id,node_id,number,title,user,user_label,state,locked,assignee,assignee_label,milestone,milestone_label,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,repo_label,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason 1054244712,I_kwDOBm6k_c4-1n9o,1510,Datasette 1.0 documented template context (maybe via API docs),9599,simonw,open,0,,,3268330,Datasette 1.0,3,2021-11-15T23:23:58Z,2023-06-28T02:05:21Z,,OWNER,,Documented context plus protective unit tests. Goal is that custom templates built for 1.x will not break without a 2.x release.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1510/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1054243511,I_kwDOBm6k_c4-1nq3,1509,Datasette 1.0 JSON API (and documentation),9599,simonw,open,0,,,3268330,Datasette 1.0,3,2021-11-15T23:22:45Z,2022-03-15T20:38:56Z,,OWNER,,"The new JSON API in a stable, documented form.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1509/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1059509927,I_kwDOBm6k_c4_Jtan,1525,"""Links from other tables"" broken for columns starting with underscore",9599,simonw,closed,0,,,,,3,2021-11-21T22:55:08Z,2021-11-30T06:39:01Z,2021-11-30T06:34:35Z,OWNER,,"Same bug as #1506, this time it's this link or the row page: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1525/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1060631257,I_kwDOBm6k_c4_N_LZ,1528,"Add new `""sql_file""` key to Canned Queries in metadata?",15178711,asg017,open,0,,,,,3,2021-11-22T21:58:01Z,2022-06-10T03:23:08Z,,CONTRIBUTOR,,"Currently for canned queries, you have to inline SQL in your `metadata.yaml` like so: ```yaml databases: fixtures: queries: neighborhood_search: sql: |- select neighborhood, facet_cities.name, state from facetable join facet_cities on facetable.city_id = facet_cities.id where neighborhood like '%' || :text || '%' order by neighborhood title: Search neighborhoods ``` This works fine, but for a few reasons, I usually have my canned queries already written in separate `.sql` files. I'd like to instead re-use those instead of re-writing it. So, I'd like to see a new `""sql_file""` key that works like so: `metadata.yaml`: ```yaml databases: fixtures: queries: neighborhood_search: sql_file: neighborhood_search.sql title: Search neighborhoods ``` `neighborhood_search.sql`: ```sql select neighborhood, facet_cities.name, state from facetable join facet_cities on facetable.city_id = facet_cities.id where neighborhood like '%' || :text || '%' order by neighborhood ``` Both of these would work in the exact same way, where Datasette would instead open + include `neighborhood_search.sql` on startup. A few reasons why I'd like to keep my canned queries SQL separate from metadata.yaml: - Keeping SQL in standalone SQL files means syntax highlighting and other text editor integrations in my code - Multiline strings in yaml, while functional, are a tad cumbersome and are hard to edit - Works well with other tools (can pipe `.sql` files into the `sqlite3` CLI, or use with other SQLite clients easier) - Typically my canned queries are quite long compared to everything else in my metadata.yaml, so I'd love to separate it where possible Let me know if this is a feature you'd like to see, I can try to send up a PR if this sounds right!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1528/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1066288689,I_kwDOBm6k_c4_jkYx,1538,Research pattern for re-registering existing Click tools with register_commands,9599,simonw,closed,0,,,,,3,2021-11-29T17:09:47Z,2021-11-29T17:32:44Z,2021-11-29T17:27:16Z,OWNER,,"Building a Datasette plugin that imports an existing Click CLI tool and re-registers it is proving hard - Click doesn't really want you to do that. I tried this: ```python from datasette import hookimpl from git_history.cli import file as git_history_file @hookimpl def register_commands(cli): cli.command(name=""git-history"")(git_history_file.callback) ``` But when I run this: ``` % datasette git-history --help Usage: datasette git-history [OPTIONS] Analyze the history of a specific file and write it to SQLite Options: --help Show this message and exit. ``` The options are all missing - which means that the command doesn't actually work. Will need to research this pattern separately. _Originally posted by @simonw in https://github.com/simonw/git-history/issues/21#issuecomment-981835305_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1538/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1978023780,I_kwDOBm6k_c515j9k,2205,request.post_vars() method obliterates form keys with multiple values,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,3,2023-11-05T23:25:08Z,2023-11-06T04:10:34Z,,OWNER,,"https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/utils/asgi.py#L137-L139 In GET requests you can do `?foo=1&foo=2` - you can do the same in POST requests, but the `dict()` call here eliminates those duplicates. You can't even try calling `post_body()` and implement your own custom parsing because of: - #2204",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2205/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1075893249,I_kwDOBm6k_c5AINQB,1545,Custom pages don't work on windows,559711,ryascott,closed,0,,,,,3,2021-12-09T18:53:05Z,2022-02-03T02:08:31Z,2022-02-03T01:58:35Z,NONE,,"It seems that custom pages don't work when put in templates/pages To reproduce on datasette version 0.59.4 using PowerShell on WIndows 10 with Python 3.10.0 mkdir -p templates/pages echo ""hello world"" >> templates/pages/about.html Start datasette datasette --template-dir templates/ Navigate to [http://127.0.0.1:8001/about](url) and receive: Error 404: Database not found: about ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1545/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1077893013,I_kwDOBm6k_c5AP1eV,1551,`keep_blank_values=True` when parsing `request.args`,9599,simonw,closed,0,,,7571612,Datasette 0.60,3,2021-12-12T19:53:07Z,2022-01-13T22:26:04Z,2021-12-12T20:02:01Z,OWNER,,"This code in `TableView` wouldn't be necessary: https://github.com/simonw/datasette/blob/492f9835aa7e90540dd0c6324282b109f73df71b/datasette/views/table.py#L396-L399 If that happened here instead: https://github.com/simonw/datasette/blob/492f9835aa7e90540dd0c6324282b109f73df71b/datasette/utils/asgi.py#L98-L100 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1518#issuecomment-991827468_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1551/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1079111498,I_kwDOBm6k_c5AUe9K,1553,if csv export is truncated in non streaming mode set informative response header,536941,fgregg,open,0,,,,,3,2021-12-13T22:50:44Z,2021-12-16T19:17:28Z,,CONTRIBUTOR,,"streaming mode is currently not enabled for custom queries, so the queries will be truncated to max row limit. it would be great if a response is truncated that an header signalling that was set in the header. i need to write some pagination code for getting full results back for a custom query and it would make the code much better if i could reliably known when there is nothing more to limit/offset ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1553/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1082584499,I_kwDOBm6k_c5Ahu2z,1558,Redesign `facet_results` JSON structure prior to Datasette 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,3,2021-12-16T19:45:10Z,2023-01-09T15:31:17Z,,OWNER,,"> Decision: as an initial fix I'm going to de-duplicate those keys by using `tags__array` etc - with a `_2` on the end if that key is already used. > > I'll open a separate issue to redesign this better for Datasette 1.0. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/625#issuecomment-996130862_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1558/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1082765654,I_kwDOBm6k_c5AibFW,1561,"add hash id to ""_memory"" url if hashed url mode is turned on and crossdb is also turned on",536941,fgregg,closed,0,,,,,3,2021-12-17T00:45:12Z,2022-03-19T04:45:40Z,2022-03-19T04:45:40Z,CONTRIBUTOR,,"If hashed_url mode is turned on and crossdb is also turned on, then queries to _memory should have a hash_id. One way that it could work is to have the _memory hash be a hash of all the individual databases. Otherwise, crossdb queries can get quit out of data if using aggressive caching. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1561/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097101917,I_kwDOBm6k_c5BZHJd,1588,`explain query plan select` is too strict about whitespace,9599,simonw,closed,0,,,7571612,Datasette 0.60,3,2022-01-09T04:22:42Z,2022-01-13T22:28:19Z,2022-01-13T20:35:05Z,OWNER,,"`explain query plan select * from facetable` is allowed: https://latest.datasette.io/fixtures?sql=explain+query+plan+select+*+from+facetable But... `explain query plan select * from facetable` (with two spaces before the `select`) returns a ""Statement must be a SELECT"" error: https://latest.datasette.io/fixtures?sql=explain+query+plan++select+*+from+facetable",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1588/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1102359726,I_kwDOBm6k_c5BtKyu,1594,"Add a CLI reference page to the docs, inspired by sqlite-utils",9599,simonw,closed,0,,,7571612,Datasette 0.60,3,2022-01-13T20:55:08Z,2022-01-13T22:28:22Z,2022-01-13T21:38:48Z,OWNER,,"Thought of this while posting this comment: https://github.com/simonw/datasette/issues/1591#issuecomment-1012506595 I added https://sqlite-utils.datasette.io/en/stable/cli-reference.html to `sqlite-utils` in https://github.com/simonw/sqlite-utils/issues/383 and I _really_ like it - it's a page showing the `--help` output of every CLI command for that tool. It's maintained using `cog`. One of the benefits is that I get a free commit history of changes to `--help` at https://github.com/simonw/sqlite-utils/commits/main/docs/cli-reference.rst",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1594/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1121618041,I_kwDOBm6k_c5C2oh5,1620,"Link: rel=""alternate"" to JSON for queries too",9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2022-02-02T08:02:42Z,2022-02-02T21:53:02Z,2022-02-02T21:33:00Z,OWNER,,"Following: - #1533 I implemented it for tables and rows but I should have done queries as well.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1620/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1108846067,I_kwDOBm6k_c5CF6Xz,1606,Tests failing against Python 3.6,9599,simonw,closed,0,,,,,3,2022-01-20T04:22:44Z,2022-01-20T04:36:42Z,2022-01-20T04:36:42Z,OWNER,,"https://github.com/simonw/datasette/runs/4877484366 ``` E File ""/opt/hostedtoolcache/Python/3.6.15/x64/lib/python3.6/site-packages/uvicorn/server.py"", line 67, in run E return asyncio.run(self.serve(sockets=sockets)) E AttributeError: module 'asyncio' has no attribute 'run' ``` I think this may mean `uvicorn` has dropped support for Python 3.6.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1606/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1157182254,I_kwDOBm6k_c5E-TMu,1646,Configuration directory mode does not pick up other file extensions than .db,15640196,dnsos,closed,0,,,,,3,2022-03-02T13:15:23Z,2022-10-07T23:06:17Z,2022-10-07T23:03:35Z,NONE,,"Hello, I've been trying to run Datasette with the [configuration directory mode](https://docs.datasette.io/en/stable/settings.html#configuration-directory-mode) with a structure such as this one: ```plain some-directory/ example.sqlite3 another-example.db one-more.custom [...] ``` (In my scenario I can't just change the filename extension without other problems arising) Now databases with the `.sqlite3` or the custom filename extension are ignored by Datasette in this case. I'm aware that the docs state that a `.db` extension is required, but I was wondering if there is a reason for restricting this or any workaround available? When I run `datasette example.sqlite3` or `datasette one-more.custom` the databases are served by Datasette without a problem. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1646/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174162781,I_kwDOBm6k_c5F_E1d,1666,Refactor URL routing to enable testing,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2022-03-19T03:52:29Z,2022-03-19T16:32:03Z,2022-03-19T16:32:03Z,OWNER,,"I ran into some bugs earlier with URL routing - having more robust testing around this (especially since they are defined using regular expressions) would be really useful. - A utility function that resolves a path against a list of reflexes and returns the match - Make the routes and regular expressions available from a private Datasette method - Add tests that exercise them Related: - #1660",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1666/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1174302994,I_kwDOBm6k_c5F_nES,1667,Make route matched pattern groups more consistent,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2022-03-19T16:32:35Z,2022-03-19T20:37:42Z,2022-03-19T20:37:41Z,OWNER,,"> ... highlights how inconsistent the way the capturing works is. Especially `as_format` which can be `None` or `""""` or `.json` or `json` or not used at all in the case of `TableView`. https://github.com/simonw/datasette/blob/764738dfcb16cd98b0987d443f59d5baa9d3c332/tests/test_routes.py#L12-L36 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1666#issuecomment-1073039670_ Part of: - #1660",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1667/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1175690070,I_kwDOBm6k_c5GE5tW,1676,"Reconsider ensure_permissions() logic, can it be less confusing?",9599,simonw,open,0,,,3268330,Datasette 1.0,3,2022-03-21T17:14:57Z,2022-12-02T01:23:40Z,,OWNER,,"> Updated documentation: https://github.com/simonw/datasette/blob/e627510b760198ccedba9e5af47a771e847785c9/docs/internals.rst#await-ensure_permissionsactor-permissions > >> This method allows multiple permissions to be checked at onced. It raises a `datasette.Forbidden` exception if any of the checks are denied before one of them is explicitly granted. >> >> This is useful when you need to check multiple permissions at once. For example, an actor should be able to view a table if either one of the following checks returns `True` or not a single one of them returns `False`: > > That's pretty hard to understand! I'm going to open a separate issue to reconsider if this is a useful enough abstraction given how confusing it is. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1675#issuecomment-1074177827_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1676/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1181432624,I_kwDOBm6k_c5Gazsw,1688,[plugins][documentation] Is it possible to serve per-plugin static folders when writing one-off (single file) plugins?,9020979,hydrosquall,closed,0,,,,,3,2022-03-26T01:17:44Z,2022-03-27T01:01:14Z,2022-03-26T21:34:47Z,CONTRIBUTOR,,"I'm trying to make a small plugin that depends on static assets, by following the guide [here](https://docs.datasette.io/en/stable/writing_plugins.html#writing-one-off-plugins). I made a `plugins/` directory with `datasette_nteract_data_explorer.py`. I am trying to follow the example of `datasette_vega`, and serving static assets. I created a `statics/` directory within `plugins/` to serve my JS and CSS. https://github.com/simonw/datasette-vega/blob/00de059ab1ef77394ba9f9547abfacf966c479c4/datasette_vega/__init__.py#L13 Unfortunately, datasette doesn't seem to be able to find my assets. Input: ```bash datasette ~/Library/Safari/History.db --plugins-dir=plugins/ ``` ![Image 2022-03-25 at 9 18 17 PM](https://user-images.githubusercontent.com/9020979/160218979-a3ff474b-5255-4a76-85d1-6f90ab2e3b44.jpg) Output: ![Image 2022-03-25 at 9 11 00 PM](https://user-images.githubusercontent.com/9020979/160218733-ca5144cf-f23f-43d8-a8d3-e3a871e57f3a.jpg) I suspect this issue might go away if I move away from ""one-off"" plugin mode, but it's been a while since I created a new python package so I'm not sure how much work there is to go between ""one off"" and ""packaged for PyPI"". I'd like to try to avoid needing to repackage a new `tar.gz` file and or reinstall my library repeatedly when developing new python code. 1. Is there a way to serve a static assets when using the `plugins/` directory method instead of installing plugins as a new python package? 2. If not, is there a way I can work on developing a plugin without creating and repackaging tar.gz files after every change, or is that the recommended path? Thanks for your help! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1688/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1202227104,I_kwDOBm6k_c5HqIeg,1712,"Make """" easier to read",9599,simonw,closed,0,,,,,3,2022-04-12T18:17:07Z,2022-04-12T19:12:22Z,2022-04-12T18:44:20Z,OWNER,,"`Binary: 2,427,344 bytes` would be nicer - even better, include a tooltip showing that size translated using this function: https://github.com/simonw/datasette/blob/138e4d9a53e3982137294ba383303c3a848cfca4/datasette/utils/__init__.py#L837-L846 ![CleanShot 2022-04-12 at 11 15 04@2x](https://user-images.githubusercontent.com/9599/163027324-b0b6092e-6e11-438b-8077-789025d0bb37.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1712/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1223241647,I_kwDOBm6k_c5I6S-v,1734,Remove python-baseconv dependency,9599,simonw,closed,0,,,,,3,2022-05-02T19:08:37Z,2022-05-02T23:25:49Z,2022-05-02T19:39:20Z,OWNER,,"> I was going to vendor `baseconv.py`, but then I reconsidered - what if there are plugins out there that expect `import baseconv` to work because they have depended on Datasette? > > I used https://cs.github.com/ and as far as I can tell there aren't any! > > So I'm going to remove that dependency and work out a smarter way to do this - probably by providing a utility function within Datasette itself. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1733#issuecomment-1115258737_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1734/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1214859703,I_kwDOBm6k_c5IaUm3,1719,Refactor `RowView` and remove `RowTableShared`,9599,simonw,closed,0,,,,,3,2022-04-25T18:06:24Z,2022-12-01T21:15:19Z,2022-04-25T18:33:44Z,OWNER,,"> The `RowTableShared` class is making this a whole lot more complicated. > > I'm going to split the `RowView` view out into an entirely separate `views/row.py` module. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1715#issuecomment-1108875068_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1719/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1216619276,I_kwDOBm6k_c5IhCMM,1724,?_trace=1 doesn't work on Global Power Plants demo,9599,simonw,closed,0,,,,,3,2022-04-27T00:15:02Z,2022-04-27T06:15:14Z,2022-04-27T00:18:30Z,OWNER,,"https://global-power-plants.datasettes.com/global-power-plants/global-power-plants?_trace=1 is not showing the trace JSON at the bottom of the page. Confirmed that `trace_debug` is `true` on https://global-power-plants.datasettes.com/-/settings Possibly related: - https://github.com/simonw/datasette-total-page-time/issues/1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1724/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1362363685,I_kwDOBm6k_c5RNAUl,1800,Remove upper bound dependencies as a default policy,9599,simonw,closed,0,,,,,3,2022-09-05T18:23:45Z,2022-09-05T18:39:52Z,2022-09-05T18:35:41Z,OWNER,,"https://iscinumpy.dev/post/bound-version-constraints/ has convinced me not to use upper bound dependencies unless I'm certain they are needed. Relevant PR: - https://github.com/simonw/datasette/pull/1799 Also: https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L45-L46 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L48-L49 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L51-L55 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L57-L59 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L75-L78 https://github.com/simonw/datasette/blob/ba35105eee2d3ba620e4f230028a02b2e2571df2/setup.py#L81-L82 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1800/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1385026210,I_kwDOBm6k_c5SjdKi,1819,Preserve query on timeout,2182,danp,closed,0,,,,,3,2022-09-25T13:32:31Z,2022-09-26T23:16:15Z,2022-09-26T23:06:06Z,CONTRIBUTOR,,"If a query hits the timeout it shows a message like: > SQL query took too long. The time limit is controlled by the [sql_time_limit_ms](https://docs.datasette.io/en/stable/settings.html#sql-time-limit-ms) configuration option. But the query is lost. Hitting the browser back button shows the query _before_ the one that errored. It would be nice if the query that errored was preserved for more tweaking. This would make it similar to how ""invalid syntax"" works since #1346 / #619.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1819/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1386854246,I_kwDOBm6k_c5Sqbdm,1822,Switch to keyword-only arguments for a bunch of internal methods,9599,simonw,open,0,,,3268330,Datasette 1.0,3,2022-09-26T23:20:38Z,2022-09-27T00:44:04Z,,OWNER,,"This is a good idea, and one that needs to happen before Datasette 1.0: > While you are adding features, would you be future-proofing your APIs if you switched over some arguments over to keyword-only arguments or would that be too disruptive? > > Thinking out loud: > > ``` > async def render_template( > self, templates, *, context=None, plugin_context=None, request=None, view_name=None > ): > ``` _Originally posted by @jefftriplett in https://github.com/simonw/datasette/issues/1817#issuecomment-1256781274_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1822/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1388631785,I_kwDOBm6k_c5SxNbp,1826,render_cell documentation example doesn't match the method signature,66709385,pjamargh,closed,0,,,,,3,2022-09-28T02:37:59Z,2022-09-28T04:30:28Z,2022-09-28T04:05:16Z,NONE,,"Open Datasette stable doc at https://docs.datasette.io/en/stable/plugin_hooks.html?highlight=render_cell#render-cell-row-value-column-table-database-datasette render_cell plugin hook method signature is `render_cell(row, value, column, table, database, datasette)`, the example shown inline uses `render_cell(value)`. ![image](https://user-images.githubusercontent.com/66709385/192674691-34265b81-6cdd-41d2-8424-aa12f8bc8c94.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1826/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1422973111,I_kwDOBm6k_c5U0Ni3,1854,Flaky test: test_serve_localhost_http,9599,simonw,closed,0,,,,,3,2022-10-25T19:37:35Z,2022-10-25T19:53:02Z,2022-10-25T19:53:02Z,OWNER,,Failing on Python 3.10 at the moment: https://github.com/simonw/datasette/actions/runs/3323629947/jobs/5494340302,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1854/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423336122,I_kwDOBm6k_c5U1mK6,1856,allow_signed_tokens setting for disabling API signed token mechanism,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,3,2022-10-26T02:20:55Z,2022-11-15T19:57:05Z,2022-10-26T02:58:35Z,OWNER,,"Had some design thoughts here: https://github.com/simonw/datasette/issues/1852#issuecomment-1291272280 I liked this option the most: --setting allow_create_tokens off",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1856/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423369494,I_kwDOBm6k_c5U1uUW,1859,datasette create-token CLI command,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,3,2022-10-26T03:12:59Z,2022-11-15T19:59:00Z,2022-10-26T04:31:39Z,OWNER,,The CLI equivalent of the `/-/create-token` page.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1859/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1410305897,I_kwDOBm6k_c5UD49p,1845,Reconsider the Datasette first-run experience,9599,simonw,open,0,,,,,3,2022-10-15T22:21:31Z,2022-10-16T08:54:53Z,,OWNER,,"Had a really interesting conversation today about how hard it is to get from ""I installed Datasette"" to ""I've done something useful with it"": https://news.ycombinator.com/item?id=33216789#33218590 Spending some time focusing on that first-run experience feels very worthwhile.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1845/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1420090659,I_kwDOBm6k_c5UpN0j,1848,Private database page should show padlock on every table,9599,simonw,closed,0,,,,,3,2022-10-24T02:28:38Z,2022-10-24T02:50:29Z,2022-10-24T02:42:34Z,OWNER,,"Following: - #1829 https://latest.datasette.io/_internal looks like this: But those queries and tables are private too, and should also show the padlock icon.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1848/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1426253476,I_kwDOBm6k_c5VAuak,1869,Release 0.63,9599,simonw,closed,0,,,,,3,2022-10-27T20:53:01Z,2022-10-27T22:24:38Z,2022-10-27T22:11:33Z,OWNER,,"Most of the release notes are already written: - https://github.com/simonw/datasette/releases/tag/0.63a0 - https://github.com/simonw/datasette/releases/tag/0.63a1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1869/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1428560020,I_kwDOBm6k_c5VJhiU,1872,"SITE-BUSTING ERROR: ""render_template() called before await ds.invoke_startup()""",192568,mroswell,closed,0,,,,,3,2022-10-30T02:28:39Z,2022-10-30T06:26:01Z,2022-10-30T06:26:01Z,CONTRIBUTOR,,"1. My https://list.saferdisinfectants.org/disinfectants/listN page (linked from https://SaferDisinfectants.org ) has been running beautifully for a year and a half, including a GitHub Actions workflow that's been routinely updating the database. 2. I received a recent report that the list page is down. I don't know when it went down, but the content is replaced with: ""render_template() called before await ds.invoke_startup()"" 3. The local datasette repo runs without incident. 4. The site is hosted on vercel, linked to my github repo. Perhaps some vercel changes were made, but not by anyone on our side. Here is a screenshot of the current project settings: Here a screenshot of the latest deployment status: This is my repository: https://github.com/mroswell/list-N (I notice: datasette==0.59 in my requirements.txt file) Because it's been long while since I actively worked on this or any other datasette project, I forget a lot of what I knew at one point. Perhaps some configuration file could be missing? Or perhaps I just need to know the right incantation to add to that vercel settings page. Help is welcome as the nonprofit org is soon hosting its annual conference, and we'd love to have the page working again. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1872/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1450312343,I_kwDOBm6k_c5WcgKX,1892,Merge 1.0-dev branch back to main,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,3,2022-11-15T20:04:25Z,2022-11-29T19:40:23Z,2022-11-29T19:40:23Z,OWNER,,"I'm committed enough to the 1.0 work now that I'm ready for the `main` branch to reflect that instead. If I need to make any dot-releases against 0.63 I can do those from a branch.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1892/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1470320227,I_kwDOBm6k_c5Xo05j,1923,latest.datasette.io Cloud Run deploys failing,9599,simonw,closed,0,,,,,3,2022-11-30T22:49:34Z,2022-11-30T23:04:56Z,2022-11-30T23:04:56Z,OWNER,,"https://github.com/simonw/datasette/actions/runs/3587402085/jobs/6038106719v ``` Warning: ""service_account_key"" has been deprecated. Please switch to using google-github-actions/auth which supports both Workload Identity Federation and Service Account Key JSON authentication. For more details, see https://github.com/google-github-actions/setup-gcloud#authorization Error: google-github-actions/setup-gcloud failed with: failed to execute command `gcloud --quiet auth activate-service-account *** --key-file -`: /opt/hostedtoolcache/gcloud/275.0.0/x64/lib/googlecloudsdk/core/console/console_io.py:544: SyntaxWarning: ""is"" with a literal. Did you mean ""==""? if answer is None or (answer is '' and default is not None): ERROR: gcloud failed to load: module 'collections' has no attribute 'MutableMapping' ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1923/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1525815985,I_kwDOBm6k_c5a8hqx,1983,Make CustomJSONEncoder a documented public API,9599,simonw,open,0,,,,,3,2023-01-09T15:27:05Z,2023-01-09T15:35:58Z,,OWNER,,It's used by `datasette-geojson` here: https://github.com/eyeseast/datasette-geojson/commit/902bf135a5a33a0dc8264673d00a59a67cb05152,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1983/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1515185383,I_kwDOBm6k_c5aT-Tn,1971,Upgrade for Sphinx 6.0 (once Furo has support for it),9599,simonw,closed,0,,,,,3,2022-12-31T19:04:35Z,2023-01-10T02:02:34Z,2023-01-10T02:02:34Z,OWNER,,"A deployment of #1967 to ReadTheDocs just failed like this: https://readthedocs.org/projects/datasette/builds/19045460/ ``` Running Sphinx v6.0.0 making output directory... done building [mo]: targets for 0 po files that are out of date building [html]: targets for 28 source files that are out of date updating environment: [new config] 28 added, 0 changed, 0 removed reading sources... [ 3%] authentication reading sources... [ 7%] binary_data reading sources... [ 10%] changelog Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 299, in next_line self.line = self.input_lines[self.line_offset] File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 1136, in __getitem__ return self.data[i] IndexError: list index out of range During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 226, in run self.next_line() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 302, in next_line raise EOFError EOFError During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/cmd/build.py"", line 281, in build_main app.build(args.force_all, args.filenames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/application.py"", line 344, in build self.builder.build_update() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 310, in build_update self.build(to_build, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 326, in build updated_docnames = set(self.read()) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 433, in read self._read_serial(docnames) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 454, in _read_serial self.read_doc(docname) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/builders/__init__.py"", line 510, in read_doc publisher.publish() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/core.py"", line 224, in publish self.document = self.reader.read(self.source, self.parser, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/io.py"", line 103, in read self.parse() File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/readers/__init__.py"", line 76, in parse self.parser.parse(self.input, document) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/parsers.py"", line 78, in parse self.statemachine.run(inputlines, document, inliner=self.inliner) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 169, in run results = StateMachineWS.run(self, input_lines, input_offset, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 233, in run context, next_state, result = self.check_line( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 445, in check_line return method(match, context, next_state) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 3024, in text self.section(title.lstrip(), source, style, lineno + 1, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 325, in section self.new_subsection(title, lineno, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 391, in new_subsection newabsoffset = self.nested_parse( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 233, in run context, next_state, result = self.check_line( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 445, in check_line return method(match, context, next_state) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 2785, in underline self.section(title, source, style, lineno - 1, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 325, in section self.new_subsection(title, lineno, messages) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 391, in new_subsection newabsoffset = self.nested_parse( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 233, in run context, next_state, result = self.check_line( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 445, in check_line return method(match, context, next_state) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 1273, in bullet i, blank_finish = self.list_item(match.end()) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 1295, in list_item self.nested_parse(indented, input_offset=line_offset, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 279, in nested_parse state_machine.run(block, input_offset, memo=self.memo, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 195, in run results = StateMachineWS.run(self, input_lines, input_offset) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/statemachine.py"", line 239, in run result = state.eof(context) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 2725, in eof self.blank(None, context, None) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 2716, in blank paragraph, literalnext = self.paragraph( File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 416, in paragraph textnodes, messages = self.inline_text(text, lineno) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 425, in inline_text nodes, messages = self.inliner.parse(text, lineno, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 649, in parse before, inlines, remaining, sysmessages = method(self, match, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 792, in interpreted_or_phrase_ref nodelist, messages = self.interpreted(rawsource, escaped, role, File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/docutils/parsers/rst/states.py"", line 889, in interpreted nodes, messages2 = role_fn(role, rawsource, text, lineno, self) File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/ext/extlinks.py"", line 101, in role title = caption % part TypeError: not all arguments converted during string formatting Exception occurred: File ""/home/docs/checkouts/readthedocs.org/user_builds/datasette/envs/latest/lib/python3.9/site-packages/sphinx/ext/extlinks.py"", line 101, in role title = caption % part TypeError: not all arguments converted during string formatting The full traceback has been saved in /tmp/sphinx-err-kq7ylgqo.log, if you want to report the issue to the developers. Please also report this if it was a user error, so that a better error message can be provided next time. A bug report can be filed in the tracker at . Thanks! ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1971/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1531991339,I_kwDOBm6k_c5bUFUr,1989,Suggestion: Hiding columns,116795,pax,open,0,,,,,3,2023-01-13T09:33:32Z,2023-03-31T06:18:05Z,,NONE,,As there's the possibility of [hiding tables](https://docs.datasette.io/en/stable/metadata.html#hiding-tables) - I've run into the **need of hiding specific columns** - data that's either not relevant for public or can't be shown due to privacy reasons. ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1989/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1615891776,I_kwDOBm6k_c5gUI1A,2037,Test failure: FAILED tests/test_cli.py::test_install_requirements - FileNotFoundError,9599,simonw,closed,0,,,,,3,2023-03-08T20:30:06Z,2023-03-09T22:33:39Z,2023-03-09T22:33:39Z,OWNER,,"> FAILED tests/test_cli.py::test_install_requirements - FileNotFoundError: [Errno 2] No such file or directory From https://github.com/simonw/datasette/actions/runs/4348548218/jobs/7597208191 ``` =================================== FAILURES =================================== __________________________ test_install_requirements ___________________________ run_module = @mock.patch(""datasette.cli.run_module"") def test_install_requirements(run_module): runner = CliRunner() > with runner.isolated_filesystem(): /home/runner/work/datasette/datasette/tests/test_cli.py:184: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /opt/hostedtoolcache/Python/3.9.16/x64/lib/python3.9/contextlib.py:119: in __enter__ return next(self.gen) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , temp_dir = None @contextlib.contextmanager def isolated_filesystem( self, temp_dir: t.Optional[t.Union[str, os.PathLike]] = None ) -> t.Iterator[str]: """"""A context manager that creates a temporary directory and changes the current working directory to it. This isolates tests that affect the contents of the CWD to prevent them from interfering with each other. :param temp_dir: Create the temporary directory under this directory. If given, the created directory is not removed when exiting. .. versionchanged:: 8.0 Added the ``temp_dir`` parameter. """""" > cwd = os.getcwd() E FileNotFoundError: [Errno 2] No such file or directory /opt/hostedtoolcache/Python/3.9.16/x64/lib/python3.9/site-packages/click/testing.py:466: FileNotFoundError ``` Not sure why it only affected the ""[Calculate test coverage](https://github.com/simonw/datasette/actions/workflows/test-coverage.yml)"" one.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2037/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1781022369,I_kwDOBm6k_c5qKD6h,2091,Drop support for Python 3.7,9599,simonw,closed,0,,,,,3,2023-06-29T15:06:38Z,2023-08-23T18:18:18Z,2023-08-23T18:18:18Z,OWNER,,"It's EOL now, as of 2023-06-27 (two days ago): https://devguide.python.org/versions/ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2091/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1811824307,I_kwDOBm6k_c5r_j6z,2105,When reverse proxying datasette with nginx an URL element gets erronously added,2235371,aki-k,open,0,,,,,3,2023-07-19T12:16:53Z,2023-07-21T21:17:09Z,,NONE,,"I use this nginx config: ``` location /datasette-llm { return 302 /datasette-llm/; } location /datasette-llm/ { proxy_set_header Upgrade $http_upgrade; proxy_set_header Connection ""Upgrade""; proxy_http_version 1.1; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto https; proxy_set_header X-Forwarded-Host $http_host; proxy_set_header Host $host; proxy_max_temp_file_size 0; proxy_pass http://127.0.0.1:8001/datasette-llm/; proxy_redirect http:// https://; proxy_buffering off; proxy_request_buffering off; proxy_set_header Origin ''; client_max_body_size 0; auth_basic ""datasette-llm""; auth_basic_user_file /etc/nginx/custom-userdb; } ``` Then I start datasette with this command: ``` datasette serve --setting base_url /datasette-llm/ $(llm logs path) ``` Everything else works right, except the links in ""This data as json, CSV"". They get an extra URL element ""datasette-llm"" like this: https://192.168.1.3:5432/datasette-llm/datasette-llm/logs.json?sql=select+*+from+_llm_migrations https://192.168.1.3:5432/datasette-llm/datasette-llm/logs.csv?sql=select+*+from+_llm_migrations&_size=max When I remove that extra ""datasette-llm"" from the URL, those links work too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2105/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1816857442,I_kwDOBm6k_c5sSwti,2106,`datasette install -e` option,9599,simonw,closed,0,,,,,3,2023-07-22T18:33:42Z,2023-07-26T18:28:33Z,2023-07-22T18:42:54Z,OWNER,,"As seen in LLM and now in `sqlite-utils` too: - https://github.com/simonw/sqlite-utils/issues/570 Useful for developing plugins, see tutorial at https://llm.datasette.io/en/stable/plugins/tutorial-model-plugin.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2106/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822939274,I_kwDOBm6k_c5sp9iK,2113,Implement and document extras for the new query view page,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,3,2023-07-26T18:24:01Z,2023-08-09T17:35:22Z,,OWNER,,- #2109 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2113/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1822940263,I_kwDOBm6k_c5sp9xn,2114,Implement canned queries against new query JSON work,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,3,2023-07-26T18:24:50Z,2023-08-09T15:26:58Z,2023-08-09T15:26:57Z,OWNER,,- #2109 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2114/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822949756,I_kwDOBm6k_c5sqAF8,2116,Turn DatabaseDownload into an async view function,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,3,2023-07-26T18:31:59Z,2023-07-26T18:44:00Z,2023-07-26T18:44:00Z,OWNER,,"A minor refactor, but it is a good starting point for this new branch. Refs: - #2109",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2116/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1823393475,I_kwDOBm6k_c5srsbD,2119,"database color shows only on index page, not other pages",9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2023-07-27T00:19:39Z,2023-08-11T05:25:45Z,2023-08-11T05:16:24Z,OWNER,,"I think this has been a bug for a long time. https://latest.datasette.io/ currently shows: Those colors are based on a hash of the database name. But when you click through to https://latest.datasette.io/fixtures It's red on all sub-pages too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2119/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1843600087,I_kwDOBm6k_c5t4xrX,2135,Release notes for 1.0a3,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,3,2023-08-09T16:09:26Z,2023-08-09T19:17:07Z,2023-08-09T19:17:06Z,OWNER,,118 commits! https://github.com/simonw/datasette/compare/1.0a2...26be9f0445b753fb84c802c356b0791a72269f25,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2135/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1838266862,I_kwDOBm6k_c5tkbnu,2126,Permissions in metadata.yml / metadata.json,36199671,ctsrc,closed,0,,,,,3,2023-08-06T16:24:10Z,2023-08-11T05:52:30Z,2023-08-11T05:52:29Z,NONE,,"https://docs.datasette.io/en/latest/authentication.html#other-permissions-in-metadata says the following: > For all other permissions, you can use one or more ""permissions"" blocks in your metadata. > To grant access to the permissions debug tool to all signed in users you can grant permissions-debug to any actor with an id matching the wildcard * by adding this a the root of your metadata: ```yaml permissions: debug-menu: id: '*' ``` I tried this. My `metadata.yml` file looks like: ```yaml permissions: debug-menu: id: '*' permissions-debug: id: '*' plugins: datasette-auth-passwords: myuser_password_hash: $env: ""PASSWORD_HASH_MYUSER"" ``` And then I run ```zsh datasette -m metadata.yml tiddlywiki.db --root ``` And I open a session for the ""root"" user of datasette with the link given. I open a private browser session and log in as ""myuser"" from http://127.0.0.1:8001/-/login Then I check http://127.0.0.1:8001/-/actor which confirms that I am logged in as the ""myuser"" actor ```json { ""actor"": { ""id"": ""myuser"" } } ``` In the session where I am logged in as ""myuser"" I then try to go to http://127.0.0.1:8001/-/permissions But all I get there as the logged in user ""myuser"" is > Forbidden > > Permission denied And then if I check the http://127.0.0.1:8001/-/permissions as the datasette ""root"" user from another browser session, I see: > permissions-debug checked at 2023-08-06T16:22:58.997841 ✗ (used default) > > Actor: {""id"": ""myuser""} It seems that in spite of having tried to give the `permissions-debug` permission to the ""myuser"" user in my `metadata.yml` file, datasette does not agree that ""myuser"" has permission `permissions-debug`.. What do I need to do differently so that my ""myuser"" user is able to access http://127.0.0.1:8001/-/permissions ?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2126/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1838469176,I_kwDOBm6k_c5tlNA4,2127,Context base class to support documenting the context,9599,simonw,open,0,,,3268330,Datasette 1.0,3,2023-08-07T00:01:02Z,2023-08-10T01:30:25Z,,OWNER,,"This idea first came up here: - https://github.com/simonw/datasette/issues/2112#issuecomment-1652751140 If `datasette.render_template(...)` takes an optional `Context` subclass as an alternative to a context dictionary, I could then use dataclasses to define the context made available to specific templates - which then gives me something I can use to help document what they are. Also refs: - https://github.com/simonw/datasette/issues/1510",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2127/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1865869205,I_kwDOBm6k_c5vNueV,2157,"Proposal: Make the `_internal` database persistent, customizable, and hidden",15178711,asg017,open,0,,,,,3,2023-08-24T20:54:29Z,2023-08-31T02:45:56Z,,CONTRIBUTOR,,"The current `_internal` database is used by Datasette core to cache info about databases/tables/columns/foreign keys of databases in a Datasette instance. It's a temporary database created at startup, that can only be seen by the root user. See an [example `_internal` DB here](https://latest.datasette.io/_internal), after [logging in as root](https://latest.datasette.io/login-as-root). The current `_internal` database has a few rough edges: - It's part of `datasette.databases`, so many plugins have to specifically exclude `_internal` from their queries [examples here](https://github.com/search?q=datasette+hookimpl+%22_internal%22+language%3APython+-path%3Adatasette%2F&ref=opensearch&type=code) - It's only used by Datasette core and can't be used by plugins or 3rd parties - It's created from scratch at startup and stored in memory. Why is fine, the performance is great, but persistent storage would be nice. Additionally, it would be really nice if plugins could use this `_internal` database to store their own configuration, secrets, and settings. For example: - `datasette-auth-tokens` [creates a `_datasette_auth_tokens` table](https://github.com/simonw/datasette-auth-tokens/blob/main/datasette_auth_tokens/__init__.py#L15) to store auth token metadata. This could be moved into the `_internal` database to avoid writing to the gues database - `datasette-socrata` [creates a `socrata_imports`](https://github.com/simonw/datasette-socrata/blob/1409aa9b4d2fc3aff286b52e73af33b5786d56d0/datasette_socrata/__init__.py#L190-L198) table, which also can be in `_internal` - `datasette-upload-csvs` [creates a `_csv_progress_`](https://github.com/simonw/datasette-upload-csvs/blob/main/datasette_upload_csvs/__init__.py#L154) table, which can be in `_internal` - `datasette-write-ui` wants to have the ability for users to toggle whether a table appears editable, which can be either in `datasette.yaml` or on-the-fly by storing config in `_internal` In general, these are specific features that Datasette plugins would have access to if there was a central internal database they could read/write to: - **Dynamic configuration**. Changing the `datasette.yaml` file works, but can be tedious to restart the server every time. Plugins can define their own configuration table in `_internal`, and could read/write to it to store configuration based on user actions (cell menu click, API access, etc.) - **Caching**. If a plugin or Datasette Core needs to cache some expensive computation, they can store it inside `_internal` (possibly as a temporary table) instead of managing their own caching solution. - **Audit logs**. If a plugin performs some sensitive operations, they can log usage info to `_internal` for others to audit later. - **Long running process status**. Many plugins (`datasette-upload-csvs`, `datasette-litestream`, `datasette-socrata`) perform tasks that run for a really long time, and want to give continue status updates to the user. They can store this info inside` _internal` - **Safer authentication**. Passwords and authentication plugins usually store credentials/hashed secrets in configuration files or environment variables, which can be difficult to handle. Now, they can store them in `_internal` ## Proposal - We remove `_internal` from [`datasette.databases`](https://docs.datasette.io/en/latest/internals.html#databases) property. - We add new `datasette.get_internal_db()` method that returns the `_internal` database, for plugins to use - We add a new `--internal internal.db` flag. If provided, then the `_internal` DB will be sourced from that file, and further updates will be persisted to that file (instead of an in-memory database) - When creating internal.db, create a new `_datasette_internal` table to mark it a an ""datasette internal database"" - In `datasette serve`, we check for the existence of the `_datasette_internal` table. If it exists, we assume the user provided that file in error and raise an error. This is to limit the chance that someone accidentally publishes their internal database to the internet. We could optionally add a `--unsafe-allow-internal` flag (or database plugin) that allows someone to do this if they really want to. ## New features unlocked with this These features don't really need a standardized `_internal` table per-say (plugins could currently configure their own long-time storage features if they really wanted to), but it would make it much simpler to create these kinds of features with a persistent application database. - **`datasette-comments`** : A plugin for commenting on rows or specific values in a database. Comment contents + threads + email notification info can be stored in `_internal` - **Bookmarks**: ""Bookmarking"" an SQL query could be stored in `_internal`, or a URL link shortener - **Webhooks**: If a plugin wants to either consume a webhook or create a new one, they can store hashed credentials/API endpoints in `_internal`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2157/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1930008379,I_kwDOBm6k_c5zCZc7,2197,click-default-group-wheel dependency conflict,1176293,ar-jan,closed,0,,,,,3,2023-10-06T11:49:20Z,2023-10-12T21:53:17Z,2023-10-12T21:53:17Z,NONE,,"I upgraded my dependencies, then ran into this problem running `datasette inspect`: > env/lib/python3.9/site-packages/datasette/cli.py"", line 6, in > from click_default_group import DefaultGroup > ModuleNotFoundError: No module named 'click_default_group' Turns out the released version of datasette still depends on `click-default-group-wheel`, so `click-default-group` doesn't get installed/recognized: ``` $ virtualenv venv $ source venv/bin/activate $ pip install datasette $ pip list | grep click-default-group click-default-group 1.2.4 click-default-group-wheel 1.2.3 $ python -c ""from click_default_group import DefaultGroup"" Traceback (most recent call last): File """", line 1, in ModuleNotFoundError: No module named 'click_default_group' $ pip install --force-reinstall click-default-group ... ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. datasette 0.64.4 requires click-default-group-wheel>=1.2.2, which is not installed. Successfully installed click-8.1.7 click-default-group-1.2.4 ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2197/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1004613267,I_kwDOCGYnMM474S6T,328,Invalid JSON output when no rows,12752,gravis,closed,0,,,,,3,2021-09-22T18:37:26Z,2021-09-22T20:21:34Z,2021-09-22T20:20:18Z,NONE,,"`sqlite-utils query` generates a JSON output with the result from the query: ```json [{...},{...}] ``` If no rows are returned by the query, I'm expecting an empty JSON array: ```json [] ``` But actually I'm getting an empty string. To be consistent, the output should be `[]` when the request succeeds (return code == `0`).",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/328/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1071531082,I_kwDOCGYnMM4_3kRK,349,A way of creating indexes on newly created tables,9599,simonw,open,0,,,,,3,2021-12-05T18:56:12Z,2021-12-07T01:04:37Z,,OWNER,,"I'm writing code for https://github.com/simonw/git-history/issues/33 that creates a table inside a loop: ```python item_pk = db[item_table].lookup( {""_item_id"": item_id}, item_to_insert, column_order=(""_id"", ""_item_id""), pk=""_id"", ) ``` I need to look things up by `_item_id` on this table, which means I need an index on that column (the table can get very big). But there's no mechanism in SQLite utils to detect if the table was created for the first time and add an index to it. And I don't want to run `CREATE INDEX IF NOT EXISTS` every time through the loop. This should work like the `foreign_keys=` mechanism. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/349/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1072435124,I_kwDOCGYnMM4_7A-0,350,Optional caching mechanism for table.lookup(),9599,simonw,open,0,,,,,3,2021-12-06T17:54:25Z,2021-12-06T17:56:57Z,,OWNER,,"Inspired by work on `git-history` where I used this pattern: ```python column_name_to_id = {} def column_id(column): if column not in column_name_to_id: id = db[""columns""].lookup( {""namespace"": namespace_id, ""name"": column}, foreign_keys=((""namespace"", ""namespaces"", ""id""),), ) column_name_to_id[column] = id return column_name_to_id[column] ``` If you're going to be doing a large number of `table.lookup(...)` calls and you know that no other script will be modifying the database at the same time you can presumably get a big speedup using a Python in-memory cache - maybe even a LRU one to avoid memory bloat.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/350/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1097087280,I_kwDOCGYnMM5BZDkw,368,Offer `python -m sqlite_utils` as an alternative to `sqlite-utils`,9599,simonw,closed,0,,,7558727,3.21,3,2022-01-09T02:29:30Z,2022-01-10T19:27:20Z,2022-01-09T02:40:50Z,OWNER,,"> Add this to `sqlite_utils/cli.py`: > > ```python > if __name__ == ""__main__"": > cli() > ``` > Now the tool can be run using `python -m sqlite_utils.cli --help` _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/364#issuecomment-1008214998_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/368/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097135732,I_kwDOCGYnMM5BZPZ0,373,List `--fmt` options in the docs ,9599,simonw,closed,0,,,7558727,3.21,3,2022-01-09T08:22:11Z,2022-01-10T19:27:24Z,2022-01-09T17:49:00Z,OWNER,,https://sqlite-utils.datasette.io/en/stable/cli.html#table-formatted-output currently cheats and tells the user to run `--help` - can fix this using `cog`. ,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/373/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097251014,I_kwDOCGYnMM5BZrjG,375,`sqlite-utils bulk` command,9599,simonw,closed,0,,,7558727,3.21,3,2022-01-09T17:12:38Z,2022-01-11T02:12:58Z,2022-01-11T02:10:55Z,OWNER,,"The `.executemany()` method is a very efficient way to execute the same SQL query against a huge list of parameters. `sqlite-utils insert` supports a bunch of ways of loading a list of dictionaries - from CSV, TSV, JSON, newline JSON and more thanks to: - #361 What if you could load a list of dictionaries and provide a SQL query with `:named` parameters that correspond to keys in those dictionaries instead? This would need to be a new command - I thought about adding a `--sql` option to `insert` but that doesn't make sense as that command already requires a table name.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/375/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1123903919,I_kwDOCGYnMM5C_Wmv,397,Support IF NOT EXISTS for table creation,738408,rafguns,closed,0,,,,,3,2022-02-04T07:41:15Z,2022-02-06T01:30:46Z,2022-02-06T01:29:01Z,NONE,,"Currently, I have a bunch of code that looks like this: ```python subjects = db[""subjects""] if db[""subjects""].exists() else db[""subjects""].create({ ... }) ``` It would be neat if sqlite-utils could simplify that by supporting `CREATE TABLE IF NOT EXISTS`, so that I'd be able to write, e.g. ```python subjects = db[""subjects""].create({...}, if_not_exists=True) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/397/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1149661489,I_kwDOCGYnMM5EhnEx,409,`with db:` for transactions,9599,simonw,open,0,,,,,3,2022-02-24T19:22:06Z,2022-10-01T03:42:50Z,,OWNER,,This can be a documented wrapper around `with db.conn:`.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/409/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1175744654,I_kwDOCGYnMM5GFHCO,417,insert fails on JSONL with whitespace,9954,blaine,closed,0,,,,,3,2022-03-21T17:58:14Z,2022-03-25T21:19:06Z,2022-03-25T21:17:13Z,NONE,,"Any JSON that is newline-delimited and has whitespace (newlines) between the start of a JSON object and an attribute fails due to a parse error. e.g. given the valid JSONL: ```{ ""attribute"": ""value"" } { ""attribute"": ""value2"" } ``` I would expect that `sqlite-utils insert --nl my.db mytable file.jsonl` would properly import the data into `mytable`. However, the following error is thrown instead: `json.decoder.JSONDecodeError: Expecting property name enclosed in double quotes: line 2 column 1 (char 2)` It makes sense that since the file is intended to be newline separated, the thing being parsed is ""{"" (which obviously fails), however the default newline-separated output of `jq` isn't compact. Using `jq -c` avoids this problem, but the fix is unintuitive and undocumented. Proposed solutions: 1. Default to a ""loose"" newline-separated parse; this could be implemented internally as [the equivalent of] a `jq -c` filter ahead of the insert step. 2. Catch the JSONDecodeError (or pre-empt it in the case of a record === ""{\n"") and give the user a ""it looks like your json isn't _actually_ newline-delimited; try running it through `jq -c` instead"" error message. It might just have been too early in the morning when I was playing with this, but running pipes of data through sqlite-utils without the 'knack' of it led to some false starts.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/417/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1227571375,I_kwDOCGYnMM5JK0Cv,431,Allow making m2m relation of a table to itself,738408,rafguns,open,0,,,,,3,2022-05-06T08:30:43Z,2022-06-23T14:12:51Z,,NONE,,"I am building a database, in which one of the tables has a many-to-many relationship to itself. As far as I can see, this is not (yet) possible using `.m2m()` in sqlite-utils. This may be a bit of a niche use case, so feel free to close this issue if you feel it would introduce too much complexity compared to the benefits. Example: suppose I have a table of people, and I want to store the information that John and Mary have two children, Michael and Suzy. It would be neat if I could do something like this: ```python from sqlite_utils import Database db = Database(memory=True) db[""people""].insert({""name"": ""John""}, pk=""name"").m2m( ""people"", [{""name"": ""Michael""}, {""name"": ""Suzy""}], m2m_table=""parent_child"", pk=""name"" ) db[""people""].insert({""name"": ""Mary""}, pk=""name"").m2m( ""people"", [{""name"": ""Michael""}, {""name"": ""Suzy""}], m2m_table=""parent_child"", pk=""name"" ) ``` But if I do that, the many-to-many table `parent_child` has only one column: ``` CREATE TABLE [parent_child] ( [people_id] TEXT REFERENCES [people]([name]), PRIMARY KEY ([people_id], [people_id]) ) ``` This could be solved by adding one or two keyword_arguments to `.m2m()`, e.g. `.m2m(..., left_name=None, right_name=None)` or `.m2m(..., names=(None, None))`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/431/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1243151184,I_kwDOCGYnMM5KGPtQ,434,`detect_fts()` identifies the wrong table if tables have names that are subsets of each other,559711,ryascott,closed,0,,,,,3,2022-05-20T13:28:31Z,2022-06-14T23:24:09Z,2022-06-14T23:24:09Z,NONE,,"Windows 10 Python 3.9.6 When I was running a full text search through the Python library, I noticed that the query was being run on a different full text search table than the one I was trying to search. I took a look at the following function https://github.com/simonw/sqlite-utils/blob/841ad44bacaff05ec79ef78166d12e80c82ba6d7/sqlite_utils/db.py#L2213 and noticed: ```python sql LIKE '%VIRTUAL TABLE%USING FTS%content=%{table}%' ``` My database contains tables with similar names and %{table}% was matching another table that ended differently in its name. I have included a sample test that shows this occurring: I search for Marsupials in db[""books""] and The Clue of the Broken Blade is returned. This occurs since the search for Marsupials was ""successfully"" done against db[""booksb""] and rowid 1 is returned. ""The Clue of the Broken Blade"" has a rowid of 1 in db[""books""] and this is what is returned from the search. ```python def test_fts_search_with_similar_table_names(fresh_db): db = Database(memory=True) db[""books""].insert_all( [ { ""title"": ""The Clue of the Broken Blade"", ""author"": ""Franklin W. Dixon"", }, { ""title"": ""Habits of Australian Marsupials"", ""author"": ""Marlee Hawkins"", }, ] ) db[""booksb""].insert( { ""title"": ""Habits of Australian Marsupials"", ""author"": ""Marlee Hawkins"", } ) db[""booksb""].enable_fts([""title"", ""author""]) db[""books""].enable_fts([""title"", ""author""]) query = ""Marsupials"" assert [ { ""rowid"": 1, ""title"": ""Habits of Australian Marsupials"", ""author"": ""Marlee Hawkins"", }, ] == list(db[""books""].search(query)) ``` ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/434/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1277295119,I_kwDOCGYnMM5MIfoP,445,`sqlite_utils.utils.TypeTracker` should be a documented API,9599,simonw,closed,0,,,,,3,2022-06-20T19:08:28Z,2022-06-20T19:49:02Z,2022-06-20T19:46:58Z,OWNER,,"I've used it in a couple of external places now: - https://github.com/simonw/datasette-socrata/blob/32fb256a461bf0e790eca10bdc7dd9d96c20f7c4/datasette_socrata/__init__.py#L264-L280 - https://github.com/simonw/datasette-lite/blob/caa8eade10f0321c64f9f65c4561186f02d57c5b/webworker.js#L55-L64 Refs: - https://github.com/simonw/datasette-lite/issues/32",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/445/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1278571700,I_kwDOCGYnMM5MNXS0,447,Incorrect syntax highlighting in docs CLI reference,9599,simonw,closed,0,,,,,3,2022-06-21T14:53:10Z,2022-06-21T18:48:47Z,2022-06-21T18:48:46Z,OWNER,,"https://sqlite-utils.datasette.io/en/stable/cli-reference.html#insert ![CE020DDA-27FB-49C3-9EA6-37457DC4C321](https://user-images.githubusercontent.com/9599/174830380-06530537-b870-41c0-a8af-03c7fa720c6f.jpeg) It looks like Python keywords are being incorrectly highlighted here.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/447/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1353074021,I_kwDOCGYnMM5QpkVl,474,Add an option for specifying column names when inserting CSV data,14294,hubgit,open,0,,,,,3,2022-08-27T15:29:59Z,2022-08-31T03:42:36Z,,NONE,,"https://sqlite-utils.datasette.io/en/stable/cli.html#csv-files-without-a-header-row > The first row of any CSV or TSV file is expected to contain the names of the columns in that file. > If your file does not include this row, you can use the `--no-headers` option to specify that the tool should not use that fist row as headers. > If you do this, the table will be created with column names called `untitled_1` and `untitled_2` and so on. You can then rename them using the `sqlite-utils transform ... --rename` command. It would be nice to be able to specify the column names when importing CSV/TSV without a header row, via an extra command line option. (renaming a column of a large table can take a long time, which makes it an inconvenient workaround)",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/474/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1386562662,I_kwDOCGYnMM5SpURm,493,Tiny typographical error in install/uninstall docs,9599,simonw,open,0,,,,,3,2022-09-26T19:00:42Z,2022-10-25T21:31:15Z,,OWNER,,"Added in: - #483 I don't know how to fix this in Sphinx: I'm getting this: https://sqlite-utils.datasette.io/en/latest/cli.html#cli-install > The [insert –convert](https://sqlite-utils.datasette.io/en/latest/cli.html#cli-insert-convert) and [query –functions](https://sqlite-utils.datasette.io/en/latest/cli.html#cli-query-functions) options But I want it to display `insert --convert` and not `insert –convert` there. Here's the code: https://github.com/simonw/sqlite-utils/blob/85247038f70d7eb2f3e272cfeaa4c44459cafba8/docs/cli.rst#L2125",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/493/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1392690202,I_kwDOCGYnMM5TAsQa,495,Support JSON values returned from .convert() functions,649467,mhalle,closed,0,,,,,3,2022-09-30T16:33:49Z,2022-10-25T21:23:37Z,2022-10-25T21:23:28Z,NONE,,"When using the convert function on a JSON column, the result of the conversion function must be a string. If the return value is either a dict (object) or a list (array), the convert call will error out with an unhelpful user defined function exception. It makes sense that since the original column value was a string and required conversion to data structures, the result should be converted back into a JSON string as well. However, other functions auto-convert to JSON string representation, so the fact that convert doesn't could be surprising. At least the documentation should note this requirement, because the sqlite error messages won't readily reveal the issue. Jf only sqlite's JSON column type meant something :)",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/495/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1429029604,I_kwDOCGYnMM5VLULk,506,Make `cursor.rowcount` accessible (wontfix),9599,simonw,closed,0,,,,,3,2022-10-30T21:51:55Z,2022-11-01T17:37:47Z,2022-11-01T17:37:13Z,OWNER,,"In building this Datasette feature on top of `sqlite-utils` I thought it might be useful to expose the number of rows that had been affected by a bulk insert or update - the `cursor.rowcount`: - https://github.com/simonw/datasette/issues/1866 This isn't currently exposed by `sqlite-utils`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/506/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1434911255,I_kwDOCGYnMM5VhwIX,510,Cannot enable FTS5 despite it being available,1176293,ar-jan,closed,0,,,,,3,2022-11-03T16:03:49Z,2022-11-18T18:37:52Z,2022-11-17T10:36:28Z,NONE,,"When I do `sqlite-utils enable-fts my.db table_name column_name` (with or without `--fts5`), I get an FTS4 virtual table instead of the expected FTS5. FTS5 is however available and Python/SQLite versions do not seem to be the issue. I can manually create the FTS5 virtual table, and then Datasette also works with it from this same Python environment. `>>> sqlite3.version` `2.6.0` `>>> sqlite3.sqlite_version` `3.39.4` `PRAGMA compile_options;` includes `ENABLE_FTS5`. `sqlite-utils, version 3.30`. Any ideas what's happening and how to fix?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/510/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1450952393,I_kwDOCGYnMM5We8bJ,512,mypy failures in CI,9599,simonw,closed,0,,,,,3,2022-11-16T06:22:48Z,2022-11-16T07:49:51Z,2022-11-16T07:49:50Z,OWNER,,"https://github.com/simonw/sqlite-utils/actions/runs/3472012235 failed on Python 3.11: Truncated output: ``` sqlite_utils/db.py:2467: note: PEP 484 prohibits implicit Optional. Accordingly, mypy has changed its default to no_implicit_optional=True sqlite_utils/db.py:2467: note: Use https://github.com/hauntsaninja/no_implicit_optional to automatically upgrade your codebase sqlite_utils/db.py:2530: error: Incompatible default for argument ""where"" (default has type ""None"", argument has type ""str"") [assignment] sqlite_utils/db.py:2530: note: PEP 484 prohibits implicit Optional. Accordingly, mypy has changed its default to no_implicit_optional=True sqlite_utils/db.py:2530: note: Use https://github.com/hauntsaninja/no_implicit_optional to automatically upgrade your codebase sqlite_utils/db.py:2658: error: Argument 1 to ""count_where"" of ""Queryable"" has incompatible type ""Optional[str]""; expected ""str"" [arg-type] Found 23 errors in 1 file (checked 51 source files) ``` Best look at https://github.com/hauntsaninja/no_implicit_optional",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/512/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1516644980,I_kwDOCGYnMM5aZip0,520,rows_from_file() raises confusing error if file-like object is not in binary mode,9599,simonw,closed,0,,,,,3,2023-01-02T19:00:14Z,2023-05-08T22:08:07Z,2023-05-08T22:08:07Z,OWNER,,"I got this error: ``` File ""/Users/simon/Dropbox/Development/openai-to-sqlite/openai_to_sqlite/cli.py"", line 27, in embeddings rows, _ = rows_from_file(input) ^^^^^^^^^^^^^^^^^^^^^ File ""/Users/simon/.local/share/virtualenvs/openai-to-sqlite-jt4obeb2/lib/python3.11/site-packages/sqlite_utils/utils.py"", line 305, in rows_from_file first_bytes = buffered.peek(2048).strip() ^^^^^^^^^^^^^^^^^^^ ``` From this code: ```python @cli.command() @click.argument( ""db_path"", type=click.Path(file_okay=True, dir_okay=False, allow_dash=False), ) @click.option( ""-i"", ""--input"", type=click.File(""r""), default=""-"", ) def embeddings(db_path, input): ""Store embeddings for one or more text documents"" click.echo(""Here is some output"") db = sqlite_utils.Database(db_path) rows, _ = rows_from_file(input) print(list(rows)) ``` The error went away when I changed it to `type=click.File(""rb"")`. This should either be called out in the documentation or `rows_from_file()` should be fixed to handle text-mode files in addition to binary files.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/520/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1718595700,I_kwDOCGYnMM5mb7B0,550,AttributeError: 'EntryPoints' object has no attribute 'get' for flake8 on Python 3.7,9599,simonw,closed,0,,,,,3,2023-05-21T18:24:39Z,2023-05-21T18:42:25Z,2023-05-21T18:41:58Z,OWNER,,"https://github.com/simonw/sqlite-utils/actions/runs/5039064797/jobs/9036965488 ``` Traceback (most recent call last): File ""/opt/hostedtoolcache/Python/3.7.16/x64/bin/flake8"", line 8, in sys.exit(main()) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/cli.py"", line 22, in main app.run(argv) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 363, in run self._run(argv) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 350, in _run self.initialize(argv) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 330, in initialize self.find_plugins(config_finder) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/main/application.py"", line 153, in find_plugins self.check_plugins = plugin_manager.Checkers(local_plugins.extension) File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/plugins/manager.py"", line 357, in __init__ self.namespace, local_plugins=local_plugins File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/plugins/manager.py"", line 238, in __init__ self._load_entrypoint_plugins() File ""/opt/hostedtoolcache/Python/3.7.16/x64/lib/python3.7/site-packages/flake8/plugins/manager.py"", line 254, in _load_entrypoint_plugins eps = importlib_metadata.entry_points().get(self.namespace, ()) AttributeError: 'EntryPoints' object has no attribute 'get' Error: Process completed with exit code 1. ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/550/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816918185,I_kwDOCGYnMM5sS_ip,574,`prepare_connection()` plugin hook,9599,simonw,closed,0,,,,,3,2023-07-22T22:52:47Z,2023-07-22T23:13:14Z,2023-07-22T22:59:10Z,OWNER,,"> Splitting off an issue for `prepare_connection()` since Alex got the PR in seconds before I shipped 3.34! _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/567#issuecomment-1646686424_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/574/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816852402,I_kwDOCGYnMM5sSvey,569,register_command plugin hook,9599,simonw,closed,0,,,,,3,2023-07-22T18:17:27Z,2023-07-22T19:19:35Z,2023-07-22T19:19:35Z,OWNER,,"> I'm going to start by adding the `register_command` hook using the exact same pattern as Datasette and LLM. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/567#issuecomment-1646643450_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/569/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1856075668,I_kwDOCGYnMM5uoXeU,586,.transform() fails to drop column if table is part of a view,9599,simonw,open,0,,,,,3,2023-08-18T05:25:22Z,2023-08-18T06:13:47Z,,OWNER,,"I got this error trying to drop a column from a table that was part of a SQL view: > error in view plugins: no such table: main.pypi_releases Upon further investigation I found that this pattern seemed to fix it: ```python def transform_the_table(conn): # Run this in a transaction: with conn: # We have to read all the views first, because we need to drop and recreate them db = sqlite_utils.Database(conn) views = {v.name: v.schema for v in db.views if table.lower() in v.schema.lower()} for view in views.keys(): db[view].drop() db[table].transform( types=types, rename=rename, drop=drop, column_order=[p[0] for p in order_pairs], ) # Now recreate the views for name, schema in views.items(): db.create_view(name, schema) ``` So grab a copy of any view that might reference this table, start a transaction, drop those views, run the transform, recreate the views again. > I wonder if this should become an option in `sqlite-utils`? Maybe a `recreate_views=True` argument for `table.tranform(...)`? Should it be opt-in or opt-out? _Originally posted by @simonw in https://github.com/simonw/datasette-edit-schema/issues/35#issuecomment-1683370548_ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/586/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1857851384,I_kwDOCGYnMM5uvI_4,587,New .add_foreign_key() can break if PRAGMA legacy_alter_table=ON and there's an invalid foreign key reference,9599,simonw,closed,0,,,,,3,2023-08-19T20:01:26Z,2023-08-19T20:04:33Z,2023-08-19T20:04:32Z,OWNER,,"Extremely detailed story of how I got to this point: - https://github.com/simonw/llm/issues/162 Steps to reproduce (only if that pragma is on though): ```bash python -c ' import sqlite_utils db = sqlite_utils.Database(memory=True) db.execute("""""" CREATE TABLE ""logs"" ( [id] INTEGER PRIMARY KEY, [model] TEXT, [prompt] TEXT, [system] TEXT, [prompt_json] TEXT, [options_json] TEXT, [response] TEXT, [response_json] TEXT, [reply_to_id] INTEGER, [chat_id] INTEGER REFERENCES [log]([id]), [duration_ms] INTEGER, [datetime_utc] TEXT ); """""") db[""logs""].add_foreign_key(""reply_to_id"", ""logs"", ""id"") ' ``` This succeeds in some environments, fails in others.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/587/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1879209560,I_kwDOCGYnMM5wAnZY,589,Mechanism for de-registering registered SQL functions,9599,simonw,open,0,,,,,3,2023-09-03T19:32:39Z,2023-09-03T19:36:34Z,,OWNER,,I used a custom SQL function in a migration script and then realized that it should be de-registered before the end of the script to avoid leaking into the calling code.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/589/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1246826792,I_kwDODLZ_YM5KUREo,10,"When running `auth` command, don't overwrite an existing auth.json file",11887,ashanan,closed,0,,,,,3,2022-05-24T16:42:20Z,2022-09-07T15:07:38Z,2022-08-22T16:17:19Z,NONE,,"Ran the `auth` command in the same directory I'd previously set up an auth.json file for `twitter-to-sqlite` and it was completely overwritten. Not the biggest issue, but still unexpected. Ideally, for me, the keys would just be added to the existing file, but getting a warning and a chance to back out would be a good solution as well.",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1345452427,I_kwDODLZ_YM5QMfmL,11,"-a option is used for ""--auth"" and for ""--all""",2467,fernand0,closed,0,,,,,3,2022-08-21T10:50:48Z,2022-08-21T21:11:57Z,2022-08-21T21:11:57Z,NONE,,"I'm not sure which option is best, instead of -a -all.",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1616347574,I_kwDOJHON9s5gV4G2,1,Initial proof of concept with ChatGPT,9599,simonw,closed,0,,,,,3,2023-03-09T03:44:39Z,2023-03-09T03:51:55Z,2023-03-09T03:51:55Z,MEMBER,,I'm using ChatGPT to figure out enough AppleScript to get at my notes data.,611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1618130434,I_kwDOJHON9s5gcrYC,11,Implement a SQL view to make it easier to query files in a nested folder,9599,simonw,open,0,,,,,3,2023-03-09T23:19:28Z,2023-03-09T23:24:01Z,,MEMBER,,"Working with nested data in SQL is tricky, can I make it easier with a view or canned query?",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 322741659,MDExOlB1bGxSZXF1ZXN0MTg3NzcwMzQ1,258,Add new metadata key persistent_urls which removes the hash from all database urls,247131,philroche,closed,0,,,,,3,2018-05-14T09:39:18Z,2018-05-21T07:38:15Z,2018-05-21T07:38:15Z,NONE,simonw/datasette/pulls/258,"Add new metadata key ""persistent_urls"" which removes the hash from all database urls when set to ""true"" This PR is just to gauge if this, or something like it, is something you would consider merging? I understand the reason why the substring of the hash is included in the url but there are some use cases where the urls should persist across deployments. For bookmarks for example or for scripts that use the JSON API. This is the initial commit for this feature. Tests and documentation updates to follow.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/258/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 313494458,MDExOlB1bGxSZXF1ZXN0MTgxMDMzMDI0,200,Hide Spatialite system tables,45057,russss,closed,0,,,,,3,2018-04-11T21:26:58Z,2018-04-12T21:34:48Z,2018-04-12T21:34:48Z,CONTRIBUTOR,simonw/datasette/pulls/200,They were getting on my nerves.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/200/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 314319372,MDExOlB1bGxSZXF1ZXN0MTgxNjQyMTE0,205,Support filtering with units and more,45057,russss,closed,0,,,,,3,2018-04-14T10:47:51Z,2018-04-14T15:24:04Z,2018-04-14T15:24:04Z,CONTRIBUTOR,simonw/datasette/pulls/205,"The first commit: * Adds units to exported JSON * Adds units key to metadata skeleton * Adds some docs for units The second commit adds filtering by units by the first method I mentioned in #203: ![image](https://user-images.githubusercontent.com/45057/38767463-7193be16-3fd9-11e8-8a5f-ac4159415c6d.png) [Try it here](https://wtr-api.herokuapp.com/wtr-663ea99/license_frequency?frequency__gt=50GHz&height__lt=50ft). I think it integrates pretty neatly. The third commit adds support for registering custom units with Pint from metadata.json. Probably pretty niche, but I need decibels!",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/205/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 403499298,MDExOlB1bGxSZXF1ZXN0MjQ3OTIzMzQ3,404,Experiment: run Jinja in async mode,9599,simonw,closed,0,,,,,3,2019-01-27T00:28:44Z,2019-11-12T05:02:18Z,2019-11-12T05:02:13Z,OWNER,simonw/datasette/pulls/404,"See http://jinja.pocoo.org/docs/2.10/api/#async-support Tests all pass. Have not checked performance difference yet. Creating pull request to run tests in Travis. This is not ready to merge - I'm not yet sure if this is a good idea.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/404/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 452901999,MDExOlB1bGxSZXF1ZXN0Mjg1Njk4MzEw,501,Test against Python 3.8-dev using Travis,9599,simonw,closed,0,,,,,3,2019-06-06T08:37:53Z,2019-11-11T03:23:29Z,2019-11-11T03:23:29Z,OWNER,simonw/datasette/pulls/501,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/501/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 505818256,MDExOlB1bGxSZXF1ZXN0MzI3MTcyNTQ1,590,Handle spaces in DB names,2657547,rixx,closed,0,,,,,3,2019-10-11T12:18:22Z,2019-11-04T23:16:31Z,2019-11-04T23:16:30Z,CONTRIBUTOR,simonw/datasette/pulls/590,Closes #503,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/590/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 499954048,MDExOlB1bGxSZXF1ZXN0MzIyNTI5Mzgx,578,Added support for multi arch builds,887095,heussd,closed,0,,,,,3,2019-09-29T18:43:03Z,2019-11-13T19:13:15Z,2019-11-13T19:13:15Z,NONE,simonw/datasette/pulls/578,Minor changes in Dockerfile and new Makefile to support Docker multi architecture builds. `make`will build one image per architecture and push them as one Docker manifest to Docker Hub. Feel free to change `IMAGE_NAME ` to `datasetteproject/datasette` to update your official Docker Hub image(s).,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/578/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 509535510,MDExOlB1bGxSZXF1ZXN0MzMwMDc2MjYz,602,Offer to format readonly SQL,2657547,rixx,closed,0,,,,,3,2019-10-20T02:29:32Z,2019-11-04T07:29:33Z,2019-11-04T02:39:56Z,CONTRIBUTOR,simonw/datasette/pulls/602,"Following discussion in #601, this PR adds a ""Format SQL"" button to read-only SQL (if the SQL actually differs from the formatting result). It also removes a console error on readonly SQL queries.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/602/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 562085508,MDExOlB1bGxSZXF1ZXN0MzcyNzYzOTA2,666,"Use inspect-file, if possible, for total row count",13896256,kevindkeogh,closed,0,,,,,3,2020-02-08T22:10:35Z,2020-03-09T02:47:15Z,2020-02-25T20:19:29Z,CONTRIBUTOR,simonw/datasette/pulls/666,"For large tables, counting the number of rows in the table can take a signficant amount of time. Instead, where an inspect-file is provided for an immutable database, look up the row-count for a plain count(*).",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/666/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 585597133,MDExOlB1bGxSZXF1ZXN0MzkxOTI0NTA5,703,WIP implementation of writable canned queries,9599,simonw,closed,0,,,,,3,2020-03-21T22:23:51Z,2020-06-03T00:08:14Z,2020-06-02T23:57:35Z,OWNER,simonw/datasette/pulls/703,Refs #698.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/703/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1, 607107849,MDExOlB1bGxSZXF1ZXN0NDA5MTUzODcw,739,Configuration directory mode,9599,simonw,closed,0,,,,,3,2020-04-26T20:37:46Z,2020-04-27T16:30:25Z,2020-04-27T16:30:25Z,OWNER,simonw/datasette/pulls/739,"Refs #731 TODO: - [x] Decide how to combine explicit command-line options with items detected from the directory structure - [x] Add unit tests - [x] Implement `inspect-data.json` mechanism for populating `immutables` - [x] Add documentation",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/739/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 598891570,MDExOlB1bGxSZXF1ZXN0NDAyNjQ1OTg0,725,"Update aiofiles requirement from ~=0.4.0 to >=0.4,<0.6",27856297,dependabot-preview[bot],closed,0,,,,,3,2020-04-13T13:32:47Z,2020-05-04T18:16:54Z,2020-05-04T16:17:49Z,CONTRIBUTOR,simonw/datasette/pulls/725,"Updates the requirements on [aiofiles](https://github.com/Tinche/aiofiles) to permit the latest version.
Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/725/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 646448486,MDExOlB1bGxSZXF1ZXN0NDQwNzM1ODE0,868,initial windows ci setup,702729,joshmgrant,open,0,,,,,3,2020-06-26T18:49:13Z,2021-07-10T23:41:43Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/868,Picking up the work done on #557 with a new PR. Seeing if I can get this working.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/868/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 688386219,MDExOlB1bGxSZXF1ZXN0NDc1NjY1OTg0,142,"insert_all(..., alter=True) should work for new columns introduced after the first 100 records",96218,simonwiles,closed,0,,,,,3,2020-08-28T22:22:57Z,2020-08-30T07:28:23Z,2020-08-28T22:30:14Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/142,Closes #139.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/142/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 726910999,MDExOlB1bGxSZXF1ZXN0NTA3OTAzMzky,1040,/db/table/-/blob/pk/column.blob download URL,9599,simonw,closed,0,,,6026070,0.51,3,2020-10-21T22:39:15Z,2020-10-24T23:09:20Z,2020-10-24T23:09:19Z,OWNER,simonw/datasette/pulls/1040,"Refs #1036. Still needs: - [x] Comprehensive tests across all of the code branches, plus permissions - [x] A bit more refactoring to share logic cleanly with `RowView` - ~~A configuration option to disable this feature (probably)~~",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1040/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 735663855,MDExOlB1bGxSZXF1ZXN0NTE1MDE0ODgz,195,table.search() improvements plus sqlite-utils search command,9599,simonw,closed,0,,,,,3,2020-11-03T22:02:08Z,2020-11-06T18:30:49Z,2020-11-06T18:30:42Z,OWNER,simonw/sqlite-utils/pulls/195,Refs #192. Still needs tests.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/195/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 729818242,MDExOlB1bGxSZXF1ZXN0NTEwMjM1OTA5,189,Allow iterables other than Lists in m2m records,35681,adamwolf,closed,0,,,,,3,2020-10-26T18:47:44Z,2020-10-27T16:28:37Z,2020-10-27T16:24:21Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/189,"I was playing around with sqlite-utils, creating a Roam Research dogsheep-style importer for Datasette, and ran into a slight snag. I wanted to use a generator to add an order column in an importer. It looked something like: ``` def order_generator(iterable, attr=None): if attr is None: attr = ""order"" order: int = 0 for i in iterable: i[attr] = order order += 1 yield i ``` When I used this with `insert_all` and other things, it worked fine--but it didn't work as the `records` argument to `m2m`. I dug into it, and sqlite-utils is explicitly checking if the records argument is a list or a tuple. I flipped the check upside down, and now it checks if the argument is a mapping. If it's a mapping, it wraps it in a list, otherwise it leaves it alone. (I get that it might not really make sense to put the order column on the second table. I changed my import schema a bit, and no longer have a real example, but maybe this change still makes sense.) The automated tests still pass, but I did not add any new ones. Let me know what you think! I'm really loving Datasette and its ecosystem; thanks for everything!",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/189/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 752749485,MDExOlB1bGxSZXF1ZXN0NTI4OTk3NjE0,1112,Fix --metadata doc usage,50527,jefftriplett,closed,0,,,6055094,Datasette 0.52,3,2020-11-28T19:19:51Z,2020-11-28T23:28:21Z,2020-11-28T19:53:48Z,CONTRIBUTOR,simonw/datasette/pulls/1112,"I stumbled on this while trying to figure out how to configure datasette-ripgrep via https://github.com/simonw/datasette-ripgrep/issues/15 You may not want to update the changelog (those are annoying) so I added two commits in case that's easier. ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1112/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 778126516,MDExOlB1bGxSZXF1ZXN0NTQ4MjcxNDcy,1170,Install Prettier via package.json,3637,benpickles,closed,0,,,6346396,Datasette 0.54,3,2021-01-04T14:18:03Z,2021-01-24T21:21:01Z,2021-01-04T19:52:34Z,CONTRIBUTOR,simonw/datasette/pulls/1170,This adds a package.json with Prettier and means that developers/CI will use the same version. It also ensures that NPM packages are cached on GitHub Actions which fixes #1169.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1170/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 771872303,MDExOlB1bGxSZXF1ZXN0NTQzMjQ2NTM1,59,Remove unneeded exists=True for -a/--auth flag.,631242,frosencrantz,closed,0,,,,,3,2020-12-21T06:03:55Z,2021-05-22T14:06:19Z,2021-05-19T16:08:12Z,CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/59,The file does not need to exist when using an environment variable.,207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/59/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 797649915,MDExOlB1bGxSZXF1ZXN0NTY0NjA4MjY0,1211,Use context manager instead of plain open,4488943,kbaikov,closed,0,,,,,3,2021-01-31T07:58:10Z,2021-03-11T16:15:50Z,2021-03-11T16:15:50Z,CONTRIBUTOR,simonw/datasette/pulls/1211,"Context manager with open closes the files after usage. Fixes: https://github.com/simonw/datasette/issues/1208 When the object is already a pathlib.Path i used read_text write_text functions In some cases pathlib.Path.open were used in context manager, it is basically the same as builtin open. Tests are passing: 850 passed, 5 xfailed, 10 xpassed",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1211/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 793002853,MDExOlB1bGxSZXF1ZXN0NTYwNzYwMTQ1,1204,WIP: Plugin includes,9599,simonw,open,0,,,,,3,2021-01-25T03:59:06Z,2021-12-17T07:10:49Z,,OWNER,simonw/datasette/pulls/1204,"Refs #1191 Next steps: - [ ] Get comfortable that this pattern is the right way to go - [ ] Implement it for all of the other pages, not just the table page - [ ] Add a new set of plugin tests that exercise ALL of these new hook locations - [ ] Document, then ship",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1204/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1, 793086333,MDExOlB1bGxSZXF1ZXN0NTYwODMxNjM4,1206,Release 0.54,9599,simonw,closed,0,,,,,3,2021-01-25T06:45:47Z,2021-01-25T17:33:30Z,2021-01-25T17:33:29Z,OWNER,simonw/datasette/pulls/1206,Refs #1201,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1206/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 810507413,MDExOlB1bGxSZXF1ZXN0NTc1MTg3NDU3,1229,ensure immutable databses when starting in configuration directory mode with,295329,camallen,closed,0,,,,,3,2021-02-17T20:18:26Z,2022-04-22T13:16:36Z,2021-03-29T00:17:32Z,CONTRIBUTOR,simonw/datasette/pulls/1229,"fixes #1224 This PR ensures all databases found in a configuration directory that match the files in `inspect-data.json` will be set to `immutable` as outlined in https://docs.datasette.io/en/latest/settings.html#configuration-directory-mode specifically on building the `datasette` instance it checks: - if `immutables` is an empty tuple - as passed by the cli code - if `immutables` is the default function value `None` - when it's not explicitly set And correctly builds the immutable database list from the `inspect-data[file]` keys. Note for this to work the `inspect-data.json` file must contain `file` paths which are relative to the configuration directory otherwise the file paths won't match and the dbs won't be set to immutable. I couldn't find an easy way to test this due to the way `make_app_client` works, happy to take directions on adding a test for this. I've updated the relevant docs as well, i.e. use the `inspect` cli cmd from the config directory path to create the relevant file ``` cd $config_dir datasette inspect *.db --inspect-file=inspect-data.json ``` https://docs.datasette.io/en/latest/performance.html#using-datasette-inspect",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1229/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 827341657,MDExOlB1bGxSZXF1ZXN0NTg5MjYzMjk3,1256,Minor type in IP adress,6371750,JBPressac,closed,0,,,,,3,2021-03-10T08:28:22Z,2021-03-10T18:26:46Z,2021-03-10T18:26:40Z,CONTRIBUTOR,simonw/datasette/pulls/1256,127.0.01 replaced by 127.0.0.1,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1256/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 837956424,MDExOlB1bGxSZXF1ZXN0NTk4MjEzNTY1,1271,Use SQLite conn.interrupt() instead of sqlite_timelimit(),9599,simonw,open,0,,,,,3,2021-03-22T17:34:20Z,2021-03-22T21:49:27Z,,OWNER,simonw/datasette/pulls/1271,"Refs #1270, #1268, #1249 Before merging this I need to do some more testing (to make sure that expensive queries really are properly cancelled). I also need to delete a bunch of code relating to the old mechanism of cancelling queries. [See comment below: this doesn't actually cancel the query due to a thread-local confusion]",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1271/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1, 904537568,MDExOlB1bGxSZXF1ZXN0NjU1Njg0NDc3,1346,Re-display user's query with an error message if an error occurs,9599,simonw,closed,0,,,,,3,2021-05-28T02:04:20Z,2021-06-02T03:46:21Z,2021-06-02T03:46:21Z,OWNER,simonw/datasette/pulls/1346,Refs #619,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1346/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 929748885,MDExOlB1bGxSZXF1ZXN0Njc3NTU0OTI5,293,Test against Python 3.10-dev,9599,simonw,closed,0,,,,,3,2021-06-25T01:40:39Z,2021-10-13T21:49:33Z,2021-10-13T21:49:33Z,OWNER,simonw/sqlite-utils/pulls/293,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/293/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 988325628,MDExOlB1bGxSZXF1ZXN0NzI3MjY1MDI1,1455,Add scientists to target groups,198537,rgieseke,closed,0,,,,,3,2021-09-04T16:28:58Z,2021-09-04T16:32:21Z,2021-09-04T16:31:38Z,CONTRIBUTOR,simonw/datasette/pulls/1455,"Not sure if you want them mentioned explicitly (it's already a long list), but following up on https://twitter.com/simonw/status/1434176989565382656",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1455/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 991575770,MDExOlB1bGxSZXF1ZXN0NzMwMDIwODY3,1467,Add Authorization header when CORS flag is set,3058200,jameslittle230,closed,0,,,,,3,2021-09-08T22:14:41Z,2021-10-17T02:29:07Z,2021-10-14T18:54:18Z,NONE,simonw/datasette/pulls/1467,"This PR adds the [`Access-Control-Allow-Headers`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Access-Control-Allow-Headers) flag when CORS mode is enabled. This would fix https://github.com/simonw/datasette-auth-tokens/issues/4. When making cross-origin requests, the server must respond with all allowable HTTP headers. A Datasette instance using auth tokens must accept the `Authorization` HTTP header in order for cross-origin authenticated requests to take place. Please let me know if there's a better way of doing this! I couldn't figure out a way to change the app's response from the plugin itself, so I'm starting here. If you'd rather this logic live in the plugin, I'd love any guidance you're able to give.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1467/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 400229984,MDU6SXNzdWU0MDAyMjk5ODQ=,401,How to pass configuration to plugins?,1055831,dazzag24,closed,0,,,,,3,2019-01-17T11:20:41Z,2019-01-18T11:48:13Z,2019-01-18T06:49:07Z,NONE,,"Hi, Firstly, thanks for your work on datasette, it is a hugely useful tool! I've been working on a fork [https://github.com/dazzag24/datasette-cluster-map] of datasette-cluster-map to allow the tileserver to be easily switched. Primarily because the tiles being served in the current version use localised text for labels and I'd like to have English used for these names instead. It uses http://leaflet-extras.github.io/leaflet-providers/preview/ to allow you to simply set the tile provider using a call like so: ``` let tiles = L.tileLayer.provider('Esri.WorldTopoMap'); ``` instead of the current: ``` let tiles = L.tileLayer('https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png', { maxZoom: 19, detectRetina: true, attribution: '© OpenStreetMap contributors' }), ``` However I've got stuck in trying to work out how to pass the provider string to the plugin. In the documentation: https://datasette.readthedocs.io/en/stable/plugins.html you discuss configuration of plugins and use an example of passing in which latitude and longitude columns should be used. However I cannot seem to see anywhere in the current datasette-cluster-map code where these config params are passed in or used. Can you please point me to an example or how to pass configuration from the metadata.json down into a plugin. Once I've over come this issue I was wondering if you would be interested in taking this change into your version? Many thanks Darren",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/401/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 400340905,MDU6SXNzdWU0MDAzNDA5MDU=,402,Use SQLITE_DBCONFIG_DEFENSIVE plus other recommendations from SQLite security docs,9599,simonw,open,0,,,,,3,2019-01-17T15:52:28Z,2019-01-17T16:15:21Z,,OWNER,,"> Was just having a skim through the datasette source. Given that the vuln impacts shadow tables, wasn't sure whether these are also covered by the immutable flag. Latest release introduced a SQLITE_DBCONFIG_DEFENSIVE flag that they recommend setting: https://sqlite.org/security.html https://twitter.com/ignoredambience/status/1085926961413869568",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/402/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 403625674,MDU6SXNzdWU0MDM2MjU2NzQ=,7,.insert_all() should accept a generator and process it efficiently,9599,simonw,closed,0,,,,,3,2019-01-28T02:11:58Z,2019-01-28T06:26:53Z,2019-01-28T06:26:53Z,OWNER,,"Right now you have to load every record into memory before passing the list to `.insert_all()` and friends. If you want to process millions of rows, this is inefficient. Python has generators - we should use them! The only catch here is that part of the magic of `sqlite-utils` is that it guesses the column types and creates the table for you. This code will need to be updated to notice if the table needs creating and, if it does, create it using the first X (where x=1,000 but can be customized) records. If a record outside of those first 1,000 has a rogue column, we can crash with an error. This will free us up to make the `--nl` option added in #6 much more efficient.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 403922644,MDU6SXNzdWU0MDM5MjI2NDQ=,8,Problems handling column names containing spaces or - ,82988,psychemedia,closed,0,,,,,3,2019-01-28T17:23:28Z,2019-04-14T15:29:33Z,2019-02-23T21:09:03Z,NONE,,"Irrrespective of whether using column names containing a space or - character is good practice, SQLite does allow it, but `sqlite-utils` throws an error in the following cases: ```python from sqlite_utils import Database dbname = 'test.db' DB = Database(sqlite3.connect(dbname)) import pandas as pd df = pd.DataFrame({'col1':range(3), 'col2':range(3)}) #Convert pandas dataframe to appropriate list/dict format DB['test1'].insert_all( df.to_dict(orient='records') ) #Works fine ``` However: ```python df = pd.DataFrame({'col 1':range(3), 'col2':range(3)}) DB['test1'].insert_all(df.to_dict(orient='records')) ``` throws: ``` --------------------------------------------------------------------------- OperationalError Traceback (most recent call last) in () 1 import pandas as pd 2 df = pd.DataFrame({'col 1':range(3), 'col2':range(3)}) ----> 3 DB['test1'].insert_all(df.to_dict(orient='records')) /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, upsert, batch_size, column_order) 327 jsonify_if_needed(record.get(key, None)) for key in all_columns 328 ) --> 329 result = self.db.conn.execute(sql, values) 330 self.db.conn.commit() 331 self.last_id = result.lastrowid OperationalError: near ""1"": syntax error ``` and: ```python df = pd.DataFrame({'col-1':range(3), 'col2':range(3)}) DB['test1'].upsert_all(df.to_dict(orient='records')) ``` results in: ``` --------------------------------------------------------------------------- OperationalError Traceback (most recent call last) in () 1 import pandas as pd 2 df = pd.DataFrame({'col-1':range(3), 'col2':range(3)}) ----> 3 DB['test1'].insert_all(df.to_dict(orient='records')) /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, upsert, batch_size, column_order) 327 jsonify_if_needed(record.get(key, None)) for key in all_columns 328 ) --> 329 result = self.db.conn.execute(sql, values) 330 self.db.conn.commit() 331 self.last_id = result.lastrowid OperationalError: near ""-"": syntax error ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 408376825,MDU6SXNzdWU0MDgzNzY4MjU=,409,Zeit API v1 does not work for new users - need to migrate to v2,209967,michaelmcandrew,closed,0,,,,,3,2019-02-09T00:50:33Z,2020-04-06T15:44:46Z,2020-04-06T15:44:46Z,NONE,,"Hello there, This looks like a great tool. Thanks. Unfortunately, I hit the following error: ``` michael@hazel ~/src/cc-datasette/data/out datasette publish now cc-datasette.db > WARN! You are using an old version of the Now Platform. More: https://zeit.co/docs/v1-upgrade > Deploying /tmp/tmpjtrxwsyf/datasette under michaelmcandrew > Using project datasette > Error! You tried to create a Now 1.0 deployment. Please use Now 2.0 instead: https://zeit.co/upgrade ``` I'm guessing you might not hit this because you are not a 'new user' of Zeit (https://github.com/zeit/now-cli/issues/1805#issuecomment-452470953). Would it be a lot of work to upgrade to the new Zeit API, do you think?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/409/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 410384988,MDU6SXNzdWU0MTAzODQ5ODg=,411,How to pass named parameter into spatialite MakePoint() function,1055831,dazzag24,closed,0,,,,,3,2019-02-14T16:30:22Z,2023-10-25T13:23:04Z,2019-05-05T12:25:04Z,NONE,,"Hi, datasette version: ""0.26.2"" extensions: spatialite: ""4.4.0-RC0"" sqlite version: ""3.22.0"" I have a table of airports with latitude and longitude columns. I've added spatialite (with KNN support). After creating the db using csvs-to-sqlit, I run these commands to setup the spatialite tables: ``` conn.execute('SELECT InitSpatialMetadata(1)') conn.execute(""SELECT AddGeometryColumn('airports', 'point_geom', 4326, 'POINT', 2);"") conn.execute('''UPDATE airports SET point_geom = GeomFromText('POINT('||""longitude""||' '||""latitude""||')',4326);''') conn.execute(""SELECT CreateSpatialIndex('airports', 'point_geom');"") ``` I'm attempting to create a canned query and have this in my metadata.json file: ``` ""find_airports_nearest_to_point"":{ ""sql"":""SELECT a.pos AS rank, b.id, b.name, b.country, b.latitude AS latitude, b.longitude AS longitude, a.distance / 1000.0 AS dist_km FROM KNN AS a JOIN airports AS b ON (b.rowid = a.fid) WHERE f_table_name = \""airports\"" AND ref_geometry = MakePoint( :Long , :Lat ) AND max_items = 10;""} ``` which doesn't seem to perform the templating of the name parameters correctly and I get no results. Have also tired: ``` MakePoint( || :Long || , || :Lat || ) ``` which returns this error: ``` near ""||"": syntax error ``` However I cannot seem to find the correct combination of named parameter syntax (:Lat) or sqlite concatenation operator to make it work. Any ideas if using named parameters inside functions is supported? Thanks Darren",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/411/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 413842611,MDU6SXNzdWU0MTM4NDI2MTE=,14,Utilities for adding indexes,9599,simonw,closed,0,,,,,3,2019-02-24T16:57:28Z,2019-02-24T19:11:28Z,2019-02-24T19:11:28Z,OWNER,,"Both in the Python API and the CLI tool. For the CLI tool this should work: $ sqlite-utils create-index mydb.db mytable col1 col2 This will create a compound index across col1 and col2. The name of the index will be automatically chosen unless you use the `--name=...` option. Support a `--unique` option too.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 418329842,MDU6SXNzdWU0MTgzMjk4NDI=,415,Add query parameter to hide SQL textarea,36796532,ad-si,closed,0,,,,,3,2019-03-07T14:11:30Z,2019-03-15T09:30:57Z,2019-03-15T05:22:43Z,NONE,,It would be cool if there was a query parameter to hide / remove the SQL textarea. Then I could simply save a bookmark for a certain query and open it to see the data without having to scroll below the (long) SQL query first.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/415/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 432870248,MDU6SXNzdWU0MzI4NzAyNDg=,431,Datasette doesn't reload when database file changes,82988,psychemedia,closed,0,,,,,3,2019-04-13T16:50:43Z,2019-05-02T05:13:55Z,2019-05-02T05:13:54Z,CONTRIBUTOR,,"My understanding of the `--reload` option was that if the database file changed `datasette` would automatically reload. I'm running on a Mac and from the `datasette` UI queries don't seem to be picking up data in a newly changed db (I checked the db timestamp - it certainly updated). I was also expecting to see some sort of log statement in the datasette logging to say that it had detected a file change and restarted, but don't see anything there? Will try to check on an Ubuntu box when I get a chance to see if this is a Mac thing.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/431/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 442327592,MDU6SXNzdWU0NDIzMjc1OTI=,456,Installing installs the tests package,7725188,hellerve,closed,0,,,,,3,2019-05-09T16:35:16Z,2020-07-24T20:39:54Z,2020-07-24T20:39:54Z,CONTRIBUTOR,,"Because `setup.py` uses `find_packages` and `tests` is on the top-level, `pip install datasette` will install a top-level package called `tests`, which is probably not desired behavior. The offending line is here: https://github.com/simonw/datasette/blob/bfa2ae0d16d39bb82dbe4da4f3fdc3c7f6257418/setup.py#L40 And only `pip uninstall datasette` with a conflicting package would warn you by default; apparently another package had the same problem, which is why I get this message when uninstalling: ``` $ pip uninstall datasette Uninstalling datasette-0.27: Would remove: /usr/local/bin/datasette /usr/local/lib/python3.7/site-packages/datasette-0.27.dist-info/* /usr/local/lib/python3.7/site-packages/datasette/* /usr/local/lib/python3.7/site-packages/tests/* Would not remove (might be manually added): [ .. snip .. ] Proceed (y/n)? ``` This should be a relatively simple fix, and I could drop a PR if desired! Cheers",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/456/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 443020048,MDU6SXNzdWU0NDMwMjAwNDg=,459,"Fix the ""datasette now publish ... --alias=x"" option",9599,simonw,closed,0,,,4305096,0.28,3,2019-05-11T17:48:40Z,2019-05-11T20:22:08Z,2019-05-11T20:22:08Z,OWNER,,"Now have deprecated the mechanism we were using for this - running `now alias` without any parameters - in favour of something new: https://zeit.co/blog/automatic-aliasing",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/459/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 445868234,MDU6SXNzdWU0NDU4NjgyMzQ=,478,Make it so Docker build doesn't delay PyPI release,9599,simonw,closed,0,,,4471010,Datasette 0.29,3,2019-05-19T21:52:10Z,2019-07-08T03:30:41Z,2019-07-07T20:03:20Z,OWNER,,"Datasette automated releases currently include building a Docker image that has a full custom-compiled version of SQLite and SpatiaLite. This takes ages! I still want to publish this Docker image (to https://hub.docker.com/r/datasetteproject/datasette/tags ) but I'd like it if this wasn't a blocker on pushing the new package to PyPI. Ideally PyPI publish would happen first.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/478/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 448391492,MDU6SXNzdWU0NDgzOTE0OTI=,21,Option to ignore inserts if primary key exists already,9599,simonw,closed,0,,,,,3,2019-05-25T00:17:12Z,2019-05-29T05:09:01Z,2019-05-29T04:18:26Z,OWNER,,"> I've just noticed that SQLite lets you IGNORE inserts that collide with a pre-existing key. This can be quite handy if you have a dataset that keeps changing in part, and you don't want to upsert and replace pre-existing PK rows but you do want to ignore collisions to existing PK rows. > > Do `sqlite_utils` support such (cavalier!) behaviour? _Originally posted by @psychemedia in https://github.com/simonw/sqlite-utils/issues/18#issuecomment-480621924_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/21/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 451585764,MDU6SXNzdWU0NTE1ODU3NjQ=,499,Accessibility for non-techie newsies? ,7936571,chrismp,open,0,,,,,3,2019-06-03T16:49:37Z,2019-06-05T21:22:55Z,,NONE,,"Hi again, I'm having fun uploading datasets to Heroku via datasette. I'd like to set up datasette so that it's easy for other newsroom workers, who don't use Linux and aren't programmers, to upload datasets. Does datsette provide this out-of-the-box, or as a plugin? ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/499/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 453131917,MDU6SXNzdWU0NTMxMzE5MTc=,502,Exporting sqlite database(s)?,7936571,chrismp,closed,0,,,,,3,2019-06-06T16:39:53Z,2021-04-03T05:16:54Z,2019-06-11T18:50:42Z,NONE,,"I'm working on datasette from one computer. But if I want to work on it from another computer and want to copy the SQLite database(s) already on the Heroku datasette instance, how to I copy the database(s) to the second computer so that I can then update it and push to online via datasette's command line code that pushes code to Heroku?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/502/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 457147936,MDU6SXNzdWU0NTcxNDc5MzY=,512,"""about"" parameter in metadata does not appear when alone",7936571,chrismp,open,0,,,,,3,2019-06-17T21:04:20Z,2019-10-11T15:49:13Z,,NONE,,"Here's an example of metadata I have for one database on datasette. ``` ""Records-requests"": { ""tables"": { ""Some table"": { ""about"": ""This table has data."" } } } ``` The text in `about` does not show up when I publish the data. But it shows up after I add a `""source""` parameter in the metadata. Is this intended?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/512/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 459598080,MDU6SXNzdWU0NTk1OTgwODA=,520,asgi_wrapper plugin hook,9599,simonw,closed,0,9599,simonw,,,3,2019-06-23T17:16:45Z,2019-07-03T04:40:34Z,2019-07-03T04:06:28Z,OWNER,,"After #272 we can finally add this hook. It will allow plugins to wrap their own ASGI middleware around Datasette. Potential use-cases include: * adding authentication * custom CORS headers (see #454) * maybe gzip support? * possibly defining entirely new routes, though that may be better handled by a separate hook",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/520/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 459936585,MDU6SXNzdWU0NTk5MzY1ODU=,527,Unable to use rank when fts-table generated with csvs-to-sqlite,2181410,clausjuhl,closed,0,,,,,3,2019-06-24T14:49:48Z,2019-06-24T15:21:18Z,2019-06-24T15:09:10Z,NONE,,"Hi Simon. If i generate a fts-table with the csvs-to-sqlite f-option, I'm unable to use (in datasette's GUI) the internal ranking of the table for sorting or viewing, but if I generate the fts-table with the enable-fts argument from sqlite-utils, everyrthing works ok. Eg.: datasette, version 0.28 sqlite-utils, version 1.2.1 csvs-to-sqlite, version 0.9 No column named rank with these commands: $ csvs-to-sqlite minutes.csv minutes.db -f text_data $ datasette -i minutes.db select rank, * from minutes_fts where minutes_fts match 'dog' Everything ok with these commands: $ csvs-to-sqlite minutes.csv minutes.db $ sqlite-utils enable-fts minutes.db text_data $ datasette -i minutes.db select rank, * from minutes_fts where minutes_fts match 'dog' Am I doing something wrong? Thank you for a great application!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/527/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 463915863,MDU6SXNzdWU0NjM5MTU4NjM=,538,Mechanism for secrets in plugin configuration,9599,simonw,closed,0,,,,,3,2019-07-03T19:23:34Z,2019-07-04T05:47:54Z,2019-07-04T05:47:54Z,OWNER,,"See https://github.com/simonw/datasette-auth-github/issues/1 We need a mechanism where by plugins can tap into ""secret"" config options without exposing them in the visible metadata.json (where plugin configs currently live, see https://datasette.readthedocs.io/en/stable/plugins.html#plugin-configuration )",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/538/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 464868844,MDU6SXNzdWU0NjQ4Njg4NDQ=,543,datasette publish option for setting plugin configuration secrets,9599,simonw,closed,0,,,4471010,Datasette 0.29,3,2019-07-06T16:21:23Z,2019-07-08T02:06:34Z,2019-07-08T02:06:34Z,OWNER,,Follow-on from #538 - the `datasette publish` command needs a way of passing secrets which will be made available to plugin configuration but will not be exposed in `/-/metadata.json`.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/543/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 465327844,MDU6SXNzdWU0NjUzMjc4NDQ=,553,Potential improvements to facet-by-date,9599,simonw,open,0,,,,,3,2019-07-08T15:37:53Z,2019-07-08T15:41:55Z,,OWNER,,"In addition to #483 Tobias had some useful suggestions on Twitter: https://twitter.com/rixxtr/status/1148253926476701696 > I think for date facets, it might be more meaningful to order them by date, rather than by size? Or offer both? I'm *definitely* often interested in size-over-time, so https://data.rixx.de/django_tickets/tickets?_facet_date=created#facet-created … isn't all that helpful! Screenshot of that link: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/553/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 467790646,MDU6SXNzdWU0Njc3OTA2NDY=,560,CodeMirror fails to load on database page,9599,simonw,closed,0,,,,,3,2019-07-14T03:31:00Z,2019-09-03T01:03:02Z,2019-07-14T03:38:59Z,OWNER,,"It's not loading on https://latest.datasette.io/fixtures But it does load on https://latest.datasette.io/fixtures?sql=select+*+from+facetable",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/560/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470691999,MDU6SXNzdWU0NzA2OTE5OTk=,43,.add_column() doesn't match indentation of initial creation,9599,simonw,closed,0,,,,,3,2019-07-20T16:33:10Z,2019-07-23T13:09:11Z,2019-07-23T13:09:05Z,OWNER,,"I spotted a table which was created once and then had columns added to it and the formatted SQL looks like this: ```sql CREATE TABLE [records] ( [type] TEXT, [sourceName] TEXT, [sourceVersion] TEXT, [unit] TEXT, [creationDate] TEXT, [startDate] TEXT, [endDate] TEXT, [value] TEXT, [metadata_Health Mate App Version] TEXT, [metadata_Withings User Identifier] TEXT, [metadata_Modified Date] TEXT, [metadata_Withings Link] TEXT, [metadata_HKWasUserEntered] TEXT , [device] TEXT, [metadata_HKMetadataKeyHeartRateMotionContext] TEXT, [metadata_HKDeviceManufacturerName] TEXT, [metadata_HKMetadataKeySyncVersion] TEXT, [metadata_HKMetadataKeySyncIdentifier] TEXT, [metadata_HKSwimmingStrokeStyle] TEXT, [metadata_HKVO2MaxTestType] TEXT, [metadata_HKTimeZone] TEXT, [metadata_Average HR] TEXT, [metadata_Recharge] TEXT, [metadata_Lights] TEXT, [metadata_Asleep] TEXT, [metadata_Rating] TEXT, [metadata_Energy Threshold] TEXT, [metadata_Deep Sleep] TEXT, [metadata_Nap] TEXT, [metadata_Edit Slots] TEXT, [metadata_Tags] TEXT, [metadata_Daytime HR] TEXT) ``` It would be nice if the columns that were added later matched the indentation of the initial columns.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/43/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 471780443,MDU6SXNzdWU0NzE3ODA0NDM=,46,extracts= option for insert/update/etc,9599,simonw,closed,0,,,,,3,2019-07-23T15:55:46Z,2020-03-01T16:53:40Z,2019-07-23T17:00:44Z,OWNER,,"Relates to #42 and #44. I want the ability to extract values out into lookup tables during bulk insert/upsert operations. `db.insert_all(rows, extracts=[""species""])` - creates species table for values in the species column `db.insert_all(rows, extracts={""species"": ""Species""})` - as above but the new table is called `Species`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/46/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 472097220,MDU6SXNzdWU0NzIwOTcyMjA=,7,Script uses a lot of RAM,9599,simonw,closed,0,,,,,3,2019-07-24T06:11:11Z,2019-07-24T06:35:52Z,2019-07-24T06:35:52Z,MEMBER,,"I'm using an XML pull parser which should avoid the need to slurp the whole XML file into memory, but it's not working - the script still uses over 1GB of RAM when it runs according to Activity Monitor. I think this is because I'm still causing the full root element to be incrementally loaded into memory just in case I try and access it later. http://effbot.org/elementtree/iterparse.htm says I should use `elem.clear()` as I go. It also says: > The above pattern has one drawback; it does not clear the root element, so you will end up with a single element with lots of empty child elements. If your files are huge, rather than just large, this might be a problem. To work around this, you need to get your hands on the root element. So I will try that recipe and see if it helps.",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 476573875,MDU6SXNzdWU0NzY1NzM4NzU=,567,Datasette Edit,9599,simonw,closed,0,,,,,3,2019-08-04T17:09:28Z,2020-02-25T03:40:50Z,2020-02-25T03:40:50Z,OWNER,,"Datasette started out immutable. Then it gained the ability to run against read-only databases that were being modified by other processes. It's time for the next logical progression: the option to allow Datasette (or more likely individual plugins) to write to the database! This is going to require some careful rethinking of how connection management works.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/567/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488833698,MDU6SXNzdWU0ODg4MzM2OTg=,2,"""twitter-to-sqlite user-timeline"" command for pulling tweets by a specific user",9599,simonw,closed,0,,,,,3,2019-09-03T21:29:12Z,2019-09-04T20:02:11Z,2019-09-04T20:02:11Z,MEMBER,,"Twitter only allows up to 3,200 tweets to be retrieved from https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-user_timeline.html I'm going to do: $ twitter-to-sqlite tweets simonw ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 492153532,MDU6SXNzdWU0OTIxNTM1MzI=,573,Exposing Datasette via Jupyter-server-proxy,82988,psychemedia,closed,0,,,,,3,2019-09-11T10:32:36Z,2020-03-26T09:41:30Z,2020-03-26T09:41:30Z,CONTRIBUTOR,,"It is possible to expose a running `datasette` service in a Jupyter environment such as a MyBinder environment using the [`jupyter-server-proxy`](https://github.com/jupyterhub/jupyter-server-proxy). For example, using [this demo Binder](https://mybinder.org/v2/gh/binder-examples/r/master?filepath=index.ipynb) which has the server proxy installed, we can then upload a simple test database from the notebook homepage, from a Jupyter termianl install datasette and set it running against the test db on eg port 8001 and then view it via the path `proxy/8001`. Clicking links results in 404s though because the `datasette` links aren't relative to the current path? ![image](https://user-images.githubusercontent.com/82988/64689964-44b69280-d487-11e9-8f9f-3681422bcc9f.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/573/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503053800,MDU6SXNzdWU1MDMwNTM4MDA=,12,"Extract ""source"" into a separate lookup table",9599,simonw,closed,0,,,,,3,2019-10-06T05:17:23Z,2019-10-17T15:49:24Z,2019-10-17T15:49:24Z,MEMBER,,"It's pretty bulky and ugly at the moment: ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503234169,MDU6SXNzdWU1MDMyMzQxNjk=,2,Track and use the 'since' value,9599,simonw,closed,0,,,,,3,2019-10-07T05:02:59Z,2020-03-27T22:22:30Z,2020-03-27T22:22:30Z,MEMBER,,"Pocket says: > Whenever possible, you should use the since parameter, or count and and offset parameters when retrieving a user's list. After retrieving the list, you should store the current time (which is provided along with the list response) and pass that in the next request for the list. This way the server only needs to return a small set (changes since that time) instead of the user's entire list every time. At the bottom of https://getpocket.com/developer/docs/v3/retrieve",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 505512251,MDU6SXNzdWU1MDU1MTIyNTE=,588,Queries per DB table in metadata.json,12617395,bsilverm,closed,0,,,,,3,2019-10-10T21:08:19Z,2019-10-21T12:58:22Z,2019-10-21T01:48:42Z,NONE,,"It doesn't appear possible to have separate queries defined per database table. When I do something like below, my table descriptions show up but not the queries: ` ""databases"": { ""MYDB"": { ""tables"": { ""MYFIRSTTABLE"": { ""source"": ""Test"", ""source_url"": ""https://www.google.com"", ""queries"": { ""Query 1"": { ""sql"": ""select * from MYFIRSTTABLE"", ""title"": ""Query 1"", ""description"": ""This is the first query"" }, } }, ""MYSECONDTABLE"": { ""source"":""Test2"", ""source_url"":""https://www.google.com"", ""queries"": { ""Query 2"" : { ""sql"":""select * from MYSECONDTABLE;"", ""title"": ""Query 2"", ""description"":""This is the second query"" } } } }`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/588/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 506087267,MDU6SXNzdWU1MDYwODcyNjc=,19,since_id support for home-timeline,9599,simonw,closed,0,,,,,3,2019-10-11T22:48:24Z,2019-10-16T19:13:06Z,2019-10-16T19:12:46Z,MEMBER,,Currently every time you run `home-timeline` we pull all 800 available tweets. We should offer to support `since_id` (which can be provided or can be pulled directly from the database) in order to work more efficiently if this command is executed e.g. on a cron.,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 506183241,MDU6SXNzdWU1MDYxODMyNDE=,593,make uvicorn optional dependancy (because not ok on windows python yet),4312421,stonebig,closed,0,,,,,3,2019-10-12T12:51:07Z,2019-10-13T06:22:08Z,2019-10-13T06:22:07Z,NONE,,"would it be possible to: - remove uvicorn mandatory dependancy ? - eventually make a fallback to hypercorn ? reason: - uvloop not yet supported on Windows/Python-3.8 and below, may happen with Python-3.9 only. - it seems a 6 lines effort (but I'm not expert)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/593/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 506268945,MDU6SXNzdWU1MDYyNjg5NDU=,20,--since support for various commands for refresh-by-cron,9599,simonw,closed,0,,,,,3,2019-10-13T03:40:46Z,2019-10-21T03:32:04Z,2019-10-16T19:26:11Z,MEMBER,,"I want to run a cron that updates my Twitter database every X minutes. It should be able to retrieve the following without needing to paginate through everything: - [x] Tweets I have tweeted - [x] My home timeline (see #19) - [x] Tweets I have favourited It would be nice if this could be standardized across all commands as a `--since` option.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 506297048,MDU6SXNzdWU1MDYyOTcwNDg=,594,upgrade to uvicorn-0.9 to be Python-3.8 friendly,4312421,stonebig,closed,0,,,,,3,2019-10-13T09:23:43Z,2019-11-12T04:47:04Z,2019-11-12T04:47:04Z,NONE,,uvicorn-0.8 relies on websockets-0.7 which lacks python-3.8 compatiblity,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/594/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 509693773,MDU6SXNzdWU1MDk2OTM3NzM=,604,_where= parameter is not persisted in hidden form fields,9599,simonw,closed,0,,,,,3,2019-10-21T02:14:10Z,2019-10-30T19:12:38Z,2019-10-30T18:49:44Z,OWNER,,"e.g. on this page: https://v0-30.datasette.io/fixtures/roadside_attractions?_where=name%20like%20%27%museum%%27 Click the ""Apply"" button and the `_where=` parameter will be dropped.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/604/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 512218858,MDU6SXNzdWU1MTIyMTg4NTg=,606,/-/plugins shows incorrect name for plugins,9599,simonw,closed,0,,,,,3,2019-10-24T22:53:25Z,2019-11-01T05:41:04Z,2019-11-01T05:40:07Z,OWNER,,"https://fivethirtyeight.datasettes.com/-/plugins ```json [ { ""name"": ""datasette_jellyfish"", ""static"": false, ""templates"": false, ""version"": ""0.3"" }, { ""name"": ""datasette_vega"", ""static"": true, ""templates"": false, ""version"": ""0.6.2"" } ] ``` These should be shown as `datasette-jellyfish` and `datasette-vega` since those are the names on PyPI.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/606/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516967682,MDU6SXNzdWU1MTY5Njc2ODI=,10,Add this repos_starred view,9599,simonw,closed,0,,,,,3,2019-11-04T05:44:38Z,2020-05-02T16:37:36Z,2020-05-02T16:37:36Z,MEMBER,,"```sql create view repos_starred as select stars.starred_at, users.login, repos.* from repos join stars on repos.id = stars.repo join users on repos.owner = users.id order by starred_at desc; ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516370822,MDU6SXNzdWU1MTYzNzA4MjI=,611,Static assets no longer loading for installed plugins,9599,simonw,closed,0,,,,,3,2019-11-01T22:07:00Z,2019-11-01T22:15:55Z,2019-11-01T22:15:55Z,OWNER,,"Caused by fix I made in #606 e.g. `/-/static-plugins/datasette_leaflet_geojson/datasette-leaflet-geojson.js` is a 404, but view-`/-/static-plugins/datasette-leaflet-geojson/datasette-leaflet-geojson.js` works correctly.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/611/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 522334771,MDU6SXNzdWU1MjIzMzQ3NzE=,633,"Publish to Heroku is broken: ""WARNING: You must pass the application as an import string to enable 'reload' or 'workers""",9599,simonw,closed,0,,,,,3,2019-11-13T16:32:11Z,2020-04-28T20:37:50Z,2019-11-13T16:43:23Z,OWNER,,"``` 2019-11-13T16:27:59.821483+00:00 heroku[web.1]: Starting process with command `datasette serve --host 0.0.0.0 -i fixtures.db --cors --port 36817 --inspect-file inspect-data.json` 2019-11-13T16:28:01.856471+00:00 heroku[web.1]: State changed from starting to crashed 2019-11-13T16:28:01.750253+00:00 app[web.1]: Serve! files=() (immutables=('fixtures.db',)) on port 36817 2019-11-13T16:28:01.771524+00:00 app[web.1]: WARNING: You must pass the application as an import string to enable 'reload' or 'workers'. 2019-11-13T16:28:01.837839+00:00 heroku[web.1]: Process exited with status 1 ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/633/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 525993034,MDU6SXNzdWU1MjU5OTMwMzQ=,637,"Custom queries with 0 results should say ""0 results""",9599,simonw,closed,0,,,,,3,2019-11-20T18:28:14Z,2019-11-23T06:17:23Z,2019-11-23T06:07:08Z,OWNER,,"Consider https://latest.datasette.io/fixtures/neighborhood_search?text=foop It's currently not obvious that the query executed and returned 0 results.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/637/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 525254973,MDU6SXNzdWU1MjUyNTQ5NzM=,636,rowid is not included in dropdown filter menus,9599,simonw,closed,0,,,,,3,2019-11-19T20:43:04Z,2019-11-19T23:01:17Z,2019-11-19T23:01:17Z,OWNER,,"For `rowid` tables the `rowid` column isn't shown in the list of filter options: This also means if you link to e.g. `?rowid__gt=1060124` the resulting filter interface will be slightly broken: clicking the ""apply"" button again will lose your filter for example.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/636/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 526913133,MDU6SXNzdWU1MjY5MTMxMzM=,638,Don't suggest column for faceting if all values are 1,9599,simonw,closed,0,,,,,3,2019-11-22T00:14:22Z,2019-11-22T01:14:59Z,2019-11-22T00:57:49Z,OWNER,,"https://www.niche-museums.com/museums/museums?_facet=wikipedia_url Challenge is how to do this efficiently, since suggested facet queries need to be lightning fast.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/638/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 530491074,MDU6SXNzdWU1MzA0OTEwNzQ=,14,Command for importing events,9599,simonw,open,0,,,,,3,2019-11-29T21:28:58Z,2020-04-14T19:38:34Z,,MEMBER,,"Eg from https://api.github.com/users/simonw/events Docs here: https://developer.github.com/v3/activity/events/#list-events-performed-by-a-user",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 531502365,MDU6SXNzdWU1MzE1MDIzNjU=,646,Make database level information from metadata.json available in the index.html template,18017473,lagolucas,open,0,,,3268330,Datasette 1.0,3,2019-12-02T19:55:10Z,2022-03-15T20:50:34Z,,NONE,,"Did a search on the issues here and didn't find anything related to what I want. I want to have information that is on the database level of the JSON like title, source and source_url, and use it on the index page. I tried some small tweaks on the python and html files, but failed to get that result. Is there a way? Thanks!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/646/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 534507142,MDU6SXNzdWU1MzQ1MDcxNDI=,69,Feature request: enable extensions loading,30607,aborruso,closed,0,,,,,3,2019-12-08T08:06:25Z,2022-02-05T00:04:25Z,2020-10-16T18:42:49Z,NONE,,"Hi, it would be great to add a parameter that enables the load of a sqlite extension you need. Something like ""-ext modspatialite"". In this way your great tool would be even more comfortable and powerful. Thank you very much",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/69/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 534629631,MDU6SXNzdWU1MzQ2Mjk2MzE=,650,Add a glossary to the documentation,9599,simonw,open,0,,,,,3,2019-12-09T00:23:45Z,2022-01-13T22:04:56Z,,OWNER,,"Call it `glossary.rst` - it can use a definition list something like this: ```rst .. _glossary: Glossary ======== Term A definition of the term. Another term Another definition. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/650/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 539590148,MDU6SXNzdWU1Mzk1OTAxNDg=,651,fts5 syntax error when using punctuation,2181410,clausjuhl,closed,0,,,,,3,2019-12-18T10:25:35Z,2021-07-14T19:26:06Z,2019-12-30T06:42:55Z,NONE,,"Hi Simon I get a syntax error when using punctuation or special characters in a fulltext search (using fts5). I created the virtual table using sqlite-utils' ""enable-fts""-command. The same error appears on Niche Museums [https://www.niche-museums.com/browse/search?q=park.](https://www.niche-museums.com/browse/search?q=park.), but works fine in most of your other datasette-examples, e.g. register-of-members-interests [https://register-of-members-interests.datasettes.com/regmem-98dc8b7/items?_search=mins.](https://register-of-members-interests.datasettes.com/regmem-98dc8b7/items?_search=mins.) What am I doing wrong? Many thanks! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/651/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 541467590,MDU6SXNzdWU1NDE0Njc1OTA=,654,Template debug mode that outputs template context,9599,simonw,closed,0,,,,,3,2019-12-22T15:51:25Z,2019-12-22T16:13:11Z,2019-12-22T16:04:51Z,OWNER,,It would make writing templates (including custom templates) easier if there was an option to dump out the full template context - maybe `?_context=1`,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/654/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 542553350,MDU6SXNzdWU1NDI1NTMzNTA=,655,Copy and paste doesn't work reliably on iPhone for SQL editor,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2019-12-26T13:15:10Z,2020-09-30T20:36:12Z,2020-08-30T17:51:40Z,OWNER,,I'm having a lot of trouble copying and pasting from the codemirror editor on my iPhone.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/655/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 542814756,MDU6SXNzdWU1NDI4MTQ3NTY=,71,Tests are failing due to missing FTS5,9599,simonw,closed,0,,,,,3,2019-12-27T09:41:16Z,2019-12-27T09:49:37Z,2019-12-27T09:49:37Z,OWNER,,"https://travis-ci.com/simonw/sqlite-utils/jobs/268436167 This is a recent change: 2 months ago they worked fine. I'm not sure what changed here. Maybe something to do with https://launchpad.net/~jonathonf/+archive/ubuntu/backports ?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/71/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 546073980,MDU6SXNzdWU1NDYwNzM5ODA=,74,Test failures on openSUSE 15.1: AssertionError: Explicit other_table and other_column,15092,jayvdb,open,0,,,,,3,2020-01-07T04:35:50Z,2020-01-12T07:21:17Z,,CONTRIBUTOR,,"openSUSE 15.1 is using python 3.6.5 and click-7.0 , however it has test failures while openSUSE Tumbleweed on py37 passes. Most fail on the cli exit code like ```py [ 74s] =================================== FAILURES =================================== [ 74s] _________________________________ test_tables __________________________________ [ 74s] [ 74s] db_path = '/tmp/pytest-of-abuild/pytest-0/test_tables0/test.db' [ 74s] [ 74s] def test_tables(db_path): [ 74s] result = CliRunner().invoke(cli.cli, [""tables"", db_path]) [ 74s] > assert '[{""table"": ""Gosh""},\n {""table"": ""Gosh2""}]' == result.output.strip() [ 74s] E assert '[{""table"": ""...e"": ""Gosh2""}]' == '' [ 74s] E - [{""table"": ""Gosh""}, [ 74s] E - {""table"": ""Gosh2""}] [ 74s] [ 74s] tests/test_cli.py:28: AssertionError ``` packaging project at https://build.opensuse.org/package/show/home:jayvdb:py-new/python-sqlite-utils I'll keep digging into this after I have github-to-sqlite working on Tumbleweed, as I'll need openSUSE Leap 15.1 working before I can submit this into the main python repo.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/74/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 555832585,MDU6SXNzdWU1NTU4MzI1ODU=,661,"--port option to expose a port other than 8001 in ""datasette package""",134771,dvhthomas,closed,0,,,,,3,2020-01-27T21:05:56Z,2020-01-30T04:17:52Z,2020-01-29T22:46:45Z,NONE,,"I see how to alter the port using `datasette serve -p XXX` per the docs. However, I'm packaging up to server the container on AppEngine flexible, which [requires](https://cloud.google.com/appengine/docs/flexible/custom-runtimes/build#listening_to_port_8080) that the container is serving traffic on port 8080. https://github.com/simonw/datasette/blob/7950105c278b140e6cb665c68b59df219870f9bc/Dockerfile#L41 Is there a way to inject a non-default port into the Dockerfile, or should I just do something like `sed` to replace 8001 with 8080 after `dataset package` has done it's thing? Thanks for the advice.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/661/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 559197745,MDU6SXNzdWU1NTkxOTc3NDU=,82,Tutorial command no longer works,10350886,petey284,closed,0,,,,,3,2020-02-03T16:36:11Z,2020-02-27T04:16:43Z,2020-02-27T04:16:30Z,NONE,,"Issue with command on [tutorial](https://simonwillison.net/2019/Feb/25/sqlite-utils/) on Simon's site. The following command no longer works, and breaks with the previous too many variables error: #50 ``` cmd > curl ""https://data.nasa.gov/resource/y77d-th95.json"" | \ sqlite-utils insert meteorites.db meteorites - --pk=id ``` Output: ``` cmd Traceback (most recent call last): File ""continuum\miniconda3\envs\main\lib\runpy.py"", line 193, in _run_module_as_main ""__main__"", mod_spec) File ""continuum\miniconda3\envs\main\lib\runpy.py"", line 85, in _run_code exec(code, run_globals) File ""Continuum\miniconda3\envs\main\Scripts\sqlite-utils.exe\__main__.py"", line 9, in File ""continuum\miniconda3\envs\main\lib\site-packages\click\core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""continuum\miniconda3\envs\main\lib\site-packages\click\core.py"", line 717, in main rv = self.invoke(ctx) File ""continuum\miniconda3\envs\main\lib\site-packages\click\core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""continuum\miniconda3\envs\main\lib\site-packages\click\core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""continuum\miniconda3\envs\main\lib\site-packages\click\core.py"", line 555, in invoke return callback(*args, **kwargs) File ""continuum\miniconda3\envs\main\lib\site-packages\sqlite_utils\cli.py"", line 434, in insert default=default, File ""continuum\miniconda3\envs\main\lib\site-packages\sqlite_utils\cli.py"", line 384, in insert_upsert_implementation docs, pk=pk, batch_size=batch_size, alter=alter, **extra_kwargs File ""continuum\miniconda3\envs\main\lib\site-packages\sqlite_utils\db.py"", line 1081, in insert_all result = self.db.conn.execute(query, params) sqlite3.OperationalError: too many SQL variables ``` My thought is that maybe the dataset grew over the last few years and so didn't run into this issue before. No error when I reduce the count of entries to 83. Once the number of entries hits 84 the command fails. // This passes ``` cmd type meteorite_83.txt | sqlite-utils insert meteorites.db meteorites - --pk=id ``` // But this fails ``` cmd type meteorite_84.txt | sqlite-utils insert meteorites.db meteorites - --pk=id ``` A potential fix might be to chunk the incoming data? I can work on a PR if pointed in right direction. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/82/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 569253072,MDU6SXNzdWU1NjkyNTMwNzI=,678,prepare_connection() plugin hook should accept optional datasette argument,9599,simonw,closed,0,,,,,3,2020-02-22T00:50:26Z,2020-02-22T03:53:19Z,2020-02-22T02:28:51Z,OWNER,,"I want to build a plugin that allows users to configure certain database columns to be ""masked"" - so the `password` column on a users table is never revealed, for example. To do this, I need to use the `conn.set_authorizer()` SQLite mechanism. So the plugin needs to build off the `prepare_connection(conn)` hook. But that hook doesn't currently get passed `datasette` so it doesn't have a way of looking up its plugin configuration!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/678/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 573578548,MDU6SXNzdWU1NzM1Nzg1NDg=,89,Ability to customize columns used by extracts= feature,9599,simonw,open,0,,,,,3,2020-03-01T16:54:48Z,2020-10-16T19:17:50Z,,OWNER,,"@simonw any thoughts on allow extracts to specify the lookup column name? If I'm understanding the documentation right, `.lookup()` allows you to define the ""value"" column (the documentation uses name), but when you use `extracts` keyword as part of `.insert()`, `.upsert()` etc. the lookup must be done against a column named ""value"". I have an existing lookup table that I've populated with columns ""id"" and ""name"" as opposed to ""id"" and ""value"", and seems I can't use `extracts=`, unless I'm missing something... Initial thought on how to do this would be to allow the dictionary value to be a tuple of table name column pair... so: ``` table = db.table(""trees"", extracts={""species_id"": (""Species"", ""name""}) ``` I haven't dug too much into the existing code yet, but does this make sense? Worth doing? _Originally posted by @chrishas35 in https://github.com/simonw/sqlite-utils/issues/46#issuecomment-592999503_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/89/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 574043218,MDU6SXNzdWU1NzQwNDMyMTg=,693,Variables from extra_template_vars() not exposed in _context=1,9599,simonw,closed,0,,,,,3,2020-03-02T15:14:51Z,2020-04-05T19:12:48Z,2020-04-05T19:12:48Z,OWNER,,The `_context=1` debugging mode does not show variables that should have been added to the context by the `extra_template_vars()` plugin hook.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/693/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 583970196,MDU6SXNzdWU1ODM5NzAxOTY=,701,Search box CSS doesn't look great on OS X Safari,9599,simonw,closed,0,,,5234079,Datasette 0.39,3,2020-03-18T20:00:52Z,2020-03-24T22:57:18Z,2020-03-24T22:57:18Z,OWNER,," ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/701/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585353598,MDU6SXNzdWU1ODUzNTM1OTg=,37,"Handle ""User not found"" error",9599,simonw,closed,0,,,,,3,2020-03-20T22:14:32Z,2020-04-17T23:43:46Z,2020-04-17T23:43:46Z,MEMBER,,"While running `user-timeline` I got this bug (because a screen name I asked for didn't exist): ``` File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py"", line 185, in transform_user user[""created_at""] = parser.parse(user[""created_at""]) KeyError: 'created_at' >>> import pdb >>> pdb.pm() > /Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py(185)transform_user() -> user[""created_at""] = parser.parse(user[""created_at""]) (Pdb) user {'errors': [{'code': 50, 'message': 'User not found.'}]} ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/37/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 598013965,MDU6SXNzdWU1OTgwMTM5NjU=,724,--plugin-secret over-rides existing metadata.json plugin config,9599,simonw,closed,0,,,,,3,2020-04-10T17:56:30Z,2020-04-16T04:58:12Z,2020-04-10T18:34:21Z,OWNER,,"This means if you use `--plugin-secret` at all (with e.g. `publish cloudrun`) any existing plugin configuration in your `metadata.json` will be ignored. https://github.com/simonw/datasette/blob/af9cd4ca64652fae262e6f7b5d201f6e0adc989b/datasette/publish/cloudrun.py#L98-L109 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/724/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601333634,MDU6SXNzdWU2MDEzMzM2MzQ=,28,Pull repository contributors,9599,simonw,closed,0,,,,,3,2020-04-16T18:46:40Z,2020-04-18T15:05:10Z,2020-04-18T15:05:10Z,MEMBER,,"https://developer.github.com/v3/repos/#list-contributors `GET /repos/:owner/:repo/contributors` Not sure if this should be a separate command or should be part of the existing `repos` command. I'm leaning towards a new `contributors` command.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/28/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601358649,MDU6SXNzdWU2MDEzNTg2NDk=,100,"Mechanism for forcing column-type, over-riding auto-detection",9599,simonw,closed,0,,,,,3,2020-04-16T19:12:52Z,2020-04-17T23:53:32Z,2020-04-17T23:53:32Z,OWNER,,"As seen in https://github.com/dogsheep/github-to-sqlite/issues/27#issuecomment-614843406 - there's a problem where you insert a record with a `None` value for a column and that column is created as `TEXT` - but actually you intended it to be an `INT` (as later examples will demonstrate). Some kind of mechanism for over-riding the detected types of columns would be useful here.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/100/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 603624862,MDU6SXNzdWU2MDM2MjQ4NjI=,31,Issue and milestone should have foreign key to repo,9599,simonw,closed,0,,,,,3,2020-04-21T00:46:24Z,2020-04-22T01:20:19Z,2020-04-22T01:20:19Z,MEMBER,,"Currently the `repo` column on those tables is a string `simonw/datasette` rather than an ID referencing a row in `repos`. _Originally posted by @simonw in https://github.com/dogsheep/github-to-sqlite/issues/29#issuecomment-616883275_",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/31/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 604222295,MDU6SXNzdWU2MDQyMjIyOTU=,32,Issue comments don't appear to populate issues foreign key,9599,simonw,closed,0,,,,,3,2020-04-21T19:17:32Z,2020-04-22T01:17:44Z,2020-04-22T01:17:44Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github?sql=select+html_url%2C+id%2C+issue+from+issue_comments+order+by+updated_at+desc+limit+101 ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/32/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 606720674,MDU6SXNzdWU2MDY3MjA2NzQ=,736,strange behavior using accented characters,30607,aborruso,closed,0,,,,,3,2020-04-25T08:34:51Z,2020-04-28T06:09:28Z,2020-04-27T18:59:16Z,NONE,,"Hi, when I search `incompatibilità` [here](https://my-database.now.sh/commissioniComunePalermo/youtube), using full text search, it becomes `incompatibilità` and I have no result. If I encode the `à` char in the URL (`incompatibilit%C3%A0`) I have the right result. ![image](https://user-images.githubusercontent.com/30607/80275201-00a79380-86e0-11ea-865e-f7e1474e8098.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/736/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610511450,MDU6SXNzdWU2MTA1MTE0NTA=,35,Create index on issue_comments(user) and other foreign keys,9599,simonw,closed,0,,,,,3,2020-05-01T02:06:56Z,2020-05-02T18:26:24Z,2020-05-02T18:26:24Z,MEMBER,,"``` create index issue_comments_user on issue_comments(user) ``` I'm sure there are other user columns that could benefit from an index.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/35/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610843136,MDU6SXNzdWU2MTA4NDMxMzY=,37,Mechanism for creating views if they don't yet exist,9599,simonw,closed,0,,,,,3,2020-05-01T16:34:10Z,2020-05-02T16:19:47Z,2020-05-02T16:19:31Z,MEMBER,,Needed for #36 #10 #12 ,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/37/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611252244,MDU6SXNzdWU2MTEyNTIyNDQ=,750,Add notlike table filter,9599,simonw,closed,0,,,,,3,2020-05-02T18:54:36Z,2020-05-02T19:10:44Z,2020-05-02T19:10:44Z,OWNER,,"I found myself wanting that for applying the opposite of this: https://github-to-sqlite.dogsheep.net/github/dependent_repos?dependent__like=%25simonw%2F%25&_sort_desc=dependent_stars ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/750/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612673948,MDU6SXNzdWU2MTI2NzM5NDg=,759,fts search on a column doesn't work anymore due to escape_fts,133845,Krazybug,closed,0,,,,,3,2020-05-05T15:03:44Z,2021-07-16T02:11:54Z,2020-05-06T17:50:57Z,NONE,,"Hi and first, thank you for this awesome work you make with this projet. On a db indexed in full text search, I can't query on indexed column anymore. This request ""cauvin language:ita"": is running smoothly on a old version of datasette but not on the current version. Compare the current version query `select uuid, title, authors, year, series, language, formats, publisher, tags, identifiers from summary where rowid in (select rowid from summary_fts where summary_fts match escape_fts(:search)) order by uuid limit 101` To an older version: `select title, authors, series, uuid, language, identifiers, tags, publisher, formats, year, links from summary where rowid in (select rowid from summary_fts where summary_fts match :search) order by uuid limit 101` _language_ is a searchable column but now the search string is known as ""cauvin language:ita"" literally as a search term. columns are not parsed. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/759/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612860531,MDU6SXNzdWU2MTI4NjA1MzE=,17,Only install osxphotos if running on macOS,9599,simonw,closed,0,,,,,3,2020-05-05T20:03:26Z,2020-05-05T20:20:05Z,2020-05-05T20:11:23Z,MEMBER,,The build is broken right now because you can't `pip install osxphotos` on Ubuntu.,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612378203,MDU6SXNzdWU2MTIzNzgyMDM=,757,Question: Any fixed date for the release with the uft8-encoding fix?,2181410,clausjuhl,closed,0,,,,,3,2020-05-05T06:51:20Z,2020-05-06T18:41:29Z,2020-05-06T18:41:29Z,NONE,,Just a little impatient :),107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/757/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 613422636,MDU6SXNzdWU2MTM0MjI2MzY=,760,Way of seeing full schema for a database,9599,simonw,open,0,,,,,3,2020-05-06T15:46:08Z,2020-05-06T23:49:06Z,,OWNER,,"I find myself wanting to quickly figure out all of the BLOB columns in a database. A `/-/schema` page showing the full schema (actually since it's per-database probably `/dbname/-/schema` or `/-/schema/dbname`) would be really handy. It would need to be carefully constructed from various queries against `sqlite_master` - just doing `select * from sqlite_master where type='table'` isn't quite enough because I also want to show indexes, triggers etc.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/760/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 616012427,MDU6SXNzdWU2MTYwMTI0Mjc=,764,Add PyPI project urls to setup.py,9599,simonw,closed,0,,,5471110,Datasette 0.43,3,2020-05-11T16:23:08Z,2020-05-27T20:21:36Z,2020-05-11T18:28:55Z,OWNER,,"Spotted this example here: ```python project_urls={ ""Issues"": ""https://gitlab.com/Cyb3r-Jak3/ExifReader/issues"", ""Source Code"": ""https://gitlab.com/Cyb3r-Jak3/ExifReader/-/tree/publish"", ""CI"": ""https://gitlab.com/Cyb3r-Jak3/ExifReader/pipelines"", ""Releases"": ""https://github.com/Cyb3r-Jak3/ExifReader"" }, ``` Results in this on https://pypi.org/project/ExifReader/ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/764/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 626663119,MDU6SXNzdWU2MjY2NjMxMTk=,781,request.url and request.scheme should obey force_https_urls config setting,9599,simonw,closed,0,,,,,3,2020-05-28T16:54:47Z,2020-05-28T17:39:54Z,2020-05-28T17:10:13Z,OWNER,,"I'm trying to get the https://www.niche-museums.com/browse/feed.atom feed to validate and I git this from https://validator.w3.org/feed/check.cgi?url=https%3A%2F%2Fwww.niche-museums.com%2Fbrowse%2Ffeed.atom > This feed is valid, but interoperability with the widest range of feed readers could be improved by implementing the following recommendations. > > [line 6](https://validator.w3.org/feed/check.cgi?url=https%3A%2F%2Fwww.niche-museums.com%2Fbrowse%2Ffeed.atom#l6), column 73: Self reference doesn't match document location [[help](https://validator.w3.org/feed/docs/warning/SelfDoesntMatchLocation.html ""more information about this error"")] > > I tried to fix this using `force_https_urls` ([commit](https://github.com/simonw/museums/commit/5dc8e2c717c59f9e949b65e47a59878e01f929e4)) but it didn't work - because that setting isn't respected by the Request class: https://github.com/simonw/datasette/blob/40885ef24e32d91502b6b8bbad1c7376f50f2830/datasette/utils/asgi.py#L15-L32",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/781/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 629524205,MDU6SXNzdWU2Mjk1MjQyMDU=,793,CSRF protection for /-/messages tool and writable canned queries,9599,simonw,closed,0,,,5512395,Datasette 0.44,3,2020-06-02T21:22:21Z,2020-06-06T00:43:41Z,2020-06-05T19:05:59Z,OWNER,,"> The `/-/messages` debug tool will need CSRF protection or people will be able to add messages using a hidden form on another website. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/790#issuecomment-637790860_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/793/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 630120235,MDU6SXNzdWU2MzAxMjAyMzU=,797,"Documentation for new ""params"" setting for canned queries",9599,simonw,closed,0,,,5512395,Datasette 0.44,3,2020-06-03T15:55:11Z,2020-06-09T04:00:40Z,2020-06-03T21:04:51Z,OWNER,,Added here: https://github.com/simonw/datasette/commit/aa82d0370463580f2cb10d9617f1bcbe45cc994a#diff-5e0ffd62fced7d46339b9b2cd167c2f9R236,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/797/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 634112607,MDU6SXNzdWU2MzQxMTI2MDc=,812,Ability to customize what happens when a view permission fails,9599,simonw,closed,0,,,5533512,Datasette 0.45,3,2020-06-08T04:26:14Z,2020-07-01T04:17:46Z,2020-07-01T04:17:45Z,OWNER,,"Currently view permission failures raise a `Forbidden` error which is transformed into a 403. It would be good if this page could offer a way forward - maybe just by linking to (or redirecting to) a login screen. This behaviour will vary based on authentication plugins, so a new plugin hook is probably the best way to do this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/812/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 635519358,MDU6SXNzdWU2MzU1MTkzNTg=,826,Document the ds_actor signed cookie,9599,simonw,closed,0,,,5512395,Datasette 0.44,3,2020-06-09T15:06:52Z,2020-06-09T22:33:12Z,2020-06-09T22:32:31Z,OWNER,,"Most authentication plugins (https://github.com/simonw/datasette-auth-github for example) are likely to work by setting the `ds_actor` signed cookie, which is already magically decoded and supported by default Datasette here: https://github.com/simonw/datasette/blob/4fa7cf68536628344356d3ef8c92c25c249067a0/datasette/actor_auth_cookie.py#L1-L13 I should document this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/826/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 635914822,MDU6SXNzdWU2MzU5MTQ4MjI=,828,Horizontal scrollbar on changelog page on mobile,9599,simonw,closed,0,,,5512395,Datasette 0.44,3,2020-06-10T04:18:54Z,2020-06-10T04:28:17Z,2020-06-10T04:28:17Z,OWNER,,"You can scroll sideways on that page and it looks bad: The cause is these long links: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/828/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 635107393,MDU6SXNzdWU2MzUxMDczOTM=,823,"Documentation is inconsistent about ""id"" as required field on actor",9599,simonw,closed,0,,,5512395,Datasette 0.44,3,2020-06-09T04:47:58Z,2020-06-09T14:58:36Z,2020-06-09T14:58:19Z,OWNER,,"Docs at https://github.com/simonw/datasette/blob/5a6a73e3190cac103906b479d56129413e5ef190/docs/authentication.rst#actors say: > The only required field in an actor is `""id""`, which must be a string. But the example here returns `{""token"": token}`: ```python @hookimpl def actor_from_request(datasette, request): async def inner(): token = request.args.get(""_token"") if not token: return None # Look up ?_token=xxx in sessions table result = await datasette.get_database().execute( ""select count(*) from sessions where token = ?"", [token] ) if result.first()[0]: return {""token"": token} else: return None return inner ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/823/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 636511683,MDU6SXNzdWU2MzY1MTE2ODM=,830,Redesign register_facet_classes plugin hook,9599,simonw,open,0,,,3268330,Datasette 1.0,3,2020-06-10T20:03:27Z,2021-12-16T19:58:22Z,,OWNER,,"Nothing uses this plugin hook yet, so the design is not yet proven. I'm going to build a real plugin against it and use that process to inform any design changes that may need to be made. I'll add a warning about this to the documentation.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/830/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 639542974,MDU6SXNzdWU2Mzk1NDI5NzQ=,47,Fall back to FTS4 if FTS5 is not available,73579,hpk42,open,0,,,,,3,2020-06-16T10:11:23Z,2020-06-17T20:13:48Z,,NONE,,"got this with version 0.21.1 from pypi. twitter-to-sqlite auth worked but then ""twitter-to-sqlite user-timeline USER.db"" produced a tracekback ending in ""no such module: FTS5"". ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/47/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 642652808,MDU6SXNzdWU2NDI2NTI4MDg=,861,Script to generate larger SQLite test files,9599,simonw,closed,0,,,,,3,2020-06-21T22:30:58Z,2020-06-23T03:44:18Z,2020-06-23T03:44:18Z,OWNER,,"> I'll write a little script which generates a 300MB SQLite file with a bunch of tables with lots of randomly generated rows in to help test this. > > Having a tool like that which can generate larger databases with different gnarly performance characteristics will be useful for other performance work too. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/859#issuecomment-647189948_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/861/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 642296989,MDU6SXNzdWU2NDIyOTY5ODk=,856,Consider pagination of canned queries,9599,simonw,open,0,,,,,3,2020-06-20T03:15:59Z,2021-05-21T14:22:41Z,,OWNER,,The new `canned_queries()` plugin hook from #852 combined with plugins like https://github.com/simonw/datasette-saved-queries could mean that some installations end up with hundreds or even thousands of canned queries. I should consider pagination or some other way of ensuring that this doesn't cause performance problems for Datasette.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/856/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 644161221,MDU6SXNzdWU2NDQxNjEyMjE=,117,Support for compound (composite) foreign keys,9599,simonw,open,0,,,,,3,2020-06-23T21:33:42Z,2020-06-23T21:40:31Z,,OWNER,,"It turns out SQLite supports composite foreign keys: https://www.sqlite.org/foreignkeys.html#fk_composite Their example looks like this: ```sql CREATE TABLE album( albumartist TEXT, albumname TEXT, albumcover BINARY, PRIMARY KEY(albumartist, albumname) ); CREATE TABLE song( songid INTEGER, songartist TEXT, songalbum TEXT, songname TEXT, FOREIGN KEY(songartist, songalbum) REFERENCES album(albumartist, albumname) ); ``` Here's what that looks like in sqlite-utils: ``` In [1]: import sqlite_utils In [2]: import sqlite3 In [3]: conn = sqlite3.connect("":memory:"") In [4]: conn Out[4]: In [5]: conn.executescript("""""" ...: CREATE TABLE album( ...: albumartist TEXT, ...: albumname TEXT, ...: albumcover BINARY, ...: PRIMARY KEY(albumartist, albumname) ...: ); ...: ...: CREATE TABLE song( ...: songid INTEGER, ...: songartist TEXT, ...: songalbum TEXT, ...: songname TEXT, ...: FOREIGN KEY(songartist, songalbum) REFERENCES album(albumartist, albumname) ...: ); ...: """""") Out[5]: In [6]: db = sqlite_utils.Database(conn) In [7]: db.tables Out[7]: [,
] In [8]: db.tables[0].foreign_keys Out[8]: [] In [9]: db.tables[1].foreign_keys Out[9]: [ForeignKey(table='song', column='songartist', other_table='album', other_column='albumartist'), ForeignKey(table='song', column='songalbum', other_table='album', other_column='albumname')] ``` The table appears to have two separate foreign keys, when actually it has a single compound composite foreign key.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/117/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 647103735,MDU6SXNzdWU2NDcxMDM3MzU=,875,"""Logged in as: XXX - logout"" navigation item",9599,simonw,closed,0,,,5533512,Datasette 0.45,3,2020-06-29T04:31:14Z,2020-07-02T00:13:24Z,2020-06-29T18:43:50Z,OWNER,,_Originally posted by @simonw in https://github.com/simonw/datasette/issues/840#issuecomment-650895874_,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/875/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 649437530,MDU6SXNzdWU2NDk0Mzc1MzA=,887,Canned query page should show the name of the canned query,9599,simonw,closed,0,,,5607421,Datasette 0.46,3,2020-07-02T00:10:39Z,2020-07-02T00:31:33Z,2020-07-02T00:23:45Z,OWNER,,"This page here - the URL is http://127.0.0.1:8001/data/all_tables but ""all_tables"" is not shown in the UI: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/887/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 649907676,MDU6SXNzdWU2NDk5MDc2NzY=,889,asgi_wrapper plugin hook is crashing at startup,49260,amjith,closed,0,,,,,3,2020-07-02T12:53:13Z,2020-09-15T20:40:52Z,2020-09-15T20:40:52Z,CONTRIBUTOR,,"Steps to reproduce: 1. Install datasette-media plugin `pip install datasette-media` 2. Launch datasette `datasette databasename.db` 3. Error ``` INFO: Started server process [927704] INFO: Waiting for application startup. ERROR: Exception in 'lifespan' protocol Traceback (most recent call last): File ""/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/uvicorn/lifespan/on.py"", line 48, in main await app(scope, self.receive, self.send) File ""/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/uvicorn/middleware/proxy_headers.py"", line 45, in __call__ return await self.app(scope, receive, send) File ""/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/datasette_media/__init__.py"", line 9, in wrapped_app path = scope[""path""] KeyError: 'path' ERROR: Application startup failed. Exiting. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/889/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 652700770,MDU6SXNzdWU2NTI3MDA3NzA=,119,Ability to remove a foreign key,9599,simonw,closed,0,,,,,3,2020-07-07T22:31:37Z,2020-09-24T20:36:59Z,2020-09-24T20:36:59Z,OWNER,,Useful if you add one but make a mistake and need to undo it without recreating the database from scratch.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/119/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 652961907,MDU6SXNzdWU2NTI5NjE5MDc=,121,Improved (and better documented) support for transactions,9599,simonw,open,0,,,,,3,2020-07-08T04:56:51Z,2020-09-24T20:36:46Z,,OWNER,,"_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/pull/118#issuecomment-655283393_ We should put some thought into how this library supports and encourages smart use of transactions.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/121/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 661605489,MDU6SXNzdWU2NjE2MDU0ODk=,900,Some links don't honor base_url,50220,noteed,closed,0,,,6026070,0.51,3,2020-07-20T09:40:50Z,2020-10-23T19:44:04Z,2020-10-15T22:57:55Z,NONE,,"Hi, I've been playing with Datasette behind Nginx (awesome tool, thanks !). It seems some URLs are OK but some aren't. For instance in https://github.com/simonw/datasette/blob/master/datasette/templates/query.html#L61 it seems that `url_csv` includes a `/` prefix, resulting in the `base_url` not beeing honored. Actually here, it seems that dropping the prefix `/` to make the link relative is enough (so it may not be strictly related to `base_url`). Additional information: ``` datasette, version 0.45+0.gf1f581b.dirty ``` Relevant Nginx configuration (note that all the trailing slashes have some effect): ``` location /datasette/ { proxy_pass http://127.0.0.1:9001/; proxy_set_header Host $host; } ``` Relelvant Datasette configuration (slashes matter too): ``` --config base_url:/datasette/ ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/900/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 663976976,MDU6SXNzdWU2NjM5NzY5NzY=,48,Add a table of contents to the README,9599,simonw,closed,0,,,,,3,2020-07-22T18:54:33Z,2020-07-23T17:46:07Z,2020-07-22T19:03:02Z,MEMBER,,Using https://github.com/jonschlinkert/markdown-toc,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/48/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 665400224,MDU6SXNzdWU2NjU0MDAyMjQ=,906,"""allow"": true for anyone, ""allow"": false for nobody",9599,simonw,closed,0,,,5607421,Datasette 0.46,3,2020-07-24T20:28:10Z,2020-07-25T00:07:10Z,2020-07-25T00:05:04Z,OWNER,,"The ""allow"" syntax described at https://datasette.readthedocs.io/en/0.45/authentication.html#defining-permissions-with-allow-blocks currently says this: > An allow block can specify ""no-one is allowed to do this"" using an empty `{}`: > > ``` > { > ""allow"": {} > } > ``` `""allow"": null` allows all access, though this isn't documented (it should be though). These are not very intuitive. How about also supporting `""allow"": true` for ""allow anyone"" and `""allow"": false` for ""allow nobody""?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/906/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 665407663,MDU6SXNzdWU2NjU0MDc2NjM=,908,"Interactive debugging tool for ""allow"" blocks",9599,simonw,closed,0,,,5607421,Datasette 0.46,3,2020-07-24T20:43:44Z,2020-07-25T00:06:15Z,2020-07-24T22:56:52Z,OWNER,,"> It might be good to have a little interactive tool which helps debug these things, since there are quite a few edge-cases and the damage caused if people use them incorrectly is substantial. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/907#issuecomment-663726146_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/908/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 666040390,MDU6SXNzdWU2NjYwNDAzOTA=,127,Ability to insert files piped to insert-files stdin,9599,simonw,closed,0,,,,,3,2020-07-27T07:09:33Z,2020-07-30T03:08:52Z,2020-07-30T03:08:18Z,OWNER,,"> Inserting files by piping them in should work - but since a filename cannot be derived this will need a `--name blah.gif` option. > > cat blah.gif | sqlite-utils insert-files files.db files - --name=blah.gif > _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/122#issuecomment-664128071_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/127/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 671763164,MDU6SXNzdWU2NzE3NjMxNjQ=,915,Refactor TableView class so things like datasette-graphql can reuse the logic,9599,simonw,closed,0,,,,,3,2020-08-03T03:13:33Z,2020-08-18T22:28:37Z,2020-08-18T22:28:37Z,OWNER,,_Originally posted by @simonw in https://github.com/simonw/datasette-graphql/issues/2#issuecomment-667780040_,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/915/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 671130371,MDU6SXNzdWU2NzExMzAzNzE=,130,Support tokenize option for FTS,9599,simonw,closed,0,,,,,3,2020-08-01T19:27:22Z,2020-08-01T20:51:28Z,2020-08-01T20:51:14Z,OWNER,,"FTS5 supports things like porter stemming using a `tokenize=` option: https://www.sqlite.org/fts5.html#tokenizers Something like this in code: ``` CREATE VIRTUAL TABLE [{table}_fts] USING {fts_version} ( {columns}, tokenize='porter', content=[{table}] ); ``` I tried this out just now and it worked exactly as expected. So... `db[table].enable_fts(...) should accept a 'tokenize=` argument, and `sqlite-utils enable-fts ...` should support a `--tokenize` option.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/130/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 675727366,MDU6SXNzdWU2NzU3MjczNjY=,919,"Travis should not build the master branch, only the main branch",9599,simonw,closed,0,,,,,3,2020-08-09T16:18:25Z,2020-08-09T16:26:18Z,2020-08-09T16:19:37Z,OWNER,,"Caused by #849 - since we are mirroring the two branches (to ensure old links to `master` keep working) Travis is building both. The following in `.travis.yml` should fix that: ``` branches: except: - master ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/919/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 677227912,MDU6SXNzdWU2NzcyMjc5MTI=,925,"""datasette install"" and ""datasette uninstall"" commands",9599,simonw,closed,0,,,,,3,2020-08-11T22:04:32Z,2020-08-11T22:34:37Z,2020-08-11T22:32:12Z,OWNER,,"When installing Datasette plugins it's crucial that they end up in the same virtual environment as Datasette itself. It's not necessarily obvious how to do this, especially if you install Datasette via pipx or homebrew. Solution: `datasette install datasette-vega` and `datasette uninstall datasette-vega` commands that know how to install to the correct place - a very thin wrapper around `pip install`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/925/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 679646710,MDU6SXNzdWU2Nzk2NDY3MTA=,935,"db.execute_write_fn(create_tables, block=True) hangs a thread if connection fails",9599,simonw,closed,0,,,,,3,2020-08-15T21:49:17Z,2020-08-15T22:35:33Z,2020-08-15T22:35:33Z,OWNER,,Discovered in https://github.com/simonw/latest-datasette-with-all-plugins/issues/3#issuecomment-674449757,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/935/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 679700269,MDU6SXNzdWU2Nzk3MDAyNjk=,938,Pass columns to extra CSS/JS/etc plugin hooks,9599,simonw,closed,0,,,,,3,2020-08-16T06:37:47Z,2020-09-30T20:36:12Z,2020-08-16T18:09:59Z,OWNER,,"I'd like `datasette-cluster-map` to only add links to JavaScript on pages that have tables with latitude and longitude columns. Passing the names of the columns to the plugin hook can support this and will be backwards compatible thanks to pluggy.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/938/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 683805434,MDU6SXNzdWU2ODM4MDU0MzQ=,135,Code for finding SpatiaLite in the usual locations,9599,simonw,closed,0,,,,,3,2020-08-21T20:15:34Z,2022-02-05T00:04:26Z,2020-08-21T20:30:13Z,OWNER,,"I built this for `shapefile-to-sqlite` but it would be useful in `sqlite-utils` too: https://github.com/simonw/shapefile-to-sqlite/blob/e754d0747ca2facf9a7433e2d5d15a6a37a9cf6e/shapefile_to_sqlite/utils.py#L16-L19 ```python SPATIALITE_PATHS = ( ""/usr/lib/x86_64-linux-gnu/mod_spatialite.so"", ""/usr/local/lib/mod_spatialite.dylib"", ) ``` https://github.com/simonw/shapefile-to-sqlite/blob/e754d0747ca2facf9a7433e2d5d15a6a37a9cf6e/shapefile_to_sqlite/utils.py#L105-L109 ```python def find_spatialite(): for path in SPATIALITE_PATHS: if os.path.exists(path): return path return None ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/135/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 683812642,MDU6SXNzdWU2ODM4MTI2NDI=,136,--load-extension=spatialite shortcut option,9599,simonw,closed,0,,,,,3,2020-08-21T20:31:25Z,2022-02-05T00:04:26Z,2020-10-16T19:14:32Z,OWNER,,In conjunction with #135 - this would do the same thing as `--load-extension=path-to-spatialite` (see #134),140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/136/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 689810340,MDU6SXNzdWU2ODk4MTAzNDA=,3,"Datasette plugin to provide custom page for running faceted, ranked searches",9599,simonw,closed,0,,,,,3,2020-09-01T05:00:22Z,2020-09-03T21:01:41Z,2020-09-03T21:01:41Z,MEMBER,,"This will be a page at `/-/beta` which renders using a custom template. It will offer a default timeline view plus search and facet by type/date.",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 694493566,MDU6SXNzdWU2OTQ0OTM1NjY=,16,Timeline view,9599,simonw,open,0,,,,,3,2020-09-06T19:13:58Z,2020-09-21T02:42:29Z,,MEMBER,,Ability to browse (and facet) by date.,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 695377804,MDU6SXNzdWU2OTUzNzc4MDQ=,153,table.optimize() should delete junk rows from *_fts_docsize,9599,simonw,closed,0,,,,,3,2020-09-07T20:31:09Z,2020-09-24T20:35:46Z,2020-09-07T21:16:33Z,OWNER,,"> The second challenge here is cleaning up all of those junk rows in existing `*_fts_docsize` tables. Doing that just to the demo database from https://github-to-sqlite.dogsheep.net/github.db dropped its size from 22MB to 16MB! Here's the SQL: > ```sql > DELETE FROM [licenses_fts_docsize] WHERE id NOT IN ( > SELECT rowid FROM [licenses_fts]); > ``` > I can do that as part of the existing `table.optimize()` method, which optimizes FTS tables. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/149#issuecomment-688501064_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/153/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 701294727,MDU6SXNzdWU3MDEyOTQ3Mjc=,965,"Documentation for 404.html, 500.html templates",9599,simonw,closed,0,,,5818042,Datasette 0.49,3,2020-09-14T17:36:59Z,2020-09-14T18:49:49Z,2020-09-14T18:47:22Z,OWNER,,This mechanism is not documented: https://github.com/simonw/datasette/blob/30b98e4d2955073ca2bca92ca7b3d97fcd0191bf/datasette/app.py#L1119-L1129,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/965/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 705057955,MDU6SXNzdWU3MDUwNTc5NTU=,969,"Add --tar option to ""datasette publish heroku""",1448859,betatim,closed,0,,,5971510,Datasette 0.50,3,2020-09-20T06:54:53Z,2020-10-08T23:55:59Z,2020-10-08T23:30:59Z,NONE,,"This issue is about how best to pass additional options to tools used for publishing datasettes. A concrete example is wanting to pass the `--tar` flag to the heroku CLI tool. I think there are at least two options for doing this: documentation for each publishing tool to explain how to set flags via env variables (if possible) or building a mechanism that lets users pass additional flags through datasette. When using `datasette publish heroku binder-launches.db --extra-options=""--config facet_time_limit_ms:35000 --config sql_time_limit_ms:35000"" --name=binderlytics --install=datasette-vega` to publish https://binderlytics.herokuapp.com/ the following error happens: ``` › Warning: heroku update available from 7.42.1 to 7.43.0. › Warning: heroku update available from 7.42.1 to 7.43.0. › Warning: heroku update available from 7.42.1 to 7.43.0. Setting WEB_CONCURRENCY and restarting ⬢ binderlytics... done, v13 WEB_CONCURRENCY: 1 › Warning: heroku update available from 7.42.1 to 7.43.0. ▸ Couldn't detect GNU tar. Builds could fail due to decompression errors ▸ See https://devcenter.heroku.com/articles/platform-api-deploying-slugs#create-slug-archive ▸ Please install it, or specify the '--tar' option ▸ Falling back to node's built-in compressor buffer.js:358 throw new ERR_INVALID_OPT_VALUE.RangeError('size', size); ^ RangeError [ERR_INVALID_OPT_VALUE]: The value ""3303763968"" is invalid for option ""size"" at Function.alloc (buffer.js:367:3) at new Buffer (buffer.js:281:19) at Readable. (/Users/thead/.local/share/heroku/node_modules/archiver-utils/index.js:39:15) at Readable.emit (events.js:322:22) at endReadableNT (/Users/thead/.local/share/heroku/node_modules/readable-stream/lib/_stream_readable.js:1010:12) at processTicksAndRejections (internal/process/task_queues.js:84:21) { code: 'ERR_INVALID_OPT_VALUE' } ``` After installing GNU tar with `brew install gnu-tar` and modifying `datasette/publish/heroku.py` to include the `--tar=/path/to/gnu-tar` publishing works. I think the problem occurs once your heroku slug reaches a certain size. At least when I add only a few 100 entries to the datasette then the error does not occcur. datasette version 0.49.1 OSX 10.14.6 (18G103)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/969/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 708261775,MDU6SXNzdWU3MDgyNjE3NzU=,175,Add docs for .transform(column_order=),9599,simonw,closed,0,,,,,3,2020-09-24T15:19:04Z,2020-09-24T20:35:48Z,2020-09-24T16:00:56Z,OWNER,,"> Need to update docs for `.transform()` now that `column_order=` is available. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/pull/174#discussion_r494403327_ Maybe also add this as an option to `sqlite-utils transform` - since reordering columns is actually a pretty nice capability.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/175/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 710650633,MDU6SXNzdWU3MTA2NTA2MzM=,979,Default table view JSON should include CREATE TABLE,9599,simonw,closed,0,,,,,3,2020-09-28T23:54:58Z,2023-01-09T15:32:39Z,2023-01-09T15:32:22Z,OWNER,,"https://latest.datasette.io/fixtures/facetable.json doesn't currently include the CREATE TABLE statement for the page, even though it's available on the HTML version at https://latest.datasette.io/fixtures/facetable",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/979/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 716756082,MDU6SXNzdWU3MTY3NTYwODI=,996,Better handling of multiple matching template wildcard paths,9599,simonw,closed,0,,,5971510,Datasette 0.50,3,2020-10-07T18:25:40Z,2020-10-08T23:55:41Z,2020-10-07T22:51:17Z,OWNER,,"I tried building this: templates/pages/{topic}.html templates/pages/{topic}/{slug}.html And it didn't work - hits to /foo/bar which should have been rendered by the `{slug}.html` template were instead rendered by the top level `{topic.html}` template.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/996/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 716988478,MDU6SXNzdWU3MTY5ODg0Nzg=,997,Documentation covering buildpack deployment,9599,simonw,closed,0,,,5971510,Datasette 0.50,3,2020-10-08T03:21:52Z,2020-10-08T23:56:03Z,2020-10-08T23:32:10Z,OWNER,,A tidied up version of https://til.simonwillison.net/til/til/digitalocean_datasette-on-digitalocean-app-platform.md - but mention that you can deploy to Heroku using the same mechanism.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/997/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 717729056,MDU6SXNzdWU3MTc3MjkwNTY=,999,Datasette should default to running Uvicorn with workers=1,9599,simonw,closed,0,,,5971510,Datasette 0.50,3,2020-10-08T23:07:03Z,2020-10-08T23:55:46Z,2020-10-08T23:21:36Z,OWNER,,"Uvicorn uses the `WEB_CONCURRENCY` variable, if set, to specify the number of workers to use. Datasette does not work with options for this other than 1: ``` WEB_CONCURRENCY=2 datasette . WARNING: You must pass the application as an import string to enable 'reload' or 'workers'. ``` This was the cause of the Heroku bug in #627. I fixed that issue by setting `WEB_CONCURRENCY=1` in `datasette publish heroku`, but a better fix would be to hard-code `workers=1` in Datasette itself.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/999/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718484082,MDU6SXNzdWU3MTg0ODQwODI=,1010,json / CSV links are broken in Datasette 0.50,9599,simonw,closed,0,,,,,3,2020-10-10T00:07:42Z,2020-10-10T02:32:03Z,2020-10-10T02:32:03Z,OWNER,,"e.g. on https://latest.datasette.io/fixtures/sortable That export link block is broken. The HTML is: ```html

This data as json, CSV (advanced)

```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1010/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718953669,MDU6SXNzdWU3MTg5NTM2Njk=,1016,"Add a ""delete"" icon next to filters (in addition to ""remove filter"")",9599,simonw,closed,0,,,6026070,0.51,3,2020-10-11T23:49:53Z,2020-10-23T19:44:06Z,2020-10-12T03:01:58Z,OWNER,,"The ""remove filter"" option in the select box is not very discoverable. It would be good to have an additional remove icon, pointed to by the pink arrow, which removes a specific selected filter.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1016/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718259202,MDU6SXNzdWU3MTgyNTkyMDI=,1005,Remove xfail tests when new httpx is released,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2020-10-09T16:00:19Z,2021-02-28T22:41:08Z,2021-02-28T22:41:08Z,OWNER,,"> My `httpx` pull request adding `raw_path` support was just merged: https://github.com/encode/httpx/pull/1357 - but it's not in a release yet. > > I'm going to mark these tests as `xfail` so I can land this change - I'll remove that once an `httpx` release comes out that I can use to get the tests passing. > _Originally posted by @simonw in https://github.com/simonw/datasette/pull/1000#issuecomment-706263157_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1005/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 722673818,MDU6SXNzdWU3MjI2NzM4MTg=,1023,Fix issues relating to base_url,9599,simonw,closed,0,,,6026070,0.51,3,2020-10-15T21:02:06Z,2020-11-24T19:51:44Z,2020-10-31T20:51:01Z,OWNER,,Lots of `base_url` bugs that I'd like to solve at once.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1023/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 725743755,MDU6SXNzdWU3MjU3NDM3NTU=,1035,"datasette.urls.table(..., format=""json"") argument",9599,simonw,closed,0,,,6026070,0.51,3,2020-10-20T16:09:34Z,2020-10-31T18:16:43Z,2020-10-31T18:16:43Z,OWNER,,"> That `datasette.urls.table(""db"", ""table"") + "".json""` example is bad because if the table name contains a `.` it should be `?_format=json` instead. > > Maybe `.table()` should have a `format=""json""` option that knows how to do this. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1026#issuecomment-712962517_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1035/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 726094754,MDU6SXNzdWU3MjYwOTQ3NTQ=,1037,Add horizontal scrollbar to tables,9599,simonw,closed,0,,,6026070,0.51,3,2020-10-21T03:13:34Z,2020-10-27T20:52:04Z,2020-10-21T03:16:36Z,OWNER,,Currently you have to scroll the entire page sideways if a table is wide. Make the table `overflow-x: auto` instead.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1037/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 732685643,MDU6SXNzdWU3MzI2ODU2NDM=,1063,.csv should link to .blob downloads,9599,simonw,closed,0,,,6026070,0.51,3,2020-10-29T21:45:58Z,2021-06-17T18:12:30Z,2020-10-29T22:47:45Z,OWNER,,"- [x] Update `.csv` output to link to these things (and get that `xfail` test to pass) - ~~Add a `.csv?_blob_base64=1` argument that causes them to be output in base64 in the CSV~~ > Moving the CSV work to a separate ticket. _Originally posted by @simonw in https://github.com/simonw/datasette/pull/1061#issuecomment-719042601_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1063/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 732859030,MDU6SXNzdWU3MzI4NTkwMzA=,1066,Table actions menu plus plugin hook,9599,simonw,closed,0,,,6026070,0.51,3,2020-10-30T03:46:54Z,2020-10-30T05:18:36Z,2020-10-30T05:16:50Z,OWNER,,"> For the table actions: attaching it to a cog icon next to the table name could make sense. > > > > This is the column action icon at twice the size, color `#666`. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/690#issuecomment-709497595_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1066/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 733768037,MDU6SXNzdWU3MzM3NjgwMzc=,1074,latest.datasette.io should include plugins from fixtures,9599,simonw,closed,0,,,6026070,0.51,3,2020-10-31T17:23:23Z,2020-10-31T19:47:47Z,2020-10-31T19:47:47Z,OWNER,,"> It bothers me that these aren't visible in any public demos. Maybe `latest.datasette.io` should include the `my_plugins.py` and `my_plugins2.py` plugins? _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1067#issuecomment-719961701_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1074/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 733829385,MDU6SXNzdWU3MzM4MjkzODU=,1077,database_actions plugin hook,9599,simonw,closed,0,,,6055094,Datasette 0.52,3,2020-10-31T23:48:12Z,2020-11-02T18:43:25Z,2020-11-02T18:29:50Z,OWNER,,Like `column_actions` but adds a cog menu to the database page.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1077/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 735650864,MDU6SXNzdWU3MzU2NTA4NjQ=,194,3.0 release with some minor breaking changes,9599,simonw,closed,0,,,6079500,3.0,3,2020-11-03T21:36:31Z,2020-11-08T17:19:35Z,2020-11-08T17:19:34Z,OWNER,,"While working on search (#192) I've spotted a few small changes I would like to make that would break backwards compatibility in minor ways, hence requiring a 3.x release. `db[table].search()` - I would like this to default to sorting by rank Also I'd like to free up the `-c` and `-f` options for other purposes from the standard output formats here: https://github.com/simonw/sqlite-utils/blob/43eae8b193d362f2b292df73e087ed6f10838144/sqlite_utils/cli.py#L48-L58 I'd like `-f` to be used to indicate a full-text search column during an insert and `-c` to indicate a column (so you can specify which columns you want to output).",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/194/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 735852274,MDU6SXNzdWU3MzU4NTIyNzQ=,1082,DigitalOcean buildpack memory errors for large sqlite db?,39538958,justmars,open,0,,,,,3,2020-11-04T06:35:32Z,2020-11-04T19:35:44Z,,NONE,,"1. Have a sqlite db stored in Dropbox 2. Previously tried the Digital Ocean build pack minimal approach (e.g. Procfile, requirements.txt, bin/post_compile) 3. bin/post_compile with wget from Dropbox 4. download of large sqlite db is successful 5. log reveals that when building Docker container, Digital Ocean runs out of memory for 5gb+ sqlite db but works fine for 2gb+ sqlite db",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1082/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 741862364,MDU6SXNzdWU3NDE4NjIzNjQ=,1090,Custom widgets for canned query forms,9599,simonw,open,0,,,,,3,2020-11-12T19:21:07Z,2021-03-27T16:25:25Z,,OWNER,,"This is an idea that was cut from the first version of writable canned queries: > I really want the option to use a `