id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason 803356942,MDU6SXNzdWU4MDMzNTY5NDI=,1218, /usr/local/opt/python3/bin/python3.6: bad interpreter: No such file or directory,11855322,open,0,,,1,2021-02-08T09:07:00Z,2021-02-23T12:12:17Z,,NONE,,"Error as above, however I do have python3.8 and the readme indicates this is supported. ``` (venv) (base) Robins-MacBook:datasette robin$ ls /usr/local/opt/python3/bin/ .. pip3 python3 python3.8 ```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1218/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1665053646,I_kwDOBm6k_c5jPrPO,2059,"""Deceptive site ahead"" alert on Heroku deployment",1186275,open,0,,,1,2023-04-12T18:34:51Z,2023-04-13T01:13:01Z,,NONE,,"I deployed a fairly basic instance of Datasette (`datasette-auth-passwords` is the only plugin) using Heroku. The deployed URL now gives a ""Deceptive site ahead"" warning to users. Is there way around this? Maybe a way to add ownership verification [through Google's search console](https://search.google.com/search-console/welcome)? ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2059/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 959999095,MDU6SXNzdWU5NTk5OTkwOTU=,1421,"""Query parameters"" form shows wrong input fields if query contains ""03:31"" style times",6988,closed,0,,,11,2021-08-04T07:29:04Z,2021-08-09T03:41:07Z,2021-08-09T03:33:02Z,NONE,,"Datasette version `0.58.1`. I'm guessing this is a bug in the code that looks for `:param`-style query parameters.. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1421/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 573583971,MDU6SXNzdWU1NzM1ODM5NzE=,689,"""Templates considered"" comment broken in >=0.35",35075,closed,0,,,6,2020-03-01T17:31:21Z,2020-04-05T19:39:44Z,2020-04-05T19:39:44Z,NONE,,"Noticed that the ""Templates Considered"" comment is missing in 0.37. Believe I traced it back to #664 as you can see it in https://v0-34.datasette.io/ but not https://v0-35.datasette.io/. Looking at the template context debug between the two you can see what is missing from 0.35 vs. 0.34: ```diff < ""datasette_version"": ""0.34"", < ""app_css_hash"": ""ffa51a"", < ""select_templates"": [ < ""*index.html"" < ], < ""zip"": """", < ""body_scripts"": [], < ""extra_css_urls"": """", < ""extra_js_urls"": """", < ""format_bytes"": """", < ""database_url"": "">"", < ""database_color"": "">"" --- > ""datasette_version"": ""0.35"", > ""database_url"": "">"", > ""database_color"": "">"" ```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/689/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 760312579,MDU6SXNzdWU3NjAzMTI1Nzk=,1134,"""_searchmode=raw"" throws an index out of range error when combined with ""_search_COLUMN""",2181410,closed,0,,,4,2020-12-09T13:05:37Z,2020-12-10T05:57:17Z,2020-12-09T19:56:55Z,NONE,,"Hi Simon! Maybe it's just me, but when [using _searchmode=raw (trying to enable wildcard-searching) in combination with the ""_search_COLUMN""-table argument](https://byraadsarkivet.aarhus.dk/db/cases?_searchmode=raw&_search_title=sundhedsfrem*), I get a list index out of range error. [When combining with the simpler ""_search""-argument everything works, including wildcard-seaches.](https://byraadsarkivet.aarhus.dk/db/cases?_search=sundhedsfrem*&_searchmode=raw). Here's the traceback: ``` Traceback (most recent call last): File ""/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/utils/asgi.py"", line 122, in route_path return await view(new_scope, receive, send) File ""/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/utils/asgi.py"", line 196, in view request, **scope[""url_route""][""kwargs""] File ""/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/views/base.py"", line 204, in get request, database, hash, correct_hash_provided, **kwargs File ""/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/views/base.py"", line 342, in view_get request, database, hash, **kwargs File ""/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/views/table.py"", line 393, in data search_col = key.split(""_search_"", 1)[1] IndexError: list index out of range ```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1134/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 457147936,MDU6SXNzdWU0NTcxNDc5MzY=,512,"""about"" parameter in metadata does not appear when alone",7936571,open,0,,,3,2019-06-17T21:04:20Z,2019-10-11T15:49:13Z,,NONE,,"Here's an example of metadata I have for one database on datasette. ``` ""Records-requests"": { ""tables"": { ""Some table"": { ""about"": ""This table has data."" } } } ``` The text in `about` does not show up when I publish the data. But it shows up after I add a `""source""` parameter in the metadata. Is this intended?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/512/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 567902704,MDU6SXNzdWU1Njc5MDI3MDQ=,675,--cp option for datasette publish and datasette package for shipping additional files and directories,141844,open,0,,,12,2020-02-19T22:55:56Z,2020-12-28T18:49:21Z,,NONE,,"I’m working on integrating Datasette into a documentation-oriented publishing workflow internally in my company, and in order to deploy the Docker image created by `datasette package` I need to add an additional file to the image — in my case, it’s a sort of a deployment directive. I’ve worked out a way to do this after the image has been created, but it’s convoluted and brittle. So it’d be excellent if there was an additional option for this command, something like, like, `--copy`. I’d envision it looking something like: ```shell $ datasette package --copy /the/source/path:/the/target/path data.db ``` I’d be happy to help design, specify, implement, and test this feature, if you’d be interested. Thanks for the fantastic tools!",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/675/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1931794126,I_kwDOBm6k_c5zJNbO,2198,--load-extension=spatialite not working with Windows,363004,open,0,,,0,2023-10-08T12:50:22Z,2023-10-08T12:50:22Z,,NONE,,"Using each of `python -m datasette counties.db -m metadata.yml --load-extension=SpatiaLite` and `python -m datasette counties.db --load-extension=""C:\Windows\System32\mod_spatialite.dll""` and `python -m datasette counties.db --load-extension=C:\Windows\System32\mod_spatialite.dll` I got the error: ``` File ""C:\Users\m3n7es\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\datasette\database.py"", line 209, in in_thread self.ds._prepare_connection(conn, self.name) File ""C:\Users\m3n7es\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\datasette\app.py"", line 596, in _prepare_connection conn.execute(""SELECT load_extension(?, ?)"", [path, entrypoint]) sqlite3.OperationalError: The specified module could not be found. ``` I finally tried modifying the code in app.py to read: ``` def _prepare_connection(self, conn, database): conn.row_factory = sqlite3.Row conn.text_factory = lambda x: str(x, ""utf-8"", ""replace"") if self.sqlite_extensions: conn.enable_load_extension(True) for extension in self.sqlite_extensions: # ""extension"" is either a string path to the extension # or a 2-item tuple that specifies which entrypoint to load. #if isinstance(extension, tuple): # path, entrypoint = extension # conn.execute(""SELECT load_extension(?, ?)"", [path, entrypoint]) #else: conn.execute(""SELECT load_extension('C:\Windows\System32\mod_spatialite.dll')"") ``` At which point the counties example worked. Is there a correct way to install/use the extension on Windows? My method will cause issues if there's a second extension to be used. On an unrelated note, my next step is to figure out how to write a query across the two loaded databases supplied from the command line: `python -m datasette rm_toucans_23_10_07.db counties.db -m metadata.yml --load-extension=SpatiaLite` ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2198/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 555832585,MDU6SXNzdWU1NTU4MzI1ODU=,661,"--port option to expose a port other than 8001 in ""datasette package""",134771,closed,0,,,3,2020-01-27T21:05:56Z,2020-01-30T04:17:52Z,2020-01-29T22:46:45Z,NONE,,"I see how to alter the port using `datasette serve -p XXX` per the docs. However, I'm packaging up to server the container on AppEngine flexible, which [requires](https://cloud.google.com/appengine/docs/flexible/custom-runtimes/build#listening_to_port_8080) that the container is serving traffic on port 8080. https://github.com/simonw/datasette/blob/7950105c278b140e6cb665c68b59df219870f9bc/Dockerfile#L41 Is there a way to inject a non-default port into the Dockerfile, or should I just do something like `sed` to replace 8001 with 8080 after `dataset package` has done it's thing? Thanks for the advice.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/661/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 435819321,MDU6SXNzdWU0MzU4MTkzMjE=,436,400 Error when trying to register new user via https://publish.datasettes.com/,317694,closed,0,,,1,2019-04-22T17:55:00Z,2021-01-04T20:15:42Z,2021-01-04T20:15:41Z,NONE,,"Behavior: When registering a new user via Zeit - confirmation is sent and screen acknowledges registered user... When clicking grant access the next screen is a white 400 error message. Replicated: Chrome and Firefox; 2 different email accounts",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/436/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 810397025,MDU6SXNzdWU4MTAzOTcwMjU=,1228,500 error caused by faceting if a column called `n` exists,7107523,closed,0,,,5,2021-02-17T17:41:20Z,2022-03-19T06:44:40Z,2022-03-19T01:38:04Z,NONE,,"I recently discovered `datasette` thanks to your great talk at FOSDEM and would like to use it for some projects. However, when trying to use it on databases created from some csv ot tsv files, I am sometimes getting this issue when going to http://127.0.0.1:8001/databasetest/databasetest and I don't exactly understand what it refers to. So far, I couldn't find anything relevant when reviewing the raw text files that could explain this issue, nor could I find something obvious between the files that generate this issue and those that don't. Does the error ring a bell and, if so, could you please point me to the right direction? ``` $ datasette databasetest.db INFO: Started server process [1408482] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) INFO: 127.0.0.1:56394 - ""GET / HTTP/1.1"" 200 OK INFO: 127.0.0.1:56394 - ""GET /-/static/app.css?4e362c HTTP/1.1"" 200 OK INFO: 127.0.0.1:56396 - ""GET /-/static-plugins/datasette_vega/main.2acbb312.css HTTP/1.1"" 200 OK INFO: 127.0.0.1:56398 - ""GET /-/static-plugins/datasette_vega/main.08f5d3d8.js HTTP/1.1"" 200 OK Traceback (most recent call last): File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/app.py"", line 1099, in route_path response = await view(request, send) File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/views/base.py"", line 147, in view request, **request.scope[""url_route""][""kwargs""] File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/views/base.py"", line 121, in dispatch_request return await handler(request, *args, **kwargs) File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/views/base.py"", line 260, in get request, database, hash, correct_hash_provided, **kwargs File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/views/base.py"", line 434, in view_get request, database, hash, **kwargs File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/views/table.py"", line 782, in data suggested_facets.extend(await facet.suggest()) File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/facets.py"", line 168, in suggest and any(r[""n""] > 1 for r in distinct_values) File ""/home/kabouik/.local/lib/python3.7/site-packages/datasette/facets.py"", line 168, in and any(r[""n""] > 1 for r in distinct_values) TypeError: '>' not supported between instances of 'str' and 'int' INFO: 127.0.0.1:56402 - ""GET /databasetest/databasetest HTTP/1.1"" 500 Internal Server Error INFO: 127.0.0.1:56402 - ""GET /-/static/app.css?4e362c HTTP/1.1"" 200 OK INFO: 127.0.0.1:56404 - ""GET / HTTP/1.1"" 200 OK INFO: 127.0.0.1:56404 - ""GET /-/static/app.css?4e362c HTTP/1.1"" 200 OK INFO: 127.0.0.1:56406 - ""GET /-/static-plugins/datasette_vega/main.2acbb312.css HTTP/1.1"" 200 OK INFO: 127.0.0.1:56408 - ""GET /-/static-plugins/datasette_vega/main.08f5d3d8.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56408 - ""GET /databasetest HTTP/1.1"" 200 OK INFO: 127.0.0.1:56408 - ""GET /-/static/app.css?4e362c HTTP/1.1"" 200 OK INFO: 127.0.0.1:56404 - ""GET /-/static-plugins/datasette_vega/main.2acbb312.css HTTP/1.1"" 200 OK INFO: 127.0.0.1:56406 - ""GET /-/static/codemirror-5.57.0.min.css HTTP/1.1"" 200 OK INFO: 127.0.0.1:56410 - ""GET /-/static-plugins/datasette_vega/main.08f5d3d8.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56414 - ""GET /-/static/codemirror-5.57.0-sql.min.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56412 - ""GET /-/static/codemirror-5.57.0.min.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56408 - ""GET /-/static/sql-formatter-2.3.3.min.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56408 - ""GET /databasetest?sql=select+*+from+databasetest HTTP/1.1"" 200 OK INFO: 127.0.0.1:56410 - ""GET /-/static/app.css?4e362c HTTP/1.1"" 200 OK INFO: 127.0.0.1:56408 - ""GET /-/static-plugins/datasette_vega/main.2acbb312.css HTTP/1.1"" 200 OK INFO: 127.0.0.1:56412 - ""GET /-/static/codemirror-5.57.0.min.css HTTP/1.1"" 200 OK INFO: 127.0.0.1:56404 - ""GET /-/static/sql-formatter-2.3.3.min.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56406 - ""GET /-/static/codemirror-5.57.0.min.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56414 - ""GET /-/static-plugins/datasette_vega/main.08f5d3d8.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56408 - ""GET /-/static/codemirror-5.57.0-sql.min.js HTTP/1.1"" 200 OK INFO: 127.0.0.1:56410 - ""GET /databasetest.json?sql=select+*+from+databasetest&_shape=array&_shape=array HTTP/1.1"" 200 OK ^CINFO: Shutting down INFO: Waiting for application shutdown. INFO: Application shutdown complete. INFO: Finished server process [1408482] ``` Note that there is no error if I go to http://127.0.0.1:8001/databasetest and then click on `Run SQL`.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1228/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 292011379,MDU6SXNzdWUyOTIwMTEzNzk=,184,500 from missing table name,222245,closed,0,,,4,2018-01-26T19:46:45Z,2019-05-21T16:17:29Z,2018-04-13T18:18:59Z,NONE,,"https://github.com/simonw/datasette/blob/56623e48da5412b25fb39cc26b9c743b684dd968/datasette/app.py#L517-L519 throws an error if it gets an empty list back. Simplest solution is to write a helper func that just says ```python result = list(await self.execute(name, sql, params) if result: return result[0][0] ``` and use it anywhere `[0][0]` is now.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/184/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 568091133,MDU6SXNzdWU1NjgwOTExMzM=,676,?_searchmode=raw option for running FTS searches without escaping characters,58088336,closed,0,,,9,2020-02-20T06:56:57Z,2020-02-25T05:57:24Z,2020-02-25T05:56:04Z,NONE,,"After the version 0.34. I am not able to use the wildchar in the _search option( or the full text search). It will not return any result unless I specify the whole word for text search. If I use 'match :search || ""*"" ' in the sql statement then it will work as expected.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/676/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 995098231,MDU6SXNzdWU5OTUwOTgyMzE=,1470,?_sort=rowid with _next= returns error,19851673,closed,0,,,4,2021-09-13T16:36:15Z,2021-10-18T19:30:15Z,2021-10-10T01:15:03Z,NONE,,"For example: - Go to https://cryptics.eigenfoo.xyz/clues/clues?_next=100 (this is the second page of results in a Datasette site) - Search anything using the FTS search bar. For example, searching for `hello` will take you to https://cryptics.eigenfoo.xyz/clues/clues?_search=hello&_sort=rowid&_next=100 - A `500 Error: list index out of range` is raised. This is because the search URL includes the `&_next=100` UTM parameter, carried over from where the FTS search was run. However, there isn't a second page in the search results, so a `list index out of range` error is raised. You can confirm that removing this UTM parameter from the URL returns the appropriate search results. The FTS search request should strip any `_next` UTM parameter. --- ```bash datasette, version 0.58.1 sqlite-utils, version 3.17 ```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1470/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 277589569,MDU6SXNzdWUyNzc1ODk1Njk=,155,A primary key column that has foreign key restriction associated won't rendering label column,388154,closed,0,,2949431,4,2017-11-29T00:40:02Z,2017-12-07T05:39:53Z,2017-12-07T05:39:53Z,NONE,,,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/155/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 448189298,MDU6SXNzdWU0NDgxODkyOTg=,486,Ability to add extra routes and related templates,2181410,closed,0,,,2,2019-05-24T14:04:25Z,2019-05-24T14:43:28Z,2019-05-24T14:43:09Z,NONE,,"Hi Simon Thank for an excellent job! Datasette is such an obviously good idea (once you have that idea!) and so well done. The only thing that I miss, is the ability to add extras routes (with associated jinja2-templates). For most of the datasets, that I would like to publish, I would also like at least a page, that describes the data (semantics, provenance, biases...) and a page explaining our cookie- and privacy-policies (which would allows us to use something like Goggle Analytics). ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/486/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 276842536,MDU6SXNzdWUyNzY4NDI1MzY=,153,Ability to customize presentation of specific columns in HTML view,20264,closed,0,,2949431,14,2017-11-26T17:46:11Z,2017-12-10T02:08:45Z,2017-12-07T06:17:33Z,NONE,,"This ties into https://github.com/simonw/datasette/issues/3 in some ways. It would be great to have some adaptability in the HTML views and to specific some columns as displaying in certain ways. - [x] 1. **Auto-parsing URIs into in-browser links.** Why? Lots of public data around cultural commons stuff links to a specific URL. This would be a great utility to turn on at the command line, just parse everything for URLs. Maybe they need to be underlined or represented in a different way than internal URLs. - [x] 2. **Ability to identify a column as plain/preformatted text.** Why? Was trying to import the Enron emails, the body collapses. Hard to read. These fields also tend to screw up the ability to scan a table view. If you knew it was text the system could set an `overflow` property on the relevant CSS, so you could still scan. - [x] 3. **Ability to identify a column as HTML.** Why? I want to spider some stuff and drop sections into SQLite, and just keep them as HTML.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/153/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 791237799,MDU6SXNzdWU3OTEyMzc3OTk=,1196,Access Denied Error in Windows,2826376,open,0,,,2,2021-01-21T15:40:40Z,2021-04-14T19:28:38Z,,NONE,,"I am trying to publish a db to vercel. But while issuing the below command throwing `Access Denied` error which is leading to `RecursionError: maximum recursion depth exceeded while calling a Python object`. I am using PyCharm and Python 3.9. I have reinstalled both and launched PyCharm as Admin in Windows 10. But still the issue persists. Issued command `datasette publish vercel jmeter.db --project jmeter --install datasette-vega` PS: localhost is working fine.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1196/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 451585764,MDU6SXNzdWU0NTE1ODU3NjQ=,499,Accessibility for non-techie newsies? ,7936571,open,0,,,3,2019-06-03T16:49:37Z,2019-06-05T21:22:55Z,,NONE,,"Hi again, I'm having fun uploading datasets to Heroku via datasette. I'd like to set up datasette so that it's easy for other newsroom workers, who don't use Linux and aren't programmers, to upload datasets. Does datsette provide this out-of-the-box, or as a plugin? ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/499/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 705057955,MDU6SXNzdWU3MDUwNTc5NTU=,969,"Add --tar option to ""datasette publish heroku""",1448859,closed,0,,5971510,3,2020-09-20T06:54:53Z,2020-10-08T23:55:59Z,2020-10-08T23:30:59Z,NONE,,"This issue is about how best to pass additional options to tools used for publishing datasettes. A concrete example is wanting to pass the `--tar` flag to the heroku CLI tool. I think there are at least two options for doing this: documentation for each publishing tool to explain how to set flags via env variables (if possible) or building a mechanism that lets users pass additional flags through datasette. When using `datasette publish heroku binder-launches.db --extra-options=""--config facet_time_limit_ms:35000 --config sql_time_limit_ms:35000"" --name=binderlytics --install=datasette-vega` to publish https://binderlytics.herokuapp.com/ the following error happens: ``` › Warning: heroku update available from 7.42.1 to 7.43.0. › Warning: heroku update available from 7.42.1 to 7.43.0. › Warning: heroku update available from 7.42.1 to 7.43.0. Setting WEB_CONCURRENCY and restarting ⬢ binderlytics... done, v13 WEB_CONCURRENCY: 1 › Warning: heroku update available from 7.42.1 to 7.43.0. ▸ Couldn't detect GNU tar. Builds could fail due to decompression errors ▸ See https://devcenter.heroku.com/articles/platform-api-deploying-slugs#create-slug-archive ▸ Please install it, or specify the '--tar' option ▸ Falling back to node's built-in compressor buffer.js:358 throw new ERR_INVALID_OPT_VALUE.RangeError('size', size); ^ RangeError [ERR_INVALID_OPT_VALUE]: The value ""3303763968"" is invalid for option ""size"" at Function.alloc (buffer.js:367:3) at new Buffer (buffer.js:281:19) at Readable. (/Users/thead/.local/share/heroku/node_modules/archiver-utils/index.js:39:15) at Readable.emit (events.js:322:22) at endReadableNT (/Users/thead/.local/share/heroku/node_modules/readable-stream/lib/_stream_readable.js:1010:12) at processTicksAndRejections (internal/process/task_queues.js:84:21) { code: 'ERR_INVALID_OPT_VALUE' } ``` After installing GNU tar with `brew install gnu-tar` and modifying `datasette/publish/heroku.py` to include the `--tar=/path/to/gnu-tar` publishing works. I think the problem occurs once your heroku slug reaches a certain size. At least when I add only a few 100 entries to the datasette then the error does not occcur. datasette version 0.49.1 OSX 10.14.6 (18G103)",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/969/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 991575770,MDExOlB1bGxSZXF1ZXN0NzMwMDIwODY3,1467,Add Authorization header when CORS flag is set,3058200,closed,0,,,3,2021-09-08T22:14:41Z,2021-10-17T02:29:07Z,2021-10-14T18:54:18Z,NONE,simonw/datasette/pulls/1467,"This PR adds the [`Access-Control-Allow-Headers`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Access-Control-Allow-Headers) flag when CORS mode is enabled. This would fix https://github.com/simonw/datasette-auth-tokens/issues/4. When making cross-origin requests, the server must respond with all allowable HTTP headers. A Datasette instance using auth tokens must accept the `Authorization` HTTP header in order for cross-origin authenticated requests to take place. Please let me know if there's a better way of doing this! I couldn't figure out a way to change the app's response from the plugin itself, so I'm starting here. If you'd rather this logic live in the plugin, I'd love any guidance you're able to give.",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1467/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 285168503,MDU6SXNzdWUyODUxNjg1MDM=,176,Add GraphQL endpoint,173848,open,0,,,8,2017-12-29T23:21:01Z,2020-04-21T14:16:24Z,,NONE,,Would make it much easier to build React & similar frontends. Maybe with https://github.com/graphql-python/sanic-graphql ?,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/176/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 476852861,MDU6SXNzdWU0NzY4NTI4NjE=,568,Add database_color as a configurable option,50906992,open,0,,,1,2019-08-05T13:14:45Z,2023-08-11T05:19:42Z,,NONE,,This would be really useful as it would allow us to tie in with colour schemes.,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/568/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 322741659,MDExOlB1bGxSZXF1ZXN0MTg3NzcwMzQ1,258,Add new metadata key persistent_urls which removes the hash from all database urls,247131,closed,0,,,3,2018-05-14T09:39:18Z,2018-05-21T07:38:15Z,2018-05-21T07:38:15Z,NONE,simonw/datasette/pulls/258,"Add new metadata key ""persistent_urls"" which removes the hash from all database urls when set to ""true"" This PR is just to gauge if this, or something like it, is something you would consider merging? I understand the reason why the substring of the hash is included in the url but there are some use cases where the urls should persist across deployments. For bookmarks for example or for scripts that use the JSON API. This is the initial commit for this feature. Tests and documentation updates to follow.",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/258/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 352768017,MDU6SXNzdWUzNTI3NjgwMTc=,362,Add option to include/exclude columns in search filters,78156,open,0,,,1,2018-08-22T01:32:08Z,2020-11-03T19:01:59Z,,NONE,,"I have a dataset with many columns, of which only some are likely to be of interest for searching. It would be great for usability if the search filters in the UI could be configured to include/exclude columns. See also: https://github.com/simonw/datasette/issues/292",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/362/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 282971961,MDU6SXNzdWUyODI5NzE5NjE=,175,"Add project topic ""automatic-api""",3179832,closed,0,,,1,2017-12-18T18:09:17Z,2017-12-21T18:33:55Z,2017-12-21T18:33:55Z,NONE,,"Hi there! Could you add the ~~tag~~ topic `automatic-api` to your repository? I am [making a list](https://github.com/dbohdan/automatic-api) of all projects that automatically expose APIs to databases. (Your Show HN made me do it. :-) I knew about PostgREST and PostGraphQL, but it took adding Datasette to sell me on the concept.) They will be easier to discover if there is a standard GitHub tag, and `automatic-api` seems as good a candidate as any. Two projects [already use it](https://github.com/topics/automatic-api).",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/175/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 418329842,MDU6SXNzdWU0MTgzMjk4NDI=,415,Add query parameter to hide SQL textarea,36796532,closed,0,,,3,2019-03-07T14:11:30Z,2019-03-15T09:30:57Z,2019-03-15T05:22:43Z,NONE,,It would be cool if there was a query parameter to hide / remove the SQL textarea. Then I could simply save a bookmark for a certain query and open it to see the data without having to scroll below the (long) SQL query first.,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/415/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1251710928,I_kwDOBm6k_c5Km5fQ,1751,Add scrollbars to table presentation in default layout,408765,closed,0,,,1,2022-05-28T19:44:57Z,2022-05-28T19:52:17Z,2022-05-28T19:52:17Z,NONE,,"(As you will be able to tell from the terminology I use, I am not a frontend guy, but I hope you will understand.) When a table is wide and needs horizontal scrolling to see the columns towards the end, the user needs to scroll horizontally. However, since the container for the HTML table (`div` with class `table-wrapper`) isn't limited by the window size, I first need to vertically scroll near to the bottom of the page in order to scroll horizontally. Then I can scroll back up again. This isn't very user friendly. Instead, I think it would make sense to constrain the table's size (when necessary), so that the vertical and horizontal scrollbars either always are visible or at least not far out of reach. I understand that I could provide my own template and / or CSS, but I think it would probably make sense to adjust the default in this regard.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1751/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 893537744,MDU6SXNzdWU4OTM1Mzc3NDQ=,1331,Add support for Jinja2 version 3.0,475613,closed,0,,,10,2021-05-17T17:14:36Z,2021-05-23T00:57:39Z,2021-05-23T00:57:39Z,NONE,,"A week ago, [The Pallets Project](https://github.com/pallets) released [new major versions of several of its projects](https://palletsprojects.com/blog/flask-2-0-released/). Among those updates is one for Jinja2, which bumps it to version 3.0.0. I'd like for datasette to support Jinaj2 version 3.0.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1331/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1351949898,PR_kwDOBm6k_c492dPw,1793,Added a useful resource,111973926,closed,0,,,1,2022-08-26T08:41:26Z,2022-09-06T00:41:25Z,2022-09-06T00:41:24Z,NONE,simonw/datasette/pulls/1793,"Have added a useful resource about the types of databases in SQL i.e SQLite, PostgreSQL, MySQL &, etc from the scaler topics. ---- :books: Documentation preview :books:: https://datasette--1793.org.readthedocs.build/en/1793/ ",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1793/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 499954048,MDExOlB1bGxSZXF1ZXN0MzIyNTI5Mzgx,578,Added support for multi arch builds,887095,closed,0,,,3,2019-09-29T18:43:03Z,2019-11-13T19:13:15Z,2019-11-13T19:13:15Z,NONE,simonw/datasette/pulls/578,Minor changes in Dockerfile and new Makefile to support Docker multi architecture builds. `make`will build one image per architecture and push them as one Docker manifest to Docker Hub. Feel free to change `IMAGE_NAME ` to `datasetteproject/datasette` to update your official Docker Hub image(s).,107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/578/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 450862577,MDU6SXNzdWU0NTA4NjI1Nzc=,496,Additional options to gcloud build command in cloudrun - timeout,1740337,closed,0,,,1,2019-05-31T15:43:55Z,2019-05-31T23:05:05Z,2019-05-31T23:05:05Z,NONE,,"I am trying to deploy a 3.1 GB dataset to cloudrun with datasette. Currrently the docker build times out. Would be nice to have a timeout flag or additional gcloud commands that could be specified. Here is the line https://github.com/simonw/datasette/blob/f825e2012109247fa246e2b938f8174069e574f1/datasette/publish/cloudrun.py#L78 I would be happy to submit a PR to allow for a timeout option. What are your ideas of allowing the user additional build publishing flag options?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/496/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1251700382,I_kwDOBm6k_c5Km26e,1750,Allow `label_column` to specify array of columns,408765,open,0,,,0,2022-05-28T18:45:48Z,2022-05-28T18:45:48Z,,NONE,,"I think it would be great if the Datasette metadata would allow the `label_column` table key to list multiple columns. Something like: ```json ""tables"": { ""person"": { ""label_column"": [""first_name"", ""last_name""] }, ``` It would even be interesting with a ""label expression"" similar to a Python f-string. E.g. `{row.last_name}, {row.first_name}`.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1750/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 828858421,MDU6SXNzdWU4Mjg4NTg0MjE=,1258,Allow canned query params to specify default values,1385831,open,0,,,5,2021-03-11T07:19:02Z,2023-02-20T23:39:58Z,,NONE,,"If I call a canned query that includes named parameters, without passing any parameters, datasette runs the query anyway, resulting in an HTTP status code 400, and a visible error in the browser, with only a link back to home. This means that one of the default links on https://site/database/ will lead to a broken page with no apparent way out. ![image](https://user-images.githubusercontent.com/1385831/110748683-13e72300-820e-11eb-855c-32e03dfef5bf.png) Is there any way to skip performing the query when parameters aren't supplied, but otherwise render the usual canned query page? Alternatively, can I supply default values for my parameters, either when defining my canned queries or when linking to the canned query page from the default database template.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1258/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 548591089,MDU6SXNzdWU1NDg1OTEwODk=,657,Allow creation of virtual tables at startup,1055831,open,0,,,4,2020-01-12T16:10:55Z,2021-01-15T20:24:35Z,,NONE,,"Hi, I've been experimenting with SQLite reading from huge datasets using this excellent Parquet extension from @cldellow. https://cldellow.com/2018/06/22/sqlite-parquet-vtable.html https://github.com/cldellow/sqlite-parquet-vtable This works really well, but I was keen to see if I could combine datasette with this. Having previously experimented with the spatialite extension I knew that datasette supports loading extensions in the underlying sqlite instance. However I hit a blocker as the current design only allows SELECT statements to be executed and so I am unable to execute the crucial CREATE VIRTUAL TABLE ......... command that is required to load the data from the parquet file into the table. It seems like this would be a simple-ish change, but I don't know enough about the architecture of datasette to start implementing this myself? Could this be done as a datasette plugin? or would this require more fundamental changes at initialisation time? My thoughts are that something at init time could detect that the user was loading a *.parquet file and then switch to a mode were it loads that via the ""CREATE VIRTUAL TABLE..."" rather than loading the *.db file in the default case?? I'm happy to contribute code and testing, I just need some pointers on the best approach. Thanks Darren",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/657/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 712889459,MDExOlB1bGxSZXF1ZXN0NDk2Mjk4MTgw,986,"Allow facet by primary keys, fixes #985",39452697,closed,0,,,2,2020-10-01T14:18:55Z,2020-10-01T16:51:45Z,2020-10-01T16:51:45Z,NONE,simonw/datasette/pulls/986,"Hello! This PR makes it possible to facet by primary keys. Did I get it right that just removing the condition on UI side is enough? From testing it works fine with primary keys, just as with normal keys. If so, should I also remove unused `data-is-pk`?",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/986/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 814591962,MDU6SXNzdWU4MTQ1OTE5NjI=,1240,Allow facetting on custom queries,7107523,closed,0,,,3,2021-02-23T15:52:19Z,2021-02-26T18:19:46Z,2021-02-26T18:18:18Z,NONE,,"Facets are a tremendously useful feature, especially for people peeking at the database for the first time and still having little knowledge about the details of the data. It is of great assistance to discover interesting features to explore futher in advanced queries. Yet, it seems it's impossible to use facets when running a custom SQL query, be it from the little gear icons in column names, the facet suggestions at the top (hidden when performing a custom query), or by appending a facet code to the URL. Is there a technical limitation, or is this something that could be unlocked easily?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1240/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 562787785,MDU6SXNzdWU1NjI3ODc3ODU=,667,Allow injecting configuration data from plugins,870184,closed,0,,,2,2020-02-10T19:50:15Z,2020-02-12T16:18:22Z,2020-02-12T09:21:22Z,NONE,,"I'm trying to customize datasette as explorer for [CLDF](https://cldf.clld.org) datasets. Such datasets can be converted automatically to SQLite, which then can be fed to datasette, (e.g. https://github.com/cldf/cookbook/blob/master/recipes/datasette/README.md). Part of this customization would be support for the ""special"" data types described in the [CLDF ontology](https://cldf.clld.org/v1.0/terms.rdf). But while rendering of the values can be customized via the `render_cell` hook in a plugin, e.g. custom labels for foreign keys must be specified through the config file. It would be nice to be able to programmatically inject config data from plugins as well.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/667/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1504352503,I_kwDOBm6k_c5Zqpj3,1968,Allow to hide some queries in metadata.yml,562352,open,0,,,0,2022-12-20T10:45:41Z,2022-12-20T10:45:41Z,,NONE,,"By default all queries are displayed. But there are many cases where it would be interesting to hide the queries by default: * the website is targeting non-tech people * the query is veeeeeery long ([eg.](https://mirabelle.openfoodfacts.org/products/energy_calculator)) * reading the query is not important for the users, they only want to see the result Of course, the user still could have the option to see the query. It could be an option in the metadata file: ```yml databases: awesome_db: tables: products: hide_sql: true queries: great_query: hide_sql: true sql: select * from products where code = :barcode ``` The priority could be: * no option in the metadata and nothing in the URL: query displayed * hide_sql in the metadata and nothing in the URL: query displayed as asked in the metadata * hide_sql in the metadata and &_hide_sql= in the URL: query as asked in the URL See also: #1824 ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1968/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1452360613,I_kwDOBm6k_c5WkUOl,1895,Avoid using host name when building absolute URLs?,14294,open,0,,,0,2022-11-16T22:21:27Z,2022-11-16T22:21:27Z,,NONE,,"When deploying Datasette to Cloud Run and rewriting certain routes from a Firebase app to the Cloud Run service, some of the URLs in the page start with `https://[service].run.app` rather than the (custom) domain of the Firebase app. I guess this is because a) the custom domain of the Firebase app isn't being passed through in the `host` header of the request to the Cloud Run instance and b) the `absolute_url` function in Datasette is using information from the request to build the URL. Would it be possible to not use the host name when building the absolute URLs, i.e. only include the path in the URL?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1895/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1501900064,I_kwDOBm6k_c5ZhS0g,1966,Broken link to live demo in Getting started docs,7551922,closed,0,,,1,2022-12-18T13:17:00Z,2022-12-31T19:15:19Z,2022-12-31T19:15:10Z,NONE,,The link in [Play with a live demo in Getting started](https://github.com/simonw/datasette/blob/main/docs/getting_started.rst#play-with-a-live-demo) to [https://fivethirtyeight.datasettes.com/fivethirtyeight](https://fivethirtyeight.datasettes.com/fivethirtyeight) is broken and the datasette is no longer working (maybe due to the end of the free tier).,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1966/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314665147,MDU6SXNzdWUzMTQ2NjUxNDc=,216,Bug: Sort by column with NULL in next_page URL,222245,closed,0,,,15,2018-04-16T14:03:18Z,2018-04-17T01:45:24Z,2018-04-17T01:45:24Z,NONE,,"Copy-pasting from https://github.com/simonw/datasette/issues/189#issuecomment-381429213, since that issue is closed: I think I found a bug. I tried to sort by middle initial in my salaries set, and many middle initials are null. The `next_url` gets set by Datasette to: http://localhost:8001/salaries-d3a5631/2017+Maryland+state+salaries?_next=None%2C391&_sort=middle_initial But then None is interpreted literally and it tries to find a name with the middle initial ""None"" and ends up skipping ahead to O on page 2. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/216/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 2029908157,I_kwDOBm6k_c54_fC9,2214,CSV export fails for some `text` foreign key references,2874,open,0,,,1,2023-12-07T05:04:34Z,2023-12-07T07:36:34Z,,NONE,,"I'm starting this issue without a clear reproduction in case someone else has seen this behavior, and to use the issue as a notebook for research. I'm using Datasette with the [SWITRS](https://iswitrs.chp.ca.gov/) data set, which is a California Highway Patrol collection of traffic incident data from the past decade or so. I receive data from them in CSV and want to work with it in Datasette, then export it to CSV for mapping in Felt.com. Their data makes extensive use of codes for incident column data (`1` for `Monday` and so on), some of it integer codes and some of it letter/text codes. The text codes are sometimes blank or `-`. During import, I'm creating lookup tables for foreign key references to make the Datasette UI presentation of the data easier to read. If I import the data and set up the integer foreign keys, everything works fine, but if I set up the text foreign keys, CSV export starts to fail. The foreign key configuration is as follows: ``` # Some tables use integer ids, like sensible tables do. Let's import them first # since we favor them. for TABLE in DAY_OF_WEEK CHP_SHIFT POPULATION SPECIAL_COND BEAT_TYPE COLLISION_SEVERITY do sqlite-utils create-table records.db $TABLE id integer name text --pk=id sqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv sqlite-utils add-foreign-key records.db collisions $TABLE $TABLE id sqlite-utils create-index records.db collisions $TABLE done # *Other* tables use letter keys, like they were raised by WOLVES. Let's put them # at the end of the import queue. for TABLE in WEATHER_1 WEATHER_2 LOCATION_TYPE RAMP_INTERSECTION SIDE_OF_HWY \ PRIMARY_COLL_FACTOR PCF_CODE_OF_VIOL PCF_VIOL_CATEGORY TYPE_OF_COLLISION MVIW \ PED_ACTION ROAD_SURFACE ROAD_COND_1 ROAD_COND_2 LIGHTING CONTROL_DEVICE \ STWD_VEHTYPE_AT_FAULT CHP_VEHTYPE_AT_FAULT PRIMARY_RAMP SECONDARY_RAMP do sqlite-utils create-table records.db $TABLE key text name text --pk=key sqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv sqlite-utils add-foreign-key records.db collisions $TABLE $TABLE key sqlite-utils create-index records.db collisions $TABLE done ``` You can see the full code and import script here: https://github.com/radical-bike-lobby/switrs-db If I run this code and then hit the CSV export link in the Datasette interface (the simple link or the ""advanced"" dialog), export fails after a small number of CSV rows are written. I am not seeing any detailed error messages but this appears in the logging output: ``` INFO: 127.0.0.1:57885 - ""GET /records/collisions.csv?_facet=PRIMARY_RD&PRIMARY_RD=ASHBY+AV&_labels=on&_size=max HTTP/1.1"" 200 OK Caught this error: ``` (No other output follows `error:` other than a blank line.) I've stared at the rows directly after the error occurs and can't yet see what is causing the problem. I'm going to set up a development environment and see if I get any more detailed error output, and then stare more at some problematic lines to see if I can get a simple reproduction.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2214/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 395236066,MDU6SXNzdWUzOTUyMzYwNjY=,393,"CSV export in ""Advanced export"" pane doesn't respect query",1727065,closed,0,,,6,2019-01-02T12:39:41Z,2021-06-17T18:14:24Z,2019-01-03T02:44:10Z,NONE,,"It looks like there's an inconsistency when exporting to CSV via the the web interface. Say I'm looking at [songs released in 1989](https://fivethirtyeight.datasettes.com/fivethirtyeight-c300360/classic-rock%2Fclassic-rock-song-list?Release+Year__exact=1989) in the `classic-rock/classic-rock-song-list` table from the Five Thirty Eight data. The JSON and CSV export links at the top of the page both give me filtered data using `Release+Year__exact=1989` in the URL. In the `Advanced export` tab, though, the CSV option gives me the whole data set, while the JSON options preserve the query. It may be that this is intended behaviour related to the streaming CSV stuff [discussed here](https://github.com/simonw/datasette/issues/266), but if that's the case then I think it should be a little clearer.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/393/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 2023057255,I_kwDOBm6k_c54lWdn,2212,Can't filter with numbers,605070,open,0,,,0,2023-12-04T05:26:29Z,2023-12-04T05:26:29Z,,NONE,,"I have a schema that uses numbers for a column (actually it's a boolean 1 or 0 but SQLite doesn't have Boolean). I can't seem to get the facet to work or even filtering on this column. My guess is that Datasette is ""stringifying"" the number and it's not matching? Example: https://debian-sqlelf.fly.dev/debian/elf_symbols?_sort_desc=name&_facet=exported&exported=0",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2212/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 826064552,MDU6SXNzdWU4MjYwNjQ1NTI=,1253,"Capture ""Ctrl + Enter"" or ""⌘ + Enter"" to send SQL query?",9308268,open,0,,,1,2021-03-09T15:00:50Z,2021-10-30T16:00:42Z,,NONE,,"It appears as though ""Shift + Enter"" triggers the form submit action to submit SQL, but could that action be bound to the ""Ctrl + Enter"" or ""⌘ + Enter"" action? I feel like that pattern already exists in a number of similar tools and could improve usability of the editor.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1253/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 569317377,MDU6SXNzdWU1NjkzMTczNzc=,681,Cashe-header missing in http-response,2181410,closed,0,,,4,2020-02-22T10:50:45Z,2020-02-24T20:53:57Z,2020-02-24T20:53:56Z,NONE,,"Hi Simon. I need some help with both understanding and adding http-headers. If I call datasette on localhost with --config default_cache_ttl:120 and --cors, I only get the following response-headers: access-control-allow-origin: * content-type: text/html; charset=utf-8 date: Sat, 22 Feb 2020 10:32:15 GMT referrer-policy: no-referrer server: uvicorn transfer-encoding: chunked Cors works, but no caching-header is set? Same thing happens if I use the command in a Dockerfile and run datasette with docker. Second, how can one add headers to uvicorn? I've tried to add uvicorn commands to the Dockerfile, before the final datasette command, but it doesn't work. Is there any way to add headers to the uvicorn.run() command i datasette? I particular, I would like to add some of the missing security-headers: Thank you for a great product!",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/681/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 781262510,MDU6SXNzdWU3ODEyNjI1MTA=,1181,"Certain database names results in 404: ""Database not found: None""",1470389,closed,0,,6346396,4,2021-01-07T12:01:16Z,2021-12-21T18:25:15Z,2021-01-25T05:13:19Z,NONE,,"I have a file named `test-database (1).sqlite`. When requesting the home route `/`, I see datasette is able to read it correctly: However, if I click any of the links, datasette replies with: `Error 404 Database not found: None` It seems the hash is crucial, as renaming the file to `database (1).sqlite` makes the error go away. This lines checks for a single dash: https://github.com/simonw/datasette/blob/97fb10c17dd007a275ab743742e93e932335ad67/datasette/views/base.py#L184 ``` $ datasette test-database\ \(1\).sqlite INFO: Started server process [68314] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) INFO: 127.0.0.1:54043 - ""GET /favicon.ico HTTP/1.1"" 200 OK INFO: 127.0.0.1:54043 - ""GET / HTTP/1.1"" 200 OK ... INFO: 127.0.0.1:54044 - ""GET /favicon.ico HTTP/1.1"" 200 OK INFO: 127.0.0.1:54044 - ""GET /test-database (1) HTTP/1.1"" 404 Not Found ``` Version: ``` $ datasette --version datasette, version 0.53 ``` ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1181/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 508100844,MDU6SXNzdWU1MDgxMDA4NDQ=,598,Character encoding bug with CSV export,46313,closed,0,,,1,2019-10-16T21:09:30Z,2021-06-17T18:13:20Z,2019-10-18T22:52:21Z,NONE,,"I was just poking around, and at [this URL](https://sql-murder-mystery.datasette.io/sql-murder-mystery/crime_scene_report.csv?_stream=on&type=arson&_size=max), I encountered this error: ``` 'latin-1' codec can't encode character '\u2019' in position 27: ordinal not in range(256) ``` ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/598/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1157182254,I_kwDOBm6k_c5E-TMu,1646,Configuration directory mode does not pick up other file extensions than .db,15640196,closed,0,,,3,2022-03-02T13:15:23Z,2022-10-07T23:06:17Z,2022-10-07T23:03:35Z,NONE,,"Hello, I've been trying to run Datasette with the [configuration directory mode](https://docs.datasette.io/en/stable/settings.html#configuration-directory-mode) with a structure such as this one: ```plain some-directory/ example.sqlite3 another-example.db one-more.custom [...] ``` (In my scenario I can't just change the filename extension without other problems arising) Now databases with the `.sqlite3` or the custom filename extension are ignored by Datasette in this case. I'm aware that the docs state that a `.db` extension is required, but I was wondering if there is a reason for restricting this or any workaround available? When I run `datasette example.sqlite3` or `datasette one-more.custom` the databases are served by Datasette without a problem. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1646/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 473307794,MDU6SXNzdWU0NzMzMDc3OTQ=,565,Conflict between datasette and uvicorn click versions,440503,closed,0,,,1,2019-07-26T11:13:40Z,2020-10-02T00:09:55Z,2020-10-02T00:09:55Z,NONE,,"Hello Datasette is awesome thanks so much! I not very familiar with Python but I think there is a problem with datasette docker builds I keep getting this error ``` ERROR: uvicorn 0.8.4 has requirement click==7.*, but you'll have click 6.0 which is incompatible. ERROR: datasette 0.29.2 has requirement click~=7.0, but you'll have click 6.0 which is incompatible. ``` The full log from the docker build is here - https://gist.github.com/jonheslop/e01cd322e761cfaf34f0cb83f86411b0 Just in case it’s helpful this is my setup - https://github.com/dotwatcher/dotwatcher-data",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/565/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1387712501,I_kwDOBm6k_c5Sts_1,1824,Convert &_hide_sql=1 to #_hide_sql,562352,open,0,,,1,2022-09-27T12:53:31Z,2022-10-05T12:56:27Z,,NONE,,"Hiding the SQL textarea with `&_hide_sql=1` enforces a page reload, which can take several seconds and use server resource (which is annoying for big database or complex queries). It could probably be done with a few lines of Javascript (I'm going to see if I can do that).",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1824/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 699947574,MDU6SXNzdWU2OTk5NDc1NzQ=,963,Currently selected array facets are not correctly persisted through hidden form fields,649467,closed,0,,5818042,1,2020-09-12T01:49:17Z,2020-09-12T21:54:29Z,2020-09-12T21:54:09Z,NONE,,"Faceted search uses JSON array elements as facets rather than the arrays. However, if a search is ""Apply""ed (using the Apply button), the array itself rather than its elements used. To reproduce: https://latest.datasette.io/fixtures/facetable?_sort=pk&_facet=created&_facet=tags&_facet_array=tags Press ""Apply"", which might be done when removing a filter. Notice that the ""tags"" facet values are now arrays, not array elements. It appears the ""&_facet_array=tags"" element of the query string is dropped.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/963/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1221849746,I_kwDOBm6k_c5I0_KS,1732,Custom page variables aren't decoded,52649,open,0,,,2,2022-04-30T14:55:46Z,2022-05-03T01:50:45Z,,NONE,,"I have a page `templates/filer/{filer_id}.html`. It uses `filer_id` in a `sql()` call to fetch data. With 0.61.1 this no longer works because the spaces in IDs isn't preserved. Instead, the escaped version is passed into the template and the id isn't present in my db. Datasette should unescape the url component before passing them into the template.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1732/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1075893249,I_kwDOBm6k_c5AINQB,1545,Custom pages don't work on windows,559711,closed,0,,,3,2021-12-09T18:53:05Z,2022-02-03T02:08:31Z,2022-02-03T01:58:35Z,NONE,,"It seems that custom pages don't work when put in templates/pages To reproduce on datasette version 0.59.4 using PowerShell on WIndows 10 with Python 3.10.0 mkdir -p templates/pages echo ""hello world"" >> templates/pages/about.html Start datasette datasette --template-dir templates/ Navigate to [http://127.0.0.1:8001/about](url) and receive: Error 404: Database not found: about ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1545/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 813899472,MDU6SXNzdWU4MTM4OTk0NzI=,1238,Custom pages don't work with base_url setting,79913,closed,0,,,9,2021-02-22T21:58:58Z,2021-06-05T18:59:55Z,2021-06-05T18:59:55Z,NONE,,"It seems that custom pages aren't routing properly when the `base_url` setting is used. To reproduce, with Datasette 0.55. Create a `templates/pages/custom.html` with some text. ``` mkdir -p templates/pages/ echo ""Hello, world!"" > templates/pages/custom.html ``` Start Datasette. ``` datasette --template-dir templates/ ``` Visit http://localhost:8001/custom and see ""Hello, world!"". Start Datasette with a `base_url`. ``` datasette --template-dir templates/ --setting base_url /prefix/ ``` Visit http://localhost:8001/prefix/custom and see a ""Database not found: custom"" 404. Note that like all routes, http://localhost:8001/custom still works when run with `base_url`. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1238/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 791381623,MDU6SXNzdWU3OTEzODE2MjM=,1197,DB size limit for publishing with Heroku,1186275,closed,0,,,1,2021-01-21T18:08:43Z,2021-01-24T20:53:44Z,2021-01-24T20:53:44Z,NONE,,"Hello, I tried searching for this, but can't seem to get a great answer: Does anybody know the size limit for databases deploying to Heroku? The files I'm working with are pretty large, but I might be able to pare them down if I have a limit in mind. I'm getting the following error when running `datasette heroku publish`: `RangeError [ERR_INVALID_OPT_VALUE]: The value ""14504095744"" is invalid for option ""size""`",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1197/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1571207083,I_kwDOBm6k_c5dprer,2016,Database metadata fields like description are not available in the index page template's context,9993,open,0,,3268330,1,2023-02-05T02:25:53Z,2023-02-05T22:56:43Z,,NONE,,"When looping through `databases` in the index.html template, I'd like to print the description of each database alongside its name. But it appears that isn't passed in from the view, unless I'm missing it. It would be great to have that.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2016/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 518506242,MDU6SXNzdWU1MTg1MDYyNDI=,616,Datasette FTS detection bug,49656826,closed,0,,,2,2019-11-06T14:25:47Z,2019-11-08T15:31:33Z,2019-11-08T02:06:56Z,NONE,,"I'm having a trouble with datasette. I deployed EXACTLY the same project on two different apps on Heroku. Both have databases (not all) with FTS activated but only one detects and works fine. You can take a look here: With search: http://teste-templates.herokuapp.com/amazonia_protege/car Without search: http://bases.vortex.media/amazonia_protege/car ![teste](https://user-images.githubusercontent.com/49656826/68306310-11a80e00-0088-11ea-8d1c-db3bd3375518.jpg) ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/616/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1899310542,I_kwDOBm6k_c5xNS3O,2187,Datasette for serving JSON only,19705106,open,0,,,0,2023-09-16T05:48:29Z,2023-09-16T05:48:29Z,,NONE,,"Hi, is there any way to use datasette for serving json only without displaying webpage? I've tried to search about this in documentation but didn't get any information",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2187/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1553615704,I_kwDOBm6k_c5cmktY,2001,Datasette is not compatible with SQLite's strict quoting compilation option,406380,open,0,,,4,2023-01-23T19:10:07Z,2023-01-25T04:59:58Z,,NONE,,"I have linked Python3.11 on macOS against recent SQLite that was compiled using `-DSQLITE_DQS=0`. This option disables interpretation of double-quoted identifiers as string literals, described in the SQLite docs as a ""MySQL 3.x misfeature"". See https://www.sqlite.org/quirks.html#dblquote for background. Datasette uses the double-quote syntax in a number of key places, and is thus completely broken in this environment. My experience was to `pip install datasette`, then run `datasette serve -I my-data.db`. When I visit `http://127.0.0.1:8001` I get a 500 response. The error: `sqlite3.OperationalError: no such column: geometry_columns` The responsible SQL: `'select 1 from sqlite_master where tbl_name = ""geometry_columns""'` I then installed datasette from GitHub master in development mode and changed the offending SQL to use correct quotes: `""select 1 from sqlite_master where tbl_name = 'geometry_columns'""`. With this change, I get a little further, but have the same problem with the first table name in my database (in my case, ""Meta""): ``` OperationalError: no such column: Meta Traceback (most recent call last): File ""/Users/gwk/external/datasette/datasette/app.py"", line 1522, in route_path response = await view(request, send) ^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/views/base.py"", line 151, in view return await self.dispatch_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/views/base.py"", line 105, in dispatch_request response = await handler(request) ^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/views/index.py"", line 70, in get ""fts_table"": await db.fts_table(table), ^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/database.py"", line 363, in fts_table return await self.execute_fn(lambda conn: detect_fts(conn, table)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/database.py"", line 213, in execute_fn return await asyncio.get_event_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/py/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/thread.py"", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/database.py"", line 211, in in_thread return fn(conn) ^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/database.py"", line 363, in return await self.execute_fn(lambda conn: detect_fts(conn, table)) ^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/utils/__init__.py"", line 588, in detect_fts rows = conn.execute(detect_fts_sql(table)).fetchall() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ sqlite3.OperationalError: no such column: Meta INFO: 127.0.0.1:50258 - ""GET / HTTP/1.1"" 500 Internal Server Error ``` I will try to continue playing with this, but I also hope that the datasette developers will enable this mode in a test environment as I am unlikely to be able to exercise all of the SQL in the codebase, or make a pull request very soon. Note that the DQS setting compile-time option can be overridden at runtime with calls to the C API: ``` sqlite3_db_config(db, SQLITE_DBCONFIG_DQS_DDL, 0, (void*)0); sqlite3_db_config(db, SQLITE_DBCONFIG_DQS_DML, 0, (void*)0); ``` As far as I can tell, `sqlite3_db_config` is not exposed in Python, but perhaps we could figure out how to invoke it using `ctypes`. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2001/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 925406964,MDU6SXNzdWU5MjU0MDY5NjQ=,1382,Datasette with Glitch - is it possible to use CSV with ISO-8859-1 encoding?,23701514,closed,0,,,1,2021-06-19T14:37:20Z,2021-06-20T00:21:02Z,2021-06-20T00:20:06Z,NONE,,"Hi Please, I used Remix on Glitch to create a project on Glitch and uploaded a CSV But it's a CSV with ISO-8859-1 encoding (https://en.wikipedia.org/wiki/ISO/IEC_8859-1) Is it possible for me to change the encoding to correctly visualize the data? Example: https://emphasized-carpal-pillow.glitch.me/data/Emendas Best",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1382/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1433576351,I_kwDOBm6k_c5VcqOf,1880,Datasette with many and large databases > Memory use,525934,open,0,,,4,2022-11-02T18:10:27Z,2022-11-16T17:50:29Z,,NONE,,"> Datasette maintains an in-memory SQLite database with details of the the databases, tables and columns for all of the attached databases. The above is from the docs ^. There's two problems here - the number of datasette ""instances"" in a single server/VM and the size of the database itself. We want the **opposite** of in-memory, including what happens on SQLlite - documented in https://www.sqlite.org/inmemorydb.html From the context in https://github.com/simonw/datasette/issues/1150 - does it mean datasette is memory-bound to the size of the dataset - which might be a deal-breaker for many large-scale use cases? In an extreme case - let's say a single server had 100 SQLlite databases, which would enable 100 ""instances"" of datasette to run, one per client (e.g. in a SaaS multi-tenant environment). How could we achieve all these goals: 1. Allow any _one_ of these 100 databases to grow to say 2Tb in size 2. Have one datasette instance, which connects to 1 of the 100 instances, based on incoming credentials/tenant ID 3. Minimize memory use entirely - both by datasette and SQLlite, such that almost all operations are executed in real-time on-disk with little to no memory consumption per-tenant, or per-database. Any ideas appreciated - we're looking to use this in a SaaS type of setting - many instances, single server. @simonw great work on datasette, in general! Possibly related to https://github.com/simonw/datasette/issues/1480 but we don't want use any kind of serverless infra - this is a long-running VM/server.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1880/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 735852274,MDU6SXNzdWU3MzU4NTIyNzQ=,1082,DigitalOcean buildpack memory errors for large sqlite db?,39538958,open,0,,,3,2020-11-04T06:35:32Z,2020-11-04T19:35:44Z,,NONE,,"1. Have a sqlite db stored in Dropbox 2. Previously tried the Digital Ocean build pack minimal approach (e.g. Procfile, requirements.txt, bin/post_compile) 3. bin/post_compile with wget from Dropbox 4. download of large sqlite db is successful 5. log reveals that when building Docker container, Digital Ocean runs out of memory for 5gb+ sqlite db but works fine for 2gb+ sqlite db",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1082/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1955676270,I_kwDOBm6k_c50kUBu,2201,Discord invite link is invalid,11708906,open,0,,,0,2023-10-21T21:50:05Z,2023-10-21T21:50:05Z,,NONE,,"https://datasette.io/discord leads to https://discord.com/invite/ktd74dm5mw and returns the following: ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2201/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1910269679,I_kwDOBm6k_c5x3Gbv,2196,Discord invite link returns 401,1892194,closed,0,,,2,2023-09-24T15:16:54Z,2023-10-13T00:07:08Z,2023-10-12T21:54:54Z,NONE,,"I found the link to the datasette discord channel via [this query](https://github.com/search?q=repo%3Asimonw%2Fdatasette%20discord&type=code). The following video should be self explanatory: https://github.com/simonw/datasette/assets/1892194/8cd33e88-bcaa-41f3-9818-ab4d589c3f02 Link for reference: https://discord.com/invite/ktd74dm5mw",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2196/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1577548579,I_kwDOBm6k_c5eB3sj,2021,Docker images for 1.0 alphas?,1563881,open,0,,,0,2023-02-09T09:35:52Z,2023-02-09T09:35:52Z,,NONE,,"Hi, would you consider putting 1.0alpha images on Dockerhub? (Also, how usable are the alphas?)",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2021/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 863884805,MDU6SXNzdWU4NjM4ODQ4MDU=,1304,"Document how to send multiple values for ""Named parameters"" ",9308268,open,0,,,4,2021-04-21T13:19:06Z,2021-12-08T03:23:14Z,,NONE,,"https://docs.datasette.io/en/stable/sql_queries.html#named-parameters I thought that I had seen an example of how to do this example below, but I can't seem to find it ```sql select * from bib where bib.bib_record_num in (1008088,1008092) ``` ```sql select * from bib where bib.bib_record_num in (:bib_record_numbers) ``` ![image](https://user-images.githubusercontent.com/9308268/115558839-2333a480-a281-11eb-85e6-ce3bada79140.png) https://ilsweb.cincinnatilibrary.org/collection-analysis/current_collection-204d100?sql=select%0D%0A++*%0D%0Afrom%0D%0A++bib%0D%0Awhere%0D%0A++bib.bib_record_num+in+%28%3Abib_record_numbers%29&bib_record_numbers=1008088%2C1008092 Or, maybe this isn't a fully supported feature. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1304/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1575880841,I_kwDOBm6k_c5d7giJ,2020,"Documentation refers to ""off"" setting; doesn't seem to work, ""false"" does",1350673,open,0,,,0,2023-02-08T10:38:10Z,2023-02-08T10:38:10Z,,NONE,,"https://docs.datasette.io/en/stable/settings.html#suggest-facets, among others, suggests using ""off"" to disable the setting; however, this doesn't appear to work in the JSON config files, where it apparently needs to be a ""JSON boolean"" and have the values ""true"" or ""false"". Perhaps the Python code is more flexible?...but either way, the documentation probably should mention it.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2020/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 459397625,MDU6SXNzdWU0NTkzOTc2MjU=,514,Documentation with recommendations on running Datasette in production without using Docker,7936571,closed,0,,5971510,27,2019-06-21T22:48:12Z,2020-10-08T23:55:53Z,2020-10-08T23:33:05Z,NONE,,"I've got some SQLite databases too big to push to Heroku or the other services with built-in support in datasette. So instead I moved my datasette code and databases to a remote server on Kimsufi. In the folder containing the SQLite databases I run the following code. `nohup datasette serve -h 0.0.0.0 *.db --cors --port 8000 --metadata metadata.json > output.log 2>&1 &`. When I go to `http://my-remote-server.com:8000`, the site loads. But I know this is not a good long-term solution to running datasette on this server. What is the ""correct"" way to have this site run, preferably on server port 80?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/514/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1375792876,I_kwDOBm6k_c5SAO7s,1811,"Drop-down menu with ""REGEXP"" choice",562352,open,0,,,0,2022-09-16T11:06:18Z,2022-09-16T15:30:31Z,,NONE,,"Drop-down menu below could add ""REGEXP"" choice when REGEXP sqlite extension is installed and used ![image](https://user-images.githubusercontent.com/562352/190675352-810fbdca-0827-4034-8b9f-fd67d5c35afb.png) Not sure. Close the issue if you don't find it relevant.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1811/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 617323873,MDU6SXNzdWU2MTczMjM4NzM=,766,Enable wildcard-searches by default,2181410,open,0,,,2,2020-05-13T10:14:48Z,2021-03-05T16:35:21Z,,NONE,,"Hi Simon. It seems that datasette currently has wildcard-searches disabled by default (along with the boolean search-options, NEAR-queries and more, and despite the docs). If I try out the search-url provided in the [docs](https://datasette.readthedocs.io/en/stable/full_text_search.html#the-table-page-and-table-view-api) (https://fara.datasettes.com/fara/FARA_All_ShortForms?_search=manafort), it does not handle wildcard-searches, and I'm unable to make it work on my datasette-instance. I would argue that wildcard-searches is such a standard query, that it should be enabled by default. Requiring ""_searchmode=raw"" when using prefix-searches seems unnecessary. Plus: What happens to non-ascii searches when using ""_searchmode=raw""? Is the ""escape_fts""-function from datasette.utils ignored? Thanks! /Claus",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/766/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1578609658,I_kwDOBm6k_c5eF6v6,2022,Error 500 - not clear the cause,1667631,closed,0,,,1,2023-02-09T20:57:17Z,2023-02-09T21:13:50Z,2023-02-09T21:13:50Z,NONE,,"On the database that I have sent via linkedIn, datasette works great, but the following URL gives a 500 error. http://127.0.0.1:8001/literature/authors_papers?authorId=100550354 The cause of the error is not apparent. Is this expected behaviour? David",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2022/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 747702144,MDU6SXNzdWU3NDc3MDIxNDQ=,1100,Error on OPTIONS request to database,1319404,closed,0,,,2,2020-11-20T18:16:43Z,2020-12-03T00:57:35Z,2020-12-03T00:50:17Z,NONE,,"When I perform an OPTIONS request against a database or table datasette fails with an internal error. All these tests result in the traceback below. ``` curl -XOPTIONS http://127.0.0.1:8001/test-db curl -XOPTIONS http://127.0.0.1:8001/test-db/table1 curl -XOPTIONS http://127.0.0.1:8001/test-db/table1\?_search\=test ``` ``` Traceback (most recent call last): File ""[path-to-python]/site-packages/datasette/app.py"", line 1033, in route_path response = await view(request, send) File ""[path-to-python]/site-packages/datasette/views/base.py"", line 146, in view request, **request.scope[""url_route""][""kwargs""] File ""[path-to-python]/site-packages/datasette/views/base.py"", line 118, in dispatch_request return await handler(request, *args, **kwargs) TypeError: object Response can't be used in 'await' expression ``` Making the `options` function in the `DataView` class async fixed it for me. ```python async def options(self, request, *args, **kwargs): r = Response.text(""ok"") if self.ds.cors: r.headers[""Access-Control-Allow-Origin""] = ""*"" return r ```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1100/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 605806386,MDU6SXNzdWU2MDU4MDYzODY=,735,"Error when I click on ""View and edit SQL""",30607,closed,0,,,2,2020-04-23T19:31:32Z,2020-04-28T06:10:20Z,2020-04-27T19:00:30Z,NONE,,"Hi, when I do it [here](https://my-database.now.sh/commissioniComunePalermo/youtube), I have ""unrecognized token: ""["""" error. Is it normal? Thank you",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/735/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1579695809,I_kwDOBm6k_c5eKD7B,2023,Error: Invalid setting 'hash_urls' in settings.json in 0.64.1,80409402,closed,0,,,2,2023-02-10T13:35:01Z,2023-02-10T15:40:00Z,2023-02-10T15:39:59Z,NONE,,"On a Debian machine, using datasette 0.64.1 installed with `pip3`, I am getting a `datasette[114272]: Error: Invalid setting 'hash_urls' in settings.json` in `journalctl -xe`. The same settings work on 0.54.1 on another Debian server. This is my `settings.json`: ```json { ""default_page_size"": 200, ""max_returned_rows"": 8000, ""num_sql_threads"": 3, ""sql_time_limit_ms"": 1000, ""default_facet_size"": 30, ""facet_time_limit_ms"": 200, ""facet_suggest_time_limit_ms"": 50, ""hash_urls"": false, ""allow_facet"": true, ""allow_download"": true, ""suggest_facets"": true, ""default_cache_ttl"": 5, ""default_cache_ttl_hashed"": 31536000, ""cache_size_kb"": 0, ""allow_csv_stream"": true, ""max_csv_mb"": 100, ""truncate_cells_html"": 2048, ""force_https_urls"": false, ""template_debug"": false, ""base_url"": ""/pclim/db/"" } ``` This looks ok to me. Would you have any ideas?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2023/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1435917503,I_kwDOBm6k_c5Vlly_,1883,Errors when using table filters behind a proxy,31312775,closed,0,,,13,2022-11-04T11:18:47Z,2022-11-11T09:20:22Z,2022-11-11T06:54:58Z,NONE,,"Using datasette==0.63 table filters do not respect the `base_url` setting as described [here](https://docs.datasette.io/en/stable/deploying.html#running-datasette-behind-a-proxy) To reproduce, go to: https://datasette-apache-proxy-demo.datasette.io/prefix/fixtures/binary_data Then use the table filter buttons. The `/prefix/` is dropped, resulting in URL not found: https://datasette-apache-proxy-demo.datasette.io/fixtures/binary_data?_sort=rowid&rowid__exact=1 ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1883/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 1}",,completed 556814876,MDU6SXNzdWU1NTY4MTQ4NzY=,662,Escape_fts5_query-hookimplementation does not work with queries to standard tables,2181410,closed,0,,,5,2020-01-29T11:56:03Z,2020-01-30T00:30:20Z,2020-01-30T00:30:19Z,NONE,,"Hi Simon Thank you for adding the escape_function, but it does not work on my datasette-installation (0.33). I've added the following file to my datasette-dir: /plugins/sql_functions.py: `from datasette import hookimpl def escape_fts_query(query): bits = query.split() return ' '.join('""{}""'.format(bit.replace('""', '')) for bit in bits) @hookimpl def prepare_connection(conn): conn.create_function(""escape_fts_query"", 1, escape_fts_query)` It has no effect on the standard queries to the tables though, as they still produce errors when including any characters like '-', '/', '+' or '?' Does the function only work when using costum queries, where I can include the escape_fts-function explicitly in the sql-query? PS. I'm calling datasette with --plugins=plugins, and my other plugins work just fine. PPS. The fts5 virtual table is created with 'sqlite3' like so: `CREATE VIRTUAL TABLE ""cases_fts"" USING FTS5( title, subtitle, resume, suggestion, presentation, detail = full, content_rowid = 'id', content = 'cases', tokenize='unicode61', 'remove_diacritics 2', 'tokenchars ""-_""' );` Thanks! _Originally posted by @clausjuhl in https://github.com/simonw/datasette/issues/651#issuecomment-579675357_",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/662/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 322283067,MDU6SXNzdWUzMjIyODMwNjc=,254,Escaping named parameters in canned queries,247131,closed,0,,,4,2018-05-11T12:43:30Z,2020-05-10T14:54:14Z,2020-05-10T14:54:13Z,NONE,,"Thank you very much for this project. I have created some canned queries but some of the filters include a colon eg. ""com.ubuntu.cloud:server:18.04:amd64"". When saved these colons are parsed as named parameters. Is there a way to escape colons in a canned query?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/254/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 453131917,MDU6SXNzdWU0NTMxMzE5MTc=,502,Exporting sqlite database(s)?,7936571,closed,0,,,3,2019-06-06T16:39:53Z,2021-04-03T05:16:54Z,2019-06-11T18:50:42Z,NONE,,"I'm working on datasette from one computer. But if I want to work on it from another computer and want to copy the SQLite database(s) already on the Heroku datasette instance, how to I copy the database(s) to the second computer so that I can then update it and push to online via datasette's command line code that pushes code to Heroku?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/502/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 340396247,MDU6SXNzdWUzNDAzOTYyNDc=,339,Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way,12617395,closed,0,,,4,2018-07-11T20:38:06Z,2022-03-21T22:22:40Z,2022-03-21T22:22:34Z,NONE,,"Is it possible to configure the sql_time_limit_ms beyond 60 seconds? It seems queries are still timing out at 60 seconds when sql_time_limit_ms is set to 180000. We have a very large data set and often encounter timeouts when testing new queries from the datasette UI. We are optimizing our database as much as we can, but still may require more than 60 seconds for complex queries.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/339/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 826700095,MDU6SXNzdWU4MjY3MDAwOTU=,1255,Facets timing out but work when filtering,1219001,open,0,,,2,2021-03-09T22:01:39Z,2021-04-02T20:50:08Z,,NONE,,"System info: Windows 10 Datasette 0.55 installed via pip Python 3.8.5 in a conda environment I'm getting the message `These facets timed out` on any faceting operation. However, when I apply a filter, the facets appear in the filtered view. The error returns when the filter is removed. My data only has 38,450 rows.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1255/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 849512840,MDU6SXNzdWU4NDk1MTI4NDA=,1288,Facets: show counts for null,1111743,open,0,,,0,2021-04-02T22:33:44Z,2021-04-02T22:33:44Z,,NONE,,"Hi, Thank you for Datasette and being a fan of SQLite! Not all rows in a record will always contain data. So when using a facet on a column where some records have data and others don't, you don't get an accurate count of the results. Please consider also counting and showing null records with facets.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1288/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 982803408,MDU6SXNzdWU5ODI4MDM0MDg=,1454,Feature Request: Publish to IPFS,1560788,open,0,,,0,2021-08-30T13:36:18Z,2021-08-30T13:36:18Z,,NONE,,"Hello, I am a huge fan of this being used for exploring data. I think it has a lot of flexibility not found in other tools. I'm not sure if what I'm asking for is possible: Can this be extended to publish to IPFS? IPFS is an attractive hosting option for decentralized journalism. Food for thought ~",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1454/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 334592281,MDExOlB1bGxSZXF1ZXN0MTk2NTI2ODYx,322,Feature/in operator,2691848,closed,0,,,0,2018-06-21T17:41:51Z,2018-06-21T17:45:25Z,2018-06-21T17:45:25Z,NONE,simonw/datasette/pulls/322,,107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/322/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1174655187,I_kwDOBm6k_c5GA9DT,1671,Filters fail to work correctly against calculated numeric columns returned by SQL views because type affinity rules do not apply,9308268,open,0,,,8,2022-03-20T19:17:24Z,2022-03-22T17:43:12Z,,NONE,,"I found a strange behavior, and I'm not sure if it's related to views and boolean values perhaps, or if there's something else weird going on here, but I'll provide an example that may help show what I'm seeing happen. ```bash #!/bin/bash echo ""\""id\"",\""expiration_date\"" 0,2018-01-04 1,2019-01-05 2,2020-01-06 3,2021-01-07 4,2022-01-08 5,2023-01-09 6,2024-01-10 7,2025-01-11 8,2026-01-12 9,2027-01-13 "" > test.csv csvs-to-sqlite test.csv test.db sqlite-utils create-view --replace test.db test_view ""select id, expiration_date, case when julianday('NOW') >= julianday(expiration_date) then 1 else 0 end as has_expired FROM test"" ``` ```bash datasette test.db ``` ![image](https://user-images.githubusercontent.com/9308268/159178745-9c6152f7-eac6-4bf9-bef5-a2d63d3ee13f.png) ![image](https://user-images.githubusercontent.com/9308268/159178824-c8952137-270c-42a4-ad1c-f6ad2c51e499.png) ![image](https://user-images.githubusercontent.com/9308268/159178877-23e00b36-443a-43ef-83e5-e0bdddd3fdcd.png) ![image](https://user-images.githubusercontent.com/9308268/159178918-65922cc7-2514-4735-a72d-4904b99976d4.png) Thanks again and let me know if you want me to provide anything else!",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1671/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 831163537,MDExOlB1bGxSZXF1ZXN0NTkyNTQ4MTAz,1260,Fix: code quality issues,25361949,closed,0,,,2,2021-03-14T13:56:10Z,2021-03-29T00:22:41Z,2021-03-29T00:22:41Z,NONE,simonw/datasette/pulls/1260,"### Description Hi :wave: I work at [DeepSource](https://deepsource.io), I ran DeepSource analysis on the forked copy of this repo and found some interesting [code quality issues](https://deepsource.io/gh/withshubh/datasette/issues/?category=recommended) in the codebase, opening this PR so you can assess if our platform is right and helpful for you. ### Summary of changes - Replaced ternary syntax with if expression - Removed redundant `None` default - Used `is` to compare type of objects - Iterated dictionary directly - Removed unnecessary lambda expression - Refactored unnecessary `else` / `elif` when `if` block has a `return` statement - Refactored unnecessary `else` / `elif` when `if` block has a `raise` statement - Added .deepsource.toml to continuously analyze and detect code quality issues",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1260/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 787104850,MDU6SXNzdWU3ODcxMDQ4NTA=,1192,Form Plugin for in-depth Datasette Querying,1024355,open,0,,,0,2021-01-15T18:24:50Z,2021-01-15T18:24:50Z,,NONE,,I envision a sort of easy-to-build form plugin that would be able to map a user's inputs to different fields/columns in a Datasette database. ,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1192/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 451513541,MDU6SXNzdWU0NTE1MTM1NDE=,498,Full text search of all tables at once?,7936571,closed,0,,,12,2019-06-03T14:24:43Z,2020-05-30T17:26:02Z,2020-05-30T17:26:02Z,NONE,,"Does datasette have a built-in way, in a browser, to do a full-text search of all columns, in all databases and tables, that have full-text search enabled? Is there a plugin that does this?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/498/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1049946823,I_kwDOBm6k_c4-lOrH,1502,"Full-text search: No support to unary ""-"" operator",516827,open,0,,,0,2021-11-10T15:11:19Z,2021-11-10T15:11:19Z,,NONE,,"Reference: https://www.sqlite.org/fts3.html#set_operations_using_the_standard_query_syntax Test: https://fara.datasettes.com/fara/FARA_All_ShortForms?_search=manafort+-freedman&_sort=rowid",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1502/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 753668177,MDU6SXNzdWU3NTM2NjgxNzc=,1116,GENERATED column support,2789593,closed,0,,,9,2020-11-30T17:33:47Z,2020-11-30T21:29:59Z,2020-11-30T21:29:59Z,NONE,,"I think this is a feature request... perhaps I should just try to contribute it myself, but thought I'd check in case support is planned already. For a table with the following schema, datasette 0.51.1 doesn't pick up the GENERATED columns and the column list only contains `(rowid, body)` If I edit the SQL and select the generated columns, it will happily show them. At first glance it appears that [`def table_column_details(conn, table):`](https://github.com/simonw/datasette/blob/4777362bf2692bc72b221ec47c3e6216151d1b89/datasette/utils/__init__.py#L575) would have to be refactored to use a different methodology to get the columns, since `PRAGMA table_info(deeds);` returns just `0|body|TEXT|0||0` so maybe it wouldn't be worth it. ``` CREATE TABLE deeds ( body TEXT, id INT GENERATED ALWAYS AS (json_extract(body, '$.id')) STORED, consideration INT GENERATED ALWAYS AS (json_extract(body, '$.consideration')) STORED ); ```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1116/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 392610803,MDU6SXNzdWUzOTI2MTA4MDM=,391,Google Trends example doesn’t work,229881,closed,0,,,1,2018-12-19T13:51:38Z,2019-01-02T19:45:13Z,2019-01-02T19:45:12Z,NONE,,"https://google-trends.datasettes.com/ I see a cloud flare error. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/391/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1266207143,I_kwDOBm6k_c5LeMmn,1755,Gunicorn,1176293,open,0,,,0,2022-06-09T14:18:46Z,2022-06-09T14:18:46Z,,NONE,,"I've read issue #514 which resulted in running Datasette via systemd as recommended approach. We've also adopted this (for now), but I notice that Uvicorn [says the following](https://www.uvicorn.org/#running-with-gunicorn): > Uvicorn includes a Gunicorn worker class allowing you to run ASGI applications, with all of Uvicorn's performance benefits, while also giving you Gunicorn's fully-featured process management. > > This allows you to increase or decrease the number of worker processes on the fly, restart worker processes gracefully, or perform server upgrades without downtime. > > For production deployments we recommend using gunicorn with the uvicorn worker class. We usually deploy Python applications via Gunicorn for these process management features (e.g. `--daemon` and `--pid`). Is this something that would/could work with Datasette as well?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1755/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 453243459,MDU6SXNzdWU0NTMyNDM0NTk=,503,Handle SQLite databases with spaces in their names?,7936571,closed,0,9599,,1,2019-06-06T21:20:59Z,2019-11-04T23:16:30Z,2019-11-04T23:16:30Z,NONE,,"I named my SQLite database ""Government workers"" and published it to Heroku. When I clicked the ""Government workers"" database online it lead to a 404 page: `Database not found: Government%20workers`. I believe this is because the database name has a space.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/503/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 550293770,MDU6SXNzdWU1NTAyOTM3NzA=,658,How do I use the app.css as style sheet?,49656826,open,0,,,2,2020-01-15T16:27:57Z,2020-02-07T00:29:50Z,,NONE,,"Simon, I'm trying to use the app.css (in static folder) as style sheet but the datasette on Heroku simply ignore it! I read everything about customization here and on readthedocs but still can't. Is this possible? Thanks!",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/658/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 400511206,MDU6SXNzdWU0MDA1MTEyMDY=,403,How does persistence work?,1794527,closed,0,,,2,2019-01-17T23:41:57Z,2019-01-19T05:47:55Z,2019-01-18T06:51:14Z,NONE,,I was under the impression that now.sh is for stateless microservices. So where are these SQLite databases stored and when do they get created and destroyed?,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/403/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 891969037,MDU6SXNzdWU4OTE5NjkwMzc=,1326,How to limit fields returned from the JSON API?,5268174,closed,0,,,1,2021-05-14T14:27:41Z,2021-05-23T02:55:06Z,2021-05-23T02:55:00Z,NONE,,"Hi, I have quite wide tables, and in many cases only want a subset of the data (to save on network bandwidth). I need to use the JSON API as handling pagination is so much easier, but I can't see a way to select specific columns. Is there a way to do this, or is it a feature request? Thanks!",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1326/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 400229984,MDU6SXNzdWU0MDAyMjk5ODQ=,401,How to pass configuration to plugins?,1055831,closed,0,,,3,2019-01-17T11:20:41Z,2019-01-18T11:48:13Z,2019-01-18T06:49:07Z,NONE,,"Hi, Firstly, thanks for your work on datasette, it is a hugely useful tool! I've been working on a fork [https://github.com/dazzag24/datasette-cluster-map] of datasette-cluster-map to allow the tileserver to be easily switched. Primarily because the tiles being served in the current version use localised text for labels and I'd like to have English used for these names instead. It uses http://leaflet-extras.github.io/leaflet-providers/preview/ to allow you to simply set the tile provider using a call like so: ``` let tiles = L.tileLayer.provider('Esri.WorldTopoMap'); ``` instead of the current: ``` let tiles = L.tileLayer('https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png', { maxZoom: 19, detectRetina: true, attribution: '© OpenStreetMap contributors' }), ``` However I've got stuck in trying to work out how to pass the provider string to the plugin. In the documentation: https://datasette.readthedocs.io/en/stable/plugins.html you discuss configuration of plugins and use an example of passing in which latitude and longitude columns should be used. However I cannot seem to see anywhere in the current datasette-cluster-map code where these config params are passed in or used. Can you please point me to an example or how to pass configuration from the metadata.json down into a plugin. Once I've over come this issue I was wondering if you would be interested in taking this change into your version? Many thanks Darren",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/401/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 410384988,MDU6SXNzdWU0MTAzODQ5ODg=,411,How to pass named parameter into spatialite MakePoint() function,1055831,closed,0,,,3,2019-02-14T16:30:22Z,2023-10-25T13:23:04Z,2019-05-05T12:25:04Z,NONE,,"Hi, datasette version: ""0.26.2"" extensions: spatialite: ""4.4.0-RC0"" sqlite version: ""3.22.0"" I have a table of airports with latitude and longitude columns. I've added spatialite (with KNN support). After creating the db using csvs-to-sqlit, I run these commands to setup the spatialite tables: ``` conn.execute('SELECT InitSpatialMetadata(1)') conn.execute(""SELECT AddGeometryColumn('airports', 'point_geom', 4326, 'POINT', 2);"") conn.execute('''UPDATE airports SET point_geom = GeomFromText('POINT('||""longitude""||' '||""latitude""||')',4326);''') conn.execute(""SELECT CreateSpatialIndex('airports', 'point_geom');"") ``` I'm attempting to create a canned query and have this in my metadata.json file: ``` ""find_airports_nearest_to_point"":{ ""sql"":""SELECT a.pos AS rank, b.id, b.name, b.country, b.latitude AS latitude, b.longitude AS longitude, a.distance / 1000.0 AS dist_km FROM KNN AS a JOIN airports AS b ON (b.rowid = a.fid) WHERE f_table_name = \""airports\"" AND ref_geometry = MakePoint( :Long , :Lat ) AND max_items = 10;""} ``` which doesn't seem to perform the templating of the name parameters correctly and I get no results. Have also tired: ``` MakePoint( || :Long || , || :Lat || ) ``` which returns this error: ``` near ""||"": syntax error ``` However I cannot seem to find the correct combination of named parameter syntax (:Lat) or sqlite concatenation operator to make it work. Any ideas if using named parameters inside functions is supported? Thanks Darren",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/411/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1590183272,I_kwDOBm6k_c5eyEVo,2027,"How to redirect from ""/"" to a specific db/table",1350673,open,0,,,4,2023-02-18T03:14:01Z,2023-03-08T04:42:22Z,,NONE,,"Using nginx to redirect public IP to the local uvicorn server as 'normal'. I can't figure out how to redirect such that '/' results in accessing the one db/table I want to serve; redirecting / to /db/table breaks some of the CSS; fooling with base_url doesn't seem to help. Can someone explain this, if it's possible?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2027/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 408518024,MDU6SXNzdWU0MDg1MTgwMjQ=,410,How to setup a multi database environment?,30607,closed,0,,,1,2019-02-10T09:39:24Z,2019-04-12T04:42:28Z,2019-04-12T04:42:27Z,NONE,,"Hi, first of all I need to write that Simon Willison and datasette are really great. I have probably a stupid question, but it seems to me that I do not have the reply in the documentation. I have installed datasette and run it with `datasette mydb.db`, and I can reach it on `http://127.0.0.1:8001`. But how to work with more than one db? Imagine I have ten sqlite databases, and that I need to explore/query these via datasette, how to run datasette? Is it possibile to create a sort of db index and than run `datasette serve myindex`? Thank you",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/410/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed