id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason 334592281,MDExOlB1bGxSZXF1ZXN0MTk2NTI2ODYx,322,Feature/in operator,2691848,closed,0,,,0,2018-06-21T17:41:51Z,2018-06-21T17:45:25Z,2018-06-21T17:45:25Z,NONE,simonw/datasette/pulls/322,,107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/322/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 343728754,MDU6SXNzdWUzNDM3Mjg3NTQ=,346,Logo design for DATASETTE,35750428,closed,0,,,0,2018-07-23T17:40:17Z,2018-08-02T02:31:59Z,2018-08-02T02:31:59Z,NONE,,"Hello :) , I'm a graphic designer, I'm interested in collaborating with open source projects, besides this helps me expand my portfolio. I would like to design a logo for your project. I will be happy to collaborate with you :). ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/346/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 407174173,MDU6SXNzdWU0MDcxNzQxNzM=,408,"Show metadata info (e.g. license, source) on custom SQL query pages",78356,closed,0,,,0,2019-02-06T10:43:34Z,2019-10-14T03:53:22Z,2019-10-14T03:53:22Z,NONE,,"Currently metadata info is not displayed on custom SQL pages. E.g. compare the footer of [this normal table page](https://register-of-members-interests.datasettes.com/regmem-98dc8b7/categories) with the footer [this custom SQL page](https://register-of-members-interests.datasettes.com/regmem-98dc8b7?sql=select+*+from+categories). This is important in order to adhere to attribution license requirements.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/408/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 787104850,MDU6SXNzdWU3ODcxMDQ4NTA=,1192,Form Plugin for in-depth Datasette Querying,1024355,open,0,,,0,2021-01-15T18:24:50Z,2021-01-15T18:24:50Z,,NONE,,I envision a sort of easy-to-build form plugin that would be able to map a user's inputs to different fields/columns in a Datasette database. ,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1192/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 808771690,MDU6SXNzdWU4MDg3NzE2OTA=,1225,More flexible formatting of records with CSS grid,649467,open,0,,,0,2021-02-15T19:28:17Z,2021-02-15T19:28:35Z,,NONE,,"In several applications I've been experimenting with alternate formatting of datasette query results. Lately I've found that CSS grids work very well and seem quite general for formatting rows. In CSS I use grid templates to define the layout of each record and the regions for each field, hiding the fields I don't want. It's pretty flexible and looks good. It's also a great basis for highly responsive layout. I initially thought I'd only use this feature for record detail views, but now I use it for index views as well. However, there are some limitations: * With the existing table templates, it seems that you can change the `display` property on the enclosing `table`, `tbody`, and `tr` to make them be grid-like, but that seems hacky (convert `table` and `tbody` to be `display: block` and `tr` to be `display: grid`). * More significantly, it's very nice to have the column name available when rendering each record to display headers/field labels. The existing templates don't do that, so a custom `_table` template is necessary. * I don't know if any plugins are sensitive to whether data is rendered as a table or not since I'm not completely clear how plugins get their data. * Regardless, you need custom CSS to take full advantage of grids. I don't have a proposal on how to integrate them more deeply. It would be helpful to at least have an official example or test that used a grid layout for records to make sure nothing in datasette breaks with it. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1225/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 824750134,MDU6SXNzdWU4MjQ3NTAxMzQ=,1251,facet option not appearing when table is big,15836677,open,0,,,0,2021-03-08T16:54:04Z,2021-03-08T16:54:16Z,,NONE,,"I have a big table with more than 500.000 rows. Trying to facet by one of my columns, the options are not available as for the other smaller tables. I have tried to set it in URL as: `&_facet=city_id` to no avail. is there any limit? how can I force the option ""facet"" to appear for big tables? ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1251/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 849512840,MDU6SXNzdWU4NDk1MTI4NDA=,1288,Facets: show counts for null,1111743,open,0,,,0,2021-04-02T22:33:44Z,2021-04-02T22:33:44Z,,NONE,,"Hi, Thank you for Datasette and being a fan of SQLite! Not all rows in a record will always contain data. So when using a facet on a column where some records have data and others don't, you don't get an accurate count of the results. Please consider also counting and showing null records with facets.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1288/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 870946764,MDU6SXNzdWU4NzA5NDY3NjQ=,1312,how to query many-to-many relationship via json API?,5268174,open,0,,,0,2021-04-29T12:09:49Z,2021-04-29T12:09:49Z,,NONE,,"Hi, Firstly thanks for Datasette, it's great! I'm trying to use the JSON API to query data from a Datasette instance. I have a simple 3 table many-to-many relationship, like so: `category` - list of categories `document` - list of documents `document_category` - join table (a category contains many documents, and a document can be a member of multiple categories) the `document_category` table foreign keys to the other two using their respective row_ids. Now I want to return ""all documents within category X"" but I cannot see a way to do this without executing two queries; the first to lookup the row_id of category X, and the second to join `document` with `document_category` where category ID is . I could easily write this in SQL, but this makes programmatic handling of pagination much more difficult (we'd have to dynamically modify the SQL to select the row_id and include the correct where and limit clauses). Is there a way to achieve this using the JSON API? ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1312/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 982803408,MDU6SXNzdWU5ODI4MDM0MDg=,1454,Feature Request: Publish to IPFS,1560788,open,0,,,0,2021-08-30T13:36:18Z,2021-08-30T13:36:18Z,,NONE,,"Hello, I am a huge fan of this being used for exploring data. I think it has a lot of flexibility not found in other tools. I'm not sure if what I'm asking for is possible: Can this be extended to publish to IPFS? IPFS is an attractive hosting option for decentralized journalism. Food for thought ~",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1454/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1049946823,I_kwDOBm6k_c4-lOrH,1502,"Full-text search: No support to unary ""-"" operator",516827,open,0,,,0,2021-11-10T15:11:19Z,2021-11-10T15:11:19Z,,NONE,,"Reference: https://www.sqlite.org/fts3.html#set_operations_using_the_standard_query_syntax Test: https://fara.datasettes.com/fara/FARA_All_ShortForms?_search=manafort+-freedman&_sort=rowid",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1502/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1091257796,I_kwDOBm6k_c5BC0XE,1584,give error with recursive sql,58088336,open,0,,,0,2021-12-30T18:53:16Z,2021-12-30T18:53:16Z,,NONE,,"I got an error ""near ""WITH"": syntax error"" after I upgraded to version 0.59 from 0.52.4. This error is related to recursive sql. It works great on the previous version but it failed after upgraded. Below is an example of sql: WITH RECURSIVE manager_of(position, super_position) AS (SELECT position, case ifnull(INDIRECT_SUPER_POSITION,'') when '' then super_position else INDIRECT_SUPER_POSITION end as SUPER_POSITION FROM position where super_position<>'SGV000000001' and super_position!='' and position <> super_position),chain_manager_of_position(position, level) AS (SELECT super_position, 1 as level FROM manager_of WHERE super_position!='' and (position=:pos or position in (Select position from employee where employee=:ein)) UNION ALL SELECT super_position, level+1 as level FROM manager_of JOIN chain_manager_of_position USING(position)) SELECT * FROM chain_manager_of_position left join employee using(position) where employee is not NULL order by level limit 1",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1584/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1181037277,I_kwDOBm6k_c5GZTLd,1686,heroku bails if app name specifed in datasette publish is the same as existing app,2115933,open,0,,,0,2022-03-25T17:10:34Z,2022-03-25T17:10:34Z,,NONE,,"Seem that `heroku` does not accept an app overwrite triggered by specifying the app name using `datasette publish`, as below: ``` datasette publish heroku some.db --name ""jazzy-name"" ``` The resulting error has the below traceback: ``` Creating jazzy-name... ! ▸ Name jazzy-name is already taken Traceback (most recent call last): File ""/opt/homebrew/bin/datasette"", line 33, in sys.exit(load_entry_point('datasette==0.60.1', 'console_scripts', 'datasette')()) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/opt/homebrew/Cellar/datasette/0.60.1/libexec/lib/python3.10/site-packages/datasette/publish/heroku.py"", line 127, in heroku create_output = check_output(cmd).decode(""utf8"") File ""/opt/homebrew/Cellar/python@3.10/3.10.2/Frameworks/Python.framework/Versions/3.10/lib/python3.10/subprocess.py"", line 420, in check_output return run(*popenargs, stdout=PIPE, timeout=timeout, check=True, File ""/opt/homebrew/Cellar/python@3.10/3.10.2/Frameworks/Python.framework/Versions/3.10/lib/python3.10/subprocess.py"", line 524, in run raise CalledProcessError(retcode, process.args, subprocess.CalledProcessError: Command '['heroku', 'apps:create', 'jazzy-name', '--json']' returned non-zero exit status 1. ``` It's a solid failsafe, but does `datasette publish` have a way to force an overwrite?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1686/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1247315144,I_kwDOBm6k_c5KWITI,1749,LDAP auth plugin,380241,open,0,,,0,2022-05-25T01:35:12Z,2022-05-25T01:35:12Z,,NONE,,"A [search of the plugins directory](https://datasette.io/plugins?q=ldap) doesn't turn up anything, but is is possible to set up a Datasette app which uses my organisation's LDAP for auth? If not, how much work would it be to write one (I _may_ have some spare cycles on my team to do this, but we haven't written a datasette plugin before).",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1749/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1251700382,I_kwDOBm6k_c5Km26e,1750,Allow `label_column` to specify array of columns,408765,open,0,,,0,2022-05-28T18:45:48Z,2022-05-28T18:45:48Z,,NONE,,"I think it would be great if the Datasette metadata would allow the `label_column` table key to list multiple columns. Something like: ```json ""tables"": { ""person"": { ""label_column"": [""first_name"", ""last_name""] }, ``` It would even be interesting with a ""label expression"" similar to a Python f-string. E.g. `{row.last_name}, {row.first_name}`.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1750/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1266207143,I_kwDOBm6k_c5LeMmn,1755,Gunicorn,1176293,open,0,,,0,2022-06-09T14:18:46Z,2022-06-09T14:18:46Z,,NONE,,"I've read issue #514 which resulted in running Datasette via systemd as recommended approach. We've also adopted this (for now), but I notice that Uvicorn [says the following](https://www.uvicorn.org/#running-with-gunicorn): > Uvicorn includes a Gunicorn worker class allowing you to run ASGI applications, with all of Uvicorn's performance benefits, while also giving you Gunicorn's fully-featured process management. > > This allows you to increase or decrease the number of worker processes on the fly, restart worker processes gracefully, or perform server upgrades without downtime. > > For production deployments we recommend using gunicorn with the uvicorn worker class. We usually deploy Python applications via Gunicorn for these process management features (e.g. `--daemon` and `--pid`). Is this something that would/could work with Datasette as well?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1755/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1280799259,I_kwDOBm6k_c5MV3Ib,1761,ensure_ascii=False,1473102,open,0,,,0,2022-06-22T19:58:13Z,2022-06-22T19:58:30Z,,NONE,,"Hi, thanks for the project! For the JSON output, I would consider defaulting to `ensure_ascii=False` (UTF-8 seems pretty universal) or making it an option. When dealing with non-Latin text, `ensure_ascii=True` (the default) can triple the size of the output.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1761/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1323332006,I_kwDOBm6k_c5O4HGm,1774,Request of feature for mongo,428820,open,0,,,0,2022-07-31T01:00:05Z,2022-07-31T01:00:05Z,,NONE,,Will love if can we use datasette for mongo and all pipelines and workflows,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1774/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1365741480,I_kwDOBm6k_c5RZ4-o,1806,"UX to recover from Error 500: ""You can only execute one statement at a time.""",1470389,open,0,,,0,2022-09-08T08:01:27Z,2022-09-08T08:01:37Z,,NONE,,"When using the Custom SQL query view, when accidentally adding a semicolon in the middle of my query, datasette errors with: > # Error 500 > You can only execute one statement at a time. The error view doesn't contain the query textarea anymore, so it provides no easy way recover from the error. It would be nice if I could change and submit it again. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1806/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1375792876,I_kwDOBm6k_c5SAO7s,1811,"Drop-down menu with ""REGEXP"" choice",562352,open,0,,,0,2022-09-16T11:06:18Z,2022-09-16T15:30:31Z,,NONE,,"Drop-down menu below could add ""REGEXP"" choice when REGEXP sqlite extension is installed and used ![image](https://user-images.githubusercontent.com/562352/190675352-810fbdca-0827-4034-8b9f-fd67d5c35afb.png) Not sure. Close the issue if you don't find it relevant.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1811/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1446657889,I_kwDOBm6k_c5WOj9h,1885,Integrate inside GUI app (tkinter),5115787,open,0,,,0,2022-11-13T00:10:43Z,2022-11-13T00:11:09Z,,NONE,,"Hi, I'd like to integrate datasette inside a tkinter app. The app should be able to start/stop datasette server. How could I integrate datasette inside my app, so it can start and stop datasette server?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1885/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1452360613,I_kwDOBm6k_c5WkUOl,1895,Avoid using host name when building absolute URLs?,14294,open,0,,,0,2022-11-16T22:21:27Z,2022-11-16T22:21:27Z,,NONE,,"When deploying Datasette to Cloud Run and rewriting certain routes from a Firebase app to the Cloud Run service, some of the URLs in the page start with `https://[service].run.app` rather than the (custom) domain of the Firebase app. I guess this is because a) the custom domain of the Firebase app isn't being passed through in the `host` header of the request to the Cloud Run instance and b) the `absolute_url` function in Datasette is using information from the request to build the URL. Would it be possible to not use the host name when building the absolute URLs, i.e. only include the path in the URL?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1895/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1504352503,I_kwDOBm6k_c5Zqpj3,1968,Allow to hide some queries in metadata.yml,562352,open,0,,,0,2022-12-20T10:45:41Z,2022-12-20T10:45:41Z,,NONE,,"By default all queries are displayed. But there are many cases where it would be interesting to hide the queries by default: * the website is targeting non-tech people * the query is veeeeeery long ([eg.](https://mirabelle.openfoodfacts.org/products/energy_calculator)) * reading the query is not important for the users, they only want to see the result Of course, the user still could have the option to see the query. It could be an option in the metadata file: ```yml databases: awesome_db: tables: products: hide_sql: true queries: great_query: hide_sql: true sql: select * from products where code = :barcode ``` The priority could be: * no option in the metadata and nothing in the URL: query displayed * hide_sql in the metadata and nothing in the URL: query displayed as asked in the metadata * hide_sql in the metadata and &_hide_sql= in the URL: query as asked in the URL See also: #1824 ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1968/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1532000914,I_kwDOBm6k_c5bUHqS,1990,Suggestion: Highlight error messages ('These facets timed out'),116795,open,0,,,0,2023-01-13T09:40:58Z,2023-01-13T09:40:58Z,,NONE,,"I had trouble figuring out why faceting didn't work in some instances, it took a while before I noticed the _These facets timed out_ notice. It might help if that would be highlighted, or fading out highlight - if one might think it would be too visually disturbing.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1990/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1533673397,I_kwDOBm6k_c5baf-1,1991,fts5 tables are not auto-detected and hidden,83819,open,0,,,0,2023-01-15T06:00:42Z,2023-01-20T04:54:24Z,,NONE,,"I set up a [Datasette instance](https://huggingface.co/spaces/Sygil/INE-dataset-explorer/tree/main) and was following the docs on full-text search. When I used fts4, datasette automatically hid the FTS tables and added the FTS search box where appropriate, but when I changed to fts5 it no longer does either. If I [manually set](https://huggingface.co/spaces/keturn/INED-datasette/blob/main/metadata.json#L9) `fts_table` for a view, then search does work as expected. My table and view creation code looks like this: ```py connection.execute(""""""CREATE TABLE IF NOT EXISTS captions(image_key text PRIMARY KEY, caption text NOT NULL) """""")   connection.execute(""""""CREATE VIRTUAL TABLE captions_fts USING fts5(caption, image_key UNINDEXED, content=captions) """""") ```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1991/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1538197093,I_kwDOBm6k_c5brwZl,1995,foreign_keys error 500,137183,open,0,,,0,2023-01-18T15:27:36Z,2023-01-18T16:44:01Z,,NONE,,"**Error 500 expected string or bytes-like object** [espial-new.sqlite3.zip](https://github.com/simonw/datasette/files/10447965/espial-new.sqlite3.zip) run `datasette espial-new.sqlite3` & click on any table other than `User` ``` /home/jon/.local/lib/python3.10/site-packages/datasette/app.py:814 in │ │ expand_foreign_keys │ │ │ │ 811 │ │ │ from {other_table} │ │ 812 │ │ │ where {other_column} in ({placeholders}) │ │ 813 │ │ """""".format( │ │ ❱ 814 │ │ │ other_column=escape_sqlite(fk[""other_column""]), │ │ 815 │ │ │ label_column=escape_sqlite(label_column), │ │ 816 │ │ │ other_table=escape_sqlite(fk[""other_table""]), │ │ 817 │ │ │ placeholders="", "".join([""?""] * len(set(values))), │ │ │ │ ╭───────────────────────────── locals ──────────────────────────────╮ │ │ │ column = 'user_id' │ │ │ │ database = 'espial-new' │ │ │ │ db = │ │ │ │ fk = { │ │ │ │ │ 'column': 'user_id', │ │ │ │ │ 'other_table': 'user', │ │ │ │ │ 'other_column': None │ │ │ │ } │ │ │ │ foreign_keys = [ │ │ │ │ │ { │ │ │ │ │ │ 'column': 'user_id', │ │ │ │ │ │ 'other_table': 'user', │ │ │ │ │ │ 'other_column': None │ │ │ │ │ } │ │ │ │ ] │ │ │ │ label_column = 'name' │ │ │ │ labeled_fks = {} │ │ │ │ self = │ │ │ │ table = 'bookmark' │ │ │ │ values = [] │ │ │ ╰───────────────────────────────────────────────────────────────────╯ │ │ │ │ /home/jon/.local/lib/python3.10/site-packages/datasette/utils/__init__.py:346 │ │ in escape_sqlite │ │ │ │ 343 │ │ 344 │ │ 345 def escape_sqlite(s): │ │ ❱ 346 │ if _boring_keyword_re.match(s) and (s.lower() not in reserved_words) │ │ 347 │ │ return s │ │ 348 │ else: │ │ 349 │ │ return f""[{s}]"" │ │ │ │ ╭─ locals ─╮ │ │ │ s = None │ │ │ ╰──────────╯ │ ╰─────────────────────────────────────────────────────────────────────────────────╯ TypeError: expected string or bytes-like object Traceback (most recent call last): File ""/home/jon/.local/lib/python3.10/site-packages/datasette/app.py"", line 1354, in route_path response = await view(request, send) File ""/home/jon/.local/lib/python3.10/site-packages/datasette/views/base.py"", line 134, in view return await self.dispatch_request(request) File ""/home/jon/.local/lib/python3.10/site-packages/datasette/views/base.py"", line 91, in dispatch_request return await handler(request) File ""/home/jon/.local/lib/python3.10/site-packages/datasette/views/base.py"", line 361, in get response_or_template_contexts = await self.data(request, **data_kwargs) File ""/home/jon/.local/lib/python3.10/site-packages/datasette/views/table.py"", line 158, in data return await self._data_traced(request, default_labels, _next, _size) File ""/home/jon/.local/lib/python3.10/site-packages/datasette/views/table.py"", line 603, in _data_traced await self.ds.expand_foreign_keys( File ""/home/jon/.local/lib/python3.10/site-packages/datasette/app.py"", line 814, in expand_foreign_keys other_column=escape_sqlite(fk[""other_column""]), File ""/home/jon/.local/lib/python3.10/site-packages/datasette/utils/__init__.py"", line 346, in escape_sqlite if _boring_keyword_re.match(s) and (s.lower() not in reserved_words): TypeError: expected string or bytes-like object INFO: 127.0.0.1:38574 - ""GET /espial-new/bookmark HTTP/1.1"" 500 Internal Server Error INFO: 127.0.0.1:38574 - ""GET /-/static/app.css?d59929 HTTP/1.1"" 200 OK ``` Schema: ``` CREATE TABLE IF NOT EXISTS ""user"" ( ""id"" INTEGER PRIMARY KEY, ""name"" VARCHAR NOT NULL, ""password_hash"" VARCHAR NOT NULL, ""api_token"" VARCHAR NULL, ""private_default"" BOOLEAN NOT NULL, ""archive_default"" BOOLEAN NOT NULL, ""privacy_lock"" BOOLEAN NOT NULL, CONSTRAINT ""unique_user_name"" UNIQUE (""name"") ); CREATE TABLE IF NOT EXISTS ""bookmark"" ( ""id"" INTEGER PRIMARY KEY, ""user_id"" INTEGER NOT NULL REFERENCES ""user"" ON DELETE RESTRICT ON UPDATE RESTRICT, ""slug"" VARCHAR NOT NULL DEFAULT (Lower(Hex(Randomblob(6)))), ""href"" VARCHAR NOT NULL, ""description"" VARCHAR NOT NULL, ""extended"" VARCHAR NOT NULL, ""time"" TIMESTAMP NOT NULL, ""shared"" BOOLEAN NOT NULL, ""to_read"" BOOLEAN NOT NULL, ""selected"" BOOLEAN NOT NULL, ""archive_href"" VARCHAR NULL, CONSTRAINT ""unique_user_href"" UNIQUE (""user_id"", ""href""), CONSTRAINT ""unique_user_slug"" UNIQUE (""user_id"", ""slug"") ); CREATE TABLE IF NOT EXISTS ""bookmark_tag"" ( ""id"" INTEGER PRIMARY KEY, ""user_id"" INTEGER NOT NULL REFERENCES ""user"" ON DELETE RESTRICT ON UPDATE RESTRICT, ""tag"" VARCHAR NOT NULL, ""bookmark_id"" INTEGER NOT NULL REFERENCES ""bookmark"" ON DELETE RESTRICT ON UPDATE RESTRICT, ""seq"" INTEGER NOT NULL, CONSTRAINT ""unique_user_tag_bookmark_id"" UNIQUE (""user_id"", ""tag"", ""bookmark_id""), CONSTRAINT ""unique_user_bookmark_id_tag_seq"" UNIQUE (""user_id"", ""bookmark_id"", ""tag"", ""seq"") ); CREATE TABLE IF NOT EXISTS ""note"" ( ""id"" INTEGER PRIMARY KEY, ""user_id"" INTEGER NOT NULL REFERENCES ""user"" ON DELETE RESTRICT ON UPDATE RESTRICT, ""slug"" VARCHAR NOT NULL DEFAULT (Lower(Hex(Randomblob(10)))), ""length"" INTEGER NOT NULL, ""title"" VARCHAR NOT NULL, ""text"" VARCHAR NOT NULL, ""is_markdown"" BOOLEAN NOT NULL, ""shared"" BOOLEAN NOT NULL DEFAULT false, ""created"" TIMESTAMP NOT NULL, ""updated"" TIMESTAMP NOT NULL ); CREATE INDEX idx_bookmark_time ON bookmark (user_id, time DESC); CREATE INDEX idx_bookmark_tag_bookmark_id ON bookmark_tag (bookmark_id, id, tag, seq); CREATE INDEX idx_note_user_created ON note (user_id, created DESC); ``` ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1995/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1575880841,I_kwDOBm6k_c5d7giJ,2020,"Documentation refers to ""off"" setting; doesn't seem to work, ""false"" does",1350673,open,0,,,0,2023-02-08T10:38:10Z,2023-02-08T10:38:10Z,,NONE,,"https://docs.datasette.io/en/stable/settings.html#suggest-facets, among others, suggests using ""off"" to disable the setting; however, this doesn't appear to work in the JSON config files, where it apparently needs to be a ""JSON boolean"" and have the values ""true"" or ""false"". Perhaps the Python code is more flexible?...but either way, the documentation probably should mention it.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2020/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1577548579,I_kwDOBm6k_c5eB3sj,2021,Docker images for 1.0 alphas?,1563881,open,0,,,0,2023-02-09T09:35:52Z,2023-02-09T09:35:52Z,,NONE,,"Hi, would you consider putting 1.0alpha images on Dockerhub? (Also, how usable are the alphas?)",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2021/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1592327343,I_kwDOBm6k_c5e6Pyv,2029,"Sorry Simon, didn't know how else to contact you",5804626,open,0,,,0,2023-02-20T19:02:53Z,2023-02-20T19:02:53Z,,NONE,,"Hi Simon, Would you be willing to chat with me about Datasette? I have some questions. I am working on a project to evaluate data ingestion tools for a research organization and I ran across Datasette. I have looked through a lot of your documentation, but still have some questions, which are very specific. If you would be willing to write me back about this, my email is laura@renci.org. Thanks, Laura",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2029/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1646068413,I_kwDOBm6k_c5iHQK9,2048,Test failures encountered while packaging for GNU Guix,8332263,open,0,,,0,2023-03-29T15:36:54Z,2023-03-29T15:36:54Z,,NONE,,"Hello, While reviewing a packaged submitted to Guix to add `datasette`, the test suite produces the following errors: ``` =================================== FAILURES =================================== _________________________ test_row_strange_table_name __________________________ [gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = def test_row_strange_table_name(app_client): response = app_client.get( ""/fixtures/table~2Fwith~2Fslashes~2Ecsv/3.json?_shape=objects"" ) > assert response.status == 200 E assert 400 == 200 E + where 400 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:701: AssertionError ----------------------------- Captured stderr call ----------------------------- ERROR: conn=, sql = 'select rowid, * from [table%7E2Fwith%7E2Fslashes%7E2Ecsv] where ""rowid""=:p0', params = {'p0': '3'}: no such table: table%7E2Fwith%7E2Fslashes%7E2Ecsv _______________ test_database_page_for_database_with_dot_in_name _______________ [gw15] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_with_dot = def test_database_page_for_database_with_dot_in_name(app_client_with_dot): response = app_client_with_dot.get(""/fixtures~2Edot.json"") > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:633: AssertionError ___________________ test_tilde_encoded_database_names[fo%o] ____________________ [gw6] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python db_name = 'fo%o' @pytest.mark.asyncio @pytest.mark.parametrize(""db_name"", (""foo"", r""fo%o"", ""f~/c.d"")) async def test_tilde_encoded_database_names(db_name): ds = Datasette() ds.add_memory_database(db_name) response = await ds.client.get(""/.json"") assert db_name in response.json().keys() path = response.json()[db_name][""path""] # And the JSON for that database response2 = await ds.client.get(path + "".json"") > assert response2.status_code == 200 E assert 302 == 200 E + where 302 = .status_code /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:983: AssertionError __________________ test_tilde_encoded_database_names[f~/c.d] ___________________ [gw7] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python db_name = 'f~/c.d' @pytest.mark.asyncio @pytest.mark.parametrize(""db_name"", (""foo"", r""fo%o"", ""f~/c.d"")) async def test_tilde_encoded_database_names(db_name): ds = Datasette() ds.add_memory_database(db_name) response = await ds.client.get(""/.json"") assert db_name in response.json().keys() path = response.json()[db_name][""path""] # And the JSON for that database response2 = await ds.client.get(path + "".json"") > assert response2.status_code == 200 E assert 302 == 200 E + where 302 = .status_code /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:983: AssertionError ______________ test_database_with_space_in_name[/searchable.json] ______________ [gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '/searchable.json' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ___________________ test_database_with_space_in_name[.json] ____________________ [gw19] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '.json' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ______________ test_database_with_space_in_name[/searchable_view] ______________ [gw22] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '/searchable_view' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects _____________________ test_database_with_space_in_name[/] ______________________ [gw18] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '/' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ________________ test_database_with_space_in_name[/searchable] _________________ [gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '/searchable' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ___________ test_database_with_space_in_name[/searchable_view.json] ____________ [gw23] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = path = '/searchable_view.json' @pytest.mark.parametrize( ""path"", ( ""/"", "".json"", ""/searchable"", ""/searchable.json"", ""/searchable_view"", ""/searchable_view.json"", ), ) def test_database_with_space_in_name(app_client_two_attached_databases, path): > response = app_client_two_attached_databases.get( ""/extra~20database"" + path, follow_redirects=True ) /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in __call__ return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(*args, **kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = request = follow_redirects = True history = [, , , , , , ...] async def _send_handling_redirects( self, request: Request, follow_redirects: bool, history: typing.List[Response], ) -> Response: while True: if len(history) > self.max_redirects: > raise TooManyRedirects( ""Exceeded maximum allowed redirects."", request=request ) E httpx.TooManyRedirects: Exceeded maximum allowed redirects. /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ________________ test_weird_database_names[database (1).sqlite] ________________ [gw7] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python tmpdir = local('/tmp/guix-build-datasette-0.64.2.drv-0/pytest-of-nixbld/pytest-0/popen-gw7/test_weird_database_names_data0') filename = 'database (1).sqlite' @pytest.mark.parametrize( ""filename"", [""test-database (1).sqlite"", ""database (1).sqlite""] ) def test_weird_database_names(tmpdir, filename): # https://github.com/simonw/datasette/issues/1181 runner = CliRunner() db_path = str(tmpdir / filename) sqlite3.connect(db_path).execute(""vacuum"") result1 = runner.invoke(cli, [db_path, ""--get"", ""/""]) assert result1.exit_code == 0, result1.output filename_no_stem = filename.rsplit(""."", 1)[0] expected_link = '{}'.format( tilde_encode(filename_no_stem), filename_no_stem ) assert expected_link in result1.output # Now try hitting that database page result2 = runner.invoke( cli, [db_path, ""--get"", ""/{}"".format(tilde_encode(filename_no_stem))] ) > assert result2.exit_code == 0, result2.output E AssertionError: E E assert 1 == 0 E + where 1 = .exit_code /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_cli.py:321: AssertionError _____________ test_weird_database_names[test-database (1).sqlite] ______________ [gw6] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python tmpdir = local('/tmp/guix-build-datasette-0.64.2.drv-0/pytest-of-nixbld/pytest-0/popen-gw6/test_weird_database_names_test0') filename = 'test-database (1).sqlite' @pytest.mark.parametrize( ""filename"", [""test-database (1).sqlite"", ""database (1).sqlite""] ) def test_weird_database_names(tmpdir, filename): # https://github.com/simonw/datasette/issues/1181 runner = CliRunner() db_path = str(tmpdir / filename) sqlite3.connect(db_path).execute(""vacuum"") result1 = runner.invoke(cli, [db_path, ""--get"", ""/""]) assert result1.exit_code == 0, result1.output filename_no_stem = filename.rsplit(""."", 1)[0] expected_link = '{}'.format( tilde_encode(filename_no_stem), filename_no_stem ) assert expected_link in result1.output # Now try hitting that database page result2 = runner.invoke( cli, [db_path, ""--get"", ""/{}"".format(tilde_encode(filename_no_stem))] ) > assert result2.exit_code == 0, result2.output E AssertionError: E E assert 1 == 0 E + where 1 = .exit_code /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_cli.py:321: AssertionError _ test_row_html_compound_primary_key[/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd-expected1] _ [gw11] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd' expected = [['a/b', '.c-d', 'c']] @pytest.mark.parametrize( ""path,expected"", ( ( ""/fixtures/compound_primary_key/a,b"", [ [ 'a', 'b', 'c', ] ], ), ( ""/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd"", [ [ 'a/b', '.c-d', 'c', ] ], ), ), ) def test_row_html_compound_primary_key(app_client, path, expected): response = app_client.get(path) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:370: AssertionError _ test_css_classes_on_body[/fixtures/table~2Fwith~2Fslashes~2Ecsv-expected_classes5] _ [gw3] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv' expected_classes = ['table', 'db-fixtures', 'table-tablewithslashescsv-fa7563'] @pytest.mark.parametrize( ""path,expected_classes"", [ (""/"", [""index""]), (""/fixtures"", [""db"", ""db-fixtures""]), (""/fixtures?sql=select+1"", [""query"", ""db-fixtures""]), ( ""/fixtures/simple_primary_key"", [""table"", ""db-fixtures"", ""table-simple_primary_key""], ), ( ""/fixtures/neighborhood_search"", [""query"", ""db-fixtures"", ""query-neighborhood_search""], ), ( ""/fixtures/table~2Fwith~2Fslashes~2Ecsv"", [""table"", ""db-fixtures"", ""table-tablewithslashescsv-fa7563""], ), ( ""/fixtures/simple_primary_key/1"", [""row"", ""db-fixtures"", ""table-simple_primary_key""], ), ], ) def test_css_classes_on_body(app_client, path, expected_classes): response = app_client.get(path) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:238: AssertionError _ test_templates_considered[/fixtures/table~2Fwith~2Fslashes~2Ecsv-table-fixtures-tablewithslashescsv-fa7563.html, *table.html] _ [gw3] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv' expected_considered = 'table-fixtures-tablewithslashescsv-fa7563.html, *table.html' @pytest.mark.parametrize( ""path,expected_considered"", [ (""/"", ""*index.html""), (""/fixtures"", ""database-fixtures.html, *database.html""), ( ""/fixtures/simple_primary_key"", ""table-fixtures-simple_primary_key.html, *table.html"", ), ( ""/fixtures/table~2Fwith~2Fslashes~2Ecsv"", ""table-fixtures-tablewithslashescsv-fa7563.html, *table.html"", ), ( ""/fixtures/simple_primary_key/1"", ""row-fixtures-simple_primary_key.html, *row.html"", ), ], ) def test_templates_considered(app_client, path, expected_considered): response = app_client.get(path) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:264: AssertionError _ test_alternate_url_json[/fixtures/table~2Fwith~2Fslashes~2Ecsv-http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json] _ [gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv' expected = 'http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json' @pytest.mark.parametrize( ""path,expected"", ( # Instance index page (""/"", ""http://localhost/.json""), # Table page (""/fixtures/facetable"", ""http://localhost/fixtures/facetable.json""), ( ""/fixtures/table~2Fwith~2Fslashes~2Ecsv"", ""http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json"", ), # Row page ( ""/fixtures/no_primary_key/1"", ""http://localhost/fixtures/no_primary_key/1.json"", ), # Database index page ( ""/fixtures"", ""http://localhost/fixtures.json"", ), # Custom query page ( ""/fixtures?sql=select+*+from+facetable"", ""http://localhost/fixtures.json?sql=select+*+from+facetable"", ), # Canned query page ( ""/fixtures/neighborhood_search?text=town"", ""http://localhost/fixtures/neighborhood_search.json?text=town"", ), # /-/ pages ( ""/-/plugins"", ""http://localhost/-/plugins.json"", ), ), ) def test_alternate_url_json(app_client, path, expected): response = app_client.get(path) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:948: AssertionError _ test_edit_sql_link_on_canned_queries[/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC-/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B] _ [gw18] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC' expected = '/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B' @pytest.mark.parametrize( ""path,expected"", [ ( ""/fixtures/neighborhood_search"", ""/fixtures?sql=%0Aselect+_neighborhood%2C+facet_cities.name%2C+state%0Afrom+facetable%0A++++join+facet_cities%0A++++++++on+facetable._city_id+%3D+facet_cities.id%0Awhere+_neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0Aorder+by+_neighborhood%3B%0A&text="", ), ( ""/fixtures/neighborhood_search?text=ber"", ""/fixtures?sql=%0Aselect+_neighborhood%2C+facet_cities.name%2C+state%0Afrom+facetable%0A++++join+facet_cities%0A++++++++on+facetable._city_id+%3D+facet_cities.id%0Awhere+_neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0Aorder+by+_neighborhood%3B%0A&text=ber"", ), (""/fixtures/pragma_cache_size"", None), ( # /fixtures/𝐜𝐢𝐭𝐢𝐞𝐬 ""/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC"", ""/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B"", ), (""/fixtures/magic_parameters"", None), ], ) def test_edit_sql_link_on_canned_queries(app_client, path, expected): response = app_client.get(path) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:841: AssertionError _______________________ test_table_with_slashes_in_name ________________________ [gw9] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = def test_table_with_slashes_in_name(app_client): response = app_client.get( ""/fixtures/table~2Fwith~2Fslashes~2Ecsv.json?_shape=objects"" ) > assert response.status == 200 E assert 302 == 200 E + where 302 = .status /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:141: AssertionError __________________ test_custom_query_with_unicode_characters ___________________ [gw8] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = def test_custom_query_with_unicode_characters(app_client): # /fixtures/𝐜𝐢𝐭𝐢𝐞𝐬.json response = app_client.get( ""/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC.json?_shape=array"" ) > assert [{""id"": 1, ""name"": ""San Francisco""}] == response.json /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:1042: _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:40: in json return json.loads(self.text) /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/__init__.py:346: in loads return _default_decoder.decode(s) /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/decoder.py:337: in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ self = , s = '', idx = 0 def raw_decode(self, s, idx=0): """"""Decode a JSON document from ``s`` (a ``str`` beginning with a JSON document) and return a 2-tuple of the Python representation and the index in ``s`` where the document ended. This can be used to decode a JSON document from a string that may have extraneous data at the end. """""" try: obj, end = self.scan_once(s, idx) except StopIteration as err: > raise JSONDecodeError(""Expecting value"", s, err.value) from None E json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/decoder.py:355: JSONDecodeError _ test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3] _ [gw13] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = path = '/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw' expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']] @pytest.mark.parametrize( ""path,expected_rows"", [ ( ""/fixtures/searchable.json?_search=dog"", [ [1, ""barry cat"", ""terry dog"", ""panther""], [2, ""terry dog"", ""sara weasel"", ""puma""], ], ), ( # Special keyword shouldn't break FTS query ""/fixtures/searchable.json?_search=AND"", [], ), ( # Without _searchmode=raw this should return no results ""/fixtures/searchable.json?_search=te*+AND+do*"", [], ), ( # _searchmode=raw ""/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw"", [ [1, ""barry cat"", ""terry dog"", ""panther""], [2, ""terry dog"", ""sara weasel"", ""puma""], ], ), ( # _searchmode=raw combined with _search_COLUMN ""/fixtures/searchable.json?_search_text2=te*&_searchmode=raw"", [ [1, ""barry cat"", ""terry dog"", ""panther""], ], ), ( ""/fixtures/searchable.json?_search=weasel"", [[2, ""terry dog"", ""sara weasel"", ""puma""]], ), ( ""/fixtures/searchable.json?_search_text2=dog"", [[1, ""barry cat"", ""terry dog"", ""panther""]], ), ( ""/fixtures/searchable.json?_search_name%20with%20.%20and%20spaces=panther"", [[1, ""barry cat"", ""terry dog"", ""panther""]], ), ], ) def test_searchable(app_client, path, expected_rows): response = app_client.get(path) > assert expected_rows == response.json[""rows""] E AssertionError: assert [[1, 'barry cat', 'terry dog', 'panther'],\n [2, 'terry dog', 'sara weasel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E [ E - , E + [1, E + 'barry cat', E + 'terry dog', E + 'panther'], E + [2, E + 'terry dog', E + 'sara weasel', E + 'puma'], E ] /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:402: AssertionError _____ test_searchmode[table_metadata1-_search=te*+AND+do*-expected_rows1] ______ [gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python table_metadata = {'searchmode': 'raw'}, querystring = '_search=te*+AND+do*' expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']] @pytest.mark.parametrize( ""table_metadata,querystring,expected_rows"", [ ( {}, ""_search=te*+AND+do*"", [], ), ( {""searchmode"": ""raw""}, ""_search=te*+AND+do*"", _SEARCHMODE_RAW_RESULTS, ), ( {}, ""_search=te*+AND+do*&_searchmode=raw"", _SEARCHMODE_RAW_RESULTS, ), # Can be over-ridden with _searchmode=escaped ( {""searchmode"": ""raw""}, ""_search=te*+AND+do*&_searchmode=escaped"", [], ), ], ) def test_searchmode(table_metadata, querystring, expected_rows): with make_app_client( metadata={""databases"": {""fixtures"": {""tables"": {""searchable"": table_metadata}}}} ) as client: response = client.get(""/fixtures/searchable.json?"" + querystring) > assert expected_rows == response.json[""rows""] E AssertionError: assert [[1, 'barry cat', 'terry dog', 'panther'],\n [2, 'terry dog', 'sara weasel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E [ E - , E + [1, E + 'barry cat', E + 'terry dog', E + 'panther'], E + [2, E + 'terry dog', E + 'sara weasel', E + 'puma'], E ] /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:442: AssertionError _ test_searchmode[table_metadata2-_search=te*+AND+do*&_searchmode=raw-expected_rows2] _ [gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python table_metadata = {}, querystring = '_search=te*+AND+do*&_searchmode=raw' expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']] @pytest.mark.parametrize( ""table_metadata,querystring,expected_rows"", [ ( {}, ""_search=te*+AND+do*"", [], ), ( {""searchmode"": ""raw""}, ""_search=te*+AND+do*"", _SEARCHMODE_RAW_RESULTS, ), ( {}, ""_search=te*+AND+do*&_searchmode=raw"", _SEARCHMODE_RAW_RESULTS, ), # Can be over-ridden with _searchmode=escaped ( {""searchmode"": ""raw""}, ""_search=te*+AND+do*&_searchmode=escaped"", [], ), ], ) def test_searchmode(table_metadata, querystring, expected_rows): with make_app_client( metadata={""databases"": {""fixtures"": {""tables"": {""searchable"": table_metadata}}}} ) as client: response = client.get(""/fixtures/searchable.json?"" + querystring) > assert expected_rows == response.json[""rows""] E AssertionError: assert [[1, 'barry cat', 'terry dog', 'panther'],\n [2, 'terry dog', 'sara weasel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E [ E - , E + [1, E + 'barry cat', E + 'terry dog', E + 'panther'], E + [2, E + 'terry dog', E + 'sara weasel', E + 'puma'], E ] /tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:442: AssertionError =========================== short test summary info ============================ FAILED tests/test_api.py::test_row_strange_table_name - assert 400 == 200 FAILED tests/test_api.py::test_database_page_for_database_with_dot_in_name - ... FAILED tests/test_api.py::test_tilde_encoded_database_names[fo%o] - assert 30... FAILED tests/test_api.py::test_tilde_encoded_database_names[f~/c.d] - assert ... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable.json] FAILED tests/test_api.py::test_database_with_space_in_name[.json] - httpx.Too... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable_view] FAILED tests/test_api.py::test_database_with_space_in_name[/] - httpx.TooMany... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable] - htt... FAILED tests/test_api.py::test_database_with_space_in_name[/searchable_view.json] FAILED tests/test_cli.py::test_weird_database_names[database (1).sqlite] - As... FAILED tests/test_cli.py::test_weird_database_names[test-database (1).sqlite] FAILED tests/test_html.py::test_row_html_compound_primary_key[/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd-expected1] FAILED tests/test_html.py::test_css_classes_on_body[/fixtures/table~2Fwith~2Fslashes~2Ecsv-expected_classes5] FAILED tests/test_html.py::test_templates_considered[/fixtures/table~2Fwith~2Fslashes~2Ecsv-table-fixtures-tablewithslashescsv-fa7563.html, *table.html] FAILED tests/test_html.py::test_alternate_url_json[/fixtures/table~2Fwith~2Fslashes~2Ecsv-http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json] FAILED tests/test_html.py::test_edit_sql_link_on_canned_queries[/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC-/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B] FAILED tests/test_table_api.py::test_table_with_slashes_in_name - assert 302 ... FAILED tests/test_table_api.py::test_custom_query_with_unicode_characters - j... FAILED tests/test_table_api.py::test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3] FAILED tests/test_table_api.py::test_searchmode[table_metadata1-_search=te*+AND+do*-expected_rows1] FAILED tests/test_table_api.py::test_searchmode[table_metadata2-_search=te*+AND+do*&_searchmode=raw-expected_rows2] =========== 22 failed, 1049 passed, 3 skipped in 1522.28s (0:25:22) ============ error: in phase 'check': uncaught exception: %exception #<&invoke-error program: ""/gnu/store/ziqwkzz6znb5d3c245xn0cq5ra2ly0w3-python-pytest-7.1.3/bin/pytest"" arguments: (""-vv"" ""-n"" ""24"" ""-m"" ""not serial"") exit-status: 1 term-signal: #f stop-signal: #f> phase `check' failed after 1523.3 seconds ``` The tests run in a private namespace without internet connectivity, and the Python dependencies are at: ``` python-aiofiles@0.6.0 python-asgi-csrf@0.9 python-asgiref@3.4.1 + python-beautifulsoup4@4.11.1 python-black@22.3.0 python-click-default-group@1.2.2 python-click@8.1.3 + python-cogapp@3.3.0 python-httpx@0.23.0 python-hupper@1.10.3 python-itsdangerous@2.0.1 + python-janus@1.0.0 python-jinja2@3.1.1 python-mergedeep@1.3.4 python-pint@0.20.1 python-pluggy@1.0.0 + python-pytest-asyncio@0.17.2 python-pytest-runner@5.2 python-pytest-timeout@2.0.2 + python-pytest-xdist@2.5.0 python-pytest@7.1.3 python-pyyaml@6.0 python-setuptools@64.0.3 + python-trustme@0.9.0 python-uvicorn@0.17.6 ``` With Python 3.9.9. Thank you!",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2048/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1727478903,I_kwDOBm6k_c5m9zx3,2081,Update Endpoints defined in metadata throws 403 Forbidden after a while,15085007,open,0,,,0,2023-05-26T11:52:30Z,2023-05-26T11:52:30Z,,NONE,,"Hello. I expose an endpoint to update `tasks`: ``` { ""title"": ""My Datasette Instance"", ""databases"": { ""tasks"": { ""queries"": { ""update_task"": { ""sql"": ""UPDATE tasks SET status = :status, result = :result, systemMessage = :systemMessage WHERE queueID = :queueID"", ""write"": true, ""on_success_message"": ""Task updated"", ""on_success_redirect"": ""/tasks/tasks.json"", ""on_error_message"": ""Task update failed"", ""on_error_redirect"": ""/tasks.json"", ""params"": [""queueID"", ""taskData"", ""status"", ""result"", ""systemMessage""] } } } } } ``` This works really well! But after a while, the Datasette Instanz answers with **403 Forbidden**. I have to delete the database and recreate it in order to work again. Any help here? (´。_。`)",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2081/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1761613778,I_kwDOBm6k_c5pABfS,2084,Support facets for columns that contain timestamps,19492893,open,0,,,0,2023-06-17T03:33:54Z,2023-06-17T03:33:54Z,,NONE,," Django has this very nice filter for datetime fields - It would be nice to have something similar to facet by a field that contains a timestamp in datasette too - Which doesn't seem to do anything with timestamps right now... ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2084/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1762180409,I_kwDOBm6k_c5pCL05,2085,Interactive row selection in Datasette ,24938923,open,0,,,0,2023-06-18T08:29:45Z,2023-06-18T08:31:23Z,,NONE,,"Simon did a excellent [prototype](https://til.simonwillison.net/datasette/row-selection-prototype) of an interactive row selection in Datasette. I hope this [functionality](https://camo.githubusercontent.com/3d4a0f31fb6a27fd279f809af5b53dc3b76faa63c7721e228951c5252b645a77/68747470733a2f2f7374617469632e73696d6f6e77696c6c69736f6e2e6e65742f7374617469632f323032332f6461746173657474652d7069636b65722e676966) can be turned into a Datasette plugin. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2085/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1794097871,I_kwDOBm6k_c5q78LP,2095,"Introduce ""dark mode"" CSS",3315059,open,0,,,0,2023-07-07T19:15:58Z,2023-07-07T19:15:58Z,,NONE,,Using [the CSS media query `prefers-color-scheme`](https://developer.mozilla.org/en-US/docs/Web/CSS/@media/prefers-color-scheme) we can provide a dark-mode version of Datasette,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2095/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1824457306,I_kwDOBm6k_c5svwJa,2122,Parameters on canned queries: fixed or query-generated list?,1563881,open,0,,,0,2023-07-27T14:07:07Z,2023-07-27T14:07:07Z,,NONE,,"Hi, currently parameters in canned queries are just text fields. It would be cool to have one of the options below. Would you accept a PR doing something in this direction? (Possibly this could even work as a plugin.) * adding facets, which would work like facets on tables or views, giving a list of selectable options (and leaving parameters as is) * making it possible to provide a query which returns selectable values for a parameter, e.g. ``` calendar_entries_current_instrument: sql: | select * from calendar_entries where DTEND_UNIX > UNIXEPOCH() and DTSTART_UNIX < UNIXEPOCH() + :days *24*60*60 and current = 1 and MACHINE = :instrument order by DTSTART_UNIX params: days: sql: ""SELECT VALUE FROM generate_series(1, 30, 1)"" # this obviously requires the corresponding sqlite extension instrument: sql: ""SELECT DISTINCT MACHINE FROM calendar_entries"" ``` * making it possible to provide a fixed list of parameters ``` calendar_entries_current_instrument: sql: | select * from calendar_entries where DTEND_UNIX > UNIXEPOCH() and DTSTART_UNIX < UNIXEPOCH() + :days *24*60*60 and current = 1 and MACHINE = :instrument order by DTSTART_UNIX params: days: values: [1, 2, 3, 5, 10, 20, 30] instrument: values: [supermachine, crappymachine, boringmachine] ```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2122/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1899310542,I_kwDOBm6k_c5xNS3O,2187,Datasette for serving JSON only,19705106,open,0,,,0,2023-09-16T05:48:29Z,2023-09-16T05:48:29Z,,NONE,,"Hi, is there any way to use datasette for serving json only without displaying webpage? I've tried to search about this in documentation but didn't get any information",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2187/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1931794126,I_kwDOBm6k_c5zJNbO,2198,--load-extension=spatialite not working with Windows,363004,open,0,,,0,2023-10-08T12:50:22Z,2023-10-08T12:50:22Z,,NONE,,"Using each of `python -m datasette counties.db -m metadata.yml --load-extension=SpatiaLite` and `python -m datasette counties.db --load-extension=""C:\Windows\System32\mod_spatialite.dll""` and `python -m datasette counties.db --load-extension=C:\Windows\System32\mod_spatialite.dll` I got the error: ``` File ""C:\Users\m3n7es\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\datasette\database.py"", line 209, in in_thread self.ds._prepare_connection(conn, self.name) File ""C:\Users\m3n7es\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\datasette\app.py"", line 596, in _prepare_connection conn.execute(""SELECT load_extension(?, ?)"", [path, entrypoint]) sqlite3.OperationalError: The specified module could not be found. ``` I finally tried modifying the code in app.py to read: ``` def _prepare_connection(self, conn, database): conn.row_factory = sqlite3.Row conn.text_factory = lambda x: str(x, ""utf-8"", ""replace"") if self.sqlite_extensions: conn.enable_load_extension(True) for extension in self.sqlite_extensions: # ""extension"" is either a string path to the extension # or a 2-item tuple that specifies which entrypoint to load. #if isinstance(extension, tuple): # path, entrypoint = extension # conn.execute(""SELECT load_extension(?, ?)"", [path, entrypoint]) #else: conn.execute(""SELECT load_extension('C:\Windows\System32\mod_spatialite.dll')"") ``` At which point the counties example worked. Is there a correct way to install/use the extension on Windows? My method will cause issues if there's a second extension to be used. On an unrelated note, my next step is to figure out how to write a query across the two loaded databases supplied from the command line: `python -m datasette rm_toucans_23_10_07.db counties.db -m metadata.yml --load-extension=SpatiaLite` ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2198/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1955676270,I_kwDOBm6k_c50kUBu,2201,Discord invite link is invalid,11708906,open,0,,,0,2023-10-21T21:50:05Z,2023-10-21T21:50:05Z,,NONE,,"https://datasette.io/discord leads to https://discord.com/invite/ktd74dm5mw and returns the following: ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2201/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1977726056,I_kwDOBm6k_c514bRo,2203,custom plugin not seen as sql function,7113541,open,0,,,0,2023-11-05T10:30:19Z,2023-11-05T10:30:19Z,,NONE,,"Hi, I'm not sure if this is the right repo for this issue. I'm using datasette with the parquet (to read a duckdb), and jellyfish plugins. Both work perfectly. Now I need to create a simple plugin that uses the python rouge package and returns a similarity score (similarly to how the jellyfish plugin works). If I create a custom plugin, even the example hello_world one, copied directly from the tutorial, I get the following error: ```duckdb.duckdb.CatalogException: Catalog Error: Scalar Function with name hello_world does not exist!``` Since the jellyfish plugin doesn't do anything more complex, I'm wondering if there is some other kind of issue with my setup.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2203/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1994845152,I_kwDOBm6k_c525uvg,2207,ModuleNotFoundError: No module named 'click_default_group,283441,open,0,,,0,2023-11-15T14:04:32Z,2023-11-15T14:04:32Z,,NONE,,"No matter what I do, I'm getting this error: ``` $ datasette Traceback (most recent call last): File ""/Users/honza/Library/Caches/pypoetry/virtualenvs/juniorguru-Lgaxwd2n-py3.11/bin/datasette"", line 5, in from datasette.cli import cli File ""/Users/honza/Library/Caches/pypoetry/virtualenvs/juniorguru-Lgaxwd2n-py3.11/lib/python3.11/site-packages/datasette/cli.py"", line 6, in from click_default_group import DefaultGroup ModuleNotFoundError: No module named 'click_default_group' ``` I have datasette in my dependencies like this: ```toml [tool.poetry.group.dev.dependencies] datasette = {version = ""1.0a7"", allow-prereleases = true} ``` I had the latest regular version (not pre-release) there originally, but the result was the same: ```toml [tool.poetry.group.dev.dependencies] datasette = ""0.64.5"" ``` Full pyproject.toml is at https://github.com/honzajavorek/junior.guru/ Previously datasette worked for me, but I guess something had to upgrade and now I can't even launch it.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2207/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 2019811176,I_kwDOBm6k_c54Y99o,2211,Unreachable exception handlers for `sqlite3.OperationalError`,1214074,open,0,,,0,2023-12-01T00:50:22Z,2023-12-01T00:50:22Z,,NONE,,"There are several places where `sqlite3.OperationalError` is caught as part of an exception handler which catches multiple exceptions, but is then caught again immediately afterwards by a dedicated exception handler. Because the exception will be caught by the first handler, the logic in the second handler is unreachable and will never be executed. If this is intended behavior, the second handler can be removed. If this is not intended, and the second handler should be the one that catches this exception, then `sqlite3.OperationalError` should be removed from the tuple of exceptions in the first handler. This issue was found via a CodeQL query on the repository, and I've listed the occurrences found by the query below. There may be other instances of this issue in the code that were not surfaced by the query. I'd be happy to share the query if others would like to view or run it. One example: https://github.com/simonw/datasette/blob/452a587e236ef642cbc6ae345b58767ea8420cb5/datasette/views/database.py#L534-L537 Other instances: https://github.com/simonw/datasette/blob/main/datasette/views/base.py#L266-L270 https://github.com/simonw/datasette/blob/main/datasette/views/base.py#L452-L456",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2211/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 2023057255,I_kwDOBm6k_c54lWdn,2212,Can't filter with numbers,605070,open,0,,,0,2023-12-04T05:26:29Z,2023-12-04T05:26:29Z,,NONE,,"I have a schema that uses numbers for a column (actually it's a boolean 1 or 0 but SQLite doesn't have Boolean). I can't seem to get the facet to work or even filtering on this column. My guess is that Datasette is ""stringifying"" the number and it's not matching? Example: https://debian-sqlelf.fly.dev/debian/elf_symbols?_sort_desc=name&_facet=exported&exported=0",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2212/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 274161964,MDU6SXNzdWUyNzQxNjE5NjQ=,101,TemplateAssertionError: no filter named 'tojson',450244,closed,0,,,1,2017-11-15T13:47:32Z,2017-11-15T13:48:55Z,2017-11-15T13:48:55Z,NONE,,"I get an exception clicking on the table link: ``` 2017-11-15 08:40:10 - (sanic)[ERROR]: Traceback (most recent call last): File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/sanic/app.py"", line 503, in handle_request response = await response File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/datasette/app.py"", line 155, in get return await self.view_get(request, name, hash, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/datasette/app.py"", line 219, in view_get **context, File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/sanic_jinja2/__init__.py"", line 84, in render return html(self.render_string(template, request, **context)) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/sanic_jinja2/__init__.py"", line 81, in render_string return self.env.get_template(template).render(**context) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 812, in get_template return self._load_template(name, self.make_globals(globals)) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 786, in _load_template template = self.loader.load(self, name, globals) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/loaders.py"", line 125, in load code = environment.compile(source, name, filename) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 565, in compile self.handle_exception(exc_info, source_hint=source_hint) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 754, in handle_exception reraise(exc_type, exc_value, tb) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/_compat.py"", line 37, in reraise raise value.with_traceback(tb) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/datasette/templates/table.html"", line 29, in template
params = {{ query.params|tojson(4) }}
File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 515, in _generate return generate(source, self, name, filename, defer_init=defer_init) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 62, in generate generator.visit(node) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 849, in visit_Template self.blockvisit(block.body, block_frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 1172, in visit_If self.blockvisit(node.body, if_frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 1353, in visit_Output self.visit(argument, frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 1565, in visit_Filter self.fail('no filter named %r' % node.name, node.lineno) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 427, in fail raise TemplateAssertionError(msg, lineno, self.name, self.filename) jinja2.exceptions.TemplateAssertionError: no filter named 'tojson' ```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/101/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 282971961,MDU6SXNzdWUyODI5NzE5NjE=,175,"Add project topic ""automatic-api""",3179832,closed,0,,,1,2017-12-18T18:09:17Z,2017-12-21T18:33:55Z,2017-12-21T18:33:55Z,NONE,,"Hi there! Could you add the ~~tag~~ topic `automatic-api` to your repository? I am [making a list](https://github.com/dbohdan/automatic-api) of all projects that automatically expose APIs to databases. (Your Show HN made me do it. :-) I knew about PostgREST and PostGraphQL, but it took adding Datasette to sell me on the concept.) They will be easier to discover if there is a standard GitHub tag, and `automatic-api` seems as good a candidate as any. Two projects [already use it](https://github.com/topics/automatic-api).",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/175/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 319449852,MDU6SXNzdWUzMTk0NDk4NTI=,247,SQLite code decoupled from Datasette,11912854,open,0,,,1,2018-05-02T08:03:28Z,2018-05-21T15:29:31Z,,NONE,,"I'm working on the possibility of use Datasette with other file formats that aren't SQLite, like files with [PyTables](https://github.com/PyTables/PyTables) format. In order to accomplish that, I've started [a fork for decoupling the code related with SQLite](https://github.com/jsancho-gpl/datasette/tree/feature/db-type-plugin) and putting it in an external connector to allow future connectors for a lot of file formats. It'd be nice if you could look at it and suggest improvements for a possible PR.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/247/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 325553991,MDExOlB1bGxSZXF1ZXN0MTg5ODYwMDUy,281,Reduces image size using Alpine + Multistage (re: #278),487897,closed,0,,,1,2018-05-23T05:27:05Z,2018-05-26T02:10:38Z,2018-05-26T02:10:38Z,NONE,simonw/datasette/pulls/281,"Hey Simon! I got the image size down from 256MB to 110MB. Seems to be working okay, but you might want to test it a bit more. Example output of `docker run --rm -it datasette` ``` Serve! files=() on port 8001 [2018-05-23 05:23:08 +0000] [1] [INFO] Goin' Fast @ http://127.0.0.1:8001 [2018-05-23 05:23:08 +0000] [1] [INFO] Starting worker [1] ``` Related: https://github.com/simonw/datasette/issues/278 ",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/281/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 352768017,MDU6SXNzdWUzNTI3NjgwMTc=,362,Add option to include/exclude columns in search filters,78156,open,0,,,1,2018-08-22T01:32:08Z,2020-11-03T19:01:59Z,,NONE,,"I have a dataset with many columns, of which only some are likely to be of interest for searching. It would be great for usability if the search filters in the UI could be configured to include/exclude columns. See also: https://github.com/simonw/datasette/issues/292",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/362/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 392610803,MDU6SXNzdWUzOTI2MTA4MDM=,391,Google Trends example doesn’t work,229881,closed,0,,,1,2018-12-19T13:51:38Z,2019-01-02T19:45:13Z,2019-01-02T19:45:12Z,NONE,,"https://google-trends.datasettes.com/ I see a cloud flare error. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/391/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 408518024,MDU6SXNzdWU0MDg1MTgwMjQ=,410,How to setup a multi database environment?,30607,closed,0,,,1,2019-02-10T09:39:24Z,2019-04-12T04:42:28Z,2019-04-12T04:42:27Z,NONE,,"Hi, first of all I need to write that Simon Willison and datasette are really great. I have probably a stupid question, but it seems to me that I do not have the reply in the documentation. I have installed datasette and run it with `datasette mydb.db`, and I can reach it on `http://127.0.0.1:8001`. But how to work with more than one db? Imagine I have ten sqlite databases, and that I need to explore/query these via datasette, how to run datasette? Is it possibile to create a sort of db index and than run `datasette serve myindex`? Thank you",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/410/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 435819321,MDU6SXNzdWU0MzU4MTkzMjE=,436,400 Error when trying to register new user via https://publish.datasettes.com/,317694,closed,0,,,1,2019-04-22T17:55:00Z,2021-01-04T20:15:42Z,2021-01-04T20:15:41Z,NONE,,"Behavior: When registering a new user via Zeit - confirmation is sent and screen acknowledges registered user... When clicking grant access the next screen is a white 400 error message. Replicated: Chrome and Firefox; 2 different email accounts",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/436/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 450862577,MDU6SXNzdWU0NTA4NjI1Nzc=,496,Additional options to gcloud build command in cloudrun - timeout,1740337,closed,0,,,1,2019-05-31T15:43:55Z,2019-05-31T23:05:05Z,2019-05-31T23:05:05Z,NONE,,"I am trying to deploy a 3.1 GB dataset to cloudrun with datasette. Currrently the docker build times out. Would be nice to have a timeout flag or additional gcloud commands that could be specified. Here is the line https://github.com/simonw/datasette/blob/f825e2012109247fa246e2b938f8174069e574f1/datasette/publish/cloudrun.py#L78 I would be happy to submit a PR to allow for a timeout option. What are your ideas of allowing the user additional build publishing flag options?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/496/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 453243459,MDU6SXNzdWU0NTMyNDM0NTk=,503,Handle SQLite databases with spaces in their names?,7936571,closed,0,9599,,1,2019-06-06T21:20:59Z,2019-11-04T23:16:30Z,2019-11-04T23:16:30Z,NONE,,"I named my SQLite database ""Government workers"" and published it to Heroku. When I clicked the ""Government workers"" database online it lead to a 404 page: `Database not found: Government%20workers`. I believe this is because the database name has a space.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/503/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 460396952,MDExOlB1bGxSZXF1ZXN0MjkxNTM0NTk2,529,Use keyed rows - fixes #521,1383872,closed,0,,,1,2019-06-25T12:33:48Z,2019-06-25T12:35:07Z,2019-06-25T12:35:07Z,NONE,simonw/datasette/pulls/529,"Supports template syntax like this: ``` {% for row in display_rows %}

{{ row[""First_Name""] }} {{ row[""Last_Name""] }}

... ```",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/529/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 473307794,MDU6SXNzdWU0NzMzMDc3OTQ=,565,Conflict between datasette and uvicorn click versions,440503,closed,0,,,1,2019-07-26T11:13:40Z,2020-10-02T00:09:55Z,2020-10-02T00:09:55Z,NONE,,"Hello Datasette is awesome thanks so much! I not very familiar with Python but I think there is a problem with datasette docker builds I keep getting this error ``` ERROR: uvicorn 0.8.4 has requirement click==7.*, but you'll have click 6.0 which is incompatible. ERROR: datasette 0.29.2 has requirement click~=7.0, but you'll have click 6.0 which is incompatible. ``` The full log from the docker build is here - https://gist.github.com/jonheslop/e01cd322e761cfaf34f0cb83f86411b0 Just in case it’s helpful this is my setup - https://github.com/dotwatcher/dotwatcher-data",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/565/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 476437213,MDU6SXNzdWU0NzY0MzcyMTM=,566,Unexpected keyword argument 'hidden',8330931,closed,0,,,1,2019-08-03T10:07:57Z,2019-08-03T16:13:36Z,2019-08-03T16:13:36Z,NONE,,"I couldn't get a test example running. I am running python 3.6.8 and tried both windows and windows subsystem for linux, getting the same error. My test.db was created by converting a five line csv file with csvs-to-sqlite. The csv file is: col1, col2, col3 1,2,3 4,5,6 7,8,9 10,11,12 Here is the error message: (myvenv) davido@DESKTOP-L29G79U:~/dot/datasette-eg$ datasette test.db Traceback (most recent call last): File ""/home/davido/dot/datasette-eg/myvenv/bin/datasette"", line 7, in from datasette.cli import cli File ""/home/davido/dot/datasette-eg/myvenv/lib/python3.6/site-packages/datasette/cli.py"", line 2, in import uvicorn File ""/home/davido/dot/datasette-eg/myvenv/lib/python3.6/site-packages/uvicorn/__init__.py"", line 2, in from uvicorn.main import Server, main, run File ""/home/davido/dot/datasette-eg/myvenv/lib/python3.6/site-packages/uvicorn/main.py"", line 224, in headers: typing.List[str], File ""/home/davido/dot/datasette-eg/myvenv/lib/python3.6/site-packages/click/decorators.py"", line 170, in decorator _param_memo(f, OptionClass(param_decls, **attrs)) File ""/home/davido/dot/datasette-eg/myvenv/lib/python3.6/site-packages/click/core.py"", line 1430, in __init__ Parameter.__init__(self, param_decls, type=type, **attrs) TypeError: __init__() got an unexpected keyword argument 'hidden' Thanks.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/566/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 476852861,MDU6SXNzdWU0NzY4NTI4NjE=,568,Add database_color as a configurable option,50906992,open,0,,,1,2019-08-05T13:14:45Z,2023-08-11T05:19:42Z,,NONE,,This would be really useful as it would allow us to tie in with colour schemes.,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/568/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 508100844,MDU6SXNzdWU1MDgxMDA4NDQ=,598,Character encoding bug with CSV export,46313,closed,0,,,1,2019-10-16T21:09:30Z,2021-06-17T18:13:20Z,2019-10-18T22:52:21Z,NONE,,"I was just poking around, and at [this URL](https://sql-murder-mystery.datasette.io/sql-murder-mystery/crime_scene_report.csv?_stream=on&type=arson&_size=max), I encountered this error: ``` 'latin-1' codec can't encode character '\u2019' in position 27: ordinal not in range(256) ``` ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/598/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 539985017,MDExOlB1bGxSZXF1ZXN0MzU0ODY5Mzkx,652,Quick (and uninformed and perhaps misguided) attempt to add a url for hosting datasette at a particular host/URI,132978,closed,0,,,1,2019-12-18T23:37:16Z,2020-03-24T22:14:50Z,2020-03-24T22:14:50Z,NONE,simonw/datasette/pulls/652,"As usual, I don't really know what I'm doing... so this is just a suggested approach. I've not written tests, I've not run the tests, I don't know if I've missed some absolute URLs that would need to have the leading slash dropped. BUT, I tested it with `--config base_url:http://127.0.0.1:8001/` on the command line and from what little I know about datasette it's at least working in some obvious cases. My changes are based on what I saw in https://github.com/simonw/datasette/commit/8da2db4b71096b19e7a9ef1929369b8483d448bf (thanks!) I'm happy to be more thorough on this if you think it's worth pursuing. Fixes #394 (he said, optimistically).",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/652/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 551834842,MDU6SXNzdWU1NTE4MzQ4NDI=,659,README information is obscured by feature history,55480210,closed,0,,,1,2020-01-18T22:34:51Z,2020-12-10T23:28:51Z,2020-12-10T23:28:51Z,NONE,,"While it's sometimes valuable to know how a project has developed, there is usually little justification for including this information in the README, and certainly not immediately after other key information such as ""what does this package do, and who might want to use it?"" Might I recommend that the feature history is migrated to an Appendix in the documentation?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/659/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611835285,MDU6SXNzdWU2MTE4MzUyODU=,752,Non-utf8 encoding in exceptionhandlers and custom-pages,2181410,closed,0,,,1,2020-05-04T12:24:42Z,2020-05-04T17:42:20Z,2020-05-04T17:42:20Z,NONE,,"Hi Simon. Whenever a response is not piped through a router-view, the template is encoded in latin-1 (I think). This is especially a problem (for me) with the new custom_pages-functionality, but also problematic with the 404- and 500-handlers. Thanks!",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/752/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 656959584,MDU6SXNzdWU2NTY5NTk1ODQ=,893,pip3 install datasette not serving static on linuxbrew.,44167,closed,0,,,1,2020-07-14T23:33:38Z,2021-06-02T04:29:56Z,2021-06-02T04:29:56Z,NONE,,"*This error wasn't thrown* ``` Traceback (most recent call last): File ""/home/linuxbrew/.linuxbrew/opt/python@3.8/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 289, in inner_static full_path.relative_to(root_path) File ""/home/linuxbrew/.linuxbrew/opt/python@3.8/lib/python3.8/pathlib.py"", line 904, in relative_to raise ValueError(""{!r} does not start with {!r}"" ValueError: '/home/linuxbrew/.linuxbrew/lib/python3.8/site-packages/datasette/static/app.css' does not start with '/home/linuxbrew/.linuxbrew/opt/python@3.8/lib/python3.8/site-packages/datasette/static' ``` Linuxbrew install python@3.8 with symbolic links when You call the full_path.relative_to(root_path) throw ValueError. This happened when you install from pip3 when you install with python3 setup.py develop , works good. Well at the end the static wasn't serving. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/893/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 660827546,MDU6SXNzdWU2NjA4Mjc1NDY=,899,How to setup a request limit per user,133845,closed,0,,,1,2020-07-19T13:08:25Z,2020-07-31T23:54:42Z,2020-07-31T23:54:42Z,NONE,,"Hello, Until now I'm using datasette without any authentication system but I would like to setup a configuration or limiting the number of requests per user (eventually by IP or with a cookie mechanism) and eventually allowing me to ban specific users/IPs. Is there a plugin available for this use case ? If not what are your insights regarding this UC ? Should I write a plugin ? Should I deploy datasette behind a reverse proxy to manage this ? ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/899/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 699947574,MDU6SXNzdWU2OTk5NDc1NzQ=,963,Currently selected array facets are not correctly persisted through hidden form fields,649467,closed,0,,5818042,1,2020-09-12T01:49:17Z,2020-09-12T21:54:29Z,2020-09-12T21:54:09Z,NONE,,"Faceted search uses JSON array elements as facets rather than the arrays. However, if a search is ""Apply""ed (using the Apply button), the array itself rather than its elements used. To reproduce: https://latest.datasette.io/fixtures/facetable?_sort=pk&_facet=created&_facet=tags&_facet_array=tags Press ""Apply"", which might be done when removing a filter. Notice that the ""tags"" facet values are now arrays, not array elements. It appears the ""&_facet_array=tags"" element of the query string is dropped.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/963/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 707849175,MDU6SXNzdWU3MDc4NDkxNzU=,974,static assets and favicon aren't cached by the browser,45416,open,0,,,1,2020-09-24T04:44:55Z,2022-01-13T22:21:28Z,,NONE,,"Using datasette to solve some frustrating problems with our fulfillment provider today, I was surprised to see repeated requests for assets under /-/static and the favicon. While it won't likely be a huge performance bottleneck, I bet datasette would feel a bit zippier if you had Uvicorn serving up some caching-related headers telling the browser it was safe to cache static assets.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/974/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 743011397,MDU6SXNzdWU3NDMwMTEzOTc=,1094,import EX_CANTCREAT means datasette fails to work on Windows,1049910,closed,0,,,1,2020-11-14T14:17:11Z,2020-12-05T19:35:04Z,2020-12-05T19:35:04Z,NONE,,"Trying to use datasette 0.51.1 gives the following error: ``` ImportError: cannot import name 'EX_CANTCREAT' from 'os' (C:\Users\drkan\AppData\Local\Programs\Python\Python39\lib\os.py) ``` Looks like that code is only available on unix: https://docs.python.org/3/library/os.html#os.EX_CANTCREAT Removing the line makes it work fine (`EX_CANTCREAT` doesn't seem to be used anywhere?)",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1094/reactions"", ""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 756818250,MDU6SXNzdWU3NTY4MTgyNTA=,1127,Make the custom SQL query text box larger or resizable,596279,closed,0,,,1,2020-12-04T05:37:11Z,2021-06-02T04:29:06Z,2021-06-02T04:28:55Z,NONE,,"The text entry field for custom SQL queries is too small to display a moderately complex query, especially when it's been formatted. Would it be easy to make the textbox resizable by the user rather than having a fixed height?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1127/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 791381623,MDU6SXNzdWU3OTEzODE2MjM=,1197,DB size limit for publishing with Heroku,1186275,closed,0,,,1,2021-01-21T18:08:43Z,2021-01-24T20:53:44Z,2021-01-24T20:53:44Z,NONE,,"Hello, I tried searching for this, but can't seem to get a great answer: Does anybody know the size limit for databases deploying to Heroku? The files I'm working with are pretty large, but I might be able to pare them down if I have a limit in mind. I'm getting the following error when running `datasette heroku publish`: `RangeError [ERR_INVALID_OPT_VALUE]: The value ""14504095744"" is invalid for option ""size""`",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1197/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 795367402,MDU6SXNzdWU3OTUzNjc0MDI=,1209,v0.54 500 error from sql query in custom template; code worked in v0.53; found a workaround,11788561,open,0,,,1,2021-01-27T19:08:13Z,2021-01-28T23:00:27Z,,NONE,,"v0.54 500 error in sql query template; code worked in v0.53; found a workaround **schema:** CREATE TABLE ""talks"" (""talk"" TEXT,""series"" INTEGER, ""talkdate"" TEXT) CREATE TABLE ""series"" (""id"" INTEGER PRIMARY KEY, ""series"" TEXT, talks_list TEXT default '', website TEXT default ''); **Live example of correctly rendered template in v.053:** https://cosmotalks-cy6xkkbezq-uw.a.run.app/cosmotalks/talks/1 **Description of problem:** I needed 'sql select' code in a custom row-mydatabase-mytable.html template to lookup the series name for a foreign key integer value in the talks table. So `metadata.json` specifies the `datasette-template-sql` plugin. The code below worked perfectly in v0.53 (just the relevant sql statement part is shown; full code is [here](https://github.com/jrdmb/cosmotalks-datasette/blob/main/templates/row-cosmotalks-talks.html)): ``` {# custom addition #} {% for row in display_rows %} ... {% set sname = sql(""select series from series where id = ?"", [row.series]) %} Series name: {{ sname[0].series }} ... {% endfor %} {# End of custom addition #} ``` **In v0.54, that code resulted in a 500 error with a 'no such table series' message.** A second query in that template also did not work but the above is fully illustrative of the problem. All templates were up-to-date along with datasette v0.54. **Workaround:** After fiddling around with trying different things, what worked was the syntax from [Querying a different database from the datasette-template-sql github repo](https://github.com/simonw/datasette-template-sql#querying-a-different-database) to add the database name to the sql statement: `{% set sname = sql(""select series from series where id = ?"", [row.series], database=""cosmotalks"") %}` Though this was found to work, it should not be necessary to add `database=""cosmotalks""` since per the `datasette-template-sql` README, it's only needed when querying a different database, but here it's a table within the same database. ",107914493,issue,,,,, 803356942,MDU6SXNzdWU4MDMzNTY5NDI=,1218, /usr/local/opt/python3/bin/python3.6: bad interpreter: No such file or directory,11855322,open,0,,,1,2021-02-08T09:07:00Z,2021-02-23T12:12:17Z,,NONE,,"Error as above, however I do have python3.8 and the readme indicates this is supported. ``` (venv) (base) Robins-MacBook:datasette robin$ ls /usr/local/opt/python3/bin/ .. pip3 python3 python3.8 ```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1218/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 811054000,MDU6SXNzdWU4MTEwNTQwMDA=,1230,"Vega charts are plotted only for rows on the visible page, cluster maps only for rows in the remaining pages",7107523,open,0,,,1,2021-02-18T12:27:02Z,2021-02-18T15:22:15Z,,NONE,,"I filtered a data set on some criteria and obtain 265 results, split over three pages (100, 100, 65), and reazlized that Vega plots are only applied to the results displayed on the current page, instead of the whole filtered data, _e.g._, 100 on page 1, 100 on page 2, 65 on page 3. Is there a way to force the graphs to consider all results instead of just the page, considering that pages rarely represent sensible information? Likewise, while the cluster map does show all results on the first page, if you go to next pages, it will show all remaining results except the previous page(s), _e.g._, 265 on page 1, 165 on page 2, 65 on page 3. In both cases, I don't see many situations where one would like to represent the data this way, and it might even lead to interpretation errors when viewing the data. Am I missing some cases where this would be best? Perhaps a clickable option to subset visual representations according visible pages _vs._ display all search results would do? [Edit] Oh, I just saw the ""Load all"" button under the cluster map as well as the [setting to alter the max number or results](https://docs.datasette.io/en/stable/settings.html#max-returned-rows). So I guess this issue only is about the Vega charts.",107914493,issue,,,,, 823035080,MDU6SXNzdWU4MjMwMzUwODA=,1248,duckdb database (very low performance in SQLite),15836677,closed,0,,,1,2021-03-05T12:20:29Z,2021-03-08T00:25:27Z,2021-03-08T00:25:27Z,NONE,,"My sqlite is getting too big to be processed by datasette (more than 10 minutes waiting to load) so I am working with duckdb and is waaaaay faster. I think the fastest embeddable database actually. https://duckdb.org/ Taking into account DuckDb is SQLite based it would be GREAT to use it with datasette. is that possible? Regards and thanks for a superb job",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1248/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 826064552,MDU6SXNzdWU4MjYwNjQ1NTI=,1253,"Capture ""Ctrl + Enter"" or ""⌘ + Enter"" to send SQL query?",9308268,open,0,,,1,2021-03-09T15:00:50Z,2021-10-30T16:00:42Z,,NONE,,"It appears as though ""Shift + Enter"" triggers the form submit action to submit SQL, but could that action be bound to the ""Ctrl + Enter"" or ""⌘ + Enter"" action? I feel like that pattern already exists in a number of similar tools and could improve usability of the editor.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1253/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 837208901,MDU6SXNzdWU4MzcyMDg5MDE=,1267,Update Datasette alternativeto listening with details,921217,closed,0,,,1,2021-03-21T23:20:20Z,2021-03-22T04:37:26Z,2021-03-22T04:37:26Z,NONE,,"Hello, I recently learned about Datasette from an old hackernews post. It seems like an awesome project and I actually have use case I might be trying out in the coming months. Alas, to get a better understanding of your project I looked it up on alternativeto to see what it is similar too. I promise it's not spam, it's reputable enough to have a [Wikipedia](https://en.wikipedia.org/wiki/AlternativeTo) page. There was no listing on the website so I went ahead and created a listing that is now approved. I encourage anyone who likes this project and hopes to spread the word to help update the listing by: 1. Adding to the list of software it compares to 2. Uploading screenshots 3. Writing a review 4. Adding ""features"" I know this may seem spammy but I promise I have no affiliation with alternativeto I'm just a happy user and know it's a popular site for discovering software. Here is the listing for datasette: https://alternativeto.net/software/datasette/about/ Cheers",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1267/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 891969037,MDU6SXNzdWU4OTE5NjkwMzc=,1326,How to limit fields returned from the JSON API?,5268174,closed,0,,,1,2021-05-14T14:27:41Z,2021-05-23T02:55:06Z,2021-05-23T02:55:00Z,NONE,,"Hi, I have quite wide tables, and in many cases only want a subset of the data (to save on network bandwidth). I need to use the JSON API as handling pagination is so much easier, but I can't see a way to select specific columns. Is there a way to do this, or is it a feature request? Thanks!",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1326/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925406964,MDU6SXNzdWU5MjU0MDY5NjQ=,1382,Datasette with Glitch - is it possible to use CSV with ISO-8859-1 encoding?,23701514,closed,0,,,1,2021-06-19T14:37:20Z,2021-06-20T00:21:02Z,2021-06-20T00:20:06Z,NONE,,"Hi Please, I used Remix on Glitch to create a project on Glitch and uploaded a CSV But it's a CSV with ISO-8859-1 encoding (https://en.wikipedia.org/wiki/ISO/IEC_8859-1) Is it possible for me to change the encoding to correctly visualize the data? Example: https://emphasized-carpal-pillow.glitch.me/data/Emendas Best",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1382/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1028115674,I_kwDOBm6k_c49R8za,1493,`--get '/:memory:.json?sql=select+3*5'` error with datasette 0.59,1580956,closed,0,,,1,2021-10-16T18:22:22Z,2021-10-19T04:39:11Z,2021-10-19T04:39:11Z,NONE,,"👋 trying to upgrade the formula to use the latest release, but runs into some regression test issue with `--get` command. My QQ is does this `datasette --get '/:memory:.json?sql=select+3*5'` supposed to return 15? Thanks! relates to https://github.com/Homebrew/homebrew-core/pull/87369",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1493/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1072106103,I_kwDOBm6k_c4_5wp3,1542,feature request: order and dependency of plugins (that use js),33631,open,0,,,1,2021-12-06T12:40:45Z,2021-12-15T17:47:08Z,,NONE,,"I have been playing with datasette for the last couple of weeks and it is great! I am a big fan of `datasette-cluster-map` and wanted to enhance it a bit with a what I would call a sub-plugin. I basically want to add more controls to the map that cluster map provides. I have been looking into its code and how the plugin management works, but it seems what I am trying to do is not doable without hacks in js. Basically what would like to have is a way to say load my plugin after the plugins I depend on have been loaded and rendered. There seems to be no prior art where plugins have these dependencies on the js level so I was wondering if that could be added or if it exists how to do it. Basically what I want to do is: my-awesome-plugin has a dependency on datastte-cluster-map. Whenever datasette cluster map has finished rendering on page load, call my plugin, but no earlier. To make that work datasette probably needs some total order in which way plugins are loaded intialized. Since I am new to datastte, I may be missing something obvious, so please let me know if the above makes no sense.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1542/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1076057610,I_kwDOBm6k_c5AI1YK,1546,validating the sql,50336793,closed,0,,,1,2021-12-09T21:35:57Z,2021-12-18T02:05:17Z,2021-12-18T02:05:16Z,NONE,,Could someone tell me that part of the code is responsible for validating the sql that guarantees that only a table can be read,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1546/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1170497629,I_kwDOBm6k_c5FxGBd,1662,[feature request] Publish to fully static website,32609395,closed,0,,,1,2022-03-16T03:32:28Z,2022-03-19T00:42:23Z,2022-03-19T00:42:23Z,NONE,,"It seems currently all datasette publish requires a real backend server which is able to query the database and send results back to the frontend. There are a few projects to on-demand download a portion of data from the database from a sqlite lite database url, and present it directly to the user. These methods leverages web assembly under the hood. I think datasette is a perfect use case for this technology. Below are a few examples of querying sqlite database from frontend directly. * [Using sqlite3 as a notekeeping document graph with automatic reference indexing](https://epilys.github.io/bibliothecula/notekeeping.html) * [Hosting SQLite databases on Github Pages - (or any static file hoster) - phiresky's blog](https://phiresky.github.io/blog/2021/hosting-sqlite-databases-on-github-pages/) * [Static torrent website with peer-to-peer queries over BitTorrent on 2M records](https://boredcaveman.xyz/post/0x2_static-torrent-website-p2p-queries.html)",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1662/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1173828092,PR_kwDOBm6k_c40q1eP,1665,Pin setup-gcloud to v0 instead of master,408570,closed,0,,,1,2022-03-18T17:17:22Z,2022-03-23T19:31:10Z,2022-03-23T17:55:39Z,NONE,simonw/datasette/pulls/1665,"setup-gcloud will be updating the branch name from master to main in a future release. Even though GitHub will establish redirects, this will break any GitHub Actions workflows that pin to master. This PR updates your GitHub Actions workflows to pin to v0, which is the recommended best practice.",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1665/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1251710928,I_kwDOBm6k_c5Km5fQ,1751,Add scrollbars to table presentation in default layout,408765,closed,0,,,1,2022-05-28T19:44:57Z,2022-05-28T19:52:17Z,2022-05-28T19:52:17Z,NONE,,"(As you will be able to tell from the terminology I use, I am not a frontend guy, but I hope you will understand.) When a table is wide and needs horizontal scrolling to see the columns towards the end, the user needs to scroll horizontally. However, since the container for the HTML table (`div` with class `table-wrapper`) isn't limited by the window size, I first need to vertically scroll near to the bottom of the page in order to scroll horizontally. Then I can scroll back up again. This isn't very user friendly. Instead, I think it would make sense to constrain the table's size (when necessary), so that the vertical and horizontal scrollbars either always are visible or at least not far out of reach. I understand that I could provide my own template and / or CSS, but I think it would probably make sense to adjust the default in this regard.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1751/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1337541526,I_kwDOBm6k_c5PuUOW,1780,`facet_time_limit_ms` and `sql_time_limit_ms` overlap?,53165,open,0,,,1,2022-08-12T17:55:37Z,2022-08-15T23:50:08Z,,NONE,,"I needed more than the default 200ms to facet a specific column in a database I was working with, so I ran `datasette` with `--setting facet_time_limit_ms 30000` — definitely overkill! But it still didn't work; it took a moment to realize I also needed to up my `sql_time_limit_ms` to something larger too. I'm happy to submit a PR that documents this behavior if it's helpful. Or, if there's a code change we'd like to make (like making sure `sql_time_limit_ms` is always set to the larger of itself and `facet_time_limit_ms`), happy to do that too. Apologies if I missed this somewhere in the docs. And: thanks. I'm really enjoying the simple, effective tooling datasette gives me out of the box for exploring my databases!",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1780/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1347717749,I_kwDOBm6k_c5QVIp1,1791,Updating metadata.json on Datasette for MacOS,1780782,open,0,,,1,2022-08-23T10:41:16Z,2022-08-23T13:29:51Z,,NONE,,"I've installed Datasette for Mac as per [the documentation](https://docs.datasette.io/en/stable/installation.html#datasette-desktop-for-mac) and it's working great! However, I'm not sure how to go about adding something like ""[Canned Queries](https://docs.datasette.io/en/stable/sql_queries.html#canned-queries)"" or utilising other advanced features or settings by manipulating the `metadata.json` or `settings.json` files. I can view these files from the Datasette App from the top right ""burger"" menu but it only shows the contents of the file with no way to edit or change it. Am I missing something? Where can I update the `metadata.json` file using the MacOS App? PS: This is a fantastic tool! Thanks so much for all the effort and especially adding a bunch of different ways to get started quickly!",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1791/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1351949898,PR_kwDOBm6k_c492dPw,1793,Added a useful resource,111973926,closed,0,,,1,2022-08-26T08:41:26Z,2022-09-06T00:41:25Z,2022-09-06T00:41:24Z,NONE,simonw/datasette/pulls/1793,"Have added a useful resource about the types of databases in SQL i.e SQLite, PostgreSQL, MySQL &, etc from the scaler topics. ---- :books: Documentation preview :books:: https://datasette--1793.org.readthedocs.build/en/1793/ ",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1793/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1387712501,I_kwDOBm6k_c5Sts_1,1824,Convert &_hide_sql=1 to #_hide_sql,562352,open,0,,,1,2022-09-27T12:53:31Z,2022-10-05T12:56:27Z,,NONE,,"Hiding the SQL textarea with `&_hide_sql=1` enforces a page reload, which can take several seconds and use server resource (which is annoying for big database or complex queries). It could probably be done with a few lines of Javascript (I'm going to see if I can do that).",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1824/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1501900064,I_kwDOBm6k_c5ZhS0g,1966,Broken link to live demo in Getting started docs,7551922,closed,0,,,1,2022-12-18T13:17:00Z,2022-12-31T19:15:19Z,2022-12-31T19:15:10Z,NONE,,The link in [Play with a live demo in Getting started](https://github.com/simonw/datasette/blob/main/docs/getting_started.rst#play-with-a-live-demo) to [https://fivethirtyeight.datasettes.com/fivethirtyeight](https://fivethirtyeight.datasettes.com/fivethirtyeight) is broken and the datasette is no longer working (maybe due to the end of the free tier).,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1966/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1536851861,I_kwDOBm6k_c5bmn-V,1994,Stuck on loading screen,10913053,open,0,,,1,2023-01-17T18:33:49Z,2023-01-23T08:21:08Z,,NONE,,"Can’t actually open it! Downloaded today from the releases tab Running macOS13.1 ``` bin/python3.9 --version Python 3.9.6 Took 83ms bin/python3.9 --version Python 3.9.6 Took 113ms bin/pip install datasette>=0.59 datasette-app-support>=0.11.6 datasette-vega>=0.6.2 datasette-cluster-map>=0.17.1 datasette-pretty-json>=0.2.1 datasette-edit-schema>=0.4 datasette-configure-fts>=1.1 datasette-leaflet>=0.2.2 --disable-pip-version-check Requirement already satisfied: datasette>=0.59 in lib/python3.9/site-packages (0.63) Requirement already satisfied: datasette-app-support>=0.11.6 in lib/python3.9/site-packages (0.11.6) Requirement already satisfied: datasette-vega>=0.6.2 in lib/python3.9/site-packages (0.6.2) Requirement already satisfied: datasette-cluster-map>=0.17.1 in lib/python3.9/site-packages (0.17.2) Requirement already satisfied: datasette-pretty-json>=0.2.1 in lib/python3.9/site-packages (0.2.2) Requirement already satisfied: datasette-edit-schema>=0.4 in lib/python3.9/site-packages (0.5.1) Requirement already satisfied: datasette-configure-fts>=1.1 in lib/python3.9/site-packages (1.1) Requirement already satisfied: datasette-leaflet>=0.2.2 in lib/python3.9/site-packages (0.2.2) Requirement already satisfied: click>=7.1.1 in lib/python3.9/site-packages (from datasette>=0.59) (8.1.3) Requirement already satisfied: hupper>=1.9 in lib/python3.9/site-packages (from datasette>=0.59) (1.10.3) Requirement already satisfied: pint>=0.9 in lib/python3.9/site-packages (from datasette>=0.59) (0.20.1) Requirement already satisfied: PyYAML>=5.3 in lib/python3.9/site-packages (from datasette>=0.59) (6.0) Requirement already satisfied: httpx>=0.20 in lib/python3.9/site-packages (from datasette>=0.59) (0.23.0) Requirement already satisfied: aiofiles>=0.4 in lib/python3.9/site-packages (from datasette>=0.59) (22.1.0) Requirement already satisfied: asgi-csrf>=0.9 in lib/python3.9/site-packages (from datasette>=0.59) (0.9) Requirement already satisfied: asgiref>=3.2.10 in lib/python3.9/site-packages (from datasette>=0.59) (3.5.2) Requirement already satisfied: uvicorn>=0.11 in lib/python3.9/site-packages (from datasette>=0.59) (0.19.0) Requirement already satisfied: itsdangerous>=1.1 in lib/python3.9/site-packages (from datasette>=0.59) (2.1.2) Requirement already satisfied: click-default-group-wheel>=1.2.2 in lib/python3.9/site-packages (from datasette>=0.59) (1.2.2) Requirement already satisfied: janus>=0.6.2 in lib/python3.9/site-packages (from datasette>=0.59) (1.0.0) Requirement already satisfied: pluggy>=1.0 in lib/python3.9/site-packages (from datasette>=0.59) (1.0.0) Requirement already satisfied: Jinja2>=2.10.3 in lib/python3.9/site-packages (from datasette>=0.59) (3.1.2) Requirement already satisfied: mergedeep>=1.1.1 in lib/python3.9/site-packages (from datasette>=0.59) (1.3.4) Requirement already satisfied: sqlite-utils in lib/python3.9/site-packages (from datasette-app-support>=0.11.6) (3.30) Requirement already satisfied: packaging in lib/python3.9/site-packages (from datasette-app-support>=0.11.6) (21.3) Requirement already satisfied: python-multipart in lib/python3.9/site-packages (from asgi-csrf>=0.9->datasette>=0.59) (0.0.5) Requirement already satisfied: httpcore<0.16.0,>=0.15.0 in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (0.15.0) Requirement already satisfied: certifi in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (2022.9.24) Requirement already satisfied: rfc3986[idna2008]<2,>=1.3 in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (1.5.0) Requirement already satisfied: sniffio in lib/python3.9/site-packages (from httpx>=0.20->datasette>=0.59) (1.3.0) Requirement already satisfied: h11<0.13,>=0.11 in lib/python3.9/site-packages (from httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (0.12.0) Requirement already satisfied: anyio==3.* in lib/python3.9/site-packages (from httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (3.6.2) Requirement already satisfied: idna>=2.8 in lib/python3.9/site-packages (from anyio==3.*->httpcore<0.16.0,>=0.15.0->httpx>=0.20->datasette>=0.59) (3.4) Requirement already satisfied: typing-extensions>=3.7.4.3 in lib/python3.9/site-packages (from janus>=0.6.2->datasette>=0.59) (4.4.0) Requirement already satisfied: MarkupSafe>=2.0 in lib/python3.9/site-packages (from Jinja2>=2.10.3->datasette>=0.59) (2.1.1) Requirement already satisfied: tabulate in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (0.9.0) Requirement already satisfied: python-dateutil in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (2.8.2) Requirement already satisfied: sqlite-fts4 in lib/python3.9/site-packages (from sqlite-utils->datasette-app-support>=0.11.6) (1.0.3) Requirement already satisfied: pyparsing!=3.0.5,>=2.0.2 in lib/python3.9/site-packages (from packaging->datasette-app-support>=0.11.6) (3.0.9) Requirement already satisfied: six>=1.5 in lib/python3.9/site-packages (from python-dateutil->sqlite-utils->datasette-app-support>=0.11.6) (1.16.0) Took 784ms ``` STUCK",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1994/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1571207083,I_kwDOBm6k_c5dprer,2016,Database metadata fields like description are not available in the index page template's context,9993,open,0,,3268330,1,2023-02-05T02:25:53Z,2023-02-05T22:56:43Z,,NONE,,"When looping through `databases` in the index.html template, I'd like to print the description of each database alongside its name. But it appears that isn't passed in from the view, unless I'm missing it. It would be great to have that.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2016/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1578609658,I_kwDOBm6k_c5eF6v6,2022,Error 500 - not clear the cause,1667631,closed,0,,,1,2023-02-09T20:57:17Z,2023-02-09T21:13:50Z,2023-02-09T21:13:50Z,NONE,,"On the database that I have sent via linkedIn, datasette works great, but the following URL gives a 500 error. http://127.0.0.1:8001/literature/authors_papers?authorId=100550354 The cause of the error is not apparent. Is this expected behaviour? David",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2022/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1665053646,I_kwDOBm6k_c5jPrPO,2059,"""Deceptive site ahead"" alert on Heroku deployment",1186275,open,0,,,1,2023-04-12T18:34:51Z,2023-04-13T01:13:01Z,,NONE,,"I deployed a fairly basic instance of Datasette (`datasette-auth-passwords` is the only plugin) using Heroku. The deployed URL now gives a ""Deceptive site ahead"" warning to users. Is there way around this? Maybe a way to add ownership verification [through Google's search console](https://search.google.com/search-console/welcome)? ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2059/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1690765434,I_kwDOBm6k_c5kxwh6,2067,Litestream-restored db: errors on 3.11 and 3.10.8; but works on py3.10.7 and 3.10.6,39538958,open,0,,,1,2023-05-01T12:42:28Z,2023-05-03T00:16:03Z,,NONE,,"Hi! Wondering if this issue is limited to my local system or if it affects others as well. It seems like 3.11 errors out on a ""litestream-restored"" database. On further investigation, it also appears to conk out on 3.10.8 but works on 3.10.7 and 3.10.6. To demo issue I created a test database, replicated it to an aws s3 bucket, then restored the same under various .pyenv-versioned shells where I test whether I can read the database via the sqlite3 cli. ```sh # create new shell with 3.11.3 litestream restore -o data/db.sqlite s3://mytestbucketxx/db sqlite3 data/db.sqlite # SQLite version 3.41.2 2023-03-22 11:56:21 # Enter "".help"" for usage hints. # sqlite> .tables # _litestream_lock _litestream_seq movie # sqlite> ``` However this get me an `OperationalError` when reading via datasette:
Error on 3.11.3 and 3.10.8 ```sh datasette data/db.sqlite ``` ```console /tester/.venv/lib/python3.11/site-packages/pkg_resources/__init__.py:121: DeprecationWarning: pkg_resources is deprecated as an API warnings.warn(""pkg_resources is deprecated as an API"", DeprecationWarning) Traceback (most recent call last): File ""/tester/.venv/bin/datasette"", line 8, in sys.exit(cli()) ^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/datasette/cli.py"", line 143, in wrapped return fn(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/datasette/cli.py"", line 615, in serve asyncio.get_event_loop().run_until_complete(check_databases(ds)) File ""/Users/mv/.pyenv/versions/3.11.3/lib/python3.11/asyncio/base_events.py"", line 653, in run_until_complete return future.result() ^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/datasette/cli.py"", line 660, in check_databases await database.execute_fn(check_connection) File ""/tester/.venv/lib/python3.11/site-packages/datasette/database.py"", line 213, in execute_fn return await asyncio.get_event_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/mv/.pyenv/versions/3.11.3/lib/python3.11/concurrent/futures/thread.py"", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/datasette/database.py"", line 211, in in_thread return fn(conn) ^^^^^^^^ File ""/tester/.venv/lib/python3.11/site-packages/datasette/utils/__init__.py"", line 951, in check_connection for r in conn.execute( ^^^^^^^^^^^^^ sqlite3.OperationalError: unable to open database file ```
Works on 3.10.7, 3.10.6 ```sh # create new shell with 3.10.7 / 3.10.6 litestream restore -o data/db.sqlite s3://mytestbucketxx/db datasette data/db.sqlite # ... # INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) ```
In both scenarios, the only dependencies were the pinned python version and the latest Datasette version 0.64.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2067/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1698865182,I_kwDOBm6k_c5lQqAe,2069,[BUG] Cannot insert new data to deployed instance,31861128,open,0,,,1,2023-05-07T02:59:42Z,2023-05-07T03:17:35Z,,NONE,,"## Summary Recently, I deployed an instance of datasette to Vercel with the following plugins: - datasette-auth-tokens - datasette-insert With the above plugins, I was able to insert new data to local sqlite db. However, when it comes to the deployment on Vercel, things behave differently. I observed some errors from the logs console on Vercel: ```console File ""/var/task/datasette/database.py"", line 179, in _execute_writes conn = self.connect(write=True) File ""/var/task/datasette/database.py"", line 93, in connect assert not (write and not self.is_mutable) AssertionError ``` I think it is a potential bug. ## Reproduce
metadata.json
```json { ""plugins"": { ""datasette-insert"": { ""allow"": { ""id"": ""*"" } }, ""datasette-auth-tokens"": { ""tokens"": [ { ""token"": { ""$env"": ""INSERT_TOKEN"" }, ""actor"": { ""id"": ""repeater"" } } ], ""param"": ""_auth_token"" } } } ```
commands
```bash # deploy datasette publish vercel remote.db \ --project=repeater-bot-sqlite \ --metadata metadata.json \ --install datasette-auth-tokens \ --install datasette-insert \ --vercel-json=vercel.json # test insert cat fixtures/dogs.json | curl --request POST -d @- -H ""Authorization: Bearer "" \ 'https://repeater-bot-sqlite.vercel.app/-/insert/remote/dogs?pk=id' ```
logs
```console Traceback (most recent call last): File ""/var/task/datasette/app.py"", line 1354, in route_path response = await view(request, send) File ""/var/task/datasette/app.py"", line 1500, in async_view_fn response = await async_call_with_supported_arguments( File ""/var/task/datasette/utils/__init__.py"", line 1005, in async_call_with_supported_arguments return await fn(*call_with) File ""/var/task/datasette_insert/__init__.py"", line 14, in insert_or_upsert response = await insert_or_upsert_implementation(request, datasette) File ""/var/task/datasette_insert/__init__.py"", line 91, in insert_or_upsert_implementation table_count = await db.execute_write_fn(write_in_thread, block=True) File ""/var/task/datasette/database.py"", line 167, in execute_write_fn raise result File ""/var/task/datasette/database.py"", line 179, in _execute_writes conn = self.connect(write=True) File ""/var/task/datasette/database.py"", line 93, in connect assert not (write and not self.is_mutable) AssertionError ```
",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2069/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 2029908157,I_kwDOBm6k_c54_fC9,2214,CSV export fails for some `text` foreign key references,2874,open,0,,,1,2023-12-07T05:04:34Z,2023-12-07T07:36:34Z,,NONE,,"I'm starting this issue without a clear reproduction in case someone else has seen this behavior, and to use the issue as a notebook for research. I'm using Datasette with the [SWITRS](https://iswitrs.chp.ca.gov/) data set, which is a California Highway Patrol collection of traffic incident data from the past decade or so. I receive data from them in CSV and want to work with it in Datasette, then export it to CSV for mapping in Felt.com. Their data makes extensive use of codes for incident column data (`1` for `Monday` and so on), some of it integer codes and some of it letter/text codes. The text codes are sometimes blank or `-`. During import, I'm creating lookup tables for foreign key references to make the Datasette UI presentation of the data easier to read. If I import the data and set up the integer foreign keys, everything works fine, but if I set up the text foreign keys, CSV export starts to fail. The foreign key configuration is as follows: ``` # Some tables use integer ids, like sensible tables do. Let's import them first # since we favor them. for TABLE in DAY_OF_WEEK CHP_SHIFT POPULATION SPECIAL_COND BEAT_TYPE COLLISION_SEVERITY do sqlite-utils create-table records.db $TABLE id integer name text --pk=id sqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv sqlite-utils add-foreign-key records.db collisions $TABLE $TABLE id sqlite-utils create-index records.db collisions $TABLE done # *Other* tables use letter keys, like they were raised by WOLVES. Let's put them # at the end of the import queue. for TABLE in WEATHER_1 WEATHER_2 LOCATION_TYPE RAMP_INTERSECTION SIDE_OF_HWY \ PRIMARY_COLL_FACTOR PCF_CODE_OF_VIOL PCF_VIOL_CATEGORY TYPE_OF_COLLISION MVIW \ PED_ACTION ROAD_SURFACE ROAD_COND_1 ROAD_COND_2 LIGHTING CONTROL_DEVICE \ STWD_VEHTYPE_AT_FAULT CHP_VEHTYPE_AT_FAULT PRIMARY_RAMP SECONDARY_RAMP do sqlite-utils create-table records.db $TABLE key text name text --pk=key sqlite-utils insert records.db $TABLE lookup-tables/$TABLE.csv --csv sqlite-utils add-foreign-key records.db collisions $TABLE $TABLE key sqlite-utils create-index records.db collisions $TABLE done ``` You can see the full code and import script here: https://github.com/radical-bike-lobby/switrs-db If I run this code and then hit the CSV export link in the Datasette interface (the simple link or the ""advanced"" dialog), export fails after a small number of CSV rows are written. I am not seeing any detailed error messages but this appears in the logging output: ``` INFO: 127.0.0.1:57885 - ""GET /records/collisions.csv?_facet=PRIMARY_RD&PRIMARY_RD=ASHBY+AV&_labels=on&_size=max HTTP/1.1"" 200 OK Caught this error: ``` (No other output follows `error:` other than a blank line.) I've stared at the rows directly after the error occurs and can't yet see what is causing the problem. I'm going to set up a development environment and see if I get any more detailed error output, and then stare more at some problematic lines to see if I can get a simple reproduction.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2214/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 274160723,MDU6SXNzdWUyNzQxNjA3MjM=,100,TemplateAssertionError: no filter named 'tojson',13304454,closed,0,,,2,2017-11-15T13:43:41Z,2017-11-16T09:25:10Z,2017-11-16T00:14:13Z,NONE,,"A 500 error is raised upon clicking on the name of a table on the homepage, say _http://0.0.0.0:8001/_ to _http://0.0.0.0:8001/test_check-c1f4771/users_ The API part seems to function as intended, though... ``` 2017-11-15 14:33:57 - (sanic)[ERROR]: Traceback (most recent call last): File ""/usr/local/lib/python3.5/dist-packages/sanic/app.py"", line 503, in handle_request response = await response File ""/usr/local/lib/python3.5/dist-packages/datasette/app.py"", line 155, in get return await self.view_get(request, name, hash, **kwargs) File ""/usr/local/lib/python3.5/dist-packages/datasette/app.py"", line 219, in view_get **context, File ""/usr/local/lib/python3.5/dist-packages/sanic_jinja2/__init__.py"", line 84, in render return html(self.render_string(template, request, **context)) File ""/usr/local/lib/python3.5/dist-packages/sanic_jinja2/__init__.py"", line 81, in render_string return self.env.get_template(template).render(**context) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 812, in get_template return self._load_template(name, self.make_globals(globals)) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 786, in _load_template template = self.loader.load(self, name, globals) File ""/usr/lib/python3/dist-packages/jinja2/loaders.py"", line 125, in load code = environment.compile(source, name, filename) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 565, in compile self.handle_exception(exc_info, source_hint=source_hint) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 754, in handle_exception reraise(exc_type, exc_value, tb) File ""/usr/lib/python3/dist-packages/jinja2/_compat.py"", line 37, in reraise raise value.with_traceback(tb) File ""/usr/local/lib/python3.5/dist-packages/datasette/templates/table.html"", line 29, in template
params = {{ query.params|tojson(4) }}
File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 515, in _generate return generate(source, self, name, filename, defer_init=defer_init) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 62, in generate generator.visit(node) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 849, in visit_Template self.blockvisit(block.body, block_frame) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 1172, in visit_If self.blockvisit(node.body, if_frame) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 1353, in visit_Output self.visit(argument, frame) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 1565, in visit_Filter self.fail('no filter named %r' % node.name, node.lineno) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 427, in fail raise TemplateAssertionError(msg, lineno, self.name, self.filename) jinja2.exceptions.TemplateAssertionError: no filter named 'tojson' 2017-11-15 14:33:57 - (network)[INFO][127.0.0.1:41316]: GET http://0.0.0.0:8001/test_check-c1f4771/users 500 144 2017-11-15 14:33:57 - (network)[INFO][127.0.0.1:41316]: GET http://0.0.0.0:8001/favicon.ico 200 0 ```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/100/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 281110295,MDU6SXNzdWUyODExMTAyOTU=,173,I18n and L10n support,50138,open,0,,,2,2017-12-11T17:49:58Z,2021-04-26T12:10:01Z,,NONE,,It would be less geeky and more user friendly if the display strings in the filter menu and possibly other parts could be localized.,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/173/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 330826972,MDU6SXNzdWUzMzA4MjY5NzI=,308,"Support extra Heroku apps:create options - region, space, team",78156,open,0,,,2,2018-06-08T23:08:33Z,2018-09-21T14:09:28Z,,NONE,,"It would be useful to document how to pass Heroku CLI options on `datasette publish`, e.g. `--region eu`.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/308/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 339095976,MDU6SXNzdWUzMzkwOTU5NzY=,334,extra_options not passed to heroku publisher,719357,closed,0,,,2,2018-07-06T23:26:12Z,2018-07-24T04:53:21Z,2018-07-10T01:46:04Z,NONE,,"I might be wrong but I was not able to publish to `heroku` with `--extra-options`, I think `extra_options` is not being used in this function [here](https://github.com/simonw/datasette/blob/master/datasette/utils.py#L369). Any help appreciated! ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/334/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 400511206,MDU6SXNzdWU0MDA1MTEyMDY=,403,How does persistence work?,1794527,closed,0,,,2,2019-01-17T23:41:57Z,2019-01-19T05:47:55Z,2019-01-18T06:51:14Z,NONE,,I was under the impression that now.sh is for stateless microservices. So where are these SQLite databases stored and when do they get created and destroyed?,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/403/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 411257981,MDU6SXNzdWU0MTEyNTc5ODE=,412,Linked Data(sette),43340,open,0,,,2,2019-02-18T00:38:14Z,2019-03-19T10:09:46Z,,NONE,,"I've a radical feature idea (possible first as an extension in order to experiment?): I'd like to link to a remote table from a remote database, e.g. with a function ""linked_datasette()"". So one could do following query: ``` SELECT foo.id, foo.a, remote_party.b FROM foo JOIN linked_datasette(""https://parlgov.datasettes.com/parlgov-b42a2f2"") AS remote_party ON foo.id=remote_party.id ``` This is inspired by SPARQL's SERVICE keyword for remote RDF ""endpoints"". There's a foundation in the SQL Standard called SQL/MED (https://rhaas.blogspot.com/2011/01/why-sqlmed-is-cool.html ). And here's an implementation from me in Postgres FDW to connect another Postgres ""endpoint"": https://pastebin.com/Fz2v64Cz .",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/412/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 448189298,MDU6SXNzdWU0NDgxODkyOTg=,486,Ability to add extra routes and related templates,2181410,closed,0,,,2,2019-05-24T14:04:25Z,2019-05-24T14:43:28Z,2019-05-24T14:43:09Z,NONE,,"Hi Simon Thank for an excellent job! Datasette is such an obviously good idea (once you have that idea!) and so well done. The only thing that I miss, is the ability to add extras routes (with associated jinja2-templates). For most of the datasets, that I would like to publish, I would also like at least a page, that describes the data (semantics, provenance, biases...) and a page explaining our cookie- and privacy-policies (which would allows us to use something like Goggle Analytics). ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/486/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 457201907,MDU6SXNzdWU0NTcyMDE5MDc=,513,Is it possible to publish to Heroku despite slug size being too large?,7936571,closed,0,,,2,2019-06-18T00:12:02Z,2019-06-21T22:35:54Z,2019-06-21T22:35:54Z,NONE,,"I'm trying to push more than 1.5GB worth of SQLite databases -- 535MB compressed -- to Heroku but I get this error when I run the `datasette publish heroku` command. Compiled slug size: 535.5M is too large (max is 500M). Can I publish the databases and make datasette work on Heroku despite the large slug size?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/513/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 494685791,MDU6SXNzdWU0OTQ2ODU3OTE=,574,Improve usage description of --host option,132978,closed,0,,,2,2019-09-17T15:12:12Z,2019-11-01T21:58:17Z,2019-11-01T21:57:54Z,NONE,,"It would be nice if the `--host` option had a clearer description. I tried to get datasette running on an AWS instance and it took a while to realize it was only listening on localhost. So I wanted to make it listen on an non-localhost interface and tried giving a couple of values to `--host` (a host name, then an interface name), but none of them did. In the end I read the source to see that the option is passed to `uvicorn` and looked at the uvicorn docs, which also didn't help. Then I searched the web for ""example running datasette on a host"" which led me to https://github.com/simonw/datasette/issues/514 where I saw someone using `-h 0.0.0.0`. I tried that and it works. That usage could be mentioned somewhere, and might save someone else some time.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/574/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed