id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason 1910269679,I_kwDOBm6k_c5x3Gbv,2196,Discord invite link returns 401,1892194,closed,0,,,2,2023-09-24T15:16:54Z,2023-10-13T00:07:08Z,2023-10-12T21:54:54Z,NONE,,"I found the link to the datasette discord channel via [this query](https://github.com/search?q=repo%3Asimonw%2Fdatasette%20discord&type=code). The following video should be self explanatory: https://github.com/simonw/datasette/assets/1892194/8cd33e88-bcaa-41f3-9818-ab4d589c3f02 Link for reference: https://discord.com/invite/ktd74dm5mw",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2196/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1825007061,I_kwDOBm6k_c5sx2XV,2123,datasette serve when invoked with --reload interprets the serve command as a file,79087,open,0,,,2,2023-07-27T19:07:22Z,2023-09-18T13:02:46Z,,NONE,,"When running `datasette serve` with the `--reload` flag, the serve command is picked up as a file argument: ``` $ datasette serve --reload test_db Starting monitor for PID 13574. Error: Invalid value for '[FILES]...': Path 'serve' does not exist. Press ENTER or change a file to reload. ``` If a 'serve' file is created it launches properly (albeit with an empty database called serve): ``` $ touch serve; datasette serve --reload test_db Starting monitor for PID 13628. INFO: Started server process [13628] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) ``` Version (running from HEAD on main): ``` $ datasette --version datasette, version 1.0a2 ``` This issue appears to have existed for awhile as https://github.com/simonw/datasette/issues/1380#issuecomment-953366110 mentions the error in a different context. I'm happy to debug and land a patch if it's welcome.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2123/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1795219865,I_kwDOCGYnMM5rAOGZ,566,`--no-headers` doesn't work on most formats,33625,open,0,,,2,2023-07-09T03:43:36Z,2023-07-09T04:13:35Z,,NONE,,"Version 3.33 ``` sqlite-utils query library.db 'select asin from audible' --fmt plain --no-headers | head -3 asin 0062804006 0062891421 ```",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/566/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1620254998,I_kwDOCGYnMM5gkyEW,532,Show more information when JSON can't be imported with sqlite-utils insert,83080728,closed,0,,,2,2023-03-12T06:41:44Z,2023-05-08T20:32:16Z,2023-05-08T20:32:02Z,NONE,,"I am currently trying to import the [JSON export of my data from Discord](https://support.discord.com/hc/en-us/articles/360004027692-Requesting-a-Copy-of-your-Data), specifically `activity/reporting/events-*.json` ``` sqlite-utils.exe insert test.db reporting events-2023-00000-of-00001.json [###################################-] 99% 00:00:00 Error: Invalid JSON - use --csv for CSV or --tsv for TSV files ``` Please show more information as to *why* this is invalid, if possible. I am using version 3.30 with Python 3.10 on Windows 11.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/532/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1515883470,I_kwDOC8tyDs5aWovO,24,DOC: xml.etree.ElementTree.ParseError due to healthkit version 12 ,6231413,open,0,,,2,2023-01-01T23:00:38Z,2023-03-30T10:17:31Z,,NONE,,"Hi @simonw I hope you find this issue ok, the idea is provide some documentation to other users like me about how to solve this problem and save some time. Following the instructions from the `README.md` I've faced this error: ```bash (venv) mgreco@pop-os apple-health master* (23:44|0s) $ healthkit-to-sqlite apple_health_export/export.xml healthkit.db --xml Importing from HealthKit [------------------------------------] 0% Traceback (most recent call last): File ""/home/mgreco/github/mmngreco/apple-health/venv/bin/healthkit-to-sqlite"", line 33, in sys.exit(load_entry_point('healthkit-to-sqlite', 'console_scripts', 'healthkit-to-sqlite')()) File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) File ""/home/mgreco/github/mmngreco/apple-health/.deps/healthkit-to-sqlite/healthkit_to_sqlite/cli.py"", line 57, in cli convert_xml_to_sqlite(fp, db, progress_callback=bar.update, zipfile=zf) File ""/home/mgreco/github/mmngreco/apple-health/.deps/healthkit-to-sqlite/healthkit_to_sqlite/utils.py"", line 25, in convert_xml_to_sqlite for tag, el in find_all_tags( File ""/home/mgreco/github/mmngreco/apple-health/.deps/healthkit-to-sqlite/healthkit_to_sqlite/utils.py"", line 12, in find_all_tags for event, el in parser.read_events(): File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/xml/etree/ElementTree.py"", line 1324, in read_events raise event File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/xml/etree/ElementTree.py"", line 1296, in feed self._parser.feed(data) xml.etree.ElementTree.ParseError: syntax error: line 156, column 0 ``` So, after debugging and searching on internet I found this useful link: https://discussions.apple.com/thread/254202523 (etresoft, the real hero). Which basically says that the xml given by the health app (healthkit version 12) has some bugs but fortunately, they can be solved with a couple of commads: 1. Uncompress the zip and move the new folder where `export.xml` is. 1. Create a `patch.txt` with the following content ```diff --- export.xml 2022-09-18 15:17:09.000000000 -0400 +++ export-fixed.xml 2022-09-18 16:37:08.000000000 -0400 @@ -15,6 +15,7 @@ HKCharacteristicTypeIdentifierBiologicalSex CDATA #REQUIRED HKCharacteristicTypeIdentifierBloodType CDATA #REQUIRED HKCharacteristicTypeIdentifierFitzpatrickSkinType CDATA #REQUIRED + HKCharacteristicTypeIdentifierCardioFitnessMedicationsUse CDATA #IMPLIED > - + - + - device CDATA #IMPLIED - - -> ]> ``` 1. Apply the path with the command: `patch < patch.txt` 1. Fix endDates with the command `sed 's/startDate/endDate/2' export.xml > export-fixed.xml` 1. Try again `healthkit-to-sqlite export-fixed.xml healthkit.db --xml`",197882382,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1594383280,I_kwDOBm6k_c5fCFuw,2030,How to use Datasette with apache webserver on GCP?,19700859,closed,0,,,2,2023-02-22T03:08:49Z,2023-02-22T21:54:39Z,2023-02-22T21:54:39Z,NONE,,"Hi Simon and Datasette team- I have installed apache2 webserver inside GCP VM using apt. I can see my ""Hello World"" index.html if I use the external IP of this GCP in a browser. However, when I try to run datasette with different combinations of -h and -p, I am still unable to access the webpage. I cannot invest Docker on this VM. Any pointers to use datasette with already existing apache2 webserver on GCP is appreciated. Thanks.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2030/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1579695809,I_kwDOBm6k_c5eKD7B,2023,Error: Invalid setting 'hash_urls' in settings.json in 0.64.1,80409402,closed,0,,,2,2023-02-10T13:35:01Z,2023-02-10T15:40:00Z,2023-02-10T15:39:59Z,NONE,,"On a Debian machine, using datasette 0.64.1 installed with `pip3`, I am getting a `datasette[114272]: Error: Invalid setting 'hash_urls' in settings.json` in `journalctl -xe`. The same settings work on 0.54.1 on another Debian server. This is my `settings.json`: ```json { ""default_page_size"": 200, ""max_returned_rows"": 8000, ""num_sql_threads"": 3, ""sql_time_limit_ms"": 1000, ""default_facet_size"": 30, ""facet_time_limit_ms"": 200, ""facet_suggest_time_limit_ms"": 50, ""hash_urls"": false, ""allow_facet"": true, ""allow_download"": true, ""suggest_facets"": true, ""default_cache_ttl"": 5, ""default_cache_ttl_hashed"": 31536000, ""cache_size_kb"": 0, ""allow_csv_stream"": true, ""max_csv_mb"": 100, ""truncate_cells_html"": 2048, ""force_https_urls"": false, ""template_debug"": false, ""base_url"": ""/pclim/db/"" } ``` This looks ok to me. Would you have any ideas?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2023/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1378495690,I_kwDOBm6k_c5SKizK,1814,Static files not served,4068,closed,0,,,2,2022-09-19T20:38:17Z,2022-09-19T23:35:06Z,2022-09-19T23:34:30Z,NONE,,"Folder structure: ``` bibliography/ bibliography/static-files bibliography/static-files/styles.css bibliography/bibliography.db bibliography/metadata.json bibliography/settings.json ``` ``` $ cat bibliography/settings.json { ""suggest_facets"": false, ""truncate_cells_html"": 1000, ""static"": ""assets:static-files/"" } ``` File `/assets/styles.css` is not found (HTTP 404, `Database not found: assets`). Using datasette revision d0737e4de51ce178e556fc011ccb8cc46bbb6359.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1814/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1367835380,I_kwDOCGYnMM5Rh4L0,487,Specify foreign key against compound key in other table,540968,closed,0,,,2,2022-09-09T13:32:09Z,2022-09-11T04:00:44Z,2022-09-11T04:00:44Z,NONE,,"When inserting rows via the library, is it possible to specify a foreign key to a compound primary key? For example, suppose I create a table: ``` db = Database('events.db') db['events'].insert_all([ {'venue': 'Times Square', 'date': '2022-12-31', 'title': 'Rockin New Year Eve'}, {'venue': 'Wembley Stadium', 'date': '2022-06-05', 'title': 'FA Cup'}, {'venue': 'Times Square', 'date': '2021-12-31', 'title': 'Rockin New Year Eve'}, ], pk=('date', 'venue')) ``` And I want to add related data in another table: ``` act = {'name': 'Rick Astley', 'venue': 'Times Square', 'date': '2021-12-31' } db['performers'].insert(act, pk=) ``` Is it possible to specify a value for `pk` that will point to the compound primary key in `events`? SQLite does support it: https://www.sqlite.org/foreignkeys.html#fk_composite",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/487/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1353441389,I_kwDOCGYnMM5Qq-Bt,477,Conda Forge,49702524,closed,0,,,2,2022-08-28T19:03:08Z,2022-09-07T03:46:55Z,2022-09-07T03:46:55Z,NONE,,"Hello! I have successfully put this package on to Conda Forge, and I have extending the invitation for the owner/maintainers of this package to be maintainers on Conda Forge as well. Let me know if you are interested! Thanks. https://github.com/conda-forge/sqlite-utils-feedstock",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/477/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1224112817,I_kwDOCGYnMM5I9nqx,430,Document how to use `PRAGMA temp_store` to avoid errors when running VACUUM against huge databases,9308268,open,0,,,2,2022-05-03T13:33:58Z,2022-06-14T23:26:37Z,,NONE,,"I'm trying to figure out a way to get the `table.extract()` method to complete successfully -- I'm not sure if maybe the cause (and a possible solution) of this on Ubuntu Server 22.04 is to adjust some of the PRAGMA values within SQLite itself ... on another Linux system (PopOS), using this method on this same database appears to work just fine. Here's the bit that's causing the error, and the resulting error output: ```python # combine these columns into 1 table ""bib_properties"" : # best_title # bib_level_code # mat_type # material_code # best_author db[""circ_trans""].extract( [""best_title"", ""bib_level_code"", ""mat_type"", ""material_code"", ""best_author""], table=""bib_properties"", fk_column=""bib_properties_id"" ) db[""circ_trans""].extract( [""call_number""], table=""call_number"", fk_column=""call_number_id"", rename={""call_number"": ""value""} ) ``` ```python --------------------------------------------------------------------------- OperationalError Traceback (most recent call last) Input In [17], in () 1 # combine these columns into 1 table ""bib_properties"" : 2 # best_title 3 # bib_level_code 4 # mat_type 5 # material_code 6 # best_author ----> 7 db[""circ_trans""].extract( 8 [""best_title"", ""bib_level_code"", ""mat_type"", ""material_code"", ""best_author""], 9 table=""bib_properties"", 10 fk_column=""bib_properties_id"" 11 ) 13 db[""circ_trans""].extract( 14 [""call_number""], 15 table=""call_number"", 16 fk_column=""call_number_id"", 17 rename={""call_number"": ""value""} 18 ) File ~/jupyter/venv/lib/python3.10/site-packages/sqlite_utils/db.py:1764, in Table.extract(self, columns, table, fk_column, rename) 1761 column_order.append(c.name) 1763 # Drop the unnecessary columns and rename lookup column -> 1764 self.transform( 1765 drop=set(columns), 1766 rename={magic_lookup_column: fk_column}, 1767 column_order=column_order, 1768 ) 1770 # And add the foreign key constraint 1771 self.add_foreign_key(fk_column, table, ""id"") File ~/jupyter/venv/lib/python3.10/site-packages/sqlite_utils/db.py:1526, in Table.transform(self, types, rename, drop, pk, not_null, defaults, drop_foreign_keys, column_order) 1524 with self.db.conn: 1525 for sql in sqls: -> 1526 self.db.execute(sql) 1527 # Run the foreign_key_check before we commit 1528 if pragma_foreign_keys_was_on: File ~/jupyter/venv/lib/python3.10/site-packages/sqlite_utils/db.py:465, in Database.execute(self, sql, parameters) 463 return self.conn.execute(sql, parameters) 464 else: --> 465 return self.conn.execute(sql) OperationalError: database or disk is full ``` This database is about 17G in total size, so I'm assuming the error is coming from the vacuum ... where i'm assuming it's maybe trying to do the temp storage in a location that doesn't have sufficient room. The disk space is more than ample on the host in question (1.8T is free in the directory where the sqlite db resides) The `/tmp` directory however is limited on a smaller disk associated with the OS I'm trying to think if there's a way to set the `PRAGMA temp_store` or maybe if it's `temp_store_directory` that I'm after ... to use the same local directory of where the file is located (maybe this is a property of the version of sqlite on the system?) ```python # SET the temp file store to be a file ... print(db.execute('PRAGMA temp_store').fetchall()) print(db.execute('PRAGMA temp_store=FILE').fetchall()) print(db.execute('PRAGMA temp_store').fetchall()) # the users home directory ... print(db.execute(""PRAGMA temp_store_directory='/home/plchuser/'"").fetchall()) print(db.execute(""PRAGMA sqlite3_temp_directory='/home/plchuser/'"").fetchall()) print(db.execute(""PRAGMA temp_store_directory"").fetchall()) print(db.execute(""PRAGMA sqlite3_temp_directory"").fetchall()) ``` ```text [(1,)] [] [(1,)] [] [] [('/home/plchuser/',)] [] ``` Here's the docs on the Temporary File Storage Locations https://www.sqlite.org/tempfiles.html",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/430/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1250161887,I_kwDOCGYnMM5Kg_Tf,438,illegal UTF-16 surrogate,4068,closed,0,,,2,2022-05-26T22:49:52Z,2022-05-27T08:21:53Z,2022-05-27T08:21:53Z,NONE,,"I am trying to insert `https://artsdatabanken.no/Fab2018/api/export/csv` into a SQLite database, but I have an error when using `sqlite-utils`: ``` sqlite-utils insert --csv --delimiter "";"" --encoding=""utf-16-le"" --pk ""Id"" csv fremmedart test.db [------------------------------------] 0% Error: 'utf-16-le' codec can't decode bytes in position 98-99: illegal UTF-16 surrogate The input you provided uses a character encoding other than utf-8. You can fix this by passing the --encoding= option with the encoding of the file. If you do not know the encoding, running 'file filename.csv' may tell you. It's often worth trying: --encoding=latin-1 ``` I tried to convert the file using `iconv -f ""utf-16le"" -t ""utf-8""`, but I still get a similar error (slightly different position): ``` sqlite-utils insert --csv --delimiter "";"" --encoding=utf-8 --pk ""Id"" csv_utf8 fremmedart test.db [------------------------------------] 0% Error: 'utf-8' codec can't decode byte 0xd9 in position 99: invalid continuation byte The input you provided uses a character encoding other than utf-8. You can fix this by passing the --encoding= option with the encoding of the file. If you do not know the encoding, running 'file filename.csv' may tell you. It's often worth trying: --encoding=latin-1 ``` I have no issues reading such file using this Python code: ```python content = open('csv', encoding='utf-16-le').read()) ``` `in2csv` works too.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/438/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1221849746,I_kwDOBm6k_c5I0_KS,1732,Custom page variables aren't decoded,52649,open,0,,,2,2022-04-30T14:55:46Z,2022-05-03T01:50:45Z,,NONE,,"I have a page `templates/filer/{filer_id}.html`. It uses `filer_id` in a `sql()` call to fetch data. With 0.61.1 this no longer works because the spaces in IDs isn't preserved. Instead, the escaped version is passed into the template and the id isn't present in my db. Datasette should unescape the url component before passing them into the template.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1732/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1129052172,I_kwDOBm6k_c5DS_gM,1633,base_url or prefix does not work with _exact match,6613091,open,0,,,2,2022-02-09T21:45:07Z,2022-04-28T09:12:56Z,,NONE,,"When i hit ""Apply"" button to search with ""_exact"" for a column syntax the URL prefix is removed from the url. ![image](https://user-images.githubusercontent.com/6613091/153293758-0b757d55-5757-4987-992e-9426e69a7956.png) And the result is: ![image](https://user-images.githubusercontent.com/6613091/153294672-87be7809-bb7b-455d-bf1a-41e90bbfa4ae.png) If I add the marked row to url_builder.py it seams to work: ![image](https://user-images.githubusercontent.com/6613091/153295231-bdd52e37-efcf-4b21-9d37-69f182a922f4.png) ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1633/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1171599874,I_kwDOCGYnMM5F1TIC,415,Convert with `--multi` and `--dry-run` flag does not work,3976183,closed,0,,,2,2022-03-16T21:59:46Z,2022-03-21T04:18:24Z,2022-03-21T04:18:24Z,NONE,,"It's not possible to combine `--multi` and `--dry-run` flag in the `convert` command. Let's first create a simple database from JSON string ```console $ echo '[{""foo"": ""abc""}]' | sqlite-utils insert demo.db demo - $ sqlite-utils query demo.db ""SELECT * FROM demo"" [{""foo"": ""abc""}] ``` and then try to convert the ""foo"" column with a static value ""bar"" (see docs [Converting a column into multiple columns](https://sqlite-utils.datasette.io/en/stable/cli.html#converting-a-column-into-multiple-columns)) ```console $ sqlite-utils convert demo.db demo foo '{""foo"": ""bar""}' --multi --dry-run Traceback (most recent call last): File ""/home/dotcs/anaconda3/envs/tools/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/home/dotcs/anaconda3/envs/tools/lib/python3.9/site-packages/sqlite_utils/cli.py"", line 2686, in convert for row in db.conn.execute(sql, where_args).fetchall(): sqlite3.OperationalError: user-defined function raised exception ``` But without the `--dry-run` flag it does work as expected: ```console $ sqlite-utils convert demo.db demo foo '{""foo"": ""bar""}' --multi $ sqlite-utils query demo.db ""SELECT * FROM demo"" [{""foo"": ""bar""}] ``` ```console $ sqlite-utils --version sqlite-utils, version 3.25.1 ```",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/415/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1088816961,I_kwDODEm0Qs5A5gdB,62,KeyError: 'created_at' for private accounts?,6764957,closed,0,,,2,2021-12-26T17:51:51Z,2022-03-12T02:36:32Z,2022-02-24T18:10:18Z,NONE,,"hey Simon! i was running `twitter-to-sqlite user-timeline twitter.db` for [my private alt](https://twitter.com/swyxio) and ran into this error:
![image](https://user-images.githubusercontent.com/6764957/147416165-46b69c30-100a-406f-8534-8612b75547ae.png) ```bash Traceback (most recent call last): File ""/Users/swyx/Work/datasette/env/bin/twitter-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/cli.py"", line 291, in user_timeline profile = utils.get_profile(db, session, **kwargs) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 133, in get_profile save_users(db, [profile]) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 453, in save_users transform_user(user) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 285, in transform_user user[""created_at""] = parser.parse(user[""created_at""]) KeyError: 'created_at' ```
this looks awfully like #37 but it can't be, because i'm authed into my account and obviously i have perms to read my own account. wonder if there's any diagnostic methods i should apply here? just filing an issue for others to find while i investigate.",206156866,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/62/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1148725876,I_kwDOBm6k_c5EeCp0,1640,"Support static assets where file length may change, e.g. logs",57859326,open,0,,,2,2022-02-24T00:34:42Z,2022-03-05T01:19:25Z,,NONE,,"This is a bit of an oxymoron. I am serving a log.txt file for a background process using the Datasette --static CLI. This is useful as I can observe a background process from the web UI to see any errors that occur (instead of spelunking the logs via docker exec/ssh etc). I get this error, which I think is because Datasette assumes that the size of the content does not change (but appending new log lines means the content length changes). ```python Traceback (most recent call last): File ""/usr/local/lib/python3.9/site-packages/datasette/app.py"", line 1181, in route_path response = await view(request, send) File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 305, in inner_static await asgi_send_file(send, full_path, chunk_size=chunk_size) File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 280, in asgi_send_file await send( File ""/usr/local/lib/python3.9/site-packages/asgi_csrf.py"", line 104, in wrapped_send await send(event) File ""/usr/local/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py"", line 460, in send output = self.conn.send(event) File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 468, in send data_list = self.send_with_data_passthrough(event) File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 501, in send_with_data_passthrough writer(event, data_list.append) File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 58, in __call__ self.send_data(event.data, write) File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 78, in send_data raise LocalProtocolError(""Too much data for declared Content-Length"") h11._util.LocalProtocolError: Too much data for declared Content-Length ERROR: Exception in ASGI application Traceback (most recent call last): File ""/usr/local/lib/python3.9/site-packages/datasette/app.py"", line 1181, in route_path response = await view(request, send) File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 305, in inner_static await asgi_send_file(send, full_path, chunk_size=chunk_size) File ""/usr/local/lib/python3.9/site-packages/datasette/utils/asgi.py"", line 280, in asgi_send_file await send( File ""/usr/local/lib/python3.9/site-packages/asgi_csrf.py"", line 104, in wrapped_send await send(event) File ""/usr/local/lib/python3.9/site-packages/uvicorn/protocols/http/h11_impl.py"", line 460, in send output = self.conn.send(event) File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 468, in send data_list = self.send_with_data_passthrough(event) File ""/usr/local/lib/python3.9/site-packages/h11/_connection.py"", line 501, in send_with_data_passthrough writer(event, data_list.append) File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 58, in __call__ self.send_data(event.data, write) File ""/usr/local/lib/python3.9/site-packages/h11/_writers.py"", line 78, in send_data raise LocalProtocolError(""Too much data for declared Content-Length"") h11._util.LocalProtocolError: Too much data for declared Content-Length ``` Thanks, I am finding Datasette very useful.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1640/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1117132741,I_kwDOBm6k_c5ClhfF,1615,Potential simplified publishing mechanism,369053,closed,0,,,2,2022-01-28T08:34:50Z,2022-02-02T07:34:21Z,2022-02-02T07:34:17Z,NONE,,"Hi, Forewarning: this idea is one I've only been thinking about for a while and it's not fully fleshed-out yet. I love Datasette and what it stands for. I was thinking about how we could make it accessible to more people, especially those without access to credit cards required for a lot of hosting options. Or they might not feel comfortable signing up for said services. So I was thinking I might create a service that hosts Datasette instances for folks. I'd probably stick it on AWS Lambda and limit requests to something like n/month to avoid bankrupting myself. If I did build such a hypothetical service, I was thinking I would rely on GitHub Actions to do the heavy lifting. E.g. user `johndoe` creates a repo `my-animals` with a couple of files: `dogs.csv`, `cats.csv` and the following GitHub Actions workflow: ```yaml # .github/workflows/push.yml on: push # this allows the publish action to use OIDC to authenticate johndoe/my-animals permissions: id-token: write contents: read jobs: publish: runs-on: ubuntu-latest steps: - uses: actions/setup-python@v2 - run: pip install sqlite-utils - uses: actions/checkout@v2 - run: | set -eux sqlite-utils create-database animals.db sqlite-utils insert animals.db dogs dogs.csv --csv sqlite-utils insert animals.db cats cats.csv --csv - uses: datasette-hub/publish@v1 with: db: animals.db metadata: meta.yml # this step is helpful for debugging why the # generated sqlite db was rejected - uses: actions/upload-artifact@v2 if: failure() with: path: animals.db retention-days: 1 ``` This would then cause a Datasette instance to be available at `https://johndoe-my-animals.datasette-hub.test/`. It feels like this could significantly reduce the friction to someone being able to go from data set to Datasette. What do you think? Does this address a real need? Or am I perhaps misunderstanding the main friction points? As a bonus: it feels like this would pair well with [git scraping](https://simonwillison.net/2020/Oct/9/git-scraping/).",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1615/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1026794056,I_kwDOCGYnMM49M6JI,331,Mypy error: found module but no type hints or library stubs,53032010,closed,0,,,2,2021-10-14T20:29:50Z,2021-11-14T23:21:08Z,2021-11-14T23:21:08Z,NONE,,"``` Python 3.9.5 mypy 0.910 sqlite-utils 3.17.1 ``` While using sqlite-utils as a library, when I use mypy for static type checking, it throws an error: ``` mypy . src/etl.py:5: error: Skipping analyzing ""sqlite_utils"": found module but no type hints or library stubs import sqlite_utils ^ src/etl.py:5: note: See https://mypy.readthedocs.io/en/stable/running_mypy.html#missing-imports test/test_etl.py:4: error: Skipping analyzing ""sqlite_utils"": found module but no type hints or library stubs import sqlite_utils ^ Found 2 errors in 2 files (checked 7 source files) ``` When I add a `py.typed` file to the sqlite-utils package to mark it as PEP 561 compatible, the error goes away. ``` al@nbal ..b/python3.9/site-packages/sqlite_utils (git)-[main] % la total 200 drwx------ 3 al al 4096 Oct 14 22:00 . drwx------ 117 al al 4096 Oct 12 21:12 .. -rw------- 1 al al 64409 Oct 12 21:11 cli.py -rw------- 1 al al 109092 Oct 12 21:11 db.py -rw------- 1 al al 0 Oct 14 22:00 py.typed -rw------- 1 al al 684 Oct 12 21:11 recipes.py -rw------- 1 al al 7988 Oct 12 21:11 utils.py -rw------- 1 al al 113 Oct 12 21:11 __init__.py ``` I would like to suggest adding a `py.typed` file to the repository. See also the mypy docs on creating PEP 561 compatible packages: https://mypy.readthedocs.io/en/stable/installed_packages.html#creating-pep-561-compatible-packages ",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/331/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 951581763,MDU6SXNzdWU5NTE1ODE3NjM=,298,Read lines with JSON object,2172260,closed,0,,,2,2021-07-23T13:28:52Z,2021-08-03T06:50:47Z,2021-08-02T21:55:16Z,NONE,,"I found this posted on HN a while ago and love it -- thank you! As a minor improvement, it would be great to have the ability to parse a file with line-separated JSON objects. Currently the parser obviously requires an array wrapping all these objects.",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/298/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 919508498,MDU6SXNzdWU5MTk1MDg0OTg=,1375,JSON export dumps JSON fields as TEXT,4068,closed,0,,,2,2021-06-12T09:45:08Z,2021-06-14T09:41:59Z,2021-06-13T15:37:58Z,NONE,,"Hi! When a user tries to export data as JSON, I would expect to see the value of JSON columns represented as JSON instead of being rendered as a string. What do you think?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1375/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 892457208,MDU6SXNzdWU4OTI0NTcyMDg=,1327,Support Unicode characters in metadata.json,20846286,closed,0,,,2,2021-05-15T14:33:58Z,2021-05-24T19:10:21Z,2021-05-24T19:10:21Z,NONE,,"Hello , when I used Burmese (Unicode) characters in metadata.json like below - ![image](https://user-images.githubusercontent.com/20846286/118364978-cba70100-b5c0-11eb-967c-7dc3b62478f2.png) It gave wrong results when I run datasette - ![image](https://user-images.githubusercontent.com/20846286/118365025-fc873600-b5c0-11eb-97ce-19541b8cc6d8.png) It would be great & helpful for us if metadata.json can support in Unicode supported Asian Languages. Thanks & Regards. ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1327/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 847423559,MDU6SXNzdWU4NDc0MjM1NTk=,253,fixtures.db example error in sql-utils blog post,192568,closed,0,,,2,2021-03-31T22:07:36Z,2021-05-19T03:31:48Z,2021-05-19T03:31:47Z,NONE,,"En route to trying to understand column order transform documentation, I tried the instructions here: https://simonwillison.net/2020/Sep/23/sqlite-advanced-alter-table/ I get a malformed database schema syntax error. ``` $ wget https://latest.datasette.io/fixtures.db --2021-03-31 18:00:23-- https://latest.datasette.io/fixtures.db Resolving latest.datasette.io (latest.datasette.io)... 2607:f8b0:4004:801::2013, 142.250.73.211 Connecting to latest.datasette.io (latest.datasette.io)|2607:f8b0:4004:801::2013|:443... connected. HTTP request sent, awaiting response... 200 OK Length: unspecified [application/octet-stream] Saving to: ‘fixtures.db’ fixtures.db [ <=> ] 260.00K --.-KB/s in 0.1s 2021-03-31 18:00:23 (2.41 MB/s) - ‘fixtures.db’ saved [266240] $ sqlite3 fixtures.db '.schema facetable' Error: malformed database schema (generated_columns) - near ""AS"": syntax error $ sqlite3 fixtures.db SQLite version 3.28.0 2019-04-15 14:49:49 Enter "".help"" for usage hints. sqlite> .schema Error: malformed database schema (generated_columns) - near ""AS"": syntax error ``` ",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/253/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 861622839,MDU6SXNzdWU4NjE2MjI4Mzk=,256,inserting with --nl errors with: sqlite3.OperationalError: table has no column named ,279769,closed,0,,,2,2021-04-19T18:01:03Z,2021-05-19T03:26:54Z,2021-05-19T03:26:54Z,NONE,,"I have a `jsonl` file, it is 10,000 lines long. Inserting from the cli with `sqlite-utils insert db table file --nl --batch-size 10000` fails with this missing column error, even though I'm telling it to use the whole file in the first batch. This seems similar to #18 and #139, but maybe it's unique to `--nl`?",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/256/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 870125126,MDU6SXNzdWU4NzAxMjUxMjY=,1310,I'm creating a plugin to export a spreadsheet file (.ods or .xlsx),3747136,closed,0,,,2,2021-04-28T16:20:11Z,2021-04-30T07:26:11Z,2021-04-30T06:58:46Z,NONE,,"Hi, I have started developing a plugin to export records as a spreadsheet file. It could be ods or xlsx, whatever is easier. I have spotted the following packages: - ods files: https://pypi.org/project/odswriter/ - xlsx files: https://openpyxl.readthedocs.io/en/stable/index.html (quite powerful) or https://xlsxwriter.readthedocs.io/ (faster) This is the code I have so far, I test it with the `--plugins-dir` option: ```python from datasette import hookimpl from datasette.utils.asgi import Response import odswriter as ods def render_spreadsheet(rows): with ods.writer(open(""test.ods"",""wb"")) as odsfile: for row in rows: odsfile.writerow([""String"", ""ABCDEF123456"", ""123456""]) return Response(odsfile, content_type=""application/vnd.oasis.opendocument.spreadsheet"", status=200) @hookimpl def register_output_renderer(): return {""extension"": ""ods"", ""render"": render_spreadsheet} ``` I get the following error: ``` Traceback (most recent call last): File ""/home/colin/.local/lib/python3.8/site-packages/datasette/app.py"", line 1128, in route_path await response.asgi_send(send) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 339, in asgi_send body = body.encode(""utf-8"") AttributeError: 'ODSWriter' object has no attribute 'encode' ERROR: Exception in ASGI application Traceback (most recent call last): File ""/home/colin/.local/lib/python3.8/site-packages/datasette/app.py"", line 1128, in route_path await response.asgi_send(send) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 339, in asgi_send body = body.encode(""utf-8"") AttributeError: 'ODSWriter' object has no attribute 'encode' During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/home/colin/.local/lib/python3.8/site-packages/uvicorn/protocols/http/h11_impl.py"", line 396, in run_asgi result = await app(self.scope, self.receive, self.send) File ""/home/colin/.local/lib/python3.8/site-packages/uvicorn/middleware/proxy_headers.py"", line 45, in __call__ return await self.app(scope, receive, send) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 161, in __call__ await self.app(scope, receive, send) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/tracer.py"", line 75, in __call__ await self.app(scope, receive, send) File ""/home/colin/.local/lib/python3.8/site-packages/asgi_csrf.py"", line 107, in app_wrapped_with_csrf await app(scope, receive, wrapped_send) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/app.py"", line 1086, in __call__ return await self.route_path(scope, receive, send, path) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/app.py"", line 1133, in route_path return await self.handle_500(request, send, exception) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/app.py"", line 1267, in handle_500 await asgi_send_html( File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 217, in asgi_send_html await asgi_send( File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 237, in asgi_send await asgi_start(send, status, headers, content_type) File ""/home/colin/.local/lib/python3.8/site-packages/datasette/utils/asgi.py"", line 246, in asgi_start await send( File ""/home/colin/.local/lib/python3.8/site-packages/asgi_csrf.py"", line 103, in wrapped_send await send(event) File ""/home/colin/.local/lib/python3.8/site-packages/uvicorn/protocols/http/h11_impl.py"", line 482, in send raise RuntimeError(msg % message_type) RuntimeError: Expected ASGI message 'http.response.body', but got 'http.response.start'. ``` I tried with `AsgiFileDownload` like in [DatabaseDownload](https://github.com/simonw/datasette/blob/main/datasette/views/database.py#L150) to deal with the binary nature of the ods file, but the renderer expects a Response: > should be dict or Response However, the `Response` class only supports the following methods, not binary: - html - text - json - redirect How would you suggest me to proceed to have my ods file downloaded? ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1310/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 281110295,MDU6SXNzdWUyODExMTAyOTU=,173,I18n and L10n support,50138,open,0,,,2,2017-12-11T17:49:58Z,2021-04-26T12:10:01Z,,NONE,,It would be less geeky and more user friendly if the display strings in the filter menu and possibly other parts could be localized.,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/173/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 791237799,MDU6SXNzdWU3OTEyMzc3OTk=,1196,Access Denied Error in Windows,2826376,open,0,,,2,2021-01-21T15:40:40Z,2021-04-14T19:28:38Z,,NONE,,"I am trying to publish a db to vercel. But while issuing the below command throwing `Access Denied` error which is leading to `RecursionError: maximum recursion depth exceeded while calling a Python object`. I am using PyCharm and Python 3.9. I have reinstalled both and launched PyCharm as Admin in Windows 10. But still the issue persists. Issued command `datasette publish vercel jmeter.db --project jmeter --install datasette-vega` PS: localhost is working fine.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1196/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 826700095,MDU6SXNzdWU4MjY3MDAwOTU=,1255,Facets timing out but work when filtering,1219001,open,0,,,2,2021-03-09T22:01:39Z,2021-04-02T20:50:08Z,,NONE,,"System info: Windows 10 Datasette 0.55 installed via pip Python 3.8.5 in a conda environment I'm getting the message `These facets timed out` on any faceting operation. However, when I apply a filter, the facets appear in the filtered view. The error returns when the filter is removed. My data only has 38,450 rows.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1255/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 831163537,MDExOlB1bGxSZXF1ZXN0NTkyNTQ4MTAz,1260,Fix: code quality issues,25361949,closed,0,,,2,2021-03-14T13:56:10Z,2021-03-29T00:22:41Z,2021-03-29T00:22:41Z,NONE,simonw/datasette/pulls/1260,"### Description Hi :wave: I work at [DeepSource](https://deepsource.io), I ran DeepSource analysis on the forked copy of this repo and found some interesting [code quality issues](https://deepsource.io/gh/withshubh/datasette/issues/?category=recommended) in the codebase, opening this PR so you can assess if our platform is right and helpful for you. ### Summary of changes - Replaced ternary syntax with if expression - Removed redundant `None` default - Used `is` to compare type of objects - Iterated dictionary directly - Removed unnecessary lambda expression - Refactored unnecessary `else` / `elif` when `if` block has a `return` statement - Refactored unnecessary `else` / `elif` when `if` block has a `raise` statement - Added .deepsource.toml to continuously analyze and detect code quality issues",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1260/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 836963850,MDU6SXNzdWU4MzY5NjM4NTA=,249,Full text search possibly broken?,36287,closed,0,,,2,2021-03-21T02:03:44Z,2021-03-21T02:43:32Z,2021-03-21T02:43:32Z,NONE,,"I'm not quite sure if this is an issue with sqlite-utils or datasette. **Background** I was previously using sqlite-utils version < 3.6. I have a bunch of csv files that have some data scraped from a website. ``` sqlite-utils create-table mydb.db post \ posted_date text \ url text \ title text \ raw_text text \ --not-null posted_date \ --not-null url \ --pk=url ``` FTS is enabled via `sqlite-utils enable-fts ./mydb.db post title raw_text` Data is loaded to the table via `sqlite-utils insert ./mydb.db post ${filename} --csv` Note that the data contains text in my language Tamil. Loading happens just fine. datasette serves the db file just fine. It recognizes FTS and shows the ""search"" box. However, none of the queries work. Whatever text I supply, it always returns 0 rows. I literally copy paste words from the row listing on the screen and paste it on the search box. Interestingly, only thing I can remember is switching to sqlite-utils 3.6. I had to do this because the prior version had an issue with column size. I have attached one of the csv files that can be loaded to the table. Substitute ""${filename}"" with that file for the sqlite-utils insert command. [posts_20200417-20201231.csv.zip](https://github.com/simonw/sqlite-utils/files/6176697/posts_20200417-20201231.csv.zip) Interestingly, the FTS based search from datasette worked just fine before this version upgrade. That is, the queries returned results. I will try to downgrade just to see if the theory is correct. I appreciate any help here. Thanks. ",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/249/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 617323873,MDU6SXNzdWU2MTczMjM4NzM=,766,Enable wildcard-searches by default,2181410,open,0,,,2,2020-05-13T10:14:48Z,2021-03-05T16:35:21Z,,NONE,,"Hi Simon. It seems that datasette currently has wildcard-searches disabled by default (along with the boolean search-options, NEAR-queries and more, and despite the docs). If I try out the search-url provided in the [docs](https://datasette.readthedocs.io/en/stable/full_text_search.html#the-table-page-and-table-view-api) (https://fara.datasettes.com/fara/FARA_All_ShortForms?_search=manafort), it does not handle wildcard-searches, and I'm unable to make it work on my datasette-instance. I would argue that wildcard-searches is such a standard query, that it should be enabled by default. Requiring ""_searchmode=raw"" when using prefix-searches seems unnecessary. Plus: What happens to non-ascii searches when using ""_searchmode=raw""? Is the ""escape_fts""-function from datasette.utils ignored? Thanks! /Claus",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/766/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 797159961,MDExOlB1bGxSZXF1ZXN0NTY0MjE1MDEx,225,fix for problem in Table.insert_all on search for columns per chunk of rows,261237,closed,0,,,2,2021-01-29T20:16:07Z,2021-02-14T21:04:13Z,2021-02-14T21:04:13Z,NONE,simonw/sqlite-utils/pulls/225,"Hi, I ran into a problem when trying to create a database from my Apple Healthkit data using [healthkit-to-sqlite](https://github.com/dogsheep/healthkit-to-sqlite). The program crashed because of an invalid insert statement that was generated for table `rDistanceCycling`. The actual problem turned out to be in [sqlite-utils](https://github.com/simonw/sqlite-utils). `Table.insert_all` processes the data to be inserted in chunks of rows and checks for every chunk which columns are used, and it will collect all column names in the variable `all_columns`. The collection of columns is done using a nested list comprehension that is not completely correct. I'm using a Windows machine and had to make a few adjustments to the tests in order to be able to run them because they had a posix dependency. Thanks, kind regards, Frans ``` # this is a (condensed) chunk of data from my Apple healthkit export that caused the problem. # the 3 last items in the chunk have additional keys: metadata_HKMetadataKeySyncVersion and metadata_HKMetadataKeySyncIdentifier chunk = [{'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '7.0.1', 'device': '<, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:7.0.1>', 'unit': 'km', 'creationDate': '2020-10-10 12:29:09 +0100', 'startDate': '2020-10-10 12:29:06 +0100', 'endDate': '2020-10-10 12:29:07 +0100', 'value': '0.00518016'}, {'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '7.0.1', 'device': '<, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:7.0.1>', 'unit': 'km', 'creationDate': '2020-10-10 12:29:10 +0100', 'startDate': '2020-10-10 12:29:07 +0100', 'endDate': '2020-10-10 12:29:08 +0100', 'value': '0.00544049'}, {'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '6.2.6', 'device': '<, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:6.2.6>', 'unit': 'km', 'creationDate': '2020-10-14 05:54:12 +0100', 'startDate': '2020-07-15 16:40:50 +0100', 'endDate': '2020-07-15 16:42:49 +0100', 'value': '0.952092', 'metadata_HKMetadataKeySyncVersion': '1', 'metadata_HKMetadataKeySyncIdentifier': '3:674DBCDB-3FE8-40D1-9FC1-E54A2B413805:616520450.99823:616520569.99360:119'}, {'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '6.2.6', 'device': '<, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:6.2.6>', 'unit': 'km', 'creationDate': '2020-10-14 05:54:12 +0100', 'startDate': '2020-07-15 16:42:49 +0100', 'endDate': '2020-07-15 16:44:51 +0100', 'value': '0.848983', 'metadata_HKMetadataKeySyncVersion': '1', 'metadata_HKMetadataKeySyncIdentifier': '3:674DBCDB-3FE8-40D1-9FC1-E54A2B413805:616520569.99360:616520691.98826:119'}, {'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '6.2.6', 'device': '<, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:6.2.6>', 'unit': 'km', 'creationDate': '2020-10-14 05:54:12 +0100', 'startDate': '2020-07-15 16:44:51 +0100', 'endDate': '2020-07-15 16:46:50 +0100', 'value': '0.834403', 'metadata_HKMetadataKeySyncVersion': '1', 'metadata_HKMetadataKeySyncIdentifier': '3:674DBCDB-3FE8-40D1-9FC1-E54A2B413805:616520691.98826:616520810.98305:119'}] def all_columns_old(): all_columns = [col for col in chunk[0]] all_columns += [column for record in chunk for column in record if column not in all_columns] return all_columns def all_columns_new(): all_columns = [col for col in chunk[0]] for record in chunk: all_columns += [column for column in record if column not in all_columns] return all_columns if __name__ == '__main__': from pprint import pprint print('problem: ') pprint(all_columns_old()) print('\nfix: ') pprint(all_columns_new()) ``` ",140912432,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/225/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 792297010,MDExOlB1bGxSZXF1ZXN0NTYwMjA0MzA2,224,Add fts offset docs.,37962604,closed,0,,,2,2021-01-22T20:50:58Z,2021-02-14T19:31:06Z,2021-02-14T19:31:06Z,NONE,simonw/sqlite-utils/pulls/224,"The limit can be passed as a string to the query builder to have an offset. I have tested it using the shorthand `limit=f""15, 30""`, the standard syntax should work too.",140912432,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/224/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 792851444,MDU6SXNzdWU3OTI4NTE0NDQ=,11,XML parse error,3613583,closed,0,,,2,2021-01-24T17:38:54Z,2021-02-11T21:18:58Z,2021-02-11T21:18:48Z,NONE,,"I am on Windows 10 using Windows Subsystem for Linux, Python 3.8. I installed evernote-to-sqlite via pipx (in a venv). I tried using enex files from the latest version of Evernote for Windows (10.6.9 which only lets you export 50 notes at a time) and from Legacy Evernote (6.25.2.9198 which lets you export all your notes at once). The enex file from latest evernote gives this error: File ""/usr/lib/python3.8/xml/etree/ElementTree.py"", line 1320, in XML parser.feed(text) xml.etree.ElementTree.ParseError: XML or text declaration not at start of entity: line 2, column 6 The enex file from Legacy Evernote gives this error: File ""/home/david/.local/pipx/venvs/evernote-to-sqlite/lib/python3.8/site-packages/evernote_to_sqlite/utils.py"", line 28, in save_note updated = note.find(""updated"").text AttributeError: 'NoneType' object has no attribute 'text'",303218369,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 801780625,MDU6SXNzdWU4MDE3ODA2MjU=,9,SSL Error,12669260,open,0,,,2,2021-02-05T02:12:56Z,2021-02-07T18:45:04Z,,NONE,,"Here's the error I get when running `pip install pocket-to-sqlite`: ``` Could not fetch URL https://pypi.python.org/simple/pocket-to-sqlite/: There was a problem confirming the ssl certificate: [SSL: TLSV1_ALERT_PROTOCOL_VERSION] tlsv1 alert protocol version (_ssl.c:661) - skipping Could not find a version that satisfies the requirement pocket-to-sqlite (from versions: ) No matching distribution found for pocket-to-sqlite ``` Does this require python 3? ",213286752,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 796234313,MDU6SXNzdWU3OTYyMzQzMTM=,1210,Immutable Database w/ Canned Queries,525780,closed,0,,,2,2021-01-28T18:08:29Z,2021-02-05T11:30:34Z,2021-02-05T11:30:34Z,NONE,,"I have a database that I only want to read from; when instructing datasette to treat the database as immutable my defined canned queries disappear. Are these two features incompatible or have I hit an unintended bug? Thanks for datasette in any way, it's a joy to use!",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1210/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 743400216,MDU6SXNzdWU3NDM0MDAyMTY=,11,Error thrown: sqlite3.OperationalError: table users has no column named lastName,61791,closed,0,,,2,2020-11-16T01:21:18Z,2021-01-18T04:35:22Z,2021-01-18T04:35:22Z,NONE,,"Just installed `swarm-to-sqlite-0.3.2` and tried according to the docs: ``` Traceback (most recent call last): File ""/usr/local/bin/swarm-to-sqlite"", line 8, in sys.exit(cli()) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/swarm_to_sqlite/cli.py"", line 73, in cli save_checkin(checkin, db) File ""/usr/local/lib/python3.9/site-packages/swarm_to_sqlite/utils.py"", line 82, in save_checkin checkins_table.m2m(""users"", user, m2m_table=""likes"", pk=""id"") File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1914, in m2m id = other_table.insert(record, pk=pk, replace=True).last_pk File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1647, in insert return self.insert_all( File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1765, in insert_all self.insert_chunk( File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1575, in insert_chunk result = self.db.execute(query, params) File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 200, in execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: table users has no column named lastName ```",205429375,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 769376447,MDU6SXNzdWU3NjkzNzY0NDc=,2,killed by oomkiller on large location-history,231498,open,0,,,2,2020-12-17T00:32:24Z,2020-12-17T00:48:32Z,,NONE,,"memory seems to grow unbounded and is oom-killed after about 20GB memory usage. this is happening while loading a ~1GB uncompressed location history.",206649770,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 767561886,MDU6SXNzdWU3Njc1NjE4ODY=,1148,Syntax error with + symbol when deployed to Vercel,765871,closed,0,,,2,2020-12-15T12:53:12Z,2020-12-16T21:57:42Z,2020-12-16T21:51:54Z,NONE,,"Works locally: ``` (ins)[hendry@t14s aws-partners]$ sqlite3 partners.db < test.sql 5|{""href"":""https://partners.amazonaws.com/partners/001E000000NaBI0IAN"",""label"":""Slalom""} 6|{""href"":""https://partners.amazonaws.com/partners/001E000000VHBQIIA5"",""label"":""Accenture""} 7|{""href"":""https://partners.amazonaws.com/partners/001E000000Rp588IAB"",""label"":""Druva""} 8|{""href"":""https://partners.amazonaws.com/partners/001E0000013FeQXIA0"",""label"":""Palo Alto Networks""} 9|{""href"":""https://partners.amazonaws.com/partners/001E000000Rl12lIAB"",""label"":""New Relic""} 10|{""href"":""https://partners.amazonaws.com/partners/001E000000NaBHWIA3"",""label"":""Deloitte""} 11|{""href"":""https://partners.amazonaws.com/partners/001E000000Rp5GDIAZ"",""label"":""MegazoneCloud""} 12|{""href"":""https://partners.amazonaws.com/partners/001E000000NaBHMIA3"",""label"":""iret, Inc.""} (ins)[hendry@t14s aws-partners]$ cat test.sql select A.launch_rank, A.partner_info from summary A INNER JOIN summary B ON A.launch_rank>=B.launch_rank-3 AND A.launch_rank<=B.launch_rank+4 WHERE B.""partner_info"" LIKE '%Palo Alto%' ``` Copy the SQL into https://aws-partners-singapore.vercel.app/partners and get syntax error: ![image](https://user-images.githubusercontent.com/765871/102217393-78e4f280-3f17-11eb-9e13-ca79ed62ec34.png) ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1148/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 760960559,MDU6SXNzdWU3NjA5NjA1NTk=,205,"sqlite3.OperationalError: near ""("": syntax error",765871,closed,0,,,2,2020-12-10T06:44:40Z,2020-12-10T19:18:22Z,2020-12-10T07:24:23Z,NONE,,"The sqlite version is 3.22.0 2018-01-22 18:45:57 0c55d179733b46d8d0ba4d88e01a25e10677046ee3da1d5b1581e86726f2alt1 sqlite-utils, version 3.0 It fails here: https://github.com/kaihendry/aws-partners-datasette/runs/1528432635?check_suite_focus=true I'm not sure where the problem is, since it works _fine locally_ on Archlinux system running 3.34.0 2020-12-01 16:14:00 a26b6597e3ae272231b96f9982c3bcc17ddec2f2b6eb4df06a224b91089fed5b https://github.com/kaihendry/aws-partners-datasette/blob/main/create-summary-view.sh Maybe I need to bump up from ubuntu-latest to ? ",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/205/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 747702144,MDU6SXNzdWU3NDc3MDIxNDQ=,1100,Error on OPTIONS request to database,1319404,closed,0,,,2,2020-11-20T18:16:43Z,2020-12-03T00:57:35Z,2020-12-03T00:50:17Z,NONE,,"When I perform an OPTIONS request against a database or table datasette fails with an internal error. All these tests result in the traceback below. ``` curl -XOPTIONS http://127.0.0.1:8001/test-db curl -XOPTIONS http://127.0.0.1:8001/test-db/table1 curl -XOPTIONS http://127.0.0.1:8001/test-db/table1\?_search\=test ``` ``` Traceback (most recent call last): File ""[path-to-python]/site-packages/datasette/app.py"", line 1033, in route_path response = await view(request, send) File ""[path-to-python]/site-packages/datasette/views/base.py"", line 146, in view request, **request.scope[""url_route""][""kwargs""] File ""[path-to-python]/site-packages/datasette/views/base.py"", line 118, in dispatch_request return await handler(request, *args, **kwargs) TypeError: object Response can't be used in 'await' expression ``` Making the `options` function in the `DataView` class async fixed it for me. ```python async def options(self, request, *args, **kwargs): r = Response.text(""ok"") if self.ds.cors: r.headers[""Access-Control-Allow-Origin""] = ""*"" return r ```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1100/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 707407567,MDU6SXNzdWU3MDc0MDc1Njc=,171,Idea: transitive closure tables for tree structures,649467,closed,0,,,2,2020-09-23T14:17:33Z,2020-10-22T04:38:35Z,2020-10-22T04:07:14Z,NONE,,"I just read that sqlite has a transitive closure table extension using a virtual table in order to represent trees: https://charlesleifer.com/blog/querying-tree-structures-in-sqlite-using-python-and-the-transitive-closure-extension/ Even without this extension, though, a util to build a transitive closure table would allow trees to be queried easily. Since it relies on self-referential foreign keys, the relationships might even be able to be automatically detected. ",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/171/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 648245071,MDU6SXNzdWU2NDgyNDUwNzE=,8,Error thrown: table photos has no column named hasSticker,18504,closed,0,,,2,2020-06-30T14:54:37Z,2020-10-12T20:35:06Z,2020-10-12T20:25:24Z,NONE,,"While running `swarm-to-sqlite` it throws an error: harper@:~/dogsheep/swarm$ swarm-to-sqlite checkins.db --save=checkins.json Please provide your Foursquare OAuth token: Importing 8127 checkins [#################-------------------] 49% 00:01:52 Traceback (most recent call last): File ""/home/harper/.local/bin/swarm-to-sqlite"", line 11, in sys.exit(cli()) File ""/home/harper/.local/lib/python3.6/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/home/harper/.local/lib/python3.6/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/home/harper/.local/lib/python3.6/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/harper/.local/lib/python3.6/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/home/harper/.local/lib/python3.6/site-packages/swarm_to_sqlite/cli.py"", line 73, in cli save_checkin(checkin, db) File ""/home/harper/.local/lib/python3.6/site-packages/swarm_to_sqlite/utils.py"", line 94, in save_checkin photos_table.insert(photo, replace=True) File ""/home/harper/.local/lib/python3.6/site-packages/sqlite_utils/db.py"", line 963, in insert alter = self.value_or_default(""alter"", alter) File ""/home/harper/.local/lib/python3.6/site-packages/sqlite_utils/db.py"", line 1142, in insert_all def upsert_all( sqlite3.OperationalError: table photos has no column named hasSticker Where should i dig in?",205429375,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 712889459,MDExOlB1bGxSZXF1ZXN0NDk2Mjk4MTgw,986,"Allow facet by primary keys, fixes #985",39452697,closed,0,,,2,2020-10-01T14:18:55Z,2020-10-01T16:51:45Z,2020-10-01T16:51:45Z,NONE,simonw/datasette/pulls/986,"Hello! This PR makes it possible to facet by primary keys. Did I get it right that just removing the condition on UI side is enough? From testing it works fine with primary keys, just as with normal keys. If so, should I also remove unused `data-is-pk`?",107914493,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/986/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 605806386,MDU6SXNzdWU2MDU4MDYzODY=,735,"Error when I click on ""View and edit SQL""",30607,closed,0,,,2,2020-04-23T19:31:32Z,2020-04-28T06:10:20Z,2020-04-27T19:00:30Z,NONE,,"Hi, when I do it [here](https://my-database.now.sh/commissioniComunePalermo/youtube), I have ""unrecognized token: ""["""" error. Is it normal? Thank you",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/735/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 539204432,MDU6SXNzdWU1MzkyMDQ0MzI=,70,Implement ON DELETE and ON UPDATE actions for foreign keys,26292069,open,0,,,2,2019-12-17T17:19:10Z,2020-02-27T04:18:53Z,,NONE,,"Hi! I did not find any mention on the library about ON DELETE and ON UPDATE actions for foreign keys. Are those expected to be implemented? If not, it would be a nice thing to include!",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/70/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 562787785,MDU6SXNzdWU1NjI3ODc3ODU=,667,Allow injecting configuration data from plugins,870184,closed,0,,,2,2020-02-10T19:50:15Z,2020-02-12T16:18:22Z,2020-02-12T09:21:22Z,NONE,,"I'm trying to customize datasette as explorer for [CLDF](https://cldf.clld.org) datasets. Such datasets can be converted automatically to SQLite, which then can be fed to datasette, (e.g. https://github.com/cldf/cookbook/blob/master/recipes/datasette/README.md). Part of this customization would be support for the ""special"" data types described in the [CLDF ontology](https://cldf.clld.org/v1.0/terms.rdf). But while rendering of the values can be customized via the `render_cell` hook in a plugin, e.g. custom labels for foreign keys must be specified through the config file. It would be nice to be able to programmatically inject config data from plugins as well.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/667/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 550293770,MDU6SXNzdWU1NTAyOTM3NzA=,658,How do I use the app.css as style sheet?,49656826,open,0,,,2,2020-01-15T16:27:57Z,2020-02-07T00:29:50Z,,NONE,,"Simon, I'm trying to use the app.css (in static folder) as style sheet but the datasette on Heroku simply ignore it! I read everything about customization here and on readthedocs but still can't. Is this possible? Thanks!",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/658/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 518506242,MDU6SXNzdWU1MTg1MDYyNDI=,616,Datasette FTS detection bug,49656826,closed,0,,,2,2019-11-06T14:25:47Z,2019-11-08T15:31:33Z,2019-11-08T02:06:56Z,NONE,,"I'm having a trouble with datasette. I deployed EXACTLY the same project on two different apps on Heroku. Both have databases (not all) with FTS activated but only one detects and works fine. You can take a look here: With search: http://teste-templates.herokuapp.com/amazonia_protege/car Without search: http://bases.vortex.media/amazonia_protege/car ![teste](https://user-images.githubusercontent.com/49656826/68306310-11a80e00-0088-11ea-8d1c-db3bd3375518.jpg) ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/616/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 491219910,MDU6SXNzdWU0OTEyMTk5MTA=,61,importing CSV to SQLite as library,17739,closed,0,,,2,2019-09-09T17:12:40Z,2019-11-04T16:25:01Z,2019-11-04T16:25:01Z,NONE,,"CSV can be imported to SQLite when used CLI, but I don't see documentation for when using as library. ",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/61/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 494685791,MDU6SXNzdWU0OTQ2ODU3OTE=,574,Improve usage description of --host option,132978,closed,0,,,2,2019-09-17T15:12:12Z,2019-11-01T21:58:17Z,2019-11-01T21:57:54Z,NONE,,"It would be nice if the `--host` option had a clearer description. I tried to get datasette running on an AWS instance and it took a while to realize it was only listening on localhost. So I wanted to make it listen on an non-localhost interface and tried giving a couple of values to `--host` (a host name, then an interface name), but none of them did. In the end I read the source to see that the option is passed to `uvicorn` and looked at the uvicorn docs, which also didn't help. Then I searched the web for ""example running datasette on a host"" which led me to https://github.com/simonw/datasette/issues/514 where I saw someone using `-h 0.0.0.0`. I tried that and it works. That usage could be mentioned somewhere, and might save someone else some time.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/574/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 510076368,MDU6SXNzdWU1MTAwNzYzNjg=,605,Support queries at the table level,12617395,open,0,,,2,2019-10-21T15:58:30Z,2019-10-30T18:55:37Z,,NONE,,"Per the issue described in [issue #588](https://github.com/simonw/datasette/issues/588), it was determined queries are not supported at the table level. Per my last comment in the issue, I'd like to request support for this as it would help eliminate errors in the event certain tables are not present in the database.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/605/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 504238461,MDU6SXNzdWU1MDQyMzg0NjE=,6,sqlite3.OperationalError: table users has no column named bio,1055831,closed,0,,,2,2019-10-08T19:39:52Z,2019-10-13T05:31:28Z,2019-10-13T05:30:19Z,NONE,,"``` $ github-to-sqlite repos github.db $ github-to-sqlite starred github.db dazzag24 Traceback (most recent call last): File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/bin/github-to-sqlite"", line 10, in sys.exit(cli()) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/cli.py"", line 106, in starred utils.save_stars(db, user, stars) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/utils.py"", line 177, in save_stars user_id = save_user(db, user) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/utils.py"", line 61, in save_user return db[""users""].upsert(to_save, pk=""id"").last_pk File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py"", line 1067, in upsert extracts=extracts, File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py"", line 916, in insert extracts=extracts, File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py"", line 1024, in insert_all result = self.db.conn.execute(sql, values) sqlite3.OperationalError: table users has no column named bio ``` ``` $ pipenv graph github-to-sqlite==0.4 - requests [required: Any, installed: 2.22.0] - certifi [required: >=2017.4.17, installed: 2019.9.11] - chardet [required: >=3.0.2,<3.1.0, installed: 3.0.4] - idna [required: >=2.5,<2.9, installed: 2.8] - urllib3 [required: >=1.21.1,<1.26,!=1.25.1,!=1.25.0, installed: 1.25.6] - sqlite-utils [required: ~=1.11, installed: 1.11] - click [required: Any, installed: 7.0] - click-default-group [required: Any, installed: 1.2.2] - click [required: Any, installed: 7.0] - tabulate [required: Any, installed: 0.8.5] Python 3.6.8 ```",207052882,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 457201907,MDU6SXNzdWU0NTcyMDE5MDc=,513,Is it possible to publish to Heroku despite slug size being too large?,7936571,closed,0,,,2,2019-06-18T00:12:02Z,2019-06-21T22:35:54Z,2019-06-21T22:35:54Z,NONE,,"I'm trying to push more than 1.5GB worth of SQLite databases -- 535MB compressed -- to Heroku but I get this error when I run the `datasette publish heroku` command. Compiled slug size: 535.5M is too large (max is 500M). Can I publish the databases and make datasette work on Heroku despite the large slug size?",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/513/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 448189298,MDU6SXNzdWU0NDgxODkyOTg=,486,Ability to add extra routes and related templates,2181410,closed,0,,,2,2019-05-24T14:04:25Z,2019-05-24T14:43:28Z,2019-05-24T14:43:09Z,NONE,,"Hi Simon Thank for an excellent job! Datasette is such an obviously good idea (once you have that idea!) and so well done. The only thing that I miss, is the ability to add extras routes (with associated jinja2-templates). For most of the datasets, that I would like to publish, I would also like at least a page, that describes the data (semantics, provenance, biases...) and a page explaining our cookie- and privacy-policies (which would allows us to use something like Goggle Analytics). ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/486/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 411257981,MDU6SXNzdWU0MTEyNTc5ODE=,412,Linked Data(sette),43340,open,0,,,2,2019-02-18T00:38:14Z,2019-03-19T10:09:46Z,,NONE,,"I've a radical feature idea (possible first as an extension in order to experiment?): I'd like to link to a remote table from a remote database, e.g. with a function ""linked_datasette()"". So one could do following query: ``` SELECT foo.id, foo.a, remote_party.b FROM foo JOIN linked_datasette(""https://parlgov.datasettes.com/parlgov-b42a2f2"") AS remote_party ON foo.id=remote_party.id ``` This is inspired by SPARQL's SERVICE keyword for remote RDF ""endpoints"". There's a foundation in the SQL Standard called SQL/MED (https://rhaas.blogspot.com/2011/01/why-sqlmed-is-cool.html ). And here's an implementation from me in Postgres FDW to connect another Postgres ""endpoint"": https://pastebin.com/Fz2v64Cz .",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/412/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 400511206,MDU6SXNzdWU0MDA1MTEyMDY=,403,How does persistence work?,1794527,closed,0,,,2,2019-01-17T23:41:57Z,2019-01-19T05:47:55Z,2019-01-18T06:51:14Z,NONE,,I was under the impression that now.sh is for stateless microservices. So where are these SQLite databases stored and when do they get created and destroyed?,107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/403/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 330826972,MDU6SXNzdWUzMzA4MjY5NzI=,308,"Support extra Heroku apps:create options - region, space, team",78156,open,0,,,2,2018-06-08T23:08:33Z,2018-09-21T14:09:28Z,,NONE,,"It would be useful to document how to pass Heroku CLI options on `datasette publish`, e.g. `--region eu`.",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/308/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 339095976,MDU6SXNzdWUzMzkwOTU5NzY=,334,extra_options not passed to heroku publisher,719357,closed,0,,,2,2018-07-06T23:26:12Z,2018-07-24T04:53:21Z,2018-07-10T01:46:04Z,NONE,,"I might be wrong but I was not able to publish to `heroku` with `--extra-options`, I think `extra_options` is not being used in this function [here](https://github.com/simonw/datasette/blob/master/datasette/utils.py#L369). Any help appreciated! ",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/334/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274160723,MDU6SXNzdWUyNzQxNjA3MjM=,100,TemplateAssertionError: no filter named 'tojson',13304454,closed,0,,,2,2017-11-15T13:43:41Z,2017-11-16T09:25:10Z,2017-11-16T00:14:13Z,NONE,,"A 500 error is raised upon clicking on the name of a table on the homepage, say _http://0.0.0.0:8001/_ to _http://0.0.0.0:8001/test_check-c1f4771/users_ The API part seems to function as intended, though... ``` 2017-11-15 14:33:57 - (sanic)[ERROR]: Traceback (most recent call last): File ""/usr/local/lib/python3.5/dist-packages/sanic/app.py"", line 503, in handle_request response = await response File ""/usr/local/lib/python3.5/dist-packages/datasette/app.py"", line 155, in get return await self.view_get(request, name, hash, **kwargs) File ""/usr/local/lib/python3.5/dist-packages/datasette/app.py"", line 219, in view_get **context, File ""/usr/local/lib/python3.5/dist-packages/sanic_jinja2/__init__.py"", line 84, in render return html(self.render_string(template, request, **context)) File ""/usr/local/lib/python3.5/dist-packages/sanic_jinja2/__init__.py"", line 81, in render_string return self.env.get_template(template).render(**context) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 812, in get_template return self._load_template(name, self.make_globals(globals)) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 786, in _load_template template = self.loader.load(self, name, globals) File ""/usr/lib/python3/dist-packages/jinja2/loaders.py"", line 125, in load code = environment.compile(source, name, filename) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 565, in compile self.handle_exception(exc_info, source_hint=source_hint) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 754, in handle_exception reraise(exc_type, exc_value, tb) File ""/usr/lib/python3/dist-packages/jinja2/_compat.py"", line 37, in reraise raise value.with_traceback(tb) File ""/usr/local/lib/python3.5/dist-packages/datasette/templates/table.html"", line 29, in template
params = {{ query.params|tojson(4) }}
File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 515, in _generate return generate(source, self, name, filename, defer_init=defer_init) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 62, in generate generator.visit(node) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 849, in visit_Template self.blockvisit(block.body, block_frame) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 1172, in visit_If self.blockvisit(node.body, if_frame) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 1353, in visit_Output self.visit(argument, frame) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 1565, in visit_Filter self.fail('no filter named %r' % node.name, node.lineno) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 427, in fail raise TemplateAssertionError(msg, lineno, self.name, self.filename) jinja2.exceptions.TemplateAssertionError: no filter named 'tojson' 2017-11-15 14:33:57 - (network)[INFO][127.0.0.1:41316]: GET http://0.0.0.0:8001/test_check-c1f4771/users 500 144 2017-11-15 14:33:57 - (network)[INFO][127.0.0.1:41316]: GET http://0.0.0.0:8001/favicon.ico 200 0 ```",107914493,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/100/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed