id,node_id,number,title,user,user_label,state,locked,assignee,assignee_label,milestone,milestone_label,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,repo_label,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason 267513424,MDU6SXNzdWUyNjc1MTM0MjQ=,1,Addressable pages for every row in a table,9599,simonw,closed,0,,,2857392,Ship first public release,6,2017-10-23T00:44:16Z,2017-10-24T14:11:04Z,2017-10-24T14:11:03Z,OWNER,," /database-name-7sha256/table-name/compound-pk /database-name-7sha256/table-name/compound-pk.json Tricky part will be figuring out what the private key is - especially since it could be a compound primary key and it might involve different data types.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 347058326,MDExOlB1bGxSZXF1ZXN0MjA1NzcwOTk2,1,Make .indexes compatible with older SQLite versions,9599,simonw,closed,0,,,,,0,2018-08-02T15:17:05Z,2018-08-02T15:17:30Z,2018-08-02T15:17:30Z,OWNER,simonw/sqlite-utils/pulls/1,Older SQLite versions return a different set of columns from the PRAGMA we are using.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 470637068,MDU6SXNzdWU0NzA2MzcwNjg=,1,Use XML Analyser to figure out the structure of the export XML,9599,simonw,closed,0,,,,,1,2019-07-20T05:19:02Z,2019-07-20T05:20:09Z,2019-07-20T05:20:09Z,MEMBER,,https://github.com/simonw/xml_analyser,197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487598042,MDU6SXNzdWU0ODc1OTgwNDI=,1,Implement code to pull checkins from the Foursquare API,9599,simonw,closed,0,,,,,0,2019-08-30T17:40:02Z,2019-08-30T18:23:24Z,2019-08-30T18:23:24Z,MEMBER,,"The tool currently only works with a pre-prepared JSON file of checkins. When called without options, it should prompt the user to paste in a Foursquare OAuth token. The `--token=` option should work too, and should be backed up by an optional environment variable.",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488833136,MDU6SXNzdWU0ODg4MzMxMzY=,1,"Imported followers should go in ""users"", relationships in ""following""",9599,simonw,closed,0,,,,,0,2019-09-03T21:27:37Z,2019-09-04T20:23:04Z,2019-09-04T20:23:04Z,MEMBER,,"Right now `twitter-to-sqlite followers` dumps everything in a `followers` table, and doesn't actually record which account they are following! It should instead save them all in a global `users` table and then set up m2m relationships in a `following` table. This also means it should create a record for the specified user in order to record both sides of each relationship.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 493599818,MDU6SXNzdWU0OTM1OTk4MTg=,1,Command for fetching starred repos,9599,simonw,closed,0,,,,,0,2019-09-14T08:36:29Z,2019-09-14T21:30:48Z,2019-09-14T21:30:48Z,MEMBER,,,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 496415321,MDU6SXNzdWU0OTY0MTUzMjE=,1,Figure out some interesting example SQL queries,9599,simonw,open,0,,,,,9,2019-09-20T15:28:07Z,2021-05-03T03:46:23Z,,MEMBER,,My knowledge of genetics has left me short here. I'd love to be able to provide some interesting example SELECT queries - maybe one that spots if you are [likely to have red hair?](https://www.snpedia.com/index.php/Rs1805007),209590345,genome-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 503233021,MDU6SXNzdWU1MDMyMzMwMjE=,1,Use better pagination (and implement progress bar),9599,simonw,closed,0,,,,,4,2019-10-07T04:58:11Z,2020-03-27T22:13:57Z,2020-03-27T22:13:57Z,MEMBER,,"Right now we attempt to load everything at once - which caps out at 5,000 items and is really slow. We can do better by implementing pagination using count and offset.",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 504720731,MDU6SXNzdWU1MDQ3MjA3MzE=,1,Add more details on how to request data from google takeout correctly.,1055831,dazzag24,open,0,,,,,0,2019-10-09T15:17:34Z,2019-10-09T15:17:34Z,,NONE,,"The default is to download everything. This can result in an enormous amount of data when you only really need 2 types of data for now: - My Activity - Location History In addition unless you specify that ""My Activity"" is downloaded in JSON format the default is HTML. This then causes the `google-takeout-to-sqlite my-activity takeout.db takeout.zip` command to fail as it only contains html files not json files. Thanks",206649770,google-takeout-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 519979091,MDExOlB1bGxSZXF1ZXN0MzM4NjQ3Mzc4,1,Add parkrun-to-sqlite,1101318,mrw34,closed,0,,,,,0,2019-11-08T12:05:32Z,2020-10-12T00:35:16Z,2020-10-12T00:35:16Z,CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/1,,214746582,dogsheep.github.io,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 585526292,MDU6SXNzdWU1ODU1MjYyOTI=,1,Set up full text search,9599,simonw,closed,0,,,,,1,2020-03-21T15:57:35Z,2020-03-21T19:47:46Z,2020-03-21T19:45:52Z,MEMBER,,"Should run against `title` and `text` in `items`, and `about` and `id` in `users`.",248903544,hacker-news-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602533300,MDU6SXNzdWU2MDI1MzMzMDA=,1,Import photo metadata from Apple Photos into SQLite,9599,simonw,open,0,,,5324096,Apple Photos online and securely browsable,8,2020-04-18T19:23:26Z,2020-05-04T02:41:40Z,,MEMBER,,"Faces, albums, locations, that kind of thing.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 689800307,MDU6SXNzdWU2ODk4MDAzMDc=,1,Add an index on the timestamp column,9599,simonw,closed,0,,,,,0,2020-09-01T04:33:37Z,2020-09-01T04:49:23Z,2020-09-01T04:49:23Z,MEMBER,,Since default view will likely be ordered by timestamp descending.,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718934942,MDU6SXNzdWU3MTg5MzQ5NDI=,1,Documentation on how to use this with Datasette,9599,simonw,open,0,,,,,1,2020-10-11T21:56:27Z,2020-10-11T22:14:00Z,,MEMBER,,In particular how to use `datasette-render-images` to see the images.,303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1353411865,I_kwDODEpn8M5Qq20Z,1,Problem with my user,2467,fernand0,open,0,,,,,0,2022-08-28T16:59:37Z,2022-08-28T16:59:37Z,,NONE,,"If I call the program with: inaturalist-to-sqlite inaturalist.db ftricas the program exits with an error: `Importing 36 observations Traceback (most recent call last): File ""/home/ftricas/.pyenv/versions/3.10.6/bin/inaturalist-to-sqlite"", line 8, in sys.exit(cli()) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/inaturalist_to_sqlite/cli.py"", line 51, in cli save_observation(observation, db) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/inaturalist_to_sqlite/utils.py"", line 34, in save_observation db[""observations""] File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 2965, in insert return self.insert_all( File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 3068, in insert_all self.create( File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 1564, in create self.db.create_table( File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 951, in create_table sql = self.create_table_sql( File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 765, in create_table_sql foreign_keys = self.resolve_foreign_keys(name, foreign_keys or []) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 702, in resolve_foreign_keys other_table = table.guess_foreign_table(column) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/sqlite_utils/db.py"", line 2061, in guess_foreign_table raise NoObviousTable( sqlite_utils.db.NoObviousTable: No obvious foreign key table for column 'taxon' - tried ['taxon', 'taxons'] ` If I call the program with your user everything seems to go well and then, I can call the program with my own user without problems. Moreover, I can call the program again with my own user and everything goes well now. Additional info, the command: sqlite-utils tables inaturalist.db shows that the correct name can be 'taxons'. There is another small problem with a warning: warnings.warn(""urllib3 ({}) or chardet ({})/charset_normalizer ({}) doesn't match a supported "" ",206202864,inaturalist-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/inaturalist-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1616347574,I_kwDOJHON9s5gV4G2,1,Initial proof of concept with ChatGPT,9599,simonw,closed,0,,,,,3,2023-03-09T03:44:39Z,2023-03-09T03:51:55Z,2023-03-09T03:51:55Z,MEMBER,,I'm using ChatGPT to figure out enough AppleScript to get at my notes data.,611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267513523,MDU6SXNzdWUyNjc1MTM1MjM=,2,Initial proof-of-concept,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-10-23T00:45:37Z,2017-10-23T01:26:39Z,2017-10-23T00:45:53Z,OWNER,,Implemented in https://github.com/simonw/stateless-datasets/commit/de04d7a854d71003ffcf98028eab976a936c2dba,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 349850687,MDU6SXNzdWUzNDk4NTA2ODc=,2,Mechanism for adding foreign keys to an existing table,9599,simonw,closed,0,,,,,1,2018-08-12T22:50:56Z,2019-02-24T21:34:41Z,2019-02-24T21:34:41Z,OWNER,,"SQLite does not have ALTER TABLE support for adding new foreign keys... but it turns out it's possible to make these changes without having to duplicate the entire table by carefully running `UPDATE sqlite_master SET sql=... WHERE type='table' AND name='X';` Here's how Django does it: https://github.com/django/django/blob/d3449faaa915a08c275b35de01e66a7ef6bdb2dc/django/db/backends/sqlite3/schema.py#L103-L125 And here's the official documentation about this: https://sqlite.org/lang_altertable.html#otheralter (scroll to the very bottom of the page)",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470637152,MDU6SXNzdWU0NzA2MzcxNTI=,2,Import workouts,9599,simonw,closed,0,,,,,1,2019-07-20T05:20:21Z,2019-07-20T06:21:41Z,2019-07-20T06:21:41Z,MEMBER,,From #1,197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487598468,MDU6SXNzdWU0ODc1OTg0Njg=,2,--save option to dump checkins to a JSON file on disk,9599,simonw,closed,0,,,,,1,2019-08-30T17:41:06Z,2019-08-31T02:40:21Z,2019-08-31T02:40:21Z,MEMBER,,"This is a complement to the `--load` option - mainly useful for development purposes. (I'll rename `--file` to `--load` as part of this issue).",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488833698,MDU6SXNzdWU0ODg4MzM2OTg=,2,"""twitter-to-sqlite user-timeline"" command for pulling tweets by a specific user",9599,simonw,closed,0,,,,,3,2019-09-03T21:29:12Z,2019-09-04T20:02:11Z,2019-09-04T20:02:11Z,MEMBER,,"Twitter only allows up to 3,200 tweets to be retrieved from https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-user_timeline.html I'm going to do: $ twitter-to-sqlite tweets simonw ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 493668862,MDU6SXNzdWU0OTM2Njg4NjI=,2,Extract licenses from repos into a separate table,9599,simonw,closed,0,,,,,0,2019-09-14T21:33:41Z,2019-09-14T21:46:58Z,2019-09-14T21:46:58Z,MEMBER,," ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503234169,MDU6SXNzdWU1MDMyMzQxNjk=,2,Track and use the 'since' value,9599,simonw,closed,0,,,,,3,2019-10-07T05:02:59Z,2020-03-27T22:22:30Z,2020-03-27T22:22:30Z,MEMBER,,"Pocket says: > Whenever possible, you should use the since parameter, or count and and offset parameters when retrieving a user's list. After retrieving the list, you should store the current time (which is provided along with the list response) and pass that in the next request for the list. This way the server only needs to return a small set (changes since that time) instead of the user's entire list every time. At the bottom of https://getpocket.com/developer/docs/v3/retrieve",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 541274681,MDU6SXNzdWU1NDEyNzQ2ODE=,2,Add linkedin-to-sqlite,881925,mnp,open,0,,,,,0,2019-12-21T03:13:40Z,2019-12-21T03:13:40Z,,NONE,,"There is an API available. https://developer.linkedin.com/docs/rest-api# At the minimum, I would think contact list and messages would be of interest.",214746582,dogsheep.github.io,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 602533352,MDU6SXNzdWU2MDI1MzMzNTI=,2,Ability to convert HEIC images to JPEG,9599,simonw,closed,0,,,5324096,Apple Photos online and securely browsable,1,2020-04-18T19:23:43Z,2020-04-28T16:47:21Z,2020-04-28T16:47:21Z,MEMBER,,,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 664793260,MDU6SXNzdWU2NjQ3OTMyNjA=,2,Yak shave,145425,ekg,open,0,,,,,0,2020-07-23T22:04:18Z,2020-07-23T22:04:18Z,,NONE,,"Just a quick note... The 23andme data is not exactly your genome, but a SNP chip of your genome. It's ""some of your genotypes."" Or about 0.1% of your genome. Nice work in any case! It deserves to be liberated!!!!!",209590345,genome-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 689809225,MDU6SXNzdWU2ODk4MDkyMjU=,2,Apply porter stemming,9599,simonw,closed,0,,,,,2,2020-09-01T04:57:55Z,2020-09-01T20:42:00Z,2020-09-01T20:40:24Z,MEMBER,,This can be on by default. You can turn it off for a table in the config file using `stemming: none` - or maybe `tokenize: none` to match the terminology used by SQLite and `sqlite-utils`: https://sqlite-utils.readthedocs.io/en/stable/python-api.html#enabling-full-text-search,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718938046,MDU6SXNzdWU3MTg5MzgwNDY=,2,Convert dates to a better format,9599,simonw,closed,0,,,,,0,2020-10-11T22:12:33Z,2020-10-11T23:15:03Z,2020-10-11T23:15:03Z,MEMBER,,"They currently look like this: https://github.com/dogsheep/evernote-to-sqlite/blob/9d8efd17580f6ddf76745c145d1e69dd24e52b64/tests/test_evernote_to_sqlite.py#L35-L36",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 769376447,MDU6SXNzdWU3NjkzNzY0NDc=,2,killed by oomkiller on large location-history,231498,khimaros,open,0,,,,,2,2020-12-17T00:32:24Z,2020-12-17T00:48:32Z,,NONE,,"memory seems to grow unbounded and is oom-killed after about 20GB memory usage. this is happening while loading a ~1GB uncompressed location history.",206649770,google-takeout-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 952179830,MDU6SXNzdWU5NTIxNzk4MzA=,2,Command for fetching Hacker News threads from the search API,9599,simonw,open,0,,,,,4,2021-07-25T02:00:45Z,2021-07-25T03:12:57Z,,MEMBER,,"I want to be able to fetch every item for a domain, e.g. https://news.ycombinator.com/from?site=simonwillison.net",248903544,hacker-news-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1485017981,I_kwDODEpn8M5Yg5N9,2,table identifications has no column named previous_observation_taxon,520541,heaversm,open,0,,,,,0,2022-12-08T16:47:17Z,2022-12-08T16:47:17Z,,NONE,,"Installed successfully with pip and ran `inaturalist-to-sqlite inaturalist.db simonw` and got the error: ``` sqlite3.OperationalError: table identifications has no column named previous_observation_taxon ```",206202864,inaturalist-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/inaturalist-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1616354999,I_kwDOJHON9s5gV563,2,First working version,9599,simonw,closed,0,,,,,7,2023-03-09T03:53:00Z,2023-03-09T05:10:22Z,2023-03-09T05:10:22Z,MEMBER,,"It's going to shell out to `osascript` as seen in: - #1 I'm going with that option because https://appscript.sourceforge.io/status.html warns against the other potential methods: > Apple eliminated its Mac Automation department in 2016. The future of AppleScript and its related technologies is unclear. Caveat emptor. But `osascript` looks pretty stable to me.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267515678,MDU6SXNzdWUyNjc1MTU2Nzg=,3,"Make individual column valuables addressable, with smart content types",9599,simonw,open,0,,,,,1,2017-10-23T01:11:32Z,2017-12-10T03:11:58Z,,OWNER,,"Some SQLite databases embed images in columns. It would be cool if these had URLs. /database-name-7sha256/table-name/compound-pk/column /database-name-7sha256/table-name/compound-pk/column.json /database-name-7sha256/table-name/compound-pk/column.png /database-name-7sha256/table-name/compound-pk/column.gif /database-name-7sha256/table-name/compound-pk/column.txt The one without an explicit file extension auto-detects the correct extension.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 351845423,MDU6SXNzdWUzNTE4NDU0MjM=,3,Experiment with contentless FTS tables,9599,simonw,closed,0,,,,,1,2018-08-18T19:31:01Z,2019-07-22T20:58:55Z,2019-07-22T20:58:55Z,OWNER,,Could greatly reduce size of resulting database for large datasets: http://cocoamine.net/blog/2015/09/07/contentless-fts4-for-large-immutable-documents/,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470637206,MDU6SXNzdWU0NzA2MzcyMDY=,3,Import ActivitySummary,9599,simonw,closed,0,,,,,0,2019-07-20T05:21:00Z,2019-07-20T05:58:07Z,2019-07-20T05:58:07Z,MEMBER,,"From #1 ```python 'ActivitySummary': {'attr_counts': {'activeEnergyBurned': 980, 'activeEnergyBurnedGoal': 980, 'activeEnergyBurnedUnit': 980, 'appleExerciseTime': 980, 'appleExerciseTimeGoal': 980, 'appleStandHours': 980, 'appleStandHoursGoal': 980, 'dateComponents': 980}, 'child_counts': {}, 'count': 980, 'parent_counts': {'HealthData': 980}}, ```",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487600595,MDU6SXNzdWU0ODc2MDA1OTU=,3,Option to fetch only checkins more recent than the current max checkin,9599,simonw,closed,0,,,,,4,2019-08-30T17:46:45Z,2019-10-16T20:41:23Z,2019-10-16T20:39:59Z,MEMBER,,"The Foursquare checkins API supports ""return every checkin occurring after this point"" - I can pass it the maximum createdAt date currently stored in the database. This will allow for quick incremental fetches via a cron.",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488833975,MDU6SXNzdWU0ODg4MzM5NzU=,3,Command for running a search and saving tweets for that search,9599,simonw,closed,0,,,,,6,2019-09-03T21:29:56Z,2019-11-04T05:31:56Z,2019-11-04T05:31:16Z,MEMBER,, $ twitter-to-sqlite search dogsheep,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 493670426,MDU6SXNzdWU0OTM2NzA0MjY=,3,Command to fetch all repos belonging to a user or organization,9599,simonw,closed,0,,,,,2,2019-09-14T21:54:21Z,2019-09-17T00:17:53Z,2019-09-17T00:17:53Z,MEMBER,,"How about this: $ github-to-sqlite repos simonw",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503243784,MDU6SXNzdWU1MDMyNDM3ODQ=,3,Extract images into separate tables,9599,simonw,open,0,,,,,1,2019-10-07T05:43:01Z,2020-09-01T06:17:45Z,,MEMBER,,"As already done with authors. Slightly harder because images do not have a universally unique ID. Also need to figure out what to do about there being columns for both `image` and `images`. ",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 543717994,MDExOlB1bGxSZXF1ZXN0MzU3OTc0MzI2,3,Add todoist-to-sqlite,706257,bcongdon,closed,0,,,,,0,2019-12-30T04:02:59Z,2020-10-12T00:35:58Z,2020-10-12T00:35:57Z,CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/3,"Really enjoying getting into the dogsheep/datasette ecosystem. I made a downloader for Todoist, and I think/hope others might find this useful",214746582,dogsheep.github.io,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 602533481,MDU6SXNzdWU2MDI1MzM0ODE=,3,"Import EXIF data into SQLite - lens used, ISO, aperture etc",9599,simonw,open,0,,,5324096,Apple Photos online and securely browsable,2,2020-04-18T19:24:31Z,2021-10-05T12:38:24Z,,MEMBER,,,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 689810340,MDU6SXNzdWU2ODk4MTAzNDA=,3,"Datasette plugin to provide custom page for running faceted, ranked searches",9599,simonw,closed,0,,,,,3,2020-09-01T05:00:22Z,2020-09-03T21:01:41Z,2020-09-03T21:01:41Z,MEMBER,,"This will be a page at `/-/beta` which renders using a custom template. It will offer a default timeline view plus search and facet by type/date.",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718938321,MDU6SXNzdWU3MTg5MzgzMjE=,3,Use a content hash for the note IDs,9599,simonw,closed,0,,,,,0,2020-10-11T22:13:46Z,2020-10-11T23:15:04Z,2020-10-11T23:15:04Z,MEMBER,,"Without a GUID note IDs are pretty ineffective, but using a hash of the contents will at least avoid creating identical duplicates in the future. https://sqlite-utils.readthedocs.io/en/stable/python-api.html#setting-an-id-based-on-the-hash-of-the-row-contents",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 769397742,MDU6SXNzdWU3NjkzOTc3NDI=,3,sqlite-utils error on takeout import,231498,khimaros,open,0,,,,,0,2020-12-17T01:18:48Z,2020-12-17T01:19:04Z,,NONE,,"``` $ google-takeout-to-sqlite my-activity takeout.db /path/to/zip ... sqlite3.OperationalError: no such table: main.my_activity ``` there is no table create in `utils.py`, unlike other importers such as github-to-sqlite additionally, this package and hackernews-to-sqlite have conflicting `sqlite-utils` dep with datasette and dogsheep-beta",206649770,google-takeout-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/3/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 952189173,MDU6SXNzdWU5NTIxODkxNzM=,3,Use HN algolia endpoint to retrieve trees,9599,simonw,open,0,,,,,3,2021-07-25T03:35:27Z,2021-07-25T18:41:17Z,,MEMBER,,"The `trees` command currently has to make a request for every single comment. Algolia have an endpoint that bundles the entire thread together into a single request. `https://hn.algolia.com/api/v1/items/ID` Here's an example that loads quickly, with about 50 comments: https://hn.algolia.com/api/v1/items/27941108 It doesn't appear to use pagination at all - if a thread is big then the response is big. I ran this search to find some stories with more than 1000 comments: https://hn.algolia.com/api/v1/search?tags=story&numericFilters=num_comments%3E=1000 Here's one: https://news.ycombinator.com/item?id=25015967 with 4759 comments. Hitting the API takes 41s and returns 3.7 MB of JSON! ``` wget 'https://hn.algolia.com/api/v1/items/25015967' 0.03s user 0.04s system 0% cpu 41.368 total /tmp % ls -lah 25015967 -rw-r--r-- 1 simon wheel 3.7M Jul 24 20:31 25015967 ```",248903544,hacker-news-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1616422013,I_kwDOJHON9s5gWKR9,3,`apple-notes-to-sqlite --dump` option,9599,simonw,closed,0,,,,,0,2023-03-09T05:05:49Z,2023-03-09T05:06:14Z,2023-03-09T05:06:14Z,MEMBER,,"Option that doesn't write to the database at all, it just outputs all the notes to stdout as newline-delimited JSON.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267515836,MDU6SXNzdWUyNjc1MTU4MzY=,4,Make URLs immutable,9599,simonw,closed,0,,,2857392,Ship first public release,8,2017-10-23T01:13:30Z,2017-10-24T02:38:24Z,2017-10-24T02:38:24Z,OWNER,,"Absolutely everything should have a far-future expires header Part of the URL will be the truncated sha1 hash of the database file itself, calculated at build time",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 403028630,MDExOlB1bGxSZXF1ZXN0MjQ3NTc2OTQy,4,Fts5,9599,simonw,closed,0,,,,,0,2019-01-25T06:54:05Z,2019-01-25T06:54:33Z,2019-01-25T06:54:33Z,OWNER,simonw/sqlite-utils/pulls/4,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 470640505,MDU6SXNzdWU0NzA2NDA1MDU=,4,Import Records,9599,simonw,closed,0,,,,,1,2019-07-20T06:11:20Z,2019-07-20T06:21:41Z,2019-07-20T06:21:41Z,MEMBER,,"From #1: ```python 'Record': {'attr_counts': {'creationDate': 2672233, 'device': 2665111, 'endDate': 2672233, 'sourceName': 2672233, 'sourceVersion': 2671779, 'startDate': 2672233, 'type': 2672233, 'unit': 2650012, 'value': 2672232}, 'child_counts': {'HeartRateVariabilityMetadataList': 2318, 'MetadataEntry': 287974}, 'count': 2672233, 'parent_counts': {'Correlation': 2, 'HealthData': 2672231}}, ```",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487601121,MDU6SXNzdWU0ODc2MDExMjE=,4,Online tool for getting a Foursquare OAuth token,9599,simonw,closed,0,,,,,1,2019-08-30T17:48:14Z,2019-08-31T18:07:26Z,2019-08-31T18:07:26Z,MEMBER,,"I will link to this from the documentation. See also this conversation on Twitter: https://twitter.com/simonw/status/1166822603023011840 I've decided to go with ""copy and paste in a token"" rather than hooking up a local web server that can have tokens passed to it.",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488835586,MDU6SXNzdWU0ODg4MzU1ODY=,4,Command for importing data from a Twitter Export file,9599,simonw,closed,0,,,,,2,2019-09-03T21:34:13Z,2019-10-11T06:45:02Z,2019-10-11T06:45:02Z,MEMBER,,"Twitter lets you export all of your data as an archive file: https://twitter.com/settings/your_twitter_data A command for importing this data into SQLite would be extremely useful. $ twitter-to-sqlite import twitter.db path-to-archive.zip ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 493670730,MDU6SXNzdWU0OTM2NzA3MzA=,4,Command to fetch stargazers for one or more repos,9599,simonw,closed,0,,,,,8,2019-09-14T21:58:22Z,2020-05-02T21:30:27Z,2020-05-02T21:30:27Z,MEMBER,,"Maybe this: $ github-to-sqlite stargazers github.db simonw/datasette It could accept more than one repos. Maybe have options similar to `--sql` in [twitter-to-sqlite](https://github.com/dogsheep/twitter-to-sqlite) so you can e.g. fetch all stargazers for all of the repos you have fetched into the database already (or all of the repos belonging to owner X)",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 558715564,MDExOlB1bGxSZXF1ZXN0MzcwMDI0Njk3,4,Add beeminder-to-sqlite,706257,bcongdon,closed,0,,,,,0,2020-02-02T15:51:36Z,2020-10-12T00:36:16Z,2020-10-12T00:36:16Z,CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/4,,214746582,dogsheep.github.io,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 589402939,MDU6SXNzdWU1ODk0MDI5Mzk=,4,"Store authentication information as ""pocket_access_token"" etc",9599,simonw,closed,0,,,,,0,2020-03-27T20:43:22Z,2020-03-27T20:43:59Z,2020-03-27T20:43:59Z,MEMBER,,The `pocket_` prefix will mean that the same `auth.json` file can be used for other Dogsheep tools without Pocket over-riding a value set by some other tool.,213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602533539,MDU6SXNzdWU2MDI1MzM1Mzk=,4,Upload all my photos to a secure S3 bucket,9599,simonw,closed,0,,,5324096,Apple Photos online and securely browsable,14,2020-04-18T19:24:50Z,2020-04-18T21:58:11Z,2020-04-18T21:57:13Z,MEMBER,,"- [x] Create a bucket with bucket credentials - [x] Programmatically upload some recent photos to it (from a notebook) - [x] Turn this into a script",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 689839399,MDU6SXNzdWU2ODk4MzkzOTk=,4,Optimize the FTS table,9599,simonw,closed,0,,,,,1,2020-09-01T05:58:17Z,2020-09-01T06:10:08Z,2020-09-01T06:10:08Z,MEMBER,,,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718938508,MDU6SXNzdWU3MTg5Mzg1MDg=,4,Configure FTS + add an index on the date columns,9599,simonw,closed,0,,,,,2,2020-10-11T22:14:40Z,2020-10-11T23:41:29Z,2020-10-11T23:41:29Z,MEMBER,,"Sort by date descending is likely the most common way of sorting, so that column should be indexed. Also add FTS configuration for both notes and the OCR column on resources.",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 778380836,MDU6SXNzdWU3NzgzODA4MzY=,4,Feature Request: Gmail,203343,Btibert3,open,0,,,,,5,2021-01-04T21:31:09Z,2021-03-04T20:54:44Z,,NONE,,"From takeout, I only exported my Gmail account. Ideally I could parse this into sqlite via this tool.",206649770,google-takeout-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1205867842,I_kwDODtX3eM5H4BVC,4,Retrieve the top-level story for a comment,1755789,telotortium,open,0,,,,,0,2022-04-15T20:25:39Z,2022-04-15T20:25:39Z,,NONE,,"I think that each comment inserted into the database should include a column `onstory` that contains the ID of the story on which the comment was made. This is exactly equivalent to the link after ""on:"" at the top of an HN comment page ([example](https://news.ycombinator.com/item?id=18358028)). We could do this either by directly retrieving the HTML page and using Beautiful Soup to find that link, or alternatively recurse up the tree in the Firebase API using the `parent` field (probably using `functools.lru_cache` in case a person has commented a bunch of times on the same story).",248903544,hacker-news-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1616429236,I_kwDOJHON9s5gWMC0,4,Support incremental updates,9599,simonw,open,0,,,,,2,2023-03-09T05:14:00Z,2023-03-09T18:20:56Z,,MEMBER,,"Running this script can take several hours against a large notes database. Would be neat if it could run against just the notes that have been modified since it last ran. Could pull the max `updated` date and then keep on looping until it finds one modified before then. Problem is I don't actually know what order it iterates over the notes in.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 267516066,MDU6SXNzdWUyNjc1MTYwNjY=,5,Implement sensible query pagination,9599,simonw,closed,0,,,2857392,Ship first public release,3,2017-10-23T01:16:00Z,2017-11-10T20:41:39Z,2017-11-10T20:41:39Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 403396009,MDExOlB1bGxSZXF1ZXN0MjQ3ODYxNDE5,5,Run Travis tests against Python 3.8-dev,9599,simonw,closed,0,,,,,0,2019-01-26T02:30:55Z,2019-01-26T02:37:54Z,2019-01-26T02:37:54Z,OWNER,simonw/sqlite-utils/pulls/5,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 470691622,MDU6SXNzdWU0NzA2OTE2MjI=,5,Add progress bar,9599,simonw,closed,0,,,,,2,2019-07-20T16:29:07Z,2019-07-22T03:30:13Z,2019-07-22T02:49:22Z,MEMBER,,"Showing a progress bar would be nice, using Click. The easiest way to do this would probably be be to hook it up to the length of the compressed content, and update it as this code pushes more XML bytes through the parser: https://github.com/dogsheep/healthkit-to-sqlite/blob/d64299765064501f4efdd9a0b21dbdba9ec4287f/healthkit_to_sqlite/utils.py#L6-L10",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487721884,MDU6SXNzdWU0ODc3MjE4ODQ=,5,Treat Foursquare timestamps as UTC,9599,simonw,closed,0,,,,,0,2019-08-31T02:44:47Z,2019-08-31T02:50:41Z,2019-08-31T02:50:41Z,MEMBER,,"Current test failure is due to timezone differences between my laptop and Circle CI: https://circleci.com/gh/dogsheep/swarm-to-sqlite/3 ``` E Full diff: E - [{'created': '2018-07-01T04:48:19', E ? ^ E + [{'created': '2018-07-01T02:48:19', E ? ^ E 'createdAt': 1530413299, ``` The timestamps I store in `created` should always be UTC.",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488874815,MDU6SXNzdWU0ODg4NzQ4MTU=,5,Write tests that simulate the Twitter API,9599,simonw,open,0,,,,,1,2019-09-03T23:55:35Z,2019-09-03T23:56:28Z,,MEMBER,,I can use betamax for this: https://pypi.org/project/betamax/,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 493671014,MDU6SXNzdWU0OTM2NzEwMTQ=,5,"Add ""incomplete"" boolean to users table for incomplete profiles",9599,simonw,closed,0,,,,,2,2019-09-14T22:01:50Z,2020-03-23T19:23:31Z,2020-03-23T19:23:30Z,MEMBER,,"User profiles that are fetched from e.g. stargazers (#4) are incomplete - they have a login but they don't have name, company etc. Add a `incomplete` boolean flag to the `users` table to record this. Then later I can add a `backfill-users` command which loops through and fetches missing data for those incomplete profiles.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602551638,MDU6SXNzdWU2MDI1NTE2Mzg=,5,photos-to-sqlite s3-auth command,9599,simonw,closed,0,,,,,1,2020-04-18T21:05:25Z,2020-04-18T21:08:44Z,2020-04-18T21:08:44Z,MEMBER,,Modeled on `github-to-sqlite auth` - prompts the user for their S3 credentials and saves them to `auth.json`.,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 629473827,MDU6SXNzdWU2Mjk0NzM4Mjc=,5,Set up a demo,26745575,harryvederci,open,0,,,,,1,2020-06-02T19:56:49Z,2020-09-01T06:18:43Z,,NONE,,"First off, thanks for open sourcing this application! This is a suggestion to increase the amount of people that would make use of it: an example in the readme file would help. Currently, users have to clone the app, install it, authorize through pocket, run a command, an then find out if this application does what they hope it does. Another possibility is to add a file `example-output.db`, containing one (mock) Pocket article. Keep up the good work!",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 689847361,MDU6SXNzdWU2ODk4NDczNjE=,5,Add a context column that's not searchable,9599,simonw,closed,0,,,,,1,2020-09-01T06:13:42Z,2020-09-03T18:43:50Z,2020-09-03T18:43:50Z,MEMBER,,"I sometimes like to configure titles that are things like ""Comment on issue X"" or ""Photo in Golden Gate Park"" - these shouldn't be included in the search index but should be stored so they can be displayed to provide context. Add a column for this - probably called `context` - and make it so it can be populated.",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718938889,MDU6SXNzdWU3MTg5Mzg4ODk=,5,Figure out how to display images from tags inline in Datasette,9599,simonw,open,0,,,,,6,2020-10-11T22:17:03Z,2020-10-16T20:16:28Z,,MEMBER,,"Relates to #1. Evernote XML looks like this: ```xml
This note includes two images.
The Python logo
The Evernote logo
``` That hash is the md5 we use to store resources. It should be possible to turn these into embedded image tags, especially if done in conjunction with the https://github.com/simonw/datasette-media plugin.",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 723499985,MDExOlB1bGxSZXF1ZXN0NTA1MDc2NDE4,5,Add fitbit-to-sqlite,4632208,mrphil007,open,0,,,,,0,2020-10-16T20:04:05Z,2020-10-16T20:04:05Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/5,,214746582,dogsheep.github.io,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 813880401,MDExOlB1bGxSZXF1ZXN0NTc3OTUzNzI3,5,WIP: Add Gmail takeout mbox import,306240,UtahDave,open,0,,,,,25,2021-02-22T21:30:40Z,2021-07-28T07:18:56Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/google-takeout-to-sqlite/pulls/5,"WIP This PR adds the ability to import emails from a Gmail mbox export from Google Takeout. This is my first PR to a datasette/dogsheep repo. I've tested this on my personal Google Takeout mbox with ~520,000 emails going back to 2004. This took around ~20 minutes to process. To provide some feedback on the progress of the import I added the ""rich"" python module. I'm happy to remove that if adding a dependency is discouraged. However, I think it makes a nice addition to give feedback on the progress of a long import. Do we want to log emails that have errors when trying to import them? Dealing with encodings with emails is a bit tricky. I'm very open to feedback on how to deal with those better. As well as any other feedback for improvements.",206649770,google-takeout-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1353418822,PR_kwDODtX3eM497MOV,5,The program fails when the user has no submissions,2467,fernand0,open,0,,,,,0,2022-08-28T17:25:45Z,2022-08-28T17:25:45Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/hacker-news-to-sqlite/pulls/5,"Tested with: hacker-news-to-sqlite user hacker-news.db fernand0 Result: ` Traceback (most recent call last): File ""/home/ftricas/.pyenv/versions/3.10.6/bin/hacker-news-to-sqlite"", line 8, in sys.exit(cli()) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) File ""/home/ftricas/.pyenv/versions/3.10.6/lib/python3.10/site-packages/hacker_news_to_sqlite/cli.py"", line 27, in user submitted = user.pop(""submitted"", None) or [] AttributeError: 'NoneType' object has no attribute 'pop' ` There is a problem of style with the patch (but not sure what to do) because with the new inicialization ( submitted = []) the part or [] is not needed. Maybe there is a more adequate way of doing this.",248903544,hacker-news-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1616440856,I_kwDOJHON9s5gWO4Y,5,Configure full text search,9599,simonw,open,0,,,,,0,2023-03-09T05:20:46Z,2023-03-09T05:20:46Z,,MEMBER,,"FTS would be useful. Maybe even extract the plain text from the notes to make that index easier to create, rather than creating it against the HTML. Can use the `plaintext` property for that.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 267516329,MDU6SXNzdWUyNjc1MTYzMjk=,6,Better JSON response options,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-10-23T01:18:47Z,2017-10-24T15:07:58Z,2017-10-24T15:07:58Z,OWNER,,"Default returns this: { “Columns”: [“id”, “name”, “age”], “Rows”: [ [45, “Simon”, 36] ] } .jsono instead returns a list of objects each duplicating the headers in its keys. They both probably share the same pagination mechanism so it might not be a jsono flat list.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 403624090,MDU6SXNzdWU0MDM2MjQwOTA=,6,"""sqlite-utils insert"" should support newline-delimited JSON",9599,simonw,closed,0,,,,,1,2019-01-28T02:00:02Z,2019-01-28T02:17:45Z,2019-01-28T02:17:45Z,OWNER,,"We can already export newline delimited JSON. We should learn to import it as well. The neat thing about importing it is that you can import GBs of data without having to read the whole lot into memory in order to decode the wrapping JSON array. Datasette can export it now: https://github.com/simonw/datasette/issues/405 Demo: https://latest.datasette.io/fixtures/facetable.json?_shape=array&_nl=on It should be possible to do this: $ curl ""https://latest.datasette.io/fixtures/facetable.json?_shape=array&_nl=on"" \ | sqlite-utils insert data.db facetable - --nl ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470856782,MDU6SXNzdWU0NzA4NTY3ODI=,6,Break up records into different tables for each type,9599,simonw,closed,0,,,,,1,2019-07-22T01:54:59Z,2019-07-22T03:28:55Z,2019-07-22T03:28:50Z,MEMBER,,"I don't think there's much benefit to having all of the different record types stored in the same enormous table. Here's what I get when I use `_facet=type`: I'm going to try splitting these up into separate tables - so `HKQuantityTypeIdentifierBodyMassIndex` becomes a table called `rBodyMassIndex` - and see if that's nicer to work with.",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 489419782,MDU6SXNzdWU0ODk0MTk3ODI=,6,Extract extended_entities into a media table,9599,simonw,closed,0,,,,,0,2019-09-04T21:59:10Z,2019-09-04T22:08:01Z,2019-09-04T22:08:01Z,MEMBER,," ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 504238461,MDU6SXNzdWU1MDQyMzg0NjE=,6,sqlite3.OperationalError: table users has no column named bio,1055831,dazzag24,closed,0,,,,,2,2019-10-08T19:39:52Z,2019-10-13T05:31:28Z,2019-10-13T05:30:19Z,NONE,,"``` $ github-to-sqlite repos github.db $ github-to-sqlite starred github.db dazzag24 Traceback (most recent call last): File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/bin/github-to-sqlite"", line 10, in sys.exit(cli()) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/cli.py"", line 106, in starred utils.save_stars(db, user, stars) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/utils.py"", line 177, in save_stars user_id = save_user(db, user) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/utils.py"", line 61, in save_user return db[""users""].upsert(to_save, pk=""id"").last_pk File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py"", line 1067, in upsert extracts=extracts, File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py"", line 916, in insert extracts=extracts, File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py"", line 1024, in insert_all result = self.db.conn.execute(sql, values) sqlite3.OperationalError: table users has no column named bio ``` ``` $ pipenv graph github-to-sqlite==0.4 - requests [required: Any, installed: 2.22.0] - certifi [required: >=2017.4.17, installed: 2019.9.11] - chardet [required: >=3.0.2,<3.1.0, installed: 3.0.4] - idna [required: >=2.5,<2.9, installed: 2.8] - urllib3 [required: >=1.21.1,<1.26,!=1.25.1,!=1.25.0, installed: 1.25.6] - sqlite-utils [required: ~=1.11, installed: 1.11] - click [required: Any, installed: 7.0] - click-default-group [required: Any, installed: 1.2.2] - click [required: Any, installed: 7.0] - tabulate [required: Any, installed: 0.8.5] Python 3.6.8 ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 543355051,MDExOlB1bGxSZXF1ZXN0MzU3NjQwMTg2,6,don't break if source is missing,78035,mfa,closed,0,,,,,1,2019-12-29T10:46:47Z,2020-03-28T02:28:11Z,2020-03-28T02:28:11Z,CONTRIBUTOR,dogsheep/swarm-to-sqlite/pulls/6,broke for me. very old checkins in 2010 had no source set.,205429375,swarm-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 602575575,MDU6SXNzdWU2MDI1NzU1NzU=,6,Add progress bar to upload command,9599,simonw,closed,0,,,,,2,2020-04-18T23:32:41Z,2020-04-19T00:15:24Z,2020-04-19T00:15:24Z,MEMBER,,Upload was added in #4 ,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 689848827,MDU6SXNzdWU2ODk4NDg4Mjc=,6,ISO timestamps,9599,simonw,open,0,,,,,0,2020-09-01T06:16:42Z,2020-09-01T06:16:42Z,,MEMBER,,"The `time_added`, `time_updated` and `time_read` columns currently store data like this: September 19, 2019 - 00:30:30 UTC Should use ISO instead, e.g. `2020-07-26T01:05:24+00:00`",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 689850810,MDU6SXNzdWU2ODk4NTA4MTA=,6,Set up a demo instance,9599,simonw,open,0,,,,,0,2020-09-01T06:20:24Z,2020-09-01T06:20:24Z,,MEMBER,,"Once I've got the Datasette plugin to a state where it's worth building a demo: #3 I can use data from my public https://github-to-sqlite.dogsheep.net/ demo plus the Pocket data subset I use for the demo in https://github.com/dogsheep/pocket-to-sqlite/issues/5 - I could pull in the https://dogsheep-photos.dogsheep.net/ photos data too.",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 718949182,MDU6SXNzdWU3MTg5NDkxODI=,6,Better handling of OCR data,9599,simonw,closed,0,,,,,2,2020-10-11T23:20:52Z,2020-10-12T00:04:10Z,2020-10-12T00:04:10Z,MEMBER,,"> I haven't done the FTS on OCR yet. I'm going to move that to another ticket because it requires more thought. _Originally posted by @simonw in https://github.com/dogsheep/evernote-to-sqlite/issues/4#issuecomment-706784028_",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 821841046,MDU6SXNzdWU4MjE4NDEwNDY=,6,Upgrade to latest sqlite-utils,9599,simonw,open,0,,,,,1,2021-03-04T07:21:54Z,2021-03-04T07:22:51Z,,MEMBER,,This is pinned to v1 at the moment.,206649770,google-takeout-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 842765105,MDExOlB1bGxSZXF1ZXN0NjAyMjYxMDky,6,Add testres-db tool,1151557,ligurio,closed,0,,,,,1,2021-03-28T15:43:23Z,2022-02-16T05:12:05Z,2022-02-16T05:12:05Z,NONE,dogsheep/dogsheep.github.io/pulls/6,,214746582,dogsheep.github.io,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1617602868,I_kwDOJHON9s5gaqk0,6,Character encoding problem,9599,simonw,open,0,,,,,2,2023-03-09T16:44:34Z,2023-04-14T15:22:09Z,,MEMBER,,"I ran against a recent note with this in it: > Or just ""Actions ⚙️ "" And got back: > `Actions ‚öôÔ∏è` Pasting that into https://ftfy.vercel.app/?s=Actions+%E2%80%9A%C3%B6%C3%B4%C3%94%E2%88%8F%C3%A8+ gives this: ```python s = 'Actions â\x80\x9aöôÃ\x94â\x88\x8fè' s = s.encode('latin-1') s = s.decode('utf-8') s = s.encode('macroman') s = s.decode('utf-8') print(s) ``` ",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1641117021,PR_kwDODtX3eM5M66op,6,Add permalink virtual field to items table,1231935,xavdid,open,0,,,,,1,2023-03-26T22:22:38Z,2023-03-29T18:38:52Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/hacker-news-to-sqlite/pulls/6,"I added a virtual column (no storage overhead) to the output that easily links back to the source. It works nicely out of the box with datasette: ![](https://cdn.zappy.app/faf43661d539ee0fee02c0421de22d65.png) I got bit a bit by https://github.com/simonw/sqlite-utils/issues/411, so I went with a manual `table_xinfo` and creating the table via execute. Happy to adjust if that issue moves, but this seems like it works. I also added my best-guess instructions for local development on this package. I'm shooting in the dark, so feel free to replace with how you work on it locally.",248903544,hacker-news-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 267516650,MDU6SXNzdWUyNjc1MTY2NTA=,7,Framework where by every page is JSON plus a template,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-23T01:22:03Z,2017-10-24T02:27:25Z,2017-10-24T02:27:25Z,OWNER,,"Every single page of my interface should be implemented as a function that returns JSON. I can then build my jinja templates on top of the exact data that would be returned by the API version.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 403625674,MDU6SXNzdWU0MDM2MjU2NzQ=,7,.insert_all() should accept a generator and process it efficiently,9599,simonw,closed,0,,,,,3,2019-01-28T02:11:58Z,2019-01-28T06:26:53Z,2019-01-28T06:26:53Z,OWNER,,"Right now you have to load every record into memory before passing the list to `.insert_all()` and friends. If you want to process millions of rows, this is inefficient. Python has generators - we should use them! The only catch here is that part of the magic of `sqlite-utils` is that it guesses the column types and creates the table for you. This code will need to be updated to notice if the table needs creating and, if it does, create it using the first X (where x=1,000 but can be customized) records. If a record outside of those first 1,000 has a rogue column, we can crash with an error. This will free us up to make the `--nl` option added in #6 much more efficient.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 472097220,MDU6SXNzdWU0NzIwOTcyMjA=,7,Script uses a lot of RAM,9599,simonw,closed,0,,,,,3,2019-07-24T06:11:11Z,2019-07-24T06:35:52Z,2019-07-24T06:35:52Z,MEMBER,,"I'm using an XML pull parser which should avoid the need to slurp the whole XML file into memory, but it's not working - the script still uses over 1GB of RAM when it runs according to Activity Monitor. I think this is because I'm still causing the full root element to be incrementally loaded into memory just in case I try and access it later. http://effbot.org/elementtree/iterparse.htm says I should use `elem.clear()` as I go. It also says: > The above pattern has one drawback; it does not clear the root element, so you will end up with a single element with lots of empty child elements. If your files are huge, rather than just large, this might be a problem. To work around this, you need to get your hands on the root element. So I will try that recipe and see if it helps.",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 490798130,MDU6SXNzdWU0OTA3OTgxMzA=,7,users-lookup command for fetching users,9599,simonw,closed,0,,,,,0,2019-09-08T19:47:59Z,2019-09-08T20:32:13Z,2019-09-08T20:32:13Z,MEMBER,,"https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-users-lookup ``` https://api.twitter.com/1.1/users/lookup.json?user_id=783214,6253282 https://api.twitter.com/1.1/users/lookup.json?screen_name=simonw,cleopaws ``` CLI design: ``` $ twitter-to-sqlite users-lookup simonw cleopaws $ twitter-to-sqlite users-lookup 783214 6253282 --ids ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 506276893,MDU6SXNzdWU1MDYyNzY4OTM=,7,issue-comments command for importing issue comments,9599,simonw,closed,0,,,,,1,2019-10-13T05:23:58Z,2019-10-14T14:44:12Z,2019-10-13T05:24:30Z,MEMBER,,Using this API: https://developer.github.com/v3/issues/comments/,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 589491711,MDU6SXNzdWU1ODk0OTE3MTE=,7,Upgrade to sqlite-utils 2.x,9599,simonw,closed,0,,,,,0,2020-03-28T02:24:51Z,2020-03-28T02:25:03Z,2020-03-28T02:25:03Z,MEMBER,,,205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602585497,MDU6SXNzdWU2MDI1ODU0OTc=,7,Integrate image content hashing,9599,simonw,open,0,,,,,2,2020-04-19T00:36:58Z,2021-08-26T02:01:01Z,,MEMBER,,To spot duplicate images (where the file content differs such that the sha256 is no longer a match) it would be useful to calculate and store perceptual hashes of some sort.,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/7/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",, 691265198,MDU6SXNzdWU2OTEyNjUxOTg=,7,"Mechanism for differentiating between ""by me"" and ""liked by me""",9599,simonw,closed,0,,,,,6,2020-09-02T17:44:37Z,2020-09-02T21:06:28Z,2020-09-02T21:06:28Z,MEMBER,,"Some of the content I'm indexing is by me - photos I've taken, tweets I wrote, commits, comments I posted. Some of it is stuff that I've ""liked"" or ""bookmarked"" in some way - favourited tweets, Pocket articles, starred GitHub repos. It woud be useful to be able to differentiate between the two.",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 743297582,MDU6SXNzdWU3NDMyOTc1ODI=,7,evernote-to-sqlite on windows 10 give this error: TypeError: insert() got an unexpected keyword argument 'replace',42387931,martinvanwieringen,closed,0,,,,,1,2020-11-15T16:57:28Z,2021-02-11T22:13:17Z,2021-02-11T22:13:17Z,NONE,,"running evernote-to-sqlite 0.2 on windows 10. Command: evernote-to-sqlite enex evernote.db MyNotes.enex I get the followinng error: File ""C:\Users\marti\AppData\Roaming\Python\Python38\site-packages\evernote_to_sqlite\utils.py"", line 46, in save_note note_id = db[""notes""].insert(row, hash_id=""id"", replace=True, alter=True).last_pk TypeError: insert() got an unexpected keyword argument 'replace' Removing replace=True, Leads to below error: note_id = db[""notes""].insert(row, hash_id=""id"", alter=True).last_pk File ""C:\Users\marti\AppData\Roaming\Python\Python38\site-packages\sqlite_utils\db.py"", line 924, in insert return self.insert_all( File ""C:\Users\marti\AppData\Roaming\Python\Python38\site-packages\sqlite_utils\db.py"", line 1046, in insert_all result = self.db.conn.execute(sql, values) sqlite3.IntegrityError: UNIQUE constraint failed: notes.id",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 750141615,MDExOlB1bGxSZXF1ZXN0NTI2ODQ3ODIz,7,Fixed conflicting CLI flags,8944,tlockney,closed,0,,,,,1,2020-11-24T23:25:12Z,2022-08-21T21:11:56Z,2022-08-21T21:11:56Z,CONTRIBUTOR,dogsheep/pocket-to-sqlite/pulls/7,"The `-a` used for the auth credentials and the shortened form of the `--all` flags were in conflict on the `fetch` command. To be consistent with other `-to-sqlite` libraries in the Dogsheep ecosystem, I removed the shortened form of the `--all` flag.",213286752,pocket-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 925384329,MDExOlB1bGxSZXF1ZXN0NjczODcyOTc0,7,Add instagram-to-sqlite,36654812,gavindsouza,open,0,,,,,0,2021-06-19T12:26:16Z,2021-07-28T07:58:59Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/7,"The tool covers only chat imports at the time of opening this PR but I'm planning to import everything else that I feel inquisitive about ref: https://github.com/gavindsouza/instagram-to-sqlite",214746582,dogsheep.github.io,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 930946817,MDU6SXNzdWU5MzA5NDY4MTc=,7,KeyError: 'accuracy' when processing Location History,403152,davidwilemski,open,0,,,,,0,2021-06-27T14:39:43Z,2021-06-27T14:39:43Z,,NONE,,"I'm new to both the dogsheep tools and datasette but have been experimenting a bit the last few days and these are really cool tools! I encountered a problem running my Google location history through this tool running the latest release in a docker container: ``` Traceback (most recent call last): File ""/usr/local/bin/google-takeout-to-sqlite"", line 8, in sys.exit(cli()) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/google_takeout_to_sqlite/cli.py"", line 49, in my_activity utils.save_location_history(db, zf) File ""/usr/local/lib/python3.9/site-packages/google_takeout_to_sqlite/utils.py"", line 27, in save_location_history db[""location_history""].upsert_all( File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1105, in upsert_all return self.insert_all( File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 990, in insert_all chunk = list(chunk) File ""/usr/local/lib/python3.9/site-packages/google_takeout_to_sqlite/utils.py"", line 33, in ""accuracy"": row[""accuracy""], KeyError: 'accuracy' ``` It looks like the tool assumes the `accuracy` key will be in every location history entry. My first attempt at a local patch to get myself going was to convert accessing the `accuracy` key to a `.get` instead to hopefully make the row nullable but I wasn't quite sure what `sqlite_utils` would do there. That did work in that the import happened and so I was going to propose a patch that made that change but in updating the existing test to include an entry with a missing accuracy entry, I noticed the expected type of the field appeared to be changing to a string in the test (and from a quick scan through the sqlite_utils code, probably TEXT in the database). Given this change in column type, it seemed that opening an issue first before proposing a fix seemed warranted. It seems the schema would need to be explicitly specified if you wanted a nullable integer column. Now that I've done a successful import run using my initial fix of calling `.get` on the row dict, I can see with datasette that I only have 7 data points (out of ~250k) that have a null accuracy column. They are all from 2011-2012 in an import that includes points spanning ~2010-2016 so perhaps another approach might be to filter those entries out during import if it really is that infrequent? I'm happy to provide a PR for a fix but figured I'd ask about which direction is preferred first.",206649770,google-takeout-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1617769847,I_kwDOJHON9s5gbTV3,7,Folder support,9599,simonw,closed,0,,,,,6,2023-03-09T18:21:33Z,2023-03-09T20:48:18Z,2023-03-09T20:48:18Z,MEMBER,,Notes can live in folders. These relationships should be exported too.,611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267517314,MDU6SXNzdWUyNjc1MTczMTQ=,8,Attempting an INSERT or UPDATE should return a sane error message,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-23T01:28:25Z,2017-10-23T15:28:12Z,2017-10-23T15:28:08Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 403922644,MDU6SXNzdWU0MDM5MjI2NDQ=,8,Problems handling column names containing spaces or - ,82988,psychemedia,closed,0,,,,,3,2019-01-28T17:23:28Z,2019-04-14T15:29:33Z,2019-02-23T21:09:03Z,NONE,,"Irrrespective of whether using column names containing a space or - character is good practice, SQLite does allow it, but `sqlite-utils` throws an error in the following cases: ```python from sqlite_utils import Database dbname = 'test.db' DB = Database(sqlite3.connect(dbname)) import pandas as pd df = pd.DataFrame({'col1':range(3), 'col2':range(3)}) #Convert pandas dataframe to appropriate list/dict format DB['test1'].insert_all( df.to_dict(orient='records') ) #Works fine ``` However: ```python df = pd.DataFrame({'col 1':range(3), 'col2':range(3)}) DB['test1'].insert_all(df.to_dict(orient='records')) ``` throws: ``` --------------------------------------------------------------------------- OperationalError Traceback (most recent call last) in () 1 import pandas as pd 2 df = pd.DataFrame({'col 1':range(3), 'col2':range(3)}) ----> 3 DB['test1'].insert_all(df.to_dict(orient='records')) /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, upsert, batch_size, column_order) 327 jsonify_if_needed(record.get(key, None)) for key in all_columns 328 ) --> 329 result = self.db.conn.execute(sql, values) 330 self.db.conn.commit() 331 self.last_id = result.lastrowid OperationalError: near ""1"": syntax error ``` and: ```python df = pd.DataFrame({'col-1':range(3), 'col2':range(3)}) DB['test1'].upsert_all(df.to_dict(orient='records')) ``` results in: ``` --------------------------------------------------------------------------- OperationalError Traceback (most recent call last) in () 1 import pandas as pd 2 df = pd.DataFrame({'col-1':range(3), 'col2':range(3)}) ----> 3 DB['test1'].insert_all(df.to_dict(orient='records')) /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, upsert, batch_size, column_order) 327 jsonify_if_needed(record.get(key, None)) for key in all_columns 328 ) --> 329 result = self.db.conn.execute(sql, values) 330 self.db.conn.commit() 331 self.last_id = result.lastrowid OperationalError: near ""-"": syntax error ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 472104705,MDExOlB1bGxSZXF1ZXN0MzAwNTgwMjIx,8,Use less RAM,9599,simonw,closed,0,,,,,0,2019-07-24T06:35:01Z,2019-07-24T06:35:52Z,2019-07-24T06:35:52Z,MEMBER,dogsheep/healthkit-to-sqlite/pulls/8,Closes #7,197882382,healthkit-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 490803176,MDU6SXNzdWU0OTA4MDMxNzY=,8,--sql and --attach options for feeding commands from SQL queries,9599,simonw,closed,0,,,,,4,2019-09-08T20:35:49Z,2020-03-20T23:13:01Z,2020-03-20T23:13:01Z,MEMBER,,"Say you want to fetch Twitter profiles for a list of accounts that are stored in another database: $ twitter-to-sqlite users-lookup users.db --attach attending.db \ --sql ""select Twitter from attending.attendes where Twitter is not null"" The SQL query you feed in is expected to return a list of screen names suitable for processing further by the command. Should be supported by all three of: - [x] `twitter-to-sqlite users-lookup` - [x] `twitter-to-sqlite user-timeline` - [x] `twitter-to-sqlite followers` and `friends` The `--attach` option allows other SQLite databases to be attached to the connection. Without it the SQL query will have to read from the single attached database.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516763727,MDExOlB1bGxSZXF1ZXN0MzM1OTgwMjQ2,8,"stargazers command, refs #4",9599,simonw,closed,0,,,,,5,2019-11-03T00:37:36Z,2020-05-02T20:00:27Z,2020-05-02T20:00:26Z,MEMBER,dogsheep/github-to-sqlite/pulls/8,Needs tests. Refs #4.,207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 605147638,MDU6SXNzdWU2MDUxNDc2Mzg=,8,Should I have used MD5 instead of SHA256?,9599,simonw,closed,0,,,,,2,2020-04-23T00:02:08Z,2020-04-23T00:03:35Z,2020-04-23T00:03:35Z,MEMBER,,"https://docs.aws.amazon.com/AmazonS3/latest/API/RESTCommonResponseHeaders.html > Objects created by the PUT Object, POST Object, or Copy operation, or through the AWS Management Console, and are encrypted by SSE-S3 or plaintext, have ETags that are an MD5 digest of their object data. ",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 648245071,MDU6SXNzdWU2NDgyNDUwNzE=,8,Error thrown: table photos has no column named hasSticker,18504,harperreed,closed,0,,,,,2,2020-06-30T14:54:37Z,2020-10-12T20:35:06Z,2020-10-12T20:25:24Z,NONE,,"While running `swarm-to-sqlite` it throws an error: harper@:~/dogsheep/swarm$ swarm-to-sqlite checkins.db --save=checkins.json Please provide your Foursquare OAuth token: Importing 8127 checkins [#################-------------------] 49% 00:01:52 Traceback (most recent call last): File ""/home/harper/.local/bin/swarm-to-sqlite"", line 11, in sys.exit(cli()) File ""/home/harper/.local/lib/python3.6/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/home/harper/.local/lib/python3.6/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/home/harper/.local/lib/python3.6/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/harper/.local/lib/python3.6/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/home/harper/.local/lib/python3.6/site-packages/swarm_to_sqlite/cli.py"", line 73, in cli save_checkin(checkin, db) File ""/home/harper/.local/lib/python3.6/site-packages/swarm_to_sqlite/utils.py"", line 94, in save_checkin photos_table.insert(photo, replace=True) File ""/home/harper/.local/lib/python3.6/site-packages/sqlite_utils/db.py"", line 963, in insert alter = self.value_or_default(""alter"", alter) File ""/home/harper/.local/lib/python3.6/site-packages/sqlite_utils/db.py"", line 1142, in insert_all def upsert_all( sqlite3.OperationalError: table photos has no column named hasSticker Where should i dig in?",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 691369691,MDU6SXNzdWU2OTEzNjk2OTE=,8,Create a view for running faceted searches,9599,simonw,closed,0,,,,,1,2020-09-02T19:44:07Z,2020-09-02T19:50:47Z,2020-09-02T19:50:47Z,MEMBER,,"```sql select search_index_fts.rank, search_index.rowid, search_index.[table], search_index.key, search_index.title, search_index.timestamp, search_index.search_1 from search_index join search_index_fts on search_index.rowid = search_index_fts.rowid order by search_index_fts.rank, search_index.timestamp desc ```",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 748370021,MDExOlB1bGxSZXF1ZXN0NTI1MzcxMDI5,8,"fix import error if note has no ""updated"" element",4028322,mkorosec,closed,0,,,,,0,2020-11-22T22:51:05Z,2021-02-11T22:34:06Z,2021-02-11T22:34:06Z,CONTRIBUTOR,dogsheep/evernote-to-sqlite/pulls/8,"I got the following error when executing evernote-to-sqlite enex evernote.db evernote.enex ``` ... File ""evernote_to_sqlite/cli.py"", line 31, in enex save_note(db, note) File ""evernote_to_sqlite/utils.py"", line 28, in save_note updated = note.find(""updated"").text AttributeError: 'NoneType' object has no attribute 'text' ``` Seems that in some cases the updated element is not added to the note, this is a part of the problematic note: ``` 20201019T074518Z web.clip7 webclipper.evernote ```",303218369,evernote-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 797728929,MDU6SXNzdWU3OTc3Mjg5Mjk=,8,QUESTION: extract full text,417363,darribas,open,0,,,,,0,2021-01-31T14:50:10Z,2021-01-31T14:50:10Z,,NONE,,"This may be solved or a feature already, but I couldn't figure it out, is it possible to extract and store also full text from the saved pages? The same way that Pocket parses the text, it'd be amazing to be able to store (and thus make searchable later) the text. Thank you very much for the project, it's such an amazing idea! ",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 927385540,MDU6SXNzdWU5MjczODU1NDA=,8,any guidance / experience on imessage-to-sqlite ?,2675621,Casyfill,open,0,,,,,0,2021-06-22T15:46:16Z,2021-06-22T15:46:16Z,,NONE,,,214746582,dogsheep.github.io,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 954546309,MDExOlB1bGxSZXF1ZXN0Njk4NDIzNjY3,8,Add Gmail takeout mbox import (v2),28565,maxhawkins,open,0,,,,,7,2021-07-28T07:05:32Z,2023-09-08T01:22:49Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/google-takeout-to-sqlite/pulls/8,"WIP This PR builds on #5 to continue implementing gmail import support. Building on @UtahDave's work, these commits add a few performance and bug fixes: * Decreased memory overhead for import by manually parsing mbox headers. * Fixed error where some messages in the mbox would yield a row with NULL in all columns. I will send more commits to fix any errors I encounter as I run the importer on my personal takeout data.",206649770,google-takeout-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1617823309,I_kwDOJHON9s5gbgZN,8,Increase performance using macnotesapp,41546558,RhetTbull,closed,0,,,,,1,2023-03-09T18:51:05Z,2023-03-14T22:00:22Z,2023-03-14T22:00:21Z,NONE,,"Neat project! You can probably increase performance using my python interface to Notes, [macnotesapp](https://github.com/RhetTbull/macnotesapp), which uses Scripting Bridge and bulk queries for much better performance than AppleScript. Another related project is [PyXA](https://github.com/SKaplanOfficial/PyXA) which uses Scripting Bridge to access Notes (and many other apps) and can return all the notes at once as opposed to calling AppleScript for each note. macnotesapp allows you to access multiple accounts and folders as well. ```python from macnotesapp import NotesApp # NotesApp() provides interface to Notes.app notesapp = NotesApp() # Get list of notes (Note objects for each note) notes = notesapp.notes() note = notes[0] print( note.id, note.account, note.folder, note.name, note.body, note.plaintext, note.password_protected, ) print(note.asdict()) ```",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267517348,MDU6SXNzdWUyNjc1MTczNDg=,9,Initial test suite,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-10-23T01:28:46Z,2017-10-24T05:55:33Z,2017-10-24T05:55:33Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 405801771,MDExOlB1bGxSZXF1ZXN0MjQ5NjgwOTQ0,9,:pencil: Updates my_database.py to my_database.db,50527,jefftriplett,closed,0,,,,,0,2019-02-01T17:35:43Z,2019-02-24T03:55:04Z,2019-02-24T03:55:04Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/9,I noticed that both `.py` and `.db` were used in the docs and assumed you'd prefer `.db`. ,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/9/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 472429048,MDU6SXNzdWU0NzI0MjkwNDg=,9,Too many SQL variables,166463,tholo,closed,0,,,,,4,2019-07-24T18:24:17Z,2019-07-26T10:01:05Z,2019-07-26T10:01:05Z,NONE,,"Decided to try importing my data, and ran into this: ``` Traceback (most recent call last): File ""/Users/tholo/Source/health/bin/healthkit-to-sqlite"", line 10, in sys.exit(cli()) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/healthkit_to_sqlite/cli.py"", line 50, in cli convert_xml_to_sqlite(fp, db, progress_callback=bar.update) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/healthkit_to_sqlite/utils.py"", line 41, in convert_xml_to_sqlite write_records(records, db) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/healthkit_to_sqlite/utils.py"", line 80, in write_records column_order=[""startDate"", ""endDate"", ""value"", ""unit""], File ""/Users/tholo/Source/health/lib/python3.7/site-packages/sqlite_utils/db.py"", line 911, in insert_all result = self.db.conn.execute(sql, values) sqlite3.OperationalError: too many SQL variables ``` Added some debug output in sqlite_utils/db.py, which resulted in: ``` INSERT INTO [rBodyMassIndex] ([creationDate], [endDate], [metadata_HKWasUserEntered], [metadata_Health Mate App Version], [metadata_Modified Date], [metadata_Withings Link], [metadata_Withings User Identifier], [sourceName], [sourceVersion], [startDate], [unit], [value]) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) ; ``` with the attached data: ``` ['2019-06-27 22:55:10 -0700', '2011-06-22 21:05:53 -0700', '0', '4.4.2', '2011-06-23 04:05:53 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308801953&type=1', '301293', 'Health Mate', '4040200', '2011-06-22 21:05:53 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2011-06-23 09:36:27 -0700', '0', '4.4.2', '2011-06-23 16:36:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308846987&type=1', '301293', 'Health Mate', '4040200', '2011-06-23 09:36:27 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2011-06-23 23:54:07 -0700', '0', '4.4.2', '2011-06-24 06:55:19 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308898447&type=1', '301293', 'Health Mate', '4040200', '2011-06-23 23:54:07 -0700', 'count', '30.679', '2019-06-27 22:55:10 -0700', '2011-06-24 09:13:40 -0700', '0', '4.4.2', '2011-06-24 16:14:35 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308932020&type=1', '301293', 'Health Mate', '4040200', '2011-06-24 09:13:40 -0700', 'count', '30.3549', '2019-06-27 22:55:10 -0700', '2011-06-25 08:30:08 -0700', '0', '4.4.2', '2011-06-25 15:30:49 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309015808&type=1', '301293', 'Health Mate', '4040200', '2011-06-25 08:30:08 -0700', 'count', '30.3395', '2019-06-27 22:55:10 -0700', '2011-06-26 07:47:51 -0700', '0', '4.4.2', '2011-06-26 14:48:27 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309099671&type=1', '301293', 'Health Mate', '4040200', '2011-06-26 07:47:51 -0700', 'count', '30.2315', '2019-06-27 22:55:10 -0700', '2011-06-28 08:48:26 -0700', '0', '4.4.2', '2011-06-28 15:49:13 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309276106&type=1', '301293', 'Health Mate', '4040200', '2011-06-28 08:48:26 -0700', 'count', '30.0617', '2019-06-27 22:55:10 -0700', '2011-06-29 09:21:16 -0700', '0', '4.4.2', '2011-06-29 16:21:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309364476&type=1', '301293', 'Health Mate', '4040200', '2011-06-29 09:21:16 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-06-30 08:41:46 -0700', '0', '4.4.2', '2011-06-30 15:42:30 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309448506&type=1', '301293', 'Health Mate', '4040200', '2011-06-30 08:41:46 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2011-07-01 09:05:28 -0700', '0', '4.4.2', '2011-07-01 16:06:24 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309536328&type=1', '301293', 'Health Mate', '4040200', '2011-07-01 09:05:28 -0700', 'count', '29.8611', '2019-06-27 22:55:10 -0700', '2011-07-02 08:58:50 -0700', '0', '4.4.2', '2011-07-02 15:59:40 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309622330&type=1', '301293', 'Health Mate', '4040200', '2011-07-02 08:58:50 -0700', 'count', '29.8765', '2019-06-27 22:55:10 -0700', '2011-07-04 09:33:43 -0700', '0', '4.4.2', '2011-07-04 16:34:19 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309797223&type=1', '301293', 'Health Mate', '4040200', '2011-07-04 09:33:43 -0700', 'count', '30.0309', '2019-06-27 22:55:10 -0700', '2011-07-06 09:40:23 -0700', '0', '4.4.2', '2011-07-06 16:41:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309970423&type=1', '301293', 'Health Mate', '4040200', '2011-07-06 09:40:23 -0700', 'count', '30.1852', '2019-06-27 22:55:10 -0700', '2011-07-08 08:08:48 -0700', '0', '4.4.2', '2011-07-08 15:09:51 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310137728&type=1', '301293', 'Health Mate', '4040200', '2011-07-08 08:08:48 -0700', 'count', '30.0309', '2019-06-27 22:55:10 -0700', '2011-07-09 08:31:05 -0700', '0', '4.4.2', '2011-07-09 15:31:48 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310225465&type=1', '301293', 'Health Mate', '4040200', '2011-07-09 08:31:05 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-07-10 08:14:36 -0700', '0', '4.4.2', '2011-07-10 15:15:12 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310310876&type=1', '301293', 'Health Mate', '4040200', '2011-07-10 08:14:36 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2011-07-12 07:55:21 -0700', '0', '4.4.2', '2011-07-12 14:55:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310482521&type=1', '301293', 'Health Mate', '4040200', '2011-07-12 07:55:21 -0700', 'count', '30.108', '2019-06-27 22:55:10 -0700', '2011-07-13 08:48:05 -0700', '0', '4.4.2', '2011-07-13 15:48:42 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310572085&type=1', '301293', 'Health Mate', '4040200', '2011-07-13 08:48:05 -0700', 'count', '30', '2019-06-27 22:55:10 -0700', '2011-07-14 09:05:16 -0700', '0', '4.4.2', '2011-07-14 16:05:57 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310659516&type=1', '301293', 'Health Mate', '4040200', '2011-07-14 09:05:16 -0700', 'count', '29.9074', '2019-06-27 22:55:10 -0700', '2011-07-15 07:09:56 -0700', '0', '4.4.2', '2011-07-15 14:10:35 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310738996&type=1', '301293', 'Health Mate', '4040200', '2011-07-15 07:09:56 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-07-16 09:26:04 -0700', '0', '4.4.2', '2011-07-16 16:26:44 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310833564&type=1', '301293', 'Health Mate', '4040200', '2011-07-16 09:26:04 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-07-17 09:52:59 -0700', '0', '4.4.2', '2011-07-17 16:53:38 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310921579&type=1', '301293', 'Health Mate', '4040200', '2011-07-17 09:52:59 -0700', 'count', '29.8765', '2019-06-27 22:55:10 -0700', '2011-07-19 08:56:16 -0700', '0', '4.4.2', '2011-07-19 15:57:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311090976&type=1', '301293', 'Health Mate', '4040200', '2011-07-19 08:56:16 -0700', 'count', '29.7685', '2019-06-27 22:55:10 -0700', '2011-07-21 08:21:20 -0700', '0', '4.4.2', '2011-07-21 15:22:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311261680&type=1', '301293', 'Health Mate', '4040200', '2011-07-21 08:21:20 -0700', 'count', '29.7685', '2019-06-27 22:55:10 -0700', '2011-07-23 08:49:56 -0700', '0', '4.4.2', '2011-07-23 15:50:40 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311436196&type=1', '301293', 'Health Mate', '4040200', '2011-07-23 08:49:56 -0700', 'count', '29.7222', '2019-06-27 22:55:10 -0700', '2011-07-24 09:17:35 -0700', '0', '4.4.2', '2011-07-24 16:18:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311524255&type=1', '301293', 'Health Mate', '4040200', '2011-07-24 09:17:35 -0700', 'count', '29.5833', '2019-06-27 22:55:10 -0700', '2011-07-25 07:51:55 -0700', '0', '4.4.2', '2011-07-25 14:52:48 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311605515&type=1', '301293', 'Health Mate', '4040200', '2011-07-25 07:51:55 -0700', 'count', '29.5525', '2019-06-27 22:55:10 -0700', '2011-08-06 10:04:05 -0700', '0', '4.4.2', '2011-08-06 17:04:47 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1312650245&type=1', '301293', 'Health Mate', '4040200', '2011-08-06 10:04:05 -0700', 'count', '29.7377', '2019-06-27 22:55:10 -0700', '2011-08-08 07:52:22 -0700', '0', '4.4.2', '2011-08-08 14:53:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1312815142&type=1', '301293', 'Health Mate', '4040200', '2011-08-08 07:52:22 -0700', 'count', '29.6605', '2019-06-27 22:55:10 -0700', '2011-08-10 07:57:30 -0700', '0', '4.4.2', '2011-08-10 14:58:12 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1312988250&type=1', '301293', 'Health Mate', '4040200', '2011-08-10 07:57:30 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-08-12 07:51:14 -0700', '0', '4.4.2', '2011-08-12 14:51:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1313160674&type=1', '301293', 'Health Mate', '4040200', '2011-08-12 07:51:14 -0700', 'count', '29.6914', '2019-06-27 22:55:10 -0700', '2011-08-13 07:45:28 -0700', '0', '4.4.2', '2011-08-13 14:46:08 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1313246728&type=1', '301293', 'Health Mate', '4040200', '2011-08-13 07:45:28 -0700', 'count', '29.5833', '2019-06-27 22:55:10 -0700', '2011-08-17 09:06:20 -0700', '0', '4.4.2', '2011-08-17 16:07:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1313597180&type=1', '301293', 'Health Mate', '4040200', '2011-08-17 09:06:20 -0700', 'count', '29.5679', '2019-06-27 22:55:10 -0700', '2011-08-22 08:28:08 -0700', '0', '4.4.2', '2011-08-22 15:28:57 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1314026888&type=1', '301293', 'Health Mate', '4040200', '2011-08-22 08:28:08 -0700', 'count', '29.9846', '2019-06-27 22:55:10 -0700', '2011-08-25 08:59:30 -0700', '0', '4.4.2', '2011-08-25 16:00:15 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1314287970&type=1', '301293', 'Health Mate', '4040200', '2011-08-25 08:59:30 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2011-08-30 08:13:59 -0700', '0', '4.4.2', '2011-08-30 15:46:08 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1314717239&type=1', '301293', 'Health Mate', '4040200', '2011-08-30 08:13:59 -0700', 'count', '29.784', '2019-06-27 22:55:10 -0700', '2011-09-12 08:47:51 -0700', '0', '4.4.2', '2011-09-12 15:48:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1315842471&type=1', '301293', 'Health Mate', '4040200', '2011-09-12 08:47:51 -0700', 'count', '29.7377', '2019-06-27 22:55:10 -0700', '2011-09-13 09:17:27 -0700', '0', '4.4.2', '2011-09-13 16:48:30 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1315930647&type=1', '301293', 'Health Mate', '4040200', '2011-09-13 09:17:27 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-10-01 09:12:20 -0700', '0', '4.4.2', '2011-10-01 16:13:00 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1317485540&type=1', '301293', 'Health Mate', '4040200', '2011-10-01 09:12:20 -0700', 'count', '29.8148', '2019-06-27 22:55:10 -0700', '2011-10-11 11:14:11 -0700', '0', '4.4.2', '2011-10-11 18:15:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1318356851&type=1', '301293', 'Health Mate', '4040200', '2011-10-11 11:14:11 -0700', 'count', '29.7377', '2019-06-27 22:55:10 -0700', '2011-10-16 09:29:47 -0700', '0', '4.4.2', '2011-10-16 16:30:39 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1318782587&type=1', '301293', 'Health Mate', '4040200', '2011-10-16 09:29:47 -0700', 'count', '29.6914', '2019-06-27 22:55:10 -0700', '2011-10-19 09:21:44 -0700', '0', '4.4.2', '2011-10-19 16:22:25 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1319041304&type=1', '301293', 'Health Mate', '4040200', '2011-10-19 09:21:44 -0700', 'count', '29.7685', '2019-06-27 22:55:10 -0700', '2011-10-24 07:04:22 -0700', '0', '4.4.2', '2011-10-24 14:05:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1319465062&type=1', '301293', 'Health Mate', '4040200', '2011-10-24 07:04:22 -0700', 'count', '29.5988', '2019-06-27 22:55:10 -0700', '2011-11-07 09:33:17 -0700', '0', '4.4.2', '2011-11-07 16:33:58 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1320683597&type=1', '301293', 'Health Mate', '4040200', '2011-11-07 09:33:17 -0700', 'count', '29.8611', '2019-06-27 22:55:10 -0700', '2011-11-10 07:59:03 -0700', '0', '4.4.2', '2011-11-10 14:59:48 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1320937143&type=1', '301293', 'Health Mate', '4040200', '2011-11-10 07:59:03 -0700', 'count', '29.9383', '2019-06-27 22:55:10 -0700', '2011-11-13 09:28:31 -0700', '0', '4.4.2', '2011-11-13 16:29:20 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1321201711&type=1', '301293', 'Health Mate', '4040200', '2011-11-13 09:28:31 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-11-21 08:45:06 -0700', '0', '4.4.2', '2011-11-21 15:46:04 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1321890306&type=1', '301293', 'Health Mate', '4040200', '2011-11-21 08:45:06 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2011-11-23 09:55:44 -0700', '0', '4.4.2', '2011-11-23 16:56:18 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1322067344&type=1', '301293', 'Health Mate', '4040200', '2011-11-23 09:55:44 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2011-11-29 09:50:44 -0700', '0', '4.4.2', '2011-11-29 16:51:31 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1322585444&type=1', '301293', 'Health Mate', '4040200', '2011-11-29 09:50:44 -0700', 'count', '30.1698', '2019-06-27 22:55:10 -0700', '2011-11-30 11:13:21 -0700', '0', '4.4.2', '2011-11-30 18:14:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1322676801&type=1', '301293', 'Health Mate', '4040200', '2011-11-30 11:13:21 -0700', 'count', '30.0617', '2019-06-27 22:55:10 -0700', '2011-12-04 10:24:36 -0700', '0', '4.4.2', '2011-12-04 17:25:24 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1323019476&type=1', '301293', 'Health Mate', '4040200', '2011-12-04 10:24:36 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2011-12-10 09:22:18 -0700', '0', '4.4.2', '2011-12-10 16:23:07 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1323534138&type=1', '301293', 'Health Mate', '4040200', '2011-12-10 09:22:18 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-12-26 10:36:42 -0700', '0', '4.4.2', '2011-12-26 17:37:31 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1324921002&type=1', '301293', 'Health Mate', '4040200', '2011-12-26 10:36:42 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2012-01-11 11:24:13 -0700', '0', '4.4.2', '2012-01-11 18:25:04 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1326306253&type=1', '301293', 'Health Mate', '4040200', '2012-01-11 11:24:13 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2012-01-15 10:17:09 -0700', '0', '4.4.2', '2012-01-15 17:17:51 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1326647829&type=1', '301293', 'Health Mate', '4040200', '2012-01-15 10:17:09 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2012-01-19 09:24:32 -0700', '0', '4.4.2', '2012-01-19 16:25:21 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1326990272&type=1', '301293', 'Health Mate', '4040200', '2012-01-19 09:24:32 -0700', 'count', '29.7994', '2019-06-27 22:55:10 -0700', '2012-01-29 10:26:13 -0700', '0', '4.4.2', '2012-01-29 17:26:52 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1327857973&type=1', '301293', 'Health Mate', '4040200', '2012-01-29 10:26:13 -0700', 'count', '30.0154', '2019-06-27 22:55:10 -0700', '2012-02-03 10:13:28 -0700', '0', '4.4.2', '2012-02-03 17:15:01 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1328289208&type=1', '301293', 'Health Mate', '4040200', '2012-02-03 10:13:28 -0700', 'count', '29.8457', '2019-06-27 22:55:10 -0700', '2012-02-12 09:23:01 -0700', '0', '4.4.2', '2012-02-12 16:23:53 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1329063781&type=1', '301293', 'Health Mate', '4040200', '2012-02-12 09:23:01 -0700', 'count', '30.1235', '2019-06-27 22:55:10 -0700', '2012-03-03 09:26:06 -0700', '0', '4.4.2', '2012-03-03 16:26:54 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1330791966&type=1', '301293', 'Health Mate', '4040200', '2012-03-03 09:26:06 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2012-03-11 11:23:15 -0700', '0', '4.4.2', '2012-03-11 18:24:16 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1331490195&type=1', '301293', 'Health Mate', '4040200', '2012-03-11 11:23:15 -0700', 'count', '30.2161', '2019-06-27 22:55:10 -0700', '2012-03-16 09:39:36 -0700', '0', '4.4.2', '2012-03-16 16:40:20 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1331915976&type=1', '301293', 'Health Mate', '4040200', '2012-03-16 09:39:36 -0700', 'count', '30.2778', '2019-06-27 22:55:10 -0700', '2012-03-21 08:33:07 -0700', '0', '4.4.2', '2012-03-21 15:34:00 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1332343987&type=1', '301293', 'Health Mate', '4040200', '2012-03-21 08:33:07 -0700', 'count', '30.1389', '2019-06-27 22:55:10 -0700', '2012-04-11 08:49:34 -0700', '0', '4.4.2', '2012-04-11 15:50:18 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1334159374&type=1', '301293', 'Health Mate', '4040200', '2012-04-11 08:49:34 -0700', 'count', '30.0154', '2019-06-27 22:55:10 -0700', '2012-04-13 08:32:06 -0700', '0', '4.4.2', '2012-04-13 15:32:49 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1334331126&type=1', '301293', 'Health Mate', '4040200', '2012-04-13 08:32:06 -0700', 'count', '29.9383', '2019-06-27 22:55:10 -0700', '2012-04-20 08:21:38 -0700', '0', '4.4.2', '2012-04-20 15:52:45 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1334935298&type=1', '301293', 'Health Mate', '4040200', '2012-04-20 08:21:38 -0700', 'count', '30.2006', '2019-06-27 22:55:10 -0700', '2012-04-25 09:00:01 -0700', '0', '4.4.2', '2012-04-25 16:00:42 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1335369601&type=1', '301293', 'Health Mate', '4040200', '2012-04-25 09:00:01 -0700', 'count', '30.2006', '2019-06-27 22:55:10 -0700', '2012-05-04 11:10:18 -0700', '0', '4.4.2', '2012-05-04 18:10:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1336155018&type=1', '301293', 'Health Mate', '4040200', '2012-05-04 11:10:18 -0700', 'count', '30.4321', '2019-06-27 22:55:10 -0700', '2012-05-12 09:35:00 -0700', '0', '4.4.2', '2012-05-12 16:35:43 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1336840500&type=1', '301293', 'Health Mate', '4040200', '2012-05-12 09:35:00 -0700', 'count', '30.1235', '2019-06-27 22:55:10 -0700', '2012-05-22 09:27:53 -0700', '0', '4.4.2', '2012-05-22 16:28:37 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1337704073&type=1', '301293', 'Health Mate', '4040200', '2012-05-22 09:27:53 -0700', 'count', '30.4167', '2019-06-27 22:55:10 -0700', '2012-05-31 09:23:16 -0700', '0', '4.4.2', '2012-05-31 16:24:04 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1338481396&type=1', '301293', 'Health Mate', '4040200', '2012-05-31 09:23:16 -0700', 'count', '30.2006', '2019-06-27 22:55:10 -0700', '2012-06-08 09:29:07 -0700', '0', '4.4.2', '2012-06-08 16:29:52 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1339172947&type=1', '301293', 'Health Mate', '4040200', '2012-06-08 09:29:07 -0700', 'count', '30.5247', '2019-06-27 22:55:10 -0700', '2012-06-21 08:07:33 -0700', '0', '4.4.2', '2012-06-21 15:08:20 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1340291253&type=1', '301293', 'Health Mate', '4040200', '2012-06-21 08:07:33 -0700', 'count', '30.5864', '2019-06-27 22:55:10 -0700', '2012-08-08 10:02:22 -0700', '0', '4.4.2', '2012-08-08 17:03:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1344445342&type=1', '301293', 'Health Mate', '4040200', '2012-08-08 10:02:22 -0700', 'count', '30.6636', '2019-06-27 22:55:10 -0700', '2012-08-17 09:11:32 -0700', '0', '4.4.2', '2012-08-17 16:42:05 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1345219892&type=1', '301293', 'Health Mate', '4040200', '2012-08-17 09:11:32 -0700', 'count', '30.8796', '2019-06-27 22:55:10 -0700', '2012-09-10 08:27:21 -0700', '0', '4.4.2', '2012-09-10 15:28:07 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1347290841&type=1', '301293', 'Health Mate', '4040200', '2012-09-10 08:27:21 -0700', 'count', '31.034', '2019-06-27 22:55:10 -0700', '2012-09-17 08:35:33 -0700', '0', '4.4.2', '2012-09-17 15:35:33 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1347896133&type=1', '301293', 'Health Mate', '4040200', '2012-09-17 08:35:33 -0700', 'count', '30.7099', '2019-06-27 22:55:10 -0700', '2012-09-26 08:59:46 -0700', '0', '4.4.2', '2012-09-26 16:13:18 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1348675186&type=1', '301293', 'Health Mate', '4040200', '2012-09-26 08:59:46 -0700', 'count', '30.679', '2019-06-27 22:55:10 -0700', '2012-10-18 08:51:16 -0700', '0', '4.4.2', '2012-10-18 15:51:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1350575476&type=1', '301293', 'Health Mate', '4040200', '2012-10-18 08:51:16 -0700', 'count', '30.7716', '2019-06-27 22:55:10 -0700', '2012-11-15 08:54:57 -0700', '0', '4.4.2', '2012-11-15 15:55:58 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1352994897&type=1', '301293', 'Health Mate', '4040200', '2012-11-15 08:54:57 -0700', 'count', '31.0802', '2019-06-27 22:55:10 -0700', '2012-12-17 09:13:40 -0700', '0', '4.4.2', '2012-12-17 16:20:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1355760820&type=1', '301293', 'Health Mate', '4040200', '2012-12-17 09:13:40 -0700', 'count', '29.784', '2019-06-27 22:55:10 -0700', '2012-12-19 11:09:55 -0700', '0', '4.4.2', '2012-12-19 18:10:37 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1355940595&type=1', '301293', 'Health Mate', '4040200', '2012-12-19 11:09:55 -0700', 'count', '29.6914', '2019-06-27 22:55:10 -0700', '2012-12-25 10:37:41 -0700', '0', '4.4.2', '2012-12-25 17:38:25 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1356457061&type=1', '301293', 'Health Mate', '4040200', '2012-12-25 10:37:41 -0700', 'count', '29.8765', '2019-06-27 22:55:10 -0700', '2013-01-01 10:44:02 -0700', '0', '4.4.2', '2013-01-01 17:44:46 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1357062242&type=1', '301293', 'Health Mate', '4040200', '2013-01-01 10:44:02 -0700', 'count', '30.0772', '2019-06-27 22:55:10 -0700', '2013-01-15 09:10:46 -0700', '0', '4.4.2', '2013-01-15 16:11:28 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1358266246&type=1', '301293', 'Health Mate', '4040200', '2013-01-15 09:10:46 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2013-01-20 11:03:39 -0700', '0', '4.4.2', '2013-01-20 18:04:22 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1358705019&type=1', '301293', 'Health Mate', '4040200', '2013-01-20 11:03:39 -0700', 'count', '30.108', '2019-06-27 22:55:10 -0700', '2013-01-30 08:56:30 -0700', '0', '4.4.2', '2013-01-30 15:57:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1359561390&type=1', '301293', 'Health Mate', '4040200', '2013-01-30 08:56:30 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2013-02-04 11:02:35 -0700', '0', '4.4.2', '2013-02-04 18:03:25 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1360000955&type=1', '301293', 'Health Mate', '4040200', '2013-02-04 11:02:35 -0700', 'count', '29.8148', '2019-06-27 22:55:10 -0700', '2013-02-07 09:07:06 -0700', '0', '4.4.2', '2013-02-07 16:07:49 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1360253226&type=1', '301293', 'Health Mate', '4040200', '2013-02-07 09:07:06 -0700', 'count', '30.1389', '2019-06-27 22:55:10 -0700', '2013-02-19 08:49:57 -0700', '0', '4.4.2', '2013-02-19 15:50:39 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1361288997&type=1', '301293', 'Health Mate', '4040200', '2013-02-19 08:49:57 -0700', 'count', '30.1235', '2019-06-27 22:55:10 -0700', '2013-03-02 11:20:54 -0700', '0', '4.4.2', '2013-03-02 18:21:38 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1362248454&type=1', '301293', 'Health Mate', '4040200', '2013-03-02 11:20:54 -0700', 'count', '30', '2019-06-27 22:55:10 -0700', '2013-04-23 08:05:30 -0700', '0', '4.4.2', '2013-04-23 15:06:59 +0000', 'withings-bd2://timeline/measure?user """""" id=301293&date=1366729530&type=1', '301293', 'Health Mate', '4040200', '2013-04-23 08:05:30 -0700', 'count', '30.5247', '2019-06-27 22:55:10 -0700', '2013-05-09 09:49:18 -0700', '0', '4.4.2', '2013-05-09 16:50:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1368118158&type=1', '301293', 'Health Mate', '4040200', '2013-05-09 09:49:18 -0700', 'count', '30.4167', '2019-06-27 22:55:10 -0700', '2013-06-09 09:28:47 -0700', '0', '4.4.2', '2013-06-09 16:29:30 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1370795327&type=1', '301293', 'Health Mate', '4040200', '2013-06-09 09:28:47 -0700', 'count', '30.8333', '2019-06-27 22:55:10 -0700', '2013-07-09 08:00:17 -0700', '0', '4.4.2', '2013-07-09 15:01:00 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1373382017&type=1', '301293', 'Health Mate', '4040200', '2013-07-09 08:00:17 -0700', 'count', '30.8179', '2019-06-27 22:55:10 -0700', '2013-07-28 09:16:55 -0700', '0', '4.4.2', '2013-07-28 16:17:39 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1375028215&type=1', '301293', 'Health Mate', '4040200', '2013-07-28 09:16:55 -0700', 'count', '30.5556', '2019-06-27 22:55:10 -0700', '2013-09-13 09:22:19 -0700', '0', '4.4.2', '2013-09-13 16:23:08 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1379089339&type=1', '301293', 'Health Mate', '4040200', '2013-09-13 09:22:19 -0700', 'count', '30.9568', '2019-06-27 22:55:10 -0700', '2013-09-24 08:08:23 -0700', '0', '4.4.2', '2013-09-24 15:09:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1380035303&type=1', '301293', 'Health Mate', '4040200', '2013-09-24 08:08:23 -0700', 'count', '31.4352', '2019-06-27 22:55:10 -0700', '2013-10-01 08:15:13 -0700', '0', '4.4.2', '2013-10-01 15:15:57 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1380640513&type=1', '301293', 'Health Mate', '4040200', '2013-10-01 08:15:13 -0700', 'count', '31.2037', '2019-06-27 22:55:10 -0700', '2013-10-23 09:31:25 -0700', '0', '4.4.2', '2013-10-23 16:32:13 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1382545885&type=1', '301293', 'Health Mate', '4040200', '2013-10-23 09:31:25 -0700', 'count', '31.8056'] ```",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 491791152,MDU6SXNzdWU0OTE3OTExNTI=,9,followers-ids and friends-ids subcommands,9599,simonw,closed,0,,,,,1,2019-09-10T16:58:15Z,2019-09-10T17:36:55Z,2019-09-10T17:36:55Z,MEMBER,,"These will import follower and friendship IDs into the following tables, using these APIs: https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-followers-ids https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-friends-ids",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516769276,MDU6SXNzdWU1MTY3NjkyNzY=,9,Commands do not work without an auth.json file,9599,simonw,closed,0,,,,,0,2019-11-03T01:54:28Z,2019-11-11T05:30:48Z,2019-11-11T05:30:48Z,MEMBER,,"`auth.json` is meant to be optional. If it's not provided, the tool should make heavily rate-limited unauthenticated requests. ``` $ github-to-sqlite repos .data/repos.db simonw Usage: github-to-sqlite repos [OPTIONS] DB_PATH [USERNAME] Try ""github-to-sqlite repos --help"" for help. Error: Invalid value for ""-a"" / ""--auth"": File ""auth.json"" does not exist. ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 605938063,MDU6SXNzdWU2MDU5MzgwNjM=,9,"upload command should be resumable, should only upload photos not already uploaded",9599,simonw,closed,0,,,,,2,2020-04-23T23:31:08Z,2020-04-23T23:39:14Z,2020-04-23T23:39:14Z,MEMBER,,Follow on from #4. ,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 673602857,MDU6SXNzdWU2NzM2MDI4NTc=,9,Define a view that displays photos correctly,9599,simonw,open,0,,,,,0,2020-08-05T14:53:39Z,2020-08-05T14:53:39Z,,MEMBER,,"The `photos` table stores data like this: id | createdAt | source | prefix | suffix | width | height | visibility | created ▲ | user -- | -- | -- | -- | -- | -- | -- | -- | -- | -- 5e12c9708506bc000840262a | January 06, 2020 - 05:45:20 UTC | Swarm for iOS 1 | https://fastly.4sqi.net/img/general/ | /15889193_AXxGk4I1nbzUZuyYqObgbXdJNyEHiwj6AUDq0tPZWtw.jpg | 1920 | 1440 | public | 2020-01-06T05:45:20 | 15889193 The photo URL can be derived from those pieces - define a SQL view which does that (using `datasette-json-html` to display the pictures)",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 691521965,MDU6SXNzdWU2OTE1MjE5NjU=,9,Mechanism for defining custom display of results,9599,simonw,closed,0,,,,,8,2020-09-03T00:14:07Z,2020-09-03T21:12:14Z,2020-09-03T21:09:55Z,MEMBER,,Part of #3 - in particular I want to make sure my photos are displayed with a thumbnail.,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 748372469,MDU6SXNzdWU3NDgzNzI0Njk=,9,ParseError: undefined entity š,4028322,mkorosec,closed,0,,,,,1,2020-11-22T23:04:35Z,2021-02-11T22:10:55Z,2021-02-11T22:10:55Z,CONTRIBUTOR,,"I encountered a parse error if the enex file contained š or   Run command: evernote-to-sqlite enex evernote.db evernote.enex ``` Traceback (most recent call last): ... File ""evernote_to_sqlite/cli.py"", line 31, in enex save_note(db, note) File ""evernote_to_sqlite/utils.py"", line 35, in save_note content = ET.tostring(ET.fromstring(content_xml)).decode(""utf-8"") File ""/usr/lib/python3.8/xml/etree/ElementTree.py"", line 1320, in XML parser.feed(text) xml.etree.ElementTree.ParseError: undefined entity š: line 3, column 35 ``` Workaround: ``` sed -i 's/š//g' evernote.enex sed -i 's/ //g' evernote.enex ```",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 801780625,MDU6SXNzdWU4MDE3ODA2MjU=,9,SSL Error,12669260,jfeiwell,open,0,,,,,2,2021-02-05T02:12:56Z,2021-02-07T18:45:04Z,,NONE,,"Here's the error I get when running `pip install pocket-to-sqlite`: ``` Could not fetch URL https://pypi.python.org/simple/pocket-to-sqlite/: There was a problem confirming the ssl certificate: [SSL: TLSV1_ALERT_PROTOCOL_VERSION] tlsv1 alert protocol version (_ssl.c:661) - skipping Could not find a version that satisfies the requirement pocket-to-sqlite (from versions: ) No matching distribution found for pocket-to-sqlite ``` Does this require python 3? ",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1046887492,PR_kwDODFE5qs4uMsMJ,9,Removed space from filename My Activity.json,91880982,widadmogral,open,0,,,,,0,2021-11-08T00:04:31Z,2021-11-08T00:04:31Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/google-takeout-to-sqlite/pulls/9,"File name from google takeout has no space. The code only runs without error if filename is ""MyActivity.json"" and not ""My Activity.json"". Is it a new change by Google?",206649770,google-takeout-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1617938730,I_kwDOJHON9s5gb8kq,9,"Default to just storing plaintext, store HTML if `--html` is passed",9599,simonw,open,0,,,,,0,2023-03-09T20:19:06Z,2023-03-09T20:19:06Z,,MEMBER,,"The full `body` version of the notes can get HUGE, due to embedded images. It turns out for my own purposes I'm usually happy with just the `plaintext` version. I'm tempted to say you don't get HTML unless you pass a `--html` option.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 267517381,MDU6SXNzdWUyNjc1MTczODE=,10,Set up Travis,9599,simonw,closed,0,,,2859414,v1 stretch goals,1,2017-10-23T01:29:07Z,2017-11-04T23:48:57Z,2017-11-04T23:48:57Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 411066700,MDU6SXNzdWU0MTEwNjY3MDA=,10,Error in upsert if column named 'order',82988,psychemedia,closed,0,,,,,1,2019-02-16T12:05:18Z,2019-02-24T16:55:38Z,2019-02-24T16:55:37Z,NONE,,"The following works fine: ``` connX = sqlite3.connect('DELME.db', timeout=10) dfX=pd.DataFrame({'col1':range(3),'col2':range(3)}) DBX = Database(connX) DBX['test'].upsert_all(dfX.to_dict(orient='records')) ``` But if a column is named `order`: ``` connX = sqlite3.connect('DELME.db', timeout=10) dfX=pd.DataFrame({'order':range(3),'col2':range(3)}) DBX = Database(connX) DBX['test'].upsert_all(dfX.to_dict(orient='records')) ``` it throws an error: ``` --------------------------------------------------------------------------- OperationalError Traceback (most recent call last) in 3 dfX=pd.DataFrame({'order':range(3),'col2':range(3)}) 4 DBX = Database(connX) ----> 5 DBX['test'].upsert_all(dfX.to_dict(orient='records')) /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in upsert_all(self, records, pk, foreign_keys, column_order) 347 foreign_keys=foreign_keys, 348 upsert=True, --> 349 column_order=column_order, 350 ) 351 /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, upsert, batch_size, column_order) 327 jsonify_if_needed(record.get(key, None)) for key in all_columns 328 ) --> 329 result = self.db.conn.execute(sql, values) 330 self.db.conn.commit() 331 self.last_id = result.lastrowid OperationalError: near ""order"": syntax error ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 492297930,MDU6SXNzdWU0OTIyOTc5MzA=,10,Rethink progress bars for various commands,9599,simonw,closed,0,,,,,5,2019-09-11T15:06:47Z,2020-04-01T03:45:48Z,2020-04-01T03:45:48Z,MEMBER,,"Progress bars and the `--silent` option are implemented inconsistently across commands at the moment. This is made more challenging by the fact that for many operations the total length is not known. https://click.palletsprojects.com/en/7.x/api/#click.progressbar",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516967682,MDU6SXNzdWU1MTY5Njc2ODI=,10,Add this repos_starred view,9599,simonw,closed,0,,,,,3,2019-11-04T05:44:38Z,2020-05-02T16:37:36Z,2020-05-02T16:37:36Z,MEMBER,,"```sql create view repos_starred as select stars.starred_at, users.login, repos.* from repos join stars on repos.id = stars.repo join users on repos.owner = users.id order by starred_at desc; ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 519038979,MDU6SXNzdWU1MTkwMzg5Nzk=,10,Failed to import workout points,9599,simonw,closed,0,,,,,4,2019-11-07T04:50:22Z,2019-11-08T01:18:37Z,2019-11-08T01:18:37Z,MEMBER,,"I just ran the script and it failed to import any `workout_points`, though it did import `workouts`.",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 606028272,MDU6SXNzdWU2MDYwMjgyNzI=,10,Speed up hashing step using threads,9599,simonw,closed,0,,,,,0,2020-04-24T04:20:08Z,2020-04-24T04:32:35Z,2020-04-24T04:32:35Z,MEMBER,,"This TODO from the code: https://github.com/dogsheep/photos-to-sqlite/blob/2e7f2c67cc18b02c75bb64992a05b0196e507252/photos_to_sqlite/cli.py#L82-L90",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 691557547,MDU6SXNzdWU2OTE1NTc1NDc=,10,Category 3: received,9599,simonw,closed,0,,,,,1,2020-09-03T01:40:36Z,2020-09-03T17:38:51Z,2020-09-03T17:38:51Z,MEMBER,,"A category for things that were sent to me: DMs, emails etc. Follows #7.",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 719637258,MDExOlB1bGxSZXF1ZXN0NTAxNzkxNjYz,10,Update utils.py to fix sqlite3.OperationalError,29426418,mattiaborsoi,closed,0,,,,,1,2020-10-12T20:17:53Z,2020-10-12T20:25:10Z,2020-10-12T20:25:09Z,CONTRIBUTOR,dogsheep/swarm-to-sqlite/pulls/10,"Fixes the errors: - sqlite3.OperationalError: table posts has no column named text - sqlite3.OperationalError: table photos has no column named hasSticker That will cause sqlite-utils to notice if there's a missing column and add it. As recommended by @simonw",205429375,swarm-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 770712149,MDExOlB1bGxSZXF1ZXN0NTQyNDA2OTEw,10,BugFix for encoding and not update info.,1277270,riverzhou,closed,0,,,,,1,2020-12-18T08:58:54Z,2021-02-11T22:37:56Z,2021-02-11T22:37:56Z,NONE,dogsheep/evernote-to-sqlite/pulls/10,"Bugfix 1: Traceback (most recent call last): File ""d:\anaconda3\lib\runpy.py"", line 194, in _run_module_as_main return _run_code(code, main_globals, None, File ""d:\anaconda3\lib\runpy.py"", line 87, in _run_code exec(code, run_globals) File ""D:\Anaconda3\Scripts\evernote-to-sqlite.exe\__main__.py"", line 7, in File ""d:\anaconda3\lib\site-packages\click\core.py"", line 829, in __call__ File ""d:\anaconda3\lib\site-packages\click\core.py"", line 782, in main rv = self.invoke(ctx) File ""d:\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) return ctx.invoke(self.callback, **ctx.params) File ""d:\anaconda3\lib\site-packages\click\core.py"", line 610, in invoke return callback(*args, **kwargs) File ""d:\anaconda3\lib\site-packages\evernote_to_sqlite\cli.py"", line 30, in enex for tag, note in find_all_tags(fp, [""note""], progress_callback=bar.update): File ""d:\anaconda3\lib\site-packages\evernote_to_sqlite\utils.py"", line 11, in find_all_tags chunk = fp.read(1024 * 1024) UnicodeDecodeError: 'gbk' codec can't decode byte 0xa4 in position 383: illegal multibyte sequence Bugfix 2: Traceback (most recent call last): File ""D:\Anaconda3\Scripts\evernote-to-sqlite-script.py"", line 33, in sys.exit(load_entry_point('evernote-to-sqlite==0.3', 'console_scripts', 'evernote-to-sqlite')()) File ""D:\Anaconda3\lib\site-packages\click\core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""D:\Anaconda3\lib\site-packages\click\core.py"", line 782, in main rv = self.invoke(ctx) File ""D:\Anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""D:\Anaconda3\lib\site-packages\click\core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""D:\Anaconda3\lib\site-packages\click\core.py"", line 610, in invoke return callback(*args, **kwargs) File ""D:\Anaconda3\lib\site-packages\evernote_to_sqlite-0.3-py3.8.egg\evernote_to_sqlite\cli.py"", line 31, in enex File ""D:\Anaconda3\lib\site-packages\evernote_to_sqlite-0.3-py3.8.egg\evernote_to_sqlite\utils.py"", line 28, in save_note AttributeError: 'NoneType' object has no attribute 'text'",303218369,evernote-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1123393829,I_kwDODFE5qs5C9aEl,10,sqlite3.OperationalError: no such table: main.my_activity,69208826,glxblt14,open,0,,,,,1,2022-02-03T17:59:29Z,2022-03-20T02:38:07Z,,NONE,,"Hello, When i run the command `google-takeout-to-sqlite my-activity db.db takeout-20220203T174446Z-001.zip`, i get this error : ``` Traceback (most recent call last): File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\runpy.py"", line 197, in _run_module_as_main return _run_code(code, main_globals, None, File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\runpy.py"", line 87, in _run_code exec(code, run_globals) File ""C:\Users\julie\AppData\Local\Programs\Python\Python39-32\Scripts\google-takeout-to-sqlite.exe\__main__.py"", line 7, in File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\click\core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\click\core.py"", line 1053, in main rv = self.invoke(ctx) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\click\core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\click\core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\click\core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\google_takeout_to_sqlite\cli.py"", line 31, in my_activity utils.save_my_activity(db, zf) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\google_takeout_to_sqlite\utils.py"", line 19, in save_my_activity db[""my_activity""].create_index([""time""]) File ""c:\users\julie\appdata\local\programs\python\python39-32\lib\site-packages\sqlite_utils\db.py"", line 629, in create_index self.db.conn.execute(sql) sqlite3.OperationalError: no such table: main.my_activity ``` Thank you for your help Sorry for my bad English EDIT: i used the json format",206649770,google-takeout-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1246826792,I_kwDODLZ_YM5KUREo,10,"When running `auth` command, don't overwrite an existing auth.json file",11887,ashanan,closed,0,,,,,3,2022-05-24T16:42:20Z,2022-09-07T15:07:38Z,2022-08-22T16:17:19Z,NONE,,"Ran the `auth` command in the same directory I'd previously set up an auth.json file for `twitter-to-sqlite` and it was completely overwritten. Not the biggest issue, but still unexpected. Ideally, for me, the keys would just be added to the existing file, but getting a warning and a chance to back out would be a good solution as well.",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1617962395,I_kwDOJHON9s5gcCWb,10,Include schema in README,9599,simonw,closed,0,,,,,0,2023-03-09T20:38:59Z,2023-03-09T20:48:18Z,2023-03-09T20:48:18Z,MEMBER,,As seen in other tools like https://github.com/simonw/git-history,611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267522549,MDU6SXNzdWUyNjc1MjI1NDk=,11,Code that generates compile-time properties about the database ,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-23T02:18:24Z,2017-10-23T16:04:23Z,2017-10-23T16:04:23Z,OWNER,,"At a minimum this will include: * sha hash of each database file * list of tables with row counts for each database file",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 413740684,MDU6SXNzdWU0MTM3NDA2ODQ=,11,Detect numpy types when creating tables,9599,simonw,closed,0,,,,,2,2019-02-23T21:09:35Z,2019-02-24T04:02:20Z,2019-02-24T04:02:20Z,OWNER,,Inspired by #8,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503045221,MDU6SXNzdWU1MDMwNDUyMjE=,11,Commands for recording real-time tweets from the streaming API,9599,simonw,closed,0,,,,,1,2019-10-06T03:09:30Z,2019-10-06T04:54:17Z,2019-10-06T04:48:31Z,MEMBER,,"https://developer.twitter.com/en/docs/tweets/filter-realtime/api-reference/post-statuses-filter We can support tracking keywords and following specific users.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520521843,MDU6SXNzdWU1MjA1MjE4NDM=,11,Command to fetch releases,9599,simonw,closed,0,,,,,0,2019-11-09T22:23:30Z,2019-11-09T22:57:00Z,2019-11-09T22:57:00Z,MEMBER,,"https://developer.github.com/v3/repos/releases/#list-releases-for-a-repository `GET /repos/:owner/:repo/releases`",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 606032950,MDU6SXNzdWU2MDYwMzI5NTA=,11,Try running S3 uploads in a thread pool,9599,simonw,closed,0,,,,,0,2020-04-24T04:34:31Z,2020-04-24T16:45:41Z,2020-04-24T16:45:41Z,MEMBER,,"Since #10 provided such a speedup, can the same thing be done for the actual uploads? http://ls.pwd.io/2013/06/parallel-s3-uploads-using-boto-and-threads-in-python/ suggests it can really help performance.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 692125110,MDU6SXNzdWU2OTIxMjUxMTA=,11,Public / Private mechanism,9599,simonw,closed,0,,,,,1,2020-09-03T16:47:03Z,2020-09-03T17:33:52Z,2020-09-03T17:33:52Z,MEMBER,,"Some of the data in Dogsheep is stuff that was written publicly - tweets, blog posts, GitHub commits to public repos. Some of it is private data - emails, photos, direct messages, Swarm checkins, commits to private repos. Being able to filter for just one or the other (or both) would be useful. Especially when giving demos!",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 723838331,MDU6SXNzdWU3MjM4MzgzMzE=,11,export.xml file name varies with different language settings,572,jarib,closed,0,,,,,7,2020-10-17T20:07:18Z,2020-10-17T21:39:15Z,2020-10-17T21:14:10Z,NONE,,"The XML file exported from my phone has a Norwegian file name – `eksport.xml` 🙄 I can work around this by unpacking the zip and using `--xml`, but then I lose the workout points. Perhaps this could be solved by `--localized-xml eksport.xml`? Alternatively just fall back to the first XML file in the root folder of the zip. ",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 743400216,MDU6SXNzdWU3NDM0MDAyMTY=,11,Error thrown: sqlite3.OperationalError: table users has no column named lastName,61791,beaugunderson,closed,0,,,,,2,2020-11-16T01:21:18Z,2021-01-18T04:35:22Z,2021-01-18T04:35:22Z,NONE,,"Just installed `swarm-to-sqlite-0.3.2` and tried according to the docs: ``` Traceback (most recent call last): File ""/usr/local/bin/swarm-to-sqlite"", line 8, in sys.exit(cli()) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/swarm_to_sqlite/cli.py"", line 73, in cli save_checkin(checkin, db) File ""/usr/local/lib/python3.9/site-packages/swarm_to_sqlite/utils.py"", line 82, in save_checkin checkins_table.m2m(""users"", user, m2m_table=""likes"", pk=""id"") File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1914, in m2m id = other_table.insert(record, pk=pk, replace=True).last_pk File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1647, in insert return self.insert_all( File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1765, in insert_all self.insert_chunk( File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1575, in insert_chunk result = self.db.execute(query, params) File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 200, in execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: table users has no column named lastName ```",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 792851444,MDU6SXNzdWU3OTI4NTE0NDQ=,11,XML parse error,3613583,dskrad,closed,0,,,,,2,2021-01-24T17:38:54Z,2021-02-11T21:18:58Z,2021-02-11T21:18:48Z,NONE,,"I am on Windows 10 using Windows Subsystem for Linux, Python 3.8. I installed evernote-to-sqlite via pipx (in a venv). I tried using enex files from the latest version of Evernote for Windows (10.6.9 which only lets you export 50 notes at a time) and from Legacy Evernote (6.25.2.9198 which lets you export all your notes at once). The enex file from latest evernote gives this error: File ""/usr/lib/python3.8/xml/etree/ElementTree.py"", line 1320, in XML parser.feed(text) xml.etree.ElementTree.ParseError: XML or text declaration not at start of entity: line 2, column 6 The enex file from Legacy Evernote gives this error: File ""/home/david/.local/pipx/venvs/evernote-to-sqlite/lib/python3.8/site-packages/evernote_to_sqlite/utils.py"", line 28, in save_note updated = note.find(""updated"").text AttributeError: 'NoneType' object has no attribute 'text'",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1250287607,PR_kwDODFE5qs44jvRV,11,Update README.md,11887,ashanan,open,0,,,,,0,2022-05-27T03:13:59Z,2022-05-27T03:13:59Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/google-takeout-to-sqlite/pulls/11,Fix typo,206649770,google-takeout-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/11/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1345452427,I_kwDODLZ_YM5QMfmL,11,"-a option is used for ""--auth"" and for ""--all""",2467,fernand0,closed,0,,,,,3,2022-08-21T10:50:48Z,2022-08-21T21:11:57Z,2022-08-21T21:11:57Z,NONE,,"I'm not sure which option is best, instead of -a -all.",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1618130434,I_kwDOJHON9s5gcrYC,11,Implement a SQL view to make it easier to query files in a nested folder,9599,simonw,open,0,,,,,3,2023-03-09T23:19:28Z,2023-03-09T23:24:01Z,,MEMBER,,"Working with nested data in SQL is tricky, can I make it easier with a view or canned query?",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 267523511,MDU6SXNzdWUyNjc1MjM1MTE=,12,Make it so you can override templates,9599,simonw,closed,0,,,2949431,Custom templates edition,1,2017-10-23T02:25:35Z,2017-11-30T16:42:46Z,2017-11-30T16:38:34Z,OWNER,,"The app will ship with default templates but, just like with the Django admin, you will be able to override them using either explicit configuration settings or just by dropping in templates with certain file names. Template inheritance should work here, both allowing you to override just the base template and allowing you to customize tiny bits of others.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 413778585,MDExOlB1bGxSZXF1ZXN0MjU1NjU4MTEy,12,"Support for numpy types, closes #11",9599,simonw,closed,0,,,,,0,2019-02-24T03:57:32Z,2019-02-24T04:02:20Z,2019-02-24T04:02:20Z,OWNER,simonw/sqlite-utils/pulls/12,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 503053800,MDU6SXNzdWU1MDMwNTM4MDA=,12,"Extract ""source"" into a separate lookup table",9599,simonw,closed,0,,,,,3,2019-10-06T05:17:23Z,2019-10-17T15:49:24Z,2019-10-17T15:49:24Z,MEMBER,,"It's pretty bulky and ugly at the moment: ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520756546,MDU6SXNzdWU1MjA3NTY1NDY=,12,Add this view for seeing new releases,9599,simonw,closed,0,,,,,5,2019-11-11T06:00:12Z,2020-05-02T18:58:18Z,2020-05-02T18:58:17Z,MEMBER,,"```sql CREATE VIEW recent_releases AS select json_object(""label"", repos.full_name, ""href"", repos.html_url) as repo, json_object( ""href"", releases.html_url, ""label"", releases.name ) as release, substr(releases.published_at, 0, 11) as date, releases.body as body_markdown, releases.published_at from releases join repos on repos.id = releases.repo order by releases.published_at desc ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 606033104,MDU6SXNzdWU2MDYwMzMxMDQ=,12,"If less than 500MB, show size in MB not GB",9599,simonw,open,0,,,,,1,2020-04-24T04:35:01Z,2020-04-24T04:35:25Z,,MEMBER,,"Just saw this: ``` Uploading 0.05 GB ```",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 692202408,MDU6SXNzdWU2OTIyMDI0MDg=,12,Idea: maps and GeoJSON support,9599,simonw,open,0,,,,,0,2020-09-03T18:47:10Z,2020-09-04T01:45:03Z,,MEMBER,,"It would be cool if the `display_sql` could return a column populated with GeoJSON which would the automatically be displayed on a map in the results (or maybe default JS would look for a `class=""geojson""` element output by the `display` template) - ala https://github.com/simonw/datasette-leaflet-geojson Then I could render workout routes on a map, or Swarm checkin points.",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 727848625,MDU6SXNzdWU3Mjc4NDg2MjU=,12,"Some workout columns should be float, not text",9599,simonw,open,0,,,,,4,2020-10-23T02:47:02Z,2022-06-23T04:35:02Z,,MEMBER,,"Columns `duration`, `totalDistance` and `totalEnergyBurned` should be converted to float. https://github.com/dogsheep/healthkit-to-sqlite/blob/71e36e1cf034b96de2a8e6652265d782d3fdf63b/healthkit_to_sqlite/utils.py#L50-L57",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 892383270,MDExOlB1bGxSZXF1ZXN0NjQ1MTAwODQ4,12,Recovering of malformed ENEX file,8431437,engdan77,open,0,,,,,0,2021-05-15T07:49:31Z,2021-05-15T19:57:50Z,,FIRST_TIMER,dogsheep/evernote-to-sqlite/pulls/12,"Hey .. Awesome work developing this project, that I found very useful to me and saved me some work.. Thanks.. :) Some background to this PR... I've been searching around for a tool allowing me to transforming my personal collection of Evernote notes to a format easier to search and potentially easier import to future services. Now I discovered problem processing my large data ~5GB using the existing source using Pythons builtin xml-parser that unfortunately was unable to succeed without exception breaking the process. My first attempt I tried to adapt to more robust lxml package allowing huge data and with ""recover"", but even if it worked better it also failed processing the whole data. Even using the memory efficient etree.iterparse() it also unfortunately got into trouble. And with no luck finding any other libraries successfully parsing this enormous file I instead chose to build a ""hugexmlparser"" module that allows parsing this huge file using yield (on a byte-to-byte-level) and allows you to set a maximum size for to cater for potential malformed or undesirable large attachments to export, should succeed covering potential exceptions. Some cases found where the parses discover malformed XML within so also in those cases try to save as much as possible by escaping (to be dealt at a later stage, better than nothing), and if a missing end before new (malformed?) it would add this after encounter a new start-tag. The code for the recovery process is a bit rough and for certain room for refactoring, but at the moment is seem to achieve what I wanted. Now with the above we pass this a minor changed version of save_note_recovery() assure the existing works. Also adding this as a new recover-enex command to click and kept the original options. A couple of new tests was added as well to check against using this command. Now this currently works to me, but thought I might share a PR in such as you find use for this yourself or found useful to others finding this repository. As a second step .. When the time allows it would have been nice to also be able to easily export from SQLite to formatted HTML/MD and attachments saved... but that might perhaps be better a separate project ... or if you or someone else have something that might shared to save some trouble, I would be interested ;-) ",303218369,evernote-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 951817328,MDU6SXNzdWU5NTE4MTczMjg=,12,403 when getting token,285352,treyhunner,open,0,,,,,1,2021-07-23T18:43:26Z,2021-10-12T18:31:57Z,,NONE,,"I tried to use https://your-foursquare-oauth-token.glitch.me/ to get my Swarm auth token and got a 403 after I clicked the Allow button: ![image](https://user-images.githubusercontent.com/285352/126826478-60e53614-263d-40bb-9f1d-c1a676644eb0.png) I'm not sure if this is the right repo to report this in",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1557599877,I_kwDODFE5qs5c1xaF,12,location history changes,14809320,gerardrbentley,open,0,,,,,0,2023-01-26T03:57:25Z,2023-01-26T03:57:25Z,,NONE,,"not sure if each download is unique, but I had to change some things to work with the takeout zip I made 2023-01-25 filename changed from ""Location History.json"" to ""Records.json"" `""timestampMs""` is not present, `""timestamp""` is roughly iso timestamp ```py def get_timestamp_ms(raw_timestamp): try: return datetime.datetime.strptime(raw_timestamp, ""%Y-%m-%dT%H:%M:%SZ"").timestamp() except ValueError: return datetime.datetime.strptime(raw_timestamp, ""%Y-%m-%dT%H:%M:%S.%fZ"").timestamp() def save_location_history(db, zf): location_history = json.load( zf.open(""Takeout/Location History/Records.json"") ) db[""location_history""].upsert_all( ( { ""id"": id_for_location_history(row), ""latitude"": row[""latitudeE7""] / 1e7, ""longitude"": row[""longitudeE7""] / 1e7, ""accuracy"": row[""accuracy""], ""timestampMs"": get_timestamp_ms(row[""timestamp""]), ""when"": row[""timestamp""], } for row in location_history[""locations""] ), pk=""id"", ) def id_for_location_history(row): # We want an ID that is unique but can be sorted by in # date order - so we use the isoformat date + the first # 6 characters of a hash of the JSON first_six = hashlib.sha1( json.dumps(row, separators=("","", "":""), sort_keys=True).encode(""utf8"") ).hexdigest()[:6] return ""{}-{}"".format( row['timestamp'], first_six, ) ``` example locations from mine ```json { ""latitudeE7"": 427220206, ""longitudeE7"": -923423972, ""accuracy"": 10, ""deviceTag"": -1312429967, ""deviceDesignation"": ""PRIMARY"", ""timestamp"": ""2019-01-08T23:31:50.867Z"" } ``` ```json { ""latitudeE7"": 427011317, ""longitudeE7"": -923448300, ""accuracy"": 5, ""deviceTag"": -1312429967, ""deviceDesignation"": ""PRIMARY"", ""timestamp"": ""2019-01-08T23:33:53Z"" }, ```",206649770,google-takeout-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/12/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 2}",, 1650981564,I_kwDOJHON9s5iZ_q8,12,Error running pytest,14314871,amlestin,open,0,,,,,0,2023-04-02T15:02:36Z,2023-04-02T15:07:10Z,,NONE,,"`______________________________________________________ ERROR collecting tests/test_apple_notes_to_sqlite.py _______________________________________________________ ImportError while importing test module '/Users/lol/development/apple-notes-to-sqlite/tests/test_apple_notes_to_sqlite.py'. Hint: make sure your test modules/packages have valid Python names. Traceback: /opt/homebrew/Cellar/python@3.9/3.9.16/Frameworks/Python.framework/Versions/3.9/lib/python3.9/importlib/__init__.py:127: in import_module return _bootstrap._gcd_import(name[level:], package, level) tests/test_apple_notes_to_sqlite.py:2: in from apple_notes_to_sqlite.cli import cli, COUNT_SCRIPT, FOLDERS_SCRIPT E ModuleNotFoundError: No module named 'apple_notes_to_sqlite'` Solution: This is likely a PYTHONPATH issue due to having pytest installed both globally and in the venv. We can guarantee the tests run by adding the current directory to sys.path automatically using `python -m pytest` The alternative is to activate the venv, install pytest, deactivate, then activate the venv again (https://stackoverflow.com/questions/35045038/how-do-i-use-pytest-with-virtualenv)",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1795187493,I_kwDODLZ_YM5rAGMl,12,Switch to pyproject.toml,9599,simonw,closed,0,,,,,2,2023-07-09T01:06:56Z,2023-07-09T01:19:43Z,2023-07-09T01:19:42Z,MEMBER,,First of my CLI tools to use https://til.simonwillison.net/python/pyproject,213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267542338,MDU6SXNzdWUyNjc1NDIzMzg=,13,Add a syntax highlighting SQL editor,9599,simonw,closed,0,,,,,1,2017-10-23T05:03:33Z,2017-11-15T02:04:51Z,2017-11-15T02:04:51Z,OWNER,,https://ace.c9.io/#nav=embedding looks like a good option,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 413779210,MDU6SXNzdWU0MTM3NzkyMTA=,13,Ability to automatically create IDs from content hash of row,9599,simonw,closed,0,,,,,1,2019-02-24T04:07:08Z,2019-02-24T04:36:48Z,2019-02-24T04:36:48Z,OWNER,,"Sometimes when you are importing data the underlying source provides records without IDs that can be uniquely identified by their contents. A utility mechanism for calculating a sha1 hash of the contents and using that as a unique ID would be useful.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503085013,MDU6SXNzdWU1MDMwODUwMTM=,13,statuses-lookup command,9599,simonw,closed,0,,,,,1,2019-10-06T11:00:20Z,2019-10-07T00:33:49Z,2019-10-07T00:31:44Z,MEMBER,,"For bulk retrieving tweets by their ID. https://developer.twitter.com/en/docs/tweets/post-and-engage/api-reference/get-statuses-lookup Rate limit is 900/15 minutes (1 call per second) but each call can pull up to 100 IDs, so we can pull 6,000 per minute. Should support `--SQL` and `--attach` #8 ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 521275281,MDU6SXNzdWU1MjEyNzUyODE=,13,Set up a live demo Datasette instance,9599,simonw,closed,0,,,5225818,1.0,9,2019-11-12T01:27:02Z,2020-03-24T00:03:26Z,2020-03-24T00:03:25Z,MEMBER,,"I deployed https://github-to-sqlite-releases-j7hipcg4aq-uc.a.run.app/ by running this: ``` #!/bin/bash # Fetch repos for simonw and dogsheep github-to-sqlite repos github.db simonw dogsheep -a auth.json # Fetch releases for the repos tagged 'datasette-io' sqlite-utils github.db "" select full_name from repos where rowid in ( select repos.rowid from repos, json_each(repos.topics) j where j.value = 'datasette-io' )"" --csv --no-headers | while read repo; do github-to-sqlite releases \ github.db $(echo $repo | tr -d '\r') \ -a auth.json; sleep 2; done; ``` And then deploying using this: ``` $ datasette publish cloudrun github.db \ --title ""github-to-sqlite releases demo"" \ --about_url=""https://github.com/simonw/github-to-sqlite"" \ --about='github-to-sqlite' \ --install=datasette-render-markdown \ --install=datasette-json-html \ --service=github-to-sqlite-releases ``` This should happen automatically for every release. I can run it once a day in Circle CI to keep the demo database up-to-date.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 607888367,MDU6SXNzdWU2MDc4ODgzNjc=,13,Also upload movie files,9599,simonw,open,0,,,,,2,2020-04-27T22:11:25Z,2020-04-28T00:39:45Z,,MEMBER,,"The `upload` command currently only handles static images: https://github.com/dogsheep/photos-to-sqlite/blob/d939455af00e07866686457ee2fcb9b2d1b7194e/photos_to_sqlite/utils.py#L26-L33 Need to cover movies taken by my phone and DSLR too.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 692386625,MDU6SXNzdWU2OTIzODY2MjU=,13,Support advanced FTS queries,9599,simonw,closed,0,,,,,1,2020-09-03T21:29:56Z,2020-09-03T21:40:51Z,2020-09-03T21:40:51Z,MEMBER,,`simon willison NOT screenshot` for example.,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 743071410,MDExOlB1bGxSZXF1ZXN0NTIxMDU0NjEy,13,SQLite does not have case sensitive columns,1689944,tomaskrehlik,open,0,,,,,1,2020-11-14T20:12:32Z,2021-08-24T13:28:26Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/healthkit-to-sqlite/pulls/13,"This solves a weird issue when there is record with metadata key that is only different in letter cases. See the test for details.",197882382,healthkit-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 978743426,MDU6SXNzdWU5Nzg3NDM0MjY=,13,xml.etree.ElementTree.ParseError: not well-formed (invalid token),9599,simonw,closed,0,,,,,4,2021-08-25T05:48:21Z,2021-08-26T18:45:13Z,2021-08-26T18:45:13Z,MEMBER,,"Got this error today: ``` (evernote-to-sqlite) /tmp % evernote-to-sqlite enex evernote.db simonwillison\'s\ notebook.enex Importing from ENEX [######------------------------------] 17% Traceback (most recent call last): File ""/Users/simon/.local/bin/evernote-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/evernote_to_sqlite/cli.py"", line 31, in enex save_note(db, note) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/evernote_to_sqlite/utils.py"", line 36, in save_note content = ET.tostring(ET.fromstring(content_xml)).decode(""utf-8"") File ""/usr/local/Cellar/python@3.9/3.9.6/Frameworks/Python.framework/Versions/3.9/lib/python3.9/xml/etree/ElementTree.py"", line 1347, in XML parser.feed(text) xml.etree.ElementTree.ParseError: not well-formed (invalid token): line 2, column 132 ```",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1373210675,I_kwDODD6af85R2Ygz,13,fails before generating views. ERR: table sqlite_master may not be modified,116795,pax,open,0,,,,,4,2022-09-14T15:41:50Z,2023-04-11T03:46:17Z,,NONE,,"generates checkins.db but seems to fail before generating views note: it worked on an Ubuntu WSL but fails on macOS 12.5.1 later edit: I suspect this is a problem with my local set-up, `dogsheep-beta index` also throws the same error full error: Importing 2591 checkins [###################################-] 98% 00:00:00 Traceback (most recent call last): File ""/Users/pax/devbox/envAll/bin/swarm-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/swarm_to_sqlite/cli.py"", line 77, in cli ensure_foreign_keys(db) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/swarm_to_sqlite/utils.py"", line 145, in ensure_foreign_keys db[fk.table].add_foreign_key(fk.column, fk.other_table, fk.other_column) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/sqlite_utils/db.py"", line 2123, in add_foreign_key self.db.add_foreign_keys([(self.name, column, other_table, other_column)]) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1086, in add_foreign_keys cursor.execute( sqlite3.OperationalError: table sqlite_master may not be modified",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1650984552,PR_kwDOJHON9s5NbyYN,13,use universal command,14314871,amlestin,open,0,,,,,0,2023-04-02T15:10:54Z,2023-04-02T15:37:34Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/apple-notes-to-sqlite/pulls/13,,611552758,apple-notes-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1884499674,PR_kwDODFE5qs5ZtYMc,13,"use poetry for packages, asdf for versioning, and gh actions for ci",150855,iloveitaly,open,0,,,,,0,2023-09-06T17:59:16Z,2023-09-06T17:59:16Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/google-takeout-to-sqlite/pulls/13,"- build: use poetry for package management, asdf for python version - build: cleanup poetry config, add keywords, ignore dist - ci: migrate circleci to gh actions - fix: dup method definition ",206649770,google-takeout-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 267707940,MDU6SXNzdWUyNjc3MDc5NDA=,14,Datasette Plugins,9599,simonw,closed,0,,,,,22,2017-10-23T15:15:28Z,2019-05-13T18:58:20Z,2019-05-13T18:58:19Z,OWNER,,"It would be neat if additional functionality could be opted-in to the system in the form of easy-to-add plugins, hosted as separate packages. First example: a Google Analytics plugin, which adds GA tracking code with your tracking ID to the web interface for your dataset. This may be an opportunity to experiment with entry points: http://amir.rachum.com/blog/2017/07/28/python-entry-points/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 413842611,MDU6SXNzdWU0MTM4NDI2MTE=,14,Utilities for adding indexes,9599,simonw,closed,0,,,,,3,2019-02-24T16:57:28Z,2019-02-24T19:11:28Z,2019-02-24T19:11:28Z,OWNER,,"Both in the Python API and the CLI tool. For the CLI tool this should work: $ sqlite-utils create-index mydb.db mytable col1 col2 This will create a compound index across col1 and col2. The name of the index will be automatically chosen unless you use the `--name=...` option. Support a `--unique` option too.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503244410,MDU6SXNzdWU1MDMyNDQ0MTA=,14,"When importing favorites, record which user favorited them",9599,simonw,closed,0,,,,,0,2019-10-07T05:45:11Z,2019-10-14T03:30:25Z,2019-10-14T03:30:25Z,MEMBER,,"This code currently just dumps them into the `tweets` table without recording who it was who had favorited them. https://github.com/dogsheep/twitter-to-sqlite/blob/436a170d74ec70903d1b4ca430c2c6b6435cdfcc/twitter_to_sqlite/cli.py#L152-L157",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 530491074,MDU6SXNzdWU1MzA0OTEwNzQ=,14,Command for importing events,9599,simonw,open,0,,,,,3,2019-11-29T21:28:58Z,2020-04-14T19:38:34Z,,MEMBER,,"Eg from https://api.github.com/users/simonw/events Docs here: https://developer.github.com/v3/activity/events/#list-events-performed-by-a-user",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 608512747,MDU6SXNzdWU2MDg1MTI3NDc=,14,Annotate photos using the Google Cloud Vision API,9599,simonw,open,0,,,,,5,2020-04-28T18:09:03Z,2020-04-28T18:19:06Z,,MEMBER,,"It can detect faces, run OCR, do image labeling (it knows what a lemur is!) and do object localization where it identifies objects and returns bounding polygons for them.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/14/reactions"", ""total_count"": 3, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",, 693318095,MDU6SXNzdWU2OTMzMTgwOTU=,14,On FTS exception rerun the query with quoting,9599,simonw,closed,0,,,,,0,2020-09-04T15:44:18Z,2020-09-05T16:23:01Z,2020-09-05T16:23:01Z,MEMBER,,"Searching for eg `#dogfest` currently throws an FTS exception - but I want to support advanced FTS query tricks as seen in #13. https://dogsheep.simonwillison.net/-/beta?q=%23dogfest > fts5: syntax error near ""#"" Idea: catch that error and re-run the query with FTS escaping applied! ",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 771608692,MDU6SXNzdWU3NzE2MDg2OTI=,14,UNIQUE constraint failed: workouts.id,1234956,n8henrie,open,0,,,,,5,2020-12-20T15:11:20Z,2023-07-10T14:46:52Z,,NONE,,"I'm getting an error on my initial attempt to import data: ```console $ healthkit-to-sqlite 20201119\ healthkit\ export.zip healthkit.db Importing from HealthKit [###################################-] 98% 00:00:01 Traceback (most recent call last): File ""venv/bin/healthkit-to-sqlite"", line 8, in sys.exit(cli()) File ""venv/lib/python3.9/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""venv/lib/python3.9/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""venv/lib/python3.9/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""venv/lib/python3.9/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""venv/lib/python3.9/site-packages/healthkit_to_sqlite/cli.py"", line 57, in cli convert_xml_to_sqlite(fp, db, progress_callback=bar.update, zipfile=zf) File ""venv/lib/python3.9/site-packages/healthkit_to_sqlite/utils.py"", line 34, in convert_xml_to_sqlite workout_to_db(el, db, zipfile) File ""venv/lib/python3.9/site-packages/healthkit_to_sqlite/utils.py"", line 57, in workout_to_db pk = db[""workouts""].insert(record, alter=True, hash_id=""id"").last_pk File ""venv/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1660, in insert return self.insert_all( File ""venv/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1778, in insert_all self.insert_chunk( File ""venv/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1588, in insert_chunk result = self.db.execute(query, params) File ""venv/lib/python3.9/site-packages/sqlite_utils/db.py"", line 213, in execute return self.conn.execute(sql, parameters) sqlite3.IntegrityError: UNIQUE constraint failed: workouts.id ```",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 986829194,MDU6SXNzdWU5ODY4MjkxOTQ=,14,xml.etree.ElementTree.Parse Error - mismatched tag,46968,step21,open,0,,,,,1,2021-09-02T14:46:36Z,2021-09-02T14:53:11Z,,NONE,,"This is an error message I get upon parsing the enex file of my Inbox. Please find the full error message below. Any hints welcome. ``` Importing from ENEX [##################------------------] 50% 00:00:50 Traceback (most recent call last): File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/bin/evernote-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/lib/python3.9/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/lib/python3.9/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/lib/python3.9/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/lib/python3.9/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/lib/python3.9/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/lib/python3.9/site-packages/evernote_to_sqlite/cli.py"", line 30, in enex for tag, note in find_all_tags(fp, [""note""], progress_callback=bar.update): File ""/Users/utopist/.virtualenvs/evernote-to-sqlite-Og2PIW3Y/lib/python3.9/site-packages/evernote_to_sqlite/utils.py"", line 17, in find_all_tags for event, el in parser.read_events(): File ""/usr/local/Cellar/python@3.9/3.9.6/Frameworks/Python.framework/Versions/3.9/lib/python3.9/xml/etree/ElementTree.py"", line 1329, in read_events raise event File ""/usr/local/Cellar/python@3.9/3.9.6/Frameworks/Python.framework/Versions/3.9/lib/python3.9/xml/etree/ElementTree.py"", line 1301, in feed self._parser.feed(data) xml.etree.ElementTree.ParseError: mismatched tag: line 6837961, column 2 ``` ",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1393330070,PR_kwDODD6af84__DNJ,14,Photo links,6782721,redmanmale,open,0,,,,,0,2022-10-01T09:44:15Z,2022-11-18T17:10:49Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/swarm-to-sqlite/pulls/14,"* add to `checkin_details` view new column for a calculated photo links * supported multiple links split by newline * create `events` table if there's no events in the history to avoid SQL errors Fixes #9.",205429375,swarm-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1880968405,PR_kwDOJHON9s5ZhYny,14,fix: fix the problem of Chinese character garbling,2698003,barretlee,open,0,,,,,0,2023-09-04T23:48:28Z,2023-09-04T23:48:28Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/apple-notes-to-sqlite/pulls/14,"1. The code uses two different ways of writing encoding formats, `mac_roman` and `macroman`. It is uncertain whether there are any typo errors. 2. When there are Chinese characters in the content, exporting it results in garbled code. Changing it to `utf8` can fix the issue.",611552758,apple-notes-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 267713226,MDU6SXNzdWUyNjc3MTMyMjY=,15,Support multiple databases,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-10-23T15:29:51Z,2017-10-24T02:01:38Z,2017-10-24T02:01:38Z,OWNER,,"I'm going to loop through every database file in the app root directory and bundle all of them. Each one will be accessible at /databasename Note this is without the file extension, and we will disallow multiple files with the same name but different extensions. Supported extensions to start with will be `.db` and `.sqlite` and `.sqlite3`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 413857257,MDU6SXNzdWU0MTM4NTcyNTc=,15,Ability to add columns to tables,9599,simonw,closed,0,,,,,0,2019-02-24T19:20:51Z,2019-02-24T20:04:40Z,2019-02-24T20:04:40Z,OWNER,,"Makes sense to do this before foreign keys in #2 Python: db[""table""].add_column(""new_column"", int) CLI: $ sqlite-utils add-column table new_column INTEGER ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 505666744,MDExOlB1bGxSZXF1ZXN0MzI3MDUxNjcz,15,"twitter-to-sqlite import command, refs #4",9599,simonw,closed,0,,,,,0,2019-10-11T06:37:14Z,2019-10-11T06:45:01Z,2019-10-11T06:45:01Z,MEMBER,dogsheep/twitter-to-sqlite/pulls/15,,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 544571092,MDU6SXNzdWU1NDQ1NzEwOTI=,15,Assets table with downloads,2029,garethr,closed,0,,,5225818,1.0,4,2020-01-02T13:05:28Z,2020-03-28T12:17:01Z,2020-03-23T19:17:32Z,NONE,,"The `releases` command extracts the releases table, but data about the individual assets are locked up in the JSON document in the `assets` field. My main interest is in individual and aggregate download counts. I was wondering if creating a new table with a record per asset may be useful? If so I'm happy to send a PR when I get a moment. Do you have opinions about that simply being part of the `releases` command or would you prefer a separate command as well?",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612151767,MDU6SXNzdWU2MTIxNTE3Njc=,15,Expose scores from ZCOMPUTEDASSETATTRIBUTES,9599,simonw,closed,0,,,,,7,2020-05-04T20:36:07Z,2020-12-20T04:44:22Z,2020-05-05T00:11:45Z,MEMBER,,"The Apple Photos database has a `ZCOMPUTEDASSETATTRIBUTES` that looks absurdly interesting... it has calculated scores for every photo: ",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 694136490,MDU6SXNzdWU2OTQxMzY0OTA=,15,Add a bunch of config examples,9599,simonw,open,0,,,,,1,2020-09-05T17:58:43Z,2020-09-18T23:17:39Z,,MEMBER,,I can bring these over from my personal Dogsheep.,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 793907673,MDExOlB1bGxSZXF1ZXN0NTYxNTEyNTAz,15,added try / except to write_records ,9857779,ryancheley,open,0,,,,,0,2021-01-26T03:56:21Z,2021-01-26T03:56:21Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/healthkit-to-sqlite/pulls/15,"to keep the data write from failing if it came across an error during processing. In particular when trying to convert my HealthKit zip file (and that of my wife's) it would consistently error out with the following: ``` db.py 1709 insert_chunk result = self.db.execute(query, params) db.py 226 execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: too many SQL variables --------------------------------------------------------------------------------------------------------------------------------------------------------------------- db.py 1709 insert_chunk result = self.db.execute(query, params) db.py 226 execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: too many SQL variables --------------------------------------------------------------------------------------------------------------------------------------------------------------------- db.py 1709 insert_chunk result = self.db.execute(query, params) db.py 226 execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: table rBodyMass has no column named metadata_HKWasUserEntered --------------------------------------------------------------------------------------------------------------------------------------------------------------------- healthkit-to-sqlite 8 sys.exit(cli()) core.py 829 __call__ return self.main(*args, **kwargs) core.py 782 main rv = self.invoke(ctx) core.py 1066 invoke return ctx.invoke(self.callback, **ctx.params) core.py 610 invoke return callback(*args, **kwargs) cli.py 57 cli convert_xml_to_sqlite(fp, db, progress_callback=bar.update, zipfile=zf) utils.py 42 convert_xml_to_sqlite write_records(records, db) utils.py 143 write_records db[table].insert_all( db.py 1899 insert_all self.insert_chunk( db.py 1720 insert_chunk self.insert_chunk( db.py 1720 insert_chunk self.insert_chunk( db.py 1714 insert_chunk result = self.db.execute(query, params) db.py 226 execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: table rBodyMass has no column named metadata_HKWasUserEntered ``` Adding the try / except in the `write_records` seems to fix that issue. ",197882382,healthkit-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1042759769,PR_kwDOEhK-wc4uAJb9,15,include note tags in the export,436138,d-rep,open,0,,,,,0,2021-11-02T20:04:31Z,2021-11-02T20:04:31Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/evernote-to-sqlite/pulls/15,"When parsing the Evernote `` elements, the script will now also parse any nested `` elements, writing them out into a separate sqlite table. Here is an example of how to query the data after the script has run: ``` select notes.*, (select group_concat(tag) from notes_tags where notes_tags.note_id=notes.id) as tags from notes; ``` My .enex source file is 3+ years old so I am assuming the structure hasn't changed. Interestingly, my _notebook names_ show up in the _tags_ list where the tag name is prefixed with `notebook_`, so this could maybe help work around the first limitation mentioned in the [evernote-to-sqlite blog post](https://simonwillison.net/2020/Oct/16/building-evernote-sqlite-exporter/). ",303218369,evernote-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1661617056,I_kwDODD6af85jCkOg,15,ambiguous column name: createdAt - on checkin_details view,9599,simonw,closed,0,,,,,0,2023-04-11T01:07:47Z,2023-04-11T03:16:37Z,2023-04-11T03:16:37Z,MEMBER,,"It looks like Swarm changed their schema and now both `venues` and `checkins` have `createdAt` fields. Which breaks this view: https://github.com/dogsheep/swarm-to-sqlite/blob/719b6e96a016d0ca8b316d3bed9c2a7a0cb499ee/swarm_to_sqlite/utils.py#L171-L188",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267726219,MDU6SXNzdWUyNjc3MjYyMTk=,16,Default HTML/CSS needs to look reasonable and be responsive,9599,simonw,closed,0,,,2857392,Ship first public release,6,2017-10-23T16:05:22Z,2017-11-11T20:19:07Z,2017-11-11T20:19:07Z,OWNER,,"Version one should have the following characteristics: - Looks OK - Works great on mobile - Loads extremely fast - No JavaScript! At least not in v1.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 413867537,MDU6SXNzdWU0MTM4Njc1Mzc=,16,add_column() should support REFERENCES {other_table}({other_column}),9599,simonw,closed,0,,,,,4,2019-02-24T21:00:45Z,2019-05-29T05:17:59Z,2019-05-29T04:56:18Z,OWNER,,Related to #2 ,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 505673645,MDU6SXNzdWU1MDU2NzM2NDU=,16,Do a better job with archived direct message threads,9599,simonw,open,0,,,,,0,2019-10-11T06:55:21Z,2019-10-11T06:55:27Z,,MEMBER,,https://github.com/dogsheep/twitter-to-sqlite/blob/fb2698086d766e0333a55bb73435e7283feeb438/twitter_to_sqlite/archive.py#L98-L99,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 546051181,MDU6SXNzdWU1NDYwNTExODE=,16,Exception running first command: IndexError: list index out of range,15092,jayvdb,closed,0,,,,,4,2020-01-07T03:01:58Z,2020-04-14T18:37:21Z,2020-04-14T18:37:21Z,NONE,,"Exception running first command without an existing db or auth. ```py > mkdir ~/.github/coala > /usr/bin/github-to-sqlite repos ~/.github/coala coala Traceback (most recent call last): File ""/usr/bin/github-to-sqlite"", line 11, in load_entry_point('github-to-sqlite==0.6', 'console_scripts', 'github-to-sqlite')() File ""/usr/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/usr/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/usr/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/usr/lib/python3.7/site-packages/github_to_sqlite/cli.py"", line 163, in repos utils.save_repo(db, repo) File ""/usr/lib/python3.7/site-packages/github_to_sqlite/utils.py"", line 120, in save_repo to_save[""owner""] = save_user(db, to_save[""owner""]) File ""/usr/lib/python3.7/site-packages/github_to_sqlite/utils.py"", line 61, in save_user return db[""users""].upsert(to_save, pk=""id"", alter=True).last_pk File ""/usr/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1135, in upsert extracts=extracts, File ""/usr/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1162, in upsert_all upsert=True, File ""/usr/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1105, in insert_all row = list(self.rows_where(""rowid = ?"", [self.last_rowid]))[0] IndexError: list index out of range ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612287234,MDU6SXNzdWU2MTIyODcyMzQ=,16,"Import machine-learning detected labels (dog, llama etc) from Apple Photos",9599,simonw,open,0,,,,,13,2020-05-05T02:45:43Z,2020-05-05T05:38:16Z,,MEMBER,,"Follow-on from #1. Apple Photos runs some very sophisticated machine learning on-device to figure out if photos are of dogs, llamas and so on. I really want to extract those labels out into my own database.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/16/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 1, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 694493566,MDU6SXNzdWU2OTQ0OTM1NjY=,16,Timeline view,9599,simonw,open,0,,,,,3,2020-09-06T19:13:58Z,2020-09-21T02:42:29Z,,MEMBER,,Ability to browse (and facet) by date.,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 830901133,MDExOlB1bGxSZXF1ZXN0NTkyMzY0MjU1,16,"Add a fallback ID, print if no ID found",1234956,n8henrie,open,0,,,,,0,2021-03-13T13:38:29Z,2021-03-13T14:44:04Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/healthkit-to-sqlite/pulls/16,"Fixes https://github.com/dogsheep/healthkit-to-sqlite/issues/14 ",197882382,healthkit-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1943259395,I_kwDOEhK-wc5z08kD,16, time data '2014-11-21T11:44:12.000Z' does not match format '%Y%m%dT%H%M%SZ',3746270,linonetwo,open,0,,,,,0,2023-10-14T13:24:39Z,2023-10-14T13:24:39Z,,NONE,," ``` evernote-to-sqlite enex evernote.db ./我的笔记.enex Importing from ENEX [#####-------------------------------] 14% Traceback (most recent call last): File ""/usr/local/bin/evernote-to-sqlite"", line 8, in sys.exit(cli()) ^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 1157, in __call__ return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 1078, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 1688, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 1434, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/click/core.py"", line 783, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/evernote_to_sqlite/cli.py"", line 31, in enex save_note(db, note) File ""/usr/local/lib/python3.11/site-packages/evernote_to_sqlite/utils.py"", line 46, in save_note ""created"": convert_datetime(created), ^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/lib/python3.11/site-packages/evernote_to_sqlite/utils.py"", line 111, in convert_datetime return datetime.datetime.strptime(s, ""%Y%m%dT%H%M%SZ"").isoformat() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/Cellar/python@3.11/3.11.5/Frameworks/Python.framework/Versions/3.11/lib/python3.11/_strptime.py"", line 568, in _strptime_datetime tt, fraction, gmtoff_fraction = _strptime(data_string, format) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/Cellar/python@3.11/3.11.5/Frameworks/Python.framework/Versions/3.11/lib/python3.11/_strptime.py"", line 349, in _strptime raise ValueError(""time data %r does not match format %r"" % ValueError: time data '2014-11-21T11:44:12.000Z' does not match format '%Y%m%dT%H%M%SZ' ``` enex is exported by evernote mac client ",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 267732005,MDU6SXNzdWUyNjc3MzIwMDU=,17,"In development mode, should still pick up new .db files",9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-23T16:22:40Z,2017-10-24T02:26:48Z,2017-10-24T02:26:47Z,OWNER,,Follow on from #11 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 413868452,MDU6SXNzdWU0MTM4Njg0NTI=,17,Improve and document foreign_keys=... argument to insert/create/etc,9599,simonw,closed,0,,,,,7,2019-02-24T21:09:11Z,2019-02-24T23:45:48Z,2019-02-24T23:45:48Z,OWNER,,"The `foreign_keys=` argument to `table.insert_all()` and friends can be used to specify foreign key relationships that should be created. It is not yet documented. It also requires you to specify the SQLite type of each column, even though this can be detected by introspecting the referenced table: cols = [c for c in self.db[other_table].columns if c.name == other_column] cols[0].type Relates to #2 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 505674949,MDU6SXNzdWU1MDU2NzQ5NDk=,17,import command should empty all archive-* tables first,9599,simonw,closed,0,,,,,2,2019-10-11T06:58:43Z,2019-10-11T15:40:08Z,2019-10-11T15:40:08Z,MEMBER,,Can have a CLI option for NOT doing that.,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 578883725,MDU6SXNzdWU1Nzg4ODM3MjU=,17,Command for importing commits,9599,simonw,closed,0,,,,,2,2020-03-10T21:55:12Z,2020-03-11T02:47:37Z,2020-03-11T02:47:37Z,MEMBER,,Using this API: https://api.github.com/repos/dogsheep/github-to-sqlite/commits,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612860531,MDU6SXNzdWU2MTI4NjA1MzE=,17,Only install osxphotos if running on macOS,9599,simonw,closed,0,,,,,3,2020-05-05T20:03:26Z,2020-05-05T20:20:05Z,2020-05-05T20:11:23Z,MEMBER,,The build is broken right now because you can't `pip install osxphotos` on Ubuntu.,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 694500679,MDU6SXNzdWU2OTQ1MDA2Nzk=,17,"Rename ""table"" to ""type""",9599,simonw,closed,0,,,,,2,2020-09-06T19:34:41Z,2020-09-09T03:03:22Z,2020-09-09T03:03:22Z,MEMBER,,"I think ""table"" is the wrong name for the concept I'm using it for here. Two reasons: firstly, `table` is a reserved word in SQLite. More importantly, it turns out there's not a direct mapping from tables to types of search result. In particular, for GitHub I ended up having two different ""tables"" of repositories - one for repos created by me, another for repos that I have starred.",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 836063389,MDU6SXNzdWU4MzYwNjMzODk=,17,Datetime columns are not properly formatted to be recognizes as datetime,1234956,n8henrie,open,0,,,,,0,2021-03-19T14:33:04Z,2021-03-19T14:33:04Z,,NONE,," Currently, the datetimes are formatted in a way that is not recognized by datasette-vega for plotting with a `Date/time` type for the axis. For example, if you have datasette running locally with `datasette-vega` installed and have a database that includes resting heart rate: ``` http://localhost:8001/healthkit/rRestingHeartRate#g.mark=line&g.x_column=startDate&g.x_type=temporal&g.y_column=value&g.y_type=quantitative ``` The plot is blank unless you choose `Label` as the type for the date data. The `startDate` (and `creationDate` and `endDate`) columns appear like: `2019-11-14 18:22:18 -0700` If instead the format for this column is changed slightly: `2019-11-14T18:22:18-07:00` they are recognized as proper dates and the charting works as expected. I have a PR that addresses this issue, will submit shortly.",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 267739593,MDU6SXNzdWUyNjc3Mzk1OTM=,18,See if I can get a websockets interface working,9599,simonw,closed,0,,,,,1,2017-10-23T16:46:41Z,2021-01-04T20:05:52Z,2021-01-04T20:05:48Z,OWNER,,"Since I am already running on Sanic, how hard would it be to add a websocket ebdpoint that lets you talk to sqlite interactively? Could this be used to efficiently support streaming in answers to giant queries?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 413871266,MDU6SXNzdWU0MTM4NzEyNjY=,18,.insert/.upsert/.insert_all/.upsert_all should add missing columns,9599,simonw,closed,0,,,4348046,1.0,2,2019-02-24T21:36:11Z,2019-05-25T00:42:11Z,2019-05-25T00:42:11Z,OWNER,,"This is a larger change, but it would be incredibly useful: if you attempt to insert or update a document with a field that does not currently exist in the underlying table, sqlite-utils should add the appropriate column for you.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 505928530,MDU6SXNzdWU1MDU5Mjg1MzA=,18,Command to import home-timeline,9599,simonw,closed,0,,,,,4,2019-10-11T15:47:54Z,2019-10-11T16:51:33Z,2019-10-11T16:51:12Z,MEMBER,,"Feature request: https://twitter.com/johankj/status/1182563563136868352 > Would it be possible to save all tweets in my timeline from the last X days? I would love to see how big a percentage some users are of my daily timeline as a metric on whether I should unfollow them/move them to a list.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585411547,MDU6SXNzdWU1ODU0MTE1NDc=,18,Commits in GitHub API can have null author,9599,simonw,closed,0,,,5225818,1.0,8,2020-03-21T02:20:56Z,2020-03-23T20:44:49Z,2020-03-23T20:44:26Z,MEMBER,,"``` Traceback (most recent call last): File ""/home/ubuntu/datasette-venv/bin/github-to-sqlite"", line 8, in sys.exit(cli()) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/github_to_sqlite/cli.py"", line 235, in commits utils.save_commits(db, commits, repo_full[""id""]) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/github_to_sqlite/utils.py"", line 290, in save_commits commit_to_insert[""author""] = save_user(db, commit[""author""]) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/github_to_sqlite/utils.py"", line 54, in save_user for key, value in user.items() AttributeError: 'NoneType' object has no attribute 'items' ``` Got this running the `commits` command from cron.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612860758,MDU6SXNzdWU2MTI4NjA3NTg=,18,Switch CI solution to GitHub Actions with a macOS runner,9599,simonw,open,0,,,,,1,2020-05-05T20:03:50Z,2020-05-05T23:49:18Z,,MEMBER,,Refs #17.,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 695553522,MDU6SXNzdWU2OTU1NTM1MjI=,18,Deleted records stay in the search index,9599,simonw,open,0,,,,,2,2020-09-08T05:14:23Z,2020-09-08T05:15:51Z,,MEMBER,,"Here's why: https://github.com/dogsheep/dogsheep-beta/blob/24f7898d41a39218058f174c75ba62f7c0fcfff6/dogsheep_beta/utils.py#L44-L53 That should probably do `DELETE FROM index1.search_index WHERE [table] = ?` first.",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 836064851,MDExOlB1bGxSZXF1ZXN0NTk2NjI3Nzgw,18,Add datetime parsing,1234956,n8henrie,open,0,,,,,0,2021-03-19T14:34:22Z,2021-03-19T14:34:22Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/healthkit-to-sqlite/pulls/18,"Parses the datetime columns so they are subsequently properly recognized as datetime. Fixes https://github.com/dogsheep/healthkit-to-sqlite/issues/17 ",197882382,healthkit-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 267741262,MDU6SXNzdWUyNjc3NDEyNjI=,19,Efficient url for downloading the raw database file,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-23T16:52:17Z,2017-10-25T15:21:16Z,2017-10-25T15:19:37Z,OWNER,,Use Sanic support for steaming large files http://sanic.readthedocs.io/en/latest/sanic/response.html#file-streaming,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 432217625,MDU6SXNzdWU0MzIyMTc2MjU=,19,Incorrect help text for enable-fts command,9599,simonw,closed,0,,,4348046,1.0,0,2019-04-11T19:46:44Z,2019-05-25T00:44:31Z,2019-05-25T00:44:31Z,OWNER,,"I clearly copied-and-pasted this from the `tables` command without updating it: https://github.com/simonw/sqlite-utils/blob/0b1af42ead3b3902347951180b3364ce1942da6e/sqlite_utils/cli.py#L216-L222",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 506087267,MDU6SXNzdWU1MDYwODcyNjc=,19,since_id support for home-timeline,9599,simonw,closed,0,,,,,3,2019-10-11T22:48:24Z,2019-10-16T19:13:06Z,2019-10-16T19:12:46Z,MEMBER,,Currently every time you run `home-timeline` we pull all 800 available tweets. We should offer to support `since_id` (which can be provided or can be pulled directly from the database) in order to work more efficiently if this command is executed e.g. on a cron.,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585850715,MDU6SXNzdWU1ODU4NTA3MTU=,19,"Enable full-text search for more stuff (like commits, issues and issue_comments)",9599,simonw,closed,0,,,5225818,1.0,2,2020-03-23T00:19:56Z,2020-03-23T19:06:39Z,2020-03-23T19:06:39Z,MEMBER,,Currently FTS is only enabled for repos and releases.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 613002220,MDU6SXNzdWU2MTMwMDIyMjA=,19,apple-photos command should work even if upload has not run,9599,simonw,closed,0,,,,,1,2020-05-06T02:02:25Z,2020-05-19T20:59:59Z,2020-05-19T20:59:59Z,MEMBER,,"I want people to be able to query their Apple Photos metadata without having to first run `upload` to upload all of their files to their own S3 bucket. To do this I can have `apple-photos` calculate SHA256 hashes of each photo if the `uploads` table does not yet exist (or does not contain that photo).",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 695556681,MDU6SXNzdWU2OTU1NTY2ODE=,19,Figure out incremental re-indexing,9599,simonw,open,0,,,,,2,2020-09-08T05:23:31Z,2020-09-08T05:27:07Z,,MEMBER,,As tables get bigger reindexing everything on a schedule (essentially recreating the entire index from scratch) will start to become a performance bottleneck.,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 975158266,MDU6SXNzdWU5NzUxNTgyNjY=,19,table activity_summary has no column named appleMoveTime,9599,simonw,closed,0,,,,,0,2021-08-20T00:46:44Z,2021-08-20T00:54:34Z,2021-08-20T00:54:34Z,MEMBER,,"Got this error today against a fresh export: table activity_summary has no column named appleMoveTime ",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267759136,MDU6SXNzdWUyNjc3NTkxMzY=,20,Config file with support for defining canned queries,9599,simonw,closed,0,9599,simonw,2949431,Custom templates edition,9,2017-10-23T17:53:06Z,2017-12-05T19:05:35Z,2017-12-05T17:44:09Z,OWNER,,"Probably using YAML because then we get support for multiline strings: bats: db: bats.sqlite3 name: ""Bat sightings"" queries: specific_row: | select * from Bats where a = 1; ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 432727685,MDU6SXNzdWU0MzI3Mjc2ODU=,20,JSON column values get extraneously quoted ,649467,mhalle,closed,0,,,4348046,1.0,1,2019-04-12T20:15:30Z,2019-05-25T00:57:19Z,2019-05-25T00:57:19Z,NONE,,"If the input to `sqlite-utils insert` includes a column that is a JSON array or object, `sqlite-utils query` will introduce an extra level of quoting on output: ``` # echo '[{""key"": [""one"", ""two"", ""three""]}]' | sqlite-utils insert t.db t - # sqlite-utils t.db 'select * from t' [{""key"": ""[\""one\"", \""two\"", \""three\""]""}] # sqlite3 t.db 'select * from t' [""one"", ""two"", ""three""] ``` This might require an imperfect solution, since sqlite3 doesn't have a JSON type. Perhaps fields that start with `[""` or `{""` and end with `""]` or `""}` could be detected, with a flag to turn off that behavior for weird text fields (or vice versa).",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 506268945,MDU6SXNzdWU1MDYyNjg5NDU=,20,--since support for various commands for refresh-by-cron,9599,simonw,closed,0,,,,,3,2019-10-13T03:40:46Z,2019-10-21T03:32:04Z,2019-10-16T19:26:11Z,MEMBER,,"I want to run a cron that updates my Twitter database every X minutes. It should be able to retrieve the following without needing to paginate through everything: - [x] Tweets I have tweeted - [x] My home timeline (see #19) - [x] Tweets I have favourited It would be nice if this could be standardized across all commands as a `--since` option.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586454513,MDU6SXNzdWU1ODY0NTQ1MTM=,20,Upgrade to sqlite-utils 2.x,9599,simonw,closed,0,,,5225818,1.0,0,2020-03-23T19:17:58Z,2020-03-23T19:22:52Z,2020-03-23T19:22:52Z,MEMBER,,,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 613006393,MDU6SXNzdWU2MTMwMDYzOTM=,20,Ability to serve thumbnailed Apple Photo from its place on disk,9599,simonw,closed,0,,,,,10,2020-05-06T02:17:50Z,2020-05-25T20:14:22Z,2020-05-25T20:09:41Z,MEMBER,,"A custom Datasette plugin that can be run locally on a Mac laptop which knows how to serve photos such that they can be seen in the browser. _Originally posted by @simonw in https://github.com/dogsheep/photos-to-sqlite/issues/19#issuecomment-624406285_",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 697162939,MDU6SXNzdWU2OTcxNjI5Mzk=,20,Add more tags so people can find your project.,7902810,ran88dom99,open,0,,,,,0,2020-09-09T21:14:09Z,2020-09-09T21:14:09Z,,NONE,,"quantified-self habit-tracking google-fit time-tracking wearables quantifiedself for example",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/20/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 1, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 975166271,MDU6SXNzdWU5NzUxNjYyNzE=,20,Add index on workout_points.date,9599,simonw,open,0,,,,,2,2021-08-20T01:08:04Z,2021-08-20T01:12:48Z,,MEMBER,,"Sorting that by date makes sense for seeing most recent points, and my DB has 2.5m points in so it's an expensive sort!",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 267769034,MDU6SXNzdWUyNjc3NjkwMzQ=,21,Use Sanic configuration mechanism ,9599,simonw,closed,0,,,2859414,v1 stretch goals,1,2017-10-23T18:25:14Z,2017-11-10T20:45:42Z,2017-11-10T20:45:42Z,OWNER,,http://sanic.readthedocs.io/en/latest/sanic/config.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/21/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 448391492,MDU6SXNzdWU0NDgzOTE0OTI=,21,Option to ignore inserts if primary key exists already,9599,simonw,closed,0,,,,,3,2019-05-25T00:17:12Z,2019-05-29T05:09:01Z,2019-05-29T04:18:26Z,OWNER,,"> I've just noticed that SQLite lets you IGNORE inserts that collide with a pre-existing key. This can be quite handy if you have a dataset that keeps changing in part, and you don't want to upsert and replace pre-existing PK rows but you do want to ignore collisions to existing PK rows. > > Do `sqlite_utils` support such (cavalier!) behaviour? _Originally posted by @psychemedia in https://github.com/simonw/sqlite-utils/issues/18#issuecomment-480621924_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/21/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 506432572,MDU6SXNzdWU1MDY0MzI1NzI=,21,Fix & escapes in tweet text,9599,simonw,closed,0,,,,,1,2019-10-14T03:37:28Z,2019-10-15T18:48:16Z,2019-10-15T18:48:16Z,MEMBER,," Shouldn't be storing `&` here.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/21/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586561727,MDU6SXNzdWU1ODY1NjE3Mjc=,21,Turn GitHub API errors into exceptions,9599,simonw,closed,0,,,5225818,1.0,2,2020-03-23T22:37:24Z,2020-03-23T23:48:23Z,2020-03-23T23:48:22Z,MEMBER,,"This would have really helped in debugging the mess in #13. Running with this `auth.json` is a useful demo: ```json {""github_personal_token"": """"} ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/21/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 615474990,MDU6SXNzdWU2MTU0NzQ5OTA=,21,bpylist.archiver.CircularReference: archive has a cycle with uid(13),9599,simonw,closed,0,,,,,11,2020-05-10T20:58:06Z,2020-12-19T07:44:49Z,2020-05-10T21:57:13Z,MEMBER,,"``` % python -i $(which photos-to-sqlite) apple-photos photos.db Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/photoinfo.py"", line 611, in place return self._place # pylint: disable=access-member-before-definition AttributeError: 'PhotoInfo' object has no attribute '_place' During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/bin/photos-to-sqlite"", line 11, in load_entry_point('photos-to-sqlite', 'console_scripts', 'photos-to-sqlite')() File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/Users/simon/Dropbox/Development/photos-to-sqlite/photos_to_sqlite/cli.py"", line 249, in apple_photos photo_row = osxphoto_to_row(sha256, photo) File ""/Users/simon/Dropbox/Development/photos-to-sqlite/photos_to_sqlite/utils.py"", line 91, in osxphoto_to_row place = photo.place File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/photoinfo.py"", line 614, in place self._place = PlaceInfo5(self._info[""reverse_geolocation""]) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py"", line 505, in __init__ self._plrevgeoloc = archiver.unarchive(revgeoloc_bplist) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 16, in unarchive return Unarchive(plist).top_object() File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 256, in top_object return self.decode_object(self.top_uid) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 247, in decode_object obj = klass.decode_archive(ArchivedObject(raw_obj, self)) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py"", line 126, in decode_archive mapItem = archive.decode(""mapItem"") File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 140, in decode return self._unarchiver.decode_key(self._object, key) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 216, in decode_key return self.decode_object(val) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 247, in decode_object obj = klass.decode_archive(ArchivedObject(raw_obj, self)) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py"", line 180, in decode_archive sortedPlaceInfos = archive.decode(""sortedPlaceInfos"") File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 140, in decode return self._unarchiver.decode_key(self._object, key) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 216, in decode_key return self.decode_object(val) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 247, in decode_object obj = klass.decode_archive(ArchivedObject(raw_obj, self)) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 112, in decode_archive return [archive._decode_index(index) for index in uids] File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 112, in return [archive._decode_index(index) for index in uids] File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 137, in _decode_index return self._unarchiver.decode_object(index) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 247, in decode_object obj = klass.decode_archive(ArchivedObject(raw_obj, self)) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py"", line 217, in decode_archive placeType = archive.decode(""placeType"") File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 140, in decode return self._unarchiver.decode_key(self._object, key) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 216, in decode_key return self.decode_object(val) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 227, in decode_object raise CircularReference(index) bpylist.archiver.CircularReference: archive has a cycle with uid(13) ``` In the debugger I traced this back to: ``` 178 @staticmethod 179 def decode_archive(archive): 180 -> sortedPlaceInfos = archive.decode(""sortedPlaceInfos"") 181 finalPlaceInfos = archive.decode(""finalPlaceInfos"") 182 return PLRevGeoMapItem(sortedPlaceInfos, finalPlaceInfos) ```",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 703951918,MDU6SXNzdWU3MDM5NTE5MTg=,21,Option to sort search results by date,9599,simonw,closed,0,,,,,0,2020-09-17T22:32:39Z,2020-09-17T22:55:35Z,2020-09-17T22:55:35Z,MEMBER,,"Sometimes I want to sort by date, not by relevance.",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/21/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 977128935,MDU6SXNzdWU5NzcxMjg5MzU=,21,Duplicate Column,32016596,FabianHertwig,open,0,,,,,1,2021-08-23T15:00:44Z,2021-08-23T17:00:59Z,,NONE,,"Hey, thank you for this repo! When I try to convert my export, I get a multiple column error. Here is the stack trace: ```sh (.venv) (base) computer:bodyweight_app user$ healthkit-to-sqlite ./data/Health_export.zip ./data/healthkit.db Importing from HealthKit [###############################-----] 87% 00:00:22 Traceback (most recent call last): File ""/MyProject/.venv/bin/healthkit-to-sqlite"", line 10, in sys.exit(cli()) File ""/MyProject/.venv/lib/python3.7/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/MyProject/.venv/lib/python3.7/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/MyProject/.venv/lib/python3.7/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/MyProject/.venv/lib/python3.7/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/MyProject/.venv/lib/python3.7/site-packages/healthkit_to_sqlite/cli.py"", line 57, in cli convert_xml_to_sqlite(fp, db, progress_callback=bar.update, zipfile=zf) File ""/MyProject/.venv/lib/python3.7/site-packages/healthkit_to_sqlite/utils.py"", line 41, in convert_xml_to_sqlite write_records(records, db) File ""/MyProject/.venv/lib/python3.7/site-packages/healthkit_to_sqlite/utils.py"", line 146, in write_records batch_size=50, File ""/MyProject/.venv/lib/python3.7/site-packages/sqlite_utils/db.py"", line 2579, in insert_all extracts=extracts, File ""/MyProject/.venv/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1246, in create extracts=extracts, File ""/MyProject/.venv/lib/python3.7/site-packages/sqlite_utils/db.py"", line 767, in create_table self.execute(sql) File ""/MyProject/.venv/lib/python3.7/site-packages/sqlite_utils/db.py"", line 421, in execute return self.conn.execute(sql) sqlite3.OperationalError: duplicate column name: metadata_Meal ```",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/21/reactions"", ""total_count"": 5, ""+1"": 5, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 267769431,MDU6SXNzdWUyNjc3Njk0MzE=,22,Refactor to use class based views ,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-10-23T18:26:22Z,2019-05-27T20:05:56Z,2017-10-24T02:25:53Z,OWNER,,http://sanic.readthedocs.io/en/latest/sanic/class_based_views.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 448395665,MDU6SXNzdWU0NDgzOTU2NjU=,22,Release notes for 1.0,9599,simonw,closed,0,,,4348046,1.0,2,2019-05-25T00:58:03Z,2019-05-25T01:18:27Z,2019-05-25T01:06:52Z,OWNER,,https://github.com/simonw/sqlite-utils/compare/0.14...251e473,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 508024032,MDU6SXNzdWU1MDgwMjQwMzI=,22,Ability to import from uncompressed archive or from specific files,9599,simonw,closed,0,,,,,0,2019-10-16T18:31:57Z,2019-10-16T18:53:36Z,2019-10-16T18:53:36Z,MEMBER,,"Currently you can only import like this: $ twitter-to-sqlite import path-to-twitter.zip It would be useful if you could import from a folder that was decompressed from that zip: $ twitter-to-sqlite import path-to-twitter/ AND from individual files within that folder - since that would allow you to e.g. selectively import certain files: $ twitter-to-sqlite import path-to-twitter/favorites.js path-to-twitter/tweets.js",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586567379,MDU6SXNzdWU1ODY1NjczNzk=,22,Handle empty git repositories,9599,simonw,closed,0,,,,,0,2020-03-23T22:49:48Z,2020-03-23T23:13:11Z,2020-03-23T23:13:11Z,MEMBER,,"Got this error: ``` github_to_sqlite.utils.GitHubError: {'message': 'Git Repository is empty.', 'documentation_url': 'https://developer.github.com/v3/repos/commits/#list-commits-on-a-repository'} ``` From https://api.github.com/repos/dogsheep/beta/commits",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 615626118,MDU6SXNzdWU2MTU2MjYxMTg=,22,Try out ExifReader,9599,simonw,open,0,,,,,4,2020-05-11T06:32:13Z,2020-05-14T05:59:53Z,,MEMBER,,"https://pypi.org/project/ExifReader/ New fork that should be able to handle EXIF in HEIC files. Forked here: https://github.com/ianare/exif-py/issues/102#issuecomment-626376522 Refs #3 ",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 703962917,MDU6SXNzdWU3MDM5NjI5MTc=,22,Bug: UI says sorted by relevance in timeline view,9599,simonw,closed,0,,,,,0,2020-09-17T23:02:07Z,2020-09-17T23:13:14Z,2020-09-17T23:13:14Z,MEMBER,,"In regular timeline view sort defaults to newest, not relevance - so this UI is incorrect: ",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 978086284,MDExOlB1bGxSZXF1ZXN0NzE4NzM0MTkx,22,Make sure that case-insensitive column names are unique,32016596,FabianHertwig,open,0,,,,,1,2021-08-24T13:13:38Z,2021-08-24T13:26:20Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/healthkit-to-sqlite/pulls/22,"This closes #21. When there are metadata entries with the same case insensitive string, then there is an error when trying to create a new column for that metadata entry in the database table, because a column with that case insensitive name already exists. ```xml ``` The code added in this PR checks if a key already exists in a record and if so adds a number at its end. The resulting column names look like the example below then. Interestingly, the column names viewed with Datasette are not case insensitive. ```text startDate, endDate, value, unit, sourceName, sourceVersion, creationDate, metadata_meal, metadata_Meal_2, metadata_Mahlzeit ``` ",197882382,healthkit-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 267788884,MDU6SXNzdWUyNjc3ODg4ODQ=,23,Support Django-style filters in querystring arguments,9599,simonw,closed,0,,,2857392,Ship first public release,6,2017-10-23T19:29:42Z,2017-10-25T04:23:03Z,2017-10-25T04:23:02Z,OWNER,,"e.g /database/table?name__contains=Simon&age__gte=4 Same format as Django: double underscore as the split. If you need to match against a column that happens to contain a double underscore in its official name, do this: /database/table?weird__column__exact=Simon __exact is the default operation if none is supplied.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 449565204,MDU6SXNzdWU0NDk1NjUyMDQ=,23,Syntactic sugar for creating m2m records,9599,simonw,closed,0,,,,,10,2019-05-29T02:17:48Z,2019-08-04T03:54:58Z,2019-08-04T03:37:34Z,OWNER,,Python library only. What would be a syntactically pleasant way of creating a m2m record?,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 508190730,MDU6SXNzdWU1MDgxOTA3MzA=,23,Extremely simple migration system,9599,simonw,closed,0,,,,,2,2019-10-17T02:13:57Z,2019-10-17T16:57:17Z,2019-10-17T16:57:17Z,MEMBER,,"Needed for #12. This is going to be an incredibly simple version of the Django migration system. * A `migrations` table, keeping track of which migrations were applied (and when) * A `migrate()` function which applies any pending migrations * A `MIGRATIONS` constant which is a list of functions to be applied The function names will be detected and used as the names of the migrations. Every time you run the CLI tool it will call the `migrate()` function before doing anything else. Needs to take into account that there might be no tables at all. As such, migration functions should sanity check that the tables they are going to work on actually exist.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586595839,MDU6SXNzdWU1ODY1OTU4Mzk=,23,Release 1.0,9599,simonw,closed,0,,,5225818,1.0,1,2020-03-24T00:03:55Z,2020-03-24T00:15:50Z,2020-03-24T00:15:50Z,MEMBER,,Need to compile release notes.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 621280529,MDU6SXNzdWU2MjEyODA1Mjk=,23,create-subset command for creating a publishable subset of a photos database,9599,simonw,closed,0,,,,,1,2020-05-19T20:58:20Z,2020-05-19T22:32:48Z,2020-05-19T22:32:37Z,MEMBER,,"I want to share a subset of my photos, without sharing everything. Idea: $ photos-to-sqlite create-subset photos.db public.db ""select sha256 from ... where ..."" So the command takes a SQL query that returns sha256 hashes, then creates a new file called `public.db` containing just the data corresponding to those photos.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 703970713,MDU6SXNzdWU3MDM5NzA3MTM=,23,Sort option should persist between multiple searches,9599,simonw,closed,0,,,,,0,2020-09-17T23:21:26Z,2020-09-18T22:39:12Z,2020-09-18T22:39:12Z,MEMBER,,Following #21 ,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1515717718,PR_kwDOC8tyDs5Gc-VH,23,Include workout statistics,2129,badboy,open,0,,,,,0,2023-01-01T17:29:57Z,2023-01-01T17:29:57Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/healthkit-to-sqlite/pulls/23,"Not sure when this changed (iOS 16 maybe?), but the `WorkoutStatistics` now has a whole bunch of information about workouts, e.g. for runs it contains the distance (as a `` element). Adding it as another column at leat allows me to pull these out (using SQLite's JSON support). I'm running with this patch on my own data now.",197882382,healthkit-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 267828746,MDU6SXNzdWUyNjc4Mjg3NDY=,24,Implement full URL design,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-10-23T21:49:05Z,2017-10-24T14:12:00Z,2017-10-24T14:12:00Z,OWNER,,"Full URL design: /database-name /database-name.json /database-name-7sha256 /database-name-7sha256.json /database-name/table-name /database-name/table-name.json /database-name-7sha256/table-name /database-name-7sha256/table-name.json /database-name-7sha256/table-name/compound-pk /database-name-7sha256/table-name/compound-pk.json ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 449818897,MDU6SXNzdWU0NDk4MTg4OTc=,24,Additional Column Constraints?,98555,IgnoredAmbience,closed,0,,,,,6,2019-05-29T13:47:03Z,2019-06-13T06:47:17Z,2019-06-13T06:30:26Z,NONE,,"I'm looking to import data from XML with a pre-defined schema that maps fairly closely to a relational database. In particular, it has explicit annotations for when fields are required, optional, or when a default value should be inferred. Would there be value in adding the ability to define `NOT NULL` and `DEFAULT` column constraints to sqlite-utils?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 508553387,MDExOlB1bGxSZXF1ZXN0MzI5MzI0MzY4,24,Tweet source extraction and new migration system,9599,simonw,closed,0,,,,,0,2019-10-17T15:24:56Z,2019-10-17T15:49:29Z,2019-10-17T15:49:24Z,MEMBER,dogsheep/twitter-to-sqlite/pulls/24,Closes #12 and #23,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 599776345,MDU6SXNzdWU1OTk3NzYzNDU=,24,Feature idea: github-to-sqlite everything ...,9599,simonw,open,0,,,,,0,2020-04-14T18:34:00Z,2020-04-14T18:34:00Z,,MEMBER,,"At the moment if you want to pull all your repos, issues, issues comments etc you have to do it with a sequence of separate commands. Consider adding a `everything` or `all` command which fetches everything that the tool knows how to fetch, and is designed to be run on a cron in a way that fetches just new stuff each time.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/24/reactions"", ""total_count"": 7, ""+1"": 7, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 621323348,MDU6SXNzdWU2MjEzMjMzNDg=,24,Configurable URL for images,9599,simonw,open,0,,,,,1,2020-05-19T22:25:56Z,2020-05-20T06:00:29Z,,MEMBER,,"This is hard-coded at the moment, which is bad: https://github.com/dogsheep/photos-to-sqlite/blob/d5d69b9019703c47bc251444838578dd752801e2/photos_to_sqlite/cli.py#L269-L272",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 703970814,MDU6SXNzdWU3MDM5NzA4MTQ=,24,"the JSON object must be str, bytes or bytearray, not 'Undefined'",9599,simonw,closed,0,,,,,8,2020-09-17T23:21:41Z,2020-09-18T22:33:32Z,2020-09-18T22:33:32Z,MEMBER,,Got this on a search results page.,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1515883470,I_kwDOC8tyDs5aWovO,24,DOC: xml.etree.ElementTree.ParseError due to healthkit version 12 ,6231413,mmngreco,open,0,,,,,2,2023-01-01T23:00:38Z,2023-03-30T10:17:31Z,,NONE,,"Hi @simonw I hope you find this issue ok, the idea is provide some documentation to other users like me about how to solve this problem and save some time. Following the instructions from the `README.md` I've faced this error: ```bash (venv) mgreco@pop-os apple-health master* (23:44|0s) $ healthkit-to-sqlite apple_health_export/export.xml healthkit.db --xml Importing from HealthKit [------------------------------------] 0% Traceback (most recent call last): File ""/home/mgreco/github/mmngreco/apple-health/venv/bin/healthkit-to-sqlite"", line 33, in sys.exit(load_entry_point('healthkit-to-sqlite', 'console_scripts', 'healthkit-to-sqlite')()) File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) File ""/home/mgreco/github/mmngreco/apple-health/.deps/healthkit-to-sqlite/healthkit_to_sqlite/cli.py"", line 57, in cli convert_xml_to_sqlite(fp, db, progress_callback=bar.update, zipfile=zf) File ""/home/mgreco/github/mmngreco/apple-health/.deps/healthkit-to-sqlite/healthkit_to_sqlite/utils.py"", line 25, in convert_xml_to_sqlite for tag, el in find_all_tags( File ""/home/mgreco/github/mmngreco/apple-health/.deps/healthkit-to-sqlite/healthkit_to_sqlite/utils.py"", line 12, in find_all_tags for event, el in parser.read_events(): File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/xml/etree/ElementTree.py"", line 1324, in read_events raise event File ""/home/mgreco/github/mmngreco/apple-health/venv/lib/python3.10/xml/etree/ElementTree.py"", line 1296, in feed self._parser.feed(data) xml.etree.ElementTree.ParseError: syntax error: line 156, column 0 ``` So, after debugging and searching on internet I found this useful link: https://discussions.apple.com/thread/254202523 (etresoft, the real hero). Which basically says that the xml given by the health app (healthkit version 12) has some bugs but fortunately, they can be solved with a couple of commads: 1. Uncompress the zip and move the new folder where `export.xml` is. 1. Create a `patch.txt` with the following content ```diff --- export.xml 2022-09-18 15:17:09.000000000 -0400 +++ export-fixed.xml 2022-09-18 16:37:08.000000000 -0400 @@ -15,6 +15,7 @@ HKCharacteristicTypeIdentifierBiologicalSex CDATA #REQUIRED HKCharacteristicTypeIdentifierBloodType CDATA #REQUIRED HKCharacteristicTypeIdentifierFitzpatrickSkinType CDATA #REQUIRED + HKCharacteristicTypeIdentifierCardioFitnessMedicationsUse CDATA #IMPLIED > - + - + - device CDATA #IMPLIED - - -> ]> ``` 1. Apply the path with the command: `patch < patch.txt` 1. Fix endDates with the command `sed 's/startDate/endDate/2' export.xml > export-fixed.xml` 1. Try again `healthkit-to-sqlite export-fixed.xml healthkit.db --xml`",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 267857622,MDU6SXNzdWUyNjc4NTc2MjI=,25,Endpoint that returns SQL ready to be piped into DB,9599,simonw,closed,0,,,,,2,2017-10-24T00:19:26Z,2017-11-15T05:11:12Z,2017-11-15T05:11:11Z,OWNER,,It would be cool if I could figure out a way to generate both the create table statements and the inserts for an individual table or the entire database and then stream them down to the client.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 449848803,MDU6SXNzdWU0NDk4NDg4MDM=,25,"Allow .insert(..., foreign_keys=()) to auto-detect table and primary key",9599,simonw,closed,0,,,,,4,2019-05-29T14:39:22Z,2019-06-13T05:32:32Z,2019-06-13T05:32:32Z,OWNER,,"The `foreign_keys=` argument currently takes a list of triples: ```python db[""usages""].insert_all( usages_to_insert, foreign_keys=( (""line_id"", ""lines"", ""id""), (""definition_id"", ""definitions"", ""id""), ), ) ``` As of #16 we have a mechanism for detecting the primary key column (the third item in this triple) - we should use that here too, so foreign keys can be optionally defined as a list of pairs.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 508578780,MDU6SXNzdWU1MDg1Nzg3ODA=,25,Ensure migrations don't accidentally create foreign key twice,9599,simonw,closed,0,,,,,2,2019-10-17T16:08:50Z,2019-10-17T16:56:47Z,2019-10-17T16:56:47Z,MEMBER,,"Is it possible for these lines to run against a database table that already has these foreign keys? https://github.com/dogsheep/twitter-to-sqlite/blob/c9295233f219c446fa2085cace987067488a31b9/twitter_to_sqlite/migrations.py#L21-L22",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601265023,MDU6SXNzdWU2MDEyNjUwMjM=,25,Improvements to demo instance,9599,simonw,closed,0,,,,,1,2020-04-16T17:26:55Z,2020-04-16T18:07:12Z,2020-04-16T18:07:12Z,MEMBER,,- [x] Demo should pull issue-comments as well,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 621332242,MDU6SXNzdWU2MjEzMzIyNDI=,25,Create a public demo,9599,simonw,closed,0,,,,,5,2020-05-19T22:47:20Z,2020-05-21T22:26:16Z,2020-05-20T05:54:18Z,MEMBER,,"So I can show people what this does, using some of my photos.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 704685890,MDU6SXNzdWU3MDQ2ODU4OTA=,25,template_debug mechanism,9599,simonw,closed,0,,,,,2,2020-09-18T22:11:09Z,2020-09-18T22:12:21Z,2020-09-18T22:12:03Z,MEMBER,,"> I'd prefer it if errors in these template fragments were displayed as errors inline where the fragment should have been inserted, rather than 500ing the whole page - especially since the template fragments are user-provided and could have all kinds of odd errors in them which should be as easy to debug as possible. _Originally posted by @simonw in https://github.com/dogsheep/dogsheep-beta/issues/24#issuecomment-694554584_",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267861210,MDU6SXNzdWUyNjc4NjEyMTA=,26,Command line tool for uploading one or more DBs to Now,9599,simonw,closed,0,,,2857392,Ship first public release,3,2017-10-24T00:43:10Z,2017-11-11T07:25:30Z,2017-11-11T07:25:30Z,OWNER,,"Uploading files appears to be undocumented, but I found it in their code here: https://github.com/zeit/now-cli/blob/0ca7d1fe44ebdf460b64fdc38ba543b8e295ac40/src/providers/sh/util/index.js#L291",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/26/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 455486286,MDU6SXNzdWU0NTU0ODYyODY=,26,Mechanism for turning nested JSON into foreign keys / many-to-many,9599,simonw,open,0,,,,,14,2019-06-13T00:52:06Z,2022-06-29T23:35:29Z,,OWNER,,"The GitHub JSON APIs have a really interesting convention with respect to related objects. Consider https://api.github.com/repos/simonw/sqlite-utils/issues - here's a truncated subset: ```json { ""id"": 449818897, ""node_id"": ""MDU6SXNzdWU0NDk4MTg4OTc="", ""number"": 24, ""title"": ""Additional Column Constraints?"", ""user"": { ""login"": ""IgnoredAmbience"", ""id"": 98555, ""node_id"": ""MDQ6VXNlcjk4NTU1"", ""avatar_url"": ""https://avatars0.githubusercontent.com/u/98555?v=4"", ""gravatar_id"": """" }, ""labels"": [ { ""id"": 993377884, ""node_id"": ""MDU6TGFiZWw5OTMzNzc4ODQ="", ""url"": ""https://api.github.com/repos/simonw/sqlite-utils/labels/enhancement"", ""name"": ""enhancement"", ""color"": ""a2eeef"", ""default"": true } ], ""state"": ""open"" } ``` The `user` column lists a complete user. The `labels` column has a list of labels. Since both user and label have populated `id` field this is actually enough information for us to create records for them AND set up the corresponding foreign key (for user) and m2m relationships (for labels). It would be really neat if `sqlite-utils` had some kind of mechanism for correctly processing these kind of patterns. Thanks to `jq` there's not much need for extra customization of the shape here - if we support a narrowly defined structure users can use `jq` to reshape arbitrary JSON to match.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/26/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 513074501,MDU6SXNzdWU1MTMwNzQ1MDE=,26,Command for importing mentions timeline,9599,simonw,closed,0,,,,,1,2019-10-28T03:14:27Z,2019-10-30T02:36:13Z,2019-10-30T02:20:47Z,MEMBER,,"https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-mentions_timeline Almost identical to home-timeline #18 but it uses `https://api.twitter.com/1.1/statuses/mentions_timeline.json` instead.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/26/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601271612,MDU6SXNzdWU2MDEyNzE2MTI=,26,Topics are missing from repositories,9599,simonw,closed,0,,,,,2,2020-04-16T17:36:32Z,2020-04-16T17:41:11Z,2020-04-16T17:41:11Z,MEMBER,,"I'm sure this used to work, but right now repositories are fetched without their topics. https://developer.github.com/v3/repos/ says you need to send a custom `Accept` header of `application/vnd.github.mercy-preview+json` to get topics.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/26/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 621444763,MDU6SXNzdWU2MjE0NDQ3NjM=,26,Rename project to dogsheep-photos,9599,simonw,closed,0,,,,,8,2020-05-20T04:12:34Z,2020-05-20T04:31:02Z,2020-05-20T04:30:40Z,MEMBER,,`photos-to-sqlite` doesn't really capture the full scope of this project anymore.,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/26/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 705215230,MDU6SXNzdWU3MDUyMTUyMzA=,26,Pagination,9599,simonw,open,0,,,,,7,2020-09-21T00:14:37Z,2020-09-21T02:55:54Z,,MEMBER,,Useful for #16 (timeline view) since you can now filter to just the items on a specific day - but if there are more than 50 items you can't see them all.,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/26/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 267886330,MDU6SXNzdWUyNjc4ODYzMzA=,27,Ability to plot a simple graph,9599,simonw,closed,0,,,,,3,2017-10-24T03:34:59Z,2018-07-10T17:52:41Z,2018-07-10T17:52:41Z,OWNER,,"Might be as simple as: pick he type of chart (bar, line) and then pick the column for the X axis and the column for the Y axis. Maybe also allow a pie chart. It’s up to the user to come up with SQL that gets the right values.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/27/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 455496504,MDU6SXNzdWU0NTU0OTY1MDQ=,27,sqlite-utils create-table command,9599,simonw,closed,0,,,,,8,2019-06-13T01:43:30Z,2020-05-03T15:26:15Z,2020-05-03T15:26:15Z,OWNER,,"Spun off from #24 - it would be useful if CLI users could create new tables (with explicit column types, not null rules and defaults) without having to insert an example record. - [x] Get it working - [x] Support `--pk` - [x] Support `--not-null` - [x] Support `--default` - [x] Support `--fk colname othertable othercol` - [x] Support `--replace` and `--ignore` - [x] Documentation",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/27/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 514459062,MDU6SXNzdWU1MTQ0NTkwNjI=,27,retweets-of-me command,9599,simonw,closed,0,,,,,4,2019-10-30T07:43:01Z,2019-11-03T01:12:58Z,2019-11-03T01:12:58Z,MEMBER,,https://developer.twitter.com/en/docs/tweets/post-and-engage/api-reference/get-statuses-retweets_of_me,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601330277,MDU6SXNzdWU2MDEzMzAyNzc=,27,Repos have a big blob of JSON in the organization column,9599,simonw,closed,0,,,,,5,2020-04-16T18:43:14Z,2020-04-18T00:19:16Z,2020-04-18T00:18:52Z,MEMBER,,"e.g. https://github-to-sqlite.dogsheep.net/github/repos ![github__repos__11_rows_where_sorted_by_updated_at_descending](https://user-images.githubusercontent.com/9599/79494124-5640b980-7fd7-11ea-99a2-17ffbd82f9ce.png) This appears to be obsolete because the `owner` column already links to that record, albeit in the `users` table with `type` set to `Organization`: https://github-to-sqlite.dogsheep.net/github/users/53015001",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/27/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 621486115,MDU6SXNzdWU2MjE0ODYxMTU=,27,photos_with_apple_metadata view should include labels,9599,simonw,open,0,,,,,0,2020-05-20T06:06:17Z,2020-05-20T06:06:17Z,,MEMBER,,"https://dogsheep-photos.dogsheep.net/public/photos_with_apple_metadata?place_city=New+Orleans&_facet=place_city&_facet_array=albums&_facet_array=persons Here's one way to add that: ```sql select rowid, photo, ( select json_group_array( json_object( 'label', normalized_string, 'href', '/photos/labelled?_hide_sql=1&label=' || normalized_string ) ) from labels where labels.uuid = photos_with_apple_metadata.uuid ) as labels, date, ```",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/27/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 709789634,MDU6SXNzdWU3MDk3ODk2MzQ=,27,Sort order is not persisted by facet filter links,9599,simonw,open,0,,,,,0,2020-09-27T18:22:07Z,2020-09-27T18:22:07Z,,MEMBER,,A link to `/-/beta?category=1×tamp__date=2018-08-01&q=swedish` should be to `/-/beta?category=1×tamp__date=2018-08-01&q=swedish&sort=newest`,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/27/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 267886865,MDU6SXNzdWUyNjc4ODY4NjU=,28,/database?sql= should redirect correctly,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-10-24T03:38:44Z,2017-10-24T23:54:30Z,2017-10-24T23:54:30Z,OWNER,,Needs to redirect to the location with the hash while retaining the query string. This should also work with the .json extension.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/28/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 455996809,MDU6SXNzdWU0NTU5OTY4MDk=,28,"Rearrange the docs by area, not CLI vs Python",9599,simonw,closed,0,,,,,1,2019-06-13T23:33:35Z,2019-07-15T02:37:20Z,2019-07-15T02:37:20Z,OWNER,,"The docs for eg inserting data should live on the same page, rather than being split across the API and CLI pages.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/28/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 515658861,MDU6SXNzdWU1MTU2NTg4NjE=,28,Add indexes to followers table,9599,simonw,closed,0,,,,,1,2019-10-31T18:40:22Z,2019-11-09T20:15:42Z,2019-11-09T20:11:48Z,MEMBER,,`select follower_id from following where followed_id = 12497` takes over a second for me at the moment.,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/28/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601333634,MDU6SXNzdWU2MDEzMzM2MzQ=,28,Pull repository contributors,9599,simonw,closed,0,,,,,3,2020-04-16T18:46:40Z,2020-04-18T15:05:10Z,2020-04-18T15:05:10Z,MEMBER,,"https://developer.github.com/v3/repos/#list-contributors `GET /repos/:owner/:repo/contributors` Not sure if this should be a separate command or should be part of the existing `repos` command. I'm leaning towards a new `contributors` command.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/28/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 624490929,MDU6SXNzdWU2MjQ0OTA5Mjk=,28,Invalid SQL no such table: main.uploads,41439,dmd,open,0,,,,,1,2020-05-25T21:25:39Z,2020-12-24T22:26:22Z,,NONE,,"http://127.0.0.1:8001/photos/photos_with_apple_metadata gives ""Invalid SQL no such table: main.uploads""",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/28/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 723861683,MDU6SXNzdWU3MjM4NjE2ODM=,28,Switch to using datasette.client,9599,simonw,closed,0,,,,,1,2020-10-17T22:42:26Z,2020-10-17T23:00:47Z,2020-10-17T23:00:47Z,MEMBER,,"`datasette.client` is designed for this kind of thing, to replace this code: https://github.com/dogsheep/dogsheep-beta/blob/bed9df2b3ef68189e2e445427721a28f4e9b4887/dogsheep_beta/__init__.py#L223-L232",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/28/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268050821,MDU6SXNzdWUyNjgwNTA4MjE=,29,Handle bytestring records encoding to JSON,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-24T14:18:45Z,2017-10-24T14:59:00Z,2017-10-24T14:58:47Z,OWNER,,"http://localhost:8006/northwind-40d049b/Categories.json 500s right now The string representation of one of the values looks like this: b""\x15\x1c/\x00\x02\x00 This is a bytestring from the database which cannot be naively converted to a unicode string.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/29/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 458941203,MDU6SXNzdWU0NTg5NDEyMDM=,29,Prevent accidental add-foreign-key with invalid column,9599,simonw,closed,0,,,,,0,2019-06-20T23:57:24Z,2019-06-20T23:58:26Z,2019-06-20T23:58:26Z,OWNER,,"You can corrupt your database by running: $ sqlite-utils add-foreign-key my.db table non_existent_column other_table other_column ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/29/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 518725064,MDU6SXNzdWU1MTg3MjUwNjQ=,29,`import` command fails on empty files,21148,jacobian,closed,0,,,,,4,2019-11-06T20:34:26Z,2019-11-09T20:33:38Z,2019-11-09T19:36:36Z,CONTRIBUTOR,,"If a file in the export is empty (in my case it was `account-suspensions.js`), `twitter-to-sqlite import` fails: ``` $ twitter-to-sqlite import twitter.db ~/Downloads/twitter-2019-11-06-926f4f3be4b3b1fcb1aa387c40cd14f7c8aaf9bbcdb2d78ac14d9989add501bb.zip Traceback (most recent call last): File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/bin/twitter-to-sqlite"", line 10, in sys.exit(cli()) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py"", line 627, in import_ archive.import_from_file(db, filename, content) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/archive.py"", line 224, in import_from_file db[table_name].upsert_all(rows, hash_id=""pk"") File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1113, in upsert_all extracts=extracts, File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/sqlite_utils/db.py"", line 980, in insert_all first_record = next(records) StopIteration ``` This appears to be because `db.upsert_all` is called with no rows -- I think? I hacked around this by modifying `import_from_file` to have an `if rows:` clause: ``` for table, rows in to_insert.items(): if rows: table_name = ""archive_{}"".format(table.replace(""-"", ""_"")) ... ``` I'm happy to work up a real PR if that's the right approach, but I'm not sure it is.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/29/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 603617013,MDU6SXNzdWU2MDM2MTcwMTM=,29,Milestones should have foreign key to creator and repo,9599,simonw,closed,0,,,,,1,2020-04-21T00:20:44Z,2020-04-21T00:43:58Z,2020-04-21T00:43:58Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github/milestones Creator is an integer but not a foreign key to users Repo is missing entirely!",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/29/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 638375985,MDExOlB1bGxSZXF1ZXN0NDM0MTYyMzE2,29,Fixed bug in SQL query for photo scores,41546558,RhetTbull,closed,0,,,,,1,2020-06-14T15:39:22Z,2020-12-04T22:32:36Z,2020-12-04T22:32:27Z,CONTRIBUTOR,dogsheep/dogsheep-photos/pulls/29,"The join on ZCOMPUTEDASSETATTRIBUTES used the wrong columns. In most of the Photos database tables, table.ZASSET joins with ZGENERICASSET.Z_PK",256834907,dogsheep-photos,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/29/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 724759588,MDU6SXNzdWU3MjQ3NTk1ODg=,29,Add search highlighting snippets,9599,simonw,open,0,,,,,5,2020-10-19T16:00:48Z,2021-08-26T20:23:11Z,,MEMBER,,Like on https://til.simonwillison.net/til/search?q=Snippet,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/29/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",, 268078453,MDU6SXNzdWUyNjgwNzg0NTM=,30,Do something neat with foreign keys,9599,simonw,closed,0,,,,,1,2017-10-24T15:29:29Z,2017-11-14T18:29:08Z,2017-11-14T18:29:01Z,OWNER,,"https://www.sqlite.org/pragma.html#pragma_foreign_key_list SQLite has robust support for introspecting foreign keys. I could use that to automatically link to the corresponding record from my tables.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/30/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 461215118,MDU6SXNzdWU0NjEyMTUxMTg=,30,Option to open database in read-only mode,9599,simonw,closed,0,,,,,1,2019-06-26T22:50:38Z,2020-05-11T19:17:17Z,2020-05-11T19:17:17Z,OWNER,,Would this make it 100% safe to run reads against a database file that is being written to by another process?,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/30/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 518739697,MDU6SXNzdWU1MTg3Mzk2OTc=,30,`followers` fails because `transform_user` is called twice,21148,jacobian,closed,0,,,,,2,2019-11-06T20:44:52Z,2019-11-09T20:15:28Z,2019-11-09T19:55:52Z,CONTRIBUTOR,,"Trying to run `twitter-to-sqlite followers` errors out: ``` Traceback (most recent call last): File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/bin/twitter-to-sqlite"", line 10, in sys.exit(cli()) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py"", line 130, in followers go(bar.update) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py"", line 116, in go utils.save_users(db, [profile]) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/utils.py"", line 302, in save_users transform_user(user) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/utils.py"", line 181, in transform_user user[""created_at""] = parser.parse(user[""created_at""]) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 1374, in parse return DEFAULTPARSER.parse(timestr, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 646, in parse res, skipped_tokens = self._parse(timestr, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 725, in _parse l = _timelex.split(timestr) # Splits the timestr into tokens File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 207, in split return list(cls(s)) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 76, in __init__ '{itype}'.format(itype=instream.__class__.__name__)) TypeError: Parser must be a string or character stream, not datetime ``` This appears to be because https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L111 calls `transform_user`, and then https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L116 calls `transform_user` again, which fails because the user is already transformed. I was able to work around this by commenting out https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L116. Shall I work up a patch for that, or is there a better approach?",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/30/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 603618244,MDU6SXNzdWU2MDM2MTgyNDQ=,30,Issues milestone column is the wrong type,9599,simonw,closed,0,,,,,2,2020-04-21T00:24:34Z,2020-04-21T00:45:23Z,2020-04-21T00:36:22Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github/issues?milestone=2857392 ![2A4C1185-2434-4F29-9EA0-3246E2F03F77](https://user-images.githubusercontent.com/9599/79811760-b7e08b00-832b-11ea-9ad7-684a6ae097a6.jpeg) It is TEXT when it should be an INTEGER - which is why the foreign key label is not correctly displayed.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/30/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 655974395,MDExOlB1bGxSZXF1ZXN0NDQ4MzU1Njgw,30,Handle empty bucket on first upload. Allow specifying the endpoint_url for services other than S3 (like b2 and digitalocean spaces),110038,scanner,open,0,,,,,0,2020-07-13T16:15:26Z,2020-07-13T16:15:26Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/dogsheep-photos/pulls/30,"Finally got around to trying dogsheep-photos but I want to use backblaze's b2 service instead of AWS S3. Had to add a way to optionally specify the endpoint_url to connect to. Then with the bucket being empty the initial key retrieval would fail. Probably a better way to see that the bucket is empty than doing a test inside the paginator loop. Also probably a better way to specify the endpoint_url as we get and test for it twice using the same code in two different places but did not want to spend too much time worrying about it.",256834907,dogsheep-photos,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/30/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 769282206,MDU6SXNzdWU3NjkyODIyMDY=,30,Upgrade to sqlite-utils 3.0 (tests are failing),9599,simonw,closed,0,,,,,0,2020-12-16T21:25:15Z,2020-12-16T21:27:11Z,2020-12-16T21:27:10Z,MEMBER,,"``` results = beta_db[""search_index""].search(""run"") if use_porter: assert results == [ ( ""dogs.db/dogs"", ""1"", ""Cleo"", ""2020-08-22 04:41:33"", 1, 0, ""running"", None, None, ) ] else: > assert results == [] E assert == [] E + E -[] E Full diff: E - [] E + ``` This was caused by a backwards incompatible change in sqlite-utils 3.0: https://sqlite-utils.readthedocs.io/en/stable/changelog.html#v3-0",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/30/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268087542,MDU6SXNzdWUyNjgwODc1NDI=,31,Idea: colour scheme based on sha256 of db,9599,simonw,closed,0,,,2859414,v1 stretch goals,1,2017-10-24T15:52:38Z,2018-05-28T18:10:45Z,2017-11-09T14:14:59Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/31/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 461237618,MDU6SXNzdWU0NjEyMzc2MTg=,31,Mechanism for adding multiple foreign key constraints at once,9599,simonw,closed,0,,,,,0,2019-06-27T00:04:30Z,2019-06-29T06:27:40Z,2019-06-29T06:27:40Z,OWNER,,"Needed by [db-to-sqlite](https://github.com/simonw/db-to-sqlite). It currently works by collecting all of the foreign key relationships it can find and then applying them at the end of the process. The problem is, the `add_foreign_key()` method looks like this: https://github.com/simonw/sqlite-utils/blob/86bd2bba689e25f09551d611ccfbee1e069e5b66/sqlite_utils/db.py#L498-L516 That means it's doing a full `VACUUM` for every single relationship it sets up - and if you have hundreds of foreign key relationships in your database this can take hours. I think the right solution is to have a `.add_foreign_keys(list_of_args)` method which does the bulk operation and then a single `VACUUM`. `.add_foreign_key(...)` can then call the bulk action with a single list item.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/31/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520508502,MDU6SXNzdWU1MjA1MDg1MDI=,31,"""friends"" command (similar to ""followers"")",9599,simonw,closed,0,,,,,2,2019-11-09T20:20:20Z,2022-09-20T05:05:03Z,2020-02-07T07:03:28Z,MEMBER,,"Current list of commands: ``` followers Save followers for specified user (defaults to... followers-ids Populate followers table with IDs of account followers friends-ids Populate followers table with IDs of account friends ``` Obvious omission here is `friends`, which would be powered by `https://api.twitter.com/1.1/friends/list.json`: https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-friends-list",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/31/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 603624862,MDU6SXNzdWU2MDM2MjQ4NjI=,31,Issue and milestone should have foreign key to repo,9599,simonw,closed,0,,,,,3,2020-04-21T00:46:24Z,2020-04-22T01:20:19Z,2020-04-22T01:20:19Z,MEMBER,,"Currently the `repo` column on those tables is a string `simonw/datasette` rather than an ID referencing a row in `repos`. _Originally posted by @simonw in https://github.com/dogsheep/github-to-sqlite/issues/29#issuecomment-616883275_",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/31/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 771316301,MDU6SXNzdWU3NzEzMTYzMDE=,31,"Searching for ""github-to-sqlite"" throws an error",9599,simonw,closed,0,,,,,4,2020-12-19T06:07:20Z,2020-12-19T06:18:07Z,2020-12-19T06:18:07Z,MEMBER,,"https://datasette.io/-/beta?q=github-to-sqlite&sort=relevance&type=blog.db%2Fentries - ""no such column: to""",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/31/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 771511344,MDExOlB1bGxSZXF1ZXN0NTQzMDE1ODI1,31,Update for Big Sur,41546558,RhetTbull,open,0,,,,,7,2020-12-20T04:36:45Z,2023-08-08T15:52:52Z,,CONTRIBUTOR,dogsheep/dogsheep-photos/pulls/31,Refactored out the SQL for extracting aesthetic scores to use osxphotos -- adds compatbility for Big Sur via osxphotos which has been updated for new table names in Big Sur. Have not yet refactored the SQL for extracting labels which is still compatible with Big Sur.,256834907,dogsheep-photos,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/31/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 268106803,MDU6SXNzdWUyNjgxMDY4MDM=,32,Try running SQLite queries in a separate thread,9599,simonw,closed,0,,,2859414,v1 stretch goals,1,2017-10-24T16:48:42Z,2017-11-09T14:05:56Z,2017-11-09T14:05:56Z,OWNER,,"https://pymotw.com/3/asyncio/executors.html Would be good to have some actual benchmarks so I can evaluate if this is worth it or not.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/32/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 462094937,MDExOlB1bGxSZXF1ZXN0MjkyODc5MjA0,32,db.add_foreign_keys() method,9599,simonw,closed,0,,,,,1,2019-06-28T15:40:33Z,2019-06-29T06:27:39Z,2019-06-29T06:27:39Z,OWNER,simonw/sqlite-utils/pulls/32,"Refs #31. Still TODO: - [x] Unit tests - [x] Documentation",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/32/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 561454071,MDU6SXNzdWU1NjE0NTQwNzE=,32,"Documentation for "" favorites"" command",9599,simonw,closed,0,,,,,0,2020-02-07T06:50:11Z,2020-02-07T06:59:10Z,2020-02-07T06:59:10Z,MEMBER,,"It looks like I forgot to document this one in the README. https://github.com/dogsheep/twitter-to-sqlite/blob/6ebd482619bd94180e54bb7b56549c413077d329/twitter_to_sqlite/cli.py#L183-L194",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/32/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 604222295,MDU6SXNzdWU2MDQyMjIyOTU=,32,Issue comments don't appear to populate issues foreign key,9599,simonw,closed,0,,,,,3,2020-04-21T19:17:32Z,2020-04-22T01:17:44Z,2020-04-22T01:17:44Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github?sql=select+html_url%2C+id%2C+issue+from+issue_comments+order+by+updated_at+desc+limit+101 ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/32/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 803333769,MDU6SXNzdWU4MDMzMzM3Njk=,32,KeyError: 'Contents' on running upload,11855322,robmarkcole,open,0,,,,,3,2021-02-08T08:36:37Z,2021-07-22T06:40:25Z,,NONE,,"Following the readme, on big sur, and having entered my auth creds via `dogsheep-photos s3-auth`: ``` (venv) (base) Robins-MacBook:datasette robin$ dogsheep-photos upload photos.db ~/Pictures/Photos\ /Users/robin/Pictures/Library.photoslibrary --dry-run Fetching existing keys from S3... Traceback (most recent call last): File ""/Users/robin/datasette/venv/bin/dogsheep-photos"", line 8, in sys.exit(cli()) File ""/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/Users/robin/datasette/venv/lib/python3.8/site-packages/dogsheep_photos/cli.py"", line 96, in upload key.split(""."")[0] for key in get_all_keys(client, creds[""photos_s3_bucket""]) File ""/Users/robin/datasette/venv/lib/python3.8/site-packages/dogsheep_photos/utils.py"", line 46, in get_all_keys for row in page[""Contents""]: KeyError: 'Contents' ``` Possibly since the bucket is in `EU (London) eu-west-2` and this into is not requested?",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/32/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 836923194,MDU6SXNzdWU4MzY5MjMxOTQ=,32,JSON API for search results,9599,simonw,open,0,,,,,0,2021-03-20T22:21:36Z,2021-03-20T22:21:36Z,,MEMBER,,Refs https://github.com/simonw/datasette/issues/878,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/32/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 268110769,MDU6SXNzdWUyNjgxMTA3Njk=,33,Use locust for benchmarking and load tests,9599,simonw,open,0,,,,,0,2017-10-24T17:00:09Z,2017-12-10T03:12:16Z,,OWNER,,"https://github.com/locustio/locust Needed for #32 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 462423839,MDU6SXNzdWU0NjI0MjM4Mzk=,33,index_foreign_keys / index-foreign-keys utilities,9599,simonw,closed,0,,,,,2,2019-06-30T16:42:03Z,2019-06-30T23:54:11Z,2019-06-30T23:50:55Z,OWNER,,"Sometimes it's good to have indices on all columns that are foreign keys, to allow for efficient reverse lookups. This would be a useful utility: $ sqlite-utils index-foreign-keys database.db ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 561469252,MDExOlB1bGxSZXF1ZXN0MzcyMjczNjA4,33,Upgrade to sqlite-utils 2.2.1,9599,simonw,closed,0,,,,,1,2020-02-07T07:32:12Z,2020-03-20T19:21:42Z,2020-03-20T19:21:41Z,MEMBER,dogsheep/twitter-to-sqlite/pulls/33,,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 609950090,MDU6SXNzdWU2MDk5NTAwOTA=,33,Fall back to authentication via ENV,2029,garethr,closed,0,,,,,4,2020-04-30T12:58:14Z,2020-05-02T18:46:10Z,2020-05-02T18:45:37Z,NONE,,"Would you accept a PR that falls back to looking for an environment variable for the GitHub token? Specifically a change here: https://github.com/dogsheep/github-to-sqlite/blob/c34d5a18bfc41fa08755ba3d5cf9fe09ff204238/github_to_sqlite/cli.py#L271 I'd like to use `github-to-sqlite` in a GitHub Action workflow and this would be simpler than trying to fill out the prompt or generate a file with sensitive content. Wanted to check first, I'm happy to submit a PR with tests and updates to the docs. ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 803338729,MDU6SXNzdWU4MDMzMzg3Mjk=,33,photo-to-sqlite: command not found,11855322,robmarkcole,open,0,,,,,4,2021-02-08T08:42:57Z,2021-02-12T15:00:44Z,,NONE,,"Having installed in a venv I get: ``` (venv) (base) Robins-MacBook:datasette robin$ photo-to-sqlite apple-photos photos.db -bash: photo-to-sqlite: command not found ```",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 919733213,MDU6SXNzdWU5MTk3MzMyMTM=,33,Searching for whitespace throws an error,9599,simonw,closed,0,,,,,0,2021-06-13T06:57:57Z,2021-06-13T14:36:39Z,2021-06-13T14:36:39Z,MEMBER,,"https://datasette.io/-/beta?q=+ returns a 500 > fts5: syntax error near """"",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268176505,MDU6SXNzdWUyNjgxNzY1MDU=,34,Support CSV export with a .csv extension,9599,simonw,closed,0,,,,,1,2017-10-24T20:34:43Z,2021-06-17T18:14:48Z,2018-05-28T20:45:34Z,OWNER,,"Maybe do this using streaming with multiple pagination SQL queries so we can support arbritrarily large exports. How would this work against a view which doesn’t have an obvious efficient pagination mechanism? Maybe limit views to up to 1000 exported records? Relates to #5 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/34/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 462423972,MDExOlB1bGxSZXF1ZXN0MjkzMTE3MTgz,34,sqlite-utils index-foreign-keys / db.index_foreign_keys(),9599,simonw,closed,0,,,,,0,2019-06-30T16:43:40Z,2019-06-30T23:50:55Z,2019-06-30T23:50:55Z,OWNER,simonw/sqlite-utils/pulls/34,"Refs #33 - [x] `sqlite-utils index-foreign-keys` command - [x] `db.index_foreign_keys()` method - [x] unit tests - [x] documentation",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/34/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 585266763,MDU6SXNzdWU1ODUyNjY3NjM=,34,IndexError running user-timeline command,9599,simonw,closed,0,,,,,2,2020-03-20T18:54:08Z,2020-03-20T19:20:52Z,2020-03-20T19:20:37Z,MEMBER,,"``` $ twitter-to-sqlite user-timeline data.db --screen_name Allen_Joines Traceback (most recent call last): File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/bin/twitter-to-sqlite"", line 11, in load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')() File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py"", line 256, in user_timeline utils.save_tweets(db, chunk) File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py"", line 289, in save_tweets db[""users""].upsert(user, pk=""id"", alter=True) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1128, in upsert conversions=conversions, File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1157, in upsert_all upsert=True, File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1096, in insert_all row = list(self.rows_where(""rowid = ?"", [self.last_rowid]))[0] IndexError: list index out of range ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/34/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610408908,MDU6SXNzdWU2MTA0MDg5MDg=,34,Command for retrieving dependents for a repo,9599,simonw,closed,0,,,,,6,2020-04-30T21:47:51Z,2020-05-03T15:53:01Z,2020-05-03T15:53:01Z,MEMBER,,"I really, really want to start grabbing this data: https://github.com/simonw/datasette/network/dependents",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/34/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 830283447,MDU6SXNzdWU4MzAyODM0NDc=,34,bucket name,6213,dsisnero,open,0,,,,,0,2021-03-12T16:40:57Z,2021-03-12T16:40:57Z,,NONE,,I followed the instructions to setup credentials but I am getting a invalid bucket name. Can you put a sample auth.json file in the base that shows the correct format for this? Thanks,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/34/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 983221851,MDU6SXNzdWU5ODMyMjE4NTE=,34,Data folder as index command parameter,1223625,humrochagf,open,0,,,,,0,2021-08-30T21:29:33Z,2021-08-30T21:29:33Z,,NONE,,"Hi, First of all, thank you for this wonderful project :smile: I started to use dogsheep to make my personal data searchable, and by using the project I noticed an issue with the index command. It always expects you are running it from the root folder from where the data is located, so I got some errors while trying to make it work on my setup. I separate all databases inside a `data` folder (I published my setup to be easier to follow: https://github.com/humrochagf/my-dogsheep) Before, I configured `dogsheep.yml` to add the data folder to its path like this: ```yml data/twitter.db: tweets: sql: |- ... ``` And running the index command like this: ``` dogsheep-beta index data/dogsheep.db dogsheep.yml ``` It worked to the normal search feature with no problem this way, but when I started adding `display_sql` rules the app started to crash, because at datasette `get_database` it was looking for `data/twitter` and it only had a db called `twitter` there. So my workaround to that was to cd into the data folder and run the indexer. You can check the way I'm doing it at this line of the makefile: https://github.com/humrochagf/my-dogsheep/blob/main/makefile#L3 It works but it would be nice to have an option to pass the path where the data is located to the index function.",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/34/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 462430920,MDU6SXNzdWU0NjI0MzA5MjA=,35,table.update(...) method,9599,simonw,closed,0,,,,,2,2019-06-30T18:06:15Z,2019-07-28T15:43:52Z,2019-07-28T15:43:52Z,OWNER,,"Spun off from #23 - this method will allow a user to update a specific row. Currently the only way to do that it is to call `.upsert({full record})` with the primary key field matching an existing record - but this does not support partial updates. ```python db[""events""].update(3, {""name"": ""Renamed""}) ``` This method only works on an existing table, so there's no need for a `pk=""id""` specifier - it can detect the primary key by looking at the table. If the primary key is compound the first argument can be a tuple: ```python db[""events_venues""].update((3, 2), {""custom_label"": ""Label""}) ``` The method can be called without the second dictionary argument. Doing this selects the row specified by the primary key (throwing an error if it does not exist) and remembers it so that chained operations can be carried out - see proposal in https://github.com/simonw/sqlite-utils/issues/23#issuecomment-507055345 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/35/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585282212,MDU6SXNzdWU1ODUyODIyMTI=,35,twitter-to-sqlite user-timeline [screen_names] --sql / --attach,9599,simonw,closed,0,,,,,5,2020-03-20T19:26:07Z,2020-03-20T20:17:00Z,2020-03-20T20:16:35Z,MEMBER,,Split from #8.,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/35/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610511450,MDU6SXNzdWU2MTA1MTE0NTA=,35,Create index on issue_comments(user) and other foreign keys,9599,simonw,closed,0,,,,,3,2020-05-01T02:06:56Z,2020-05-02T18:26:24Z,2020-05-02T18:26:24Z,MEMBER,,"``` create index issue_comments_user on issue_comments(user) ``` I'm sure there are other user columns that could benefit from an index.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/35/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 842695374,MDU6SXNzdWU4NDI2OTUzNzQ=,35,Support to annotate photos on other than macOS OSes,1151557,ligurio,open,0,,,,,1,2021-03-28T09:01:25Z,2021-04-05T07:37:57Z,,NONE,,dogsheep-photos allows to annotate photos using Apple Photo's db. It would be nice to have such ability on other OSes too. For example using trained local model or using Google Vision API (see #14).,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/35/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 987985935,MDExOlB1bGxSZXF1ZXN0NzI2OTkwNjgw,35,Support for Datasette's --base-url setting,2670795,brandonrobertz,open,0,,,,,0,2021-09-03T17:47:45Z,2021-09-03T17:47:45Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/dogsheep-beta/pulls/35,This makes it so you can use Dogsheep if you're using Datasette with the `--base-url /some-path/` setting.,197431109,dogsheep-beta,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/35/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 268262480,MDU6SXNzdWUyNjgyNjI0ODA=,36,"date, year, month and day querystring lookups",9599,simonw,closed,0,,,,,3,2017-10-25T04:23:45Z,2018-05-28T17:30:53Z,2018-05-28T17:30:53Z,OWNER,,"- [ ] `?timestamp___date=2017-07-17` - return every item where the timestamp falls on that date - [ ] `?timestamp___year=2017` - return every item where the timestamp falls within 2017 - [ ] `?timestamp___month=1` - return every item where the month component is January - [ ] `?timestamp___day=10` - return every item where the day-of-the-month component is 10 Follow on from #23 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/36/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 462817589,MDU6SXNzdWU0NjI4MTc1ODk=,36,Support compound primary keys,9599,simonw,closed,0,,,,,0,2019-07-01T17:00:07Z,2019-07-15T04:28:52Z,2019-07-15T04:28:52Z,OWNER,,"This should work: ```python table = db[""dog_breeds""].insert({ ""dog_id"": 1, ""breed_id"": 2 }, pk=(""dog_id"", ""breed_id"")) ``` Needed for m2m work in #23",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/36/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585306847,MDU6SXNzdWU1ODUzMDY4NDc=,36,twitter-to-sqlite followers/friends --sql / --attach,9599,simonw,closed,0,,,,,0,2020-03-20T20:20:33Z,2020-03-20T23:12:38Z,2020-03-20T23:12:38Z,MEMBER,,"Split from #8. The `friends` and `followers` commands don't yet support `--sql` and `--attach`. (`friends-ids` and `followers-ids` do though).",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/36/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610842926,MDU6SXNzdWU2MTA4NDI5MjY=,36,Add view for better display of dependent repos,9599,simonw,closed,0,,,,,2,2020-05-01T16:33:44Z,2020-05-02T16:50:31Z,2020-05-02T16:30:11Z,MEMBER,,"```sql select repos.full_name as repo, 'https://github.com/' || repos2.full_name as dependent, repos2.created_at as dependent_repo_created, repos2.updated_at as dependent_repo_updated, repos2.stargazers_count as dependent_repo_stars, repos2.watchers_count as dependent_repo_watchers from dependents join repos as repos2 on dependents.dependent = repos2.id join repos on dependents.repo = repos.id order by repos2.created_at desc ``` https://dogsheep.simonwillison.net/github?sql=select%0D%0A++repos.full_name+as+repo%2C%0D%0A++%27https%3A%2F%2Fgithub.com%2F%27+%7C%7C+repos2.full_name+as+dependent%2C%0D%0A++repos2.created_at+as+dependent_repo_created%2C%0D%0A++repos2.updated_at+as+dependent_repo_updated%2C%0D%0A++repos2.stargazers_count+as+dependent_repo_stars%2C%0D%0A++repos2.watchers_count+as+dependent_repo_watchers%0D%0Afrom%0D%0A++dependents%0D%0A++join+repos+as+repos2+on+dependents.dependent+%3D+repos2.id%0D%0A++join+repos+on+dependents.repo+%3D+repos.id%0D%0Aorder+by%0D%0A++repos2.created_at+desc",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/36/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 988493790,MDExOlB1bGxSZXF1ZXN0NzI3MzkwODM1,36,Correct naming of tool in readme,2129,badboy,open,0,,,,,1,2021-09-05T12:05:40Z,2022-01-06T16:04:46Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/dogsheep-photos/pulls/36,,256834907,dogsheep-photos,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/36/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1751214236,I_kwDOC8SPRc5oYWic,36,Getting sqlite_master may not be modified when creating dogsheep index,8711912,khushmeeet,open,0,,,,,0,2023-06-11T03:21:53Z,2023-06-11T03:21:53Z,,NONE,,"When creating a `dogsheep` index from `config.yml` file on pocket.db (created using pocket-to-sqlite), I am getting this error ``` Traceback (most recent call last): File ""/Users/khushmeeet/.pyenv/versions/3.11.2/bin/dogsheep-beta"", line 8, in sys.exit(cli()) ^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) ^^^^^^^^^^^^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py"", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/dogsheep_beta/cli.py"", line 36, in index run_indexer( File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/dogsheep_beta/utils.py"", line 32, in run_indexer ensure_table_and_indexes(db, tokenize) File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/dogsheep_beta/utils.py"", line 91, in ensure_table_and_indexes table.add_foreign_key(*fk) File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/sqlite_utils/db.py"", line 2155, in add_foreign_key self.db.add_foreign_keys([(self.name, column, other_table, other_column)]) File ""/Users/khushmeeet/.pyenv/versions/3.11.2/lib/python3.11/site-packages/sqlite_utils/db.py"", line 1116, in add_foreign_keys cursor.execute( sqlite3.OperationalError: table sqlite_master may not be modified ``` Command I ran to get this error ``` dogsheep-beta index pocket.db config.yml ``` Dogsheep version ``` dogsheep-beta, version 0.10.2 ``` Python version ``` Python 3.11.2 ```",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/36/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 268453968,MDU6SXNzdWUyNjg0NTM5Njg=,37,Ability to serialize massive JSON without blocking event loop,9599,simonw,closed,0,,,,,2,2017-10-25T15:58:03Z,2020-05-30T17:29:20Z,2020-05-30T17:29:20Z,OWNER,,"We run the risk of someone attempting a select statement that returns thousands of rows and hence takes several seconds just to JSON encode the response, effectively blocking the event loop and pausing all other traffic. The Twisted community have a solution for this, can we adapt that in some way? http://as.ynchrono.us/2010/06/asynchronous-json_18.html?m=1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/37/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 465815372,MDU6SXNzdWU0NjU4MTUzNzI=,37,Experiment with type hints,9599,simonw,closed,0,,,,,6,2019-07-09T14:30:34Z,2021-08-18T21:48:57Z,2021-08-18T21:48:57Z,OWNER,,"Since it's designed to be used in Jupyter or for rapid prototyping in an IDE (and it's still pretty small) `sqlite-utils` feels like a great candidate for me to finally try out Python type hints. https://veekaybee.github.io/2019/07/08/python-type-hints/ is good. It suggests the mypy docs for getting started: https://mypy.readthedocs.io/en/latest/existing_code.html plus this tutorial: https://pymbook.readthedocs.io/en/latest/typehinting.html",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/37/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585353598,MDU6SXNzdWU1ODUzNTM1OTg=,37,"Handle ""User not found"" error",9599,simonw,closed,0,,,,,3,2020-03-20T22:14:32Z,2020-04-17T23:43:46Z,2020-04-17T23:43:46Z,MEMBER,,"While running `user-timeline` I got this bug (because a screen name I asked for didn't exist): ``` File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py"", line 185, in transform_user user[""created_at""] = parser.parse(user[""created_at""]) KeyError: 'created_at' >>> import pdb >>> pdb.pm() > /Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py(185)transform_user() -> user[""created_at""] = parser.parse(user[""created_at""]) (Pdb) user {'errors': [{'code': 50, 'message': 'User not found.'}]} ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/37/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610843136,MDU6SXNzdWU2MTA4NDMxMzY=,37,Mechanism for creating views if they don't yet exist,9599,simonw,closed,0,,,,,3,2020-05-01T16:34:10Z,2020-05-02T16:19:47Z,2020-05-02T16:19:31Z,MEMBER,,Needed for #36 #10 #12 ,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/37/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1293698966,PR_kwDOD079W84600uh,37,Fix former command name in readme,578773,DanLipsitt,open,0,,,,,0,2022-07-05T02:09:13Z,2022-07-05T02:09:13Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/dogsheep-photos/pulls/37,Looks like a previous commit missed a `photo-to-sqlite`→ `dogsheep-photos` replacement.,256834907,dogsheep-photos,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/37/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1817281557,I_kwDOC8SPRc5sUYQV,37,cannot use jinja filters in display?,10352819,rprimet,closed,0,,,,,1,2023-07-23T20:09:54Z,2023-07-23T20:18:27Z,2023-07-23T20:18:26Z,NONE,,"Hi, I'm trying to have a display function in Dogsheep's `config.yml` that includes something like this: ```

{{ display.title }} (source)

{{ display.snippet|safe }}

``` Unfortunately, rendering fails with a message 'urls is undefined'. The same happens if I'm trying to build a row URL manually, using filters like `quote_plus` (as my keys are URLs). Any hints? Thanks!",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/37/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268462768,MDU6SXNzdWUyNjg0NjI3Njg=,38,Experiment with patterns for concurrent long running queries,9599,simonw,closed,0,,,,,5,2017-10-25T16:23:42Z,2018-05-28T20:47:31Z,2018-05-28T20:47:31Z,OWNER,,I want to understand how the system could perform under load with many concurrent long-running queries. Can we serve these without blocking the event loop?,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 467862459,MDExOlB1bGxSZXF1ZXN0Mjk3NDEyNDY0,38,table.update() method,9599,simonw,closed,0,,,,,2,2019-07-14T17:03:49Z,2019-07-28T15:43:51Z,2019-07-28T15:43:51Z,OWNER,simonw/sqlite-utils/pulls/38,"Refs #35 Still to do: - [x] Unit tests - [x] Switch to using `.get()` - [x] Better exceptions, plus unit tests for what happens if pk does not exist - [x] Documentation - [x] Ensure compound primary keys work properly - [x] `alter=True` support",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 585359363,MDU6SXNzdWU1ODUzNTkzNjM=,38,Screen name display for user-timeline is uneven,9599,simonw,closed,0,,,,,1,2020-03-20T22:30:23Z,2020-03-20T22:37:17Z,2020-03-20T22:37:17Z,MEMBER,,"``` CDPHE [####################################] 67 CHFSKy [####################################] 3216 DHSWI [####################################] 41 DPHHSMT [####################################] 742 Delaware_DHSS [####################################] 3231 DhhsNevada [####################################] 639 ``` I could format them to match the length of the longest screen name instead.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611284481,MDU6SXNzdWU2MTEyODQ0ODE=,38,[Feature Request] Support Repo Name in Search 🥺,5779832,zzeleznick,closed,0,,,,,4,2020-05-02T22:08:51Z,2020-05-03T02:34:32Z,2020-05-02T23:15:11Z,NONE,,"## Description Per your [v2.2 release tweet](https://twitter.com/simonw/status/1256700238099693568) I played with the demo, but the output did not match my expectations. ## Expected Behavior Expected a search query for ""twitter"" contained within the `repo` column to return non-zero results. ## Actual Behavior 😭 [0 rows where repo contains ""twitter"" sorted by starred_at descending](https://github-to-sqlite.dogsheep.net/github/stars?repo__contains=twitter&_sort_desc=starred_at) ## Best Explanation Per the table schema (see appendix) `repo` is of type `INTEGER` which built from `repo_id` and does not expose the repo name in search. ## Desired Behavior Given that searching for ""206156866"" is less intuitive than ""twitter"", it would be great to support this via extending the search capabilities or by adding an additional column. ✅ 104 rows where repo contains ""twitter"" ❌ [104 rows where repo contains ""206156866"" sorted by starred_at descending](https://github-to-sqlite.dogsheep.net/github/stars?repo__contains=206156866&_sort_desc=starred_at) ## Appendix ``` CREATE TABLE [stars] ( [user] INTEGER REFERENCES [users]([id]), [repo] INTEGER REFERENCES [repos]([id]), [starred_at] TEXT, PRIMARY KEY ([user], [repo]) ); CREATE INDEX [idx_stars_repo] ON [stars] ([repo]); CREATE INDEX [idx_stars_user] ON [stars] ([user]); ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1827427757,PR_kwDOD079W85WtUKG,38,photos-to-sql not found?,319473,coldclimate,closed,0,,,,,2,2023-07-29T09:59:42Z,2023-07-29T10:01:27Z,2023-07-29T10:01:23Z,FIRST_TIME_CONTRIBUTOR,dogsheep/dogsheep-photos/pulls/38,"I wonder if `photos-to-sql` is an old name for `dogsheep-photos`, because I can't find it anywhere. I can't actually get this command to work (`sqlite3.OperationalError: no such table: attached.ZGENERICASSET` thrown) but I don't think that's related",256834907,dogsheep-photos,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1888477283,I_kwDOC8SPRc5wj-Bj,38,Run `rebuild_fts` after building the index,9599,simonw,open,0,,,,,0,2023-09-08T23:17:45Z,2023-09-08T23:17:45Z,,MEMBER,,"In: - https://github.com/simonw/datasette.io/issues/152#issuecomment-1712323347 This turned out to be the fix: ```bash dogsheep-beta index dogsheep-index.db templates/dogsheep-beta.yml sqlite-utils rebuild-fts dogsheep-index.db ```",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 268469569,MDU6SXNzdWUyNjg0Njk1Njk=,39,Protect against malicious SQL that causes damage even though our DB is immutable,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-10-25T16:44:27Z,2021-08-17T23:52:07Z,2017-11-05T02:53:47Z,OWNER,,"I’m currently operating under the assumption that it’s safe to allow arbitrary SQL statements because we are dealing with an immutable database. But this might not be the case - there are some pretty weird SQLite language extensions (ATTACH, PRAGMA etc) and I’m not certain they cannot be used to break things in a way that would affect future requests to the API. Solution: provide a “safe mode” option which disables the ?sql= mechanism. This still leaves the URL filter lookups, so I need to make sure that those are “safe”. In the future I may also implement a whitelist option where datasets can be configured to only allow specific filters against specific columns.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/39/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 467864071,MDU6SXNzdWU0Njc4NjQwNzE=,39,table.get(...) method,9599,simonw,closed,0,,,,,0,2019-07-14T17:20:51Z,2019-07-15T04:28:53Z,2019-07-15T04:28:53Z,OWNER,,"Utility method for fetching a record by its primary key. Accepts a single value (for primary key / rowid tables) or a list/tuple of values (for compound primary keys, refs #36). Raises a `NotFoundError` if the record cannot be found.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/39/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 590666760,MDU6SXNzdWU1OTA2NjY3NjA=,39,--since feature can be confused by retweets,9599,simonw,closed,0,,,,,11,2020-03-30T23:25:33Z,2020-04-01T03:45:16Z,2020-04-01T03:45:16Z,MEMBER,,"If you run `twitter-to-sqlite user-timeline ... --since` it's supposed to fetch Tweets those specific users tweeted since last time the command was run. It does this by seeking out the max ID of their previous tweets: https://github.com/dogsheep/twitter-to-sqlite/blob/810cb2af5a175837204389fd7f4b5721f8b325ab/twitter_to_sqlite/cli.py#L305-L311 BUT... this has a nasty flaw: if another account had retweeted one of their recent tweets the retweeted-tweet will have been loaded into the database - so we may treat that as the most recent since ID and miss a bunch of their tweets!",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/39/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 613777056,MDU6SXNzdWU2MTM3NzcwNTY=,39,issues foreign key to repo isn't working,9599,simonw,closed,0,,,,,1,2020-05-07T05:11:48Z,2020-08-18T14:24:46Z,2020-08-18T14:23:56Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github/issues?_facet=repo If the foreign key was working those would be repository names. From the schema at the bottom of the page: ``` [repo] TEXT, ``` That's the wrong type and not a foreign key.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/39/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1827436260,PR_kwDOD079W85WtVyk,39,Missing option in datasette instructions,319473,coldclimate,open,0,,,,,0,2023-07-29T10:34:48Z,2023-07-29T10:34:48Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/dogsheep-photos/pulls/39,Gotta tell it where to look,256834907,dogsheep-photos,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/39/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 268470572,MDU6SXNzdWUyNjg0NzA1NzI=,40,Implement command-line tool interface,9599,simonw,closed,0,,,2857392,Ship first public release,11,2017-10-25T16:47:15Z,2017-11-11T07:27:33Z,2017-11-11T07:27:33Z,OWNER,,"The first version needs to take one or more file names or URLs, then generate and deploy an app to Now. It will assume you already have the now command installed and configured.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/40/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 467928674,MDExOlB1bGxSZXF1ZXN0Mjk3NDU5Nzk3,40,.get() method plus support for compound primary keys,9599,simonw,closed,0,,,,,1,2019-07-15T03:43:13Z,2019-07-15T04:28:57Z,2019-07-15T04:28:52Z,OWNER,simonw/sqlite-utils/pulls/40,"- [x] Tests for the `NotFoundError` exception - [x] Documentation for `.get()` method - [x] Support `--pk` multiple times to define CLI compound primary keys - [x] Documentation for compound primary keys",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/40/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 590669793,MDU6SXNzdWU1OTA2Njk3OTM=,40,Feature: record history of follower counts,9599,simonw,closed,0,,,,,5,2020-03-30T23:32:28Z,2020-04-01T04:13:05Z,2020-04-01T04:13:05Z,MEMBER,,"We currently over-write the follower count every time we import a tweet (when we import that user profile again): https://github.com/dogsheep/twitter-to-sqlite/blob/810cb2af5a175837204389fd7f4b5721f8b325ab/twitter_to_sqlite/utils.py#L293-L294 It would be neat if we noticed if that user's follower count (and maybe other counts?) had changed since we last saved them and recorded that change in a separate history table. This would be an inexpensive way of building up rough charts of follower count over time.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/40/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 637899539,MDU6SXNzdWU2Mzc4OTk1Mzk=,40,Demo deploy is broken,9599,simonw,closed,0,,,,,2,2020-06-12T17:20:17Z,2020-06-12T18:06:48Z,2020-06-12T18:06:48Z,MEMBER,,"https://github.com/dogsheep/github-to-sqlite/runs/766180404?check_suite_focus=true ``` The following NEW packages will be installed: sqlite3 0 upgraded, 1 newly installed, 0 to remove and 11 not upgraded. Need to get 752 kB of archives. After this operation, 2482 kB of additional disk space will be used. Ign:1 http://azure.archive.ubuntu.com/ubuntu bionic-updates/main amd64 sqlite3 amd64 3.22.0-1ubuntu0.3 Err:1 http://security.ubuntu.com/ubuntu bionic-updates/main amd64 sqlite3 amd64 3.22.0-1ubuntu0.3 404 Not Found [IP: 52.177.174.250 80] E: Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/s/sqlite3/sqlite3_3.22.0-1ubuntu0.3_amd64.deb 404 Not Found [IP: 52.177.174.250 80] E: Unable to fetch some archives, maybe run apt-get update or try with --fix-missing? ##[error]Process completed with exit code 100. ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/40/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1871935751,I_kwDOD079W85vk3kH,40, ImportError: cannot import name 'formatargspec' from 'inspect',36752421,hosslikw,closed,0,,,,,0,2023-08-29T15:36:31Z,2023-08-31T03:18:07Z,2023-08-31T03:18:06Z,NONE,,"I get the following error when running ""pip3 install dogsheep-photos"" "" from inspect import ismethod, isclass, formatargspec ImportError: cannot import name 'formatargspec' from 'inspect' (/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/inspect.py). Did you mean: 'formatargvalues'?"" Python 3.12.0rc1 sqlite 3.43.0 datasette, version 0.64.3",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/40/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268590777,MDU6SXNzdWUyNjg1OTA3Nzc=,41,Homepage should show summary of databases,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-26T00:18:11Z,2017-10-27T04:05:35Z,2017-10-27T04:05:35Z,OWNER,,"I sch database should have a name, optional description, download link and a summary of the tables Flights.db Flights and suchlike blah. URL? License? 577373 rows across 14 tables airports, routes, airlines... Title of the homepage is derived from the databases or can be manually overridden e. “Datasets of Flights, NHS, Blah...” - or if only one database just the title of that.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/41/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470131537,MDU6SXNzdWU0NzAxMzE1Mzc=,41,sqlite-utils insert --tsv option,9599,simonw,closed,0,,,,,0,2019-07-19T04:27:21Z,2019-07-19T04:50:47Z,2019-07-19T04:50:47Z,OWNER,,"Right now we only support ingesting CSV, but sometimes interesting data is released as TSV. https://www.washingtonpost.com/national/2019/07/18/how-download-use-dea-pain-pills-database/ for example.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/41/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 591613579,MDU6SXNzdWU1OTE2MTM1Nzk=,41,"Bug: recorded a since_id for None, None",9599,simonw,closed,0,,,,,0,2020-04-01T04:29:43Z,2020-04-01T04:31:11Z,2020-04-01T04:31:11Z,MEMBER,,"This shouldn't happen in the `since_ids` table (relates to #39): ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/41/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 651159727,MDU6SXNzdWU2NTExNTk3Mjc=,41,Demo is failing to deploy,9599,simonw,closed,0,,,,,7,2020-07-05T22:40:33Z,2020-07-06T01:07:03Z,2020-07-06T01:07:02Z,MEMBER,,"https://github.com/dogsheep/github-to-sqlite/runs/837714622?check_suite_focus=true ``` Creating Revision.........................................................................................................................................failed Deployment failed ERROR: (gcloud.run.deploy) Cloud Run error: Container failed to start. Failed to start and then listen on the port defined by the PORT environment variable. Logs for this revision might contain more information. Traceback (most recent call last): File ""/opt/hostedtoolcache/Python/3.8.3/x64/bin/datasette"", line 8, in sys.exit(cli()) File ""/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/datasette/publish/cloudrun.py"", line 138, in cloudrun check_call( File ""/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/subprocess.py"", line 364, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command 'gcloud run deploy --allow-unauthenticated --platform=managed --image gcr.io/datasette-222320/datasette github-to-sqlite' returned non-zero exit status 1. ##[error]Process completed with exit code 1. ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/41/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268591332,MDU6SXNzdWUyNjg1OTEzMzI=,42,Homepage UI for editing metadata file,9599,simonw,closed,0,,,,,4,2017-10-26T00:22:03Z,2017-12-10T03:02:14Z,2017-12-10T03:02:14Z,OWNER,,"Since we are going to have a metadata file which sets the title/description/etc for each database, why not allow you to run the app in —dev mode which makes the homepage into a WYSIWYG editor that can save to that file format.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/42/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470345929,MDU6SXNzdWU0NzAzNDU5Mjk=,42,"table.extract(...) method and ""sqlite-utils extract"" command",9599,simonw,closed,0,,,5897911,2.20,21,2019-07-19T14:09:36Z,2020-09-22T23:39:31Z,2020-09-22T23:37:49Z,OWNER,,"One of my favourite features of [csvs-to-sqlite](https://github.com/simonw/csvs-to-sqlite) is that it can ""extract"" columns into a separate lookup table - for example: csvs-to-sqlite big_csv_file.csv -c country output.db This will turn the `country` column in the resulting table into a integer foreign key against a new `country` table. You can see an example of what that looks like here: https://san-francisco.datasettes.com/registered-business-locations-3d50679/Business+Corridor was extracted from https://san-francisco.datasettes.com/registered-business-locations-3d50679/Registered_Business_Locations_-_San_Francisco?Business%20Corridor=1 I'd like to have the same capability in `sqlite-utils` - but with the ability to run it against an existing SQLite table rather than just against a CSV.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/42/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602173589,MDU6SXNzdWU2MDIxNzM1ODk=,42,Error running user-timeline with --sql and --ids together,9599,simonw,closed,0,,,,,0,2020-04-17T19:02:06Z,2020-04-17T23:34:40Z,2020-04-17T23:34:40Z,MEMBER,,"``` $ twitter-to-sqlite user-timeline tweets.db --sql='select id from users' --ids Traceback (most recent call last): File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/bin/twitter-to-sqlite"", line 11, in load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')() File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py"", line 284, in user_timeline ""@{:"" + str(max(len(identifier) for identifier in identifiers)) + ""}"" File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py"", line 284, in ""@{:"" + str(max(len(identifier) for identifier in identifiers)) + ""}"" TypeError: object of type 'int' has no len() ``` But this DID work - casting to strings: ``` $ twitter-to-sqlite user-timeline tweets.db --sql='select """" || id from users' --ids ... this worked ... ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/42/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 654405302,MDU6SXNzdWU2NTQ0MDUzMDI=,42,Option for importing just specific repos,9599,simonw,closed,0,,,,,0,2020-07-09T23:20:15Z,2020-07-09T23:25:35Z,2020-07-09T23:25:35Z,MEMBER,,"For if you know which specific repos you care about, as opposed to loading everything owned by the authenticated user. github-to-sqlite repos specific.db -r simonw/datasette -r simonw/github-contents ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/42/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268592894,MDU6SXNzdWUyNjg1OTI4OTQ=,43,"While running, server should spot new db files added to its directory ",9599,simonw,closed,0,,,2859414,v1 stretch goals,1,2017-10-26T00:32:37Z,2017-11-14T08:25:53Z,2017-11-14T08:25:37Z,OWNER,,"Maybe in each request it checks the time and if 5s has elapsed since t last scanned the directory it scans it again This would allow people with dedicated hosting to run the app there and just upload new datasets whenever they want. It would also be very convenient for development.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/43/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470691999,MDU6SXNzdWU0NzA2OTE5OTk=,43,.add_column() doesn't match indentation of initial creation,9599,simonw,closed,0,,,,,3,2019-07-20T16:33:10Z,2019-07-23T13:09:11Z,2019-07-23T13:09:05Z,OWNER,,"I spotted a table which was created once and then had columns added to it and the formatted SQL looks like this: ```sql CREATE TABLE [records] ( [type] TEXT, [sourceName] TEXT, [sourceVersion] TEXT, [unit] TEXT, [creationDate] TEXT, [startDate] TEXT, [endDate] TEXT, [value] TEXT, [metadata_Health Mate App Version] TEXT, [metadata_Withings User Identifier] TEXT, [metadata_Modified Date] TEXT, [metadata_Withings Link] TEXT, [metadata_HKWasUserEntered] TEXT , [device] TEXT, [metadata_HKMetadataKeyHeartRateMotionContext] TEXT, [metadata_HKDeviceManufacturerName] TEXT, [metadata_HKMetadataKeySyncVersion] TEXT, [metadata_HKMetadataKeySyncIdentifier] TEXT, [metadata_HKSwimmingStrokeStyle] TEXT, [metadata_HKVO2MaxTestType] TEXT, [metadata_HKTimeZone] TEXT, [metadata_Average HR] TEXT, [metadata_Recharge] TEXT, [metadata_Lights] TEXT, [metadata_Asleep] TEXT, [metadata_Rating] TEXT, [metadata_Energy Threshold] TEXT, [metadata_Deep Sleep] TEXT, [metadata_Nap] TEXT, [metadata_Edit Slots] TEXT, [metadata_Tags] TEXT, [metadata_Daytime HR] TEXT) ``` It would be nice if the columns that were added later matched the indentation of the initial columns.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/43/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602176870,MDU6SXNzdWU2MDIxNzY4NzA=,43,"""twitter-to-sqlite lists"" command for retrieving a user's owned lists",9599,simonw,closed,0,,,,,1,2020-04-17T19:08:59Z,2020-04-17T23:48:28Z,2020-04-17T23:30:39Z,MEMBER,,"https://developer.twitter.com/en/docs/accounts-and-users/create-manage-lists/api-reference/get-lists-ownerships `https://api.twitter.com/1.1/lists/ownerships.json `",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/43/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 660355904,MDU6SXNzdWU2NjAzNTU5MDQ=,43,github-to-sqlite tags command for fetching tags,9599,simonw,closed,0,,,,,4,2020-07-18T20:14:12Z,2020-07-18T23:05:56Z,2020-07-18T21:52:15Z,MEMBER,,Fetches paginated data from https://api.github.com/repos/simonw/datasette/tags,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/43/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 269731374,MDU6SXNzdWUyNjk3MzEzNzQ=,44,?_group_count=country - return counts by specific column(s),9599,simonw,closed,0,,,,,7,2017-10-30T19:50:32Z,2018-04-26T15:09:58Z,2018-04-26T15:09:58Z,OWNER,,"Imagine if this: https://stateless-datasets-jykibytogk.now.sh/flights-07d1283/airports.jsono?country__contains=gu&_group_count=country Turned into this: https://stateless-datasets-jykibytogk.now.sh/flights-07d1283?sql=select%20country,%20count(*)%20as%20group_count_country%20from%20airports%20where%20country%20like%20%27%gu%%27%20group%20by%20country%20order%20by%20group_count_country%20desc This would involve introducing a new precedent of query string arguments that start with an _ having special meanings. While we're at it, could try adding _fields=x,y,z Tasks: - [x] Get initial version working - [ ] Refactor code to not just ""pretend to be a view"" - [ ] Get foreign key relationships expanded",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/44/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 471628483,MDU6SXNzdWU0NzE2Mjg0ODM=,44,Utilities for building lookup tables,9599,simonw,closed,0,,,,,2,2019-07-23T10:59:58Z,2019-07-23T13:07:01Z,2019-07-23T13:07:01Z,OWNER,,"While building https://github.com/dogsheep/healthkit-to-sqlite I found a need for a neat mechanism for easily building lookup tables - tables where each unique value in a column is replaced by a foreign key to a separate table. csvs-to-sqlite currently creates those with its ""extract"" mechanism - but that's written as custom code against Pandas. I'd like to eventually replace Pandas with sqlite-utils there. See also #42 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/44/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602181581,MDU6SXNzdWU2MDIxODE1ODE=,44,"tweet[""source""] can be an empty string",9599,simonw,closed,0,,,,,0,2020-04-17T19:18:26Z,2020-04-17T22:01:44Z,2020-04-17T22:01:44Z,MEMBER,,"Got this excepion: ``` File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py"", line 641, in extract_and_save_source details = m.groupdict() AttributeError: 'NoneType' object has no attribute 'groupdict' ``` I traced it back to this tweet: https://twitter.com/osder/status/578712651393576960 ``` (Pdb) source_re re.compile('.*?)"".*?>(?P.*?)') (Pdb) locals()['source'] '' (Pdb) u > /Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py(393)save_tweets() -> tweet[""source""] = extract_and_save_source(db, tweet[""source""]) (Pdb) tweet {'created_at': '2015-03-20T00:20:22+00:00', 'id': 578712651393576960, 'full_text': '@osder', 'truncated': False, 'display_text_range': [0, 6], 'source': '', 'in_reply_to_status_id': 578712521382715392, 'in_reply_to_user_id': 1545741, 'in_reply_to_screen_name': 'osder', 'geo': None, 'coordinates': None, 'place': None, 'contributors': None, 'is_quote_status': False, 'retweet_count': 0, 'favorite_count': 0, 'favorited': False, 'retweeted': False, 'lang': 'und', 'user': 1545741} ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/44/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 660413281,MDU6SXNzdWU2NjA0MTMyODE=,44,Rename tags.repo_id column to tags.repo,9599,simonw,closed,0,,,,,0,2020-07-18T22:13:46Z,2020-07-18T22:15:12Z,2020-07-18T22:15:12Z,MEMBER,,"For improved consistency with other tables. https://observablehq.com/@simonw/datasette-table-diagram ![datasette-table-diagram(1)](https://user-images.githubusercontent.com/9599/87862843-3cca4900-c909-11ea-9c76-58b3f4aca43f.png) ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/44/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 271242824,MDU6SXNzdWUyNzEyNDI4MjQ=,45,Run SQLite operations in a thread pool,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-05T02:27:12Z,2017-11-05T02:27:34Z,2017-11-05T02:27:33Z,OWNER,,"Let's run SQLite operations in threads, so we don't end up blocking our core event loop. These articles are helpful: * https://pymotw.com/3/asyncio/executors.html * https://marlinux.wordpress.com/2017/05/19/python-3-6-asyncio-sqlalchemy/ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/45/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 471684708,MDExOlB1bGxSZXF1ZXN0MzAwMjg2NTM1,45,"Implemented table.lookup(...), closes #44",9599,simonw,closed,0,,,,,0,2019-07-23T13:03:30Z,2019-07-23T13:07:00Z,2019-07-23T13:07:00Z,OWNER,simonw/sqlite-utils/pulls/45,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/45/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 602619330,MDU6SXNzdWU2MDI2MTkzMzA=,45,Use raise_for_status() everywhere,9599,simonw,open,0,,,,,1,2020-04-19T04:38:28Z,2020-04-19T04:39:22Z,,MEMBER,,"I keep seeing errors which I think are caused by authentication or rate limit problems but which appear to be unexpected JSON responses - presumably because they are actually an error message. Recent example: https://github.com/simonw/jsk-fellows-on-twitter/runs/598892575 Using `response.raise_for_status()` everywhere will make these errors less confusing.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/45/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 660429601,MDU6SXNzdWU2NjA0Mjk2MDE=,45,Fix the demo - it breaks because of the tags table change,9599,simonw,closed,0,,,,,5,2020-07-18T22:49:32Z,2020-07-18T23:03:14Z,2020-07-18T23:03:13Z,MEMBER,,"https://github.com/dogsheep/github-to-sqlite/runs/885773677 ``` File ""/home/runner/work/github-to-sqlite/github-to-sqlite/github_to_sqlite/utils.py"", line 476, in save_tags db[""tags""].insert_all( File ""/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1145, in insert_all result = self.db.conn.execute(query, params) sqlite3.OperationalError: table tags has no column named repo ``` That's because I changed the name in #44. I thought this would be safe since no-one else could possibly be using this yet (it hadn't shipped in a release) but turns out I broke my demo!",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/45/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 271301468,MDU6SXNzdWUyNzEzMDE0Njg=,46,Dockerfile should build more recent SQLite with FTS5 and spatialite support,9599,simonw,closed,0,,,,,13,2017-11-05T18:16:22Z,2017-11-17T14:32:12Z,2017-11-17T14:32:12Z,OWNER,,"The SQLite bundled with Python 3 doesn't support the FTS5 search extension. It would be nice if the SQLite built by our Dockerfile could support as many modern SQLite features as possible. https://web.archive.org/web/20170212034155/http://charlesleifer.com/blog/using-the-sqlite-json1-and-fts5-extensions-with-python/ has instructions on building a more recent SQLite and the pysqlite package. Our Dockerfile could carry out an updated version of this process.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/46/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 471780443,MDU6SXNzdWU0NzE3ODA0NDM=,46,extracts= option for insert/update/etc,9599,simonw,closed,0,,,,,3,2019-07-23T15:55:46Z,2020-03-01T16:53:40Z,2019-07-23T17:00:44Z,OWNER,,"Relates to #42 and #44. I want the ability to extract values out into lookup tables during bulk insert/upsert operations. `db.insert_all(rows, extracts=[""species""])` - creates species table for values in the species column `db.insert_all(rows, extracts={""species"": ""Species""})` - as above but the new table is called `Species`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/46/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610284471,MDU6SXNzdWU2MTAyODQ0NzE=,46,Error running 'search' for the first time,9599,simonw,closed,0,,,,,0,2020-04-30T18:11:20Z,2020-04-30T18:11:58Z,2020-04-30T18:11:58Z,MEMBER,,"``` % twitter-to-sqlite search infodemic.db '#infodemic' Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/bin/twitter-to-sqlite"", line 11, in load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')() File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/Users/simon/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py"", line 867, in search for tweet in tweets: File ""/Users/simon/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py"", line 165, in fetch_timeline [since_type_id, since_key], sqlite3.OperationalError: no such table: since_ids ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/46/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 664485022,MDU6SXNzdWU2NjQ0ODUwMjI=,46,Feature: pull request reviews and comments,1326704,bhrutledge,open,0,,,,,6,2020-07-23T13:43:45Z,2022-12-20T14:40:15Z,,NONE,,"Hi there! I saw your [presentation at Boston Python](https://www.meetup.com/bostonpython/events/271887195). I'm already a light user of Datasette (thank you!), but wasn't aware of this project. I've been working on a ""pull request dashboard"" to get a comprehensive view of the state of open PR's, esp. related to reviews (i.e., pending, approved, changes requested). Currently it's a CLI command, but I thought a Datasette UI might be fun. I see that PR's are available from the `issues` command, but I don't see reviews anywhere. From the [API docs](https://docs.github.com/en/rest/reference/pulls#reviews), it looks like there are separate endpoints for those (as well as pull requests in general). What do you think about adding that? Would you accept a PR? Any sense of the level of effort?",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/46/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 271831408,MDU6SXNzdWUyNzE4MzE0MDg=,47,Create neat example database,9599,simonw,closed,0,,,,,5,2017-11-07T13:29:38Z,2017-11-14T03:08:13Z,2017-11-14T03:08:13Z,OWNER,,How about data from open elections eg https://github.com/openelections/openelections-data-ca?files=1,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/47/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 471797101,MDExOlB1bGxSZXF1ZXN0MzAwMzc3NTk5,47,extracts= table parameter,9599,simonw,closed,0,,,,,0,2019-07-23T16:30:29Z,2019-07-23T17:00:43Z,2019-07-23T17:00:43Z,OWNER,simonw/sqlite-utils/pulls/47,Still needs docs. Refs #46,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/47/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 639542974,MDU6SXNzdWU2Mzk1NDI5NzQ=,47,Fall back to FTS4 if FTS5 is not available,73579,hpk42,open,0,,,,,3,2020-06-16T10:11:23Z,2020-06-17T20:13:48Z,,NONE,,"got this with version 0.21.1 from pypi. twitter-to-sqlite auth worked but then ""twitter-to-sqlite user-timeline USER.db"" produced a tracekback ending in ""no such module: FTS5"". ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/47/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 681086659,MDU6SXNzdWU2ODEwODY2NTk=,47,emojis command,9599,simonw,closed,0,,,,,1,2020-08-18T14:26:26Z,2020-08-18T14:52:13Z,2020-08-18T14:52:13Z,MEMBER,,For fun - it can import https://api.github.com/emojis - maybe with an option to fetch the binary representations in addition to the URLs.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/47/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 272391665,MDU6SXNzdWUyNzIzOTE2NjU=,48,Switch to ujson,9599,simonw,closed,0,,,,,4,2017-11-08T23:50:29Z,2019-06-24T06:57:54Z,2019-06-24T06:57:43Z,OWNER,,"ujson is already a dependency of Sanic, and should be quite a bit faster.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/48/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 471818939,MDU6SXNzdWU0NzE4MTg5Mzk=,48,"Jupyter notebook demo of the library, launchable on Binder",9599,simonw,closed,0,,,,,2,2019-07-23T17:05:05Z,2022-01-26T02:08:46Z,2022-01-26T02:08:39Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/48/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 663976976,MDU6SXNzdWU2NjM5NzY5NzY=,48,Add a table of contents to the README,9599,simonw,closed,0,,,,,3,2020-07-22T18:54:33Z,2020-07-23T17:46:07Z,2020-07-22T19:03:02Z,MEMBER,,Using https://github.com/jonschlinkert/markdown-toc,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/48/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 681228542,MDExOlB1bGxSZXF1ZXN0NDY5NjUxNzMy,48,Add pull requests,755825,adamjonas,closed,0,,,,,2,2020-08-18T17:58:44Z,2020-11-29T23:51:09Z,2020-11-29T23:51:09Z,CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/48,"ref #46 Issues don't have merge information on them, which means that PRs need to be pulled separately. Did my best to mimic the API of issues.",207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/48/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 272661336,MDU6SXNzdWUyNzI2NjEzMzY=,49,Pick a name,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-11-09T17:56:17Z,2017-11-10T18:33:22Z,2017-11-10T18:33:22Z,OWNER,,"Options so far: * immutabase * datasite * sqlstatic * dbserve * sqlserve Terms to play with: * immutable * sqlite * dataset * json * static * serve",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/49/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 472115381,MDU6SXNzdWU0NzIxMTUzODE=,49,extracts= should support multiple-column extracts,9599,simonw,open,0,,,,,10,2019-07-24T07:06:41Z,2020-10-16T19:18:19Z,,OWNER,,"Lookup tables can be constructed on compound columns, but the `extracts=` option doesn't currently support that. Right now extracts can be defined in two ways: ```python # Extract these columns into tables with the same name: dogs = db.table(""dogs"", extracts=[""breed"", ""most_recent_trophy""]) # Same as above but with custom table names: dogs = db.table(""dogs"", extracts={""breed"": ""Breeds"", ""most_recent_trophy"": ""Trophies""}) ``` Need some kind of syntax for much more complicated extractions, like when two columns (say ""source"" and ""source_version"") are extracted into a single table.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/49/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 681575714,MDExOlB1bGxSZXF1ZXN0NDY5OTQ0OTk5,49,"Document the use of --stop_after with favorites, refs #20",370930,mikepqr,closed,0,,,,,1,2020-08-19T06:10:52Z,2021-08-20T00:02:11Z,2021-08-20T00:02:11Z,CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/49,(I discovered this trawling the issues for how to use --since with favorites),206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/49/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 703216044,MDU6SXNzdWU3MDMyMTYwNDQ=,49,Feature: gists and starred gists,9599,simonw,open,0,,,,,0,2020-09-17T02:30:52Z,2020-09-17T02:30:52Z,,MEMBER,,https://developer.github.com/v3/gists/#list-starred-gists,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/49/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 272694136,MDU6SXNzdWUyNzI2OTQxMzY=,50,Unit tests against application itself,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-11-09T19:31:49Z,2017-11-11T22:23:22Z,2017-11-11T22:23:22Z,OWNER,,"Use Sanic’s testing mechanism. Test should create a temporary SQLite database file on disk by executing sql that is stored in the test themselves. For the moment we can just test the JSON API more thoroughly and just sanity check that the HTML output doesn’t throw any errors.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/50/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 473083260,MDU6SXNzdWU0NzMwODMyNjA=,50,"""Too many SQL variables"" on large inserts",9599,simonw,closed,0,,,,,4,2019-07-25T21:43:31Z,2022-11-04T14:38:36Z,2019-07-28T11:59:33Z,OWNER,,"Reported here: https://github.com/dogsheep/healthkit-to-sqlite/issues/9 It looks like there's a default limit of 999 variables - we need to be smart about that, maybe dynamically lower the batch size based on the number of columns.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/50/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 698791218,MDU6SXNzdWU2OTg3OTEyMTg=,50,"favorites --stop_after=N stops after min(N, 200)",370930,mikepqr,open,0,,,,,2,2020-09-11T03:38:14Z,2020-09-13T05:11:14Z,,CONTRIBUTOR,,"For any number greater than 200, `favorites --stop_after` stops after getting 200 tweets, e.g. ``` $ twitter-to-sqlite favorites tweets.db --stop_after=300 Importing favorites [####################################] 199 $ ``` I don't _think_ this is a limitation of the API (if you omit `--stop_after` you get some very large number, possibly all of them), so I _think_ this is a bug.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/50/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 703218756,MDU6SXNzdWU3MDMyMTg3NTY=,50,Commands for making authenticated API calls,9599,simonw,open,0,,,,,7,2020-09-17T02:39:07Z,2020-10-19T05:01:29Z,,MEMBER,,"Similar to `twitter-to-sqlite fetch`, see https://github.com/dogsheep/twitter-to-sqlite/issues/51",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/50/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 272735257,MDU6SXNzdWUyNzI3MzUyNTc=,51,Make a proper README,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-09T21:46:07Z,2017-11-13T18:44:23Z,2017-11-13T18:44:23Z,OWNER,,Include instructions on building a local Docker container - currently detailed here: https://gist.github.com/simonw/0ea5c960608c2d876e4637a5e48aa95d (those instructions don't work now that we have removed the Dockerfile in favour of a template generated by `datasette publish`),107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/51/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 473733752,MDExOlB1bGxSZXF1ZXN0MzAxODI0MDk3,51,"Fix for too many SQL variables, closes #50",9599,simonw,closed,0,,,,,1,2019-07-28T11:30:30Z,2019-07-28T11:59:32Z,2019-07-28T11:59:32Z,OWNER,simonw/sqlite-utils/pulls/51,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/51/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 703218448,MDU6SXNzdWU3MDMyMTg0NDg=,51,Documentation for twitter-to-sqlite fetch,9599,simonw,open,0,,,,,0,2020-09-17T02:38:10Z,2020-09-17T02:38:10Z,,MEMBER,,"It's mentioned in passing in the README but it deserves its own section: ``` $ twitter-to-sqlite fetch \ ""https://api.twitter.com/1.1/account/verify_credentials.json"" \ | grep '""id""' | head -n 1 ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/51/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 703246031,MDU6SXNzdWU3MDMyNDYwMzE=,51,github-to-sqlite should handle rate limits better,9599,simonw,open,0,,,,,4,2020-09-17T04:01:50Z,2022-10-14T16:34:07Z,,MEMBER,,From #50 - right now it will crash with an error of it hits the rate limit. Since the rate limit information (including reset time) is available in the headers it could automatically sleep and try again instead.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/51/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 273026602,MDU6SXNzdWUyNzMwMjY2MDI=,52,Solution for temporarily uploading DB so it can be built by docker,9599,simonw,closed,0,,,,,2,2017-11-10T18:55:25Z,2017-12-10T03:02:57Z,2017-12-10T03:02:57Z,OWNER,,For the `datasette publish` command I ideally need a way of uploading the specified DB to somewhere temporary on the internet so that when the Dockerfile is built by the final hosting location it can download that database as part of the build process.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/52/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 476413293,MDU6SXNzdWU0NzY0MTMyOTM=,52,Throws error if .insert_all() / .upsert_all() called with empty list,9599,simonw,closed,0,,,,,1,2019-08-03T04:09:00Z,2019-11-07T04:32:39Z,2019-11-07T04:32:39Z,OWNER,,See also https://github.com/simonw/db-to-sqlite/issues/18,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/52/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 724264574,MDU6SXNzdWU3MjQyNjQ1NzQ=,52,Option to fetch README and/or HTML-rendered README for repos,9599,simonw,closed,0,,,,,0,2020-10-19T05:10:24Z,2020-10-19T05:33:42Z,2020-10-19T05:33:42Z,MEMBER,,"I'm thinking: github-to-sqlite repos ... --readme # Populates readme column with raw text github-to-sqlite repos ... --readme-html # Populates readme_html column with raw HTML https://developer.github.com/v3/repos/contents/#get-a-repository-readme",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/52/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 745393298,MDU6SXNzdWU3NDUzOTMyOTg=,52,Discussion: Adding support for fetching only fresh tweets,4169772,fatihky,closed,0,,,,,1,2020-11-18T07:01:48Z,2020-11-18T07:12:45Z,2020-11-18T07:12:45Z,NONE,,I think it'd be very useful if this tool has an option like `--incremental` to fetch only newer tweets. This way operations could complete very fast in sequential runs. I'd want to try to implement this feature if it seems OK for this tool's purpose. ,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/52/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273054652,MDU6SXNzdWUyNzMwNTQ2NTI=,53,Implement a better database index page,9599,simonw,closed,0,,,2857392,Ship first public release,3,2017-11-10T20:47:36Z,2017-11-12T21:19:33Z,2017-11-12T01:50:27Z,OWNER,,"This view isn't great. I should do a better job of separating out tables from views and indexes, showing the count of rows in each table, and maybe move the SQL to the individual table pages. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/53/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 476436920,MDExOlB1bGxSZXF1ZXN0MzAzOTkwNjgz,53,Work in progress: m2m() method for creating many-to-many records,9599,simonw,closed,0,,,,,0,2019-08-03T10:03:56Z,2019-08-04T03:38:10Z,2019-08-04T03:37:33Z,OWNER,simonw/sqlite-utils/pulls/53,"- [x] `table.insert({""name"": ""Barry""}).m2m(""tags"", lookup={""tag"": ""Coworker""})` - [x] Explicit table name `.m2m(""humans"", ..., m2m_table=""relationships"")` - [x] Automatically use an existing m2m table if a single obvious candidate exists (a table with two foreign keys in the correct directions) - [x] Require the explicit `m2m_table=` argument if multiple candidates for the m2m table exist - [x] Documentation Refs #23",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/53/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 753000405,MDU6SXNzdWU3NTMwMDA0MDU=,53,Command for fetching file contents,9599,simonw,open,0,,,,,1,2020-11-29T20:31:04Z,2020-11-30T00:36:09Z,,MEMBER,,"Something like this: github-to-sqlite files github.db simonw/datasette This would fetch all files from the `main` branch into a `files` table. Additional options could handle things like pulling files from a branch or tag, or just pulling files that match a specific glob or that exist in a specific directory.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/53/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 771324837,MDU6SXNzdWU3NzEzMjQ4Mzc=,53,--since support for favorites,27,anotherjesse,closed,0,,,,,1,2020-12-19T07:08:23Z,2020-12-19T07:47:11Z,2020-12-19T07:47:11Z,NONE,,"Having support for `--since` for updating your favorites would be ideal as the api is both slow and it only returns ~3k most recent favorites. https://twittercommunity.com/t/cant-get-all-favorite-tweets-by-rest-api/22007/3 The api seems to take an optional `since_id` parameter - https://developer.twitter.com/en/docs/twitter-api/v1/tweets/post-and-engage/api-reference/get-favorites-list",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/53/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273121803,MDU6SXNzdWUyNzMxMjE4MDM=,54,Views should not attempt to link to records / use rowids,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-11T05:44:54Z,2017-11-12T21:29:42Z,2017-11-12T21:29:33Z,OWNER,,"http://localhost:8001/parlgov-development-25f9855/view_variable ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/54/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 480961330,MDU6SXNzdWU0ODA5NjEzMzA=,54,"Ability to list views, and to access db[""view_name""].rows / rows_where / etc",20264,ftrain,closed,0,,,,,5,2019-08-15T02:00:28Z,2019-08-23T12:41:09Z,2019-08-23T12:20:15Z,NONE,,"The docs show me how to create a view via `db.create_view()` but I can't seem to get back to that view post-creation; if I query it as a table it returns `None`, and it doesn't appear in the table listing, even though querying the view works fine from inside the sqlite3 command-line. It'd be great to have the view as a pseudo-table, or if the python/sqlite3 module makes that hard to pull off (I couldn't figure it out), to have that edge-case documented next to the `db.create_view()` docs.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/54/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 753026003,MDU6SXNzdWU3NTMwMjYwMDM=,54,github-to-sqlite workflows command,9599,simonw,closed,0,,,,,3,2020-11-29T21:56:42Z,2020-11-29T22:08:46Z,2020-11-29T21:57:17Z,MEMBER,,"A command that fetches the YAML workflows for different repos, parses them and stores them in relational tables would be really useful for maintaining larger numbers of workflows.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/54/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 779088071,MDU6SXNzdWU3NzkwODgwNzE=,54,Archive import appears to be broken on recent exports,21148,jacobian,open,0,,,,,5,2021-01-05T14:18:01Z,2023-01-04T11:06:55Z,,CONTRIBUTOR,,"I requested a Twitter export yesterday, and unfortunately they seem to have changed it such that `twitter-to-sqlite import` can't handle it anymore 😢 So far I've ran into two issues. The first was easy to work around, but the second will take more investigation. If I can find the time I'll keep working on it and update this issue accordingly. The issues (so far): ### 1. Data seems to have moved to a `data/` subdirectory Running `twitter-to-sqlite import` on the raw zip file reports a bunch of ""not yet implemented"" errors, and then exits without actually importing anything: ``` ❯ twitter-to-sqlite import tarchive.db twitter.zip ... data/manifest: not yet implemented data/account-creation-ip: not yet implemented data/account-suspension: not yet implemented ... (dozens of more lines like this, including critical stuff like data/tweets) ... ``` (`tarchive.db` now exists, but is empty) Workaround: unpack the zip file, and run `twitter-to-sqlite import tarchive.db path/to/archive/data` That gets further, but: ### 2. Some schema(s?) have changed At least, the `blocks` schema seems different now: ``` ❯ twitter-to-sqlite import tarchive.db archive/data direct-messages-group: not yet implemented branch-links: not yet implemented periscope-expired-broadcasts: not yet implemented direct-messages: not yet implemented mute: not yet implemented Traceback (most recent call last): File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/bin/twitter-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/twitter_to_sqlite/cli.py"", line 772, in import_ archive.import_from_file(db, filepath.name, open(filepath, ""rb"").read()) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/twitter_to_sqlite/archive.py"", line 215, in import_from_file to_insert = transformer(data) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/twitter_to_sqlite/archive.py"", line 115, in lists_member return {""lists-member"": _list_from_common(data)} File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jacobian-dogsheep-4AXaN4tu-py3.8/lib/python3.8/site-packages/twitter_to_sqlite/archive.py"", line 200, in _list_from_common for url in block[""userListInfo""][""urls""]: KeyError: 'urls' ``` That's as far as I got before I needed to work on something else. I'll report back if I get further!",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/54/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 273127117,MDU6SXNzdWUyNzMxMjcxMTc=,55,Ship first version to PyPI,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-11-11T07:38:48Z,2017-11-13T21:19:43Z,2017-11-13T21:19:43Z,OWNER,,"Just before doing this, update the Dockerfile template to `pip install datasette` https://github.com/simonw/datasette/blob/65e350ca2a4845c25752a62c16ba58cfe2c14b9b/datasette/utils.py#L125",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/55/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 481887482,MDExOlB1bGxSZXF1ZXN0MzA4MjkyNDQ3,55,Ability to introspect and run queries against views,9599,simonw,closed,0,,,,,1,2019-08-17T13:40:56Z,2019-08-23T12:19:42Z,2019-08-23T12:19:42Z,OWNER,simonw/sqlite-utils/pulls/55,See #54 ,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/55/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 753026388,MDU6SXNzdWU3NTMwMjYzODg=,55,github-to-sqlite workflows does not correctly replace existing records,9599,simonw,closed,0,,,,,0,2020-11-29T21:58:43Z,2020-11-29T23:48:50Z,2020-11-29T23:48:50Z,MEMBER,,Following #54 - see this TODO: https://github.com/dogsheep/github-to-sqlite/blob/1b23ce11953f9f59c0161ea1f99188b55b5ea11c/github_to_sqlite/utils.py#L700,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/55/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 779211940,MDExOlB1bGxSZXF1ZXN0NTQ5MjA0MDYz,55,Fix archive imports,21148,jacobian,closed,0,,,,,2,2021-01-05T15:54:48Z,2021-08-20T00:02:49Z,2021-08-20T00:02:49Z,CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/55,This fixes the issues discussed in #54,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/55/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 273127443,MDU6SXNzdWUyNzMxMjc0NDM=,56,Easy way to block search engine crawling in robots.txt,9599,simonw,closed,0,,,,,1,2017-11-11T07:46:07Z,2018-05-28T20:50:25Z,2018-05-28T20:50:24Z,OWNER,,For people who don't want their datasets to be crawled by search engines.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/56/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487847945,MDExOlB1bGxSZXF1ZXN0MzEzMDA3NDgz,56,Escape the table name in populate_fts and search.,49260,amjith,closed,0,,,,,2,2019-09-01T06:29:05Z,2019-09-02T17:23:21Z,2019-09-02T17:23:21Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/56,"The table names weren't escaped using double quotes in the populate_fts method. Reproducible case: ``` >>> import sqlite_utils >>> db = sqlite_utils.Database(""abc.db"") >>> db[""http://example.com""].insert_all([ ... {""id"": 1, ""age"": 4, ""name"": ""Cleo""}, ... {""id"": 2, ""age"": 2, ""name"": ""Pancakes""} ... ], pk=""id"") >>> db[""http://example.com""].enable_fts([""name""]) Traceback (most recent call last): File """", line 1, in db[""http://example.com""].enable_fts([""name""]) File ""/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/sqlite_utils/db.py"", l ine 705, in enable_fts self.populate_fts(columns) File ""/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/sqlite_utils/db.py"", l ine 715, in populate_fts self.db.conn.executescript(sql) sqlite3.OperationalError: unrecognized token: "":"" >>> ```",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/56/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 753122082,MDU6SXNzdWU3NTMxMjIwODI=,56,Link to example tables from the README,9599,simonw,closed,0,,,,,0,2020-11-30T04:01:51Z,2020-11-30T04:10:27Z,2020-11-30T04:10:27Z,MEMBER,,Would help demonstrate how the tool works.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/56/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 796736607,MDU6SXNzdWU3OTY3MzY2MDc=,56,Not all quoted statuses get fetched?,42315895,gsajko,closed,0,,,,,3,2021-01-29T09:48:44Z,2021-02-03T10:36:36Z,2021-02-03T10:36:36Z,NONE,," ![image](https://user-images.githubusercontent.com/42315895/106259325-5f75dc80-621f-11eb-8311-db8f2fe2a257.png) In my database I have 13300 quote tweets, but eta 3600 have `quoted_status` empty. I fetched some of them using `https://api.twitter.com/1.1/statuses/show.json?id=xx` and they did have ids of quoted tweets.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/56/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273127694,MDU6SXNzdWUyNzMxMjc2OTQ=,57,Ship a Docker image of the whole thing,9599,simonw,closed,0,,,,,7,2017-11-11T07:51:28Z,2018-06-28T04:01:51Z,2018-06-28T04:01:38Z,OWNER,,"The generated Docker images can then just inherit from that. This will speed up deploys as no need to `pip install` anything. - [x] Ship that image to Docker Hub - [ ] Update the generated Dockerfile to use it",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/57/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487987958,MDExOlB1bGxSZXF1ZXN0MzEzMTA1NjM0,57,Add triggers while enabling FTS,49260,amjith,closed,0,,,,,4,2019-09-02T04:23:40Z,2019-09-03T01:03:59Z,2019-09-02T23:42:29Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/57,"This adds the option for a user to set up triggers in the database to keep their FTS table in sync with the parent table. Ref: https://sqlite.org/fts5.html#external_content_and_contentless_tables I would prefer to make the creation of triggers the default behavior, but that will break existing usage where people have been calling `populate_fts` after inserting new rows. I am happy to make changes to the PR as you see fit. ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/57/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 758944006,MDU6SXNzdWU3NTg5NDQwMDY=,57,--readme throws 404 error if README does not exist in repo,9599,simonw,closed,0,,,,,0,2020-12-07T23:58:49Z,2020-12-16T18:17:54Z,2020-12-16T18:17:54Z,MEMBER,,It should fail silently (populate the column with a null) instead.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/57/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 907645813,MDU6SXNzdWU5MDc2NDU4MTM=,57,"Error: Use either --since or --since_id, not both",42904,rubenv,closed,0,,,,,6,2021-05-31T18:11:04Z,2021-08-20T00:01:31Z,2021-08-20T00:01:31Z,CONTRIBUTOR,,"I'm using the following command: ``` twitter-to-sqlite user-timeline -a twitter-auth.json twitter/tweets.db --since ``` Which gives the following error: ``` Error: Use either --since or --since_id, not both ``` Running without `--since`. ``` Traceback (most recent call last): File ""/usr/local/bin/twitter-to-sqlite"", line 8, in sys.exit(cli()) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/twitter_to_sqlite/cli.py"", line 317, in user_timeline for tweet in bar: File ""/usr/local/lib/python3.9/site-packages/click/_termui_impl.py"", line 328, in generator for rv in self.iter: File ""/usr/local/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 234, in fetch_user_timeline yield from fetch_timeline( File ""/usr/local/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 202, in fetch_timeline raise Exception(str(tweets[""errors""])) Exception: [{'code': 44, 'message': 'since_id parameter is invalid.'}] ``` ``` Python 3.9.5 twitter-to-sqlite, version 0.21.3 ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/57/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273128608,MDU6SXNzdWUyNzMxMjg2MDg=,58,"publish command should detect if ""now"" is installed",9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-11T08:10:17Z,2017-11-11T16:00:07Z,2017-11-11T16:00:07Z,OWNER,,"If now is not installed, it should tell you where to get it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/58/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488293926,MDU6SXNzdWU0ODgyOTM5MjY=,58,Support enabling FTS on views,49260,amjith,closed,0,,,,,1,2019-09-02T18:56:36Z,2020-10-16T18:39:36Z,2020-10-16T18:39:31Z,CONTRIBUTOR,,"Right now enable_fts() is only implemented for Table(). Technically sqlite supports enabling fts on views. But it requires deeper thought since views don't have `rowid` and the current implementation of enable_fts() relies on the presence of `rowid` column. It is possible to provide an alternative rowid using the `content_rowid` option to the FTS5() function. Ref: https://sqlite.org/fts5.html#fts5_table_creation_and_initialization > The ""content_rowid"" option, used to set the rowid field of an external content table. This will further complicate `enable_fts()` function by adding an extra argument. I'm wondering if that is outside the scope of this tool or should I work on that feature and send a PR? ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/58/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 769150394,MDU6SXNzdWU3NjkxNTAzOTQ=,58,Readme HTML has broken internal links,9599,simonw,closed,0,,,,,2,2020-12-16T17:58:11Z,2020-12-16T19:20:14Z,2020-12-16T19:20:14Z,MEMBER,,"From https://github.com/simonw/datasette.io/issues/46 ```html
  • Filtering tables
  • ...

    Filtering tables

    ``` So this is a bug in GitHub's API, but we need to work around it.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/58/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 984939366,MDU6SXNzdWU5ODQ5MzkzNjY=,58,"Error: Use either --since or --since_id, not both - still broken",42904,rubenv,closed,0,,,,,1,2021-09-01T09:45:28Z,2021-09-21T17:37:41Z,2021-09-21T17:37:41Z,CONTRIBUTOR,,"Hi Simon, It appears the fix for #57 doesn't fix things for me: ``` $ twitter-to-sqlite --version twitter-to-sqlite, version 0.21.4 $ python --version Python 3.9.6 ``` ``` $ twitter-to-sqlite home-timeline -a twitter-auth.json twitter/timeline.db --since Importing tweets Error: Use either --since or --since_id, not both ``` Is there any way I can help debug this?",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/58/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273157085,MDU6SXNzdWUyNzMxNTcwODU=,59,datasette publish hyper,9599,simonw,closed,0,,,,,4,2017-11-11T16:27:26Z,2019-05-13T19:01:00Z,2019-05-13T19:00:44Z,OWNER,,"This is a bit tricky, because unlike Now there doesn't seem to be a way to tell Hyper to ""build this Dockerfile and deploy the resulting image"". They expect you to build a container and publish it to a registry instead. https://docs.hyper.sh/Reference/CLI/load.html allows you to publish an image directly from a tarball, but that still leaves the challenge of creating that image. The nice thing about the Now integration is that you don't need to have Docker installed on your local machine.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/59/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488338965,MDU6SXNzdWU0ODgzMzg5NjU=,59,Ability to introspect triggers,9599,simonw,closed,0,,,,,0,2019-09-02T23:47:16Z,2019-09-03T01:52:36Z,2019-09-03T00:09:42Z,OWNER,,"Now that we're creating triggers (thanks to @amjith in #57) it would be neat if we could introspect them too. I'm thinking: `db.triggers` - lists all triggers for the database `db[""tablename""].triggers` - lists triggers for that table The underlying query for this is `select * from sqlite_master where type = 'trigger'` I'll return the trigger information in a new namedtuple, similar to how Indexes and ForeignKeys work.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/59/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 771872303,MDExOlB1bGxSZXF1ZXN0NTQzMjQ2NTM1,59,Remove unneeded exists=True for -a/--auth flag.,631242,frosencrantz,closed,0,,,,,3,2020-12-21T06:03:55Z,2021-05-22T14:06:19Z,2021-05-19T16:08:12Z,CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/59,The file does not need to exist when using an environment variable.,207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/59/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 984942782,MDExOlB1bGxSZXF1ZXN0NzI0MzE3NjUw,59,"Fix for since_id bug, closes #58",42904,rubenv,closed,0,,,,,1,2021-09-01T09:49:09Z,2021-09-21T17:37:40Z,2021-09-21T17:37:40Z,CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/59,Fixes remaining instances of this bug,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/59/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 273163905,MDU6SXNzdWUyNzMxNjM5MDU=,60,Rethink how metadata is generated and stored,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-11T18:01:28Z,2017-11-11T20:12:17Z,2017-11-11T20:12:16Z,OWNER,,"I broke the existing mechanism in 407795b61217205625f2d4e084afbf69f1db781b In order to get unit tests for the sanic app working. I think i should ditch the build-metadata.json cache file entirely and calculate the SHA hashes on startup. Not sure what to do about the table row counts.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/60/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488341021,MDExOlB1bGxSZXF1ZXN0MzEzMzgzMzE3,60,db.triggers and table.triggers introspection,9599,simonw,closed,0,,,,,0,2019-09-03T00:04:32Z,2019-09-03T00:09:42Z,2019-09-03T00:09:42Z,OWNER,simonw/sqlite-utils/pulls/60,Closes #59,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/60/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 797097140,MDU6SXNzdWU3OTcwOTcxNDA=,60,Use Data from SQLite in other commands,22578954,daniel-butler,open,0,,,,,3,2021-01-29T18:35:52Z,2021-02-12T18:29:43Z,,CONTRIBUTOR,,"As a total beginner here how could you access data from the sqlite table to run other commands. What I am thinking is I want to get all the repos in an organization then using the repo list pull all the commit messages for each repo. I love this project by the way!",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/60/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1063982712,I_kwDODEm0Qs4_axZ4,60,Execution on Windows,1733616,bernard01,open,0,,,,,1,2021-11-26T00:24:34Z,2022-10-14T16:58:27Z,,NONE,,"My installation on Windows using pip has been successful. I have Python 3.6. How do I run twitter-to-sqlite? I cannot even figure out how ""auth"" is a command. I have python on my path: C:\prog\python\Python36;C:\prog\python\Python36\Scripts Where should the commands be executed, and where are the files created? Could some basics please be added to the documentation to get beginners started?",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/60/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 273173116,MDU6SXNzdWUyNzMxNzMxMTY=,61,Common header and footer,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-11T20:20:08Z,2017-11-11T20:37:19Z,2017-11-11T20:37:19Z,OWNER,,"Split from #16 - [x] A link to the homepage from some kind of navigation bar in the header - [x] link to github.com/simonw/datasette in the footer - [x] Slightly better titles (maybe ditch the visited link colours for titles only? should keep those for primary key links)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/61/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 491219910,MDU6SXNzdWU0OTEyMTk5MTA=,61,importing CSV to SQLite as library,17739,witeshadow,closed,0,,,,,2,2019-09-09T17:12:40Z,2019-11-04T16:25:01Z,2019-11-04T16:25:01Z,NONE,,"CSV can be imported to SQLite when used CLI, but I don't see documentation for when using as library. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/61/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 797108702,MDExOlB1bGxSZXF1ZXN0NTY0MTcyMTQw,61,fixing typo in get cli help text,22578954,daniel-butler,closed,0,,,,,1,2021-01-29T18:57:04Z,2021-05-19T16:07:09Z,2021-05-19T16:07:09Z,CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/61,,207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/61/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1077560091,I_kwDODEm0Qs5AOkMb,61,"Data Pull fails for ""Essential"" level access to the Twitter API (for Documentation)",57161638,jmnickerson05,open,0,,,,,1,2021-12-11T14:59:41Z,2022-10-31T14:47:58Z,,NONE,,"Per Twitter documentation: https://developer.twitter.com/en/docs/twitter-api/getting-started/about-twitter-api#v2-access-leve This isn't any fault of twitter-to-sqlite of course, but it should probably be documented as a side-note. ![image](https://user-images.githubusercontent.com/57161638/145681272-8c85b3b9-be95-44ff-9760-1bafa4917ce2.png) And this is how I'm surfacing the message from utils.py: ![image](https://user-images.githubusercontent.com/57161638/145681005-2776c0ad-9822-4461-b43a-450ab2e828eb.png) ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/61/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 273174397,MDU6SXNzdWUyNzMxNzQzOTc=,62,Link to .json and .jsono versions on various pages,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-11T20:37:47Z,2017-11-11T22:41:06Z,2017-11-11T22:41:06Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/62/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 500783373,MDU6SXNzdWU1MDA3ODMzNzM=,62,[enhancement] Method to delete a row in python,4454869,Sergeileduc,closed,0,,,,,5,2019-10-01T09:45:47Z,2019-11-04T16:30:34Z,2019-11-04T16:18:18Z,NONE,,"Hi ! Thanks for the lib ! Obviously, every possible sql queries won't have a dedicated method. But I was thinking : a method to delete a row (I'm terrible with names, maybe `delete_where()` or something, would be useful. I have a Database, with primary key. For the moment, I use : ```Python3 db.conn.execute(f""DELETE FROM table WHERE key = {key_id}"") db.conn.commit() ``` to delete a row I don't need anymore, giving his primary key. Works like a charm. Just an idea : ```Python3 table.delete_where_pkey({'key': key_id}) ``` or something (I know, I'm terrible at naming methods...). Pros : well, no need to write SQL query. Cons : WHERE normally allows to do many more things (operators =, <>, >, <, BETWEEN), not to mention AND, OR, etc... Method is maybe to specific, and/or a pain to render more flexible. Again, just a thought. Writing his own sql works too, so... Thanks again. See yah.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/62/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 797784080,MDU6SXNzdWU3OTc3ODQwODA=,62,Stargazers and workflows commands always require an auth file when using GITHUB_TOKEN ,631242,frosencrantz,open,0,,,,,0,2021-01-31T18:56:05Z,2021-01-31T18:56:05Z,,CONTRIBUTOR,,"Requested fix in https://github.com/dogsheep/github-to-sqlite/pull/59 The stargazers and workflows commands always require an auth file, even when using a `GITHUB_TOKEN`. Other commands don't require the auth file. ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/62/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1088816961,I_kwDODEm0Qs5A5gdB,62,KeyError: 'created_at' for private accounts?,6764957,swyxio,closed,0,,,,,2,2021-12-26T17:51:51Z,2022-03-12T02:36:32Z,2022-02-24T18:10:18Z,NONE,,"hey Simon! i was running `twitter-to-sqlite user-timeline twitter.db` for [my private alt](https://twitter.com/swyxio) and ran into this error:
    ![image](https://user-images.githubusercontent.com/6764957/147416165-46b69c30-100a-406f-8534-8612b75547ae.png) ```bash Traceback (most recent call last): File ""/Users/swyx/Work/datasette/env/bin/twitter-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/cli.py"", line 291, in user_timeline profile = utils.get_profile(db, session, **kwargs) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 133, in get_profile save_users(db, [profile]) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 453, in save_users transform_user(user) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 285, in transform_user user[""created_at""] = parser.parse(user[""created_at""]) KeyError: 'created_at' ```
    this looks awfully like #37 but it can't be, because i'm authed into my account and obviously i have perms to read my own account. wonder if there's any diagnostic methods i should apply here? just filing an issue for others to find while i investigate.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/62/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273174447,MDU6SXNzdWUyNzMxNzQ0NDc=,63,Review design of JSON output,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-11T20:38:33Z,2017-11-11T22:20:17Z,2017-11-11T22:20:17Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/63/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 517241040,MDU6SXNzdWU1MTcyNDEwNDA=,63,ensure_index() method,9599,simonw,closed,0,,,,,1,2019-11-04T15:51:22Z,2019-11-04T16:20:36Z,2019-11-04T16:20:35Z,OWNER,,"```python db[""table""].ensure_index([""col1"", ""col2""]) ``` This will do the following: - if the specified table or column does not exist, do nothing - if they exist and already have an index, do nothing - otherwise, create the index I want this for tools like [twitter-to-sqlite search](https://github.com/dogsheep/twitter-to-sqlite/blob/801c0c2daf17d8abce9dcb5d8d610410e7e25dbe/README.md#running-searches) where the `search_runs` table may or not have been created yet but, if it IS created, I want to put an index on the `hash` column.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/63/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 897212458,MDU6SXNzdWU4OTcyMTI0NTg=,63,Ability to fetch commits from branches other than the default,9599,simonw,open,0,,,,,0,2021-05-20T17:58:08Z,2021-05-20T17:58:08Z,,MEMBER,,This tool is currently almost entirely ignorant of the concept of branches. One example: you can't retrieve commits from any branch other than the default (usually main).,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/63/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1091850530,I_kwDODEm0Qs5BFFEi,63,Import archive error 'withheld_in_countries',521097,pauloxnet,open,0,,,,,0,2022-01-01T16:58:59Z,2022-01-01T16:58:59Z,,NONE,,"Importing the twitter archive I received this error: ```bash $ twitter-to-sqlite import archive.db twitter-2021-12-31-.zip birdwatch-note-rating: not yet implemented birdwatch-note: not yet implemented branch-links: not yet implemented community-tweet: not yet implemented contact: not yet implemented device-token: not yet implemented direct-message-mute: not yet implemented mute: not yet implemented periscope-account-information: not yet implemented periscope-ban-information: not yet implemented periscope-broadcast-metadata: not yet implemented periscope-comments-made-by-user: not yet implemented periscope-expired-broadcasts: not yet implemented periscope-followers: not yet implemented periscope-profile-description: not yet implemented professional-data: not yet implemented protected-history: not yet implemented reply-prompt: not yet implemented screen-name-change: not yet implemented smartblock: not yet implemented spaces-metadata: not yet implemented sso: not yet implemented Traceback (most recent call last): File ""/home/paulox/.virtualenvs/dogsheep/bin/twitter-to-sqlite"", line 8, in sys.exit(cli()) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/twitter_to_sqlite/cli.py"", line 759, in import_ archive.import_from_file(db, filename, content) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/twitter_to_sqlite/archive.py"", line 246, in import_from_file db[table_name].insert_all(rows, pk=pk, replace=True) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/sqlite_utils/db.py"", line 2625, in insert_all self.insert_chunk( File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/sqlite_utils/db.py"", line 2406, in insert_chunk result = self.db.execute(query, params) File ""/home/paulox/.virtualenvs/dogsheep/lib/python3.9/site-packages/sqlite_utils/db.py"", line 422, in execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: table archive_tweet has no column named withheld_in_countries ``` I found only a single tweet with the key `withheld_in_countries` in `tweet.js` that seems the problems: ```JSON [ { ""tweet"" : { ""retweeted"" : false, ""source"" : ""Twitter for Android"", ""entities"" : { ""hashtags"" : [ { ""text"" : ""NowOnAndroid"", ""indices"" : [ ""64"", ""77"" ] } ], ""symbols"" : [ ], ""user_mentions"" : [ { ""name"" : ""Periscope"", ""screen_name"" : ""PeriscopeCo"", ""indices"" : [ ""3"", ""15"" ], ""id_str"" : ""1111111111"", ""id"" : ""222222222"" } ], ""urls"" : [ { ""url"" : ""https://t.co/xxxxxxxxx"", ""expanded_url"" : ""https://vine.co/v/xxxxxxxxx"", ""display_url"" : ""vine.co/v/xxxxxxxxxx"", ""indices"" : [ ""78"", ""101"" ] } ] }, ""display_text_range"" : [ ""0"", ""101"" ], ""favorite_count"" : ""0"", ""id_str"" : ""1111111111111111111111"", ""truncated"" : false, ""retweet_count"" : ""0"", ""withheld_in_countries"" : [ ""TR"" ], ""id"" : ""000000000000000000"", ""possibly_sensitive"" : false, ""created_at"" : ""Fri Aug 14 06:04:03 +0000 2015"", ""favorited"" : false, ""full_text"" : ""RT @periscopeco: Travel the world. LIVE. The Global Map is here #NowOnAndroid https://t.co/NZXdsPWROk"", ""lang"" : ""en"" } } ] ``` I solved the error removing the key from the `tweet.js` but I'm reporting this error to improve the project.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/63/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 273181020,MDU6SXNzdWUyNzMxODEwMjA=,64,Support for ?field__isnull=1 or similar,9599,simonw,closed,0,,,,,1,2017-11-11T22:26:52Z,2017-11-17T14:38:21Z,2017-11-17T14:38:21Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/64/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 519032008,MDExOlB1bGxSZXF1ZXN0MzM3ODQ3NTcz,64,test_insert_upsert_all_empty_list,9599,simonw,closed,0,,,,,0,2019-11-07T04:24:45Z,2019-11-07T04:32:38Z,2019-11-07T04:32:38Z,OWNER,simonw/sqlite-utils/pulls/64,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/64/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 920636216,MDU6SXNzdWU5MjA2MzYyMTY=,64,"feature: support ""events""",231498,khimaros,open,0,,,,,5,2021-06-14T17:42:49Z,2021-06-15T00:48:37Z,,NONE,,"the GitHub API provides the ability to fetch all events for a given user, organization, or repository: https://docs.github.com/en/rest/reference/activity#list-events-for-the-authenticated-user this would allow users to export all of the issue comments, new issues, etc. that they created. something which is currently missing from the GitHub takeout exports.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/64/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1097332098,I_kwDODEm0Qs5BZ_WC,64,Include all entities for tweets,111631,max,open,0,,,,,0,2022-01-09T23:35:28Z,2022-01-09T23:35:28Z,,NONE,,"Per our conversation [on Twitter](https://twitter.com/mschoening/status/1480312477246054401): It would be neat if all entities (including URLs) were captured. This way you can ensure, that URLs are parsed out exactly the same way Twitter parses URLs – we all know parsing URLs with a regex ain't fun. Right now, I believe the tool filters out all entities that are not of type `media`.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/64/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 273191608,MDU6SXNzdWUyNzMxOTE2MDg=,65,Re-implement ?sql= mode,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-12T01:47:17Z,2017-11-12T02:36:37Z,2017-11-12T02:35:42Z,OWNER,,"Here's the code I removed: async def data(self, request, name, hash): sql = 'select * from sqlite_master' custom_sql = False params = {} if request.args.get('sql'): params = request.raw_args sql = params.pop('sql') validate_sql_select(sql) custom_sql = True rows = await self.execute(name, sql, params) columns = [r[0] for r in rows.description] return { 'database': name, 'rows': rows, 'columns': columns, 'query': { 'sql': sql, 'params': params, } }, { 'database_hash': hash, 'custom_sql': custom_sql, } ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/65/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 519039316,MDExOlB1bGxSZXF1ZXN0MzM3ODUzMzk0,65,Release 1.12.1,9599,simonw,closed,0,,,,,0,2019-11-07T04:51:29Z,2019-11-07T04:58:48Z,2019-11-07T04:58:47Z,OWNER,simonw/sqlite-utils/pulls/65,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/65/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 923270900,MDExOlB1bGxSZXF1ZXN0NjcyMDUzODEx,65,basic support for events,231498,khimaros,open,0,,,,,2,2021-06-17T00:51:30Z,2022-10-03T22:35:03Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/65,"a quick first pass at implementing the feature requested in https://github.com/dogsheep/github-to-sqlite/issues/64 testing instructions: ``` $ github-to-sqlite events events.db user/khimaros ``` if the specified user is the authenticated user, it will also include private events. caveat: pagination appears to be broken (i don't see `next` in the response JSON from GitHub)",207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/65/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1160327106,PR_kwDODEm0Qs4z_V3w,65,"Update Twitter dev link, clarify apps vs projects",2657547,rixx,open,0,,,,,0,2022-03-05T11:56:08Z,2022-03-05T11:56:08Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/65,"Twitter pushes you heavily towards v2 projects instead of v1 apps – I know the README mentions v1 API compatibility at the top, but I still nearly got turned around here.",206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/65/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 273191806,MDU6SXNzdWUyNzMxOTE4MDY=,66,Show table SQL on table page,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-12T01:51:23Z,2017-11-12T21:17:29Z,2017-11-12T21:17:29Z,OWNER,,"Let's do the SQL for the table you are looking at, plus SQL for any indexes that mention that table. The page for a view should show the SQL for that view.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/66/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 521868864,MDU6SXNzdWU1MjE4Njg4NjQ=,66,"The "".upsert()"" method is misnamed",9599,simonw,closed,0,,,,,15,2019-11-12T23:48:28Z,2019-12-31T01:30:21Z,2019-12-31T01:30:20Z,OWNER,,"This thread here is illuminating: https://stackoverflow.com/questions/3634984/insert-if-not-exists-else-update The term `UPSERT` in SQLite has a specific meaning as-of 3.24.0 (2018-06-04): https://www.sqlite.org/lang_UPSERT.html It means ""behave as an UPDATE or a no-op if the INSERT would violate a uniqueness constraint"". The syntax in 3.24.0+ looks like this (confusingly it does not use the term ""upsert""): ```sql INSERT INTO phonebook(name,phonenumber) VALUES('Alice','704-555-1212') ON CONFLICT(name) DO UPDATE SET phonenumber=excluded.phonenumber ``` Here's the problem: the `sqlite-utils` `.upsert()` and `.upsert_all()` methods don't do this. They use the following SQL: ```sql INSERT OR REPLACE INTO [{table}] ({columns}) VALUES {rows}; ``` If the record already exists, it will be entirely replaced by a new record - as opposed to updating any specified fields but leaving existing fields as they are (the behaviour of ""upsert"" in SQLite itself).",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/66/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 975161924,MDExOlB1bGxSZXF1ZXN0NzE2MzU3OTgy,66,Add --merged-by flag to pull-requests sub command,30531572,sarcasticadmin,open,0,,,,,1,2021-08-20T00:57:55Z,2021-09-28T21:50:31Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/66,"## Description Proposing a solution to the API limitation for `merged_by` in pull_requests. Specifically the following called out in the readme: ``` Note that the merged_by column on the pull_requests table will only be populated for pull requests that are loaded using the --pull-request option - the GitHub API does not return this field for pull requests that are loaded in bulk. ``` This approach might cause larger repos to hit rate limits called out in https://github.com/dogsheep/github-to-sqlite/issues/51 but seems to work well in the repos I tested and included below. ## Old Behavior - Had to list out the pull-requests individually via multiple `--pull-request` flags ## New Behavior - `--merged-by` flag for getting 'merge_by' information out of pull-requests without having to specify individual PR numbers. # Testing Picking some repo that has more than one merger (datasette only has 1 😉 ) ``` $ github-to-sqlite pull-requests ./github.db opnsense/tools --merged-by $ echo ""select id, url, merged_by from pull_requests;"" | sqlite3 ./github.db 83533612|https://github.com/opnsense/tools/pull/39|1915288 102632885|https://github.com/opnsense/tools/pull/43|1915288 149114810|https://github.com/opnsense/tools/pull/57|1915288 160394495|https://github.com/opnsense/tools/pull/64|1915288 163308408|https://github.com/opnsense/tools/pull/67|1915288 169723264|https://github.com/opnsense/tools/pull/69|1915288 171381422|https://github.com/opnsense/tools/pull/72|1915288 179938195|https://github.com/opnsense/tools/pull/77|1915288 196233824|https://github.com/opnsense/tools/pull/82|1915288 215289964|https://github.com/opnsense/tools/pull/93| 219696100|https://github.com/opnsense/tools/pull/97|1915288 223664843|https://github.com/opnsense/tools/pull/99| 228446172|https://github.com/opnsense/tools/pull/103|1915288 238930434|https://github.com/opnsense/tools/pull/110|1915288 255507110|https://github.com/opnsense/tools/pull/119|1915288 255980675|https://github.com/opnsense/tools/pull/120|1915288 261906770|https://github.com/opnsense/tools/pull/125| 263800503|https://github.com/opnsense/tools/pull/127|1915288 264038685|https://github.com/opnsense/tools/pull/128|1915288 264696704|https://github.com/opnsense/tools/pull/129|1915288 266660547|https://github.com/opnsense/tools/pull/130|1915288 273120409|https://github.com/opnsense/tools/pull/133|1915288 274370803|https://github.com/opnsense/tools/pull/135| 276600629|https://github.com/opnsense/tools/pull/139| 277303655|https://github.com/opnsense/tools/pull/141|1915288 293033714|https://github.com/opnsense/tools/pull/145| 294827649|https://github.com/opnsense/tools/pull/146| 295140008|https://github.com/opnsense/tools/pull/147|1915288 305690829|https://github.com/opnsense/tools/pull/150|9783985 307077931|https://github.com/opnsense/tools/pull/152|1915288 321782100|https://github.com/opnsense/tools/pull/155| 337265672|https://github.com/opnsense/tools/pull/160| 337267484|https://github.com/opnsense/tools/pull/161|1915288 368251763|https://github.com/opnsense/tools/pull/169| 428262505|https://github.com/opnsense/tools/pull/181| 437557011|https://github.com/opnsense/tools/pull/182|1915288 447079893|https://github.com/opnsense/tools/pull/185| 461822092|https://github.com/opnsense/tools/pull/191| 463290142|https://github.com/opnsense/tools/pull/193|1915288 470112962|https://github.com/opnsense/tools/pull/194|1915288 472644649|https://github.com/opnsense/tools/pull/195|1915288 488696898|https://github.com/opnsense/tools/pull/198| 513289902|https://github.com/opnsense/tools/pull/201| 522530265|https://github.com/opnsense/tools/pull/203| 564443347|https://github.com/opnsense/tools/pull/213| 597579516|https://github.com/opnsense/tools/pull/220|1915288 602860357|https://github.com/opnsense/tools/pull/221|1915288 608744738|https://github.com/opnsense/tools/pull/222|1915288 623279673|https://github.com/opnsense/tools/pull/228|1915288 664656182|https://github.com/opnsense/tools/pull/233| 664781786|https://github.com/opnsense/tools/pull/234|1915288 670683636|https://github.com/opnsense/tools/pull/235|1915288 683150764|https://github.com/opnsense/tools/pull/237| 685016233|https://github.com/opnsense/tools/pull/238| 687099825|https://github.com/opnsense/tools/pull/239|1915288 715705652|https://github.com/opnsense/tools/pull/244|1915288 715721248|https://github.com/opnsense/tools/pull/245|1915288 ``` `userid` are now present for those PRs that were merged. Without the flag the `merged_by` behavior remains missing as expected when get PRs bulk: ``` $ github-to-sqlite pull-requests ./github.db opnsense/tools $ echo ""select id, url, merged_by from pull_requests;"" | sqlite3 ./github.db 83533612|https://github.com/opnsense/tools/pull/39| 102632885|https://github.com/opnsense/tools/pull/43| 149114810|https://github.com/opnsense/tools/pull/57| 160394495|https://github.com/opnsense/tools/pull/64| 163308408|https://github.com/opnsense/tools/pull/67| 169723264|https://github.com/opnsense/tools/pull/69| 171381422|https://github.com/opnsense/tools/pull/72| 179938195|https://github.com/opnsense/tools/pull/77| 196233824|https://github.com/opnsense/tools/pull/82| 215289964|https://github.com/opnsense/tools/pull/93| 219696100|https://github.com/opnsense/tools/pull/97| 223664843|https://github.com/opnsense/tools/pull/99| 228446172|https://github.com/opnsense/tools/pull/103| 238930434|https://github.com/opnsense/tools/pull/110| 255507110|https://github.com/opnsense/tools/pull/119| 255980675|https://github.com/opnsense/tools/pull/120| 261906770|https://github.com/opnsense/tools/pull/125| 263800503|https://github.com/opnsense/tools/pull/127| 264038685|https://github.com/opnsense/tools/pull/128| 264696704|https://github.com/opnsense/tools/pull/129| 266660547|https://github.com/opnsense/tools/pull/130| 273120409|https://github.com/opnsense/tools/pull/133| 274370803|https://github.com/opnsense/tools/pull/135| 276600629|https://github.com/opnsense/tools/pull/139| 277303655|https://github.com/opnsense/tools/pull/141| 293033714|https://github.com/opnsense/tools/pull/145| 294827649|https://github.com/opnsense/tools/pull/146| 295140008|https://github.com/opnsense/tools/pull/147| 305690829|https://github.com/opnsense/tools/pull/150| 307077931|https://github.com/opnsense/tools/pull/152| 321782100|https://github.com/opnsense/tools/pull/155| 337265672|https://github.com/opnsense/tools/pull/160| 337267484|https://github.com/opnsense/tools/pull/161| 368251763|https://github.com/opnsense/tools/pull/169| 428262505|https://github.com/opnsense/tools/pull/181| 437557011|https://github.com/opnsense/tools/pull/182| 447079893|https://github.com/opnsense/tools/pull/185| 461822092|https://github.com/opnsense/tools/pull/191| 463290142|https://github.com/opnsense/tools/pull/193| 470112962|https://github.com/opnsense/tools/pull/194| 472644649|https://github.com/opnsense/tools/pull/195| 488696898|https://github.com/opnsense/tools/pull/198| 513289902|https://github.com/opnsense/tools/pull/201| 522530265|https://github.com/opnsense/tools/pull/203| 564443347|https://github.com/opnsense/tools/pull/213| 597579516|https://github.com/opnsense/tools/pull/220| 602860357|https://github.com/opnsense/tools/pull/221| 608744738|https://github.com/opnsense/tools/pull/222| 623279673|https://github.com/opnsense/tools/pull/228| 664656182|https://github.com/opnsense/tools/pull/233| 664781786|https://github.com/opnsense/tools/pull/234| 670683636|https://github.com/opnsense/tools/pull/235| 683150764|https://github.com/opnsense/tools/pull/237| 685016233|https://github.com/opnsense/tools/pull/238| 687099825|https://github.com/opnsense/tools/pull/239| 715705652|https://github.com/opnsense/tools/pull/244| 715721248|https://github.com/opnsense/tools/pull/245| ``` Individual PRs passed via `--pull-request` flag behaves as expected (unchanged): ``` $ github-to-sqlite pull-requests ./github.db opnsense/tools --pull-request 39 --pull-request 237 $ echo ""select id, url, merged_by from pull_requests;"" | sqlite3 ./github.db 83533612|https://github.com/opnsense/tools/pull/39|1915288 683150764|https://github.com/opnsense/tools/pull/237| ``` > Picking 1 PR that has a merged_by (39) and one that does not (237)",207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/66/reactions"", ""total_count"": 3, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",0, 1244082183,PR_kwDODEm0Qs44PPLy,66,Ageinfo workaround,11887,ashanan,open,0,,,,,0,2022-05-21T21:08:29Z,2022-05-21T21:09:16Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/66,"I'm not sure if this is due to a new format or just because my ageinfo file is blank, but trying to import an archive would crash when it got to that file. This PR adds a guard clause in the `ageinfo` transformer and sets a default value that doesn't throw an exception. Seems likely to be the same issue mentioned by danp in https://github.com/dogsheep/twitter-to-sqlite/issues/54, my ageinfo file looks the same. Added that same ageinfo file to the test archive as well to help confirm my workaround didn't break anything. Let me know if you want any changes!",206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/66/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 273192789,MDU6SXNzdWUyNzMxOTI3ODk=,67,Command that builds a local docker container,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-11-12T02:13:29Z,2017-11-13T16:17:52Z,2017-11-13T16:17:52Z,OWNER,,Be nice to indicate that this isn't just for Now. Shouldn't be too hard either.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/67/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 529376481,MDExOlB1bGxSZXF1ZXN0MzQ2MjY0OTI2,67,Run tests against 3.5 too,9599,simonw,closed,0,,,,,2,2019-11-27T14:20:35Z,2019-12-31T01:29:44Z,2019-12-31T01:29:43Z,OWNER,simonw/sqlite-utils/pulls/67,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/67/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 981690086,MDExOlB1bGxSZXF1ZXN0NzIxNjg2NzIx,67,Replacing step ID key with step_id,16374374,jshcmpbll,open,0,,,,,0,2021-08-28T01:26:41Z,2021-08-28T01:27:00Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/67,"Workflows that have an `id` in any step result in the following error when running `workflows`: e.g.`github-to-sqlite workflows github.db nixos/nixpkgs` ```Traceback (most recent call last): File ""/usr/local/bin/github-to-sqlite"", line 8, in sys.exit(cli()) File ""/usr/local/lib/python3.8/dist-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.8/dist-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.8/dist-packages/click/core.py"", line 1668, in invoke```Traceback (most recent call last): File ""/usr/local/bin/github-to-sqlite"", line 8, in sys.exit(cli()) File ""/usr/local/lib/python3.8/dist-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.8/dist-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.8/dist-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.8/dist-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.8/dist-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/usr/local/lib/python3.8/dist-packages/github_to_sqlite/cli.py"", line 601, in workflows utils.save_workflow(db, repo_id, filename, content) File ""/usr/local/lib/python3.8/dist-packages/github_to_sqlite/utils.py"", line 865, in save_workflow db[""steps""].insert_all( File ""/usr/local/lib/python3.8/dist-packages/sqlite_utils/db.py"", line 2596, in insert_all self.insert_chunk( File ""/usr/local/lib/python3.8/dist-packages/sqlite_utils/db.py"", line 2378, in insert_chunk result = self.db.execute(query, params) File ""/usr/local/lib/python3.8/dist-packages/sqlite_utils/db.py"", line 419, in execute return self.conn.execute(sql, parameters) sqlite3.IntegrityError: datatype mismatch ``` - [Information about the ID key in a step for GHA](https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions#jobsjob_idstepsid) - [An example workflow from a public repo](https://github.com/NixOS/nixpkgs/blob/b4cc66827745e525ce7bb54659845ac89788a597/.github/workflows/direct-push.yml#L16) # Changes I'm proposing that the key for `id` in step is replaced with `step_id` so that it no longer interferes with the table `id` for tracking the record. Special thanks to @sarcasticadmin @egiffen and @ruebenramirez for helping a bit on this 😄 ",207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/67/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1513237712,PR_kwDODEm0Qs5GUoG_,67,Add support for app-only bearer tokens,26161409,sometimes-i-send-pull-requests,open,0,,,,,0,2022-12-28T23:31:20Z,2022-12-28T23:31:20Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/67,"Previously, twitter-to-sqlite only supported OAuth1 authentication, and the token must be on behalf of a user. However, Twitter also supports application-only bearer tokens, documented here: https://developer.twitter.com/en/docs/authentication/oauth-2-0/bearer-tokens This PR adds support to twitter-to-sqlite for using application-only bearer tokens. To use, the auth.json file just needs to contain a ""bearer_token"" key instead of ""api_key"", ""api_secret_key"", etc.",206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/67/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 273247186,MDU6SXNzdWUyNzMyNDcxODY=,68,Support for title/source/license metadata,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-11-12T17:04:21Z,2017-12-04T04:55:43Z,2017-11-13T15:26:11Z,OWNER,,"I've decided this is important for launch: I want to set a precedent for people citing, licensing and documenting their datasets. Not sure how best to go about supporting this. I'd like to allow for the following data to be optionally attached to any given database: - Title - Description, potentially in markdown? - Original source URL - License I'd also like the ability to attach descriptions to individual tables - and maybe even to table columns? The question then becomes: how should this information be stored. A few options: - In the SQLite database itself, in a specially named table. Problem here is that this means having to modify SQLite databases before publishing them. - In a separate SQLite database that can be published alongside the databases we are publishing. - In a JSON file. This is neat, but JSON files are not a great editing experience once you start including multiple lines (e.g. a markdown description). - In a YAML file. This is a better format for multi-line descriptions, but still isn't a great editing experience. Whatever the format, it can be made much more usable by offering a web-based editing UI for populating it (a special mode the server can be run in).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/68/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 531583658,MDU6SXNzdWU1MzE1ODM2NTg=,68,Add support for porter stemming in FTS,9599,simonw,closed,0,,,,,1,2019-12-02T22:35:52Z,2020-09-20T04:25:53Z,2020-09-20T04:25:47Z,OWNER,,FTS5 can have porter stemming enabled.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/68/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1013506559,PR_kwDODFdgUs4skaNS,68,Add support for retrieving teams / members,68329,philwills,open,0,,,,,0,2021-10-01T15:55:02Z,2021-10-01T15:59:53Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/68,Adds a method for retrieving all the teams within an organisation and all the members in those teams. The latter is stored as a join table `team_members` beteween `teams` and `users`.,207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/68/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1513237982,PR_kwDODEm0Qs5GUoKL,68,Archive: Import mute table,26161409,sometimes-i-send-pull-requests,open,0,,,,,0,2022-12-28T23:32:06Z,2022-12-28T23:32:06Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/68,,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/68/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 273248366,MDU6SXNzdWUyNzMyNDgzNjY=,69,Enforce pagination (or at least limits) for arbitrary custom SQL,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-11-12T17:21:33Z,2017-11-13T20:32:47Z,2017-11-13T19:35:47Z,OWNER,,"It's way too easy to accidentally trigger a page that returns 100,000 rows at the moment. I need to use the LIMIT clause on views and custom SQL - I can support pagination ""next"" links using offset as well.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/69/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 534507142,MDU6SXNzdWU1MzQ1MDcxNDI=,69,Feature request: enable extensions loading,30607,aborruso,closed,0,,,,,3,2019-12-08T08:06:25Z,2022-02-05T00:04:25Z,2020-10-16T18:42:49Z,NONE,,"Hi, it would be great to add a parameter that enables the load of a sqlite extension you need. Something like ""-ext modspatialite"". In this way your great tool would be even more comfortable and powerful. Thank you very much",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/69/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1071071397,I_kwDODFdgUs4_10Cl,69,View that combines issues and issue comments,9599,simonw,open,0,,,,,1,2021-12-04T00:34:33Z,2021-12-04T00:34:52Z,,MEMBER,,I want to see a reverse chronologically ordered interface onto both issues and comments - essentially a unified log of comments and issues opened across one or multiple projects.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/69/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1513238152,PR_kwDODEm0Qs5GUoMM,69,Archive: Import new tweets table name,26161409,sometimes-i-send-pull-requests,open,0,,,,,0,2022-12-28T23:32:44Z,2022-12-28T23:32:44Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/69,"Given the code here, it seems like in the past this file was named ""tweet.js"". In recent exports, it's named ""tweets.js"". The archive importer needs to be modified to take this into account. Existing logic is reused for importing this table. (However, the resulting table name will be different, matching the different file name -- archive_tweets, rather than archive_tweet).",206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/69/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 273267081,MDU6SXNzdWUyNzMyNjcwODE=,70,Paginate views using OFFSET/LIMIT,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-12T21:30:29Z,2017-11-13T21:11:01Z,2017-11-13T21:11:01Z,OWNER,,"As with #69 these should obey a maximum offset setting, which can be over-ridden.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/70/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 539204432,MDU6SXNzdWU1MzkyMDQ0MzI=,70,Implement ON DELETE and ON UPDATE actions for foreign keys,26292069,LucasElArruda,open,0,,,,,2,2019-12-17T17:19:10Z,2020-02-27T04:18:53Z,,NONE,,"Hi! I did not find any mention on the library about ON DELETE and ON UPDATE actions for foreign keys. Are those expected to be implemented? If not, it would be a nice thing to include!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/70/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1149402080,PR_kwDODFdgUs4zaUta,70,scrape-dependents: enable paging through package menu option if present,36061055,stanbiryukov,open,0,,,,,0,2022-02-24T15:07:25Z,2022-02-24T15:07:25Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/70,Some repos organize network dependents by a Package toggle. This PR adds the ability to page through those options and scrape underlying dependents.,207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/70/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1513238314,PR_kwDODEm0Qs5GUoN6,70,Archive: Import Twitter Circle data,26161409,sometimes-i-send-pull-requests,open,0,,,,,0,2022-12-28T23:33:09Z,2022-12-28T23:33:09Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/70,,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/70/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 273278840,MDU6SXNzdWUyNzMyNzg4NDA=,71,Set up some example datasets on a Cloudflare-backed domain,9599,simonw,closed,0,,,2857392,Ship first public release,10,2017-11-13T00:06:30Z,2017-11-13T02:09:34Z,2017-11-13T02:09:34Z,OWNER,,"To better demonstrate the caching and HTTP/2 features, I'd like to go live with some demos that are hosted behind Cloudflare. - [x] Redirect https://datasettes.com/ and https://www.datasettes.com/ to https://github.com/simonw/datasette - [x] Have `now domain add -e datasettes.com` run without errors (hopefully just a matter of waiting for the DNS to update) - [x] Alias an example dataset hosted on Now on a datasettes.com subdomain - [x] Confirm that HTTP caching and HTTP/2 redirect pushing works as expected - this may require another page rule",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/71/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 542814756,MDU6SXNzdWU1NDI4MTQ3NTY=,71,Tests are failing due to missing FTS5,9599,simonw,closed,0,,,,,3,2019-12-27T09:41:16Z,2019-12-27T09:49:37Z,2019-12-27T09:49:37Z,OWNER,,"https://travis-ci.com/simonw/sqlite-utils/jobs/268436167 This is a recent change: 2 months ago they worked fine. I'm not sure what changed here. Maybe something to do with https://launchpad.net/~jonathonf/+archive/ubuntu/backports ?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/71/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1177059481,I_kwDODFdgUs5GKICZ,71,Store commit parents,64686,carltongibson,closed,0,,,,,0,2022-03-22T17:06:48Z,2022-04-22T12:44:04Z,2022-04-22T12:44:04Z,NONE,,"Hi @simonw 👋 Currently, stored commit data doesn't quite give me the information I'm needing... Committer date and author date are not 100% reliable for dividing a commit history up by release or branch. A PR created before a release but merged after can have earlier dates… — this can be quite frustrating if you're trying to pin down commits for a release: _It should be there!_, but then isn't. (This gets worse using release branches.) Would you be open to adding the `sha` of a `parent` of a commit to the commit table? (As an FK? 🤔 — likely not feasible.) It's part of the [response body](https://docs.github.com/en/rest/reference/commits#get-a-commit): ``` ""parents"": [ { ""url"": ""https://api.github.com/repos/octocat/Hello-World/commits/6dcb09b5b57875f334f61aebed695e2e4193db5e"", ""sha"": ""6dcb09b5b57875f334f61aebed695e2e4193db5e"" } ], ``` I think this list should only have a single entry. (🤔 — not sure why it's a list then...) With this it would be possible to build/reconstruct a chain of commits from the history, that I don't **think** is available as yet (unless you know a better way). It is certainly possible to get sequential lists of commits out of git directly, so the same would be possible combining tools, but wondering if a single tool could do it. What do you think? Thanks! 🏅 ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/71/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1513238455,PR_kwDODEm0Qs5GUoPm,71,"Archive: Fix ""ni devices"" typo in importer",26161409,sometimes-i-send-pull-requests,open,0,,,,,0,2022-12-28T23:33:31Z,2022-12-28T23:33:31Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/71,,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/71/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 273283166,MDU6SXNzdWUyNzMyODMxNjY=,72,publish command should take an optional --name argument,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-13T00:59:35Z,2017-11-13T02:12:27Z,2017-11-13T02:12:27Z,OWNER,,"To set the directory name so that now will inherit it as the name of the app. Defaults to datasette ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/72/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 543738004,MDExOlB1bGxSZXF1ZXN0MzU3OTkyNTg4,72,Fixed implementation of upsert,9599,simonw,closed,0,,,,,0,2019-12-30T05:08:05Z,2019-12-30T05:29:24Z,2019-12-30T05:29:24Z,OWNER,simonw/sqlite-utils/pulls/72,Refs #66,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/72/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1211283427,I_kwDODFdgUs5IMrfj,72,feature: display progress bar when downloading multi-page responses,9020979,hydrosquall,open,0,,,,,1,2022-04-21T16:37:12Z,2022-04-21T17:29:31Z,,NONE,,"## Motivation For a long running command (longer than 1 minute) for a big table (like pull requests or commits), it can be tricky to know if the script is still running, or if a rate limit/error was encountered We know how many pages there are, so it may be possible to indicate how many remain. https://github.com/dogsheep/github-to-sqlite/blob/a6e237f75a4b86963d91dcb5c9582e3a1b3349d6/github_to_sqlite/utils.py#L367 ## Resources - Using the existing Click API: - https://click.palletsprojects.com/en/5.x/utils/#showing-progress-bars - Loading spinner: https://github.com/pavdmyt/yaspin - Progress bar: https://github.com/tqdm/tqdm",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/72/reactions"", ""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1524431805,I_kwDODEm0Qs5a3Pu9,72,"Import thread, including self- and others' replies",601708,mcint,open,0,,,,,0,2023-01-08T09:51:06Z,2023-01-08T09:51:06Z,,NONE,,"statuses-lookup, home-timeline, mentions (only for auth'ed user) don't cover this. `twitter-to-sqlite fetch-thread tw-group1.db 1234123412341234` twitter-to-sqlite focuses on archiving users, but does not easily support archiving conversations or community activity. For reference, this is [implemented in twarc](https://sourcegraph.com/github.com/DocNow/twarc/-/blob/twarc/client.py?L708-766&subtree=true), using a search, optionally recursively. Other research suggests that this formerly, or currently, requires a [search query](https://stackoverflow.com/a/30480103/1020467), use of [undocumented `related_results` api](https://stackoverflow.com/a/9419346/1020467), or with requested inclusion of [newer conversation_id](https://stackoverflow.com/a/68115718/1020467) with subsequent query. ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/72/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 273296178,MDU6SXNzdWUyNzMyOTYxNzg=,73,_nocache=1 query string option for use with sort-by-random,9599,simonw,closed,0,,,,,2,2017-11-13T02:57:10Z,2018-05-28T17:25:15Z,2018-05-28T17:25:15Z,OWNER,,The one place where we wouldn’t want cdching is if we have something which uses sort by random to return random items. We can offer a _nocache=1 querystring argument to support this.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/73/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 545407916,MDU6SXNzdWU1NDU0MDc5MTY=,73,upsert_all() throws issue when upserting to empty table,82988,psychemedia,closed,0,,,,,6,2020-01-05T11:58:57Z,2020-01-31T14:21:09Z,2020-01-05T17:20:18Z,NONE,,"If I try to add a list of `dict`s to an empty table using `upsert_all`, I get an error: ```python import sqlite3 from sqlite_utils import Database import pandas as pd conx = sqlite3.connect(':memory') cx = conx.cursor() cx.executescript('CREATE TABLE ""test"" (""Col1"" TEXT);') q=""SELECT * FROM test;"" pd.read_sql(q, conx) #shows empty table db = Database(conx) db['test'].upsert_all([{'Col1':'a'},{'Col1':'b'}]) --------------------------------------------------------------------------- TypeError Traceback (most recent call last) in 1 db = Database(conx) ----> 2 db['test'].upsert_all([{'Col1':'a'},{'Col1':'b'}]) /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in upsert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, alter, extracts) 1157 alter=alter, 1158 extracts=extracts, -> 1159 upsert=True, 1160 ) 1161 /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, alter, ignore, replace, extracts, upsert) 1040 sql = ""INSERT OR IGNORE INTO [{table}]({pks}) VALUES({pk_placeholders});"".format( 1041 table=self.name, -> 1042 pks="", "".join([""[{}]"".format(p) for p in pks]), 1043 pk_placeholders="", "".join([""?"" for p in pks]), 1044 ) TypeError: 'NoneType' object is not iterable ``` A hacky workaround in use is: ```python try: db['test'].upsert_all([{'Col1':'a'},{'Col1':'b'}]) except: db['test'].insert_all([{'Col1':'a'},{'Col1':'b'}]) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/73/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1261884917,PR_kwDODFdgUs45K1L3,73,Fixing 'NoneType' object has no attribute 'items',1224205,empjustine,closed,0,,,,,1,2022-06-06T13:58:11Z,2022-07-18T19:40:12Z,2022-07-18T19:40:12Z,CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/73,"Under some conditions, GitHub caches removed starred repositories and ends up leaving dangling `None` user references. Traceback (most recent call last): File ""/home/dogsheep/dogsheep/github-to-sqlite/bin/github-to-sqlite"", line 8, in sys.exit(cli()) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/click/core.py"", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/github_to_sqlite/cli.py"", line 181, in starred utils.save_stars(db, user, stars) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/github_to_sqlite/utils.py"", line 494, in save_stars repo_id = save_repo(db, repo) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/github_to_sqlite/utils.py"", line 308, in save_repo to_save[""owner""] = save_user(db, to_save[""owner""]) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/github_to_sqlite/utils.py"", line 229, in save_user for key, value in user.items() AttributeError: 'NoneType' object has no attribute 'items'",207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/73/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1816830546,I_kwDODEm0Qs5sSqJS,73,Twitter v1 API shutdown,6341745,david-perez,open,0,,,,,0,2023-07-22T16:57:41Z,2023-07-22T16:57:41Z,,NONE,,"I've been using this project reliably over the past two years to periodically download my liked tweets, but unfortunately since 19th July I get: ``` [2023-07-19 21:00:04.937536] File ""/home/pi/code/liked-tweets/lib/python3.7/site-packages/twitter_to_sqlite/utils.py"", line 202, in fetch_timeline [2023-07-19 21:00:04.937606] raise Exception(str(tweets[""errors""])) [2023-07-19 21:00:04.937678] Exception: [{'message': 'You currently have access to a subset of Twitter API v2 endpoints and limited v1.1 endpoints (e.g. media post, oauth) only. If you need access to this endpoint, you may need a different access level. You can learn more here: https://developer.twitter.com/en/portal/product', 'code': 453}] ``` It appears like Twitter has now shut down their v1 endpoints, which is rather gracious of them, considering they [announced they'd be deprecated on 29th April](https://twittercommunity.com/t/reminder-to-migrate-to-the-new-free-basic-or-enterprise-plans-of-the-twitter-api/189737). Unfortunately [retrieving likes using the v2 API](https://developer.twitter.com/en/docs/twitter-api/tweets/likes/introduction) is not part of their [free plan](https://developer.twitter.com/en/portal/products). In fact, with the free plan one can only post and delete tweets and retrieve information about oneself. So I'm afraid this is the end of this very nice project. It was very useful, thank you! ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/73/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 1}",, 273296684,MDU6SXNzdWUyNzMyOTY2ODQ=,74,Send a 302 redirect to the new hash for hits to old hashes,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-13T03:00:59Z,2017-11-13T18:49:59Z,2017-11-13T18:49:59Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/74/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 546073980,MDU6SXNzdWU1NDYwNzM5ODA=,74,Test failures on openSUSE 15.1: AssertionError: Explicit other_table and other_column,15092,jayvdb,open,0,,,,,3,2020-01-07T04:35:50Z,2020-01-12T07:21:17Z,,CONTRIBUTOR,,"openSUSE 15.1 is using python 3.6.5 and click-7.0 , however it has test failures while openSUSE Tumbleweed on py37 passes. Most fail on the cli exit code like ```py [ 74s] =================================== FAILURES =================================== [ 74s] _________________________________ test_tables __________________________________ [ 74s] [ 74s] db_path = '/tmp/pytest-of-abuild/pytest-0/test_tables0/test.db' [ 74s] [ 74s] def test_tables(db_path): [ 74s] result = CliRunner().invoke(cli.cli, [""tables"", db_path]) [ 74s] > assert '[{""table"": ""Gosh""},\n {""table"": ""Gosh2""}]' == result.output.strip() [ 74s] E assert '[{""table"": ""...e"": ""Gosh2""}]' == '' [ 74s] E - [{""table"": ""Gosh""}, [ 74s] E - {""table"": ""Gosh2""}] [ 74s] [ 74s] tests/test_cli.py:28: AssertionError ``` packaging project at https://build.opensuse.org/package/show/home:jayvdb:py-new/python-sqlite-utils I'll keep digging into this after I have github-to-sqlite working on Tumbleweed, as I'll need openSUSE Leap 15.1 working before I can submit this into the main python repo.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/74/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1308461063,I_kwDODFdgUs5N_YgH,74,500 error in github-to-sqlite demo,9599,simonw,closed,0,,,,,5,2022-07-18T19:39:32Z,2022-07-18T21:16:18Z,2022-07-18T21:14:22Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github/issue_comments throws a 500: > `cannot import name 'etree' from 'markdown.util' (/usr/local/lib/python3.8/site-packages/markdown/util.py)` https://console.cloud.google.com/run/detail/us-central1/github-to-sqlite/metrics?project=datasette-222320 suggests this started happening 3 days ago.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/74/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273509159,MDU6SXNzdWUyNzM1MDkxNTk=,75,Add --cors argument to serve,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-13T17:16:19Z,2017-11-13T18:17:52Z,2017-11-13T18:17:52Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/75/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 546078359,MDExOlB1bGxSZXF1ZXN0MzU5ODIyNzcz,75,Explicitly include tests and docs in sdist,15092,jayvdb,closed,0,,,,,1,2020-01-07T04:53:20Z,2020-01-31T00:21:27Z,2020-01-31T00:21:27Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/75,Also exclude 'tests' from runtime installation.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/75/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1363244199,I_kwDODFdgUs5RQXSn,75,Fetch repos doesn't support organisations,2757699,OverkillGuy,open,0,,,,,0,2022-09-06T12:55:06Z,2022-09-06T12:55:06Z,,NONE,,"Say I want to get all my Github Org's repos info, for data analysis. Not just the public repos, but also the private/internal repos. The endpoints are different for organisation, and this tool doesn't take it into account: https://github.com/dogsheep/github-to-sqlite/blob/ace13ec3d98090d99bd71871c286a4a612c96a50/github_to_sqlite/utils.py#L453 https://github.com/dogsheep/github-to-sqlite/blob/ace13ec3d98090d99bd71871c286a4a612c96a50/github_to_sqlite/utils.py#L455 The endpoints for organisation repos is instead ([source](https://docs.github.com/en/rest/repos/repos#list-organization-repositories)): `url = ""https://api.github.com/orgs/{}/repos"".format(username)` Let's add support for organisations repo scraping.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/75/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 273510781,MDU6SXNzdWUyNzM1MTA3ODE=,76,publish should have required argument specifying publisher,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-13T17:21:26Z,2017-11-13T18:41:01Z,2017-11-13T18:41:01Z,OWNER,,Initially the only argument will be “now” - but “hyper” can be added in the future,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/76/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 549287310,MDU6SXNzdWU1NDkyODczMTA=,76,order_by mechanism,10501166,metab0t,closed,0,,,,,4,2020-01-14T02:06:03Z,2020-04-16T06:23:29Z,2020-04-16T03:13:06Z,NONE,,"In some cases, I want to iterate rows in a table with `ORDER BY` clause. It would be nice to have a `rows_order_by` function similar to `rows_where`. In a more general case, `rows_filter` function might be added to allow more customized filtering to iterate rows.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/76/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1363280254,PR_kwDODFdgUs4-cIa_,76,Add organization support to repos command,2757699,OverkillGuy,open,0,,,,,1,2022-09-06T13:21:42Z,2022-09-06T13:59:08Z,,FIRST_TIME_CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/76,"New --organization flag to signify all given ""usernames"" are private orgs. Adapts API URL to the organization path instead. Not the best implementation, but a first draft to talk around Fixes #75 (badly, no tests, overly vague, untested)",207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/76/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 273537940,MDU6SXNzdWUyNzM1Mzc5NDA=,77,Add Travis CI badge to README,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-13T18:52:25Z,2017-11-13T21:24:15Z,2017-11-13T21:24:15Z,OWNER,,"Also fix this newline issue: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/77/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 557825032,MDU6SXNzdWU1NTc4MjUwMzI=,77,Ability to insert data that is transformed by a SQL function,9599,simonw,closed,0,,,,,2,2020-01-30T23:45:55Z,2022-02-05T00:04:25Z,2020-01-31T00:24:32Z,OWNER,,"I want to be able to run the equivalent of this SQL insert: ```python # Convert to ""Well Known Text"" format wkt = shape(geojson['geometry']).wkt # Insert and commit the record conn.execute(""INSERT INTO places (id, name, geom) VALUES(null, ?, GeomFromText(?, 4326))"", ( ""Wales"", wkt )) conn.commit() ``` From the Datasette SpatiaLite docs: https://datasette.readthedocs.io/en/stable/spatialite.html To do this, I need a way of telling `sqlite-utils` that a specific column should be wrapped in `GeomFromText(?, 4326)`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/77/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1410548368,I_kwDODFdgUs5UE0KQ,77,Feature: Support GitHub discussions,631242,frosencrantz,open,0,,,,,0,2022-10-16T16:53:38Z,2022-10-16T16:53:38Z,,CONTRIBUTOR,,"Hi @simonw I've been a happy user of this tool. Thank you for writing it and sharing it. I wanted to suggest a feature request to support Discussions. For example the VisiData project has discussions https://github.com/saulpw/visidata/discussions , and it would be useful if there was a way to pull that data into the database. However, I'm not offering a pull request.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/77/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 273554949,MDU6SXNzdWUyNzM1NTQ5NDk=,78,Rename after to next and provide a next_url,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-13T19:48:31Z,2017-11-13T20:35:03Z,2017-11-13T20:35:03Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/78/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 557830332,MDExOlB1bGxSZXF1ZXN0MzY5MzQ4MDg0,78,"New conversions= feature, refs #77",9599,simonw,closed,0,,,,,0,2020-01-31T00:02:33Z,2020-09-22T07:48:29Z,2020-01-31T00:24:31Z,OWNER,simonw/sqlite-utils/pulls/78,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/78/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1505411725,I_kwDODFdgUs5ZusKN,78,self-hosted or corp github enterprise,549431,ebdavison,open,0,,,,,0,2022-12-20T22:51:45Z,2022-12-20T22:51:45Z,,NONE,,"We use github enterprise at work and I would like to use this tool to pull info from that site rather than the public github.com instance. Is there an option for this? If not, can one be added for a custom repo URL?",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/78/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 273569068,MDU6SXNzdWUyNzM1NjkwNjg=,79,Add more detailed API documentation to the README,9599,simonw,closed,0,,,,,3,2017-11-13T20:36:21Z,2018-05-28T17:24:48Z,2018-05-28T17:24:48Z,OWNER,,"Need to document: - [ ] The ?column__gt=4 style filter arguments for tables - [ ] The ?sql= API, and how named parameters work - [ ] How API pagination works - [ ] How redirects and cache headers work",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/79/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 557842245,MDU6SXNzdWU1NTc4NDIyNDU=,79,Helper methods for working with SpatiaLite,9599,simonw,closed,0,,,,,8,2020-01-31T00:39:19Z,2022-02-05T00:04:25Z,2022-02-04T05:55:11Z,OWNER,,"As demonstrated by this piece of documentation, using SpatiaLite with sqlite-utils requires a fair bit of boilerplate: https://github.com/simonw/sqlite-utils/blob/f7289174e66ae4d91d57de94bbd9d09fabf7aff4/docs/python-api.rst#L880-L909",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/79/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1570375808,I_kwDODFdgUs5dmgiA,79,Deploy demo job is failing due to rate limit,9599,simonw,open,0,,,,,2,2023-02-03T20:05:01Z,2023-12-08T14:50:15Z,,MEMBER,,https://github.com/dogsheep/github-to-sqlite/actions/runs/4080058087/jobs/7032116511,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/79/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 273569477,MDU6SXNzdWUyNzM1Njk0Nzc=,80,Deploy final versions of fivethirtyeight and parlgov datasets (with view pagination),9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-11-13T20:37:46Z,2017-11-13T22:09:46Z,2017-11-13T22:09:46Z,OWNER,,Final versions should be deployed using the first released version of datasette.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/80/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 557892819,MDExOlB1bGxSZXF1ZXN0MzY5Mzk0MDQz,80,on_create mechanism for after table creation,9599,simonw,closed,0,,,,,5,2020-01-31T03:38:48Z,2020-01-31T05:08:04Z,2020-01-31T05:08:04Z,OWNER,simonw/sqlite-utils/pulls/80,"I need this for `geojson-to-sqlite`, in particular https://github.com/simonw/geojson-to-sqlite/issues/6",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/80/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 273595473,MDExOlB1bGxSZXF1ZXN0MTUyMzYwNzQw,81,:fire: Removes DS_Store,50527,jefftriplett,closed,0,,,,,2,2017-11-13T22:07:52Z,2017-11-14T02:24:54Z,2017-11-13T22:16:55Z,CONTRIBUTOR,simonw/datasette/pulls/81,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/81/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 558600274,MDU6SXNzdWU1NTg2MDAyNzQ=,81,"Remove .detect_column_types() from table, make it a documented API",9599,simonw,closed,0,,,,,4,2020-02-01T21:25:54Z,2020-02-01T21:55:35Z,2020-02-01T21:55:35Z,OWNER,,"I used it in `geojson-to-sqlite` here: https://github.com/simonw/geojson-to-sqlite/blob/f10e44264712dd59ae7dfa2e6fd5a904b682fb33/geojson_to_sqlite/utils.py#L45-L50 It would make more sense for this method to live on the Database rather than the Table - or even to exist as a separate utility method entirely. Then it should be documented.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/81/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273596159,MDU6SXNzdWUyNzM1OTYxNTk=,82,Post a blog entry announcing it to the world,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-13T22:10:35Z,2017-11-14T01:46:10Z,2017-11-14T01:46:10Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/82/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 559197745,MDU6SXNzdWU1NTkxOTc3NDU=,82,Tutorial command no longer works,10350886,petey284,closed,0,,,,,3,2020-02-03T16:36:11Z,2020-02-27T04:16:43Z,2020-02-27T04:16:30Z,NONE,,"Issue with command on [tutorial](https://simonwillison.net/2019/Feb/25/sqlite-utils/) on Simon's site. The following command no longer works, and breaks with the previous too many variables error: #50 ``` cmd > curl ""https://data.nasa.gov/resource/y77d-th95.json"" | \ sqlite-utils insert meteorites.db meteorites - --pk=id ``` Output: ``` cmd Traceback (most recent call last): File ""continuum\miniconda3\envs\main\lib\runpy.py"", line 193, in _run_module_as_main ""__main__"", mod_spec) File ""continuum\miniconda3\envs\main\lib\runpy.py"", line 85, in _run_code exec(code, run_globals) File ""Continuum\miniconda3\envs\main\Scripts\sqlite-utils.exe\__main__.py"", line 9, in File ""continuum\miniconda3\envs\main\lib\site-packages\click\core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""continuum\miniconda3\envs\main\lib\site-packages\click\core.py"", line 717, in main rv = self.invoke(ctx) File ""continuum\miniconda3\envs\main\lib\site-packages\click\core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""continuum\miniconda3\envs\main\lib\site-packages\click\core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""continuum\miniconda3\envs\main\lib\site-packages\click\core.py"", line 555, in invoke return callback(*args, **kwargs) File ""continuum\miniconda3\envs\main\lib\site-packages\sqlite_utils\cli.py"", line 434, in insert default=default, File ""continuum\miniconda3\envs\main\lib\site-packages\sqlite_utils\cli.py"", line 384, in insert_upsert_implementation docs, pk=pk, batch_size=batch_size, alter=alter, **extra_kwargs File ""continuum\miniconda3\envs\main\lib\site-packages\sqlite_utils\db.py"", line 1081, in insert_all result = self.db.conn.execute(query, params) sqlite3.OperationalError: too many SQL variables ``` My thought is that maybe the dataset grew over the last few years and so didn't run into this issue before. No error when I reduce the count of entries to 83. Once the number of entries hits 84 the command fails. // This passes ``` cmd type meteorite_83.txt | sqlite-utils insert meteorites.db meteorites - --pk=id ``` // But this fails ``` cmd type meteorite_84.txt | sqlite-utils insert meteorites.db meteorites - --pk=id ``` A potential fix might be to chunk the incoming data? I can work on a PR if pointed in right direction. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/82/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273626815,MDU6SXNzdWUyNzM2MjY4MTU=,83,Individual row view is broken,9599,simonw,closed,0,,,,,0,2017-11-14T00:29:11Z,2017-11-14T00:45:34Z,2017-11-14T00:45:34Z,OWNER,,"https://parlgov.datasettes.com/parlgov-25f9855/viewcalc_parliament_composition/18 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/83/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 559374410,MDU6SXNzdWU1NTkzNzQ0MTA=,83,"Make db[""table""].exists a documented API",9599,simonw,closed,0,,,,,1,2020-02-03T22:31:44Z,2020-02-08T23:58:35Z,2020-02-08T23:56:23Z,OWNER,,Right now it's a static thing which might get out-of-sync with the database. It should probably be a live check. Maybe call it `.exists()` instead?,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/83/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273660425,MDU6SXNzdWUyNzM2NjA0MjU=,84,datasette package --metadata does not work with a relative path,9599,simonw,closed,0,,,,,0,2017-11-14T04:00:50Z,2017-11-15T05:18:35Z,2017-11-15T05:18:35Z,OWNER,," $ datasette package ~/parlgov-db/parlgov.db --metadata=~/parlgov-db/parlgov.json Usage: datasette package [OPTIONS] FILES... Error: Invalid value for ""-m"" / ""--metadata"": Could not open file: ~/parlgov-db/parlgov.json: No such file or directory simonw-07542:~ simonw$ cd ~/parlgov-db/ simonw-07542:parlgov-db simonw$ datasette package ~/parlgov-db/parlgov.db --metadata=parlgov.json Sending build context to Docker daemon 4.46MB Step 1/7 : FROM python:3 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/84/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 561460274,MDU6SXNzdWU1NjE0NjAyNzQ=,84,.upsert() with hash_id throws error,9599,simonw,closed,0,,,,,0,2020-02-07T07:08:19Z,2020-02-07T07:17:11Z,2020-02-07T07:17:11Z,OWNER,,"```python db[table_name].upsert_all(rows, hash_id=""pk"") ``` This throws an error: `PrimaryKeyRequired('upsert() requires a pk')` The problem is, if you try this: ```python db[table_name].upsert_all(rows, hash_id=""pk"", pk=""pk"") ``` You get this error: `AssertionError('Use either pk= or hash_id=')` `hash_id=` should imply that `pk=` that column.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/84/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273678673,MDU6SXNzdWUyNzM2Nzg2NzM=,85,Detect foreign keys and use them to link HTML pages together,9599,simonw,closed,0,,,2919870,Foreign key edition,6,2017-11-14T06:12:05Z,2017-11-19T06:08:19Z,2017-11-19T06:08:19Z,OWNER,,"https://stackoverflow.com/a/44430157/6083 documents the PRAGMA needed to extract foreign key references for a table. At a minimum we can link column values known to be foreign keys to the corresponding row page. We could try to summarize the linked row in some way too - somehow extracting a sensible link title, maybe based on additional configuration in the metadata.json file. Still todo: - [x] Fix it to csvs-to-sqlite refactoring command correctly creates primary key on generated tables - [x] Ship new csvs-to-sqlite with refactoring command - [x] Refactor column logic to be more predictable in our templates (the rowid special case) - [x] Mechanism by which table metadata can specify the ""label"" column for a table - [x] Automatically set the label column as the first column that isn't a primary key (falling back on primary key) - [x] Code which runs a ""select id, label from table where id in (...)"" query as part of the tableview and populates a lookup dictionary - [x] Modify templates to use values from that lookup dictionary",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/85/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 562911863,MDU6SXNzdWU1NjI5MTE4NjM=,85,Create index doesn't work for columns containing spaces,9599,simonw,closed,0,,,,,1,2020-02-11T00:34:46Z,2020-02-11T05:13:20Z,2020-02-11T05:13:20Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/85/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273703829,MDU6SXNzdWUyNzM3MDM4Mjk=,86,Filter UI on table page,9599,simonw,closed,0,,,2919870,Foreign key edition,10,2017-11-14T08:22:43Z,2017-11-23T20:34:32Z,2017-11-23T20:34:32Z,OWNER,,A UI for building up simple table queries by adding additional filter rules that get executed as query parameters in the URL.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/86/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 564579430,MDU6SXNzdWU1NjQ1Nzk0MzA=,86,Problem with square bracket in CSV column name,8149512,foscoj,closed,0,,,,,7,2020-02-13T10:19:57Z,2020-02-27T04:16:08Z,2020-02-27T04:16:07Z,NONE,,"testing some data from european power information (entsoe.eu), the title of the csv contains square brackets. as I am playing with glitch, sqlite-utils are used for creating the db. Traceback (most recent call last): File ""/app/.local/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/app/.local/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/app/.local/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/app/.local/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/app/.local/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/app/.local/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/app/.local/lib/python3.7/site-packages/sqlite_utils/cli.py"", line 434, in insert default=default, File ""/app/.local/lib/python3.7/site-packages/sqlite_utils/cli.py"", line 384, in insert_upsert_implementation docs, pk=pk, batch_size=batch_size, alter=alter, **extra_kwargs File ""/app/.local/lib/python3.7/site-packages/sqlite_utils/db.py"", line 997, in insert_all extracts=extracts, File ""/app/.local/lib/python3.7/site-packages/sqlite_utils/db.py"", line 618, in create extracts=extracts, File ""/app/.local/lib/python3.7/site-packages/sqlite_utils/db.py"", line 310, in create_table self.conn.execute(sql) sqlite3.OperationalError: unrecognized token: ""]"" entsoe_2016.csv renamed to txt for uploading compatibility [entsoe_2016.txt](https://github.com/simonw/sqlite-utils/files/4197688/entsoe_2016.txt) code is remixed directly from your https://glitch.com/edit/#!/datasette-csvs repo ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/86/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273709194,MDU6SXNzdWUyNzM3MDkxOTQ=,87,Configure Travis to release new tags to PyPI,9599,simonw,closed,0,,,,,1,2017-11-14T08:44:08Z,2018-07-10T17:49:13Z,2018-07-10T17:49:12Z,OWNER,,https://docs.travis-ci.com/user/deployment/pypi/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/87/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 565837965,MDU6SXNzdWU1NjU4Mzc5NjU=,87,Should detect collections.OrderedDict as a regular dictionary,9599,simonw,closed,0,,,,,2,2020-02-16T02:06:34Z,2020-02-16T02:20:59Z,2020-02-16T02:20:59Z,OWNER,,"``` File ""...python3.7/site-packages/sqlite_utils/db.py"", line 292, in create_table column_type=COLUMN_TYPE_MAPPING[column_type], KeyError: ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/87/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273775212,MDU6SXNzdWUyNzM3NzUyMTI=,88,Add NHS England Hospitals example to wiki,15543,tomdyson,closed,0,,,,,4,2017-11-14T12:29:10Z,2021-03-22T23:46:36Z,2017-11-14T22:54:06Z,CONTRIBUTOR,,"https://nhs-england-hospitals.now.sh and an associated map visualisation: http://run.plnkr.co/preview/cj9zlf1qc0003414y90ajkwpk/ Datasette is wonderful! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/88/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 571805300,MDU6SXNzdWU1NzE4MDUzMDA=,88,"table.disable_fts() method and ""sqlite-utils disable-fts ..."" command",9599,simonw,closed,0,,,,,5,2020-02-27T04:00:50Z,2020-02-27T04:40:44Z,2020-02-27T04:40:44Z,OWNER,,This would make it easier to iterate on the FTS configuration for a database without having to wipe and recreate the database each time.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/88/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273816720,MDExOlB1bGxSZXF1ZXN0MTUyNTIyNzYy,89,SQL syntax highlighting with CodeMirror,15543,tomdyson,closed,0,,,,,1,2017-11-14T14:43:33Z,2017-11-15T02:03:01Z,2017-11-15T02:03:01Z,CONTRIBUTOR,simonw/datasette/pulls/89,"Addresses #13 Future enhancements could include autocompletion of table and column names, e.g. with ```javascript extraKeys: {""Ctrl-Space"": ""autocomplete""}, hintOptions: {tables: { users: [""name"", ""score"", ""birthDate""], countries: [""name"", ""population"", ""size""] }} ``` (see https://codemirror.net/doc/manual.html#addon_sql-hint and source at http://codemirror.net/mode/sql/)",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/89/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 573578548,MDU6SXNzdWU1NzM1Nzg1NDg=,89,Ability to customize columns used by extracts= feature,9599,simonw,open,0,,,,,3,2020-03-01T16:54:48Z,2020-10-16T19:17:50Z,,OWNER,,"@simonw any thoughts on allow extracts to specify the lookup column name? If I'm understanding the documentation right, `.lookup()` allows you to define the ""value"" column (the documentation uses name), but when you use `extracts` keyword as part of `.insert()`, `.upsert()` etc. the lookup must be done against a column named ""value"". I have an existing lookup table that I've populated with columns ""id"" and ""name"" as opposed to ""id"" and ""value"", and seems I can't use `extracts=`, unless I'm missing something... Initial thought on how to do this would be to allow the dictionary value to be a tuple of table name column pair... so: ``` table = db.table(""trees"", extracts={""species_id"": (""Species"", ""name""}) ``` I haven't dug too much into the existing code yet, but does this make sense? Worth doing? _Originally posted by @chrishas35 in https://github.com/simonw/sqlite-utils/issues/46#issuecomment-592999503_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/89/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 273846123,MDU6SXNzdWUyNzM4NDYxMjM=,90,datasette publish heroku,9599,simonw,closed,0,,,,,8,2017-11-14T16:01:39Z,2017-12-10T03:06:34Z,2017-12-10T03:05:48Z,OWNER,,"Heroku has Docker container support so this should not be too hard: https://devcenter.heroku.com/articles/container-registry-and-runtime See also #59 This should work exactly like the existing “datasette publish now....” command except it would be “datasette publish heroku...”",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/90/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 573740712,MDU6SXNzdWU1NzM3NDA3MTI=,90,Cannot .enable_fts() for columns with spaces in their names,9599,simonw,closed,0,,,,,0,2020-03-02T06:06:03Z,2020-03-02T06:10:49Z,2020-03-02T06:10:49Z,OWNER,,"``` import sqlite_utils db = sqlite_utils.Database(memory=True) db[""test""].insert({""space in name"": ""hello""}) db[""test""].enable_fts([""space in name""]) --------------------------------------------------------------------------- OperationalError Traceback (most recent call last) in ----> 1 db['test'].enable_fts([""space in name""]) /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in enable_fts(self, columns, fts_version, create_triggers) 755 ) 756 self.db.conn.executescript(sql) --> 757 self.populate_fts(columns) 758 759 if create_triggers: /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in populate_fts(self, columns) 787 table=self.name, columns="", "".join(columns) 788 ) --> 789 self.db.conn.executescript(sql) 790 return self 791 OperationalError: near ""in"": syntax error ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/90/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273878873,MDU6SXNzdWUyNzM4Nzg4NzM=,91,"Option to serve databases from a different prefix, serve regular content elsewhere",9599,simonw,closed,0,,,,,1,2017-11-14T17:32:46Z,2017-12-10T03:07:58Z,2017-12-10T03:07:53Z,OWNER,,"It would be useful if the databases themselves could be served from a prefix e.g. datasette serve mydb.db --path-prefix=db Now my database is at `http://localhost:8001/db/mydb-23423` This would free up the rest of the URL namespace for other things. Maybe we could have an option to serve static content from a known folder e.g. datasette serve mydb.db --path-prefix=db --root-content=~/my-project/static Now a hit to `http://localhost:8001/news/` serves content from `~/my-project/static/news/index.html` This would make it trivial to package up entire HTML/CSS/JS apps with one or more underlying SQLite databases. Running without `--cors` would be fine here because any JS apps would be hosted on the same origin.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/91/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 577302229,MDU6SXNzdWU1NzczMDIyMjk=,91,Enable ordering FTS results by rank,416374,gfrmin,closed,0,,,6079500,3.0,1,2020-03-07T08:43:51Z,2020-11-06T23:53:26Z,2020-11-06T23:53:25Z,NONE,,According to https://www.sqlite.org/fts5.html (not sure about FTS4) results can be sorted by relevance. At the moment results are returned by default by `rowid`. Perhaps a flag can be added to the `search` method?,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/91/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273895344,MDU6SXNzdWUyNzM4OTUzNDQ=,92,Add --license --license_url --source --source_url --title arguments to datasette publish,9599,simonw,closed,0,,,,,0,2017-11-14T18:27:07Z,2017-11-15T05:04:41Z,2017-11-15T05:04:41Z,OWNER,,"I keep on using the `echo '{""source"": ""...""}' | datasette publish now --metadata=-` pattern, which suggests it makes sense for us to support these as optional arguments. https://gist.github.com/simonw/9f8bf23b37a42d7628c4dcc4bba10253",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/92/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 581339961,MDU6SXNzdWU1ODEzMzk5NjE=,92,.columns_dict doesn't work for all possible column types,9599,simonw,closed,0,,,,,7,2020-03-14T19:30:35Z,2020-03-15T18:37:43Z,2020-03-14T20:04:14Z,OWNER,,"Got this error: ``` File "".../python3.7/site-packages/sqlite_utils/db.py"", line 462, in for column in self.columns KeyError: 'REAL' ``` `.columns_dict` uses `REVERSE_COLUMN_TYPE_MAPPING`: https://github.com/simonw/sqlite-utils/blob/43f1c6ab4e3a6b76531fb6f5447adb83d26f3971/sqlite_utils/db.py#L457-L463 `REVERSE_COLUMN_TYPE_MAPPING` defines `FLOAT` not `REAL`A https://github.com/simonw/sqlite-utils/blob/43f1c6ab4e3a6b76531fb6f5447adb83d26f3971/sqlite_utils/db.py#L68-L74",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/92/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273944952,MDU6SXNzdWUyNzM5NDQ5NTI=,93,Package as standalone binary,67420,atomotic,closed,0,,,,,18,2017-11-14T21:14:07Z,2021-11-21T07:00:23Z,2021-11-21T07:00:23Z,NONE,,"hint: more than the docker image a standalone and multiplatform binary (containing the app and the database) could be simpler to distribute. i would like to investigate the possibility to package everything with [pyinstaller](http://www.pyinstaller.org/) adding the database as a [data file](https://pythonhosted.org/PyInstaller/spec-files.html#adding-data-files)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/93/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 581795570,MDU6SXNzdWU1ODE3OTU1NzA=,93,Support more string values for types in .add_column(),9599,simonw,open,0,,,,,0,2020-03-15T19:32:49Z,2020-09-24T20:36:46Z,,OWNER,,"https://sqlite-utils.readthedocs.io/en/2.4.2/python-api.html#adding-columns says: > SQLite types you can specify are ""TEXT"", ""INTEGER"", ""FLOAT"" or ""BLOB"". As discovered in #92 this isn't the right list of values. I should expand this to match https://www.sqlite.org/datatype3.html",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/93/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 273961179,MDExOlB1bGxSZXF1ZXN0MTUyNjMxNTcw,94,Initial add simple prod ready Dockerfile refs #57,247192,macropin,closed,0,,,,,1,2017-11-14T22:09:09Z,2017-11-15T03:08:04Z,2017-11-15T03:08:04Z,CONTRIBUTOR,simonw/datasette/pulls/94,"Multi-stage build based off official python:3.6-slim Example usage: ``` docker run --rm -t -i -p 9000:8001 -v $(pwd)/db:/db datasette datasette serve /db/chinook.db ```",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/94/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 586477757,MDU6SXNzdWU1ODY0Nzc3NTc=,94,"If column data is a mixture of integers and nulls, detected type should be INTEGER",9599,simonw,closed,0,,,,,0,2020-03-23T19:51:46Z,2020-03-23T19:57:10Z,2020-03-23T19:57:10Z,OWNER,,It looks like detected type for that case is TEXT at the moment.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/94/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273998513,MDU6SXNzdWUyNzM5OTg1MTM=,95,Allow shorter time limits to be set using a ?_sql_time_limit_ms =20 query string limit,9599,simonw,closed,0,,,,,1,2017-11-15T01:02:16Z,2017-11-15T02:56:13Z,2017-11-15T02:56:13Z,OWNER,,This cannot be greater than the configured time limit.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/95/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586486367,MDU6SXNzdWU1ODY0ODYzNjc=,95,Columns with only null values are no longer created in the database,9599,simonw,closed,0,,,,,0,2020-03-23T20:07:42Z,2020-03-23T20:31:15Z,2020-03-23T20:31:15Z,OWNER,,"Bug introduced in #94, and released in `2.4.3`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/95/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274001453,MDU6SXNzdWUyNzQwMDE0NTM=,96,UI for editing named parameters,9599,simonw,closed,0,,,,,3,2017-11-15T01:19:21Z,2017-11-16T01:45:51Z,2017-11-16T01:33:38Z,OWNER,,"On any page displaying a custom query that includes named parameters, we should show HTML form fields for editing those parameters. Eg the breed parameter on https://australian-dogs.now.sh/australian-dogs-3ba9628?sql=select+name%2C+count%28*%29+as+n+from+%28%0D%0A%0D%0Aselect+upper%28%22Animal+name%22%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2013%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2014%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2015%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22AnimalName%22%29+as+name+from+%5BCity-of-Port-Adelaide-Enfield-Dog_Registrations_2016%5D+where+AnimalBreed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5BMitcham-dog-registrations-2015%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22DOG_NAME%22%29+as+name+from+%5Bburnside-dog-registrations-2015%5D+where+DOG_BREED+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Animal_Name%22%29+as+name+from+%5Bcity-of-playford-2015-dog-registration%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5Bcity-of-prospect-dog-registration-details-2016%5D+where%22Breed+Description%22+like+%3Abreed%0D%0A%0D%0A%29+group+by+name+order+by+n+desc%3B&breed=pug",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/96/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 589801352,MDExOlB1bGxSZXF1ZXN0Mzk1MjU4Njg3,96,Add type conversion for Panda's Timestamp,32605365,b0b5h4rp13,closed,0,,,,,2,2020-03-29T14:13:09Z,2020-03-31T04:40:49Z,2020-03-31T04:40:48Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/96,"Add type conversion for Panda's Timestamp, if Panda library is present in system (thanks for this project, I was about to do the same thing from scratch)",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/96/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 274022950,MDU6SXNzdWUyNzQwMjI5NTA=,97,Link to JSON for the list of tables ,9599,simonw,closed,0,,,,,3,2017-11-15T03:29:05Z,2018-05-29T18:51:35Z,2018-05-28T20:57:21Z,OWNER,,https://twitter.com/yschimke/status/930606210855854080,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/97/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 593751293,MDU6SXNzdWU1OTM3NTEyOTM=,97,"Adding a ""recreate"" flag to the `Database` constructor",1448859,betatim,closed,0,,,,,4,2020-04-04T05:41:10Z,2020-04-15T14:29:31Z,2020-04-13T03:52:29Z,NONE,,"I have a [script](https://github.com/betatim/binder-datasette/blob/master/create-db.ipynb) that imports data into a sqlite DB. When I re-run that script I'd like to remove the existing sqlite DB, instead of adding to it. The pragmatic answer is to add the check and file deletion to my script. However I thought it would be easy and useful for others to add a `recreate=True` flag to `db = sqlite_utils.Database(""binder-launches.db"")`. After taking a look at the code for it I am not so sure any more. This is because the connection string could be a URL (or ""connection string"") like `""file:///tmp/foo.db""`. I don't know what the equivalent of `os.path.exists()` is for a connection string or how to detect that something is a connection string and raise an error ""can't use recreate=True and conn_string at the same time"". Does anyone have an idea/suggestion where to start investigating?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/97/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274023417,MDU6SXNzdWUyNzQwMjM0MTc=,98,Default to 127.0.0.1 not 0.0.0.0,9599,simonw,closed,0,,,,,0,2017-11-15T03:31:55Z,2017-11-15T05:08:54Z,2017-11-15T05:08:54Z,OWNER,,https://twitter.com/yschimke/status/930606210855854080,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/98/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 597671518,MDU6SXNzdWU1OTc2NzE1MTg=,98,"Only set .last_rowid and .last_pk for single update/inserts, not for .insert_all()/.upsert_all() with multiple records",9599,simonw,closed,0,,,,,7,2020-04-10T03:19:40Z,2021-09-28T04:38:44Z,2020-04-13T03:29:15Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/98/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274023625,MDU6SXNzdWUyNzQwMjM2MjU=,99,Start a change log,9599,simonw,closed,0,,,,,0,2017-11-15T03:33:21Z,2017-11-16T15:12:46Z,2017-11-16T15:12:45Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/99/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 598640234,MDU6SXNzdWU1OTg2NDAyMzQ=,99,.upsert_all() should maybe error if dictionaries passed to it do not have the same keys,9599,simonw,closed,0,,,,,2,2020-04-13T03:02:25Z,2020-04-13T03:05:20Z,2020-04-13T03:05:04Z,OWNER,,"While investigating #98 I stumbled across this: ``` def test_upsert_compound_primary_key(fresh_db): table = fresh_db[""table""] table.upsert_all( [ {""species"": ""dog"", ""id"": 1, ""name"": ""Cleo"", ""age"": 4}, {""species"": ""cat"", ""id"": 1, ""name"": ""Catbag""}, ], pk=(""species"", ""id""), ) table.upsert_all( [ {""species"": ""dog"", ""id"": 1, ""age"": 5}, {""species"": ""dog"", ""id"": 2, ""name"": ""New Dog"", ""age"": 1}, ], pk=(""species"", ""id""), ) > assert [ {""species"": ""dog"", ""id"": 1, ""name"": ""Cleo"", ""age"": 5}, {""species"": ""cat"", ""id"": 1, ""name"": ""Catbag"", ""age"": None}, {""species"": ""dog"", ""id"": 2, ""name"": ""New Dog"", ""age"": 1}, ] == list(table.rows) E AssertionError: assert [{'age': 5, '...cies': 'dog'}] == [{'age': 5, '...cies': 'dog'}] E At index 0 diff: {'species': 'dog', 'id': 1, 'name': 'Cleo', 'age': 5} != {'species': 'dog', 'id': 1, 'name': None, 'age': 5} E Full diff: E - [{'age': 5, 'id': 1, 'name': 'Cleo', 'species': 'dog'}, E ? ^^^ -- E + [{'age': 5, 'id': 1, 'name': None, 'species': 'dog'}, E ? ^^^ E {'age': None, 'id': 1, 'name': 'Catbag', 'species': 'cat'}, E {'age': 1, 'id': 2, 'name': 'New Dog', 'species': 'dog'}] ``` If you run `.upsert_all()` with multiple dictionaries it doesn't quite have the effect you might expect.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/99/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274160723,MDU6SXNzdWUyNzQxNjA3MjM=,100,TemplateAssertionError: no filter named 'tojson',13304454,coisnepe,closed,0,,,,,2,2017-11-15T13:43:41Z,2017-11-16T09:25:10Z,2017-11-16T00:14:13Z,NONE,,"A 500 error is raised upon clicking on the name of a table on the homepage, say _http://0.0.0.0:8001/_ to _http://0.0.0.0:8001/test_check-c1f4771/users_ The API part seems to function as intended, though... ``` 2017-11-15 14:33:57 - (sanic)[ERROR]: Traceback (most recent call last): File ""/usr/local/lib/python3.5/dist-packages/sanic/app.py"", line 503, in handle_request response = await response File ""/usr/local/lib/python3.5/dist-packages/datasette/app.py"", line 155, in get return await self.view_get(request, name, hash, **kwargs) File ""/usr/local/lib/python3.5/dist-packages/datasette/app.py"", line 219, in view_get **context, File ""/usr/local/lib/python3.5/dist-packages/sanic_jinja2/__init__.py"", line 84, in render return html(self.render_string(template, request, **context)) File ""/usr/local/lib/python3.5/dist-packages/sanic_jinja2/__init__.py"", line 81, in render_string return self.env.get_template(template).render(**context) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 812, in get_template return self._load_template(name, self.make_globals(globals)) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 786, in _load_template template = self.loader.load(self, name, globals) File ""/usr/lib/python3/dist-packages/jinja2/loaders.py"", line 125, in load code = environment.compile(source, name, filename) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 565, in compile self.handle_exception(exc_info, source_hint=source_hint) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 754, in handle_exception reraise(exc_type, exc_value, tb) File ""/usr/lib/python3/dist-packages/jinja2/_compat.py"", line 37, in reraise raise value.with_traceback(tb) File ""/usr/local/lib/python3.5/dist-packages/datasette/templates/table.html"", line 29, in template
    params = {{ query.params|tojson(4) }}
    File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 515, in _generate return generate(source, self, name, filename, defer_init=defer_init) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 62, in generate generator.visit(node) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 849, in visit_Template self.blockvisit(block.body, block_frame) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 1172, in visit_If self.blockvisit(node.body, if_frame) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 1353, in visit_Output self.visit(argument, frame) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 1565, in visit_Filter self.fail('no filter named %r' % node.name, node.lineno) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 427, in fail raise TemplateAssertionError(msg, lineno, self.name, self.filename) jinja2.exceptions.TemplateAssertionError: no filter named 'tojson' 2017-11-15 14:33:57 - (network)[INFO][127.0.0.1:41316]: GET http://0.0.0.0:8001/test_check-c1f4771/users 500 144 2017-11-15 14:33:57 - (network)[INFO][127.0.0.1:41316]: GET http://0.0.0.0:8001/favicon.ico 200 0 ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/100/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601358649,MDU6SXNzdWU2MDEzNTg2NDk=,100,"Mechanism for forcing column-type, over-riding auto-detection",9599,simonw,closed,0,,,,,3,2020-04-16T19:12:52Z,2020-04-17T23:53:32Z,2020-04-17T23:53:32Z,OWNER,,"As seen in https://github.com/dogsheep/github-to-sqlite/issues/27#issuecomment-614843406 - there's a problem where you insert a record with a `None` value for a column and that column is created as `TEXT` - but actually you intended it to be an `INT` (as later examples will demonstrate). Some kind of mechanism for over-riding the detected types of columns would be useful here.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/100/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274161964,MDU6SXNzdWUyNzQxNjE5NjQ=,101,TemplateAssertionError: no filter named 'tojson',450244,eaubin,closed,0,,,,,1,2017-11-15T13:47:32Z,2017-11-15T13:48:55Z,2017-11-15T13:48:55Z,NONE,,"I get an exception clicking on the table link: ``` 2017-11-15 08:40:10 - (sanic)[ERROR]: Traceback (most recent call last): File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/sanic/app.py"", line 503, in handle_request response = await response File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/datasette/app.py"", line 155, in get return await self.view_get(request, name, hash, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/datasette/app.py"", line 219, in view_get **context, File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/sanic_jinja2/__init__.py"", line 84, in render return html(self.render_string(template, request, **context)) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/sanic_jinja2/__init__.py"", line 81, in render_string return self.env.get_template(template).render(**context) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 812, in get_template return self._load_template(name, self.make_globals(globals)) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 786, in _load_template template = self.loader.load(self, name, globals) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/loaders.py"", line 125, in load code = environment.compile(source, name, filename) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 565, in compile self.handle_exception(exc_info, source_hint=source_hint) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 754, in handle_exception reraise(exc_type, exc_value, tb) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/_compat.py"", line 37, in reraise raise value.with_traceback(tb) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/datasette/templates/table.html"", line 29, in template
    params = {{ query.params|tojson(4) }}
    File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 515, in _generate return generate(source, self, name, filename, defer_init=defer_init) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 62, in generate generator.visit(node) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 849, in visit_Template self.blockvisit(block.body, block_frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 1172, in visit_If self.blockvisit(node.body, if_frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 1353, in visit_Output self.visit(argument, frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 1565, in visit_Filter self.fail('no filter named %r' % node.name, node.lineno) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 427, in fail raise TemplateAssertionError(msg, lineno, self.name, self.filename) jinja2.exceptions.TemplateAssertionError: no filter named 'tojson' ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/101/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601392318,MDU6SXNzdWU2MDEzOTIzMTg=,101,README should include an example of CLI data insertion,9599,simonw,closed,0,,,,,0,2020-04-16T19:45:37Z,2020-04-17T23:59:49Z,2020-04-17T23:59:49Z,OWNER,,Maybe using `curl` from the GitHub API.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/101/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274264175,MDU6SXNzdWUyNzQyNjQxNzU=,102,datasette publish elasticbeanstalk,9599,simonw,closed,0,,,,,1,2017-11-15T18:48:31Z,2021-01-04T20:13:20Z,2021-01-04T20:13:19Z,OWNER,,"It looks like Elastic Beanstalk is the most convenient way to deploy a docker container to AWS without first deploying a cluster. https://aws.amazon.com/blogs/devops/dockerizing-a-python-web-app/ looks helpful. We would need to automate the deployment with Boto: http://boto3.readthedocs.io/en/latest/reference/services/elasticbeanstalk.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/102/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602569315,MDU6SXNzdWU2MDI1NjkzMTU=,102,Can't store an array or dictionary containing a bytes value,9599,simonw,closed,0,,,,,0,2020-04-18T22:49:21Z,2020-05-01T20:45:45Z,2020-05-01T20:45:45Z,OWNER,,"``` In [1]: import sqlite_utils In [2]: db = sqlite_utils.Database(memory=True) In [3]: db[""t""].insert({""id"": 1, ""data"": {""foo"": b""bytes""}}) --------------------------------------------------------------------------- TypeError Traceback (most recent call last) in ----> 1 db[""t""].insert({""id"": 1, ""data"": {""foo"": b""bytes""}}) ~/Dropbox/Development/sqlite-utils/sqlite_utils/db.py in insert(self, record, pk, foreign_keys, column_order, not_null, defaults, hash_id, alter, ignore, replace, extracts, conversions, columns) 950 extracts=extracts, 951 conversions=conversions, --> 952 columns=columns, 953 ) 954 ~/Dropbox/Development/sqlite-utils/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, alter, ignore, replace, extracts, conversions, columns, upsert) 1052 for key in all_columns: 1053 value = jsonify_if_needed( -> 1054 record.get(key, None if key != hash_id else _hash(record)) 1055 ) 1056 if key in extracts: ~/Dropbox/Development/sqlite-utils/sqlite_utils/db.py in jsonify_if_needed(value) 1318 def jsonify_if_needed(value): 1319 if isinstance(value, (dict, list, tuple)): -> 1320 return json.dumps(value) 1321 elif isinstance(value, (datetime.time, datetime.date, datetime.datetime)): 1322 return value.isoformat() /usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/json/__init__.py in dumps(obj, skipkeys, ensure_ascii, check_circular, allow_nan, cls, indent, separators, default, sort_keys, **kw) 229 cls is None and indent is None and separators is None and 230 default is None and not sort_keys and not kw): --> 231 return _default_encoder.encode(obj) 232 if cls is None: 233 cls = JSONEncoder /usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/json/encoder.py in encode(self, o) 197 # exceptions aren't as detailed. The list call should be roughly 198 # equivalent to the PySequence_Fast that ''.join() would do. --> 199 chunks = self.iterencode(o, _one_shot=True) 200 if not isinstance(chunks, (list, tuple)): 201 chunks = list(chunks) /usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/json/encoder.py in iterencode(self, o, _one_shot) 255 self.key_separator, self.item_separator, self.sort_keys, 256 self.skipkeys, _one_shot) --> 257 return _iterencode(o, 0) 258 259 def _make_iterencode(markers, _default, _encoder, _indent, _floatstr, /usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/json/encoder.py in default(self, o) 177 178 """""" --> 179 raise TypeError(f'Object of type {o.__class__.__name__} ' 180 f'is not JSON serializable') 181 TypeError: Object of type bytes is not JSON serializable ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/102/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274265878,MDU6SXNzdWUyNzQyNjU4Nzg=,103,datasette publish appengine,9599,simonw,closed,0,,,,,1,2017-11-15T18:54:18Z,2021-01-04T20:05:14Z,2021-01-04T20:05:14Z,OWNER,,"Similar approach to Heroku, discussed in #90 Looks like this could be pretty easy: https://cloud.google.com/appengine/docs/flexible/python/quickstart",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/103/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610517472,MDU6SXNzdWU2MTA1MTc0NzI=,103,sqlite3.OperationalError: too many SQL variables in insert_all when using rows with varying numbers of columns,32605365,b0b5h4rp13,closed,0,,,,,8,2020-05-01T02:26:14Z,2020-05-14T00:18:57Z,2020-05-14T00:18:57Z,CONTRIBUTOR,,"If using insert_all to put in 1000 rows of data with varying number of columns, it comes up with this message `sqlite3.OperationalError: too many SQL variables` if the number of columns is larger in later records (past the first row) I've reduced `SQLITE_MAX_VARS` by 100 to 899 at the top of `db.py` to add wiggle room, so that if the column count increases it wont go past SQLite's batch limit as calculated by this line of code based on the count of the first row's dict keys batch_size = max(1, min(batch_size, SQLITE_MAX_VARS // num_columns))",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/103/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274284246,MDExOlB1bGxSZXF1ZXN0MTUyODcwMDMw,104,[WIP] Add publish to heroku support,21148,jacobian,closed,0,,,,,6,2017-11-15T19:56:22Z,2017-11-21T20:55:05Z,2017-11-21T20:55:05Z,CONTRIBUTOR,simonw/datasette/pulls/104," Refs #90 ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/104/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 610853393,MDU6SXNzdWU2MTA4NTMzOTM=,104,"--schema option to ""sqlite-utils tables""",9599,simonw,closed,0,,,,,0,2020-05-01T16:55:49Z,2020-05-01T17:12:37Z,2020-05-01T17:12:37Z,OWNER,,Adds output showing the table schema.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/104/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274314940,MDU6SXNzdWUyNzQzMTQ5NDA=,105,Consider data-package as a format for metadata,9599,simonw,closed,0,,,,,4,2017-11-15T21:43:34Z,2017-11-20T19:50:53Z,2017-11-20T19:50:53Z,OWNER,,http://frictionlessdata.io/specs/data-package/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/105/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610853576,MDU6SXNzdWU2MTA4NTM1NzY=,105,"""sqlite-utils views"" command",9599,simonw,closed,0,,,,,1,2020-05-01T16:56:11Z,2020-05-01T20:40:07Z,2020-05-01T20:38:36Z,OWNER,,Similar to `sqlite-utils tables`. See also #104.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/105/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274315193,MDU6SXNzdWUyNzQzMTUxOTM=,106,Document how pagination works,9599,simonw,closed,0,,,,,1,2017-11-15T21:44:32Z,2019-06-24T06:42:33Z,2019-06-24T06:42:33Z,OWNER,,I made a start at that in this comment: https://news.ycombinator.com/item?id=15691926,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/106/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611216862,MDU6SXNzdWU2MTEyMTY4NjI=,106,"create_view(..., ignore=True, replace=True) parameters",9599,simonw,closed,0,,,,,1,2020-05-02T15:45:21Z,2020-05-02T16:04:51Z,2020-05-02T16:02:10Z,OWNER,,"Two new parameters which specify what should happen if the view already exists. I want this for https://github.com/dogsheep/github-to-sqlite/issues/37 Here's the current `create_view()` implementation: https://github.com/simonw/sqlite-utils/blob/b4d953d3ccef28bb81cea40ca165a647b59971fa/sqlite_utils/db.py#L325-L332 `ignore=True` will not do anything if the view exists already. `replace=True` will drop and redefine the view - but only if its SQL definition differs, otherwise it will be left alone.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/106/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274343647,MDExOlB1bGxSZXF1ZXN0MTUyOTE0NDgw,107,add support for ?field__isnull=1,3433657,raynae,closed,0,,,,,4,2017-11-15T23:36:36Z,2017-11-17T15:12:29Z,2017-11-17T13:29:22Z,CONTRIBUTOR,simonw/datasette/pulls/107,Is this what you had in mind for [this issue](https://github.com/simonw/datasette/issues/64)?,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/107/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 611222968,MDU6SXNzdWU2MTEyMjI5Njg=,107,sqlite-utils create-view CLI command,9599,simonw,closed,0,,,,,2,2020-05-02T16:15:13Z,2020-05-03T15:36:58Z,2020-05-03T15:36:37Z,OWNER,,Can go with #27 - `sqlite-utils create-table`.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/107/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274374317,MDU6SXNzdWUyNzQzNzQzMTc=,108,"Include version in python code, output in template",9599,simonw,closed,0,,,,,0,2017-11-16T02:32:40Z,2017-11-16T15:30:04Z,2017-11-16T15:30:04Z,OWNER,,It would be useful if I could tell which version of datasette was running on a site. Embed version number and include it in maybe a tooltip on the “powered by datasette” link,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/108/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611326701,MDU6SXNzdWU2MTEzMjY3MDE=,108,Documentation unit tests for CLI commands,9599,simonw,closed,0,,,,,2,2020-05-03T03:58:42Z,2020-05-03T04:13:57Z,2020-05-03T04:13:57Z,OWNER,,Have a test that ensures all CLI commands are documented.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/108/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274378301,MDU6SXNzdWUyNzQzNzgzMDE=,109,Set up readthedocs,9599,simonw,closed,0,,,,,1,2017-11-16T02:58:01Z,2017-11-16T16:53:26Z,2017-11-16T16:13:56Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/109/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612658444,MDU6SXNzdWU2MTI2NTg0NDQ=,109,"table.create_index(..., ignore=True)",9599,simonw,closed,0,,,,,1,2020-05-05T14:44:21Z,2020-05-05T14:46:53Z,2020-05-05T14:46:53Z,OWNER,,Option to silently do nothing if the index already exists.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/109/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274578142,MDU6SXNzdWUyNzQ1NzgxNDI=,110,Add --load-extension option to datasette for loading extra SQLite extensions,9599,simonw,closed,0,,,,,2,2017-11-16T16:26:19Z,2017-11-16T18:38:30Z,2017-11-16T16:58:50Z,OWNER,,"This would allow users with extra SQLite extensions installed (like spatialite) to load them at runtime. Inspired by this comment: https://github.com/simonw/datasette/issues/46#issuecomment-344810525",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/110/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 613755043,MDU6SXNzdWU2MTM3NTUwNDM=,110,Support decimal.Decimal type,134771,dvhthomas,closed,0,,,,,6,2020-05-07T03:57:19Z,2020-05-11T01:58:20Z,2020-05-11T01:50:11Z,NONE,,"Decimal types in Postgres cause a failure in db.py data type selection --- I have a Django app using a MoneyField, which uses a `numeric(14,0)` data type in Postgres (https://www.postgresql.org/docs/9.3/datatype-numeric.html). When attempting to export that table I get the following error: ```bash $ db-to-sqlite --table isaweb_proposal ""postgres://connection"" test.db .... column_type=COLUMN_TYPE_MAPPING[column_type], KeyError: ``` Looking at `sql_utils.db.py` at 292-ish it's clear that there is no matching type for what I assume SQLAlchemy interprets as Python decimal.Decimal. From the [SQLite docs](https://www.sqlite.org/datatype3.html#affinity_name_examples) it looks like DECIMAL in other DBs are considered numeric. I'm not quite sure if it's as simple as adding a data type to that list or if there are repercussions beyond it. Thanks for a great tool!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/110/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274615452,MDU6SXNzdWUyNzQ2MTU0NTI=,111,Add “updated” to metadata,9599,simonw,open,0,,,,,12,2017-11-16T18:22:20Z,2021-09-21T22:48:27Z,,OWNER,,"To give an indication as to when the data was last updated. This should be a field in the metadata that is then shown on the index page and in the footer, if it is set. Also support setting it using an option to “datasette publish” and “datasette package” - which can either be a string or can be the magic string “today” to set it to today’s date: datasette publish file.db --updated=today",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/111/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 615477131,MDU6SXNzdWU2MTU0NzcxMzE=,111,sqlite-utils drop-table and drop-view commands,9599,simonw,closed,0,,,,,2,2020-05-10T21:10:42Z,2020-05-11T01:58:36Z,2020-05-11T00:44:26Z,OWNER,,Would be useful to be able to drop views and tables from the CLI.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/111/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274617240,MDU6SXNzdWUyNzQ2MTcyNDA=,112,Allow --load-extension to be set via environment variables,9599,simonw,closed,0,,,,,1,2017-11-16T18:28:31Z,2017-11-17T14:19:23Z,2017-11-17T14:17:27Z,OWNER,,"This will make it easier to package up datasette in a Docker container with a bunch of pre-compiled extensions without the user having to remember to include all of the options every time. Click has a mechanism for this: http://click.pocoo.org/5/options/#multiple-values-from-environment-values",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/112/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 616271236,MDU6SXNzdWU2MTYyNzEyMzY=,112,"add_foreign_key(...., ignore=True)",9599,simonw,closed,0,,,5896742,2.19,4,2020-05-12T00:24:00Z,2020-09-20T22:17:34Z,2020-09-20T22:17:34Z,OWNER,,"When using this library I often find myself wanting to ""add this foreign key, but only if it doesn't exist yet"". The `ignore=True` parameter is increasingly being used for this else where in the library (e.g. in `create_view()`).",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/112/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274662378,MDU6SXNzdWUyNzQ2NjIzNzg=,113,Fix the   bug on the database custom SQL query view,9599,simonw,closed,0,,,2919870,Foreign key edition,0,2017-11-16T21:01:26Z,2017-11-17T15:40:52Z,2017-11-17T15:40:52Z,OWNER,,"https://sf-film-locations.now.sh/sf-film-locations-57704b7?sql=select+*+from+Film_Locations_in_San_Francisco This is the bug I fixed in 01e0c3fa18cd0dd7970e208790ffd683a420c924 - but I only fixed it in one place.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/113/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 621286870,MDU6SXNzdWU2MjEyODY4NzA=,113,Syntactic sugar for ATTACH DATABASE,9599,simonw,closed,0,,,,,2,2020-05-19T21:10:00Z,2021-02-19T05:09:12Z,2021-02-19T04:56:36Z,OWNER,,"https://www.sqlite.org/lang_attach.html Maybe something like this: ```python db.attach(""other_db"", ""other_db.db"") ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/113/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274733145,MDExOlB1bGxSZXF1ZXN0MTUzMjAxOTQ1,114,"Add spatialite, switch to debian and local build",54999,ingenieroariel,closed,0,,,,,1,2017-11-17T02:37:09Z,2017-11-17T03:50:52Z,2017-11-17T03:50:52Z,CONTRIBUTOR,simonw/datasette/pulls/114,"Improves the Dockerfile to support spatial datasets, work with the local datasette code (Friendly with git tags and Dockerhub) and moves to slim debian, a small image easy to extend via apt packages for sqlite.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/114/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 621989740,MDU6SXNzdWU2MjE5ODk3NDA=,114,table.transform() method for advanced alter table,9599,simonw,closed,0,,,5897911,2.20,26,2020-05-20T18:20:46Z,2020-09-22T07:51:37Z,2020-09-22T04:20:02Z,OWNER,,"SQLite's `ALTER TABLE` can only do the following: * Rename a table * Rename a column * Add a column Notably, it cannot drop columns - so tricks like ""add a float version of this text column, populate it, then drop the old one and rename"" won't work. The docs here https://www.sqlite.org/lang_altertable.html#making_other_kinds_of_table_schema_changes describe a way of implementing full alters safely within a transaction, but it's fiddly. 1. Create new table 2. Copy data 3. Drop old table 4. Rename new into old It would be great if `sqlite-utils` provided an abstraction to help make these kinds of changes safely.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/114/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274877366,MDExOlB1bGxSZXF1ZXN0MTUzMzA2ODgy,115,Add keyboard shortcut to execute SQL query,198537,rgieseke,closed,0,,,,,1,2017-11-17T14:13:33Z,2017-11-17T15:16:34Z,2017-11-17T14:22:56Z,CONTRIBUTOR,simonw/datasette/pulls/115,"Very cool tool, thanks a lot! This PR adds a `Shift-Enter` short cut to execute the SQL query. I used CodeMirrors keyboard handling.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/115/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 637889964,MDU6SXNzdWU2Mzc4ODk5NjQ=,115,Ability to execute insert/update statements with the CLI,9599,simonw,closed,0,,,,,1,2020-06-12T17:01:17Z,2020-06-12T17:51:11Z,2020-06-12T17:41:10Z,OWNER,,"``` $ sqlite-utils github.db ""update stars set starred_at = ''"" Traceback (most recent call last): File ""/Users/simon/.local/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/Users/simon/.local/pipx/venvs/sqlite-utils/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/pipx/venvs/sqlite-utils/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/Users/simon/.local/pipx/venvs/sqlite-utils/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/pipx/venvs/sqlite-utils/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/pipx/venvs/sqlite-utils/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/Users/simon/.local/pipx/venvs/sqlite-utils/lib/python3.8/site-packages/sqlite_utils/cli.py"", line 673, in query headers = [c[0] for c in cursor.description] TypeError: 'NoneType' object is not iterable ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/115/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274884209,MDU6SXNzdWUyNzQ4ODQyMDk=,116,Add documentation section about SQLite extensions,9599,simonw,closed,0,,,,,1,2017-11-17T14:36:30Z,2018-05-28T17:23:42Z,2018-05-28T17:23:41Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/116/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 644122661,MDU6SXNzdWU2NDQxMjI2NjE=,116,Documentation for table.pks introspection property,9599,simonw,closed,0,,,,,2,2020-06-23T20:27:24Z,2020-06-23T21:21:33Z,2020-06-23T21:03:14Z,OWNER,,https://github.com/simonw/sqlite-utils/blob/4d9a3204361d956440307a57bd18c829a15861db/sqlite_utils/db.py#L535-L540,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/116/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274900388,MDExOlB1bGxSZXF1ZXN0MTUzMzI0MzAx,117,Don't prevent tabbing to `Run SQL` button,198537,rgieseke,closed,0,,,,,1,2017-11-17T15:27:50Z,2017-11-19T20:30:24Z,2017-11-18T00:53:43Z,CONTRIBUTOR,simonw/datasette/pulls/117,"Mentioned in #115 Here you go!",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/117/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 644161221,MDU6SXNzdWU2NDQxNjEyMjE=,117,Support for compound (composite) foreign keys,9599,simonw,open,0,,,,,3,2020-06-23T21:33:42Z,2020-06-23T21:40:31Z,,OWNER,,"It turns out SQLite supports composite foreign keys: https://www.sqlite.org/foreignkeys.html#fk_composite Their example looks like this: ```sql CREATE TABLE album( albumartist TEXT, albumname TEXT, albumcover BINARY, PRIMARY KEY(albumartist, albumname) ); CREATE TABLE song( songid INTEGER, songartist TEXT, songalbum TEXT, songname TEXT, FOREIGN KEY(songartist, songalbum) REFERENCES album(albumartist, albumname) ); ``` Here's what that looks like in sqlite-utils: ``` In [1]: import sqlite_utils In [2]: import sqlite3 In [3]: conn = sqlite3.connect("":memory:"") In [4]: conn Out[4]: In [5]: conn.executescript("""""" ...: CREATE TABLE album( ...: albumartist TEXT, ...: albumname TEXT, ...: albumcover BINARY, ...: PRIMARY KEY(albumartist, albumname) ...: ); ...: ...: CREATE TABLE song( ...: songid INTEGER, ...: songartist TEXT, ...: songalbum TEXT, ...: songname TEXT, ...: FOREIGN KEY(songartist, songalbum) REFERENCES album(albumartist, albumname) ...: ); ...: """""") Out[5]: In [6]: db = sqlite_utils.Database(conn) In [7]: db.tables Out[7]: [
    ,
    ] In [8]: db.tables[0].foreign_keys Out[8]: [] In [9]: db.tables[1].foreign_keys Out[9]: [ForeignKey(table='song', column='songartist', other_table='album', other_column='albumartist'), ForeignKey(table='song', column='songalbum', other_table='album', other_column='albumname')] ``` The table appears to have two separate foreign keys, when actually it has a single compound composite foreign key.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/117/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 275048699,MDExOlB1bGxSZXF1ZXN0MTUzNDMyMDQ1,118,Foreign key information on row and table pages,9599,simonw,closed,0,,,,,0,2017-11-18T03:13:27Z,2017-11-18T03:15:57Z,2017-11-18T03:15:50Z,OWNER,simonw/datasette/pulls/118,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/118/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 651844316,MDExOlB1bGxSZXF1ZXN0NDQ1MDIzMzI2,118,Add insert --truncate option,79913,tsibley,closed,0,,,,,9,2020-07-06T21:58:40Z,2020-07-08T17:26:21Z,2020-07-08T17:26:21Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/118," Deletes all rows in the table (if it exists) before inserting new rows. SQLite doesn't implement a TRUNCATE TABLE statement but does optimize an unqualified DELETE FROM. This can be handy if you want to refresh the entire contents of a table but a) don't have a PK (so can't use --replace), b) don't want the table to disappear (even briefly) for other connections, and c) have to handle records that used to exist being deleted. Ideally the replacement of rows would appear instantaneous to other connections by putting the DELETE + INSERT in a transaction, but this is very difficult without breaking other code as the current transaction handling is inconsistent and non-systematic. There exists the possibility for the DELETE to succeed but the INSERT to fail, leaving an empty table. This is not much worse, however, than the current possibility of one chunked INSERT succeeding and being committed while the next chunked INSERT fails, leaving a partially complete operation.",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/118/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 275082158,MDU6SXNzdWUyNzUwODIxNTg=,119,"Build an ""export this data to google sheets"" plugin",9599,simonw,closed,0,,,,,1,2017-11-18T14:14:51Z,2020-06-04T18:46:40Z,2020-06-04T18:46:39Z,OWNER,,"Inspired by https://github.com/kren1/tosheets It should be a plug-in because I'd like to keep all interactions with proprietary / non-open-source software encapsulated in plugins rather than shipped as part of core.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/119/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 652700770,MDU6SXNzdWU2NTI3MDA3NzA=,119,Ability to remove a foreign key,9599,simonw,closed,0,,,,,3,2020-07-07T22:31:37Z,2020-09-24T20:36:59Z,2020-09-24T20:36:59Z,OWNER,,Useful if you add one but make a mistake and need to undo it without recreating the database from scratch.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/119/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275087397,MDU6SXNzdWUyNzUwODczOTc=,120,Plugin that adds an authentication layer of some sort,9599,simonw,closed,0,,,,,4,2017-11-18T15:39:13Z,2020-03-16T18:48:06Z,2020-03-16T18:48:06Z,OWNER,,"Would allow people who want to host private data to do so. .sh ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/120/reactions"", ""total_count"": 7, ""+1"": 5, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 2, ""rocket"": 0, ""eyes"": 0}",,completed 652816158,MDExOlB1bGxSZXF1ZXN0NDQ1ODMzOTA4,120,Fix query command's support for DML,79913,tsibley,closed,0,,,,,1,2020-07-08T01:36:34Z,2020-07-08T05:14:04Z,2020-07-08T05:14:04Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/120,See commit messages for details. I ran into this while investigating another feature/issue.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/120/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 275089535,MDU6SXNzdWUyNzUwODk1MzU=,121,?_json=foo&_json=bar query string argument ,9599,simonw,closed,0,,,,,4,2017-11-18T16:09:55Z,2018-05-31T13:48:12Z,2018-05-28T18:11:51Z,OWNER,,"Causes the specified columns in the output to be treated as JSON, and returned deserialized in the .json or .jsono response. This will be particularly powerful when combined with https://sqlite.org/json1.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/121/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 652961907,MDU6SXNzdWU2NTI5NjE5MDc=,121,Improved (and better documented) support for transactions,9599,simonw,open,0,,,,,3,2020-07-08T04:56:51Z,2020-09-24T20:36:46Z,,OWNER,,"_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/pull/118#issuecomment-655283393_ We should put some thought into how this library supports and encourages smart use of transactions.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/121/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 275092453,MDU6SXNzdWUyNzUwOTI0NTM=,122,"Redesign JSON output, ditch jsono, offer variants controlled by parameter instead",9599,simonw,closed,0,,,,,5,2017-11-18T16:52:28Z,2018-04-08T14:54:09Z,2018-04-08T14:54:09Z,OWNER,,"I want to support three variants for the rows output: * a list of lists, with a columns key saying what they are * a list of dictionaries * a single dictionary where the keys are the primary keys of the rows and the values are the row dictionaries themselves I also want to make the various bits of metadata opt-in - so you don't get the SQL statement unless you ask for it. These output options should be controlled by query string arguments. I will set the .jsono URL to redirect to .json with the corresponding options. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/122/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 665700495,MDU6SXNzdWU2NjU3MDA0OTU=,122,CLI utility for inserting binary files into SQLite,9599,simonw,closed,0,,,,,10,2020-07-26T03:27:39Z,2020-07-27T07:10:41Z,2020-07-27T07:09:03Z,OWNER,,"SQLite BLOB columns can store entire binary files. The challenge is inserting them, since they don't neatly fit into JSON objects. It would be great if the `sqlite-utils` CLI had a trick for helping with this. Inspired by https://github.com/simonw/datasette-media/issues/14",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/122/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275125561,MDU6SXNzdWUyNzUxMjU1NjE=,123,Datasette serve should accept paths/URLs to CSVs and other file formats,9599,simonw,open,0,,,,,9,2017-11-19T02:05:48Z,2021-07-19T00:04:32Z,,OWNER,,"This would remove the csvs-to-sqlite step which I end up using for almost everything. I'm hesitant to introduce pandas as a required dependency though since it require compiling numpy. Could build it so this option is only available if you have pandas installed.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/123/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",, 665701216,MDU6SXNzdWU2NjU3MDEyMTY=,123,--raw option for outputting binary content,9599,simonw,closed,0,,,,,0,2020-07-26T03:35:39Z,2020-07-26T16:44:11Z,2020-07-26T16:44:11Z,OWNER,,"Related to the `insert-files` work in #122 - it should be easy to get binary data back out of the database again. One way to do that could be: sqlite-utils files.db ""select content from files where key = 'foo.jpg'"" --raw The `--raw` option would cause just the contents of the first column to be output directly to stdout.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/123/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275125805,MDU6SXNzdWUyNzUxMjU4MDU=,124,Option to open readonly but not immutable,9599,simonw,closed,0,,,,,5,2017-11-19T02:11:03Z,2019-06-24T06:43:46Z,2019-06-24T06:43:46Z,OWNER,,Immutable assumes no other process can modify the file. An option to open reqdonly instead would enable other processes to update the file in place.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/124/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 665802405,MDU6SXNzdWU2NjU4MDI0MDU=,124,sqlite-utils query should support named parameters,9599,simonw,closed,0,,,,,1,2020-07-26T15:25:10Z,2020-07-30T22:57:51Z,2020-07-27T03:53:58Z,OWNER,,"To help out with escaping - so you can run this: sqlite-utils query ""insert into foo (blah) values (:blah)"" --param blah `something here`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/124/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275135393,MDU6SXNzdWUyNzUxMzUzOTM=,125,Plot rows on a map with Leaflet and Leaflet.markercluster,9599,simonw,closed,0,,,,,2,2017-11-19T06:05:05Z,2018-04-26T15:14:31Z,2018-04-26T15:14:31Z,OWNER,,"https://github.com/Leaflet/Leaflet.markercluster would allow us to paginate-load in an enormous set of rows with latitude/longitude points, e.g. https://australian-dunnies.now.sh/ Here's a demo of it loading 50,000 markers: https://leaflet.github.io/Leaflet.markercluster/example/marker-clustering-realworld.50000.html - and it looks like it's easy to support progress bars for if we were iteratively loading 1,000 markers at a time using datasette pagination.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/125/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 665817570,MDU6SXNzdWU2NjU4MTc1NzA=,125,"Output binary columns in ""sqlite-utils query"" JSON",9599,simonw,closed,0,,,,,4,2020-07-26T16:47:02Z,2020-07-27T00:49:41Z,2020-07-27T00:48:45Z,OWNER,,You get an error if you try to run a query that returns data from a BLOB.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/125/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275135535,MDU6SXNzdWUyNzUxMzU1MzU=,126,Blog entry announcing foreign key support,9599,simonw,closed,0,,,2919870,Foreign key edition,1,2017-11-19T06:09:06Z,2017-11-30T16:49:24Z,2017-11-30T16:49:24Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/126/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 665819048,MDU6SXNzdWU2NjU4MTkwNDg=,126,Ability to insert binary data on the CLI using JSON,9599,simonw,closed,0,,,,,2,2020-07-26T16:54:14Z,2020-07-27T04:00:33Z,2020-07-27T03:59:45Z,OWNER,,"> I could solve round tripping (at least a bit) by allowing insert to be run with a flag that says ""these columns are base64 encoded, store the decoded data in a BLOB"". > > That would solve inserting binary data using JSON too. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/125#issuecomment-664012247_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/126/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275135719,MDU6SXNzdWUyNzUxMzU3MTk=,127,"Filtered tables should show count of all matching rows, if fast enough",9599,simonw,closed,0,,,2919870,Foreign key edition,2,2017-11-19T06:13:29Z,2017-11-24T22:02:01Z,2017-11-24T22:02:01Z,OWNER,,"Relates to #86. If you are viewing a filtered page e.g. https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9/bob-ross%2Felements-by-episode?CLOUDS=1 we should show the count of matching rows. Since this could be an expensive operation, we will run it with a strict time limit (maybe 50ms). If the time limit is exceeded we will display ""many"" instead, perhaps? Maybe even link to a count(*) query that would get the full 1000ms time limit which the user can click on if they like (that could even Ajax-in the result).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/127/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 666040390,MDU6SXNzdWU2NjYwNDAzOTA=,127,Ability to insert files piped to insert-files stdin,9599,simonw,closed,0,,,,,3,2020-07-27T07:09:33Z,2020-07-30T03:08:52Z,2020-07-30T03:08:18Z,OWNER,,"> Inserting files by piping them in should work - but since a filename cannot be derived this will need a `--name blah.gif` option. > > cat blah.gif | sqlite-utils insert-files files.db files - --name=blah.gif > _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/122#issuecomment-664128071_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/127/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275159710,MDU6SXNzdWUyNzUxNTk3MTA=,128,"Every visualization should have an ""embed"" button",9599,simonw,open,0,,,,,0,2017-11-19T13:38:13Z,2019-05-13T18:33:51Z,,OWNER,,"At least for the first round of visualizations, any time you construct one using the UI the result should include an ""embed this"" button that returns source code to copy and paste These examples should use unpkg.com (or similarl) urls with SRI hashes, eg https://www.srihash.org - and should load data from the datasette JSON API.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/128/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 666639051,MDU6SXNzdWU2NjY2MzkwNTE=,128,Support UUID and memoryview types,9599,simonw,closed,0,,,,,1,2020-07-27T23:08:34Z,2020-07-30T01:10:43Z,2020-07-30T01:10:43Z,OWNER,,`psycopg2` can return data from PostgreSQL as `uuid.UUID` or `memoryview` objects. These should to be supported by `sqlite-utils` - mainly for https://github.com/simonw/db-to-sqlite,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/128/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275164558,MDU6SXNzdWUyNzUxNjQ1NTg=,129,Hide FTS-created tables by default on the database index page,9599,simonw,closed,0,,,,,2,2017-11-19T14:50:42Z,2017-11-22T20:22:02Z,2017-11-22T20:19:04Z,OWNER,,"SQLite databases that use FTS include a number of automatically generated tables, e.g.: https://sf-trees-search.now.sh/sf-trees-search-a899b92 Of these, only the `Street_Tree_List` table is actually relevant to the user. We can detect which tables are FTS tables by first finding the virtual tables: sqlite> .headers on sqlite> select * from sqlite_master where rootpage = 0; type|name|tbl_name|rootpage|sql table|Search|Search|0|CREATE VIRTUAL TABLE ""Street_Tree_List_fts"" USING FTS4 (""qAddress"", ""qCaretaker"", ""qSpecies"") Then parsing the above to figure out which ones are USING FTS? - then assume that any table which starts with that `Street_Tree_List_fts` prefix was created to support search: sqlite> select * from sqlite_master where type='table' and tbl_name like 'Street_Tree_List_fts%'; type|name|tbl_name|rootpage|sql table|Search_content|Search_content|10355|CREATE TABLE 'Street_Tree_List_fts_content'(docid INTEGER PRIMARY KEY, 'c0qAddress', 'c1qCaretaker', 'c2qSpecies') table|Search_segments|Search_segments|10356|CREATE TABLE 'Street_Tree_List_fts_segments'(blockid INTEGER PRIMARY KEY, block BLOB) table|Search_segdir|Search_segdir|10357|CREATE TABLE 'Street_Tree_List_fts_segdir'(level INTEGER,idx INTEGER,start_block INTEGER,leaves_end_block INTEGER,end_block INTEGER,root BLOB,PRIMARY KEY(level, idx)) table|Search_docsize|Search_docsize|10359|CREATE TABLE 'Street_Tree_List_fts_docsize'(docid INTEGER PRIMARY KEY, size BLOB) table|Search_stat|Search_stat|10360|CREATE TABLE 'Street_Tree_List_fts_stat'(id INTEGER PRIMARY KEY, value BLOB) We won't hide these completely - instead, we'll default the database index view to not showing them with a message that says ""5 hidden tables"" and support ?_hidden=1 to display them.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/129/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 668308777,MDU6SXNzdWU2NjgzMDg3Nzc=,129,"""insert-files --sqlar"" for creating SQLite archives",9599,simonw,closed,0,,,,,2,2020-07-30T02:28:29Z,2020-07-30T22:41:01Z,2020-07-30T22:40:55Z,OWNER,,"A `--sqlar` option could cause `insert-files` to behave in the same way as SQLite's own sqlar mechanism. https://www.sqlite.org/sqlar.html and https://sqlite.org/sqlar/doc/trunk/README.md",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/129/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275166078,MDU6SXNzdWUyNzUxNjYwNzg=,130,"Rename ""datasette build"" to ""datasette inspect""",9599,simonw,closed,0,,,,,0,2017-11-19T15:08:02Z,2017-12-07T16:57:58Z,2017-12-07T16:57:58Z,OWNER,,"This command introspects the databases and writes out a JSON summary. I think I'd like to use `datasette build` for something more interesting, potentially duplicating functionality from https://github.com/simonw/csvs-to-sqlite Since the internal method that does this is called `ds.inspect()` that seems like a reasonable replacement name for the command.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/130/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 671130371,MDU6SXNzdWU2NzExMzAzNzE=,130,Support tokenize option for FTS,9599,simonw,closed,0,,,,,3,2020-08-01T19:27:22Z,2020-08-01T20:51:28Z,2020-08-01T20:51:14Z,OWNER,,"FTS5 supports things like porter stemming using a `tokenize=` option: https://www.sqlite.org/fts5.html#tokenizers Something like this in code: ``` CREATE VIRTUAL TABLE [{table}_fts] USING {fts_version} ( {columns}, tokenize='porter', content=[{table}] ); ``` I tried this out just now and it worked exactly as expected. So... `db[table].enable_fts(...) should accept a 'tokenize=` argument, and `sqlite-utils enable-fts ...` should support a `--tokenize` option.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/130/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275166669,MDU6SXNzdWUyNzUxNjY2Njk=,131,UI support for running FTS searches,9599,simonw,closed,0,,,,,3,2017-11-19T15:16:20Z,2017-11-19T17:18:05Z,2017-11-19T17:00:12Z,OWNER,,"Here's an example query that searches all FTS indexed columns in a table: https://sf-trees-search.now.sh/sf-trees-search-a899b92?sql=select+*+from+Street_Tree_List+where+rowid+in+%28select+rowid+from+Street_Tree_List_fts+where+Street_Tree_List_fts+match+%27grove+london+dpw%27%29%0D%0A And here's a query that searches a specific column: https://sf-trees-search.now.sh/sf-trees-search-a899b92?sql=select+*+from+Street_Tree_List+where+rowid+in+%28select+rowid+from+Street_Tree_List_fts+where+qSpecies+match+%27london%27%29%0D%0A If we detect that a table has FTS enabled (which we can do by looking for it as a content table reference in another FTS table's create definition) we should add a search box to the table page which constructs this query - maybe using `?_search=XXX` in the query string? To support search against specified columns, we can do `?_search__ qSpecies=London`. - not necessary, see comment below. - [x] Detect if a table has a FTS index defined against it as a content= parameter - [x] Decide what to do if there is more than one FTS index (maybe just pick the first one?) - [x] Add the `?_search=` query string argument - [x] Add the UI",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/131/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 675753042,MDU6SXNzdWU2NzU3NTMwNDI=,131,sqlite-utils insert: options for column types,9599,simonw,open,0,,,,,5,2020-08-09T18:59:11Z,2022-03-15T13:21:42Z,,OWNER,,"The `insert` command currently results in string types for every column - at least when used against CSV or TSV inputs. It would be useful if you could do the following: - automatically detects the column types based on eg the first 1000 records - explicitly state the rule for specific columns `--detect-types` could work for the former - or it could do that by default and allow opt-out using `--no-detect-types` For specific columns maybe this: sqlite-utils insert db.db images images.tsv \ --tsv \ -c id int \ -c score float",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/131/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 275175929,MDU6SXNzdWUyNzUxNzU5Mjk=,132,Row view is not currently expanding foreign keys,9599,simonw,closed,0,,,2919870,Foreign key edition,1,2017-11-19T17:24:25Z,2017-11-23T21:51:51Z,2017-11-23T21:51:30Z,OWNER,,Eg https://sf-trees.now.sh/sf-trees-ebc2ad9/Street_Tree_List/1,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/132/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 675839512,MDU6SXNzdWU2NzU4Mzk1MTI=,132,Features for enabling and disabling WAL mode,9599,simonw,closed,0,,,,,5,2020-08-10T03:25:44Z,2020-08-10T18:59:35Z,2020-08-10T18:59:35Z,OWNER,,I finally figured out how to enable WAL - turns out it's a property of the database file itself: https://github.com/simonw/til/blob/master/sqlite/enabling-wal-mode.md,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/132/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275176006,MDU6SXNzdWUyNzUxNzYwMDY=,133,"If view is filtered, search should apply within those filtered rows",9599,simonw,closed,0,,,2919870,Foreign key edition,3,2017-11-19T17:25:36Z,2017-11-24T22:30:32Z,2017-11-24T22:30:15Z,OWNER,,Eg on https://sf-trees.now.sh/sf-trees-ebc2ad9/Street_Tree_List?qSpecies=1,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/133/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 677839979,MDU6SXNzdWU2Nzc4Mzk5Nzk=,133,Release a sdist to PyPI,9599,simonw,closed,0,,,,,1,2020-08-12T16:55:09Z,2020-08-12T17:05:06Z,2020-08-12T17:05:06Z,OWNER,,https://pypi.org/project/sqlite-utils/#files currently just has a wheel. I need this to package for homebrew: https://github.com/simonw/homebrew-datasette/issues/10,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/133/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275176094,MDU6SXNzdWUyNzUxNzYwOTQ=,134,Filtered table view should show a count,9599,simonw,closed,0,,,2919870,Foreign key edition,1,2017-11-19T17:26:53Z,2017-11-19T18:10:49Z,2017-11-19T18:10:49Z,OWNER,,Let's do the thing where we attempt to show an accurate count if it can be done in less than 50ms,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/134/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 683804172,MDU6SXNzdWU2ODM4MDQxNzI=,134,--load-extension option for sqlite-utils query,9599,simonw,closed,0,,,,,4,2020-08-21T20:12:42Z,2020-08-21T21:06:26Z,2020-08-21T20:54:19Z,OWNER,,"I got this error: ``` % sqlite-utils calands.db 'create table superunits_with_maps_view_concrete as select * from superunits_with_maps_view' Traceback (most recent call last): ... cursor = db.conn.execute(sql, dict(param)) sqlite3.OperationalError: no such function: AsGeoJSON ``` A `--load-extension=/usr/local/lib/mod_spatialite.dylib` option (imitating the same option for Datasette) would help.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/134/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275179724,MDU6SXNzdWUyNzUxNzk3MjQ=,135,?_search=x should work if used directly against a FTS virtual table,9599,simonw,closed,0,,,2949431,Custom templates edition,3,2017-11-19T18:17:53Z,2017-12-07T04:54:41Z,2017-12-07T04:54:41Z,OWNER,,e.g. https://sf-trees.now.sh/sf-trees-ebc2ad9/Street_Tree_List_fts?_search=grove should work,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/135/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 683805434,MDU6SXNzdWU2ODM4MDU0MzQ=,135,Code for finding SpatiaLite in the usual locations,9599,simonw,closed,0,,,,,3,2020-08-21T20:15:34Z,2022-02-05T00:04:26Z,2020-08-21T20:30:13Z,OWNER,,"I built this for `shapefile-to-sqlite` but it would be useful in `sqlite-utils` too: https://github.com/simonw/shapefile-to-sqlite/blob/e754d0747ca2facf9a7433e2d5d15a6a37a9cf6e/shapefile_to_sqlite/utils.py#L16-L19 ```python SPATIALITE_PATHS = ( ""/usr/lib/x86_64-linux-gnu/mod_spatialite.so"", ""/usr/local/lib/mod_spatialite.dylib"", ) ``` https://github.com/simonw/shapefile-to-sqlite/blob/e754d0747ca2facf9a7433e2d5d15a6a37a9cf6e/shapefile_to_sqlite/utils.py#L105-L109 ```python def find_spatialite(): for path in SPATIALITE_PATHS: if os.path.exists(path): return path return None ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/135/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275228834,MDU6SXNzdWUyNzUyMjg4MzQ=,136,"""Reformat SQL"" button next to SQL editor textarea",9599,simonw,closed,0,,,,,0,2017-11-20T03:42:19Z,2019-10-14T03:46:13Z,2019-10-14T03:46:13Z,OWNER,,"Can use this: https://github.com/zeroturnaround/sql-formatter https://zeroturnaround.github.io/sql-formatter/ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/136/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 683812642,MDU6SXNzdWU2ODM4MTI2NDI=,136,--load-extension=spatialite shortcut option,9599,simonw,closed,0,,,,,3,2020-08-21T20:31:25Z,2022-02-05T00:04:26Z,2020-10-16T19:14:32Z,OWNER,,In conjunction with #135 - this would do the same thing as `--load-extension=path-to-spatialite` (see #134),140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/136/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275415799,MDU6SXNzdWUyNzU0MTU3OTk=,137,Ability to combine multiple SQL queries on a single graph,9599,simonw,open,0,,,,,1,2017-11-20T16:26:57Z,2019-05-13T18:33:51Z,,OWNER,,This would make visualizations significantly more powerful. The interesting challenge will be around the URL design. It would be useful to be able to combine either multiple explicit SQL queries or multiple queries based on the filter string parameters passed to one or more table views.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/137/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 683830416,MDU6SXNzdWU2ODM4MzA0MTY=,137,--load-extension for other sqlite-utils commands,9599,simonw,closed,0,,,,,1,2020-08-21T21:12:56Z,2020-10-16T19:14:32Z,2020-10-16T19:14:32Z,OWNER,,"e.g. for this: ``` calands-datasette % sqlite-utils tables calands.db --counts [{""table"": ""spatial_ref_sys"", ""count"": 4924}, {""table"": ""spatialite_history"", ""count"": 14}, {""table"": ""sqlite_sequence"", ""count"": 1}, {""table"": ""geometry_columns"", ""count"": 2}, {""table"": ""spatial_ref_sys_aux"", ""count"": 4873}, {""table"": ""views_geometry_columns"", ""count"": 0}, {""table"": ""virts_geometry_columns"", ""count"": 0}, {""table"": ""geometry_columns_statistics"", ""count"": 2}, {""table"": ""views_geometry_columns_statistics"", ""count"": 0}, {""table"": ""virts_geometry_columns_statistics"", ""count"": 0}, {""table"": ""geometry_columns_field_infos"", ""count"": 0}, {""table"": ""views_geometry_columns_field_infos"", ""count"": 0}, {""table"": ""virts_geometry_columns_field_infos"", ""count"": 0}, {""table"": ""geometry_columns_time"", ""count"": 2}, {""table"": ""geometry_columns_auth"", ""count"": 2}, {""table"": ""views_geometry_columns_auth"", ""count"": 0}, {""table"": ""virts_geometry_columns_auth"", ""count"": 0}, Traceback (most recent call last): File ""/usr/local/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/sqlite_utils/cli.py"", line 143, in tables for line in output_rows(_iter(), headers, nl, arrays, json_cols): File ""/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/sqlite_utils/cli.py"", line 922, in output_rows for row, next_row in itertools.zip_longest(current_iter, next_iter): File ""/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/sqlite_utils/cli.py"", line 123, in _iter row.append(db[name].count) File ""/usr/local/Cellar/sqlite-utils/2.15.1/libexec/lib/python3.8/site-packages/sqlite_utils/db.py"", line 458, in count return self.db.conn.execute( sqlite3.OperationalError: no such module: VirtualSpatialIndex ``` The `tables` command could take `--load-extension` too - as could `rows` and other similar commands. Follow-on from #134 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/137/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275476839,MDU6SXNzdWUyNzU0NzY4Mzk=,138,"Per-database and per-table metadata, probably using data-package",9599,simonw,closed,0,,,,,1,2017-11-20T19:50:10Z,2017-12-10T03:08:36Z,2017-12-10T03:08:26Z,OWNER,,"Ability to annotate databases and tables with extra metadata describing their purpose, providing source and licensing information and describing individual columns. http://frictionlessdata.io/specs/data-package/ looks like a great format for this, see #105 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/138/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 684118950,MDU6SXNzdWU2ODQxMTg5NTA=,138,extracts= doesn't configure foreign keys,9599,simonw,closed,0,,,,,2,2020-08-23T05:21:15Z,2020-09-24T22:47:01Z,2020-09-24T22:46:52Z,OWNER,,"In using `extracts=` for `shapefiles-to-sqlite` in https://github.com/simonw/shapefile-to-sqlite/issues/9 I've run into a couple of pretty serious flaws: - The columns in the original table are still `TEXT` even when the foreign key they are supposed to reference is an `INTEGER` - which means Datasette foreign key features don't actually work - Those foreign key relationships aren't setup automatically - creating them is left as an exercise for the developer",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/138/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275493851,MDU6SXNzdWUyNzU0OTM4NTE=,139,Build a visualization plugin for Vega,9599,simonw,closed,0,,,,,2,2017-11-20T20:47:41Z,2018-07-10T17:48:18Z,2018-07-10T17:48:18Z,OWNER,,"https://vega.github.io/vega/examples/population-pyramid/ for example looks pretty easy to hook up to Datasette. Depends on #14 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/139/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 686978131,MDU6SXNzdWU2ODY5NzgxMzE=,139,"insert_all(..., alter=True) should work for new columns introduced after the first 100 records",96218,simonwiles,closed,0,,,,,7,2020-08-27T06:25:25Z,2020-08-28T22:48:51Z,2020-08-28T22:30:14Z,CONTRIBUTOR,,"Is there a way to make `.insert_all()` work properly when new columns are introduced outside the first 100 records (with or without the `alter=True` argument)? I'm using `.insert_all()` to bulk insert ~3-4k records at a time and it is common for records to need to introduce new columns. However, if new columns are introduced after the first 100 records, `sqlite_utils` doesn't even raise the `OperationalError: table ... has no column named ...` exception; it just silently drops the extra data and moves on. It took me a while to find this little snippet in the [documentation for `.insert_all()`](https://sqlite-utils.readthedocs.io/en/stable/python-api.html#bulk-inserts) (it's not mentioned under [Adding columns automatically on insert/update](https://sqlite-utils.readthedocs.io/en/stable/python-api.html#bulk-inserts)): > The column types used in the CREATE TABLE statement are automatically derived from the types of data in that first batch of rows. **_Any additional or missing columns in subsequent batches will be ignored._** I tried changing the `batch_size` argument to the total number of records, but it seems only to effect the number of rows that are committed at a time, and has no influence on this problem. Is there a way around this that you would suggest? It seems like it should raise an exception at least.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/139/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275755475,MDU6SXNzdWUyNzU3NTU0NzU=,140,Heatmap visualization plugin,9599,simonw,open,0,,,,,2,2017-11-21T15:34:23Z,2019-05-13T18:33:51Z,,OWNER,,Could use https://github.com/scottbedard/svelte-heatmap,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/140/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 688351054,MDU6SXNzdWU2ODgzNTEwNTQ=,140,Idea: insert-files mechanism for adding extra columns with fixed values,9599,simonw,open,0,,,,,1,2020-08-28T20:57:36Z,2022-03-20T19:45:45Z,,OWNER,,"Say for example you want to populate a `file_type` column with the value `gif`. That could work like this: ``` sqlite-utils insert-files gifs.db images *.gif \ -c path -c md5 -c last_modified:mtime \ -c file_type:text:gif --pk=path ``` So a column defined as a `text` column with a value that follows a second colon.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/140/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 275814941,MDU6SXNzdWUyNzU4MTQ5NDE=,141,datasette publish can fail if /tmp is on a different device,21148,jacobian,closed,0,,,2949431,Custom templates edition,5,2017-11-21T18:28:05Z,2020-04-29T03:27:54Z,2017-12-08T16:06:36Z,CONTRIBUTOR,,"`datasette publish` uses hard links to avoid copying the db into a tmp directory. This can fail if `/tmp` is on another device, because hardlinks can't cross devices. You'll see something like this: ``` $ datasette publish heroku whatever.db ... OSError: [Errno 18] Invalid cross-device link: '/mnt/c/Users/jacob/c/datasette/whatever.db' -> '/tmp/tmpvxq2yof6/whatever.db' ``` [In my case this is failing because I'm on a Windows machine, using WSL, so my code's on a different virtual filesystem from the Linux subsystem, Because Reasons.] I'm not sure if it's possible to detect this (can you figure out which device `/tmp` is on?), or what the fallback should be (soft link? copy?).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/141/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 688352145,MDU6SXNzdWU2ODgzNTIxNDU=,141,insert-files support for compressed values,9599,simonw,open,0,,,,,0,2020-08-28T20:59:46Z,2020-09-24T20:36:08Z,,OWNER,,"The `sqlar` format supports this, it would be useful if `insert-files` could support this too. https://www.sqlite.org/sqlar.html",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/141/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 275917760,MDU6SXNzdWUyNzU5MTc3NjA=,142,Show extra instructions with the interrupted,9599,simonw,closed,0,,,,,3,2017-11-22T01:44:29Z,2018-05-28T21:25:06Z,2018-05-28T21:24:35Z,OWNER,,"When you are using Datasette locally for ad-hoc analysis it can be frustrating to hit the time limit. If you start it with the correct command line arguments you can disable that time limit. So how about we tell you how to do that anytime you hit the interrupted error provided you are accessing it from localhost.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/142/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 688386219,MDExOlB1bGxSZXF1ZXN0NDc1NjY1OTg0,142,"insert_all(..., alter=True) should work for new columns introduced after the first 100 records",96218,simonwiles,closed,0,,,,,3,2020-08-28T22:22:57Z,2020-08-30T07:28:23Z,2020-08-28T22:30:14Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/142,Closes #139.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/142/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 275939188,MDU6SXNzdWUyNzU5MzkxODg=,143,"Mechanism for ""suggested visualizations""",9599,simonw,closed,0,,,,,1,2017-11-22T04:10:25Z,2018-07-10T17:48:34Z,2018-07-10T17:48:34Z,OWNER,," Each visualization should have a way of deciding if it might be appropriate for the current view of data. We can then offer a ""suggested visualizations"" prompt which shows previews.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/143/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 688389933,MDU6SXNzdWU2ODgzODk5MzM=,143,Move to GitHub Actions CI,9599,simonw,closed,0,,,,,1,2020-08-28T22:34:11Z,2020-08-28T22:41:35Z,2020-08-28T22:41:35Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/143/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 276091279,MDU6SXNzdWUyNzYwOTEyNzk=,144,apsw as alternative sqlite3 binding (for full text search),649467,mhalle,closed,0,,,,,3,2017-11-22T14:40:39Z,2018-05-28T21:29:42Z,2018-05-28T21:29:42Z,NONE,,"Hey there, Have you considered providing apsw support as an alternative to stock python sqlite3? I use apsw because it keeps up with sqlite3 and is straightforward to bring in extensions like FTS5. FTS really accelerates the kind of searching often done by web clients. I may be able to help (it shouldn't be much code), but there are a couple of stylistic questions that come up when supporting an optional package. Also, apsw is tricky in that it doesn't have a pypi package (author says limitations in providing options to setup.py). ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/144/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 688395275,MDU6SXNzdWU2ODgzOTUyNzU=,144,Run some tests against numpy,9599,simonw,closed,0,,,,,2,2020-08-28T22:53:00Z,2020-08-28T22:57:05Z,2020-08-28T22:57:04Z,OWNER,,"Accidentally removed in #143: https://github.com/simonw/sqlite-utils/blob/d7d3f962861ef32c5ead8f514c8756f5b6f7c4a0/.travis.yml#L18-L19",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/144/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 276192732,MDExOlB1bGxSZXF1ZXN0MTU0MjQ2ODE2,145,Fix pytest version conflict,9599,simonw,closed,0,,,,,0,2017-11-22T20:15:34Z,2017-11-22T20:17:54Z,2017-11-22T20:17:52Z,OWNER,simonw/datasette/pulls/145,"https://travis-ci.org/simonw/datasette/jobs/305929426 pkg_resources.VersionConflict: (pytest 3.2.1 (/home/travis/virtualenv/python3.5.3/lib/python3.5/site-packages), Requirement.parse('pytest==3.2.3'))",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/145/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 688659182,MDU6SXNzdWU2ODg2NTkxODI=,145,Bug when first record contains fewer columns than subsequent records,96218,simonwiles,closed,0,,,,,2,2020-08-30T05:44:44Z,2020-09-08T23:21:23Z,2020-09-08T23:21:23Z,CONTRIBUTOR,,"`insert_all()` selects the maximum batch size based on the number of fields in the first record. If the first record has fewer fields than subsequent records (and `alter=True` is passed), this can result in SQL statements with more than the maximum permitted number of host parameters. This situation is perhaps unlikely to occur, but could happen if the first record had, say, 10 columns, such that `batch_size` (based on `SQLITE_MAX_VARIABLE_NUMBER = 999`) would be 99. If the next 98 rows had 11 columns, the resulting SQL statement for the first batch would have `10 * 1 + 11 * 98 = 1088` host parameters (and subsequent batches, if the data were consistent from thereon out, would have `99 * 11 = 1089`). I suspect that this bug is masked somewhat by the fact that while: > [`SQLITE_MAX_VARIABLE_NUMBER`](https://www.sqlite.org/limits.html#max_variable_number) ... defaults to 999 for SQLite versions prior to 3.32.0 (2020-05-22) or 32766 for SQLite versions after 3.32.0. it is common that it is increased at compile time. Debian-based systems, for example, seem to ship with a version of sqlite compiled with `SQLITE_MAX_VARIABLE_NUMBER` set to 250,000, and I believe this is the case for homebrew installations too. A test for this issue might look like this: ```python def test_columns_not_in_first_record_should_not_cause_batch_to_be_too_large(fresh_db): # sqlite on homebrew and Debian/Ubuntu etc. is typically compiled with # SQLITE_MAX_VARIABLE_NUMBER set to 250,000, so we need to exceed this value to # trigger the error on these systems. THRESHOLD = 250000 extra_columns = 1 + (THRESHOLD - 1) // 99 records = [ {""c0"": ""first record""}, # one column in first record -> batch_size = 100 # fill out the batch with 99 records with enough columns to exceed THRESHOLD *[ dict([(""c{}"".format(i), j) for i in range(extra_columns)]) for j in range(99) ] ] try: fresh_db[""too_many_columns""].insert_all(records, alter=True) except sqlite3.OperationalError: raise ``` The best solution, I think, is simply to process all the records when determining columns, column types, and the batch size. In my tests this doesn't seem to be particularly costly at all, and cuts out a lot of complications (including obviating my implementation of #139 at #142). I'll raise a PR for your consideration. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/145/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 276455748,MDU6SXNzdWUyNzY0NTU3NDg=,146,datasette publish gcloud,9599,simonw,closed,0,,,,,2,2017-11-23T18:55:03Z,2019-06-24T06:48:20Z,2019-06-24T06:48:20Z,OWNER,,"See also #103 It looks like you can start a Google Cloud VM with a ""docker container"" option - and the Google Cloud Registry is easy to push containers to. So it would be feasible to have `datasette publish gcloud ...` automatically build a container, push it to GCR, then start a new VM instance with it: https://cloud.google.com/container-registry/docs/pushing-and-pulling ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/146/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 688668680,MDExOlB1bGxSZXF1ZXN0NDc1ODc0NDkz,146,Handle case where subsequent records (after first batch) include extra columns,96218,simonwiles,closed,0,,,,,5,2020-08-30T07:13:58Z,2020-09-08T23:20:37Z,2020-09-08T23:20:37Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/146,"Addresses #145. I think this should do the job. If it meets with your approval I'll update this PR to include an update to the documentation -- I came across this bug while preparing a PR to update the documentation around `batch_size` in any event.",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/146/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 276476670,MDU6SXNzdWUyNzY0NzY2NzA=,147,Tidy up design of the header of the table page,9599,simonw,closed,0,,,2919870,Foreign key edition,1,2017-11-23T21:52:58Z,2017-11-24T22:02:46Z,2017-11-24T22:02:46Z,OWNER,,"This is a bit messy: Depends on #127 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/147/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 688670158,MDU6SXNzdWU2ODg2NzAxNTg=,147,SQLITE_MAX_VARS maybe hard-coded too low,96218,simonwiles,open,0,,,,,7,2020-08-30T07:26:45Z,2021-02-15T21:27:55Z,,CONTRIBUTOR,,"I came across this while about to open an issue and PR against the documentation for `batch_size`, which is a bit incomplete. As mentioned in #145, while: > [`SQLITE_MAX_VARIABLE_NUMBER`](https://www.sqlite.org/limits.html#max_variable_number) ... defaults to 999 for SQLite versions prior to 3.32.0 (2020-05-22) or 32766 for SQLite versions after 3.32.0. it is common that it is increased at compile time. Debian-based systems, for example, seem to ship with a version of sqlite compiled with SQLITE_MAX_VARIABLE_NUMBER set to 250,000, and I believe this is the case for homebrew installations too. In working to understand what `batch_size` was actually doing and why, I realized that by setting `SQLITE_MAX_VARS` in `db.py` to match the value my sqlite was compiled with (I'm on Debian), I was able to decrease the time to `insert_all()` my test data set (~128k records across 7 tables) from ~26.5s to ~3.5s. Given that this about .05% of my total dataset, this is time I am keen to save... Unfortunately, it seems that `sqlite3` in the python standard library doesn't expose the `get_limit()` C API (even though `pysqlite` used to), so it's hard to know what value sqlite has been compiled with (note that this could mean, I suppose, that it's less than 999, and even hardcoding `SQLITE_MAX_VARS` to the conservative default might not be adequate. It can also be lowered -- but not raised -- at runtime). The best I could come up with is `echo """" | sqlite3 -cmd "".limits variable_number""` (only available in `sqlite >= 2015-05-07 (3.8.10)`). Obviously this couldn't be relied upon in `sqlite_utils`, but I wonder what your opinion would be about exposing `SQLITE_MAX_VARS` as a user-configurable parameter (with suitable ""here be dragons"" warnings)? I'm going to go ahead and monkey-patch it for my purposes in any event, but it seems like it might be worth considering.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/147/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 276477888,MDU6SXNzdWUyNzY0Nzc4ODg=,148,Need a != filter,9599,simonw,closed,0,,,2919870,Foreign key edition,0,2017-11-23T22:05:22Z,2017-11-23T22:10:02Z,2017-11-23T22:10:01Z,OWNER,,https://datasette-demos.now.sh/sf-trees-ebc2ad9/Street_Tree_List?qCareAssistant=1 shows trees managed by FUF - but how about trees that are NOT managed by FUF?,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/148/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 695276328,MDU6SXNzdWU2OTUyNzYzMjg=,148,More attractive indentation of created FTS table schema,9599,simonw,closed,0,,,,,1,2020-09-07T16:49:30Z,2020-09-07T18:12:50Z,2020-09-07T18:12:50Z,OWNER,,"On https://github-to-sqlite.dogsheep.net/github/licenses_fts the create table SQL is displayed as: ```sql CREATE VIRTUAL TABLE [licenses_fts] USING FTS5 ( [name], content=[licenses] ); ``` It would be more aesthetically pleasing if it looked like this: ```sql CREATE VIRTUAL TABLE [licenses_fts] USING FTS5 ( [name], content=[licenses] ); ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/148/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 276704127,MDU6SXNzdWUyNzY3MDQxMjc=,149,Update custom SQL results to match new table view header,9599,simonw,closed,0,,,2919870,Foreign key edition,1,2017-11-24T22:03:59Z,2017-11-24T22:42:10Z,2017-11-24T22:42:09Z,OWNER,,"Follow-on from #147 - the custom SQL results page should more closely match the design of the table view, which now looks like this: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/149/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 695319258,MDU6SXNzdWU2OTUzMTkyNTg=,149,"FTS table with 7 rows has _fts_docsize table with 9,141 rows",9599,simonw,closed,0,,,,,10,2020-09-07T18:06:16Z,2020-09-07T21:16:34Z,2020-09-07T21:16:34Z,OWNER,,"I'm seeing a weird issue with some of the SQLite databases that I am using with the FTS5 module. I have a database with a `licenses` table that contains 7 rows: The FTS table also has 7 rows: Somehow the accompanying `licenses_fts_docsize` shadow table now has 9,141 rows in it! And `licenses_fts_data` has 41 rows - should I expect that to have 7 rows? I have a hunch that it might be a problem with the triggers. These are the triggers that are updating that FTS table: | type | name | tbl_name | rootpage | sql | | --- | --- | --- | --- | --- | | trigger | licenses_ai | licenses | 0 | `CREATE TRIGGER [licenses_ai] AFTER INSERT ON [licenses] BEGIN INSERT INTO [licenses_fts] (rowid, [name]) VALUES (new.rowid, new.[name]); END` | | trigger | licenses_ad | licenses | 0 | `CREATE TRIGGER [licenses_ad] AFTER DELETE ON [licenses] BEGIN INSERT INTO [licenses_fts] ([licenses_fts], rowid, [name]) VALUES('delete', old.rowid, old.[name]); END` | | trigger | licenses_au | licenses | 0 | `CREATE TRIGGER [licenses_au] AFTER UPDATE ON [licenses] BEGIN INSERT INTO [licenses_fts] ([licenses_fts], rowid, [name]) VALUES('delete', old.rowid, old.[name]); INSERT INTO [licenses_fts] (rowid, [name]) VALUES (new.rowid, new.[name]); END` |",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/149/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 276704327,MDU6SXNzdWUyNzY3MDQzMjc=,150,_group_count= feature improvements,9599,simonw,closed,0,,,,,3,2017-11-24T22:06:18Z,2018-05-28T16:41:28Z,2018-05-28T16:41:28Z,OWNER,,"- [ ] The ""apply filters"" form should keep you on the _group_count= page - [ ] Foreign key references should be expand - [ ] Page title should reflect the view you are on",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/150/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 695359607,MDU6SXNzdWU2OTUzNTk2MDc=,150,Feature for tracing SQL queries,9599,simonw,closed,0,,,,,0,2020-09-07T19:43:08Z,2020-09-07T21:57:01Z,2020-09-07T21:57:01Z,OWNER,,"Debugging `sqlite-utils` when something weird happens (e.g. #149) can be a bit tricky since it runs a bunch of different SQL statements behind the scenes. An optional ""tracing"" mechanism for seeing what SQL is being executed would be useful.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/150/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 276718605,MDU6SXNzdWUyNzY3MTg2MDU=,151,Set up a pattern portfolio,9599,simonw,closed,0,,,,,2,2017-11-25T02:09:49Z,2020-07-02T00:13:24Z,2020-05-03T03:13:16Z,OWNER,,"https://www.slideshare.net/nataliedowne/practical-maintainable-css/75 This will be a single page that demonstrates all of the different CSS styles and classes available to Datasette.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/151/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 695360889,MDExOlB1bGxSZXF1ZXN0NDgxNjE2NzA0,151,Tracer mechanism for seeing underlying SQL,9599,simonw,closed,0,,,,,0,2020-09-07T19:46:43Z,2020-09-07T21:57:00Z,2020-09-07T21:57:00Z,OWNER,simonw/sqlite-utils/pulls/151,"Refs #150. Needs tests and documentation, including for the new `db.execute()` and `db.executescript()` methods.",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/151/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 276765070,MDU6SXNzdWUyNzY3NjUwNzA=,152,Incorrect display of rows page for tables with a primary key,9599,simonw,closed,0,,,2949431,Custom templates edition,0,2017-11-25T17:29:54Z,2017-12-07T05:23:20Z,2017-12-07T05:23:19Z,OWNER,,"This is a regression. Here's the old version: And here's the new, broken one: https://parlgov-xtxlddmtiz.now.sh/parlgov-25f9855/party_family/1 The JSON output is the same for both - it's only the HTML representation that exhibits the bug.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/152/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 695376054,MDU6SXNzdWU2OTUzNzYwNTQ=,152,Turn on recursive_triggers by default,9599,simonw,closed,0,,,,,2,2020-09-07T20:26:36Z,2020-09-07T21:17:48Z,2020-09-07T20:45:14Z,OWNER,,"https://www.sqlite.org/pragma.html#pragma_recursive_triggers says: > Prior to SQLite [version 3.6.18](https://www.sqlite.org/releaselog/3_6_18.html) (2009-09-11), recursive triggers were not supported. The behavior of SQLite was always as if this pragma was set to OFF. Support for recursive triggers was added in version 3.6.18 but was initially turned OFF by default, for compatibility. Recursive triggers may be turned on by default in future versions of SQLite. So I think the fix for the complex issue in #149 is to turn on `recursive_triggers` globally by default for `sqlite-utils`. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/149#issuecomment-688499924_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/152/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 276842536,MDU6SXNzdWUyNzY4NDI1MzY=,153,Ability to customize presentation of specific columns in HTML view,20264,ftrain,closed,0,,,2949431,Custom templates edition,14,2017-11-26T17:46:11Z,2017-12-10T02:08:45Z,2017-12-07T06:17:33Z,NONE,,"This ties into https://github.com/simonw/datasette/issues/3 in some ways. It would be great to have some adaptability in the HTML views and to specific some columns as displaying in certain ways. - [x] 1. **Auto-parsing URIs into in-browser links.** Why? Lots of public data around cultural commons stuff links to a specific URL. This would be a great utility to turn on at the command line, just parse everything for URLs. Maybe they need to be underlined or represented in a different way than internal URLs. - [x] 2. **Ability to identify a column as plain/preformatted text.** Why? Was trying to import the Enron emails, the body collapses. Hard to read. These fields also tend to screw up the ability to scan a table view. If you knew it was text the system could set an `overflow` property on the relevant CSS, so you could still scan. - [x] 3. **Ability to identify a column as HTML.** Why? I want to spider some stuff and drop sections into SQLite, and just keep them as HTML.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/153/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 695377804,MDU6SXNzdWU2OTUzNzc4MDQ=,153,table.optimize() should delete junk rows from *_fts_docsize,9599,simonw,closed,0,,,,,3,2020-09-07T20:31:09Z,2020-09-24T20:35:46Z,2020-09-07T21:16:33Z,OWNER,,"> The second challenge here is cleaning up all of those junk rows in existing `*_fts_docsize` tables. Doing that just to the demo database from https://github-to-sqlite.dogsheep.net/github.db dropped its size from 22MB to 16MB! Here's the SQL: > ```sql > DELETE FROM [licenses_fts_docsize] WHERE id NOT IN ( > SELECT rowid FROM [licenses_fts]); > ``` > I can do that as part of the existing `table.optimize()` method, which optimizes FTS tables. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/149#issuecomment-688501064_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/153/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 276873891,MDU6SXNzdWUyNzY4NzM4OTE=,154,Datasette CSS should include content hash in the URL,9599,simonw,closed,0,,,2949431,Custom templates edition,3,2017-11-27T00:57:36Z,2017-12-09T03:10:23Z,2017-12-09T03:10:22Z,OWNER,,"When I deployed the latest version of datasette to https://fivethirtyeight.datasettes.com/ I noticed I was getting served stale CSS since it had been cached. Including the sha of he contents in its URL should fix that. I can calculate this on server start.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/154/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 695441530,MDU6SXNzdWU2OTU0NDE1MzA=,154,OperationalError: cannot change into wal mode from within a transaction,9599,simonw,open,0,,,,,2,2020-09-07T23:42:44Z,2020-09-07T23:47:10Z,,OWNER,,"I'm getting this error when running: sqlite-utils enable-wal beta.db `OperationalError: cannot change into wal mode from within a transaction` I'm worried that maybe that's because of this new code from #152: https://github.com/simonw/sqlite-utils/blob/deb2eb013ff85bbc828ebc244a9654f0d9c3139e/sqlite_utils/db.py#L128-L129",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/154/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 277589569,MDU6SXNzdWUyNzc1ODk1Njk=,155,A primary key column that has foreign key restriction associated won't rendering label column,388154,wsxiaoys,closed,0,,,2949431,Custom templates edition,4,2017-11-29T00:40:02Z,2017-12-07T05:39:53Z,2017-12-07T05:39:53Z,NONE,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/155/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 696045581,MDU6SXNzdWU2OTYwNDU1ODE=,155,rebuild-fts command and table.rebuild_fts() method,9599,simonw,closed,0,,,,,2,2020-09-08T17:19:26Z,2020-09-24T20:35:46Z,2020-09-08T23:16:10Z,OWNER,,"https://sqlite.org/forum/forumpost/fa777fff86 > Easiest thing would be to run a 'rebuild' to rebuild the FTS index from scratch based on the contents of the content table. i.e. > > INSERT INTO licenses_fts(licenses_fts) VALUES('rebuild'); > > https://www.sqlite.org/fts5.html#the_rebuild_command",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/155/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 278189708,MDU6SXNzdWUyNzgxODk3MDg=,156,Document CSS hooks and custom templates,9599,simonw,closed,0,,,2949431,Custom templates edition,1,2017-11-30T16:43:15Z,2017-11-30T17:11:34Z,2017-11-30T17:10:58Z,OWNER,,Documentation currently lives in commit messages on https://github.com/simonw/datasette/commit/8ab3a169d42d096f2c7979c6d3d7746618d30f0b and 3cd06729f457d690603b6060dc552b535517ab09,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/156/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 697030843,MDExOlB1bGxSZXF1ZXN0NDgzMDI3NTg3,156,Typos in tests,96218,simonwiles,closed,0,,,,,1,2020-09-09T18:00:58Z,2020-09-09T18:24:50Z,2020-09-09T18:21:23Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/156,"One of these is my fault, and the other is one I just happened to come across. They're harmless, but might as well be fixed.",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/156/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 278190321,MDU6SXNzdWUyNzgxOTAzMjE=,157,"Teach ""datasette publish"" about custom template directories",9599,simonw,closed,0,,,2949431,Custom templates edition,1,2017-11-30T16:44:57Z,2020-01-15T16:05:13Z,2017-12-09T18:28:54Z,OWNER,,"The following command should copy the custom templates into the deployment and ensure `datasette serve` correctly serves them: datasette publish now mydb.db --template-dir=custom-templates/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/157/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 697179806,MDU6SXNzdWU2OTcxNzk4MDY=,157,sqlite-utils add-foreign-keys command,9599,simonw,closed,0,,,5896742,2.19,2,2020-09-09T21:44:30Z,2020-09-24T20:34:50Z,2020-09-20T20:14:30Z,OWNER,,"Like `add-foreign-key` but can do multiple foreign keys at once. Inspired by https://github.com/simonw/calands-datasette/blob/99de39dd80a906f5c1f16724467b0cd55ba4ef36/build.sh which does this: ``` sqlite-utils add-foreign-key calands.db units_with_maps ACCESS_TYP sqlite-utils add-foreign-key calands.db units_with_maps AGNCY_NAME sqlite-utils add-foreign-key calands.db units_with_maps AGNCY_LEV sqlite-utils add-foreign-key calands.db units_with_maps AGNCY_TYP sqlite-utils add-foreign-key calands.db units_with_maps LAYER sqlite-utils add-foreign-key calands.db units_with_maps MNG_AGENCY sqlite-utils add-foreign-key calands.db units_with_maps MNG_AG_LEV sqlite-utils add-foreign-key calands.db units_with_maps MNG_AG_TYP sqlite-utils add-foreign-key calands.db units_with_maps COUNTY sqlite-utils add-foreign-key calands.db units_with_maps DES_TP ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/157/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 278190981,MDU6SXNzdWUyNzgxOTA5ODE=,158,Ensure default templates are designed to be extended,9599,simonw,closed,0,,,2949431,Custom templates edition,1,2017-11-30T16:46:41Z,2017-12-07T05:41:09Z,2017-12-07T05:41:08Z,OWNER,,"Since custom templates can do `{% extends ""default:table.html"" %}` the default templates should include sensible named `{% block %}` components designed to support common extension patterns. Since we already support `{{ super() }}` we may not have much if anything to add here.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/158/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 697203800,MDExOlB1bGxSZXF1ZXN0NDgzMTc1NTA5,158,Fix accidental mega long line in docs,167319,tomviner,closed,0,,,,,1,2020-09-09T22:31:23Z,2020-09-16T06:21:43Z,2020-09-16T06:21:43Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/158,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/158/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 278191223,MDU6SXNzdWUyNzgxOTEyMjM=,159,Come up with an elegant mechanism for per-row template customization,9599,simonw,closed,0,,,2949431,Custom templates edition,0,2017-11-30T16:47:26Z,2017-12-07T06:12:27Z,2017-12-07T06:12:26Z,OWNER,,It would be nice if customizing the display of an individual row in a custom table template was as simple as possible - refs #153 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/159/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 702386948,MDU6SXNzdWU3MDIzODY5NDg=,159,.delete_where() does not auto-commit (unlike .insert() or .upsert()),11712349,spdkils,open,0,,,,,9,2020-09-16T01:55:52Z,2023-04-01T17:21:05Z,,NONE,,"When you use the delete_where() function on a table, it never commits.... Is that intentional?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/159/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 278208011,MDU6SXNzdWUyNzgyMDgwMTE=,160,Ability to bundle and serve additional static files,9599,simonw,closed,0,,,2949431,Custom templates edition,8,2017-11-30T17:37:51Z,2019-02-02T00:58:20Z,2017-12-09T18:29:11Z,OWNER,,"Since we now have custom templates, we should support including custom static files with them as well. Maybe something like this: datasette mydb.db --template-dir=templates/ --static-dir=static/ This should also be supported by datasette publish - see also #157 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/160/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 705190723,MDU6SXNzdWU3MDUxOTA3MjM=,160,"table.enable_fts(..., replace=True)",9599,simonw,closed,0,,,5896742,2.19,1,2020-09-20T21:36:23Z,2020-09-24T20:35:47Z,2020-09-20T22:05:51Z,OWNER,,"I noticed that https://til.simonwillison.net/ search doesn't use porter stemming. I'd like to add that, but since [the build script](https://github.com/simonw/til/blob/9d3f0fca30e94df3970df52b0447907a077e4673/build_database.py) always operates on an existing database (to avoid re-rendering markdown and re-building image thumbnails) I'd like it to only add porter stemming if it's not there already. So I'd like to be able to say ""set up FTS to look like this, and fix it if it doesn't"". I think the neatest way to do that is with a `replace=True` argument to `.enable_fts()`, for consistency with `def .create_view(self, name, sql, replace=True)`. So the `replace=True` argument would check and see if the configured FTS exists already with the correct options (columns, stemming, triggers) - and if any of those are incorrect it would call `.disable_fts()` and then create a new FTS configuration with the correct options. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/160/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 278814220,MDU6SXNzdWUyNzg4MTQyMjA=,161,Support WITH query ,388154,wsxiaoys,closed,0,,,,,4,2017-12-03T20:00:40Z,2017-12-08T06:18:12Z,2017-12-04T04:52:41Z,NONE,,"Currently datasettle failed with error message: Statement must begin with SELECT Example query ```sql WITH RECURSIVE cnt(x) AS ( SELECT 1 UNION ALL SELECT x+1 FROM cnt LIMIT 1000000 ) SELECT x FROM cnt; ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/161/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 705975133,MDExOlB1bGxSZXF1ZXN0NDkwNjA3OTQ5,161,table.transform() method,9599,simonw,closed,0,,,5897911,2.20,13,2020-09-21T23:16:59Z,2020-09-22T07:48:24Z,2020-09-22T04:20:02Z,OWNER,simonw/sqlite-utils/pulls/161,"Refs #114 - [x] Ability to change the primary key - [x] Support for changing default value for columns - [x] Support for changing `NOT NULL` status of columns - [x] Support for copying existing foreign keys and removing them - Support for `conversions=` parameter - [x] Detailed documentation - [x] `PRAGMA foreign_keys` stuff",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/161/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 279199916,MDU6SXNzdWUyNzkxOTk5MTY=,162,Link should not show up in the column selection dropdowns,9599,simonw,closed,0,,,2949431,Custom templates edition,0,2017-12-05T00:19:04Z,2017-12-07T05:05:58Z,2017-12-07T05:05:58Z,OWNER,,"e.g. on https://san-francisco.datasettes.com/food-trucks-921342f/Applicant ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/162/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 705995722,MDU6SXNzdWU3MDU5OTU3MjI=,162,A decorator for registering custom SQL functions,9599,simonw,closed,0,,,,,2,2020-09-22T00:18:32Z,2020-09-22T00:40:44Z,2020-09-22T00:32:17Z,OWNER,,"Syntactic sugar for `db.conn.create_function` - it would work something like this: ```python db = sqlite_utils.Database(""mydb.db"") @db.register_function def scramble(text): chars = list(text) random.shuffle(chars) return """".join(chars) ``` The decorator would inspect the function to find its name and arity (number of arguments). Having run the above you could then do: ```python db.execute(""select scramble('hello')"").fetchall() ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/162/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 279547886,MDU6SXNzdWUyNzk1NDc4ODY=,163,Document the querystring argument for setting a different time limit,9599,simonw,closed,0,,,,,2,2017-12-05T22:05:08Z,2021-03-23T02:44:33Z,2017-12-06T15:06:57Z,OWNER,,"http://datasette.readthedocs.io/en/latest/sql_queries.html#query-limits Need to explain why this is useful too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/163/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 706001517,MDU6SXNzdWU3MDYwMDE1MTc=,163,Idea: conversions= could take Python functions,9599,simonw,open,0,,,,,4,2020-09-22T00:37:12Z,2021-12-20T00:56:52Z,,OWNER,,"Right now you use `conversions=` like this: ```python db[""example""].insert({ ""name"": ""The Bigfoot Discovery Museum"" }, conversions={""name"": ""upper(?)""}) ``` How about if you could optionally provide a Python function (or a lambda) like this? ```python db[""example""].insert({ ""name"": ""The Bigfoot Discovery Museum"" }, conversions={""name"": lambda s: s.upper()}) ``` This would work by creating a random name for that function, registering it (similar to #162), executing the SQL and then un-registering the custom function at the end.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/163/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 280013907,MDU6SXNzdWUyODAwMTM5MDc=,164,datasette skeleton command for kick-starting database and table metadata,9599,simonw,closed,0,,,2949431,Custom templates edition,3,2017-12-07T06:13:28Z,2021-03-23T02:45:12Z,2017-12-07T06:20:45Z,OWNER,,Generates an example `metadata.json` file populated with all of the databases and tables inspected from the specified databases.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/164/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 706017416,MDU6SXNzdWU3MDYwMTc0MTY=,164,sqlite-utils transform sub-command,9599,simonw,closed,0,,,5897911,2.20,4,2020-09-22T01:32:20Z,2020-09-24T20:34:50Z,2020-09-22T07:48:05Z,OWNER,,The `.transform()` method in #114 warrants an equivalent CLI tool.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/164/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 280014287,MDU6SXNzdWUyODAwMTQyODc=,165,metadata.json support for per-database and per-table information,9599,simonw,closed,0,,,2949431,Custom templates edition,2,2017-12-07T06:15:34Z,2017-12-07T16:48:34Z,2017-12-07T16:47:29Z,OWNER,,"Every database and every table should be able to support the following optional metadata: title description description_html license license_url source source_url If `description_html` is provided it over-rides `description` and will be displayed unescaped.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/165/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 706091046,MDU6SXNzdWU3MDYwOTEwNDY=,165,Make .transform() a keyword arguments only function,9599,simonw,closed,0,,,5897911,2.20,0,2020-09-22T05:37:29Z,2020-09-24T20:35:47Z,2020-09-22T06:39:12Z,OWNER,,And rename the first argument from `columns=` to `types=`,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/165/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 280023225,MDU6SXNzdWUyODAwMjMyMjU=,166,Documentation for metadata.json and datasette skeleton,9599,simonw,closed,0,,,2949431,Custom templates edition,1,2017-12-07T07:02:52Z,2017-12-07T17:20:35Z,2017-12-07T17:20:25Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/166/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 706092617,MDExOlB1bGxSZXF1ZXN0NDkwNzAzMTcz,166,Keyword only arguments for transform(),9599,simonw,closed,0,,,,,0,2020-09-22T05:41:44Z,2020-09-22T06:39:11Z,2020-09-22T06:39:11Z,OWNER,simonw/sqlite-utils/pulls/166,Refs #165,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/166/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 280315352,MDU6SXNzdWUyODAzMTUzNTI=,167,Nasty bug: last column not being correctly displayed,9599,simonw,closed,0,,,2949431,Custom templates edition,6,2017-12-07T23:23:46Z,2017-12-10T01:00:21Z,2017-12-10T01:00:20Z,OWNER,,"e.g. https://datasette-bwnojrhmmg.now.sh/dk3-bde9a9a/dk?source__contains=http ![2017-12-07 at 3 22 pm](https://user-images.githubusercontent.com/9599/33743613-7ee97be0-db62-11e7-8e81-9b9ec69d93f0.png) The JSON output shows that the column is there, but is being displayed incorrectly: https://datasette-bwnojrhmmg.now.sh/dk3-bde9a9a/dk.jsono?source__contains=http ![2017-12-07 at 3 23 pm](https://user-images.githubusercontent.com/9599/33743645-9489b302-db62-11e7-898b-72e812e8855d.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/167/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 706098005,MDU6SXNzdWU3MDYwOTgwMDU=,167,Review the foreign key pragma stuff,9599,simonw,closed,0,,,5897911,2.20,1,2020-09-22T05:55:20Z,2020-09-23T00:13:02Z,2020-09-23T00:13:02Z,OWNER,,"> It is not possible to enable or disable foreign key constraints in the middle of a multi-statement transaction (when SQLite is not in autocommit mode). Attempting to do so does not return an error; it simply has no effect. https://sqlite.org/foreignkeys.html",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/167/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 280662866,MDExOlB1bGxSZXF1ZXN0MTU3MzY1ODEx,168,Upgrade to Sanic 0.7.0,9599,simonw,closed,0,,,,,1,2017-12-09T01:25:08Z,2017-12-09T03:00:34Z,2017-12-09T03:00:34Z,OWNER,simonw/datasette/pulls/168,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/168/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 706167456,MDU6SXNzdWU3MDYxNjc0NTY=,168,Automate (as much as possible) updates published to Homebrew,9599,simonw,closed,0,,,,,2,2020-09-22T08:08:37Z,2020-11-09T07:43:30Z,2020-11-09T07:43:30Z,OWNER,,I'd like to get new `sqlite-utils` (and Datasette) releases submitted to Homebrew as painlessly as possible.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/168/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 280744309,MDU6SXNzdWUyODA3NDQzMDk=,169,Release v0.14 with templates and static files features,9599,simonw,closed,0,,,2949431,Custom templates edition,1,2017-12-09T18:52:48Z,2017-12-10T02:04:56Z,2017-12-10T02:04:56Z,OWNER,,"Everything in this milestone https://github.com/simonw/datasette/milestone/6 - plus various other fixes: https://github.com/simonw/datasette/compare/0.13...6bdfcf60760c27e29ff34692d06e62b36aeecc56 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/169/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 706757891,MDU6SXNzdWU3MDY3NTc4OTE=,169,"Progress bar for ""sqlite-utils extract""",9599,simonw,closed,0,,,5897911,2.20,0,2020-09-22T23:40:21Z,2020-09-24T20:34:50Z,2020-09-23T00:02:40Z,OWNER,,"> Since these operations could take a long time against large tables, it would be neat if there was a progress bar option for the CLI command. > > The operations are full table scans so calculating progress shouldn't be too difficult. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/42#issuecomment-513246831_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/169/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 280745470,MDU6SXNzdWUyODA3NDU0NzA=,170,Custom template for named canned query,9599,simonw,closed,0,,,2949431,Custom templates edition,3,2017-12-09T19:07:51Z,2017-12-09T21:35:30Z,2017-12-09T21:34:52Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/170/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 706768798,MDU6SXNzdWU3MDY3Njg3OTg=,170,Release notes for 2.20,9599,simonw,closed,0,,,5897911,2.20,1,2020-09-23T00:13:22Z,2020-09-23T00:31:25Z,2020-09-23T00:31:25Z,OWNER,,https://github.com/simonw/sqlite-utils/compare/2.19...b8e004,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/170/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 280745746,MDU6SXNzdWUyODA3NDU3NDY=,171,HTML comments specifying custom templates for page,9599,simonw,closed,0,,,2949431,Custom templates edition,1,2017-12-09T19:11:13Z,2017-12-09T21:50:50Z,2017-12-09T21:48:03Z,OWNER,," This would make the custom templating system self-documenting, and save people from having to figure out the right template names for customizing specific pages.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/171/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 707407567,MDU6SXNzdWU3MDc0MDc1Njc=,171,Idea: transitive closure tables for tree structures,649467,mhalle,closed,0,,,,,2,2020-09-23T14:17:33Z,2020-10-22T04:38:35Z,2020-10-22T04:07:14Z,NONE,,"I just read that sqlite has a transitive closure table extension using a virtual table in order to represent trees: https://charlesleifer.com/blog/querying-tree-structures-in-sqlite-using-python-and-the-transitive-closure-extension/ Even without this extension, though, a util to build a transitive closure table would allow trees to be queried easily. Since it relies on self-referential foreign keys, the relationships might even be able to be automatically detected. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/171/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 280896290,MDU6SXNzdWUyODA4OTYyOTA=,172,Show size of .db file next to download link,9599,simonw,closed,0,,,,,1,2017-12-11T05:12:46Z,2019-02-06T05:09:06Z,2019-02-06T05:00:36Z,OWNER,,"Size in bytes should be calculated by datasette inspect. Template should display it in KB or MB or GB",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/172/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 707427200,MDU6SXNzdWU3MDc0MjcyMDA=,172,Improve performance of extract operations,9599,simonw,closed,0,,,,,9,2020-09-23T14:40:50Z,2020-09-24T15:43:57Z,2020-09-24T15:43:57Z,OWNER,,"This command took about 12 minutes (against a 150MB file with 680,000 rows in it): ``` sqlite-utils extract salaries.db salaries \ 'Organization Group Code' 'Organization Group' \ --table 'organization_groups' \ --fk-column 'organization_group_id' \ --rename 'Organization Group Code' code \ --rename 'Organization Group' name ``` I'm pretty confident we can do better than that.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/172/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 281110295,MDU6SXNzdWUyODExMTAyOTU=,173,I18n and L10n support,50138,janimo,open,0,,,,,2,2017-12-11T17:49:58Z,2021-04-26T12:10:01Z,,NONE,,It would be less geeky and more user friendly if the display strings in the filter menu and possibly other parts could be localized.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/173/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 707478649,MDU6SXNzdWU3MDc0Nzg2NDk=,173,Progress bar for sqlite-utils insert,9599,simonw,closed,0,,,,,6,2020-09-23T15:43:56Z,2021-11-01T08:42:24Z,2020-10-27T18:16:04Z,OWNER,,"It would be nice if `sqlite-utils insert` had a progress bar, for when it's churning through huge CSV files.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/173/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 281197863,MDU6SXNzdWUyODExOTc4NjM=,174,License/Source in footer should inherit from top level,9599,simonw,closed,0,,,,,1,2017-12-11T23:01:35Z,2018-08-11T17:46:51Z,2018-08-11T17:46:51Z,OWNER,,"The footer on https://vice-police-shootings.now.sh/vice-bc7c892/ViceNews_FullOISData does not show license and source information... but that Datasette has that information, it's just defined at the top level: https://vice-police-shootings.now.sh/ The footer for a row/table page should fall back on information for the database, and if there is none for the database it should fall back on the top-level metadata instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/174/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 707944044,MDExOlB1bGxSZXF1ZXN0NDkyMjU3NDA1,174,"Much, much faster extract() implementation",9599,simonw,closed,0,,,,,7,2020-09-24T07:52:31Z,2020-09-24T15:44:00Z,2020-09-24T15:43:56Z,OWNER,simonw/sqlite-utils/pulls/174,Takes my test down from ten minutes to four seconds. Refs #172.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/174/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 282971961,MDU6SXNzdWUyODI5NzE5NjE=,175,"Add project topic ""automatic-api""",3179832,dbohdan,closed,0,,,,,1,2017-12-18T18:09:17Z,2017-12-21T18:33:55Z,2017-12-21T18:33:55Z,NONE,,"Hi there! Could you add the ~~tag~~ topic `automatic-api` to your repository? I am [making a list](https://github.com/dbohdan/automatic-api) of all projects that automatically expose APIs to databases. (Your Show HN made me do it. :-) I knew about PostgREST and PostGraphQL, but it took adding Datasette to sell me on the concept.) They will be easier to discover if there is a standard GitHub tag, and `automatic-api` seems as good a candidate as any. Two projects [already use it](https://github.com/topics/automatic-api).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/175/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 708261775,MDU6SXNzdWU3MDgyNjE3NzU=,175,Add docs for .transform(column_order=),9599,simonw,closed,0,,,,,3,2020-09-24T15:19:04Z,2020-09-24T20:35:48Z,2020-09-24T16:00:56Z,OWNER,,"> Need to update docs for `.transform()` now that `column_order=` is available. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/pull/174#discussion_r494403327_ Maybe also add this as an option to `sqlite-utils transform` - since reordering columns is actually a pretty nice capability.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/175/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 285168503,MDU6SXNzdWUyODUxNjg1MDM=,176,Add GraphQL endpoint,173848,yozlet,open,0,,,,,8,2017-12-29T23:21:01Z,2020-04-21T14:16:24Z,,NONE,,Would make it much easier to build React & similar frontends. Maybe with https://github.com/graphql-python/sanic-graphql ?,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/176/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 708293114,MDU6SXNzdWU3MDgyOTMxMTQ=,176,sqlite-utils transform column order option,9599,simonw,closed,0,,,,,2,2020-09-24T16:01:21Z,2020-09-24T20:34:51Z,2020-09-24T16:11:59Z,OWNER,,Split from #175,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/176/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 286938589,MDU6SXNzdWUyODY5Mzg1ODk=,177,Publishing to Heroku - metadata file not uploaded?,82988,psychemedia,closed,0,,,,,0,2018-01-09T01:04:31Z,2018-01-25T16:45:32Z,2018-01-25T16:45:32Z,CONTRIBUTOR,,"Trying to run *datasette* (version 0.14) on Heroku with a `metadata.json` doesn't seem to be picking up the `metadata.json` file? On a Mac with dodgy `tar` support: ``` ▸ Couldn't detect GNU tar. Builds could fail due to decompression errors ▸ See ▸ https://devcenter.heroku.com/articles/platform-api-deploying-slugs#create-slug-archive ▸ Please install it, or specify the '--tar' option ▸ Falling back to node's built-in compressor ``` Could that be causing the issue? Also, I'm not seeing custom query links anywhere obvious when I run the metadata file with a local *datasette* server? ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/177/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 708301810,MDU6SXNzdWU3MDgzMDE4MTA=,177,Simplify .transform(drop_foreign_keys=) and sqlite-transform --drop-foreign-key,9599,simonw,closed,0,,,,,1,2020-09-24T16:13:50Z,2020-09-24T20:35:03Z,2020-09-24T16:19:13Z,OWNER,,"These both currently require you to provide three strings, for `column`, `other_table`, `other_column`. Just providing `column` should be enough information.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/177/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 287240246,MDExOlB1bGxSZXF1ZXN0MTYxOTgyNzEx,178,"If metadata exists, add it to heroku launch command",82988,psychemedia,closed,0,,,,,1,2018-01-09T21:42:21Z,2018-01-15T09:42:46Z,2018-01-14T21:05:16Z,CONTRIBUTOR,simonw/datasette/pulls/178,"The heroku build does seem to make use of any provided `metadata.json` file. Add the `--metadata` switch to the Heroku web launch command if a `metadata.json` file is available. Addresses: https://github.com/simonw/datasette/issues/177",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/178/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 709043182,MDExOlB1bGxSZXF1ZXN0NDkzMTYyNzY3,178,Update README.md,19921,shakeel,closed,0,,,,,1,2020-09-25T15:52:11Z,2020-10-01T14:18:30Z,2020-09-30T20:29:28Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/178,"The `sqlite-utils insert releases.db releases - --pk` is missing the pk field name, added ` ""id""` to fix it.",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/178/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 288438570,MDU6SXNzdWUyODg0Mzg1NzA=,179,More metadata options for template authors ,9599,simonw,open,0,,,,,2,2018-01-14T20:51:04Z,2019-05-13T18:33:33Z,,OWNER,,See this thread on Twitter: https://twitter.com/simonw/status/952637152797458432,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/179/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 709577625,MDU6SXNzdWU3MDk1Nzc2MjU=,179,sqlite-utils transform/insert --detect-types,9599,simonw,closed,0,,,,,4,2020-09-26T17:28:55Z,2021-06-19T03:36:16Z,2021-06-19T03:36:05Z,OWNER,,"Idea from https://github.com/simonw/datasette-edit-tables/issues/13 - provide Python utility methods and accompanying CLI options for detecting the likely types of TEXT columns. So if you have a text column that actually contained exclusively integer string values, it can let you know and let you run transform against it.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/179/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 289375133,MDExOlB1bGxSZXF1ZXN0MTYzNTIzOTc2,180,make html title more readable in query template,56477,ryanpitts,closed,0,,,,,0,2018-01-17T18:56:03Z,2018-04-03T16:03:38Z,2018-04-03T15:24:05Z,CONTRIBUTOR,simonw/datasette/pulls/180,tiny tweak to make this easier to visually parse—I think it matches your style in other templates,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/180/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 709861194,MDU6SXNzdWU3MDk4NjExOTQ=,180,Try running some tests using Hypothesis,9599,simonw,closed,0,,,,,1,2020-09-28T01:11:30Z,2020-10-19T04:51:55Z,2020-10-19T04:51:55Z,OWNER,,Inspired by this Twitter conversation: https://twitter.com/simonw/status/1310386009465479168,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/180/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 289425975,MDExOlB1bGxSZXF1ZXN0MTYzNTYxODMw,181,"add ""format sql"" button to query page, uses sql-formatter",1957344,bsmithgall,closed,0,,,,,7,2018-01-17T21:50:04Z,2019-11-11T03:08:25Z,2019-11-11T03:08:25Z,NONE,simonw/datasette/pulls/181,"Cool project! This fixes #136 using the suggested [sql formatter](https://github.com/zeroturnaround/sql-formatter) library. I included the minified version in the bundle and added the relevant scripts to the codemirror includes instead of adding new files, though I could also add new files. I wanted to keep it all together, since the result of the format needs access to the editor in order to properly update the codemirror instance.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/181/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 709920027,MDU6SXNzdWU3MDk5MjAwMjc=,181,"pk=[""id""] should have same effect as pk=""id""",9599,simonw,closed,0,,,,,1,2020-09-28T04:28:07Z,2020-10-14T21:59:47Z,2020-10-14T21:59:47Z,OWNER,,"``` In [11]: db['one'].insert({""id"": 1, ""name"": ""oentuh""}, pk=""id"") Out[11]:
    In [12]: db['two'].insert({""id"": 1, ""name"": ""oentuh""}, pk=[""id""]) Out[12]:
    In [13]: db['one'].schema Out[13]: 'CREATE TABLE [one] (\n [id] INTEGER PRIMARY KEY,\n [name] TEXT\n)' In [14]: db['two'].schema Out[14]: 'CREATE TABLE [two] (\n [id] INTEGER,\n [name] TEXT\n)' ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/181/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 291451116,MDExOlB1bGxSZXF1ZXN0MTY1MDI5ODA3,182,Add db filesize next to download link,3433657,raynae,closed,0,,,,,0,2018-01-25T04:58:56Z,2019-03-22T13:50:57Z,2019-02-06T04:59:38Z,CONTRIBUTOR,simonw/datasette/pulls/182,"Took a stab at #172, will this do the trick?",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/182/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 711649325,MDU6SXNzdWU3MTE2NDkzMjU=,182,"Better handling of encodings other than utf-8 for ""sqlite-utils insert""",765871,kaihendry,closed,0,,,,,5,2020-09-30T05:43:48Z,2020-10-16T17:20:41Z,2020-10-16T17:18:52Z,NONE,,"Makefile: ``` data.db: curl -O http://maps.natalian.org/data.txt go run csv-write.go > data.csv sqlite-utils insert data.db travels data.csv --csv clean: rm data* ``` [csv-write.go](https://gist.github.com/kaihendry/dff2442de20d73f900026d13bf7a11d9) Error message is: ``` sqlite-utils insert data.db travels data.csv --csv Traceback (most recent call last): File ""/home/hendry/.local/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/home/hendry/.local/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/home/hendry/.local/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/home/hendry/.local/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/hendry/.local/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/hendry/.local/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/home/hendry/.local/lib/python3.8/site-packages/sqlite_utils/cli.py"", line 614, in insert insert_upsert_implementation( File ""/home/hendry/.local/lib/python3.8/site-packages/sqlite_utils/cli.py"", line 553, in insert_upsert_implementation headers = next(reader) File ""/usr/lib/python3.8/codecs.py"", line 322, in decode (result, consumed) = self._buffer_decode(data, self.errors, final) UnicodeDecodeError: 'utf-8' codec can't decode byte 0xe3 in position 1234: invalid continuation byte make: *** [Makefile:4: data.db] Error 1 [hendry@t14s datasette-map]$ sqlite-utils --version sqlite-utils, version 2.19 ``` Little bit surprised if Go is spewing out bad Unicode, but I'm not sure how to grok `position 1234`.. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/182/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 291639118,MDU6SXNzdWUyOTE2MzkxMTg=,183,Custom Queries - escaping strings,82988,psychemedia,closed,0,,,,,2,2018-01-25T16:49:13Z,2019-06-24T06:45:07Z,2019-06-24T06:45:07Z,CONTRIBUTOR,,"If a SQLite table column name contains spaces, they are usually referred to in double quotes: `SELECT * FROM mytable WHERE ""gappy column name""=""my value"";` In the JSON metadata file, this is passed by escaping the double quotes: `""queries"": {""my query"": ""SELECT * FROM mytable WHERE \""gappy column name\""=\""my value\"";""}` When specifying a custom query in `metadata.json` using double quotes, these are then rendered in the *datasette* query box using single quotes: `SELECT * FROM mytable WHERE 'gappy column name'='my value';` which does not work. Alternatively, a valid custom query can be passed using backticks (\`) to quote the column name and single (unescaped) quotes for the matched value: ``""queries"": {""my query"": ""SELECT * FROM mytable WHERE `gappy column name`='my value';""}`` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/183/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 712316959,MDU6SXNzdWU3MTIzMTY5NTk=,183,Try out GitHub code scanning,9599,simonw,closed,0,,,,,1,2020-09-30T22:16:14Z,2020-09-30T22:23:44Z,2020-09-30T22:23:44Z,OWNER,,https://github.blog/2020-09-30-code-scanning-is-now-available/,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/183/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 292011379,MDU6SXNzdWUyOTIwMTEzNzk=,184,500 from missing table name,222245,carlmjohnson,closed,0,,,,,4,2018-01-26T19:46:45Z,2019-05-21T16:17:29Z,2018-04-13T18:18:59Z,NONE,,"https://github.com/simonw/datasette/blob/56623e48da5412b25fb39cc26b9c743b684dd968/datasette/app.py#L517-L519 throws an error if it gets an empty list back. Simplest solution is to write a helper func that just says ```python result = list(await self.execute(name, sql, params) if result: return result[0][0] ``` and use it anywhere `[0][0]` is now.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/184/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 716955793,MDExOlB1bGxSZXF1ZXN0NDk5NjAzMzU5,184,Test against Python 3.9,9599,simonw,closed,0,,,,,0,2020-10-08T01:37:05Z,2020-10-08T01:44:06Z,2020-10-08T01:44:06Z,OWNER,simonw/sqlite-utils/pulls/184,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/184/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 299760684,MDU6SXNzdWUyOTk3NjA2ODQ=,185,Metadata should be a nested arbitrary KV store,222245,carlmjohnson,open,0,,,,,12,2018-02-23T16:02:07Z,2019-05-13T18:33:33Z,,NONE,,"I started using the metadata feature and was surprised to find that values are not inherited from the root object down to specific databases and tables. This makes metadata much less useful and requires a lot of pointless duplication. Ideally, metadata should allow arbitrary key-value pairs, and there should be a way of accessing metadata either in an inherited or non-inherited manner. Something like `metadata.page.key` vs. `metadata.this.key` might work as an interface.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/185/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 718952107,MDU6SXNzdWU3MTg5NTIxMDc=,185,Use db[table] consistently in documentation,9599,simonw,closed,0,,,,,0,2020-10-11T23:39:12Z,2020-10-12T00:13:41Z,2020-10-12T00:13:41Z,OWNER,,"The Python docs have a bunch of examples like this: https://sqlite-utils.readthedocs.io/en/stable/python-api.html ```python dogs.enable_fts([""name"", ""twitter""], create_triggers=True) ``` This would be easier for people to understand if it looked like this instead: ```python db[""dogs""].enable_fts([""name"", ""twitter""], create_triggers=True) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/185/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 306811513,MDU6SXNzdWUzMDY4MTE1MTM=,186,proposal new option to disable user agents cache,47107,stefanocudini,closed,0,,,,,3,2018-03-20T10:42:20Z,2018-03-21T09:07:22Z,2018-03-21T01:28:31Z,NONE,,"I think it would be very useful for debugging an option of adding headers to http replies ``` Cache-Control: no-cache ``` especially in the html output",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/186/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 722816436,MDU6SXNzdWU3MjI4MTY0MzY=,186,.extract() shouldn't extract null values,9599,simonw,open,0,,,,,7,2020-10-16T02:41:08Z,2021-08-12T12:32:14Z,,OWNER,,"This almost works, but it creates a rogue `type` record with a value of None. ``` In [1]: import sqlite_utils In [2]: db = sqlite_utils.Database(memory=True) In [5]: db[""creatures""].insert_all([ {""id"": 1, ""name"": ""Simon"", ""type"": None}, {""id"": 2, ""name"": ""Natalie"", ""type"": None}, {""id"": 3, ""name"": ""Cleo"", ""type"": ""dog""}], pk=""id"") Out[5]:
    In [7]: db[""creatures""].extract(""type"") Out[7]:
    In [8]: list(db[""creatures""].rows) Out[8]: [{'id': 1, 'name': 'Simon', 'type_id': None}, {'id': 2, 'name': 'Natalie', 'type_id': None}, {'id': 3, 'name': 'Cleo', 'type_id': 2}] In [9]: db[""type""] Out[9]:
    In [10]: list(db[""type""].rows) Out[10]: [{'id': 1, 'type': None}, {'id': 2, 'type': 'dog'}] ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/186/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 309033998,MDU6SXNzdWUzMDkwMzM5OTg=,187,Windows installation error,11855322,robmarkcole,closed,0,,,,,7,2018-03-27T16:04:37Z,2019-06-15T21:44:23Z,2019-06-15T21:44:23Z,NONE,,"On attempting install on a Win 7 PC with py 3.6.2 (Anaconda dist) I get the error: ``` Collecting uvloop>=0.5.3 (from Sanic==0.7.0->datasette) Downloading uvloop-0.9.1.tar.gz (1.8MB) 100% |¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦| 1.8MB 12.8MB/s Complete output from command python setup.py egg_info: Traceback (most recent call last): File """", line 1, in File ""C:\Users\RCole\AppData\Local\Temp\pip-build-juakfqt8\uvloop\setup.py "", line 10, in raise RuntimeError('uvloop does not support Windows at the moment') RuntimeError: uvloop does not support Windows at the moment ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/187/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 723460107,MDU6SXNzdWU3MjM0NjAxMDc=,187,Maybe: Utility method / CLI tool for initializing SpatiaLite,9599,simonw,closed,0,,,,,2,2020-10-16T19:04:03Z,2022-02-05T00:04:26Z,2020-10-16T19:15:13Z,OWNER,,"> I think this should initialize SpatiaLite against the current database if it has not been initialized already. > > Relevant code: https://github.com/simonw/shapefile-to-sqlite/blob/e754d0747ca2facf9a7433e2d5d15a6a37a9cf6e/shapefile_to_sqlite/utils.py#L112-L126",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/187/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 309047460,MDU6SXNzdWUzMDkwNDc0NjA=,188,Ability to bundle metadata and templates inside the SQLite file,9599,simonw,open,0,,,,,4,2018-03-27T16:42:07Z,2020-12-04T17:18:34Z,,OWNER,,"One of the nicest qualities of SQLite as a data format is that you get a single file which you can then backup or share with other people. Datasette breaks this a little once you start including custom metadata.json or template files and CSS. It would be cool if there was an optional mechanism for baking that extra configuration into the SQLite file itself. That way entire datasette mini-applications (including canned queries and custom HTML and CSS) could be constructed as single .db files. Since datasette configuration is all file-based, one way to achieve that would be to support a ""datasette_files"" table which, if present is used to search for file contents by path. This is inline with the philosophy described by https://www.sqlite.org/appfileformat.html ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/188/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 723708310,MDU6SXNzdWU3MjM3MDgzMTA=,188,About loading spatialite,30607,aborruso,closed,0,,,,,1,2020-10-17T08:47:02Z,2022-02-05T00:04:26Z,2020-10-17T08:52:58Z,NONE,,"Hi @simonw , If I run ``` sqlite3 .load /usr/local/lib/mod_spatialite.so select spatialite_version(); ``` I have `5.0.0`. ![image](https://user-images.githubusercontent.com/30607/96332706-d8cd3300-1065-11eb-906b-daf99963198e.png) If I run ``` sqlite-utils :memory: ""select spatialite_version()"" --load-extension=spatialite ``` I have ``` Traceback (most recent call last): File ""/home/aborruso/.local/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/home/aborruso/.local/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/home/aborruso/.local/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/home/aborruso/.local/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/aborruso/.local/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/aborruso/.local/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/home/aborruso/.local/lib/python3.8/site-packages/sqlite_utils/cli.py"", line 936, in query _load_extensions(db, load_extension) File ""/home/aborruso/.local/lib/python3.8/site-packages/sqlite_utils/cli.py"", line 1326, in _load_extensions db.conn.load_extension(ext) TypeError: argument 1 must be str, not None ``` How to load properly spatialite extension in sqlite-utils? Thank you very muc",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/188/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 309471814,MDU6SXNzdWUzMDk0NzE4MTQ=,189,Ability to sort (and paginate) by column,9599,simonw,closed,0,9599,simonw,,,31,2018-03-28T18:04:51Z,2018-04-15T18:54:22Z,2018-04-09T05:16:02Z,OWNER,,"As requested in https://github.com/simonw/datasette/issues/185#issuecomment-376614973 I've previously avoided this for performance reasons: sort-by-column on a column without an index is likely to perform badly for hundreds of thousands of rows. That's not a good enough reason to avoid the feature entirely though. A few options: * Allow sort-by-column by default, give users the option to disable it for specific tables/columns * Disallow sort-by-column by default, give users option (probably in `metadata.json`) to enable it for specific tables/columns * Automatically detect if a column either has an index on it OR a table has less than X rows in it We already have the mechanism in place to cut off SQL queries that take more than X seconds, so if someone DOES try to sort by a column that's too expensive it won't actually hurt anything - but it would be nice to not show people a ""sort"" option which is guaranteed to throw a timeout error. The vast majority of datasette usage that I've seen so far is on smaller datasets where the performance penalties of sort-by-column are extremely unlikely to show up. ---- Still left to do: - [x] UI that shows which sort order is currently being applied (in HTML and in JSON) - [x] UI for applying a sort order (with rel=nofollow to avoid Google crawling it) - [x] Sort column names should be escaped correctly in generated SQL - [x] Validation that the selected sort order is a valid column - [x] Throw error if user attempts to apply _sort AND _sort_desc at the same time - [x] Ability to disable sorting (or sort only for specific columns) in metadata.json - [x] Fix ""201 rows where sorted by sortable_with_nulls "" bug ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/189/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 729818242,MDExOlB1bGxSZXF1ZXN0NTEwMjM1OTA5,189,Allow iterables other than Lists in m2m records,35681,adamwolf,closed,0,,,,,3,2020-10-26T18:47:44Z,2020-10-27T16:28:37Z,2020-10-27T16:24:21Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/189,"I was playing around with sqlite-utils, creating a Roam Research dogsheep-style importer for Datasette, and ran into a slight snag. I wanted to use a generator to add an order column in an importer. It looked something like: ``` def order_generator(iterable, attr=None): if attr is None: attr = ""order"" order: int = 0 for i in iterable: i[attr] = order order += 1 yield i ``` When I used this with `insert_all` and other things, it worked fine--but it didn't work as the `records` argument to `m2m`. I dug into it, and sqlite-utils is explicitly checking if the records argument is a list or a tuple. I flipped the check upside down, and now it checks if the argument is a mapping. If it's a mapping, it wraps it in a list, otherwise it leaves it alone. (I get that it might not really make sense to put the order column on the second table. I changed my import schema a bit, and no longer have a real example, but maybe this change still makes sense.) The automated tests still pass, but I did not add any new ones. Let me know what you think! I'm really loving Datasette and its ecosystem; thanks for everything!",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/189/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 309558826,MDU6SXNzdWUzMDk1NTg4MjY=,190,Keyset pagination doesn't work correctly for compound primary keys,9599,simonw,closed,0,,,,,7,2018-03-28T22:45:06Z,2018-03-30T06:31:15Z,2018-03-30T06:26:28Z,OWNER,,"Consider https://datasette-issue-190-compound-pks.now.sh/compound-pks-9aafe8f/compound_primary_key ![2018-03-28 at 3 47 pm](https://user-images.githubusercontent.com/9599/38060388-56da86dc-329f-11e8-9f20-5576153ad55c.png) The next= link is to `d,v`: https://datasette-issue-190-compound-pks.now.sh/compound-pks-9aafe8f/compound_primary_key?_next=d%2Cv But that page starts with: ![2018-03-28 at 3 48 pm](https://user-images.githubusercontent.com/9599/38060402-6b0f5984-329f-11e8-85b8-44a666c4ee71.png) The next key in the sequence should be `d,w`. Also we should return the full a-z of the ones that start with the letter e - in this example we only return `e-w`, `e-x`, `e-y` and `e-z`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/190/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 730693696,MDExOlB1bGxSZXF1ZXN0NTEwOTU2MTM0,190,Progress bar for sqlite-utils insert command,9599,simonw,closed,0,,,,,0,2020-10-27T18:08:53Z,2020-10-27T18:16:03Z,2020-10-27T18:16:03Z,OWNER,simonw/sqlite-utils/pulls/190,Refs #173,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/190/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 310533258,MDU6SXNzdWUzMTA1MzMyNTg=,191,Figure out how to bundle a more up-to-date SQLite,9599,simonw,closed,0,,,,,6,2018-04-02T16:33:25Z,2018-07-10T17:46:13Z,2018-07-10T17:46:13Z,OWNER,,"The version of SQLite that ships with Python 3 is a bit limited - it doesn't support row values for example https://www.sqlite.org/rowvalue.html Figure out how to bundle a more recent SQLite engine with datasette. We need to figure out two cases: * Bundling a recent version in a Dockerfile build. I expect this to be quite easy. * Making a more recent version available to people hacking around in Mac OS X. I have no idea how to start on this. I want it working on Mac OS X too because I don't want to force Docker as a dependency for anyone who just want to hack around with Datasette a little and run the test suite.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/191/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 731740458,MDU6SXNzdWU3MzE3NDA0NTg=,191,Idea: @db.register_function(deterministic=True),9599,simonw,closed,0,,,,,2,2020-10-28T19:45:18Z,2020-10-28T21:31:06Z,2020-10-28T21:31:06Z,OWNER,,"Python 3.8 added a `deterministic` parameter to `db.create_function()`: https://docs.python.org/3/library/sqlite3.html#sqlite3.Connection.create_function `sqlite-utils` could expose this in the decorator, only actually applying it if the Python version supports it (using feature detection) - since nothing will break if it's not applied. https://sqlite-utils.readthedocs.io/en/stable/python-api.html#registering-custom-sql-functions ```python db = Database(memory=True) @db.register_function(deterministic=True) def reverse_string(s): return """".join(reversed(list(s))) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/191/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 310850458,MDExOlB1bGxSZXF1ZXN0MTc5MTA4OTYx,192,New ?_shape=objects/object/lists param for JSON API,9599,simonw,closed,0,,,,,0,2018-04-03T14:02:58Z,2018-04-03T14:53:00Z,2018-04-03T14:52:55Z,OWNER,simonw/datasette/pulls/192,Refs #122,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/192/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 735532751,MDU6SXNzdWU3MzU1MzI3NTE=,192,sqlite-utils search command,9599,simonw,closed,0,,,6079500,3.0,9,2020-11-03T18:07:59Z,2020-11-08T17:07:01Z,2020-11-08T17:07:01Z,OWNER,,A command that knows how to run a search against a FTS enabled table and return results ranked by relevance.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/192/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 310882100,MDU6SXNzdWUzMTA4ODIxMDA=,193,Cleaner mechanism for handling custom errors,9599,simonw,closed,0,,,,,3,2018-04-03T15:19:13Z,2018-04-13T18:18:59Z,2018-04-13T18:18:59Z,OWNER,,"This code is pretty messy: https://github.com/simonw/datasette/blob/0abd3abacb309a2bd5913a7a2df4e9256585b1bb/datasette/app.py#L245-L265 Instead, it would be nice if I could raise an exception that would be converted into the appropriate JSON or HTML error message, with a corresponding HTTP code.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/193/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 735648209,MDU6SXNzdWU3MzU2NDgyMDk=,193,--tsv output format option,9599,simonw,closed,0,,,6079500,3.0,0,2020-11-03T21:31:18Z,2020-11-07T00:09:52Z,2020-11-07T00:09:52Z,OWNER,,"We already support `--csv` for output, and the `insert` command accepts `--tsv`. The output format options should accept `--tsv` too.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/193/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 312312125,MDU6SXNzdWUzMTIzMTIxMjU=,194,Rename table_rows and filtered_table_rows to have _count suffix,9599,simonw,closed,0,,,,,2,2018-04-08T14:53:37Z,2018-04-09T05:25:22Z,2018-04-09T05:25:22Z,OWNER,,"These fields represent counts of items: ""table_rows"": 131, ""filtered_table_rows"": 8, But the names make it sound like they might be arrays full of rows. Adding a `_count` suffix would make this more clear: ""table_rows_count"": 131, ""filtered_table_rows_count"": 8, ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/194/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 735650864,MDU6SXNzdWU3MzU2NTA4NjQ=,194,3.0 release with some minor breaking changes,9599,simonw,closed,0,,,6079500,3.0,3,2020-11-03T21:36:31Z,2020-11-08T17:19:35Z,2020-11-08T17:19:34Z,OWNER,,"While working on search (#192) I've spotted a few small changes I would like to make that would break backwards compatibility in minor ways, hence requiring a 3.x release. `db[table].search()` - I would like this to default to sorting by rank Also I'd like to free up the `-c` and `-f` options for other purposes from the standard output formats here: https://github.com/simonw/sqlite-utils/blob/43eae8b193d362f2b292df73e087ed6f10838144/sqlite_utils/cli.py#L48-L58 I'd like `-f` to be used to indicate a full-text search column during an insert and `-c` to indicate a column (so you can specify which columns you want to output).",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/194/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 312313496,MDU6SXNzdWUzMTIzMTM0OTY=,195,"Run pks_for_table in inspect, executing once at build time rather than constantly",9599,simonw,closed,0,,,,,3,2018-04-08T15:12:40Z,2018-04-10T00:54:43Z,2018-04-10T00:54:43Z,OWNER,,"Right now several Datasette views call the `await self.pks_for_table(...)` method to figure out what primary keys are set for a specific table. This executes a `PRAGMA table_info` SQL query. It would be faster and more efficient to execute this query for each table as part of the `inspect()` method.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/195/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 735663855,MDExOlB1bGxSZXF1ZXN0NTE1MDE0ODgz,195,table.search() improvements plus sqlite-utils search command,9599,simonw,closed,0,,,,,3,2020-11-03T22:02:08Z,2020-11-06T18:30:49Z,2020-11-06T18:30:42Z,OWNER,simonw/sqlite-utils/pulls/195,Refs #192. Still needs tests.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/195/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 312355154,MDExOlB1bGxSZXF1ZXN0MTgwMTg4Mzk3,196,_sort= and _sort_desc= parameters to table view,9599,simonw,closed,0,,,,,0,2018-04-09T00:07:21Z,2018-04-09T05:10:29Z,2018-04-09T05:10:23Z,OWNER,simonw/datasette/pulls/196,See #189 ,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/196/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 736520310,MDU6SXNzdWU3MzY1MjAzMTA=,196,Introspect if table is FTS4 or FTS5,9599,simonw,closed,0,,,,,19,2020-11-05T00:45:50Z,2020-11-05T03:54:07Z,2020-11-05T03:54:07Z,OWNER,,"> I want `.search()` to work against both FTS5 and FTS4 tables - but sort by rank should only work for FTS5. > > This means I need to be able to introspect and tell if a table is FTS4 or FTS5. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/192#issuecomment-722054264_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/196/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 312395790,MDU6SXNzdWUzMTIzOTU3OTA=,197,Ability to sort by more than one column,9599,simonw,open,0,,,,,0,2018-04-09T05:13:30Z,2018-07-10T17:45:37Z,,OWNER,,"Split off from #189. I'd like to support ""sort by X descending, then by Y ascending if there are dupes for X"" as well. Suggested syntax for that: ?_sort_desc=X&_sort=Y we currently only allow one argument to be sent. We should allow as many arguments as there are columns, for example: ?_sort=department&_sort_desc=precinct&_sort=age&_sort_desc=size",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/197/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 737153927,MDU6SXNzdWU3MzcxNTM5Mjc=,197,Rethink how table.search() method works,9599,simonw,closed,0,,,6079500,3.0,5,2020-11-05T18:04:34Z,2020-11-08T17:07:37Z,2020-11-08T17:07:37Z,OWNER,,"I need to improve this method to help build `sqlite-utils search` in #192 (PR is #195). The challenge is deciding how it should handle sorting by relevance - especially since that is easy in FTS5 but not at all easy in FTS4. > Latest test failure: > ``` > 114 -> assert [(""racoons are biting trash pandas"", ""USA"", ""bar"")] == table.search( > 115 ""bite"", order=""rowid"" > 116 ) > 117 > 118 > 119 def test_optimize_fts(fresh_db): > (Pdb) table.search(""bite"") > [(2, 'racoons are biting trash pandas', 'USA', 'bar', -9.641434262948206e-07)] > ``` > The problem here is that the `table.search()` method now behaves differently for FTS4 v.s. FTS5 tables. > > With FTS4 you get back just the table columns. > > With FTS5 you also get back the `rowid` as the first column and the `rank` score as the last column. > > This is weird. It also makes me question whether having `.search()` return a list of tuples is the right API design. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/pull/195#issuecomment-722542895_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/197/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 312396095,MDU6SXNzdWUzMTIzOTYwOTU=,198,Ability to sort with nulls last,9599,simonw,open,0,,,,,0,2018-04-09T05:15:40Z,2018-07-10T17:45:37Z,,OWNER,,"Split off from #189 Here's how to do that in SQL: https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9?sql=select+rowid%2C+*+from+%5Bnfl-wide-receivers%2Fadvanced-historical%5D%0D%0Aorder+by+case+when+career_ranypa+is+null+then+1+else+0+end%2C+career_ranypa%2C+rowid order by case when career_ranypa is null then 1 else 0 end, career_ranypa",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/198/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 737476423,MDU6SXNzdWU3Mzc0NzY0MjM=,198,Support order by relevance against FTS4,9599,simonw,closed,0,,,,,6,2020-11-06T05:36:31Z,2020-11-06T18:30:44Z,2020-11-06T18:30:44Z,OWNER,,"For #192 and #197 I've decided I want to be able to order by relevance in FTS4 as well as FTS5. This means I need to port over my work on bm25() from https://github.com/simonw/sqlite-fts4 (since I don't want to add a full dependency).",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/198/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 312620566,MDU6SXNzdWUzMTI2MjA1NjY=,199,Ability to apply sort on mobile in portrait mode,9599,simonw,closed,0,,,,,4,2018-04-09T17:35:04Z,2018-04-10T00:37:53Z,2018-04-10T00:34:38Z,OWNER,,"Missed this in #189... on mobile in portrait mode we hide the column headers, which means you can't click them to sort! You can sort in landscape mode at least. Need to come up with an alternative sort UI for portrait on mobile.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/199/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 737855731,MDU6SXNzdWU3Mzc4NTU3MzE=,199,"@db.register_function(..., replace=False) to avoid double-registering custom functions",9599,simonw,closed,0,,,,,1,2020-11-06T15:39:21Z,2020-11-06T18:30:44Z,2020-11-06T18:30:44Z,OWNER,,"I'd like a mechanism to optionally avoid registering a custom function if it has already been registered. SQLite doesn't seem to offer a way to introspect registered custom functions so I'll need to track what has already been registered in `sqlite-utils` instead. > Should I register the custom `rank_bm25` SQLite function for every connection, or should I register it against the connection just the first time the user attempts an FTS4 search? I think I'd rather register it only if it is needed. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/198#issuecomment-723145383_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/199/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 313494458,MDExOlB1bGxSZXF1ZXN0MTgxMDMzMDI0,200,Hide Spatialite system tables,45057,russss,closed,0,,,,,3,2018-04-11T21:26:58Z,2018-04-12T21:34:48Z,2018-04-12T21:34:48Z,CONTRIBUTOR,simonw/datasette/pulls/200,They were getting on my nerves.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/200/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 738115165,MDU6SXNzdWU3MzgxMTUxNjU=,200,sqlite-utils rows -c option,9599,simonw,closed,0,,,6079500,3.0,1,2020-11-07T00:22:12Z,2020-11-07T00:28:48Z,2020-11-07T00:28:47Z,OWNER,,To let you specify the exact columns you want. Based on the `-c` option to `sqlite-utils search` in #192.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/200/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 313512748,MDU6SXNzdWUzMTM1MTI3NDg=,201,Support explain select / explain query plan select,9599,simonw,closed,0,,,,,1,2018-04-11T22:41:26Z,2018-04-13T21:17:14Z,2018-04-12T21:32:52Z,OWNER,,See https://www.sqlite.org/eqp.html and https://www.sqlite.org/lang_explain.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/201/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 738128913,MDU6SXNzdWU3MzgxMjg5MTM=,201,.search(columns=) and sqlite-utils search -c ... bug,9599,simonw,closed,0,,,6079500,3.0,1,2020-11-07T01:27:26Z,2020-11-08T16:54:15Z,2020-11-08T16:54:15Z,OWNER,,"Both `table.search(columns=)` and the `sqlite-utils search -c` option do not work as expected - they always return both the `rowid` and the `rank` columns even if those have not been requested. This should be fixed before the 3.0 non-alpha release.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/201/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 313785206,MDExOlB1bGxSZXF1ZXN0MTgxMjQ3NTY4,202,Raise 404 on nonexistent table URLs,45057,russss,closed,0,,,,,2,2018-04-12T15:47:06Z,2018-04-13T19:22:56Z,2018-04-13T18:19:15Z,CONTRIBUTOR,simonw/datasette/pulls/202,"Currently they just 500. Also cleaned the logic up a bit, I hope I didn't miss anything. This is issue #184.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/202/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 738514367,MDU6SXNzdWU3Mzg1MTQzNjc=,202,sqlite-utils insert -f colname - for configuring full-text search,9599,simonw,closed,0,,,,,2,2020-11-08T17:30:09Z,2021-01-03T05:00:36Z,2021-01-03T05:00:27Z,OWNER,,"A mechanism for specifying columns that should be configured for full-text search as part of the initial data import: sqlite-utils insert mydb.db articles articles.csv --csv -f title -f body",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/202/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 313837303,MDU6SXNzdWUzMTM4MzczMDM=,203,Support for units,45057,russss,closed,0,,,,,10,2018-04-12T18:24:28Z,2018-04-16T21:59:17Z,2018-04-16T21:59:17Z,CONTRIBUTOR,,"It would be nice to be able to attach a unit to a column in the metadata, and have it rendered with that unit (and SI prefix) when it's displayed. It would also be nice to support entering the prefixes in variables when querying. With my radio licensing app I've put all frequencies in Hz. It's easy enough to special-case the row rendering to add the SI prefixes, but it's pretty unusable when querying by that field.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/203/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 743384829,MDExOlB1bGxSZXF1ZXN0NTIxMjg3OTk0,203,changes to allow for compound foreign keys,1049910,drkane,open,0,,,,,7,2020-11-16T00:30:10Z,2023-01-25T18:47:18Z,,FIRST_TIME_CONTRIBUTOR,simonw/sqlite-utils/pulls/203,"Add support for compound foreign keys, as per issue #117 Not sure if this is the right approach. In particular I'm unsure about: - the new `ForeignKey` class, which replaces the namedtuple in order to ensure that `column` and `other_column` are forced into tuples. The class does the job, but doesn't feel very elegant. - I haven't rewritten `guess_foreign_table` to take account of multiple columns, so it just checks for the first column in the foreign key definition. This isn't ideal. - I haven't added any ability to the CLI to add compound foreign keys, it's only in the python API at the moment. The PR also contains a minor related change that columns and tables are always quoted in foreign key definitions.",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/203/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 314256802,MDExOlB1bGxSZXF1ZXN0MTgxNjAwOTI2,204,Initial units support,45057,russss,closed,0,,,,,0,2018-04-13T21:32:49Z,2018-04-14T09:44:33Z,2018-04-14T03:32:54Z,CONTRIBUTOR,simonw/datasette/pulls/204,"Add support for specifying units for a column in metadata.json and rendering them on display using [pint](https://pint.readthedocs.io/en/latest/). Example table metadata: ```json ""license_frequency"": { ""units"": { ""frequency"": ""Hz"", ""channel_width"": ""Hz"", ""height"": ""m"", ""antenna_height"": ""m"", ""azimuth"": ""degrees"" } } ``` [Example result](https://wtr-api.herokuapp.com/wtr-663ea99/license_frequency/1) This works surprisingly well! I'd like to add support for using units when querying but this is PR is pretty usable as-is. (Pint doesn't seem to support decibels though - it thinks they're decibytes - which is an annoying omission.) (ref ticket #203)",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/204/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 752888228,MDExOlB1bGxSZXF1ZXN0NTI5MDkwNTYw,204,use jsonify_if_need for sql updates,78035,mfa,closed,0,,,,,1,2020-11-29T10:49:00Z,2020-12-08T17:49:42Z,2020-12-08T17:49:42Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/204,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/204/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 314319372,MDExOlB1bGxSZXF1ZXN0MTgxNjQyMTE0,205,Support filtering with units and more,45057,russss,closed,0,,,,,3,2018-04-14T10:47:51Z,2018-04-14T15:24:04Z,2018-04-14T15:24:04Z,CONTRIBUTOR,simonw/datasette/pulls/205,"The first commit: * Adds units to exported JSON * Adds units key to metadata skeleton * Adds some docs for units The second commit adds filtering by units by the first method I mentioned in #203: ![image](https://user-images.githubusercontent.com/45057/38767463-7193be16-3fd9-11e8-8a5f-ac4159415c6d.png) [Try it here](https://wtr-api.herokuapp.com/wtr-663ea99/license_frequency?frequency__gt=50GHz&height__lt=50ft). I think it integrates pretty neatly. The third commit adds support for registering custom units with Pint from metadata.json. Probably pretty niche, but I need decibels!",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/205/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 760960559,MDU6SXNzdWU3NjA5NjA1NTk=,205,"sqlite3.OperationalError: near ""("": syntax error",765871,kaihendry,closed,0,,,,,2,2020-12-10T06:44:40Z,2020-12-10T19:18:22Z,2020-12-10T07:24:23Z,NONE,,"The sqlite version is 3.22.0 2018-01-22 18:45:57 0c55d179733b46d8d0ba4d88e01a25e10677046ee3da1d5b1581e86726f2alt1 sqlite-utils, version 3.0 It fails here: https://github.com/kaihendry/aws-partners-datasette/runs/1528432635?check_suite_focus=true I'm not sure where the problem is, since it works _fine locally_ on Archlinux system running 3.34.0 2020-12-01 16:14:00 a26b6597e3ae272231b96f9982c3bcc17ddec2f2b6eb4df06a224b91089fed5b https://github.com/kaihendry/aws-partners-datasette/blob/main/create-summary-view.sh Maybe I need to bump up from ubuntu-latest to ? ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/205/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314323977,MDExOlB1bGxSZXF1ZXN0MTgxNjQ0ODA1,206,Fix sqlite error when loading rows with no incoming FKs,45057,russss,closed,0,,,,,0,2018-04-14T12:08:17Z,2018-04-14T14:32:42Z,2018-04-14T14:24:25Z,CONTRIBUTOR,simonw/datasette/pulls/206,"This fixes `ERROR: conn=, sql = 'select ', params = {'id': '1'}` caused by an invalid query loading incoming FKs when none exist. The error was ignored due to async but it still got printed to the console.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/206/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 761915790,MDU6SXNzdWU3NjE5MTU3OTA=,206,sqlite-utils should suggest --csv if JSON parsing fails,9599,simonw,closed,0,,,,,4,2020-12-11T05:17:56Z,2021-10-30T15:52:17Z,2021-01-03T18:42:22Z,OWNER,,"``` ~ % gsutil cat gs://ossf-criticality-score/python_top_200.csv | sqlite-utils insert /tmp/crit.db crit - ... File ""/usr/local/Cellar/python@3.9/3.9.0_3/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/decoder.py"", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File ""/usr/local/Cellar/python@3.9/3.9.0_3/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/decoder.py"", line 355, in raw_decode raise JSONDecodeError(""Expecting value"", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) ``` A nicer error message here would be one that says the JSON is invalid but suggests that maybe you could try `--csv`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/206/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314329002,MDExOlB1bGxSZXF1ZXN0MTgxNjQ3NzE3,207,Link foreign keys which don't have labels,45057,russss,closed,0,,,,,1,2018-04-14T13:27:14Z,2018-04-14T15:00:00Z,2018-04-14T15:00:00Z,CONTRIBUTOR,simonw/datasette/pulls/207,"This renders unlabeled FKs as simple links. I can't see why this would cause any major problems. ![image](https://user-images.githubusercontent.com/45057/38768722-ea15a000-3fef-11e8-8664-ffd7aa4894ea.png) Also includes bonus fixes for two minor issues: * In foreign key link hrefs the primary key was escaped using HTML escaping rather than URL escaping. This broke some non-integer PKs. * Print tracebacks to console when handling 500 errors.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/207/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 763283616,MDU6SXNzdWU3NjMyODM2MTY=,207,sqlite-utils analyze-tables command,9599,simonw,closed,0,,,,,4,2020-12-12T04:33:12Z,2020-12-13T07:25:23Z,2020-12-13T07:20:13Z,OWNER,,"A command which analyzes a table (potentially taking quite a while if the table is large) and outputs information for each column - things like: - How many unique values does this column have? - How many null rows? - How many blank rows? (defined as empty string) - What are the 10 most common values? - What are the 10 least common values? The command can output this information to the terminal, but it should also provide an option for writing the information to a database table so it can be explored later.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/207/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314340944,MDExOlB1bGxSZXF1ZXN0MTgxNjU0ODM5,208,Return HTTP 405 on InvalidUsage rather than 500,45057,russss,closed,0,,,,,0,2018-04-14T16:12:50Z,2018-04-14T18:00:39Z,2018-04-14T18:00:39Z,CONTRIBUTOR,simonw/datasette/pulls/208,"This also stops it filling up the logs. This happens for HEAD requests at the moment - which perhaps should be handled better, but that's a different issue.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/208/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 763320133,MDExOlB1bGxSZXF1ZXN0NTM3NzkxNjc1,208,sqlite-utils analyze-tables command and table.analyze_column() method,9599,simonw,closed,0,,,,,6,2020-12-12T05:27:49Z,2020-12-13T07:20:16Z,2020-12-13T07:20:12Z,OWNER,simonw/sqlite-utils/pulls/208,"Refs #207 - [x] Improve design of CLI output - [x] Truncate long values in least/most common - [x] Add a `-c` column selection option - [x] Tests - [x] Documentation",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/208/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 314455877,MDExOlB1bGxSZXF1ZXN0MTgxNzIzMzAz,209, Don't duplicate simple primary keys in the link column,45057,russss,closed,0,,,,,6,2018-04-15T21:56:15Z,2018-04-18T08:40:37Z,2018-04-18T01:13:04Z,CONTRIBUTOR,simonw/datasette/pulls/209,"When there's a simple (single-column) primary key, it looks weird to duplicate it in the link column. This change removes the second PK column and treats the link column as if it were the PK column from a header/sorting perspective. This might make it a bit more difficult to tell what the link for the row is, I'm not sure yet. I feel like the alternative is to change the link column to just have the text ""view"" or something, instead of repeating the PK. (I doubt it makes much more sense with compound PKs.) Bonus change in this PR: fix urlencoding of links in the displayed HTML. Before: ![image](https://user-images.githubusercontent.com/45057/38783830-e2ababb4-40ff-11e8-97fb-25e286a8c920.png) After: ![image](https://user-images.githubusercontent.com/45057/38783835-ebf6b48e-40ff-11e8-8c47-6a864cf21ccc.png)",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/209/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 766156875,MDU6SXNzdWU3NjYxNTY4NzU=,209,Test failure with sqlite 3.34 in test_cli.py::test_optimize,191622,meatcar,closed,0,,,,,1,2020-12-14T08:58:18Z,2021-01-01T23:52:46Z,2021-01-01T23:52:46Z,CONTRIBUTOR,,"pytest output: ``` ... ============================== short test summary info =============================== FAILED tests/test_cli.py::test_optimize[tables0] - assert 1662976 < 1662976 FAILED tests/test_cli.py::test_optimize[tables1] - assert 1667072 < 1662976 ===================== 2 failed, 538 passed, 3 skipped in 34.32s ====================== ``` Came across this while packaging `sqlite-utils` for NixOS, but it can be recreated it using the `alpine:edge` docker image as well as follows: ``` docker run --rm -it alpine:edge /bin/sh # apk update && apk add git sqlite python3 gcc python3-dev musl-dev && python3 -m ensurepip # git clone https://github.com/simonw/sqlite-utils.git # cd sqlite-utils/ # pip3 install -e .[test] # pytest ``` This definitely works on sqlite v3.33.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/209/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314469126,MDExOlB1bGxSZXF1ZXN0MTgxNzMxOTU2,210,"Start of the plugin system, based on pluggy",9599,simonw,closed,0,,,,,0,2018-04-16T00:51:30Z,2018-04-16T00:56:16Z,2018-04-16T00:56:16Z,OWNER,simonw/datasette/pulls/210,Refs #14,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/210/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 767685961,MDU6SXNzdWU3Njc2ODU5NjE=,210,Support of RData files,23739126,PeterBailey,closed,0,,,,,1,2020-12-15T15:04:14Z,2021-01-02T00:02:40Z,2021-01-02T00:02:40Z,NONE,,"Hi Simon, Would be great if you could ingest RData files! I could do this in a few lines of code but I am too lazy - sorry! Peter",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/210/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314471743,MDU6SXNzdWUzMTQ0NzE3NDM=,211,Load plugins from a `--plugins-dir=plugins/` directory,9599,simonw,closed,0,,,,,6,2018-04-16T01:17:43Z,2018-04-16T05:22:02Z,2018-04-16T05:22:02Z,OWNER,,"In #14 and 33c7c53ff87c2 I've added working support for setuptools entry_points plugins. These can be installed from PyPI using `pip install ...`. I imagine some projects will benefit from being able to add plugins without first publishing them to PyPI. Datasette already supports [loading custom templates](http://datasette.readthedocs.io/en/latest/custom_templates.html#custom-templates) like so: datasette serve --template-dir=mytemplates/ mydb.db I propose an additional option, `--plugins-dir=` which specifies a directory full of `blah.py` files which will be loaded into Datasette when the application server starts. datasette serve --plugins-dir=myplugins/ mydb.db This will also need to be supported by `datasette publish` as those Python files should be copied up as part of the deployment.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/211/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 777386465,MDU6SXNzdWU3NzczODY0NjU=,211,table.triggers_dict introspection property,9599,simonw,closed,0,,,,,0,2021-01-02T02:04:00Z,2021-01-02T02:10:10Z,2021-01-02T02:10:10Z,OWNER,,"`table.triggers` currently returns a list of `Trigger` values. A `table.triggers_dict` property could behave like `columns_dict`, returning a dictionary mapping trigger names to their SQL definitions for that table.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/211/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314504812,MDExOlB1bGxSZXF1ZXN0MTgxNzU1MjIw,212,New --plugins-dir=plugins/ option,9599,simonw,closed,0,,,,,0,2018-04-16T05:19:28Z,2018-04-16T05:22:18Z,2018-04-16T05:22:01Z,OWNER,simonw/datasette/pulls/212,Refs #211,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/212/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 777392020,MDU6SXNzdWU3NzczOTIwMjA=,212,Mechanism for maintaining cache of table counts using triggers,9599,simonw,closed,0,,,,,1,2021-01-02T02:58:53Z,2021-01-02T21:40:27Z,2021-01-02T21:40:27Z,OWNER,,"Counting all of the rows in a large table is expensive - this is one of the main causes of performance problems in Datasette when running against large databases. Carefully constructed SQL triggers could be used to maintain accurate cached counts for a table, by incrementing and decrementing a counter every time a row is inserted or deleted. `sqlite-utils` already has a mechanism for creating triggers for FTS - the `table.enable_fts(..., create_triggers=True)` method. How about a similar mechanism for setting up triggers to maintain a count of table rows?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/212/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314506033,MDU6SXNzdWUzMTQ1MDYwMzM=,213,Documentation for plugins system,9599,simonw,closed,0,,,,,0,2018-04-16T05:27:07Z,2018-04-16T15:12:48Z,2018-04-16T15:12:48Z,OWNER,,"Documentation for #14 - how to write plugins, how to ship plugins to PyPI and how to use the `--plugins-dir` option added in #211 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/213/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 777529979,MDU6SXNzdWU3Nzc1Mjk5Nzk=,213,db.enable_counts() method,9599,simonw,closed,0,,,,,2,2021-01-02T21:44:55Z,2021-01-02T22:04:02Z,2021-01-02T22:04:02Z,OWNER,,"Following #212 it would be useful if there was a utility method for enabling counts for ALL tables in a database: ```python db.enable_counts() ``` Open question: should this setup triggers for virtual tables such as FTS tables? Could that break things? ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/213/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314506446,MDU6SXNzdWUzMTQ1MDY0NDY=,214,Ability for plugins to define extra JavaScript and CSS,9599,simonw,closed,0,,,,,6,2018-04-16T05:29:34Z,2020-09-30T20:36:11Z,2018-04-18T03:13:03Z,OWNER,,"This can hook in to the existing `extra_css_urls` and `extra_js_urls` mechanism: https://github.com/simonw/datasette/blob/b2955d9065ea019500c7d072bcd9d49d1967f051/datasette/app.py#L304-L305 The plugins should be able to bundle their own assets though, so it will also have to integrate with the `/static/` static mounts mechanism somehow: https://github.com/simonw/datasette/blob/b2955d9065ea019500c7d072bcd9d49d1967f051/datasette/app.py#L1255-L1257 Refs #14",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/214/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 777530107,MDU6SXNzdWU3Nzc1MzAxMDc=,214,sqlite-utils enable-counts command,9599,simonw,closed,0,,,,,0,2021-01-02T21:45:48Z,2021-01-03T04:26:44Z,2021-01-03T04:26:44Z,OWNER,,"The CLI version of #212 and #213. # Enable counts for all tables: sqlite-utils enable-counts data.db # Enable counts for specific tables: sqlite-utils enable-counts data.db table1 table2",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/214/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314506669,MDU6SXNzdWUzMTQ1MDY2Njk=,215,Allow plugins to define additional URL routes and views,9599,simonw,closed,0,,,5512395,Datasette 0.44,14,2018-04-16T05:31:09Z,2020-06-09T03:14:32Z,2020-06-09T03:12:08Z,OWNER,,"Might be as simple as having plugins get passed the `app` after the other routes have been defined: https://github.com/simonw/datasette/blob/b2955d9065ea019500c7d072bcd9d49d1967f051/datasette/app.py#L1270-L1274 Refs #14",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/215/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 777535402,MDU6SXNzdWU3Nzc1MzU0MDI=,215,Use _counts to speed up counts,9599,simonw,closed,0,,,,,9,2021-01-02T22:30:17Z,2021-01-03T20:19:40Z,2021-01-03T20:19:40Z,OWNER,,"Utility mechanism for taking advantage of the new `_counts` table from #212 would be nice. These can trigger automatically if the `_counts` table exists, but since `sqlite-utils` needs to work against any existing database there should be a way of opting out of this optimization.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/215/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314665147,MDU6SXNzdWUzMTQ2NjUxNDc=,216,Bug: Sort by column with NULL in next_page URL,222245,carlmjohnson,closed,0,,,,,15,2018-04-16T14:03:18Z,2018-04-17T01:45:24Z,2018-04-17T01:45:24Z,NONE,,"Copy-pasting from https://github.com/simonw/datasette/issues/189#issuecomment-381429213, since that issue is closed: I think I found a bug. I tried to sort by middle initial in my salaries set, and many middle initials are null. The `next_url` gets set by Datasette to: http://localhost:8001/salaries-d3a5631/2017+Maryland+state+salaries?_next=None%2C391&_sort=middle_initial But then None is interpreted literally and it tries to find a name with the middle initial ""None"" and ends up skipping ahead to O on page 2. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/216/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 777540352,MDU6SXNzdWU3Nzc1NDAzNTI=,216,database.triggers_dict introspection property,9599,simonw,closed,0,,,,,1,2021-01-02T23:13:00Z,2021-01-03T04:27:14Z,2021-01-03T04:25:36Z,OWNER,,Following #211,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/216/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314725342,MDU6SXNzdWUzMTQ3MjUzNDI=,217,Plugin support for datasette publish,9599,simonw,closed,0,,,,,1,2018-04-16T16:17:14Z,2018-07-26T05:33:39Z,2018-07-26T05:16:00Z,OWNER,,"It should be possible to support additional deployment options by writing a plugin (see #59). As part of this, rewrite the Heroku and Now publishers to be implemented as plugins (they will still ship with datasette by default). Maybe `datasette package` should be changed to being part of publish instead, `datasette publish docker` perhaps? Refs #14",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/217/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 777543336,MDU6SXNzdWU3Nzc1NDMzMzY=,217,Rename .escape() to .quote(),9599,simonw,closed,0,,,,,2,2021-01-02T23:40:52Z,2021-01-03T04:27:38Z,2021-01-03T04:15:23Z,OWNER,,"`.quote()` is a better name because it reflects that the method adds quotes around the value. This method has never been documented so I'm going to rename it without a major version bump. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/217/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314771615,MDU6SXNzdWUzMTQ3NzE2MTU=,218,"Support custom unit display in order to handle ""$10,000""",9599,simonw,open,0,,,,,0,2018-04-16T18:39:31Z,2018-07-10T17:45:38Z,,OWNER,,"I tried to get Datasette to display `$10,000` using the new units support but we currently only display units as a suffix: https://github.com/simonw/datasette/blob/10a34f995c70daa37a8a2aa02c3135a4b023a24c/datasette/app.py#L563-L572 It would be neat if there was a mechanism for specifying a custom unit display - maybe something like this: ``` { ""custom_units"": { ""us_dollar"": { ""unit"": ""us_dollar = [] = $"", ""format"": ""${:,}"" } } } ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/218/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 777560474,MDU6SXNzdWU3Nzc1NjA0NzQ=,218,"""sqlite-utils triggers"" command",9599,simonw,closed,0,,,,,1,2021-01-03T02:34:50Z,2021-01-03T03:49:51Z,2021-01-03T03:03:35Z,OWNER,,"A command to list the triggers in the database. sqlite-utils triggers my.db Can optionally take one or more tables: sqlite-utils triggers my.db table1 table2",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/218/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314834783,MDU6SXNzdWUzMTQ4MzQ3ODM=,219,Expose units in the JSON API?,45057,russss,open,0,,,,,0,2018-04-16T22:04:25Z,2018-04-16T22:04:25Z,,CONTRIBUTOR,,"From #203: it would be nice for the JSON API to (optionally) return columns rendered with units in them - if, for example, you're consuming the JSON to render the rows on a map. I'm not entirely sure how useful this will be though - at the moment my map queries are custom SQL queries (a few have joins in, the rest might be fetching large amounts of data so it makes sense to limit columns fetched). Perhaps the SQL function is a better approach in general.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/219/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 777707544,MDU6SXNzdWU3Nzc3MDc1NDQ=,219,reset_counts() method and command,9599,simonw,closed,0,,,,,4,2021-01-03T20:08:28Z,2021-01-03T20:59:37Z,2021-01-03T20:59:37Z,OWNER,,"> Thought: maybe there should be a `.reset_counts()` method too, for if the table gets out of date with the triggers. > > One way that could happen is if a table is dropped and recreated - the counts in the `_counts` table would likely no longer match the number of rows in that table. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753545757_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/219/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314847571,MDU6SXNzdWUzMTQ4NDc1NzE=,220,Investigate syntactic sugar for plugins,9599,simonw,closed,0,,,,,2,2018-04-16T23:01:39Z,2020-06-11T21:50:06Z,2020-06-11T21:49:55Z,OWNER,,"Suggested by @andrewhayward on Twitter: https://twitter.com/arhayward/status/986015118965268480?s=21 > Have you considered a basic abstraction on top of that, for standard hook features? ``` @sql_function random_integer(a,b): return random.randint(a,b) @template_filter uppercase(str): return str.upper() ``` Maybe `from datasette.plugins import template_filter`? Would have to work out how to get this to play well with pluggy",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/220/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 783778672,MDU6SXNzdWU3ODM3Nzg2NzI=,220,Better error message for *_fts methods against views,649467,mhalle,closed,0,,,,,3,2021-01-11T23:24:00Z,2021-02-22T20:44:51Z,2021-02-14T22:34:26Z,NONE,,"enable_fts and its related methods only work on tables, not views. Could those methods and possibly others move up to the Queryable superclass? ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/220/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 315142414,MDU6SXNzdWUzMTUxNDI0MTQ=,221,Allow plugins to add new cli sub commands ,9599,simonw,closed,0,,,,,3,2018-04-17T16:40:13Z,2021-01-04T20:12:14Z,2021-01-04T20:12:14Z,OWNER,,I could then test this out by having https://github.com/simonw/csvs-to-sqlite register itself as a plugin,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/221/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 783910901,MDU6SXNzdWU3ODM5MTA5MDE=,221,.add_missing_columns() does not take case insensitivity into account,9599,simonw,closed,0,,,,,0,2021-01-12T05:01:00Z,2021-01-12T23:17:33Z,2021-01-12T23:17:33Z,OWNER,,SQLite columns are case insensitive - but the `.add_missing_columns()` method doesn't know that. This means that it can crash if it identifies a column that is a case-insensitive duplicate of an existing column. https://github.com/simonw/sqlite-utils/blob/4cc82fd0bccc9d2eeb3510beb4e691d7da099f84/sqlite_utils/db.py#L1974-L1980,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/221/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 315316214,MDExOlB1bGxSZXF1ZXN0MTgyMzU3NjEz,222,Fix for plugins in Python 3.5,9599,simonw,closed,0,,,,,0,2018-04-18T03:21:01Z,2018-04-18T04:26:50Z,2018-04-18T03:24:21Z,OWNER,simonw/datasette/pulls/222,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/222/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 787900412,MDU6SXNzdWU3ODc5MDA0MTI=,222,.m2m() should accept alter=True parameter,9599,simonw,closed,0,,,,,0,2021-01-18T04:15:43Z,2021-01-18T04:26:10Z,2021-01-18T04:26:10Z,OWNER,,Needed by https://github.com/dogsheep/swarm-to-sqlite/issues/11,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/222/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 315327860,MDU6SXNzdWUzMTUzMjc4NjA=,223,datasette publish --install=name-of-plugin,9599,simonw,closed,0,,,,,3,2018-04-18T04:33:59Z,2018-04-18T14:56:17Z,2018-04-18T14:56:17Z,OWNER,,Mechanism for causing datasette publish and datasette package to install one or more additional plugins using `pip install` - refs #14 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/223/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 788527932,MDU6SXNzdWU3ODg1Mjc5MzI=,223,--delimiter option for CSV import,9599,simonw,closed,0,,,,,2,2021-01-18T20:25:03Z,2021-02-06T01:39:47Z,2021-02-06T01:34:54Z,OWNER,,"https://bruxellesdata.opendatasoft.com/explore/dataset/dog-toilets/export/?location=12,50.85802,4.38054 says: > CSV uses semicolon (;) as a separator. Would be useful to be able to do this: sqlite-utils insert places.db places places.csv --delimiter ';' `--delimiter` could imply `--csv`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/223/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 315517578,MDU6SXNzdWUzMTU1MTc1Nzg=,224,Ability for plugins to bundle templates,9599,simonw,closed,0,,,,,1,2018-04-18T14:57:53Z,2018-04-19T05:50:36Z,2018-04-19T05:50:36Z,OWNER,,"Plugins should be able to bundle templates. The Datasette template loader should then consult those plugins first when loading a template. Jinja2 has a `PackageLoader` class that can help with this: http://jinja.pocoo.org/docs/2.10/api/#jinja2.PackageLoader Refs #14",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/224/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 792297010,MDExOlB1bGxSZXF1ZXN0NTYwMjA0MzA2,224,Add fts offset docs.,37962604,polyrand,closed,0,,,,,2,2021-01-22T20:50:58Z,2021-02-14T19:31:06Z,2021-02-14T19:31:06Z,NONE,simonw/sqlite-utils/pulls/224,"The limit can be passed as a string to the query builder to have an offset. I have tested it using the shorthand `limit=f""15, 30""`, the standard syntax should work too.",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/224/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 315548495,MDU6SXNzdWUzMTU1NDg0OTU=,225,/-/(inspect|metadata|plugins)(.json)? introspection,9599,simonw,closed,0,,,,,0,2018-04-18T16:14:58Z,2018-04-19T05:25:33Z,2018-04-19T05:25:33Z,OWNER,,"3 pages (and accompanying .json endpoints) for viewing: * the metadata.json that datasette was loaded with * the output of ds.inspect() * a list of installed plugins, detected by pluggy",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/225/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 797159961,MDExOlB1bGxSZXF1ZXN0NTY0MjE1MDEx,225,fix for problem in Table.insert_all on search for columns per chunk of rows,261237,nieuwenhoven,closed,0,,,,,2,2021-01-29T20:16:07Z,2021-02-14T21:04:13Z,2021-02-14T21:04:13Z,NONE,simonw/sqlite-utils/pulls/225,"Hi, I ran into a problem when trying to create a database from my Apple Healthkit data using [healthkit-to-sqlite](https://github.com/dogsheep/healthkit-to-sqlite). The program crashed because of an invalid insert statement that was generated for table `rDistanceCycling`. The actual problem turned out to be in [sqlite-utils](https://github.com/simonw/sqlite-utils). `Table.insert_all` processes the data to be inserted in chunks of rows and checks for every chunk which columns are used, and it will collect all column names in the variable `all_columns`. The collection of columns is done using a nested list comprehension that is not completely correct. I'm using a Windows machine and had to make a few adjustments to the tests in order to be able to run them because they had a posix dependency. Thanks, kind regards, Frans ``` # this is a (condensed) chunk of data from my Apple healthkit export that caused the problem. # the 3 last items in the chunk have additional keys: metadata_HKMetadataKeySyncVersion and metadata_HKMetadataKeySyncIdentifier chunk = [{'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '7.0.1', 'device': '<, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:7.0.1>', 'unit': 'km', 'creationDate': '2020-10-10 12:29:09 +0100', 'startDate': '2020-10-10 12:29:06 +0100', 'endDate': '2020-10-10 12:29:07 +0100', 'value': '0.00518016'}, {'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '7.0.1', 'device': '<, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:7.0.1>', 'unit': 'km', 'creationDate': '2020-10-10 12:29:10 +0100', 'startDate': '2020-10-10 12:29:07 +0100', 'endDate': '2020-10-10 12:29:08 +0100', 'value': '0.00544049'}, {'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '6.2.6', 'device': '<, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:6.2.6>', 'unit': 'km', 'creationDate': '2020-10-14 05:54:12 +0100', 'startDate': '2020-07-15 16:40:50 +0100', 'endDate': '2020-07-15 16:42:49 +0100', 'value': '0.952092', 'metadata_HKMetadataKeySyncVersion': '1', 'metadata_HKMetadataKeySyncIdentifier': '3:674DBCDB-3FE8-40D1-9FC1-E54A2B413805:616520450.99823:616520569.99360:119'}, {'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '6.2.6', 'device': '<, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:6.2.6>', 'unit': 'km', 'creationDate': '2020-10-14 05:54:12 +0100', 'startDate': '2020-07-15 16:42:49 +0100', 'endDate': '2020-07-15 16:44:51 +0100', 'value': '0.848983', 'metadata_HKMetadataKeySyncVersion': '1', 'metadata_HKMetadataKeySyncIdentifier': '3:674DBCDB-3FE8-40D1-9FC1-E54A2B413805:616520569.99360:616520691.98826:119'}, {'sourceName': 'AppleÂ\xa0Watch van Frans', 'sourceVersion': '6.2.6', 'device': '<, name:Apple Watch, manufacturer:Apple Inc., model:Watch, hardware:Watch3,4, software:6.2.6>', 'unit': 'km', 'creationDate': '2020-10-14 05:54:12 +0100', 'startDate': '2020-07-15 16:44:51 +0100', 'endDate': '2020-07-15 16:46:50 +0100', 'value': '0.834403', 'metadata_HKMetadataKeySyncVersion': '1', 'metadata_HKMetadataKeySyncIdentifier': '3:674DBCDB-3FE8-40D1-9FC1-E54A2B413805:616520691.98826:616520810.98305:119'}] def all_columns_old(): all_columns = [col for col in chunk[0]] all_columns += [column for record in chunk for column in record if column not in all_columns] return all_columns def all_columns_new(): all_columns = [col for col in chunk[0]] for record in chunk: all_columns += [column for column in record if column not in all_columns] return all_columns if __name__ == '__main__': from pprint import pprint print('problem: ') pprint(all_columns_old()) print('\nfix: ') pprint(all_columns_new()) ``` ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/225/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 315738696,MDU6SXNzdWUzMTU3Mzg2OTY=,226,Unit tests for installable plugins,9599,simonw,closed,0,,,,,2,2018-04-19T06:05:32Z,2020-11-24T19:52:51Z,2020-11-24T19:52:46Z,OWNER,,"I'd like more thorough unit test coverage of the plugins mechanism - in particular for installable plugins. I think I can do this while still having the code live in the same repo, by creating a subdirectory in tests/example_plugin with its own setup.py and then running `python setup.py install` as part of the test runner. I imagine I will need to bump the version number every time I change the plugin in case someone runs the test again in the same virtual environment. If that doesn't work I can instead ship a datasette-plugins-tests two to PyPI and add that as a tests_require dependency. Refs #14",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/226/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 802583450,MDU6SXNzdWU4MDI1ODM0NTA=,226,3.4 release is broken - includes a rogue line,9599,simonw,closed,0,,,,,0,2021-02-06T02:08:01Z,2021-02-06T02:10:26Z,2021-02-06T02:10:26Z,OWNER,,"I started seeing weird errors, caused by this line: https://github.com/simonw/sqlite-utils/blob/f8010ca78fed8c5fca6cde19658ec09fdd468420/sqlite_utils/cli.py#L1-L3 That was added by accident in 1b666f9315d4ea6bb332b2e75e48480c26100199 I'm surprised the tests didn't catch this!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/226/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 315960272,MDU6SXNzdWUzMTU5NjAyNzI=,227,prepare_context() plugin hook,9599,simonw,closed,0,,,,,8,2018-04-19T16:55:26Z,2020-03-24T22:19:54Z,2020-03-24T22:19:54Z,OWNER,,This would be called with the context dictionary before each template is rendered. It would have the opportunity to modify that context.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/227/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 807174161,MDU6SXNzdWU4MDcxNzQxNjE=,227,Error reading csv files with large column data,295329,camallen,closed,0,,,,,4,2021-02-12T11:51:47Z,2021-02-16T11:48:03Z,2021-02-14T21:17:19Z,NONE,,"*Feel free to close this issue - I mostly added it for reference for future folks that run into this :)* I have a CSV file with one column that has very long strings. When i try to import this file via the `insert` command I get the following error: ``` sqlite-utils insert database.db table_name file_with_large_column.csv Traceback (most recent call last): File ""/usr/local/bin/sqlite-utils"", line 10, in sys.exit(cli()) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/usr/local/lib/python3.7/site-packages/sqlite_utils/cli.py"", line 774, in insert default=default, File ""/usr/local/lib/python3.7/site-packages/sqlite_utils/cli.py"", line 705, in insert_upsert_implementation docs, pk=pk, batch_size=batch_size, alter=alter, **extra_kwargs File ""/usr/local/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1852, in insert_all first_record = next(records) File ""/usr/local/lib/python3.7/site-packages/sqlite_utils/cli.py"", line 703, in docs = (decode_base64_values(doc) for doc in docs) File ""/usr/local/lib/python3.7/site-packages/sqlite_utils/cli.py"", line 681, in docs = (dict(zip(headers, row)) for row in reader) _csv.Error: field larger than field limit (131072) ``` Built with the docker image `datasetteproject/datasette:0.54` with the following versions: ``` # sqlite-utils --version sqlite-utils, version 3.4.1 # datasette --version datasette, version 0.54 ``` It appears this is a [known issue](https://stackoverflow.com/a/54517228/2761423) reading in csv files in python and [doesn't look to be modifiable](https://github.com/python/cpython/blob/ea46579067fd2d4e164d6605719ffec690c4d621/Modules/_csv.c#L1685) through system / env vars (i may be very wrong on this). Noting that using sqlite3 `import` command work without error (not using the python csv reader) ``` sqlite3 database.db sqlite> .mode csv sqlite> .import file_with_large_column.csv table_name ``` Sadly I couldn't see an easy way around this while using the cli as it appears this value needs to be changed in python code. FWIW I've switched to using https://datasette.io/tools/csvs-to-sqlite for importing csv data and it's working well. Finally, I'm loving https://datasette.io/ thank you very much for an amazing tool and data ecosytem 🙇‍♀️ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/227/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 316031566,MDU6SXNzdWUzMTYwMzE1NjY=,228,"If spatialite detected, mark idx_XXX_Geometry tables as hidden",9599,simonw,closed,0,,,,,1,2018-04-19T20:37:24Z,2018-04-26T03:25:39Z,2018-04-26T03:25:39Z,OWNER,,"https://timezones-api.now.sh/timezones-faf26d0 ![2018-04-19 at 1 36 pm](https://user-images.githubusercontent.com/9599/39016906-a5acbb3e-43d6-11e8-9a31-814ff1d0022e.png) Need to update this logic: https://github.com/simonw/datasette/blob/e2750c7cc0585adaa8c866be611089e62961ee35/datasette/app.py#L1276-L1288",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/228/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 807437089,MDU6SXNzdWU4MDc0MzcwODk=,228,--no-headers option for CSV and TSV,9599,simonw,closed,0,,,,,10,2021-02-12T17:56:51Z,2021-12-26T07:01:31Z,2021-02-14T22:25:17Z,OWNER,,"https://bl.iro.bl.uk/work/ns/3037474a-761c-456d-a00c-9ef3c6773f4c has a fascinating CSV file that doesn't have a header row - it starts like this: ```csv Computation and measurement of turbulent flow through idealized turbine blade passages,,""Loizou, Panos A."",https://isni.org/isni/0000000136122593,,University of Manchester,https://isni.org/isni/0000000121662407,1989,Thesis (Ph.D.),,Physical Sciences,,,https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.232781, ""Prolactin and growth hormone secretion in normal, hyperprolactinaemic and acromegalic man"",,""Prescott, R. W. G."",https://isni.org/isni/0000000134992122,,University of Newcastle upon Tyne,https://isni.org/isni/0000000104627212,1983,Thesis (Ph.D.),,Biological Sciences,,,https://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.232784, ``` It would be useful if `sqlite-utils insert ... --csv` had a mechanism for importing files like this one.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/228/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 316123256,MDU6SXNzdWUzMTYxMjMyNTY=,229,Table view should support ?_size=400 parameter,9599,simonw,closed,0,,,,,1,2018-04-20T04:23:18Z,2018-04-26T04:49:46Z,2018-04-26T04:48:32Z,OWNER,,Allows callers to request more rows at once. The limit will still be `max_returned_rows` (defaults to 1000).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/229/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 807817197,MDU6SXNzdWU4MDc4MTcxOTc=,229,Hitting `_csv.Error: field larger than field limit (131072)`,631242,frosencrantz,closed,0,,,,,3,2021-02-13T19:52:44Z,2021-02-14T21:33:33Z,2021-02-14T21:33:33Z,NONE,,"I have a csv file where one of the fields is so large it is throwing an exception with this error and stops loading: ``` _csv.Error: field larger than field limit (131072) ``` The stack trace occurs here: https://github.com/simonw/sqlite-utils/blob/3.1/sqlite_utils/cli.py#L633 There is a way to handle this that helps: https://stackoverflow.com/questions/15063936/csv-error-field-larger-than-field-limit-131072 One issue I had with this problem was sqlite-utils only provides limited context as to where the problem line is. There is the progress bar, but that is by percent rather than by line number. It would have been helpful if it could have provided a line number. Also, it would have been useful if it had allowed the loading to continue with later lines. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/229/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 316128955,MDU6SXNzdWUzMTYxMjg5NTU=,230,Setting page size AND max returned rows to 1000 doesn't seem to work,9599,simonw,closed,0,,,,,1,2018-04-20T05:05:11Z,2018-04-26T04:04:25Z,2018-04-26T04:04:25Z,OWNER,,"It appears that if the two settings are the same Datasette fails to return any results, probably because of the trick where we try to fetch 1001 rows so we know if there's a next page.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/230/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 808008305,MDU6SXNzdWU4MDgwMDgzMDU=,230,--sniff option for sniffing delimiters,9599,simonw,closed,0,,,,,8,2021-02-14T17:43:54Z,2021-02-14T21:15:33Z,2021-02-14T19:24:32Z,OWNER,,"> I just spotted that `csv.Sniffer` in the Python standard library has a `.has_header(sample)` method which detects if the first row appears to be a header or not, which is interesting. https://docs.python.org/3/library/csv.html#csv.Sniffer _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/228#issuecomment-778812050_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/230/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 316323336,MDU6SXNzdWUzMTYzMjMzMzY=,231,metadata.json support for plugin configuration options,9599,simonw,closed,0,,,,,4,2018-04-20T15:58:47Z,2019-05-13T18:56:21Z,2019-05-13T18:56:21Z,OWNER,,"My [datasette-cluster-map](https://github.com/simonw/datasette-cluster-map) plugin currently works by detecting `latitude` and `longitude` columns. I'd like to be able to configure it to look for different column names. One way to do this could be to support optional plugin configuration as part of `metadata.json`. Something like this: { ""title"": ""Polar Bear Ear Tags, 2009-2011"", ""source"": ""USGS Alaska Science Center, Polar Bear Research Program"", ""source_url"": ""https://alaska.usgs.gov/products/data.php?dataid=130"", ""plugins"": { ""datasette_cluster_map"": { ""latitude_columns"": [ ""latitude"", ""Capture Latitude"" ], ""longitude_columns"": [ ""longitude"", ""Capture Longitude"" ] } } } These settings should be supported at the root level or at the individual database or table level. They could also be exposed in the https://datasette-cluster-map-demo.now.sh/-/plugins debug tool. Refs #14",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/231/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 808028757,MDU6SXNzdWU4MDgwMjg3NTc=,231,"limit=X, offset=Y parameters for more Python methods",9599,simonw,closed,0,,,,,2,2021-02-14T19:31:23Z,2021-02-14T20:03:08Z,2021-02-14T20:03:08Z,OWNER,,"> I'm going to add a `offset=` parameter to support this case. Thanks for the suggestion! _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/224#issuecomment-778828495_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/231/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 316365426,MDExOlB1bGxSZXF1ZXN0MTgzMTM1NjA0,232,Fix a typo,45281,lsb,closed,0,,,,,1,2018-04-20T18:20:04Z,2018-04-21T00:19:08Z,2018-04-21T00:19:08Z,CONTRIBUTOR,simonw/datasette/pulls/232,It looks like this was the only instance of it: https://github.com/simonw/datasette/search?utf8=%E2%9C%93&q=SOLite&type=,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/232/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 808036774,MDU6SXNzdWU4MDgwMzY3NzQ=,232,Run tests against Windows in GitHub Actions,9599,simonw,closed,0,,,,,0,2021-02-14T20:09:45Z,2021-02-14T20:39:55Z,2021-02-14T20:39:55Z,OWNER,,"> I'm going to try and get the test suite to run in Windows on GitHub Actions. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/225#issuecomment-778834504_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/232/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 316444720,MDU6SXNzdWUzMTY0NDQ3MjA=,233,Option to expose expanded foreign keys in JSON/CSV,9599,simonw,closed,0,,,,,11,2018-04-21T00:18:25Z,2018-06-16T22:26:21Z,2018-06-16T22:20:14Z,OWNER,,"https://datasette-cluster-map-demo.datasettes.com/sf-trees-02c8ef1/Street_Tree_List?qCareAssistant=1 ![f36b87c0-478e-4d55-9a5f-ad37df0b47cb](https://user-images.githubusercontent.com/9599/39078411-bb3e4f88-44be-11e8-9d0c-d22324793c77.png) It would be nice if the info bubbles there could expose more than just the IDs, and if the title showed the expanded name of the selected qCareAssistant.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/233/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 808037010,MDExOlB1bGxSZXF1ZXN0NTczMTQ3MTY4,233,"Run tests against Ubuntu, macOS and Windows",9599,simonw,closed,0,,,,,0,2021-02-14T20:11:02Z,2021-02-14T20:39:54Z,2021-02-14T20:39:54Z,OWNER,simonw/sqlite-utils/pulls/233,Refs #232,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/233/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 316526433,MDU6SXNzdWUzMTY1MjY0MzM=,234,label_column option in metadata.json,9599,simonw,closed,0,,,,,3,2018-04-21T21:19:08Z,2018-04-22T20:47:12Z,2018-04-22T20:47:12Z,OWNER,,"Currently the column used for displaying a foreign key relationship is automatically detected by `inspect()` by looking for tables that have a primary key column and one other column. This doesn't work for tables with more than two columns. Let's allow the table section in `metadata.json` to optionally define a `label_column` which, if present, will be used for those displays.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/234/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 808046597,MDU6SXNzdWU4MDgwNDY1OTc=,234,.insert_all() fails if subsequent chunks contain additional columns,9599,simonw,closed,0,,,,,1,2021-02-14T21:01:51Z,2021-02-14T21:03:40Z,2021-02-14T21:03:40Z,OWNER,,Reported by @nieuwenhoven in #225 along with a proposed fix.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/234/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 316621102,MDU6SXNzdWUzMTY2MjExMDI=,235,Add limit on the size in KB of data returned from a single query,9599,simonw,open,0,,,,,2,2018-04-22T23:01:15Z,2018-04-24T00:30:02Z,,OWNER,,"Datasette limits the number of rows returned to 1,000 and limits the time spent executing a SQL query to 1000ms - and both of these limits can be customized. It does not have a limit on the size of the response returned. It's possible to compose maliciously large SQL responses in a small number of rows using mechanisms like the `group_concat()` aggregate function. It would be good to avoid malicious SQL creating 100MB+ responses and potentially crashing the server. I think the easiest place to implement that is here: https://github.com/simonw/datasette/blob/f3f42957128c1e7ece584d45d9167f2ac003a3b8/datasette/app.py#L175-L190 Currently we use `cursor.fetchmany()` to fetch up to 1,001 rows at once. Instead, we could switch to iterating through `cursor.fetchone()` (or just using `for row in cursor`) and keeping a running tally of the size of the response as we go - maybe just using `rough_response_size += len(str(row))`. If that goes above a certain threshold we can terminate the response with an error, like we do with timelimits. The bigger challenge here is understanding how well this approach works and what impact it will have on overall Datasette performance. I think I need #33 for this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/235/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 810618495,MDU6SXNzdWU4MTA2MTg0OTU=,235,Extract columns cannot create foreign key relation: sqlite3.OperationalError: table sqlite_master may not be modified,6913891,kristomi,closed,0,,,,,18,2021-02-17T23:33:23Z,2023-06-26T01:47:01Z,2023-06-25T23:25:53Z,NONE,,"Thanks for what seems like a truly great suite of libraries. I wanted to try out Datasette, but never got more than half way through your YouTube video with the SF tree dataset. Whenever I try to extract a column, I get a `sqlite3.OperationalError: table sqlite_master may not be modified` error from Python. This snippet reproduces the error on my system, Python 3.9.1 and sqlite-utils 3.5 on an M1 Macbook Pro running in rosetta mode: ``` curl ""https://data.nasa.gov/resource/y77d-th95.json"" | \ sqlite-utils insert meteorites.db meteorites - --pk=id sqlite-utils extract meteorites.db meteorites recclass ``` I have tried googling the problem, but all I've found is that this *might* be a problem with the sqlite3 database running in defensive mode, but I definitely can't know for sure. Does the problem seem familiar to you? ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/235/reactions"", ""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 317001500,MDU6SXNzdWUzMTcwMDE1MDA=,236,datasette publish lambda plugin,9599,simonw,open,0,,,,,11,2018-04-23T22:10:30Z,2023-03-12T14:04:15Z,,OWNER,,"Refs #217 - create a publish plugin that can deploy to AWS Lambda. https://docs.aws.amazon.com/lambda/latest/dg/limits.html says lambda packages can be up to 50 MB, so this would only work with smaller databases (the command can check the filesize before attempting to package and deploy it). Lambdas do get a 512 MB `/tmp` directory too, so for larger databases the function could start and then download up to 512MB from an S3 bucket - so the plugin could take an optional S3 bucket to write to and know how to upload the `.db` file there and then have the lambda download it on startup.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/236/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 811680502,MDU6SXNzdWU4MTE2ODA1MDI=,236,--attach command line option for attaching extra databases,9599,simonw,closed,0,,,,,1,2021-02-19T04:38:30Z,2021-02-19T05:10:41Z,2021-02-19T05:08:43Z,OWNER,,"This will enable cross-database joins, as seen in https://github.com/simonw/datasette/issues/283 Also refs #113",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/236/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 317475156,MDU6SXNzdWUzMTc0NzUxNTY=,237,Support for ?_search_colname=blah searches,9599,simonw,closed,0,,,,,2,2018-04-25T04:29:53Z,2018-05-05T22:56:42Z,2018-05-05T22:33:23Z,OWNER,,"Right now the `_search=` argument searches across all fields in a full-text index, for example: https://san-francisco.datasettes.com/sf-film-locations-84594a7/Film_Locations_in_San_Francisco?_search=justin SQLite FTS also supports searches within a specified field, for example: https://san-francisco.datasettes.com/sf-film-locations-84594a7?sql=select+rowid%2C+*+from+Film_Locations_in_San_Francisco+where+rowid+in+%28select+rowid+from+%5BFilm_Locations_in_San_Francisco_fts%5D+where+%5BLocations%5D+match+%3Asearch%29+order+by+rowid+limit+101&search=justin ``` select rowid, * from Film_Locations_in_San_Francisco where rowid in ( select rowid from [Film_Locations_in_San_Francisco_fts] where [Locations] match :search ) order by rowid limit 101 ``` The `_search=` parameter could be extended to support this using `_search_colname=`. This should also be able to support columns with spaces and special characters in their names, something like this: `_search_Column%20With%20Spaces=foo` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/237/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 815554385,MDU6SXNzdWU4MTU1NTQzODU=,237,"db[""my_table""].drop(ignore=True) parameter, plus sqlite-utils drop-table --ignore and drop-view --ignore",649467,mhalle,closed,0,,,,,3,2021-02-24T14:55:06Z,2021-02-25T17:11:41Z,2021-02-25T17:11:41Z,NONE,,"When I'm generating a derived table in python, I often drop the table and create it from scratch. However, the first time I generate the table, it doesn't exist, so the drop raises an exception. That means more boilerplate. I was going to submit a pull request that adds an ""if_exists"" option to the `drop` method of tables and views. However, for a utility like sqlite_utils, perhaps the ""IF EXISTS"" SQL semantics is what you want most of the time, and thus should be the default. What do you think?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/237/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 317714268,MDU6SXNzdWUzMTc3MTQyNjg=,238,External metadata.json,9599,simonw,closed,0,,,,,3,2018-04-25T17:02:30Z,2019-06-24T06:52:55Z,2019-06-24T06:52:45Z,OWNER,,"A frustration I'm having with https://register-of-members-interests.datasettes.com/ is that I keep coming up with new canned queries but I don't want to redeploy the whole thing just to add them to `metadata.json` Maybe Datasette could optionally take a `--metadata-url` option which causes it to load from a URL instead and occasionally check for updates.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/238/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 816523763,MDU6SXNzdWU4MTY1MjM3NjM=,238,.add_foreign_key() corrupts database if column contains a space,9599,simonw,closed,0,,,,,1,2021-02-25T15:07:20Z,2021-02-25T16:54:02Z,2021-02-25T16:54:02Z,OWNER,,"I ran this: db[""Reports""].add_foreign_key(""Reported by ID"", ""Reporters"", ""id"") And got this: ``` ~/jupyter-venv/lib/python3.9/site-packages/sqlite_utils/db.py in add_foreign_keys(self, foreign_keys) 616 # Have to VACUUM outside the transaction to ensure .foreign_keys property 617 # can see the newly created foreign key. --> 618 self.vacuum() 619 620 def index_foreign_keys(self): ~/jupyter-venv/lib/python3.9/site-packages/sqlite_utils/db.py in vacuum(self) 629 630 def vacuum(self): --> 631 self.execute(""VACUUM;"") 632 633 ~/jupyter-venv/lib/python3.9/site-packages/sqlite_utils/db.py in execute(self, sql, parameters) 234 return self.conn.execute(sql, parameters) 235 else: --> 236 return self.conn.execute(sql) 237 238 def executescript(self, sql): DatabaseError: database disk image is malformed ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/238/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 317760361,MDU6SXNzdWUzMTc3NjAzNjE=,239,Support for hidden tables in metadata.json,9599,simonw,closed,0,,,,,2,2018-04-25T19:21:17Z,2018-04-26T03:45:12Z,2018-04-26T03:43:10Z,OWNER,,"Since we already have a hidden feature, let's expose it more to our users ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/239/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 816526538,MDU6SXNzdWU4MTY1MjY1Mzg=,239,sqlite-utils extract could handle nested objects,9599,simonw,open,0,,,,,16,2021-02-25T15:10:28Z,2022-09-03T23:46:02Z,,OWNER,,"Imagine a table (imported from a nested JSON file) where one of the columns contains values that look like this: {""email"": ""anonymous@noreply.airtable.com"", ""id"": ""usrROSHARE0000000"", ""name"": ""Anonymous""} The `sqlite-utils extract` command already uses single text values in a column to populate a new table. It would not be much of a stretch for it to be able to use JSON instead, including specifying which of those values should be used as the primary key in the new table.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/239/reactions"", ""total_count"": 6, ""+1"": 5, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 317900587,MDU6SXNzdWUzMTc5MDA1ODc=,240,FTS table detection should be part of .inspect(),9599,simonw,closed,0,,,,,0,2018-04-26T06:58:10Z,2018-04-29T00:04:44Z,2018-04-29T00:04:44Z,OWNER,,"The code that detects if specific tables have a corresponding FTS column is currently called from TableView - it should instead be handled as part of `.inspect()`. This will make it easier to build other features that need to behave differently depending on whether a table can be searched, e.g. an autocomplete widget for selecting filters from foreign key tables. Current code: https://github.com/simonw/datasette/blob/f188ceaa2a3a5b2eab83425ad0f00cb0d364e24a/datasette/app.py#L728-L733",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/240/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 816560819,MDU6SXNzdWU4MTY1NjA4MTk=,240,table.pks_and_rows_where() method returning primary keys along with the rows,9599,simonw,closed,0,,,,,7,2021-02-25T15:49:28Z,2021-02-25T16:39:23Z,2021-02-25T16:28:23Z,OWNER,,"*Original title: Easier way to update a row returned from .rows* Here's a surprisingly hard problem I ran into while trying to implement #239 - given a row returned by `db[table].rows` how can you update that row? The problem is that the `db[table].update(...)` method requires a primary key. But if you have a row from the `db[table].rows` iterator it might not even contain the primary key - provided the table is a `rowid` table. Instead, currently, you need to introspect the table and, if `rowid` is a primary key, explicitly include that in the `select=` argument to `table.rows_where(...)` - otherwise it will not be returned. A utility mechanism to make this easier would be very welcome.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/240/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 318490133,MDU6SXNzdWUzMTg0OTAxMzM=,241,Default datasette logging format should be JSON,9599,simonw,open,0,,,,,0,2018-04-27T17:32:48Z,2018-07-10T17:45:40Z,,OWNER,,"Structured logs are better. Datasette should default to outputting it's HTTP access log lines as newline delimited JSON instead of the Sanic default format it uses at the moment. For improved greppability these logs should have keys ordered in a consistent way. Python's JSON module can do this with ordered dictionaries.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/241/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 816601354,MDExOlB1bGxSZXF1ZXN0NTgwMjM1NDI3,241,Extract expand - work in progress,9599,simonw,open,0,,,,,0,2021-02-25T16:36:38Z,2021-02-25T16:36:38Z,,OWNER,simonw/sqlite-utils/pulls/241,Refs #239. Still needs documentation and CLI implementation.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/241/reactions"", ""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1, 318692953,MDU6SXNzdWUzMTg2OTI5NTM=,242,Rename ?_sql_time_limit_ms= to ?_timelimit=,9599,simonw,closed,0,,,,,0,2018-04-29T06:11:35Z,2018-05-02T00:20:42Z,2018-05-02T00:20:42Z,OWNER,,It's a bit of a mouthful at the moment.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/242/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 817989436,MDU6SXNzdWU4MTc5ODk0MzY=,242,Async support,25778,eyeseast,open,0,,,,,13,2021-02-27T18:29:38Z,2021-10-28T14:37:56Z,,CONTRIBUTOR,,"Following our conversation last week, want to note this here before I forget. I've had a couple situations where I'd like to do a bunch of updates in an async event loop, but I run into SQLite's issues with concurrent writes. This feels like something sqlite-utils could help with. PeeWee ORM has a [SQLite write queue](http://docs.peewee-orm.com/en/latest/peewee/playhouse.html#sqliteq) that might be a good model. It's using threads or gevent, but I _think_ that approach would translate well enough to asyncio. Happy to help with this, too.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/242/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 318737808,MDU6SXNzdWUzMTg3Mzc4MDg=,243,--spatialite option for datasette publish commands,9599,simonw,closed,0,,,,,2,2018-04-29T18:19:32Z,2018-05-31T14:17:53Z,2018-05-31T14:17:53Z,OWNER,,Performs the necessary incantations to install Spatialite on Zeit Now or Heroku and sets the corresponding environment variable to ensure the module is correctly loaded by datasette serve.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/243/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 818684978,MDU6SXNzdWU4MTg2ODQ5Nzg=,243,How can i use this utils to deal with fts on column meta of tables ?,27874014,svjack,open,0,,,,,0,2021-03-01T09:45:05Z,2021-03-01T09:45:05Z,,NONE,,"Thank you to release this bravo project. When i use this project on multi table db, I want to implement convenient search on column name from different tables. I want to develop a meta table to save the meta data of different columns of different tables and search on this meta table to get rows from the data table (which the meta table describes) does this project provide some simple function on it ? You can think a have a knowledge graph about the table in the db, and i save this knowledge graph into the db with fts enabled.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/243/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 318738000,MDU6SXNzdWUzMTg3MzgwMDA=,244,/-/versions page,9599,simonw,closed,0,,,,,1,2018-04-29T18:22:15Z,2018-05-03T14:13:49Z,2018-05-03T14:09:53Z,OWNER,,"Displays the current version of: * datasette * Python * SQLite * Spatialite (if available) Installed plugin versions should be shown on /-/plugins",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/244/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 820468864,MDExOlB1bGxSZXF1ZXN0NTgzNDA3OTg5,244,Typo in upsert example,387669,j-e-d,closed,0,,,,,1,2021-03-02T23:14:14Z,2021-05-19T02:58:21Z,2021-05-19T02:58:21Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/244,Remove extra `[`,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/244/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 319358200,MDU6SXNzdWUzMTkzNTgyMDA=,245,?_shape=array option,9599,simonw,closed,0,,,,,1,2018-05-01T23:11:07Z,2018-05-03T14:14:33Z,2018-05-02T00:12:20Z,OWNER,,"Some tools (`pandas.DataFrame(...)` for example) are happiest when you give them a raw array of JSON objects. `?_shape=array` should do just that While I'm at it, rename the default `?_shape=lists` to instead be called `?shape=arrays` And validate that `_shape` is a valid option And have `?_shape=object` return the object at the root level rather than nested in `.rows` to better match the behavior of `?_shape=array`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/245/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 830803173,MDExOlB1bGxSZXF1ZXN0NTkyMjg5MzI0,245,Correct some typos,1076745,dbready,closed,0,,,,,1,2021-03-13T04:26:56Z,2021-05-19T02:58:04Z,2021-05-19T02:58:04Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/245,Noticed a typo in the docs and followed that up with a spellcheck. Had to bite my tongue at some of the British spellings.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/245/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 319371036,MDExOlB1bGxSZXF1ZXN0MTg1MzA3NDA3,246,?_shape=array and _timelimit=,9599,simonw,closed,0,,,,,0,2018-05-02T00:18:54Z,2018-05-02T00:20:41Z,2018-05-02T00:20:40Z,OWNER,simonw/datasette/pulls/246,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/246/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 831751367,MDU6SXNzdWU4MzE3NTEzNjc=,246,Escaping FTS search strings,16001974,DeNeutoy,closed,0,,,,,4,2021-03-15T12:15:09Z,2021-08-18T18:57:13Z,2021-08-18T18:43:12Z,CONTRIBUTOR,," Thanks for the excellent library, it's very nice to use! I've been building some in memory search functionality for a data annotation tool i'm making, and I got tripped up a little bit with escaping the full text search queries. First I tried using `db.quote(q)`, which doesn't work, because sqlite FTS has it's own (separate)[ query syntax](https://www2.sqlite.org/fts5.html#full_text_query_syntax). You can see this happening here also: http://search-24ways.herokuapp.com/24ways-f8f455f/articles?_search=acces%2A I got around this by aggressively escaping quotes inside the query string like this: ```python quoted = q.replace('""', '""""') quoted = f'""{quoted}""' print(quoted) results = db[""data""].search(quoted, columns=[""id""]) return [x[""id""] for x in results] ``` This works in the sense it doesn't crash, but it also removes access to the search query syntax. Given the well specified definition, it might be possible for sqlite-utils to provide a `db.quote_query(q)` which would intelligently escape a query whilst leaving the syntax intact. This would be very nice! ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/246/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 319449852,MDU6SXNzdWUzMTk0NDk4NTI=,247,SQLite code decoupled from Datasette,11912854,jsancho-gpl,open,0,,,,,1,2018-05-02T08:03:28Z,2018-05-21T15:29:31Z,,NONE,,"I'm working on the possibility of use Datasette with other file formats that aren't SQLite, like files with [PyTables](https://github.com/PyTables/PyTables) format. In order to accomplish that, I've started [a fork for decoupling the code related with SQLite](https://github.com/jsancho-gpl/datasette/tree/feature/db-type-plugin) and putting it in an external connector to allow future connectors for a lot of file formats. It'd be nice if you could look at it and suggest improvements for a possible PR.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/247/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 832687563,MDExOlB1bGxSZXF1ZXN0NTkzODA1ODA0,247,FTS quote functionality from datasette,16001974,DeNeutoy,closed,0,,,,,2,2021-03-16T11:17:34Z,2021-08-18T18:43:12Z,2021-08-18T18:43:12Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/247,"Addresses #246 - this is a bit of a kludge because it doesn't actually *validate* the FTS string, just makes sure that it will not crash when executed, but I figured that building a query parser is a bit out of the scope of sqlite-utils and if you actually want to use the query language, you probably need to parse that yourself. ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/247/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 319954545,MDU6SXNzdWUzMTk5NTQ1NDU=,248,/-/plugins should show version of each installed plugin,9599,simonw,closed,0,,,,,2,2018-05-03T14:50:45Z,2018-05-04T18:25:40Z,2018-05-04T18:05:04Z,OWNER,,"Refs #244 https://stackoverflow.com/questions/20180543/how-to-check-version-of-python-modules ``` >>> import pkg_resources >>> pkg_resources.get_distribution('datasette_cluster_map').version '0.4' ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/248/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 836829560,MDU6SXNzdWU4MzY4Mjk1NjA=,248,support for Apache Arrow / parquet files I/O,649467,mhalle,open,0,,,,,1,2021-03-20T14:59:30Z,2021-10-28T23:46:48Z,,NONE,,"I just started looking at Apache Arrow using pyarrow for import and export of tabular datasets, and it looks quite compelling. It might be worth looking at for sqlite-utils and/or datasette. As a test, I took a random jsonl data dump of a dataset I have with floats, strings, and ints and converted it to arrow's parquet format using the naive `pyarrow.parquet.write_file()` command, which has automatic type inferrence. It compressed down to 7% of the original size. Conversion of a 26MB JSON file and serializing it to parquet was eyeblink instantaneous. Parquet files are portable and can be directly imported into pandas and other analytics software. The only hangup is the automatic type inference of the naive reader. It's great for general laziness and for parsing JSON columns (it correctly interpreted a table of mine with a JSON array). However, I did get an exception for a string column where most entries looked integer-like but had a couple values that weren't -- the reader tried to coerce all of them for some reason, even though the JSON type is string. Since the writer optionally takes a schema, it shouldn't be too hard to grab the sqlite header types. With some additional hinting, you might get datetime columns and JSON, which are native Arrow types. Somewhat tangentially, someone even wrote an sqlite vfs extension for Parquet: https://cldellow.com/2018/06/22/sqlite-parquet-vtable.html ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/248/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 320090329,MDU6SXNzdWUzMjAwOTAzMjk=,249,?_size=max argument ,9599,simonw,closed,0,,,,,1,2018-05-03T21:42:04Z,2018-05-04T18:26:30Z,2018-05-04T18:05:04Z,OWNER,,"For plugins that want to load the most data allowable, having `?_size=max` would be useful.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/249/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 836963850,MDU6SXNzdWU4MzY5NjM4NTA=,249,Full text search possibly broken?,36287,prabhur,closed,0,,,,,2,2021-03-21T02:03:44Z,2021-03-21T02:43:32Z,2021-03-21T02:43:32Z,NONE,,"I'm not quite sure if this is an issue with sqlite-utils or datasette. **Background** I was previously using sqlite-utils version < 3.6. I have a bunch of csv files that have some data scraped from a website. ``` sqlite-utils create-table mydb.db post \ posted_date text \ url text \ title text \ raw_text text \ --not-null posted_date \ --not-null url \ --pk=url ``` FTS is enabled via `sqlite-utils enable-fts ./mydb.db post title raw_text` Data is loaded to the table via `sqlite-utils insert ./mydb.db post ${filename} --csv` Note that the data contains text in my language Tamil. Loading happens just fine. datasette serves the db file just fine. It recognizes FTS and shows the ""search"" box. However, none of the queries work. Whatever text I supply, it always returns 0 rows. I literally copy paste words from the row listing on the screen and paste it on the search box. Interestingly, only thing I can remember is switching to sqlite-utils 3.6. I had to do this because the prior version had an issue with column size. I have attached one of the csv files that can be loaded to the table. Substitute ""${filename}"" with that file for the sqlite-utils insert command. [posts_20200417-20201231.csv.zip](https://github.com/simonw/sqlite-utils/files/6176697/posts_20200417-20201231.csv.zip) Interestingly, the FTS based search from datasette worked just fine before this version upgrade. That is, the queries returned results. I will try to downgrade just to see if the theory is correct. I appreciate any help here. Thanks. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/249/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 320132682,MDU6SXNzdWUzMjAxMzI2ODI=,250,Setup some issue templates,9599,simonw,open,0,,,,,0,2018-05-04T01:49:07Z,2018-05-04T01:49:07Z,,OWNER,,"https://twitter.com/left_pad/status/99216385740464537 I like the idea of using these to help people understand some of the ways I want to use issues.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/250/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 838148087,MDU6SXNzdWU4MzgxNDgwODc=,250,Handle byte order marks (BOMs) in CSV files,9599,simonw,closed,0,,,,,3,2021-03-22T22:13:18Z,2021-05-29T05:34:21Z,2021-05-29T05:34:21Z,OWNER,,I often find `sqlite-utils insert ... --csv` creates a first column with a weird character at the start of it - which it turns out is the UTF-8 BOM. Fix that.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/250/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 320592643,MDU6SXNzdWUzMjA1OTI2NDM=,251,"Explore ""distinct values for column"" in inspect()",9599,simonw,closed,0,,,,,4,2018-05-06T13:27:24Z,2018-05-14T22:47:55Z,2018-05-14T22:47:55Z,OWNER,,"A lot of datasets have columns which have a small number of possible values in them - this one for example: https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9?sql=select+distinct+category+from+%5Binconvenient-sequel%2Fratings%5D%3B Detecting these could be interesting as part of `.inspect()`, since it would allow for various UI enhancements like autocomplete / select box filters for those columns. The problem is detecting them efficiently. `.inspect()` shouldn't spend 5 minutes churning through columns on giant tables trying to determine if they have a small collection of unique values.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/251/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 841377702,MDU6SXNzdWU4NDEzNzc3MDI=,251,"""sqlite-utils convert"" command to replace the separate ""sqlite-transform"" tool",9599,simonw,closed,0,,,,,15,2021-03-25T22:36:36Z,2021-08-02T22:39:46Z,2021-08-02T04:47:40Z,OWNER,,"See https://github.com/simonw/sqlite-transform/issues/11 - I built a separate `sqlite-transform` tool a while ago that uses the word ""transform"" to means something entirely different from `sqlite-utils transform` - I'd like to resolve this by merging the two tools.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/251/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 321624016,MDU6SXNzdWUzMjE2MjQwMTY=,252,/-/versions should report the FTS version supported by SQLite,9599,simonw,closed,0,,,,,0,2018-05-09T15:43:47Z,2018-05-11T13:19:52Z,2018-05-11T13:19:52Z,OWNER,,I can copy this function from `csvs-to-sqlite`: https://github.com/simonw/csvs-to-sqlite/blob/dccbf65b37bc9eed50e9edb80a42f257e93edb1f/csvs_to_sqlite/utils.py#L283-L293,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/252/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 842062949,MDU6SXNzdWU4NDIwNjI5NDk=,252,Support json-line files,279769,rathboma,closed,0,,,,,1,2021-03-26T15:19:39Z,2021-03-26T15:21:38Z,2021-03-26T15:21:38Z,NONE,,"It's common for many processes to create flat files where each line is a JSON object. So the file isn't a json array. Many tools (like jq) support this natively, it'd be great for sqlite-utils to also!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/252/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 321631020,MDU6SXNzdWUzMjE2MzEwMjA=,253,Documentation explaining how to use SQLite FTS with Datasette,9599,simonw,closed,0,,,,,1,2018-05-09T16:02:08Z,2018-05-12T12:09:02Z,2018-05-12T12:06:51Z,OWNER,,"In particular how to work with https://www.sqlite.org/fts3.html#_external_content_fts4_tables_ - which Datasette can automatically detect and use to add a search UI to your page. Examples of basic search setup like this: ``` CREATE VIRTUAL TABLE ""interests_fts"" USING FTS4 (name, content=""interests""); INSERT INTO ""interests_fts"" (rowid, name) SELECT rowid, name FROM interests; ``` And complex join-based search setup like this: ``` CREATE VIRTUAL TABLE ""interests_fts"" USING FTS4 (name, category, member, content=""interests""); INSERT INTO ""interests_fts"" (rowid, name, category, member) SELECT interests.rowid, interests.name, interest_categories.name, members.name FROM interests JOIN interest_categories ON interests.category_id = interest_categories.id JOIN members ON interests.member_id = members.id; ``` Also mention how `csvs-to-sqlite` can be used to do this easily. This will benefit from #252 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/253/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 847423559,MDU6SXNzdWU4NDc0MjM1NTk=,253,fixtures.db example error in sql-utils blog post,192568,mroswell,closed,0,,,,,2,2021-03-31T22:07:36Z,2021-05-19T03:31:48Z,2021-05-19T03:31:47Z,NONE,,"En route to trying to understand column order transform documentation, I tried the instructions here: https://simonwillison.net/2020/Sep/23/sqlite-advanced-alter-table/ I get a malformed database schema syntax error. ``` $ wget https://latest.datasette.io/fixtures.db --2021-03-31 18:00:23-- https://latest.datasette.io/fixtures.db Resolving latest.datasette.io (latest.datasette.io)... 2607:f8b0:4004:801::2013, 142.250.73.211 Connecting to latest.datasette.io (latest.datasette.io)|2607:f8b0:4004:801::2013|:443... connected. HTTP request sent, awaiting response... 200 OK Length: unspecified [application/octet-stream] Saving to: ‘fixtures.db’ fixtures.db [ <=> ] 260.00K --.-KB/s in 0.1s 2021-03-31 18:00:23 (2.41 MB/s) - ‘fixtures.db’ saved [266240] $ sqlite3 fixtures.db '.schema facetable' Error: malformed database schema (generated_columns) - near ""AS"": syntax error $ sqlite3 fixtures.db SQLite version 3.28.0 2019-04-15 14:49:49 Enter "".help"" for usage hints. sqlite> .schema Error: malformed database schema (generated_columns) - near ""AS"": syntax error ``` ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/253/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 322283067,MDU6SXNzdWUzMjIyODMwNjc=,254,Escaping named parameters in canned queries,247131,philroche,closed,0,,,,,4,2018-05-11T12:43:30Z,2020-05-10T14:54:14Z,2020-05-10T14:54:13Z,NONE,,"Thank you very much for this project. I have created some canned queries but some of the filters include a colon eg. ""com.ubuntu.cloud:server:18.04:amd64"". When saved these colons are parsed as named parameters. Is there a way to escape colons in a canned query?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/254/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 857280617,MDExOlB1bGxSZXF1ZXN0NjE0NzI3MDM2,254,Fix incorrect create-table cli description,1935268,robjwells,closed,0,,,,,1,2021-04-13T20:03:15Z,2021-05-19T04:43:46Z,2021-05-19T02:57:26Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/254,The description for `create-table` was duplicated from `create-index`.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/254/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 322477187,MDU6SXNzdWUzMjI0NzcxODc=,255,Facets,9599,simonw,closed,0,,,,,16,2018-05-12T03:00:07Z,2019-05-29T21:39:12Z,2018-05-16T15:32:12Z,OWNER,,"Ability to display facets and facet counts on the table view. Facets can be specified in the URL with `?_facet=column&_facet=othercolumn` or the default facets for a table can be set using a new `""facets"": [...]` property in `metadata.json` - [x] Implement `?_facet=` - [x] Implement `metadata.json` `facets` key - [x] Design for how facets should be presented - [x] Facets should be able to toggle off as well as on - [x] Expand labels for facets that are foreign keys - [x] Suggest potential facets (if we can do so within a tight time limit) - [x] Documentation",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/255/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 858501079,MDU6SXNzdWU4NTg1MDEwNzk=,255,transform --help should tell you the available types,9599,simonw,closed,0,,,,,0,2021-04-15T05:24:48Z,2021-05-29T03:55:52Z,2021-05-29T03:55:52Z,OWNER,,"``` Usage: sqlite-utils transform [OPTIONS] PATH TABLE Transform a table beyond the capabilities of ALTER TABLE Options: --type ... Change column type to X ``` This should specify that the possible types are 'INTEGER', 'TEXT', 'FLOAT', 'BLOB'.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/255/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 322551723,MDU6SXNzdWUzMjI1NTE3MjM=,256,Break up app.py into separate view modules,9599,simonw,closed,0,,,,,1,2018-05-12T23:56:33Z,2018-05-14T03:05:37Z,2018-05-14T03:05:37Z,OWNER,,"`views/table.py` and `views/database.py` and `views/utils.py` as a starting point. Likewise, create `tests/test_views_table.py` and `tests/test_views_database.py` - these will contain both HTML and API test for those views.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/256/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 861622839,MDU6SXNzdWU4NjE2MjI4Mzk=,256,inserting with --nl errors with: sqlite3.OperationalError: table
    has no column named ,279769,rathboma,closed,0,,,,,2,2021-04-19T18:01:03Z,2021-05-19T03:26:54Z,2021-05-19T03:26:54Z,NONE,,"I have a `jsonl` file, it is 10,000 lines long. Inserting from the cli with `sqlite-utils insert db table file --nl --batch-size 10000` fails with this missing column error, even though I'm telling it to use the whole file in the first batch. This seems similar to #18 and #139, but maybe it's unique to `--nl`?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/256/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 322591993,MDExOlB1bGxSZXF1ZXN0MTg3NjY4ODkw,257,Refactor views,9599,simonw,closed,0,,,,,5,2018-05-13T13:00:50Z,2018-05-14T03:04:25Z,2018-05-14T03:04:24Z,OWNER,simonw/datasette/pulls/257,"* Split out view classes from main `app.py` * Run [black](https://github.com/ambv/black) against resulting code to apply opinionated source code formatting * Run [isort](https://github.com/timothycrosley/isort) to re-order my imports Refs #256 ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/257/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 868188068,MDU6SXNzdWU4NjgxODgwNjg=,257,"Insert from JSON containing strings with non-ascii characters are escaped as unicode for lists, tuples, dicts.",6586811,dylan-wu,closed,0,,,,,0,2021-04-26T20:46:25Z,2021-05-19T02:57:05Z,2021-05-19T02:57:05Z,CONTRIBUTOR,,"JSON Test File (test.json): ```json [ { ""id"": 123, ""text"": ""FR Théâtre"" }, { ""id"": 223, ""text"": [ ""FR Théâtre"" ] } ] ``` Command to import: ```bash sqlite-utils insert test.db text test.json --pk=id ``` Resulting table view from datasette: ![image](https://user-images.githubusercontent.com/6586811/116147833-cdf2fb00-a6a5-11eb-8412-0aae81b6e6dd.png) Original, db.py line 2225: ```python return json.dumps(value, default=repr) ``` Fix, db.py line 2225: ```python return json.dumps(value, default=repr, ensure_ascii=False) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/257/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 322741659,MDExOlB1bGxSZXF1ZXN0MTg3NzcwMzQ1,258,Add new metadata key persistent_urls which removes the hash from all database urls,247131,philroche,closed,0,,,,,3,2018-05-14T09:39:18Z,2018-05-21T07:38:15Z,2018-05-21T07:38:15Z,NONE,simonw/datasette/pulls/258,"Add new metadata key ""persistent_urls"" which removes the hash from all database urls when set to ""true"" This PR is just to gauge if this, or something like it, is something you would consider merging? I understand the reason why the substring of the hash is included in the url but there are some use cases where the urls should persist across deployments. For bookmarks for example or for scripts that use the JSON API. This is the initial commit for this feature. Tests and documentation updates to follow.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/258/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 868191959,MDExOlB1bGxSZXF1ZXN0NjIzNzU1NzIz,258,Fixing insert from JSON containing strings with non-ascii characters …,6586811,dylan-wu,closed,0,,,,,1,2021-04-26T20:50:00Z,2021-05-19T02:47:44Z,2021-05-19T02:47:44Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/258,"…are escaped aps unicode for lists, tuples, dicts Fix of #257 ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/258/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 322787470,MDU6SXNzdWUzMjI3ODc0NzA=,259,inspect() should detect many-to-many relationships,9599,simonw,closed,0,,,,,6,2018-05-14T12:03:58Z,2019-05-23T03:55:37Z,2019-05-23T03:55:37Z,OWNER,,"Relates to #255 - in particular supporting facets across M2M relationships. It should be possible for `.inspect()` to notice when a table has two foreign keys to two different tables, and assume that this means there is a M2M relationship between those tables. When rendering a table with a m2m relationship we could display the first X associated records as a comma separated list of hyperlinks in a new column on the table view, with a column name derived from the table on the other side. Since SQLite doesn't have RANK or an equivalent of https://www.xaprb.com/blog/2006/12/02/how-to-number-rows-in-mysql/ this would be implemented as N+1 queries (one query per cell that we want to display an m2m summary). This should be OK in SQLite: https://sqlite.org/np1queryprob.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/259/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 894948100,MDU6SXNzdWU4OTQ5NDgxMDA=,259,Suggest the --alter option if a new column cannot be added,9599,simonw,closed,0,,,,,1,2021-05-19T03:17:38Z,2021-05-19T03:27:33Z,2021-05-19T03:26:26Z,OWNER,,Refs #256.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/259/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 323223872,MDU6SXNzdWUzMjMyMjM4NzI=,260,Validate metadata.json on startup,9599,simonw,open,0,,,,,7,2018-05-15T13:42:56Z,2023-06-21T12:51:22Z,,OWNER,,"It's easy to misspell the name of a database or table and then be puzzled when the metadata settings silently fail. To avoid this, let's sanity check the provided metadata.json on startup and quit with a useful error message if we find any obvious mistakes.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/260/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 906330187,MDU6SXNzdWU5MDYzMzAxODc=,260,Support creating descending order indexes,9599,simonw,closed,0,,,,,12,2021-05-29T03:42:59Z,2021-05-29T05:01:39Z,2021-05-29T05:01:39Z,OWNER,,"SQLite lets you create indexes in reverse order, which can have a surprisingly big impact on performance, see https://github.com/simonw/covid-19-datasette/issues/27 I tried doing this using `sqlite-utils` like so, but it's didn't work: ```python db[""ny_times_us_counties""].create_index([""date desc""]) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/260/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 323459939,MDExOlB1bGxSZXF1ZXN0MTg4MzEyNDEx,261,Facets improvements plus suggested facets,9599,simonw,closed,0,,,,,0,2018-05-16T03:52:39Z,2018-05-16T15:27:26Z,2018-05-16T15:27:25Z,OWNER,simonw/datasette/pulls/261,Refs #255,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/261/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 906345899,MDU6SXNzdWU5MDYzNDU4OTk=,261,`table.xindexes` using `PRAGMA index_xinfo(table)`,9599,simonw,closed,0,,,,,5,2021-05-29T04:23:48Z,2021-06-03T03:54:14Z,2021-06-03T03:51:32Z,OWNER,,"> `PRAGMA index_xinfo(table)` DOES return that data: > ``` > (Pdb) [c[0] for c in fresh_db.execute(""PRAGMA > index_xinfo('idx_dogs_age_name')"").description] > ['seqno', 'cid', 'name', 'desc', 'coll', 'key'] > (Pdb) fresh_db.execute(""PRAGMA index_xinfo('idx_dogs_age_name')"").fetchall() > [(0, 2, 'age', 1, 'BINARY', 1), (1, 0, 'name', 0, 'BINARY', 1), (2, -1, None, 0, 'BINARY', 0)] > ``` > See https://sqlite.org/pragma.html#pragma_index_xinfo > > Example output: https://covid-19.datasettes.com/covid?sql=select+*+from+pragma_index_xinfo%28%27idx_ny_times_us_counties_date%27%29 _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/260#issuecomment-850766552_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/261/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 323658641,MDU6SXNzdWUzMjM2NTg2NDE=,262,Add ?_extra= mechanism for requesting extra properties in JSON,9599,simonw,open,0,,,3268330,Datasette 1.0,27,2018-05-16T14:55:42Z,2023-03-29T06:22:22Z,,OWNER,,"Datasette views currently work by creating a set of data that should be returned as JSON, then defining an additional, optional `template_data()` function which is called if the view is being rendered as HTML. This `template_data()` function calculates extra template context variables which are necessary for the HTML view but should not be included in the JSON. Example of how that is used today: https://github.com/simonw/datasette/blob/2b79f2bdeb1efa86e0756e741292d625f91cb93d/datasette/views/table.py#L672-L704 With features like Facets in #255 I'm beginning to want to move more items into the `template_data()` - in the case of facets it's the `suggested_facets` array. This saves that feature from being calculated (involving several SQL queries) for the JSON case where it is unlikely to be used. But... as an API user, I want to still optionally be able to access that information. Solution: Add a `?_extra=suggested_facets&_extra=table_metadata` argument which can be used to optionally request additional blocks to be added to the JSON API. Then redefine as many of the current `template_data()` features as extra arguments instead, and teach Datasette to return certain extras by default when rendering templates. This could allow the JSON representation to be slimmed down further (removing e.g. the `table_definition` and `view_definition` keys) while still making that information available to API users who need it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/262/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 906355849,MDExOlB1bGxSZXF1ZXN0NjU3MzczNzI2,262,Ability to add descending order indexes,9599,simonw,closed,0,,,,,0,2021-05-29T04:51:04Z,2021-05-29T05:01:42Z,2021-05-29T05:01:39Z,OWNER,simonw/sqlite-utils/pulls/262,Refs #260,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/262/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 323671577,MDU6SXNzdWUzMjM2NzE1Nzc=,263,Facets should not execute for ?shape=array|object,9599,simonw,closed,0,,,,,3,2018-05-16T15:26:13Z,2021-06-02T02:54:34Z,2021-06-02T02:54:34Z,OWNER,,Split off from #255 - there's no point executing the facet SQL for the `?_shape=array` and `?_shape=object` API responses.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/263/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 906356331,MDU6SXNzdWU5MDYzNTYzMzE=,263,`sqlite-utils indexes` command,9599,simonw,closed,0,,,,,6,2021-05-29T04:52:34Z,2021-06-03T04:34:38Z,2021-06-03T04:34:38Z,OWNER,,"While working on #260 I realized there's no command to show indexes in a database, even though there is one for showing tables and one for triggers. I should implement #261 first.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/263/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 323673899,MDU6SXNzdWUzMjM2NzM4OTk=,264,Make it possible to customize various facet settings,9599,simonw,closed,0,,,,,1,2018-05-16T15:31:34Z,2018-05-18T06:18:00Z,2018-05-18T05:11:52Z,OWNER,,"The new Facets implementation from #255 includes several hard-coded settings which should be made configurable somehow: Number of rows to return in a facet (maybe this should also be an option that can be set via quersytring argument, e.g. `?_facet=qSpecies:40`): https://github.com/simonw/datasette/blob/9959a9e4deec8e3e178f919e8b494214d5faa7fd/datasette/views/table.py#L539 Time limit for executing a facet: https://github.com/simonw/datasette/blob/9959a9e4deec8e3e178f919e8b494214d5faa7fd/datasette/views/table.py#L559-L562 Maximum unique values returned in order for a column to be suggested as a facet: https://github.com/simonw/datasette/blob/9959a9e4deec8e3e178f919e8b494214d5faa7fd/datasette/views/table.py#L646-L647 Time limit for calculating if a column should be a suggested facet: https://github.com/simonw/datasette/blob/9959a9e4deec8e3e178f919e8b494214d5faa7fd/datasette/views/table.py#L664-L667 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/264/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 907642546,MDU6SXNzdWU5MDc2NDI1NDY=,264,"Supporting additional output formats, like GeoJSON",25778,eyeseast,closed,0,,,,,3,2021-05-31T18:03:32Z,2021-06-03T05:12:21Z,2021-06-03T05:12:21Z,CONTRIBUTOR,,"I have a project going where it would be useful to do some spatial processing in SQLite (instead of large files) and then output GeoJSON. So my workflow would be something like this: 1. Read Shapefiles, GeoJSON, CSVs into a SQLite database 2. Join, filter, prune as needed 3. Export GeoJSON for just the stuff I need at that moment, while still having a database of things that will be useful later I'm wondering if this is worth adding to SQLite-utils itself (GeoJSON, at least), or if it's better to make a counterpart to the ecosystem of `*-to-sqlite` tools, say a suite of `sqlite-to-*` things. Or would it be crazy to have a plugin system?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/264/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 323677499,MDU6SXNzdWUzMjM2Nzc0OTk=,265,Add links to example Datasette instances to appropiate places in docs,9599,simonw,closed,0,,,,,5,2018-05-16T15:40:20Z,2018-06-18T15:52:15Z,2018-06-18T15:52:15Z,OWNER,,"Links to working examples would really help, especially on these pages: * http://datasette.readthedocs.io/en/latest/json_api.html * http://datasette.readthedocs.io/en/latest/sql_queries.html * http://datasette.readthedocs.io/en/latest/facets.html * http://datasette.readthedocs.io/en/latest/full_text_search.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/265/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 907795562,MDU6SXNzdWU5MDc3OTU1NjI=,265,Using enable_fts before search term,36287,prabhur,open,0,,,,,1,2021-06-01T01:43:34Z,2023-04-01T17:27:18Z,,NONE,,"Many thanks for the sqlite-utils suite of utilities. Has made my life much much easier. I used this to create a table and enable FTS. All works fine. The datasette utility detects FTS and shows a text box. Searching for a term using that interface works well. However, when I start to use features by following https://www.sqlite.org/fts5.html section **""3. Full-text Query Syntax""** I seem to run into issues that I suspect is due to `escape_fts` wrapper function. As an example, if i search for the term `""^குகை"" `on the text box in datasette it produces 140 results. However, when i tweak the query produced by datasette to not use ""escape_fts"" it produces 5 results. Similarly, when I try to restrict the search to a single column in FTS using a spec like `{title : ^குகை}` it returns no rows. The same thing pulls results when used without `escape_fts`. The text in the table is in Tamil language and the search term is a Tamil word. ``` ... where posts_fts match escape_fts(:search) ``` vs ``` ... where posts_fts match (:search) ``` Any ideas why? How can I get the benefits of both escaping as well as utilizing different facets of providing / controlling search terms? Thanks.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/265/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 323681589,MDU6SXNzdWUzMjM2ODE1ODk=,266,Export to CSV,9599,simonw,closed,0,,,,,27,2018-05-16T15:50:24Z,2021-06-17T18:14:24Z,2018-06-18T06:05:25Z,OWNER,,Datasette needs to be able to export data to CSV.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/266/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 913135723,MDU6SXNzdWU5MTMxMzU3MjM=,266,"Add some types, enforce with mypy",9599,simonw,closed,0,,,,,3,2021-06-07T06:05:56Z,2021-08-18T22:25:38Z,2021-08-18T22:25:38Z,OWNER,,"A good starting point would be adding type information to the members of these named tuples and the introspection methods that return them: https://github.com/simonw/sqlite-utils/blob/9dff7a38831d471b1dff16d40d89eb5c3b4e84d6/sqlite_utils/db.py#L51-L75",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/266/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 323716411,MDU6SXNzdWUzMjM3MTY0MTE=,267,"Documentation for URL hashing, redirects and cache policy",9599,simonw,closed,0,,,,,3,2018-05-16T17:29:01Z,2019-06-24T06:41:02Z,2019-06-24T06:41:02Z,OWNER,,See my comments on #258 for a starting point,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/267/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 915421499,MDU6SXNzdWU5MTU0MjE0OTk=,267,row.update() or row.pk,12721157,Gravitar64,open,0,,,,,4,2021-06-08T19:56:00Z,2021-06-22T17:27:27Z,,NONE,,"Hi, fantastic framework for working with Sqlite3 databases!!! I tried to update spezific rows in a table and used for row in db[tablename]: newValue = row[""counter""] * row[""prize""] row.update({""Fieldname"": newValue}) print(row) This updates the value in the printet row, but not in the database. So I switched to db[tablename].update(id, {""Filedname"": newValue}) This works fine. But row.update would be nicer, because no need for the id (its that row), no need for the tablename and the db (all defined in the for row ... loop). Thx ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/267/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 323718842,MDU6SXNzdWUzMjM3MTg4NDI=,268,Mechanism for ranking results from SQLite full-text search,9599,simonw,open,0,,,,,12,2018-05-16T17:36:40Z,2022-01-13T22:21:28Z,,OWNER,,This isn't particularly straight-forward - all the more reason for Datasette to implement it for you. This article is helpful: http://charlesleifer.com/blog/using-sqlite-full-text-search-with-python/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/268/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 919181559,MDU6SXNzdWU5MTkxODE1NTk=,268,db.schema property and sqlite-utils schema command,9599,simonw,closed,0,,,,,4,2021-06-11T20:25:47Z,2021-06-11T20:51:56Z,2021-06-11T20:51:56Z,OWNER,,"`table.schema` returns the schema for a table. `db.schema` should return the schema for the whole databes. Can do this using `select sql from sqlite_master where sql is not null`: https://latest.datasette.io/fixtures?sql=select+sql+from+sqlite_master+where+sql+is+not+null",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/268/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 323726888,MDU6SXNzdWUzMjM3MjY4ODg=,269,"If a facet fails due to timing out, let the user know somehow",9599,simonw,closed,0,,,,,0,2018-05-16T18:01:47Z,2018-05-18T06:11:46Z,2018-05-18T06:11:46Z,OWNER,,Refs #255 - right now facets fail silently if the user requested them but they take longer than 200ms to calculate - see also #264,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/269/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 919250621,MDU6SXNzdWU5MTkyNTA2MjE=,269,bool type not supported,4068,frafra,closed,0,,,,,3,2021-06-11T22:00:36Z,2021-06-15T01:34:10Z,2021-06-15T01:34:10Z,NONE,,"Hi! Thank you for sharing this very nice tool :) It would be nice to have support for more types, like `bool`: it is not possible to convert to boolean at the moment. My suggestion would be to handle it as `bool(int(value))`, like csvkit does.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/269/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 323830051,MDU6SXNzdWUzMjM4MzAwNTE=,270,--limit= CLI option for setting limits,9599,simonw,closed,0,,,,,1,2018-05-17T00:14:24Z,2018-05-18T06:19:31Z,2018-05-18T06:16:39Z,OWNER,,"#264 calls for four new datasette limit options, on top of the two existing ones: * `--max_returned_rows` * `--sql_time_limit_ms` These are already clogging up `datasette serve --help` a bit. How about this syntax instead? datasette --limit max_returned_rows:100 \ --limit facet_timeout_ms:500 demo.db Then we can add as many new user over-rideable limits as we like without clogging up `--help` too much - though it would be good to have a way of optionally listings their documentation as well.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/270/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 919314806,MDU6SXNzdWU5MTkzMTQ4MDY=,270,Cannot set type JSON,4068,frafra,closed,0,,,,,4,2021-06-11T23:53:22Z,2021-06-16T17:34:49Z,2021-06-16T15:47:06Z,NONE,,"It would be great if the column type could be set to JSON. That would not be different from handling a regular string. It would be something like `repr(value)` and it would work with both JSON and CSV inputs, no matter if `value` is a real list or just a string representing a list.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/270/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 324162476,MDU6SXNzdWUzMjQxNjI0NzY=,271,Mechanism for automatically picking up changes when on-disk .db file changes,9599,simonw,closed,0,,,,,4,2018-05-17T19:53:15Z,2019-01-10T21:35:18Z,2019-01-10T21:35:18Z,OWNER,,"It would be useful if Datasette could spot when a SQLite database file changes on disk and restart itself (hence re-running .inspect() and picking up the new content hash). Ideally this could happen in an atomic way so no requests get dropped during the switch-over. This may not play well with SQLite opening databases in immutable mode. Research required.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/271/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 919702451,MDU6SXNzdWU5MTk3MDI0NTE=,271,table.upsert_all() fails if input has a single column that should be a primary key,9599,simonw,closed,0,,,,,1,2021-06-13T02:50:27Z,2021-06-13T02:57:29Z,2021-06-13T02:57:29Z,OWNER,,"This works: ```pycon >>> db['foo'].insert_all([{""name"": ""hello""}], pk=""name"")
    ``` But this fails: ``` >>> db['foo3'].upsert_all([{""name"": ""hello""}], pk=""name"") Traceback (most recent call last): File """", line 1, in File ""/Users/simon/.local/share/virtualenvs/datasette.io-TK86ygSO/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1837, in upsert_all return self.insert_all( File ""/Users/simon/.local/share/virtualenvs/datasette.io-TK86ygSO/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1778, in insert_all self.insert_chunk( File ""/Users/simon/.local/share/virtualenvs/datasette.io-TK86ygSO/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1588, in insert_chunk result = self.db.execute(query, params) File ""/Users/simon/.local/share/virtualenvs/datasette.io-TK86ygSO/lib/python3.9/site-packages/sqlite_utils/db.py"", line 213, in execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: near ""WHERE"": syntax error ``` With the debugger: ``` >>> import pdb; pdb.pm() > /Users/simon/.local/share/virtualenvs/datasette.io-TK86ygSO/lib/python3.9/site-packages/sqlite_utils/db.py(213)execute() -> return self.conn.execute(sql, parameters) (Pdb) print(sql, parameters) UPDATE [foo3] SET WHERE [name] = ? ['hello'] ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/271/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 324188953,MDU6SXNzdWUzMjQxODg5NTM=,272,Port Datasette to ASGI,9599,simonw,closed,0,9599,simonw,3268330,Datasette 1.0,42,2018-05-17T21:16:32Z,2019-06-24T04:54:15Z,2019-06-24T03:33:06Z,OWNER,,"Datasette doesn't take much advantage of Sanic, and I'm increasingly having to work around parts of it because of idiosyncrasies that are specific to Datasette - caring about the exact order of querystring arguments for example. Since Datasette is GET-only our needs from a web framework are actually pretty slim. This becomes more important as I expand the plugins #14 framework. Am I sure I want the plugin ecosystem to depend on a Sanic if I might move away from it in the future? If Datasette wasn't all about async/await I would use WSGI, but today it makes more sense to use ASGI. I'd like to be confident that switching to ASGI would still give me the excellent performance that Sanic provides. https://github.com/django/asgiref/blob/master/specs/asgi.rst",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/272/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 921878733,MDU6SXNzdWU5MjE4Nzg3MzM=,272,"Idea: import CSV to memory, run SQL, export in a single command",9599,simonw,closed,0,,,,,22,2021-06-15T23:02:48Z,2021-06-19T23:36:48Z,2021-06-18T15:05:03Z,OWNER,,"I quite often load a CSV file into a SQLite DB, then do stuff with it (like export results back out again as a new CSV) without any intention of keeping the CSV file around afterwards. What if `sqlite-utils` could do this for me? Something like this: sqlite-utils --csv blah.csv --csv baz.csv ""select * from blah join baz ..."" ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/272/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 324451322,MDU6SXNzdWUzMjQ0NTEzMjI=,273,Figure out a way to have /-/version return current git commit hash,9599,simonw,closed,0,,,,,2,2018-05-18T15:16:56Z,2018-05-22T19:35:22Z,2018-05-22T19:35:22Z,OWNER,,"https://fivethirtyeight.datasettes.com/-/versions reports Datasette version `0.21` This isn't actually correct. The deploy script for that site actually deploys current master using `https://github.com/simonw/datasette/archive/master.zip`: https://github.com/simonw/fivethirtyeight-datasette/blob/66b4b0dfedd7237bc8c02d3e26d905bca7b84069/Dockerfile#L9 Ideally this would show the current commit hash, but I'm not at all sure if it's possible to derive that from `pip install https://github.com/simonw/datasette/archive/master.zip`. Is there another mechanism that could be used to reliably `pip install` current master but still provide access to the most recent commit hash?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/273/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 922099793,MDExOlB1bGxSZXF1ZXN0NjcxMDE0NzUx,273,sqlite-utils memory command for directly querying CSV/JSON data,9599,simonw,closed,0,,,,,8,2021-06-16T05:04:58Z,2021-06-18T15:01:17Z,2021-06-18T15:00:52Z,OWNER,simonw/sqlite-utils/pulls/273,"Refs #272. Initial implementation only does CSV data, still needs: - [x] Implement `--save` - [x] Add `--dump` to the documentation - [x] Add `--attach` example to the documentation - [x] Replace `:memory:` in documentation",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/273/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 324652142,MDU6SXNzdWUzMjQ2NTIxNDI=,274,"Rename --limit to --config, add --help-config",9599,simonw,closed,0,,,,,2,2018-05-19T18:57:42Z,2018-05-20T17:04:55Z,2018-05-20T17:04:11Z,OWNER,,"#270 introduced `--limit` but on further thought it should be called `--config` instead. `--page_size` should becomes `--config default_page_size:1000` Add `--help-config` to show full help showing all config settings.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/274/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 922832113,MDU6SXNzdWU5MjI4MzIxMTM=,274,sqlite-utils dump my.db command,9599,simonw,closed,0,,,,,0,2021-06-16T16:30:14Z,2021-06-16T23:51:54Z,2021-06-16T23:51:54Z,OWNER,,"Inspired by the `--dump` mechanism I added to `sqlite-utils memory` here: https://github.com/simonw/sqlite-utils/issues/272#issuecomment-862018937 > Can use `.iterdump()` to implement this: https://docs.python.org/3/library/sqlite3.html#sqlite3.Connection.iterdump > > Maybe instead (or as-well-as) offer `--dump` which dumps out the SQL from that.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/274/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 324720095,MDU6SXNzdWUzMjQ3MjAwOTU=,275,"""config"" section in metadata.json (root, database and table level)",9599,simonw,closed,0,,,,,3,2018-05-20T16:02:28Z,2023-08-23T01:28:37Z,2023-08-23T01:28:37Z,OWNER,,"Split off from #274 Metadata should an optional `""config""` section at root, table or database level. The TableView and RowView and DatabaseView and BaseView classes could all have a `.config(""key"")` method which knows how to resolve the hierarchy of configs. This will allow individual tables (or databases) to set their own config settings for things like `sql_time_limit_ms`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/275/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 922955697,MDU6SXNzdWU5MjI5NTU2OTc=,275,Enable code coverage,9599,simonw,closed,0,,,,,1,2021-06-16T18:33:49Z,2021-06-17T00:12:12Z,2021-06-17T00:12:12Z,OWNER,,"https://app.codecov.io/gh/simonw/sqlite-utils Same mechanism as Datasette. Need to copy across the token from that page and add an equivalent of this workflow: https://github.com/simonw/datasette/blob/main/.github/workflows/test-coverage.yml",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/275/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 324835838,MDU6SXNzdWUzMjQ4MzU4Mzg=,276,Handle spatialite geometry columns better,45057,russss,closed,0,,,,,21,2018-05-21T08:46:55Z,2022-03-21T22:22:20Z,2022-03-21T22:22:20Z,CONTRIBUTOR,,"I'd like to see spatialite geometry columns rendered more sensibly - at the moment they come through as well-known-binary unless you use custom SQL, and WKB isn't of much use to anyone on the web. In HTML: they should be shown either as simple lat/long (if it's just a point, for example), or as a sensible placeholder if they're more complex geometries. In JSON: they should be GeoJSON geometries, (which means they can be automatically fed into a leaflet map with no further messing around). In CSV: they should be WKT. I briefly wondered if this should go into a plugin, but I suspect it needs hooking in at a deeper level than the plugin architecture will support any time soon.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/276/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 923602693,MDU6SXNzdWU5MjM2MDI2OTM=,276,support small help flag -h,601708,mcint,closed,0,,,,,0,2021-06-17T07:59:31Z,2021-06-18T14:56:59Z,2021-06-18T14:56:59Z,CONTRIBUTOR,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/276/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 324836533,MDExOlB1bGxSZXF1ZXN0MTg5MzE4NDUz,277,Refactor inspect logic,45057,russss,closed,0,,,,,2,2018-05-21T08:49:31Z,2018-05-22T16:07:24Z,2018-05-22T14:03:07Z,CONTRIBUTOR,simonw/datasette/pulls/277,"This pulls the logic for inspect out into a new file which makes it a bit easier to understand. This was going to be the first part of an implementation for #276, but it seems like that might take a while so I'm going to PR a few bits of refactoring individually.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/277/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 923612361,MDExOlB1bGxSZXF1ZXN0NjcyMzU5NjA5,277,add -h support closes #276,601708,mcint,closed,0,,,,,2,2021-06-17T08:08:26Z,2021-06-18T14:56:59Z,2021-06-18T14:56:59Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/277,This appears to be the [canonical solution](https://click.palletsprojects.com/en/7.x/documentation/#help-parameter-customization).,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/277/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 325294102,MDU6SXNzdWUzMjUyOTQxMDI=,278,Build smallest possible Docker image with Datasette plus recent SQLite (with json1) plus Spatialite 4.4.0,9599,simonw,closed,0,,,,,3,2018-05-22T13:28:40Z,2018-05-23T17:43:36Z,2018-05-23T17:43:36Z,OWNER,,"A Dockerfile that does the following: * Bundles Datasette master * Python 3.6 most recent version (or 3.7 if it has been released) * SQLite 3.23.1 (or most recent release) such that ""import sqite3"" in Python gets that version. Ideally with the json1 module baked in by default, but having it loadable as an optional module is fine too * SpatiaLite 4.4.0-RC0 (or most recent version) such that it can be loaded as an optional module * Uses multi-stage builds to stay as small as possible Note that the current ""release"" of SpatiaLite is 4.3.0 which is missing key features like https://www.gaia-gis.it/fossil/libspatialite/wiki?name=KNN - 4.4.0 probably needs to be compiled from source. I don't know the best way to get a current SQLite version bundled for Python 3. Maybe https://github.com/coleifer/pysqlite3 ?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/278/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 923697888,MDU6SXNzdWU5MjM2OTc4ODg=,278,"Support db as first parameter before subcommand, or as environment variable",601708,mcint,closed,0,,,,,3,2021-06-17T09:26:29Z,2021-06-20T22:39:57Z,2021-06-18T15:43:19Z,CONTRIBUTOR,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/278/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 325352370,MDExOlB1bGxSZXF1ZXN0MTg5NzA3Mzc0,279,Add version number support with Versioneer,198537,rgieseke,closed,0,,,,,4,2018-05-22T15:39:45Z,2018-05-22T19:35:23Z,2018-05-22T19:35:22Z,CONTRIBUTOR,simonw/datasette/pulls/279,"I think that's all for getting Versioneer support, I've been happily using it in a couple of projects ... ``` In [2]: datasette.__version__ Out[2]: '0.22+3.g6e12445' ``` Repo: https://github.com/warner/python-versioneer Versioneer Licence: Public Domain (CC0-1.0) Closes #273 ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/279/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 924990677,MDU6SXNzdWU5MjQ5OTA2Nzc=,279,sqlite-utils memory should handle TSV and JSON in addition to CSV,9599,simonw,closed,0,,,,,7,2021-06-18T15:02:54Z,2021-06-19T03:11:59Z,2021-06-19T03:11:59Z,OWNER,,"- Use sniff to detect CSV or TSV (if `:tsv` or `:csv` was not specified) and delimiters Follow-on from #272",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/279/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 325373747,MDExOlB1bGxSZXF1ZXN0MTg5NzIzNzE2,280,Build Dockerfile with recent Sqlite + Spatialite,565628,r4vi,closed,0,,,,,10,2018-05-22T16:33:50Z,2018-06-28T11:26:23Z,2018-05-23T17:43:35Z,CONTRIBUTOR,simonw/datasette/pulls/280,"This solves #278 without bloating the Dockerfile too much, the image size is now 495MB (original was ~240MB) but it could be reduced significantly if we only copied the output of the compilation of spatialite and friends to /usr/local/lib, instead of the entirety of it however that will take more time. In the python code change references to `import sqlite3` to `import pysqlite3` and it should use the compiled version of sqlite3.23.1. You don't need to try/except because pysqlite3 falls back to builtin sqlite3 if there is no compiled version. ```bash $ docker run --rm -it datasette spatialite SpatiaLite version ..: 4.4.0-RC0 Supported Extensions: - 'VirtualShape' [direct Shapefile access] - 'VirtualDbf' [direct DBF access] - 'VirtualXL' [direct XLS access] - 'VirtualText' [direct CSV/TXT access] - 'VirtualNetwork' [Dijkstra shortest path] - 'RTree' [Spatial Index - R*Tree] - 'MbrCache' [Spatial Index - MBR cache] - 'VirtualSpatialIndex' [R*Tree metahandler] - 'VirtualElementary' [ElemGeoms metahandler] - 'VirtualKNN' [K-Nearest Neighbors metahandler] - 'VirtualXPath' [XML Path Language - XPath] - 'VirtualFDO' [FDO-OGR interoperability] - 'VirtualGPKG' [OGC GeoPackage interoperability] - 'VirtualBBox' [BoundingBox tables] - 'SpatiaLite' [Spatial SQL - OGC] PROJ.4 version ......: Rel. 4.9.3, 15 August 2016 GEOS version ........: 3.5.1-CAPI-1.9.1 r4246 TARGET CPU ..........: x86_64-linux-gnu the SPATIAL_REF_SYS table already contains some row(s) SQLite version ......: 3.23.1 Enter "".help"" for instructions SQLite version 3.23.1 2018-04-10 17:39:29 Enter "".help"" for instructions Enter SQL statements terminated with a "";"" spatialite> ``` ```bash $ docker run --rm -it datasette python -c ""import pysqlite3; print(pysqlite3.sqlite_version)"" 3.23.1 ```",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/280/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 924991194,MDU6SXNzdWU5MjQ5OTExOTQ=,280,Add --encoding option to sqlite-utils memory,9599,simonw,closed,0,,,,,0,2021-06-18T15:03:32Z,2021-06-18T15:29:46Z,2021-06-18T15:29:46Z,OWNER,,Follow-on from #272 - this will work like `--encoding` on `sqlite-utils insert` and will affect all CSV files processed by `sqlite-utils memory`.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/280/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 325553991,MDExOlB1bGxSZXF1ZXN0MTg5ODYwMDUy,281,Reduces image size using Alpine + Multistage (re: #278),487897,iMerica,closed,0,,,,,1,2018-05-23T05:27:05Z,2018-05-26T02:10:38Z,2018-05-26T02:10:38Z,NONE,simonw/datasette/pulls/281,"Hey Simon! I got the image size down from 256MB to 110MB. Seems to be working okay, but you might want to test it a bit more. Example output of `docker run --rm -it datasette` ``` Serve! files=() on port 8001 [2018-05-23 05:23:08 +0000] [1] [INFO] Goin' Fast @ http://127.0.0.1:8001 [2018-05-23 05:23:08 +0000] [1] [INFO] Starting worker [1] ``` Related: https://github.com/simonw/datasette/issues/278 ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/281/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 924992318,MDU6SXNzdWU5MjQ5OTIzMTg=,281,Mechanism for explicitly stating CSV or JSON or TSV for sqlite-utils memory,9599,simonw,closed,0,,,,,1,2021-06-18T15:04:53Z,2021-06-19T03:11:59Z,2021-06-19T03:11:59Z,OWNER,,"- Implement `filename.json:json` and `-:nl` and suchlike options for specifying the format rather than guessing it - see https://github.com/simonw/sqlite-utils/issues/272#issuecomment-861985944 Follows #272",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/281/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 325705981,MDU6SXNzdWUzMjU3MDU5ODE=,282,Faceting breaks pagination,9599,simonw,closed,0,,,,,1,2018-05-23T13:29:47Z,2018-05-23T13:53:39Z,2018-05-23T13:42:07Z,OWNER,,"e.g. on https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3/nba-elo%2Fnbaallelo?_facet=lg_id#facet-lg_id - click the ""next page"" link: https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3/nba-elo%2Fnbaallelo?_facet=lg_id&_next=100 Invalid SQL: near ""and"": syntax error",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/282/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925305186,MDU6SXNzdWU5MjUzMDUxODY=,282,Automatic type detection for CSV data,9599,simonw,closed,0,,,,,4,2021-06-19T03:33:21Z,2021-06-19T04:42:03Z,2021-06-19T04:38:00Z,OWNER,,"I've touched on this before in #179 - but now that I've added `sqlite-utils memory` this is much more important - because unlike with `sqlite-utils insert` the in-memory command doesn't give you the opportunity to fix any types you imported from CSV, so queries like `select * from stdin where age > 3` are never going to work correctly against these temporary in-memory tables. Teaching `sqlite-utils insert` to detect types for columns in a CSV file would be a backwards-compatibility breaking change. Teaching `sqlite-utils memory` that trick would not be, since it hasn't been included in a release yet. It's a little inconsistent, but I'm going to have `sqlite-utils memory` default to detecting types while `sqlite-utils insert` does not. In each case this can be controlled by a new command-line option: cat file.csv | sqlite-utils memory - --no-detect-types To opt-in for `sqlite-utils insert`: cat file.csv | sqlite-utils insert blah.db blah - --detect-types I'll have short options for these too: `-n` for `--no-detect-types` and `-d` for `--detect-types`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/282/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",,completed 325958506,MDU6SXNzdWUzMjU5NTg1MDY=,283,Support cross-database joins,9599,simonw,closed,0,,,,,26,2018-05-24T04:18:39Z,2021-06-06T09:40:18Z,2021-02-18T22:16:46Z,OWNER,,"SQLite has the ability to attach multiple databases to a single connection and then run joins across multiple databases. Since Datasette supports more than one database, this would make a pretty neat feature.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/283/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925319214,MDU6SXNzdWU5MjUzMTkyMTQ=,283,memory: Shouldn't detect types for JSON,9599,simonw,closed,0,,,,,1,2021-06-19T05:17:35Z,2021-06-19T14:52:48Z,2021-06-19T14:52:48Z,OWNER,,"https://github.com/simonw/sqlite-utils/blob/ec5174ed40fa283cb06f25ee0c0136297ec313ae/sqlite_utils/cli.py#L1244-L1251 This runs against JSON as well as CSV/TSV - which isn't necessary and In fact throws errors if there is any nested data.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/283/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 326182814,MDU6SXNzdWUzMjYxODI4MTQ=,284,Ability to enable/disable specific features via --config,9599,simonw,closed,0,,,,,5,2018-05-24T15:47:56Z,2018-05-25T06:05:02Z,2018-05-25T05:51:09Z,OWNER,,"`--config` settings from #274 can currently only be integers. I'd like them to be available as boooeans too. Then we can use them to have that are turned on by default but can be turned off. First features to get this treatment: - [x] `allow_sql` - whether or not the `?sql=` parameter is allowed and form is displayed - [X] `allow_facet` - is `?_facet=` allowed or do we only run facets defined in `metadata.json` - [X] `allow_download` - do we let users download the full SQLite database file? - [X] `suggest_facets` - do we attempt to calculate suggested facets? Refs #275 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/284/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925320167,MDU6SXNzdWU5MjUzMjAxNjc=,284,.transform(types=) turns rowid into a concrete column,9599,simonw,closed,0,,,,,5,2021-06-19T05:25:27Z,2021-06-19T15:28:30Z,2021-06-19T15:28:30Z,OWNER,,"Noticed this in the tests for `sqlite-utils memory` in #282 - is it possible to fix this? https://github.com/simonw/sqlite-utils/commit/ec5174ed40fa283cb06f25ee0c0136297ec313ae",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/284/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 326189744,MDU6SXNzdWUzMjYxODk3NDQ=,285,num_threads and cache_max_age should be --config options,9599,simonw,closed,0,,,,,2,2018-05-24T16:04:51Z,2018-05-27T00:53:35Z,2018-05-27T00:43:33Z,OWNER,,"https://github.com/simonw/datasette/blob/58b5a37dbbf13868a46bcbb284509434e66eca25/datasette/app.py#L106 And https://github.com/simonw/datasette/blob/58b5a37dbbf13868a46bcbb284509434e66eca25/datasette/views/base.py#L325 Refs #275 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/285/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925410305,MDU6SXNzdWU5MjU0MTAzMDU=,285,Introspection property for telling if a table is a rowid table,9599,simonw,closed,0,,,,,7,2021-06-19T14:56:16Z,2021-06-19T15:12:33Z,2021-06-19T15:12:33Z,OWNER,,_Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/284#issuecomment-864416785_,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/285/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 326599525,MDU6SXNzdWUzMjY1OTk1MjU=,286,Database hash should include current datasette version,9599,simonw,open,0,,,,,2,2018-05-25T17:03:42Z,2018-05-25T17:07:36Z,,OWNER,,"Right now deploying a new version of datasette doesn't invalidate existing URLs, so users may still see a cached copy of the old templates. We can fix this by including the current datasette version in the input to the hash function (which currently just the database file contents).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/286/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 925487946,MDU6SXNzdWU5MjU0ODc5NDY=,286,Add installation instructions,9599,simonw,closed,0,,,,,1,2021-06-19T23:55:36Z,2021-06-20T18:47:13Z,2021-06-20T18:47:13Z,OWNER,,"`pip install sqlite-utils`, `pipx install sqlite-utils` and `brew install sqlite-utils`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/286/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 326617744,MDU6SXNzdWUzMjY2MTc3NDQ=,287,?_shape=arrayfirst,9599,simonw,closed,0,,,,,1,2018-05-25T18:11:03Z,2018-05-27T00:32:53Z,2018-05-27T00:32:29Z,OWNER,,Return an array of single items (the first item in each row returned from the SQL query).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/287/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925544070,MDU6SXNzdWU5MjU1NDQwNzA=,287,Update rowid examples in the docs,9599,simonw,closed,0,,,,,0,2021-06-20T08:03:00Z,2021-06-20T18:26:21Z,2021-06-20T18:26:21Z,OWNER,,Changed in #284 - a couple of examples need updating on https://github.com/simonw/sqlite-utils/blob/3.10/docs/cli.rst.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/287/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 326767626,MDU6SXNzdWUzMjY3Njc2MjY=,288,Support multiple filters of the same type,9599,simonw,closed,0,,,,,3,2018-05-26T21:13:12Z,2019-04-15T23:45:04Z,2019-04-15T23:44:26Z,OWNER,,"This should work for example: https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3/biopics%2Fbiopics?year_release__not=2014&year_release__not=2015",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/288/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925545468,MDU6SXNzdWU5MjU1NDU0Njg=,288,sqlite-utils memory blah.json --schema,9599,simonw,closed,0,,,,,0,2021-06-20T08:10:40Z,2021-06-20T18:26:21Z,2021-06-20T18:26:21Z,OWNER,,Like `--dump` but only outputs the schema - useful for understanding what you are about to run queries against.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/288/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 326768188,MDU6SXNzdWUzMjY3NjgxODg=,289,?_ttl= parameter to control caching,9599,simonw,closed,0,,,,,3,2018-05-26T21:22:55Z,2018-05-26T22:22:47Z,2018-05-26T22:17:48Z,OWNER,,"This would allow clients to specify the max-age caching header that should be returned with the query. Most important this will allow caching to be completely urned off for specific queries using `?_ttl=0`. Sending 0 should cause a `Cache-Control: no-cache` header to be returned.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/289/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 925677191,MDU6SXNzdWU5MjU2NzcxOTE=,289,Mypy fixes for rows_from_file(),857609,adamchainz,closed,0,,,,,3,2021-06-20T20:34:59Z,2021-06-22T18:44:36Z,2021-06-22T18:13:26Z,NONE,,"Following https://github.com/simonw/sqlite-utils/issues/279#issuecomment-864328927 You had two mypy errors. The first: > sqlite_utils/utils.py:157: error: Argument 1 to ""BufferedReader"" has incompatible type ""BinaryIO""; expected ""RawIOBase"" Looking at the `BufferedReader` docs, it seems to expect a `RawIOBase`, and this [has been copied into typeshed](https://github.com/python/typeshed/blob/9ec2f8712480c57353cea097a65d75a2c4ec1846/stdlib/io.pyi#L100). There may be scope to change how `BufferedReader` is documented and typed upstream, but for now it wouldn't be too bad to use a `typing.cast()`: ``` # Detect the format, then call this recursively buffered = io.BufferedReader( cast(io.RawIOBase, fp), # Undocumented BufferedReader support for BinaryIO buffer_size=4096, ) ``` The second error seems to be flagging a legitimate bug in your code: > sqlite_utils/utils.py:163: error: Argument 1 to ""decode"" of ""bytes"" has incompatible type ""Optional[str]""; expected ""str"" From your type hints, `encoding` may be `None`. In the CSV format block, you use `encoding or ""utf-8-sig""` to set a default, maybe that's desirable in this case too? ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/289/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 326778161,MDU6SXNzdWUzMjY3NzgxNjE=,290,Consider increasing the default for num_sql_threads (currently 3),9599,simonw,open,0,,,,,0,2018-05-27T00:52:41Z,2018-05-27T00:52:41Z,,OWNER,,"I ran a very rough micro-benchmark on the new `num_sql_threads` config option (added in #285) datasette --config num_sql_threads:1 fivethirtyeight.db Then ab -n 100 -c 10 'http://127.0.0.1:8011/fivethirtyeight-2628db9/twitter-ratio%2Fsenators' | Number of threads | Requests/second | |---|---| | 1 | 4.57 | | 3 | 9.77 | | 10 | 13.53 | | 20 | 15.24 | 50 | 8.21 | This was on my early 2018 OS X laptop. Need to benchmark in other common environments before making a decision on changing the default. That said, the default of 3 was a number I plucked out of thin air.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/290/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 926777310,MDU6SXNzdWU5MjY3NzczMTA=,290,`db.query()` method (renamed `db.execute_returning_dicts()`),9599,simonw,closed,0,,,,,6,2021-06-22T03:03:54Z,2021-06-24T23:17:38Z,2021-06-24T22:54:43Z,OWNER,,"Most of this library deals with lists of Python dictionaries - `.insert_all()`, `.rows`, `.rows_where()`, `.search()`. The `db.execute()` method is the only thing that returns a `sqlite3` cursor. There is a clumsily named `db.execute_returning_dicts(sql)` method but it's not currently mentioned in the documentation. It needs a better name, and needs to be properly documented.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/290/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 326783670,MDU6SXNzdWUzMjY3ODM2NzA=,291,Avoid plugins accidentally loading dependencies twice,9599,simonw,closed,0,,,,,3,2018-05-27T03:15:21Z,2020-09-30T20:36:12Z,2018-05-28T20:42:02Z,OWNER,,Plugins that include JavaScript files risk loading the same code twice. In particular: I want to build a second plugin that uses the Leaflet mapping library (the first was [datasette-cluster-map](https://pypi.org/project/datasette-cluster-map/)). But I don't want the two plugins to load duplicate copies of Leaflet.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/291/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 927766296,MDU6SXNzdWU5Mjc3NjYyOTY=,291,Adopt flake8,9599,simonw,closed,0,,,,,2,2021-06-23T01:19:37Z,2021-06-24T17:50:27Z,2021-06-24T17:50:27Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/291/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 326800219,MDU6SXNzdWUzMjY4MDAyMTk=,292,Mechanism for customizing the SQL used to select specific columns in the table view,9599,simonw,closed,0,,,,,15,2018-05-27T09:05:52Z,2021-05-27T04:25:01Z,2021-05-27T04:25:01Z,OWNER,,"Some columns don't make a lot of sense in their default representation - binary blobs such as SpatiaLite geometries for example, or lengthy columns that really should be truncated somehow. We may also find that there are tables where we don't want to show all of the columns - so a mechanism to select a subset of columns would be nice. I think there are two features here: * the ability to request a subset of columns on the table view * the ability to override the SQL for a specific column and/or add extra columns - `AsGeoJSON(Geometry)` for example Both features should be available via both querystring arguments and in `metadata.json` The querystring argument for custom SQL should only work if `allow_sql` config is turned on. Refs #276",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/292/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 927789811,MDU6SXNzdWU5Mjc3ODk4MTE=,292,Add contributing documentation,9599,simonw,closed,0,,,,,0,2021-06-23T02:13:05Z,2021-06-25T17:53:51Z,2021-06-25T17:53:51Z,OWNER,,Like https://docs.datasette.io/en/latest/contributing.html (but simpler) - should cover how to run `black` and `flake8` and `mypy` and how to run the tests.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/292/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 326987229,MDExOlB1bGxSZXF1ZXN0MTkwOTAxNDI5,293,Support for external database connectors,11912854,jsancho-gpl,closed,0,,,,,1,2018-05-28T11:02:45Z,2018-09-11T14:32:45Z,2018-09-11T14:32:45Z,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/293,"I think it would be nice that Datasette could work with other file formats that aren't SQLite, like files with PyTables format. I've tried to accomplish that using external connectors published with entry points. These external connectors must have a structure similar to the structure [PyTables Datasette connector](https://github.com/PyTables/datasette-pytables) has.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/293/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 929748885,MDExOlB1bGxSZXF1ZXN0Njc3NTU0OTI5,293,Test against Python 3.10-dev,9599,simonw,closed,0,,,,,3,2021-06-25T01:40:39Z,2021-10-13T21:49:33Z,2021-10-13T21:49:33Z,OWNER,simonw/sqlite-utils/pulls/293,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/293/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 327365110,MDU6SXNzdWUzMjczNjUxMTA=,294,inspect should record column types,9599,simonw,open,0,,,,,7,2018-05-29T15:10:41Z,2019-06-28T16:45:28Z,,OWNER,,"For each table we want to know the columns, their order and what type they are. I'm going to break with SQLite defaults a little on this one and allow datasette to define additional types - to start with just a `geometry` type for columns that are detected as SpatiaLite geometries. Possible JSON design: ""columns"": [{ ""name"": ""title"", ""type"": ""text"" }, ...] Refs #276",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/294/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 931752773,MDU6SXNzdWU5MzE3NTI3NzM=,294,Add a `sqlite-utils memory` example to the README,9599,simonw,closed,0,,,,,0,2021-06-28T16:35:59Z,2021-08-18T21:40:03Z,2021-08-18T21:40:03Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/294/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 327383759,MDU6SXNzdWUzMjczODM3NTk=,295,Extract unit tests for inspect out to test_inspect.py,9599,simonw,closed,0,,,,,2,2018-05-29T15:55:04Z,2019-05-11T21:40:32Z,2019-05-11T21:40:32Z,OWNER,,"Right now they are bundled up as API unit tests for a relatively unimportant endpoint. They should be their own thing. Blocks #294",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/295/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 934123448,MDU6SXNzdWU5MzQxMjM0NDg=,295,Insert with --tsv and --no-headers give error about --nl arguments,7288187,davidscotson,closed,0,,,,,1,2021-06-30T21:01:01Z,2021-08-18T20:19:04Z,2021-08-18T20:18:57Z,NONE,,"Not quite sure if this is a bug, or just an assumption I made but I thought `--tsv` and `--no-headers` would work together when inserting from a file, and currently they seem not to (sqlite-utils, version 3.12, installed on Mac OS X via brew) Instead it says: `Error: Use just one of --nl, --csv or --tsv` As if it has interpreted the --no-headers as --nl. The --help does specifically say CSV: `--no-headers CSV file has no header row` And this heading in the documentation also only refers to CSV, but the text does mention TSV in passing, and I'd generally expect them to behave the same in most cases. https://sqlite-utils.datasette.io/en/stable/cli.html#csv-files-without-a-header-row",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/295/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 327395270,MDU6SXNzdWUzMjczOTUyNzA=,296,Per-database and per-table /-/ URL namespace,9599,simonw,open,0,,,,,3,2018-05-29T16:23:13Z,2019-06-28T16:46:34Z,,OWNER,,"Initially this will be for subsets of `/-/inspect` and `/-/metadata` but it will also give us a URL namespace for future features like `/-/facet` (expanded list of a specific facet, linked to from `...`) and `/-/graph` To start: * `/dbname/-/inspect` * `/dbname/-/metadata` * `/dbname/tablename/-/inspect` * `/dbname/tablename/-/metadata` This means we will no longer allow databases or tables to have the name `""-""` - I think that's OK We will continue to support rows with a primary key of `""-""` at the following URL: * `/dbname/tablename/-`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/296/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 944326512,MDU6SXNzdWU5NDQzMjY1MTI=,296,"`table.search(..., quote=True)` parameter and `sqlite-utils search --quote` option",32427188,deafmute1,closed,0,,,,,6,2021-07-14T11:26:47Z,2021-08-18T20:13:12Z,2021-08-18T20:10:48Z,NONE,,"Hi, Recently got this error: ``` Traceback (most recent call last): File """", line 1, in File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/__init__.py"", line 38, in start(""/home/ethan/git/music-metadata-indexer/sample"", ""/home/ethan/git/music-metadata-indexer/test.db"") File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/__init__.py"", line 23, in start scanner.build_database() File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/scan.py"", line 79, in build_database _import_song(self.db, Path(dirpath).joinpath(f), self.logger) File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/scan.py"", line 23, in _import_song db.add_song(filepath) File ""/home/ethan/git/music-metadata-indexer/src/mmindexer/index.py"", line 166, in add_song for match in self.search(""albums"", album): File ""/home/ethan/git/music-metadata-indexer/env/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1625, in search cursor = self.db.execute( File ""/home/ethan/git/music-metadata-indexer/env/lib/python3.9/site-packages/sqlite_utils/db.py"", line 243, in execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: fts5: syntax error near ""."" ``` So, the error seems to suggest there was a ""."" character somewhere in the SQL command that was causing the error. I did a little digging and found this in the docs: https://www.sqlite.org/fts5.html#fts5_strings. ""."" is one of the many prohibited characters. My solution was to just strip these out of the query using this line `query = query.translate({e: None for e in itertools.chain(range(0,26), range(27, 48), range(58,65), range(91,95), [96], range(123,128))})` Perhaps this could be included into the `table.search()` function? ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/296/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 327420945,MDU6SXNzdWUzMjc0MjA5NDU=,297,datasette publish Dockerfile should use python:3.6-slim-stretch,9599,simonw,closed,0,,,,,1,2018-05-29T17:40:08Z,2018-05-31T14:44:37Z,2018-05-31T14:44:37Z,OWNER,,"Right now the Dockerfile generated by `datasette package` and `datasette publish` uses this: https://github.com/simonw/datasette/blob/b0a95da96386ddf99816911e08df86178ffa9a89/datasette/utils.py#L269 This appears to result in a SQLite version of `3.8.7.1` - https://parlgov.datasettes.com/-/versions ``` ""sqlite"": { ""extensions"": {}, ""fts_versions"": [ ""FTS4"", ""FTS3"" ], ""version"": ""3.8.7.1"" } ``` Meanwhile, https://fivethirtyeight.datasettes.com/-/versions is deployed with this Dockerfile https://github.com/simonw/fivethirtyeight-datasette/blob/0849901cae06e957fe04892cd4033bdcd1fcf966/Dockerfile which uses `FROM python:3.6-slim-stretch` and results in the following version report: ``` ""sqlite"": { ""extensions"": { ""json1"": null }, ""fts_versions"": [ ""FTS5"", ""FTS4"", ""FTS3"" ], ""version"": ""3.16.2"" } ``` So not only do we get a more recent SQLite (including https://www.sqlite.org/rowvalue.html added in 3.15) but we also get `FTS5` and `json1` as well. Refs #191 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/297/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 944846776,MDU6SXNzdWU5NDQ4NDY3NzY=,297,Option for importing CSV data using the SQLite .import mechanism,9599,simonw,open,0,,,,,23,2021-07-14T22:36:41Z,2023-09-22T20:49:52Z,,OWNER,,"As seen in https://til.simonwillison.net/sqlite/import-csv - `.mode csv` and then `.import school.csv schools` is hugely faster than importing via `sqlite-utils insert` and doing the work in Python - but it can only be implemented by shelling out to the `sqlite3` CLI tool, it's not functionality that is exposed to the Python `sqlite3` module. An option to use this would be useful - maybe something like this: sqlite-utils insert blah.db blah blah.csv --fast",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/297/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 327459829,MDU6SXNzdWUzMjc0NTk4Mjk=,298,URLify URLs in results from custom SQL statements / views,9599,simonw,closed,0,,,,,2,2018-05-29T19:41:07Z,2018-07-24T04:53:20Z,2018-07-24T03:56:50Z,OWNER,,"Consider this custom query: https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3?sql=select+user%2C+%28%27https%3A%2F%2Ftwitter.com%2F%27+%7C%7C+user%29+as+user_url%2C+created_at%2C+text%2C+url+from+%5Btwitter-ratio%2Fsenators%5D+limit+10%3B ```select user, ('https://twitter.com/' || user) as user_url, created_at, text, url from [twitter-ratio/senators] limit 10;``` ![2018-05-29 at 12 38 pm](https://user-images.githubusercontent.com/9599/40681177-44a36d5c-633d-11e8-935b-c49dad7ac682.png) It would be nice if these URLs were turned into links, as happens on the table view page: https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3/twitter-ratio%2Fsenators ![2018-05-29 at 12 39 pm](https://user-images.githubusercontent.com/9599/40681206-5c69c47c-633d-11e8-9f3a-08899f8659b8.png) This currently does not happen because the table view render logic takes a different path through `display_columns_and_rows()` which includes this bit: https://github.com/simonw/datasette/blob/b0a95da96386ddf99816911e08df86178ffa9a89/datasette/views/table.py#L195-L202",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/298/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 951581763,MDU6SXNzdWU5NTE1ODE3NjM=,298,Read lines with JSON object,2172260,qqilihq,closed,0,,,,,2,2021-07-23T13:28:52Z,2021-08-03T06:50:47Z,2021-08-02T21:55:16Z,NONE,,"I found this posted on HN a while ago and love it -- thank you! As a minor improvement, it would be great to have the ability to parse a file with line-separated JSON objects. Currently the parser obviously requires an array wrapping all these objects.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/298/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 327461381,MDU6SXNzdWUzMjc0NjEzODE=,299,Documentation covering ALL datasette URLs,9599,simonw,closed,0,,,,,1,2018-05-29T19:46:15Z,2018-07-28T04:24:05Z,2018-07-28T04:22:30Z,OWNER,,"Relates to #296. We need a single page of the docs listing all of the URL patterns Datasette responds to, also detailing which templates are used to render them and linking to examples of the JSON they output when called with `.json`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/299/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 952154468,MDU6SXNzdWU5NTIxNTQ0Njg=,299,Ability to see just specific table schemas with `sqlite-utils schema`,9599,simonw,closed,0,,,,,1,2021-07-24T22:00:05Z,2021-07-24T22:12:01Z,2021-07-24T22:08:46Z,OWNER,,"It currently accepts no arguments. Allowing for optional arguments specifying tables would be useful: sqlite-utils schema fixtures.db facetable searchable ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/299/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 327541975,MDU6SXNzdWUzMjc1NDE5NzU=,300,Hide sort select box on larger screens,9599,simonw,closed,0,,,,,0,2018-05-30T01:34:59Z,2018-05-31T14:43:13Z,2018-05-31T14:43:13Z,OWNER,,"I'm larger screens you can sort by clicking column headers, so no need to show the select box (which was added for the small screen layout that doesn't show headers)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/300/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 956832836,MDU6SXNzdWU5NTY4MzI4MzY=,300,Returning underlying cause for User Defined Functions ,71236,wsargent,closed,0,,,,,1,2021-07-30T15:08:21Z,2021-08-02T21:53:50Z,2021-08-02T21:53:50Z,NONE,,"The sqlite3 client takes user defined functions and replaces the text with ""user-defined function raised exception`"" so it's not apparent what's gone wrong: ``` Unexpected error: user-defined function raised exception ``` As mentioned in https://code.djangoproject.com/ticket/29500 and https://stackoverflow.com/questions/45824209/how-to-get-an-error-kind-from-sqlite-create-function/45834923#45834923 the workaround for this is to enable callback tracebacks: ``` sqlite3.enable_callback_tracebacks(True) ``` It would be nice if https://sqlite-utils.datasette.io/en/stable/python-api.html#registering-custom-sql-functions either included a reference to `enable_callback_tracebacks` or if registering a user defined function set this flag automatically.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/300/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 328155946,MDU6SXNzdWUzMjgxNTU5NDY=,301,"--spatialite option for ""datasette publish heroku""",9599,simonw,open,0,,,,,1,2018-05-31T14:13:09Z,2022-01-20T21:28:50Z,,OWNER,,Split off from #243. Need to figure out how to install and configure SpatiaLite on Heroku.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/301/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 957383814,MDU6SXNzdWU5NTczODM4MTQ=,301,insert-files should get a --silent option,9599,simonw,closed,0,,,,,0,2021-08-01T04:11:03Z,2021-08-02T19:12:21Z,2021-08-02T19:12:21Z,OWNER,,"The new `sqlite-utils convert` command I'm adding in #251 will have a `--silent` option for turning off the progress bars. The only other command that has progress bars right now is `insert-files` so it should get this option too, for consistency.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/301/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 328171513,MDU6SXNzdWUzMjgxNzE1MTM=,302,test-2.3.sqlite database filename throws a 404,9599,simonw,closed,0,,,3439337,0.23.1,2,2018-05-31T14:50:58Z,2018-06-21T15:21:17Z,2018-06-21T15:21:16Z,OWNER,,"The following almost works: datasette test-2.3.sqlite http://127.0.0.1:8001test-2.3-c88bc35/HighWays loads OK, but http://127.0.0.1:8001test-2.3-c88bc35 throws a 404: ![2018-05-31 at 7 50 am](https://user-images.githubusercontent.com/9599/40789434-447ae934-64a7-11e8-9a07-4eeba87147d5.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/302/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 957529248,MDU6SXNzdWU5NTc1MjkyNDg=,302,Python library version of `sqlite-utils convert`,9599,simonw,closed,0,9599,simonw,,,1,2021-08-01T16:11:02Z,2021-08-02T04:47:40Z,2021-08-02T04:47:40Z,OWNER,,"Spin off from #251. The ability to execute Python functions to convert and split columns should be part of the library too, not just the CLI.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/302/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 328172521,MDU6SXNzdWUzMjgxNzI1MjE=,303,Support table names ending with .json or .csv,9599,simonw,closed,0,,,,,4,2018-05-31T14:53:23Z,2018-06-15T06:55:50Z,2018-06-15T06:55:50Z,OWNER,,"This is needed for #266 - if a table name ends with `.json` or `.csv` right now our URL pattern matching will do the wrong thing. We should be smarter about this. This does mean we will have some URLs that look like this: http://localhost:8001/dbname/weird.json - returning HTML, not JSON http://localhost:8001/dbname/weird.json.json - returning JSON http://localhost:8001/dbname/weird.json.csv - returning CSV ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/303/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 957536983,MDExOlB1bGxSZXF1ZXN0NzAwOTQ0NjQ0,303,sqlite-utils convert command and db[table].convert(...) method,9599,simonw,closed,0,,,,,1,2021-08-01T16:52:42Z,2021-08-02T04:47:42Z,2021-08-02T04:47:39Z,OWNER,simonw/sqlite-utils/pulls/303,"Refs #251, #302. - [x] Get recipes working - [x] Document recipes - [x] Implement `db[table].convert(...)` method - [x] Add tests for recipes that use the new Python method - [x] Implement `db[table].convert(..., multi=True)` mechanism - [x] Documentation for `db[table].convert(...)` - [x] Refactor `sqlite-utils convert` to use the new method",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/303/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 328229224,MDU6SXNzdWUzMjgyMjkyMjQ=,304,Ability to configure SQLite cache_size,9599,simonw,closed,0,,,,,3,2018-05-31T17:28:07Z,2018-06-04T16:13:32Z,2018-06-04T16:03:19Z,OWNER,,"See https://www.sqlite.org/pragma.html#pragma_cache_size Let's call the config setting `cache_size_kb` to emphasize that we're using the negative option. Note this warning: perhaps we should raise an error if you try to use this setting against a SQLite version prior to 3.7.10 > If the argument N is positive then the suggested cache size is set to N. If the argument N is negative, then the number of cache pages is adjusted to use approximately abs(N*1024) bytes of memory. Backwards compatibility note: The behavior of cache_size with a negative N was different in prior to version 3.7.10 (2012-01-16). In version 3.7.9 and earlier, the number of pages in the cache was set to the absolute value of N.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/304/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 957731178,MDU6SXNzdWU5NTc3MzExNzg=,304,"`table.convert(..., where=)` and `sqlite-utils convert ... --where=`",9599,simonw,closed,0,,,,,3,2021-08-02T04:27:23Z,2021-08-02T19:00:00Z,2021-08-02T18:58:10Z,OWNER,,"For applying the conversion to a subset of rows selected using the where clause. Should also take optional arguments, as seen in `db[""dogs""].delete_where(""age < ?"", [3])`. Follows #302 and #251. This was originally https://github.com/simonw/sqlite-transform/issues/9",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/304/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 329147284,MDU6SXNzdWUzMjkxNDcyODQ=,305,Add contributor guidelines to docs,9599,simonw,closed,0,,,,,2,2018-06-04T17:25:30Z,2019-06-24T06:40:19Z,2019-06-24T06:40:19Z,OWNER,,https://channels.readthedocs.io/en/latest/contributing.html is a nice example of this done well.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/305/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 957741820,MDU6SXNzdWU5NTc3NDE4MjA=,305,Python: need a way to execute a count with an extra where clause,9599,simonw,closed,0,,,,,1,2021-08-02T04:52:02Z,2021-08-02T05:08:22Z,2021-08-02T05:08:22Z,OWNER,,I need this for #304. I'll probably add this to the `.execute_count()` method as `where=` and `where_args=`.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/305/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 329661905,MDU6SXNzdWUzMjk2NjE5MDU=,306,Custom URL routing with independent tests,9599,simonw,closed,0,,,,,5,2018-06-05T23:40:08Z,2018-06-07T15:29:28Z,2018-06-07T15:29:28Z,OWNER,,"The more I think about #303 the more I feel that Datasette's URL routing needs go beyond Django-style regex matching. If we go custom, tests should live in `test_routing.py`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/306/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 958516743,MDU6SXNzdWU5NTg1MTY3NDM=,306,Configure sphinx.ext.extlinks for issues,9599,simonw,closed,0,,,,,2,2021-08-02T21:19:19Z,2021-08-02T21:39:34Z,2021-08-02T21:29:22Z,OWNER,,As seen in Datasette: https://github.com/simonw/datasette/issues/1227,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/306/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 330323860,MDExOlB1bGxSZXF1ZXN0MTkzMzYxMzQx,307,"Initial sketch of custom URL routing, refs #306",9599,simonw,closed,0,,,,,1,2018-06-07T15:26:48Z,2018-06-07T15:29:54Z,2018-06-07T15:29:41Z,OWNER,simonw/datasette/pulls/307,See #306 for background on this.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/307/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 959305209,MDU6SXNzdWU5NTkzMDUyMDk=,307,codespell to spell check documentation,9599,simonw,closed,0,,,,,0,2021-08-03T16:48:19Z,2021-08-03T16:48:53Z,2021-08-03T16:48:53Z,OWNER,,As seen in https://github.com/simonw/datasette/issues/1417 and https://til.simonwillison.net/python/codespell,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/307/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 330826972,MDU6SXNzdWUzMzA4MjY5NzI=,308,"Support extra Heroku apps:create options - region, space, team",78156,annapowellsmith,open,0,,,,,2,2018-06-08T23:08:33Z,2018-09-21T14:09:28Z,,NONE,,"It would be useful to document how to pass Heroku CLI options on `datasette publish`, e.g. `--region eu`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/308/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 961008507,MDU6SXNzdWU5NjEwMDg1MDc=,308,Add an interactive tutorial as a Jupyter notebook,9599,simonw,open,0,,,,,2,2021-08-04T20:34:22Z,2021-08-04T21:30:59Z,,OWNER,,Can show people how to open this up in Binder.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/308/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 331343824,MDU6SXNzdWUzMzEzNDM4MjQ=,309,On 404s with a trailing slash redirect to that page without a trailing slash,9599,simonw,closed,0,,,3439337,0.23.1,2,2018-06-11T20:46:49Z,2018-06-21T15:22:02Z,2018-06-21T15:13:15Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/309/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 963897111,MDU6SXNzdWU5NjM4OTcxMTE=,309,"sqlite-utils insert errors should show SQL and parameters, if possible",16622642,scaleoutsean,closed,0,,,,,6,2021-08-09T11:24:14Z,2021-08-09T23:40:29Z,2021-08-09T22:25:58Z,NONE,,"I've tried several approaches, but this is the current one: ```sh echo $json-line | sqlite-utils insert json.db jsontable --truncate --alter --detect-types - ``` In all cases, I get this error: ```sh OverflowError: Python int too large to convert to SQLite INTEGER Traceback (most recent call last): File ""/home/sean/.local/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/usr/lib/python3/dist-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/usr/lib/python3/dist-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/usr/lib/python3/dist-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/lib/python3/dist-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/lib/python3/dist-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/home/sean/.local/lib/python3.8/site-packages/sqlite_utils/cli.py"", line 841, in insert insert_upsert_implementation( File ""/home/sean/.local/lib/python3.8/site-packages/sqlite_utils/cli.py"", line 780, in insert_upsert_implementation db[table].insert_all( File ""/home/sean/.local/lib/python3.8/site-packages/sqlite_utils/db.py"", line 2145, in insert_all self.insert_chunk( File ""/home/sean/.local/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1957, in insert_chunk result = self.db.execute(query, params) File ""/home/sean/.local/lib/python3.8/site-packages/sqlite_utils/db.py"", line 257, in execute return self.conn.execute(sql, parameters) ``` I googled the error and checked SO answers and advice, all good. I changed my JSON file to not use integers so I no longer get this error. Of course, that makes using the database a bit harder, so I also tried to solve the problem by modifying DB structure (while using integers in JSON). If change all `INTEGER` Data Types to something else (`STRING`, `TEXT`) and try to import again using `--truncate`, I still get this error. I suppose I should tell sqlite-utils which columns should use non-INTEGER Data Type rather than rely on it to check my SQL table configuration. If that is the case, can this error be a bit more specific for easier troubleshooting - maybe tell us which which record caused the problem when that error is thrown? My table has 60+ columns, many of which use 64-bit integers (not all records are large or known in advance), so while I can modify JSON to use strings instead of integers, it decreases usability and finding out which records have values for which SQLite integers aren't sufficient requires some work (I'm thinking about parsing all integers with `jq` and sorting output by length to identify those columns, but I'd prefer if sqlite-utils could tell me that). My environment: - Python 3.8.10 - sqlite-utils 3.14 - pandas 1.3.1 - numpy 1.21.1 - sqlite-fts4 1.0.1 - sqlite 3.31.1-4ubuntu0.2 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/309/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 332830309,MDU6SXNzdWUzMzI4MzAzMDk=,310,datasette publish now is broken in master,9599,simonw,closed,0,,,,,0,2018-06-15T16:01:14Z,2018-06-16T16:29:50Z,2018-06-16T16:29:50Z,OWNER,,"``` > gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/usr/local/include/python3.6m -c httptools/parser/parser.c -o build/temp.linux-x86_64-3.6/httptools/parser/parser.o -O2 > unable to execute 'gcc': No such file or directory > error: command 'gcc' failed with exit status 1 > > ---------------------------------------- > Command ""/usr/local/bin/python -u -c ""import setuptools, tokenize;__file__='/tmp/pip-install-s73273rj/httptools/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))"" install --record /tmp/pip-record-yha7dxqq/install-record.txt --single-version-externally-managed --compile"" failed with error code 1 in /tmp/pip-install-s73273rj/httptools/ ``` Turns out the `python-slim` base image I introduced in b18e4515855c3f1eeca3dfcccdbb6df05869084a doesn't include gcc: https://github.com/docker-library/python/issues/60#issuecomment-134322383",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/310/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 964400482,MDU6SXNzdWU5NjQ0MDA0ODI=,310,`sqlite-utils insert --flatten` option to flatten nested JSON,9599,simonw,closed,0,,,,,3,2021-08-09T21:23:08Z,2021-10-16T13:54:56Z,2021-08-09T21:44:06Z,OWNER,,"I had to do this with a `jq` recipe today: https://til.simonwillison.net/cloudrun/tailing-cloud-run-request-logs ``` cat log.json | jq -c '[leaf_paths as $path | { ""key"": $path | join(""_""), ""value"": getpath($path) }] | from_entries' \ | sqlite-utils insert /tmp/logs.db logs - --nl --alter --batch-size 1 ``` That was to turn something like this: ```json { ""httpRequest"": { ""latency"": ""0.112114537s"", ""requestMethod"": ""GET"", ""requestSize"": ""534"", ""status"": 200, }, ""insertId"": ""6111722f000b5b4c4d4071e2"", ""labels"": { ""service"": ""datasette-io"" } } ``` Into this instead: ```json { ""httpRequest_latency"": ""0.112114537s"", ""httpRequest_requestMethod"": ""GET"", ""httpRequest_requestSize"": ""534"", ""httpRequest_status"": 200, ""insertId"": ""6111722f000b5b4c4d4071e2"", ""labels_service"": ""datasette-io"" } ``` I have to do this often enough that I think it should be an option, `--flatten` - so I can do this instead: ``` cat log.json | sqlite-utils insert /tmp/logs.db logs - --flatten ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/310/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 332998752,MDExOlB1bGxSZXF1ZXN0MTk1MzM5MTEx,311,"?_labels=1 to expand foreign keys (in csv and json), refs #233",9599,simonw,closed,0,,,,,2,2018-06-16T16:31:12Z,2018-06-16T22:20:31Z,2018-06-16T22:20:31Z,OWNER,simonw/datasette/pulls/311,"Output looks something like this: { ""rowid"": 233, ""TreeID"": 121240, ""qLegalStatus"": { ""value"" 2, ""label"": ""Private"" } ""qSpecies"": { ""value"": 16, ""label"": ""Sycamore"" } ""qAddress"": ""91 Commonwealth Ave"", ... }",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/311/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 965102534,MDU6SXNzdWU5NjUxMDI1MzQ=,311,Add reference documentation generated from docstrings,9599,simonw,closed,0,,,,,4,2021-08-10T16:04:00Z,2021-08-11T12:03:50Z,2021-08-11T12:03:50Z,OWNER,,"Using https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html I'm not a big fan of this kind of documentation because it so often comes in place of narrative documentation - but the library has great narrative documentation now, so the reference documentation can link to it in places. This will also encourage me to add good docstrings everywhere, useful for IDEs and suchlike.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/311/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 333000163,MDU6SXNzdWUzMzMwMDAxNjM=,312,"HTML, CSV and JSON views should support ?_col=&_col=",9599,simonw,closed,0,,,,,1,2018-06-16T16:53:35Z,2021-06-17T18:14:24Z,2018-06-16T17:00:12Z,OWNER,,To support whitelisting columns to display.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/312/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 965143346,MDExOlB1bGxSZXF1ZXN0NzA3NDkwNzg5,312,Add reference page to documentation using Sphinx autodoc,9599,simonw,closed,0,,,,,10,2021-08-10T16:59:17Z,2021-08-10T23:09:32Z,2021-08-10T23:09:28Z,OWNER,simonw/sqlite-utils/pulls/312,Refs #311.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/312/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 333086005,MDU6SXNzdWUzMzMwODYwMDU=,313,Deploy demo of Datasette on every commit that passes tests,9599,simonw,closed,0,,,,,6,2018-06-17T19:19:12Z,2018-06-17T21:52:58Z,2018-06-17T21:52:58Z,OWNER,,We can use Travis CI and Zeit Now to ensure there is always a live demo of current master. We can ship archived demos for releases as well.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/313/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 965166058,MDU6SXNzdWU5NjUxNjYwNTg=,313,`.add_foreign_keys()` doesn't reject being called with a View,9599,simonw,closed,0,,,,,0,2021-08-10T17:22:17Z,2021-08-10T17:25:34Z,2021-08-10T17:25:34Z,OWNER,,"Spotted this bug using `mypy` while working on #311 / #312! ``` % mypy sqlite_utils sqlite_utils/db.py:725: error: Item ""View"" of ""Union[Table, View]"" has no attribute ""foreign_keys"" Found 1 error in 1 file (checked 5 source files) ``` Refers to this code: https://github.com/simonw/sqlite-utils/blob/c11ff89894727270d4a9eb554d3a006f5b0d8d9d/sqlite_utils/db.py#L710-L720 It's a bug! We run some checks earlier but none of them ensure that it's a view: https://github.com/simonw/sqlite-utils/blob/c11ff89894727270d4a9eb554d3a006f5b0d8d9d/sqlite_utils/db.py#L697-L709",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/313/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 333096176,MDU6SXNzdWUzMzMwOTYxNzY=,314,HTML table does not correctly display entirely blank rows,9599,simonw,closed,0,,,3439337,0.23.1,1,2018-06-17T21:58:06Z,2018-06-21T16:04:59Z,2018-06-21T15:26:26Z,OWNER,,"https://958b75c.datasette.io/fixtures-35b6eb6/simple_view ![2018-06-17 at 2 56 pm](https://user-images.githubusercontent.com/9599/41512541-b52e90be-723e-11e8-95c9-7d091738d5cc.png) https://958b75c.datasette.io/fixtures-35b6eb6/simple_view.json shows the underlying data: ``` ""rows"": [ [ ""hello"", ""HELLO"" ], [ ""world"", ""WORLD"" ], [ """", """" ] ] ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/314/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 965210966,MDU6SXNzdWU5NjUyMTA5NjY=,314,Type signatures for `.create_table()` and `.create_table_sql()` and `.create()` and `Table.__init__`,9599,simonw,closed,0,,,,,2,2021-08-10T18:03:59Z,2021-08-18T22:25:21Z,2021-08-18T22:25:21Z,OWNER,,"> Adding type signatures to `create_table()` and `.create_table_sql()` is a bit too involved, I'll do that in a separate issue. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/312#issuecomment-896200682_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/314/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 333120982,MDExOlB1bGxSZXF1ZXN0MTk1NDEzMjQx,315,Streaming mode for downloading all rows as a CSV,9599,simonw,closed,0,,,,,0,2018-06-18T03:06:59Z,2018-06-18T03:29:13Z,2018-06-18T03:21:02Z,OWNER,simonw/datasette/pulls/315,Refs #266,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/315/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 965440017,MDU6SXNzdWU5NjU0NDAwMTc=,315,`.delete_where()` returns `[]` when it should return self,9599,simonw,closed,0,,,,,1,2021-08-10T21:54:55Z,2021-08-10T23:09:29Z,2021-08-10T23:09:29Z,OWNER,,"If the table doesn't exist it should still return `self`, not `[]`: https://github.com/simonw/sqlite-utils/blob/ee469e3122d6f5973ec2584c1580d930daca2e7c/sqlite_utils/db.py#L1676-L1683 Spotted with `mypy` while working on #312.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/315/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 333238932,MDU6SXNzdWUzMzMyMzg5MzI=,316,datasette inspect takes a very long time on large dbs,132230,gavinband,closed,0,,,,,5,2018-06-18T11:56:27Z,2019-05-11T18:26:25Z,2019-05-11T18:26:25Z,NONE,,"Hi, I want to expose data in a very large sqlite database (~600Gb) to the web. I have used datasette with success on smaller test databases with the same schema - it works very well (thanks!). However, using the full db, both `datasette inspect` and `datasette serve` seem to hang or pause for a very long time (tens of minutes) on startup. Is this expected behaviour? (I noticed that the output of `datasette inspect` includes row counts for each table. Simply counting the rows in this db will take a long time (tens of millions of rows across each of ~10 tables), so I wondered if this is the source of the problem.) Any help on a workaround would be appreciated. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/316/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 970320615,MDU6SXNzdWU5NzAzMjA2MTU=,316,Fix visible backticks on reference page,9599,simonw,closed,0,,,,,1,2021-08-13T11:37:46Z,2021-08-14T05:12:23Z,2021-08-14T05:10:48Z,OWNER,,"https://sqlite-utils.datasette.io/en/latest/reference.html Search for backtick to reveal various minor markup bugs.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/316/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 333326107,MDU6SXNzdWUzMzMzMjYxMDc=,317,Travis CI fails to upload new releases to PyPI,9599,simonw,closed,0,,,3439337,0.23.1,2,2018-06-18T15:44:26Z,2018-06-21T15:45:47Z,2018-06-21T15:45:47Z,OWNER,,"https://travis-ci.org/simonw/datasette/jobs/393684139 ``` ... removing build/bdist.linux-x86_64/wheel Uploading distributions to https://upload.pypi.org/legacy/ Uploading datasette-0.23-py3-none-any.whl 100%|██████████| 201k/201k [00:00<00:00, 1.02MB/s] HTTPError: 403 Client Error: Invalid or non-existent authentication information. for url: https://upload.pypi.org/legacy/ ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/317/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 972827346,MDU6SXNzdWU5NzI4MjczNDY=,317,Link to a better example on docs index,9599,simonw,closed,0,,,,,1,2021-08-17T15:43:40Z,2021-08-18T18:31:43Z,2021-08-18T18:31:43Z,OWNER,,https://github.com/simonw/sqlite-utils/blob/7a19822ac9ee24be2fbb4c2326a0bf2f3d2d9c4d/docs/index.rst#L39 Is a very old example,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/317/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 334148669,MDU6SXNzdWUzMzQxNDg2Njk=,318,Facets with value of 0 displayed incorrectly,9599,simonw,closed,0,,,3439337,0.23.1,1,2018-06-20T16:06:46Z,2019-05-29T21:39:12Z,2018-06-21T04:30:45Z,OWNER,,"https://registry.datasette.io/registry-7d4f81f/tables?_facet=is_hidden#facet-is_hidden ![2018-06-20 at 9 05 am](https://user-images.githubusercontent.com/9599/41670448-2c06e642-7469-11e8-86be-4664269582b1.png) Displays correctly if you select it: https://registry.datasette.io/registry-7d4f81f/tables?_facet=is_hidden&is_hidden=0 ![2018-06-20 at 9 06 am](https://user-images.githubusercontent.com/9599/41670471-3e61e486-7469-11e8-8710-5da90ef65787.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/318/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 974067156,MDU6SXNzdWU5NzQwNjcxNTY=,318,Research: handle gzipped CSV directly,9599,simonw,open,0,,,,,2,2021-08-18T21:23:04Z,2021-08-18T21:25:30Z,,OWNER,,"Would it be worthwhile for the `sqlite-utils` command-line tool to grow features to efficiently directly interact with gzipped CSV data? Maybe add `--gz` options to both `insert` and to the various commands that output query results.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/318/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 334149717,MDU6SXNzdWUzMzQxNDk3MTc=,319,Incorrect display of compound primary keys with foreign key relationships,9599,simonw,closed,0,,,3439337,0.23.1,2,2018-06-20T16:09:36Z,2018-06-21T15:58:15Z,2018-06-21T14:56:41Z,OWNER,,"https://registry.datasette.io/registry-7d4f81f/datasette_tags ![2018-06-20 at 9 07 am](https://user-images.githubusercontent.com/9599/41670542-68cc4dec-7469-11e8-9521-3bbc6465eccb.png) Underlying JSON looks [like this](https://registry.datasette.io/registry-7d4f81f/datasette_tags.json?_labels=on): ``` { ""database"": ""registry"", ""table"": ""datasette_tags"", ""is_view"": false, ""human_description_en"": """", ""rows"": [ { ""datasette_id"": { ""value"": 1, ""label"": ""Global Power Plant Database"" }, ""tag"": { ""value"": ""geospatial"", ""label"": ""geospatial"" } }, ```` Bug is likely somewhere in here: https://github.com/simonw/datasette/blob/e04f5b0d348ef7275a0a5ab9eb53527105132885/datasette/views/table.py#L143-L207",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/319/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 976399638,MDU6SXNzdWU5NzYzOTk2Mzg=,319,[Enhancement] Please allow 'insert-files' to insert content as text.,66709385,pjamargh,closed,0,,,,,10,2021-08-22T15:10:46Z,2021-08-24T23:33:45Z,2021-08-24T23:33:44Z,NONE,,"'insert-files' creates BLOB columns for file contents. Transforming the column to TEXT still keep the content as binary. Even though I'm sure there is a transform that can be applied decoding the text it would be great to have a argument to make 'insert-files' to do it as text (with optional text encoding). The use case is a bunch of htmls (single file) on a directory structure that inserted with this command could be served in Datasette allowing full text search.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/319/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 334169932,MDU6SXNzdWUzMzQxNjk5MzI=,320,Need unit tests covering the different states for the advanced export box,9599,simonw,closed,0,,,,,1,2018-06-20T17:03:40Z,2018-07-24T04:53:20Z,2018-07-24T03:38:40Z,OWNER,,"There are quite a few variants of this box: ![2018-06-20 at 10 02 am](https://user-images.githubusercontent.com/9599/41673229-1d423adc-7471-11e8-99d4-4251f7d03aa5.png) Test coverage should exercise all of them, since the logic is a little unclear. https://github.com/simonw/datasette/blob/fdfbbbb9ee0d02fd4d43dfc42382252fa2287d6d/datasette/templates/table.html#L140-L159",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/320/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 976405225,MDU6SXNzdWU5NzY0MDUyMjU=,320,sqlite-utils memory --analyze option,9599,simonw,closed,0,,,,,2,2021-08-22T15:37:10Z,2021-08-22T15:46:56Z,2021-08-22T15:44:29Z,OWNER,,To provide a way of running [analyze-tables](https://sqlite-utils.datasette.io/en/stable/cli.html#analyzing-tables) directly against JSON or CSV data.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/320/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 334190959,MDU6SXNzdWUzMzQxOTA5NTk=,321,Wildcard support in query parameters,12617395,bsilverm,closed,0,,,3439337,0.23.1,8,2018-06-20T18:03:56Z,2018-06-21T17:00:10Z,2018-06-21T04:55:26Z,NONE,,"I haven't found a way to get the wildcard (%) inserted automatically in to a query parameter. This would be useful for cases the query parameter is followed by a LIKE clause. Wrapping the parameter name using the wildcard character within the metadata file (ie - ...where xyz like %:querystring%) does not seem to work. Can this be made possible? Or if not, can the template be extended to provide a tip to the user that they need to insert the wildcard characters themselves?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/321/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 978537855,MDExOlB1bGxSZXF1ZXN0NzE5MTA5NzA5,321,"Ability to insert file contents as text, in addition to blob",9599,simonw,closed,0,,,,,5,2021-08-24T22:37:18Z,2021-08-24T23:31:17Z,2021-08-24T23:31:13Z,OWNER,simonw/sqlite-utils/pulls/321,Refs #319.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/321/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 334592281,MDExOlB1bGxSZXF1ZXN0MTk2NTI2ODYx,322,Feature/in operator,2691848,4e1e0603,closed,0,,,,,0,2018-06-21T17:41:51Z,2018-06-21T17:45:25Z,2018-06-21T17:45:25Z,NONE,simonw/datasette/pulls/322,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/322/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 979612115,MDExOlB1bGxSZXF1ZXN0NzE5OTk4MjI1,322,Add dict type to be mapped as TEXT in sqllite,2496189,minaeid90,closed,0,,,,,1,2021-08-25T20:54:26Z,2021-11-15T00:27:40Z,2021-11-15T00:27:40Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/322,"the library deal with Postgres type jsonb as dictionary, add dict type as a TEXT for mapping to sqlite ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/322/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 334698969,MDU6SXNzdWUzMzQ2OTg5Njk=,323,Speed up Travis CI builds,9599,simonw,closed,0,,,,,1,2018-06-21T23:55:27Z,2018-07-10T15:03:37Z,2018-07-10T15:03:36Z,OWNER,,"They've got a bit slow. Part of this is the Zeit Now deploy, but the build-and-test cycle is taking at least a couple of minutes. ![2018-06-21 at 4 54 pm](https://user-images.githubusercontent.com/9599/41751010-e48c823e-7573-11e8-88f3-7aa8a7e53917.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/323/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 979627285,MDU6SXNzdWU5Nzk2MjcyODU=,323,`table.convert()` method should clean up after itself,9599,simonw,closed,0,,,,,1,2021-08-25T21:15:39Z,2021-08-25T21:25:26Z,2021-08-25T21:25:18Z,OWNER,,"It currently works like this: https://github.com/simonw/sqlite-utils/blob/77c240df56068341561e95e4a412cbfa24dc5bc7/sqlite_utils/db.py#L2177-L2195 It's registering a function called `convert_value()` and then failing to de-register that function once it has finished. It might even be possible for two queries running against the same connection to clobber each other's `convert_value()` functions, leading to incorrect behaviour. So two fixes: firstly it should register the function with a unique name (maybe add a random suffix). Secondly, it should de-register that function once it has finished.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/323/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 334731076,MDExOlB1bGxSZXF1ZXN0MTk2NjI4MzA0,324,Speed up Travis by reusing pip wheel cache across builds,9599,simonw,closed,0,,,,,0,2018-06-22T03:20:08Z,2018-06-24T01:03:47Z,2018-06-24T01:03:47Z,OWNER,simonw/datasette/pulls/324,From https://atchai.com/blog/faster-ci/ - refs #323 ,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/324/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 988013247,MDExOlB1bGxSZXF1ZXN0NzI3MDEyOTk2,324,Use python-dateutil package instead of dateutils,191622,meatcar,closed,0,,,,,1,2021-09-03T18:31:19Z,2021-11-14T23:25:40Z,2021-11-14T23:25:40Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/324,"While working on updating `sqlite-utils` for NixOS/Nixpkgs, I came a cross the following: In 5ec6686153e29ae10d4921a1ad4c841f192f20e2, a new dependency was added on `dateutils` (https://pypi.org/project/dateutils/). I believe this is unintentional, and instead `python-dateutil` (https://pypi.org/project/python-dateutil/) was intended. My reasoning is: - `python-dateutil` is imported here in [recipes.py](https://github.com/simonw/sqlite-utils/blob/5ec6686153e29ae10d4921a1ad4c841f192f20e2/sqlite_utils/recipes.py#L1) - The `mypy` `type-python-dateutil` dependency in [setup.py](https://github.com/simonw/sqlite-utils/blob/5ec6686153e29ae10d4921a1ad4c841f192f20e2/setup.py#L36) - `python-dateutil` is a dependency of `dateutils` as seen in the output in [docs/tutorial.ipynb](https://github.com/simonw/sqlite-utils/blob/77c240df56068341561e95e4a412cbfa24dc5bc7/docs/tutorial.ipynb#L43) Seems like the trailing ""s"" seems to be the source of confusion 😅 I've swapped the dependencies out, hope this helps.",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/324/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 335064777,MDU6SXNzdWUzMzUwNjQ3Nzc=,325,Error on row page if table has slashes in the name and ends in .csv,9599,simonw,closed,0,,,,,1,2018-06-23T03:43:42Z,2018-07-09T17:28:27Z,2018-07-08T05:21:59Z,OWNER,,"https://v0-23-1.datasette.io/fixtures-e14e080/table%252Fwith%252Fslashes.csv/3 > no such table: table%252Fwith%252Fslashes.csv From clicking the row link on https://v0-23-1.datasette.io/fixtures-e14e080/table%2Fwith%2Fslashes.csv",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/325/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 990844088,MDU6SXNzdWU5OTA4NDQwODg=,325,sqlite-utils memory can't deal with multiple files with the same name,144773,karlb,closed,0,,,,,4,2021-09-08T08:14:42Z,2021-09-22T20:52:56Z,2021-09-22T20:45:45Z,NONE,,"When I use multiple files with the same name, e.g. in `sqlite-utils memory a/bug.csv b/bug.csv`, sqlite-utils creates invalid views. ``` Traceback (most recent call last): File ""/home/karl/.local/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/sqlite_utils/cli.py"", line 1299, in memory db[csv_table].transform(types=tracker.types) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1287, in transform self.db.execute(sql) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/sqlite_utils/db.py"", line 421, in execute return self.conn.execute(sql) sqlite3.OperationalError: error in view t1: no such table: main.bug ``` This can be reproduced with ```sh #!/bin/bash mkdir foo mkdir bar echo -e 'col1,col2\nval1,val2' > foo/bug.csv echo -e 'col3,col4\nval3,val4' > bar/bug.csv sqlite-utils memory */bug.csv 'SELECT 1' ``` Ideally, the tables would get unique names by including the next path segment until the names are unique. But just making the numbered t* aliases work would be good enough. This problem can of course be worked around by renaming the files, but it would be nice if this case was handled more gracefully. Thanks a lot for this great tool!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/325/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 335141434,MDU6SXNzdWUzMzUxNDE0MzQ=,326,CSV should respect --cors and return cors headers,9599,simonw,closed,0,,,,,1,2018-06-24T00:44:07Z,2021-06-17T18:14:24Z,2018-06-24T00:59:45Z,OWNER,,Otherwise tools like Vega can't load data via CSV.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/326/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 991237645,MDExOlB1bGxSZXF1ZXN0NzI5NzMxNDQx,326,Test against 3.10-dev,9599,simonw,closed,0,,,,,2,2021-09-08T15:01:15Z,2021-10-13T21:49:28Z,2021-10-13T21:49:28Z,OWNER,simonw/sqlite-utils/pulls/326,"This tests against the latest 3.10 RC, https://www.python.org/downloads/release/python-3100rc2/",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/326/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 335200136,MDU6SXNzdWUzMzUyMDAxMzY=,327,Explore if SquashFS can be used to shrink size of packaged Docker containers,9599,simonw,open,0,,,,,4,2018-06-24T18:15:16Z,2022-02-17T23:37:24Z,,OWNER,,"Inspired by this article: https://cldellow.com/2018/06/22/sqlite-parquet-vtable.html#sqlite-database-indexed--squashed https://en.wikipedia.org/wiki/SquashFS is ""a compressed read-only file system for Linux"" - which means it could be a really nice fit for Datasette and its read-only SQLite databases. It would be interesting to explore a Dockerfile recipe that used SquashFS to compress the SQLite database file that was bundled up by `datasette package` and friends.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/327/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1000275035,PR_kwDOCGYnMM4r7n-9,327,Extract expand: Support JSON Arrays,101753,phaer,closed,0,,,,,0,2021-09-19T10:34:30Z,2022-12-29T09:05:36Z,2022-12-29T09:05:36Z,NONE,simonw/sqlite-utils/pulls/327,"Hi, I needed to extract data in JSON Arrays to normalize data imports. I've quickly hacked the following together based on #241 which refers to #239 where you, @simonw, wrote: > Could this handle lists of objects too? That would be pretty amazing - if the column has a [{...}, {...}] list in it could turn that into a many-to-many. They way this works in my work is that many-to-many relationships are created for anything that maps to an dictionary in a list, and many-to-one relations for everything else (assumed to be scalar values). Not sure what the best approach here would be? Are many-to-one relationships are at all useful here? What do you think about this approach? I could try to add it to the cli interface and documentation if wanted. Thanks for this awesome piece of software in any case! :sun_with_face: ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/327/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 336464733,MDU6SXNzdWUzMzY0NjQ3MzM=,328,"Installation instructions, including how to use the docker image",9599,simonw,closed,0,,,,,4,2018-06-28T03:59:33Z,2023-09-05T14:10:39Z,2018-06-28T04:02:10Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/328/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1004613267,I_kwDOCGYnMM474S6T,328,Invalid JSON output when no rows,12752,gravis,closed,0,,,,,3,2021-09-22T18:37:26Z,2021-09-22T20:21:34Z,2021-09-22T20:20:18Z,NONE,,"`sqlite-utils query` generates a JSON output with the result from the query: ```json [{...},{...}] ``` If no rows are returned by the query, I'm expecting an empty JSON array: ```json [] ``` But actually I'm getting an empty string. To be consistent, the output should be `[]` when the request succeeds (return code == `0`).",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/328/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 336465018,MDU6SXNzdWUzMzY0NjUwMTg=,329,Travis should push tagged images to Docker Hub for each release,9599,simonw,closed,0,,,,,7,2018-06-28T04:01:31Z,2018-11-05T06:54:10Z,2018-11-05T06:53:28Z,OWNER,,https://sebest.github.io/post/using-travis-ci-to-build-docker-images/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/329/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1005891028,I_kwDOCGYnMM479K3U,329,Rethink approach to [ and ] in column names (currently throws error),9599,simonw,closed,0,,,,,12,2021-09-23T22:14:24Z,2021-11-15T02:57:51Z,2021-11-15T02:57:51Z,OWNER,,"> I think it's best to still keep `[` and `]` out of column names though. Transforming them into `(` and `)` seems reasonable - but should that happen here or in `sqlite-utils`? I think in `sqlite-utils`. _Originally posted by @simonw in https://github.com/simonw/datasette-app/issues/121#issuecomment-926200398_ This is a rethinking of the solution to: - https://github.com/simonw/sqlite-utils/issues/86",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/329/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 336924199,MDU6SXNzdWUzMzY5MjQxOTk=,330,Limit text display in cells containing large amounts of text,82988,psychemedia,closed,0,,,,,4,2018-06-29T09:15:22Z,2018-07-24T04:53:20Z,2018-07-10T16:20:48Z,CONTRIBUTOR,,"The default preview of a database shows all columns (is the row count limited?) which is fine in many cases but can take a long time to load / offer a large overhead if the table is a SpatiaLite table containing geometry columns that include large shapefiles. Would it make sense to have a setting that can limit the amount of text displayed in any given cell in the table preview, or (less useful?) suppress (with notification) the display of overlong columns unless enabled by the user? An issue then arises if a user does want to see all the text in a cell: 1) for a particular cell; 2) for every cell in the table; 3) for all cells in a particular column or columns (I haven't checked but what if a column contains e.g. raw image data? Does this display as raw data? Or can this be rendered in a context aware way as an image preview? I guess a custom template would be one way to do that?)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/330/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed