id,node_id,number,title,user,user_label,state,locked,assignee,assignee_label,milestone,milestone_label,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,repo_label,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason 1616347574,I_kwDOJHON9s5gV4G2,1,Initial proof of concept with ChatGPT,9599,simonw,closed,0,,,,,3,2023-03-09T03:44:39Z,2023-03-09T03:51:55Z,2023-03-09T03:51:55Z,MEMBER,,I'm using ChatGPT to figure out enough AppleScript to get at my notes data.,611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1617962395,I_kwDOJHON9s5gcCWb,10,Include schema in README,9599,simonw,closed,0,,,,,0,2023-03-09T20:38:59Z,2023-03-09T20:48:18Z,2023-03-09T20:48:18Z,MEMBER,,As seen in other tools like https://github.com/simonw/git-history,611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1616354999,I_kwDOJHON9s5gV563,2,First working version,9599,simonw,closed,0,,,,,7,2023-03-09T03:53:00Z,2023-03-09T05:10:22Z,2023-03-09T05:10:22Z,MEMBER,,"It's going to shell out to `osascript` as seen in: - #1 I'm going with that option because https://appscript.sourceforge.io/status.html warns against the other potential methods: > Apple eliminated its Mac Automation department in 2016. The future of AppleScript and its related technologies is unclear. Caveat emptor. But `osascript` looks pretty stable to me.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1616422013,I_kwDOJHON9s5gWKR9,3,`apple-notes-to-sqlite --dump` option,9599,simonw,closed,0,,,,,0,2023-03-09T05:05:49Z,2023-03-09T05:06:14Z,2023-03-09T05:06:14Z,MEMBER,,"Option that doesn't write to the database at all, it just outputs all the notes to stdout as newline-delimited JSON.",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1617769847,I_kwDOJHON9s5gbTV3,7,Folder support,9599,simonw,closed,0,,,,,6,2023-03-09T18:21:33Z,2023-03-09T20:48:18Z,2023-03-09T20:48:18Z,MEMBER,,Notes can live in folders. These relationships should be exported too.,611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1617823309,I_kwDOJHON9s5gbgZN,8,Increase performance using macnotesapp,41546558,RhetTbull,closed,0,,,,,1,2023-03-09T18:51:05Z,2023-03-14T22:00:22Z,2023-03-14T22:00:21Z,NONE,,"Neat project! You can probably increase performance using my python interface to Notes, [macnotesapp](https://github.com/RhetTbull/macnotesapp), which uses Scripting Bridge and bulk queries for much better performance than AppleScript. Another related project is [PyXA](https://github.com/SKaplanOfficial/PyXA) which uses Scripting Bridge to access Notes (and many other apps) and can return all the notes at once as opposed to calling AppleScript for each note. macnotesapp allows you to access multiple accounts and folders as well. ```python from macnotesapp import NotesApp # NotesApp() provides interface to Notes.app notesapp = NotesApp() # Get list of notes (Note objects for each note) notes = notesapp.notes() note = notes[0] print( note.id, note.account, note.folder, note.name, note.body, note.plaintext, note.password_protected, ) print(note.asdict()) ```",611552758,apple-notes-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 689800307,MDU6SXNzdWU2ODk4MDAzMDc=,1,Add an index on the timestamp column,9599,simonw,closed,0,,,,,0,2020-09-01T04:33:37Z,2020-09-01T04:49:23Z,2020-09-01T04:49:23Z,MEMBER,,Since default view will likely be ordered by timestamp descending.,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 691557547,MDU6SXNzdWU2OTE1NTc1NDc=,10,Category 3: received,9599,simonw,closed,0,,,,,1,2020-09-03T01:40:36Z,2020-09-03T17:38:51Z,2020-09-03T17:38:51Z,MEMBER,,"A category for things that were sent to me: DMs, emails etc. Follows #7.",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 692125110,MDU6SXNzdWU2OTIxMjUxMTA=,11,Public / Private mechanism,9599,simonw,closed,0,,,,,1,2020-09-03T16:47:03Z,2020-09-03T17:33:52Z,2020-09-03T17:33:52Z,MEMBER,,"Some of the data in Dogsheep is stuff that was written publicly - tweets, blog posts, GitHub commits to public repos. Some of it is private data - emails, photos, direct messages, Swarm checkins, commits to private repos. Being able to filter for just one or the other (or both) would be useful. Especially when giving demos!",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 692386625,MDU6SXNzdWU2OTIzODY2MjU=,13,Support advanced FTS queries,9599,simonw,closed,0,,,,,1,2020-09-03T21:29:56Z,2020-09-03T21:40:51Z,2020-09-03T21:40:51Z,MEMBER,,`simon willison NOT screenshot` for example.,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 693318095,MDU6SXNzdWU2OTMzMTgwOTU=,14,On FTS exception rerun the query with quoting,9599,simonw,closed,0,,,,,0,2020-09-04T15:44:18Z,2020-09-05T16:23:01Z,2020-09-05T16:23:01Z,MEMBER,,"Searching for eg `#dogfest` currently throws an FTS exception - but I want to support advanced FTS query tricks as seen in #13. https://dogsheep.simonwillison.net/-/beta?q=%23dogfest > fts5: syntax error near ""#"" Idea: catch that error and re-run the query with FTS escaping applied! ",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 694500679,MDU6SXNzdWU2OTQ1MDA2Nzk=,17,"Rename ""table"" to ""type""",9599,simonw,closed,0,,,,,2,2020-09-06T19:34:41Z,2020-09-09T03:03:22Z,2020-09-09T03:03:22Z,MEMBER,,"I think ""table"" is the wrong name for the concept I'm using it for here. Two reasons: firstly, `table` is a reserved word in SQLite. More importantly, it turns out there's not a direct mapping from tables to types of search result. In particular, for GitHub I ended up having two different ""tables"" of repositories - one for repos created by me, another for repos that I have starred.",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 689809225,MDU6SXNzdWU2ODk4MDkyMjU=,2,Apply porter stemming,9599,simonw,closed,0,,,,,2,2020-09-01T04:57:55Z,2020-09-01T20:42:00Z,2020-09-01T20:40:24Z,MEMBER,,This can be on by default. You can turn it off for a table in the config file using `stemming: none` - or maybe `tokenize: none` to match the terminology used by SQLite and `sqlite-utils`: https://sqlite-utils.readthedocs.io/en/stable/python-api.html#enabling-full-text-search,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 703951918,MDU6SXNzdWU3MDM5NTE5MTg=,21,Option to sort search results by date,9599,simonw,closed,0,,,,,0,2020-09-17T22:32:39Z,2020-09-17T22:55:35Z,2020-09-17T22:55:35Z,MEMBER,,"Sometimes I want to sort by date, not by relevance.",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/21/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 703962917,MDU6SXNzdWU3MDM5NjI5MTc=,22,Bug: UI says sorted by relevance in timeline view,9599,simonw,closed,0,,,,,0,2020-09-17T23:02:07Z,2020-09-17T23:13:14Z,2020-09-17T23:13:14Z,MEMBER,,"In regular timeline view sort defaults to newest, not relevance - so this UI is incorrect: ",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 703970713,MDU6SXNzdWU3MDM5NzA3MTM=,23,Sort option should persist between multiple searches,9599,simonw,closed,0,,,,,0,2020-09-17T23:21:26Z,2020-09-18T22:39:12Z,2020-09-18T22:39:12Z,MEMBER,,Following #21 ,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 703970814,MDU6SXNzdWU3MDM5NzA4MTQ=,24,"the JSON object must be str, bytes or bytearray, not 'Undefined'",9599,simonw,closed,0,,,,,8,2020-09-17T23:21:41Z,2020-09-18T22:33:32Z,2020-09-18T22:33:32Z,MEMBER,,Got this on a search results page.,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 704685890,MDU6SXNzdWU3MDQ2ODU4OTA=,25,template_debug mechanism,9599,simonw,closed,0,,,,,2,2020-09-18T22:11:09Z,2020-09-18T22:12:21Z,2020-09-18T22:12:03Z,MEMBER,,"> I'd prefer it if errors in these template fragments were displayed as errors inline where the fragment should have been inserted, rather than 500ing the whole page - especially since the template fragments are user-provided and could have all kinds of odd errors in them which should be as easy to debug as possible. _Originally posted by @simonw in https://github.com/dogsheep/dogsheep-beta/issues/24#issuecomment-694554584_",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 723861683,MDU6SXNzdWU3MjM4NjE2ODM=,28,Switch to using datasette.client,9599,simonw,closed,0,,,,,1,2020-10-17T22:42:26Z,2020-10-17T23:00:47Z,2020-10-17T23:00:47Z,MEMBER,,"`datasette.client` is designed for this kind of thing, to replace this code: https://github.com/dogsheep/dogsheep-beta/blob/bed9df2b3ef68189e2e445427721a28f4e9b4887/dogsheep_beta/__init__.py#L223-L232",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/28/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 689810340,MDU6SXNzdWU2ODk4MTAzNDA=,3,"Datasette plugin to provide custom page for running faceted, ranked searches",9599,simonw,closed,0,,,,,3,2020-09-01T05:00:22Z,2020-09-03T21:01:41Z,2020-09-03T21:01:41Z,MEMBER,,"This will be a page at `/-/beta` which renders using a custom template. It will offer a default timeline view plus search and facet by type/date.",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 769282206,MDU6SXNzdWU3NjkyODIyMDY=,30,Upgrade to sqlite-utils 3.0 (tests are failing),9599,simonw,closed,0,,,,,0,2020-12-16T21:25:15Z,2020-12-16T21:27:11Z,2020-12-16T21:27:10Z,MEMBER,,"``` results = beta_db[""search_index""].search(""run"") if use_porter: assert results == [ ( ""dogs.db/dogs"", ""1"", ""Cleo"", ""2020-08-22 04:41:33"", 1, 0, ""running"", None, None, ) ] else: > assert results == [] E assert == [] E + E -[] E Full diff: E - [] E + ``` This was caused by a backwards incompatible change in sqlite-utils 3.0: https://sqlite-utils.readthedocs.io/en/stable/changelog.html#v3-0",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/30/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 771316301,MDU6SXNzdWU3NzEzMTYzMDE=,31,"Searching for ""github-to-sqlite"" throws an error",9599,simonw,closed,0,,,,,4,2020-12-19T06:07:20Z,2020-12-19T06:18:07Z,2020-12-19T06:18:07Z,MEMBER,,"https://datasette.io/-/beta?q=github-to-sqlite&sort=relevance&type=blog.db%2Fentries - ""no such column: to""",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/31/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 919733213,MDU6SXNzdWU5MTk3MzMyMTM=,33,Searching for whitespace throws an error,9599,simonw,closed,0,,,,,0,2021-06-13T06:57:57Z,2021-06-13T14:36:39Z,2021-06-13T14:36:39Z,MEMBER,,"https://datasette.io/-/beta?q=+ returns a 500 > fts5: syntax error near """"",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1817281557,I_kwDOC8SPRc5sUYQV,37,cannot use jinja filters in display?,10352819,rprimet,closed,0,,,,,1,2023-07-23T20:09:54Z,2023-07-23T20:18:27Z,2023-07-23T20:18:26Z,NONE,,"Hi, I'm trying to have a display function in Dogsheep's `config.yml` that includes something like this: ```

{{ display.title }} (source)

{{ display.snippet|safe }}

``` Unfortunately, rendering fails with a message 'urls is undefined'. The same happens if I'm trying to build a row URL manually, using filters like `quote_plus` (as my keys are URLs). Any hints? Thanks!",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/37/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 689839399,MDU6SXNzdWU2ODk4MzkzOTk=,4,Optimize the FTS table,9599,simonw,closed,0,,,,,1,2020-09-01T05:58:17Z,2020-09-01T06:10:08Z,2020-09-01T06:10:08Z,MEMBER,,,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 689847361,MDU6SXNzdWU2ODk4NDczNjE=,5,Add a context column that's not searchable,9599,simonw,closed,0,,,,,1,2020-09-01T06:13:42Z,2020-09-03T18:43:50Z,2020-09-03T18:43:50Z,MEMBER,,"I sometimes like to configure titles that are things like ""Comment on issue X"" or ""Photo in Golden Gate Park"" - these shouldn't be included in the search index but should be stored so they can be displayed to provide context. Add a column for this - probably called `context` - and make it so it can be populated.",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 691265198,MDU6SXNzdWU2OTEyNjUxOTg=,7,"Mechanism for differentiating between ""by me"" and ""liked by me""",9599,simonw,closed,0,,,,,6,2020-09-02T17:44:37Z,2020-09-02T21:06:28Z,2020-09-02T21:06:28Z,MEMBER,,"Some of the content I'm indexing is by me - photos I've taken, tweets I wrote, commits, comments I posted. Some of it is stuff that I've ""liked"" or ""bookmarked"" in some way - favourited tweets, Pocket articles, starred GitHub repos. It woud be useful to be able to differentiate between the two.",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 691369691,MDU6SXNzdWU2OTEzNjk2OTE=,8,Create a view for running faceted searches,9599,simonw,closed,0,,,,,1,2020-09-02T19:44:07Z,2020-09-02T19:50:47Z,2020-09-02T19:50:47Z,MEMBER,,"```sql select search_index_fts.rank, search_index.rowid, search_index.[table], search_index.key, search_index.title, search_index.timestamp, search_index.search_1 from search_index join search_index_fts on search_index.rowid = search_index_fts.rowid order by search_index_fts.rank, search_index.timestamp desc ```",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 691521965,MDU6SXNzdWU2OTE1MjE5NjU=,9,Mechanism for defining custom display of results,9599,simonw,closed,0,,,,,8,2020-09-03T00:14:07Z,2020-09-03T21:12:14Z,2020-09-03T21:09:55Z,MEMBER,,Part of #3 - in particular I want to make sure my photos are displayed with a thumbnail.,197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 606028272,MDU6SXNzdWU2MDYwMjgyNzI=,10,Speed up hashing step using threads,9599,simonw,closed,0,,,,,0,2020-04-24T04:20:08Z,2020-04-24T04:32:35Z,2020-04-24T04:32:35Z,MEMBER,,"This TODO from the code: https://github.com/dogsheep/photos-to-sqlite/blob/2e7f2c67cc18b02c75bb64992a05b0196e507252/photos_to_sqlite/cli.py#L82-L90",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 606032950,MDU6SXNzdWU2MDYwMzI5NTA=,11,Try running S3 uploads in a thread pool,9599,simonw,closed,0,,,,,0,2020-04-24T04:34:31Z,2020-04-24T16:45:41Z,2020-04-24T16:45:41Z,MEMBER,,"Since #10 provided such a speedup, can the same thing be done for the actual uploads? http://ls.pwd.io/2013/06/parallel-s3-uploads-using-boto-and-threads-in-python/ suggests it can really help performance.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612151767,MDU6SXNzdWU2MTIxNTE3Njc=,15,Expose scores from ZCOMPUTEDASSETATTRIBUTES,9599,simonw,closed,0,,,,,7,2020-05-04T20:36:07Z,2020-12-20T04:44:22Z,2020-05-05T00:11:45Z,MEMBER,,"The Apple Photos database has a `ZCOMPUTEDASSETATTRIBUTES` that looks absurdly interesting... it has calculated scores for every photo: ",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612860531,MDU6SXNzdWU2MTI4NjA1MzE=,17,Only install osxphotos if running on macOS,9599,simonw,closed,0,,,,,3,2020-05-05T20:03:26Z,2020-05-05T20:20:05Z,2020-05-05T20:11:23Z,MEMBER,,The build is broken right now because you can't `pip install osxphotos` on Ubuntu.,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 613002220,MDU6SXNzdWU2MTMwMDIyMjA=,19,apple-photos command should work even if upload has not run,9599,simonw,closed,0,,,,,1,2020-05-06T02:02:25Z,2020-05-19T20:59:59Z,2020-05-19T20:59:59Z,MEMBER,,"I want people to be able to query their Apple Photos metadata without having to first run `upload` to upload all of their files to their own S3 bucket. To do this I can have `apple-photos` calculate SHA256 hashes of each photo if the `uploads` table does not yet exist (or does not contain that photo).",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602533352,MDU6SXNzdWU2MDI1MzMzNTI=,2,Ability to convert HEIC images to JPEG,9599,simonw,closed,0,,,5324096,Apple Photos online and securely browsable,1,2020-04-18T19:23:43Z,2020-04-28T16:47:21Z,2020-04-28T16:47:21Z,MEMBER,,,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 613006393,MDU6SXNzdWU2MTMwMDYzOTM=,20,Ability to serve thumbnailed Apple Photo from its place on disk,9599,simonw,closed,0,,,,,10,2020-05-06T02:17:50Z,2020-05-25T20:14:22Z,2020-05-25T20:09:41Z,MEMBER,,"A custom Datasette plugin that can be run locally on a Mac laptop which knows how to serve photos such that they can be seen in the browser. _Originally posted by @simonw in https://github.com/dogsheep/photos-to-sqlite/issues/19#issuecomment-624406285_",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 615474990,MDU6SXNzdWU2MTU0NzQ5OTA=,21,bpylist.archiver.CircularReference: archive has a cycle with uid(13),9599,simonw,closed,0,,,,,11,2020-05-10T20:58:06Z,2020-12-19T07:44:49Z,2020-05-10T21:57:13Z,MEMBER,,"``` % python -i $(which photos-to-sqlite) apple-photos photos.db Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/photoinfo.py"", line 611, in place return self._place # pylint: disable=access-member-before-definition AttributeError: 'PhotoInfo' object has no attribute '_place' During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/bin/photos-to-sqlite"", line 11, in load_entry_point('photos-to-sqlite', 'console_scripts', 'photos-to-sqlite')() File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/Users/simon/Dropbox/Development/photos-to-sqlite/photos_to_sqlite/cli.py"", line 249, in apple_photos photo_row = osxphoto_to_row(sha256, photo) File ""/Users/simon/Dropbox/Development/photos-to-sqlite/photos_to_sqlite/utils.py"", line 91, in osxphoto_to_row place = photo.place File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/photoinfo.py"", line 614, in place self._place = PlaceInfo5(self._info[""reverse_geolocation""]) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py"", line 505, in __init__ self._plrevgeoloc = archiver.unarchive(revgeoloc_bplist) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 16, in unarchive return Unarchive(plist).top_object() File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 256, in top_object return self.decode_object(self.top_uid) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 247, in decode_object obj = klass.decode_archive(ArchivedObject(raw_obj, self)) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py"", line 126, in decode_archive mapItem = archive.decode(""mapItem"") File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 140, in decode return self._unarchiver.decode_key(self._object, key) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 216, in decode_key return self.decode_object(val) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 247, in decode_object obj = klass.decode_archive(ArchivedObject(raw_obj, self)) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py"", line 180, in decode_archive sortedPlaceInfos = archive.decode(""sortedPlaceInfos"") File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 140, in decode return self._unarchiver.decode_key(self._object, key) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 216, in decode_key return self.decode_object(val) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 247, in decode_object obj = klass.decode_archive(ArchivedObject(raw_obj, self)) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 112, in decode_archive return [archive._decode_index(index) for index in uids] File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 112, in return [archive._decode_index(index) for index in uids] File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 137, in _decode_index return self._unarchiver.decode_object(index) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 247, in decode_object obj = klass.decode_archive(ArchivedObject(raw_obj, self)) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py"", line 217, in decode_archive placeType = archive.decode(""placeType"") File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 140, in decode return self._unarchiver.decode_key(self._object, key) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 216, in decode_key return self.decode_object(val) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 227, in decode_object raise CircularReference(index) bpylist.archiver.CircularReference: archive has a cycle with uid(13) ``` In the debugger I traced this back to: ``` 178 @staticmethod 179 def decode_archive(archive): 180 -> sortedPlaceInfos = archive.decode(""sortedPlaceInfos"") 181 finalPlaceInfos = archive.decode(""finalPlaceInfos"") 182 return PLRevGeoMapItem(sortedPlaceInfos, finalPlaceInfos) ```",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 621280529,MDU6SXNzdWU2MjEyODA1Mjk=,23,create-subset command for creating a publishable subset of a photos database,9599,simonw,closed,0,,,,,1,2020-05-19T20:58:20Z,2020-05-19T22:32:48Z,2020-05-19T22:32:37Z,MEMBER,,"I want to share a subset of my photos, without sharing everything. Idea: $ photos-to-sqlite create-subset photos.db public.db ""select sha256 from ... where ..."" So the command takes a SQL query that returns sha256 hashes, then creates a new file called `public.db` containing just the data corresponding to those photos.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 621332242,MDU6SXNzdWU2MjEzMzIyNDI=,25,Create a public demo,9599,simonw,closed,0,,,,,5,2020-05-19T22:47:20Z,2020-05-21T22:26:16Z,2020-05-20T05:54:18Z,MEMBER,,"So I can show people what this does, using some of my photos.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 621444763,MDU6SXNzdWU2MjE0NDQ3NjM=,26,Rename project to dogsheep-photos,9599,simonw,closed,0,,,,,8,2020-05-20T04:12:34Z,2020-05-20T04:31:02Z,2020-05-20T04:30:40Z,MEMBER,,`photos-to-sqlite` doesn't really capture the full scope of this project anymore.,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/26/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 638375985,MDExOlB1bGxSZXF1ZXN0NDM0MTYyMzE2,29,Fixed bug in SQL query for photo scores,41546558,RhetTbull,closed,0,,,,,1,2020-06-14T15:39:22Z,2020-12-04T22:32:36Z,2020-12-04T22:32:27Z,CONTRIBUTOR,dogsheep/dogsheep-photos/pulls/29,"The join on ZCOMPUTEDASSETATTRIBUTES used the wrong columns. In most of the Photos database tables, table.ZASSET joins with ZGENERICASSET.Z_PK",256834907,dogsheep-photos,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/29/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1827427757,PR_kwDOD079W85WtUKG,38,photos-to-sql not found?,319473,coldclimate,closed,0,,,,,2,2023-07-29T09:59:42Z,2023-07-29T10:01:27Z,2023-07-29T10:01:23Z,FIRST_TIME_CONTRIBUTOR,dogsheep/dogsheep-photos/pulls/38,"I wonder if `photos-to-sql` is an old name for `dogsheep-photos`, because I can't find it anywhere. I can't actually get this command to work (`sqlite3.OperationalError: no such table: attached.ZGENERICASSET` thrown) but I don't think that's related",256834907,dogsheep-photos,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 602533539,MDU6SXNzdWU2MDI1MzM1Mzk=,4,Upload all my photos to a secure S3 bucket,9599,simonw,closed,0,,,5324096,Apple Photos online and securely browsable,14,2020-04-18T19:24:50Z,2020-04-18T21:58:11Z,2020-04-18T21:57:13Z,MEMBER,,"- [x] Create a bucket with bucket credentials - [x] Programmatically upload some recent photos to it (from a notebook) - [x] Turn this into a script",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1871935751,I_kwDOD079W85vk3kH,40, ImportError: cannot import name 'formatargspec' from 'inspect',36752421,hosslikw,closed,0,,,,,0,2023-08-29T15:36:31Z,2023-08-31T03:18:07Z,2023-08-31T03:18:06Z,NONE,,"I get the following error when running ""pip3 install dogsheep-photos"" "" from inspect import ismethod, isclass, formatargspec ImportError: cannot import name 'formatargspec' from 'inspect' (/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/inspect.py). Did you mean: 'formatargvalues'?"" Python 3.12.0rc1 sqlite 3.43.0 datasette, version 0.64.3",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/40/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602551638,MDU6SXNzdWU2MDI1NTE2Mzg=,5,photos-to-sqlite s3-auth command,9599,simonw,closed,0,,,,,1,2020-04-18T21:05:25Z,2020-04-18T21:08:44Z,2020-04-18T21:08:44Z,MEMBER,,Modeled on `github-to-sqlite auth` - prompts the user for their S3 credentials and saves them to `auth.json`.,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602575575,MDU6SXNzdWU2MDI1NzU1NzU=,6,Add progress bar to upload command,9599,simonw,closed,0,,,,,2,2020-04-18T23:32:41Z,2020-04-19T00:15:24Z,2020-04-19T00:15:24Z,MEMBER,,Upload was added in #4 ,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 605147638,MDU6SXNzdWU2MDUxNDc2Mzg=,8,Should I have used MD5 instead of SHA256?,9599,simonw,closed,0,,,,,2,2020-04-23T00:02:08Z,2020-04-23T00:03:35Z,2020-04-23T00:03:35Z,MEMBER,,"https://docs.aws.amazon.com/AmazonS3/latest/API/RESTCommonResponseHeaders.html > Objects created by the PUT Object, POST Object, or Copy operation, or through the AWS Management Console, and are encrypted by SSE-S3 or plaintext, have ETags that are an MD5 digest of their object data. ",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 605938063,MDU6SXNzdWU2MDU5MzgwNjM=,9,"upload command should be resumable, should only upload photos not already uploaded",9599,simonw,closed,0,,,,,2,2020-04-23T23:31:08Z,2020-04-23T23:39:14Z,2020-04-23T23:39:14Z,MEMBER,,Follow on from #4. ,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 519979091,MDExOlB1bGxSZXF1ZXN0MzM4NjQ3Mzc4,1,Add parkrun-to-sqlite,1101318,mrw34,closed,0,,,,,0,2019-11-08T12:05:32Z,2020-10-12T00:35:16Z,2020-10-12T00:35:16Z,CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/1,,214746582,dogsheep.github.io,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 543717994,MDExOlB1bGxSZXF1ZXN0MzU3OTc0MzI2,3,Add todoist-to-sqlite,706257,bcongdon,closed,0,,,,,0,2019-12-30T04:02:59Z,2020-10-12T00:35:58Z,2020-10-12T00:35:57Z,CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/3,"Really enjoying getting into the dogsheep/datasette ecosystem. I made a downloader for Todoist, and I think/hope others might find this useful",214746582,dogsheep.github.io,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 558715564,MDExOlB1bGxSZXF1ZXN0MzcwMDI0Njk3,4,Add beeminder-to-sqlite,706257,bcongdon,closed,0,,,,,0,2020-02-02T15:51:36Z,2020-10-12T00:36:16Z,2020-10-12T00:36:16Z,CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/4,,214746582,dogsheep.github.io,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 842765105,MDExOlB1bGxSZXF1ZXN0NjAyMjYxMDky,6,Add testres-db tool,1151557,ligurio,closed,0,,,,,1,2021-03-28T15:43:23Z,2022-02-16T05:12:05Z,2022-02-16T05:12:05Z,NONE,dogsheep/dogsheep.github.io/pulls/6,,214746582,dogsheep.github.io,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 770712149,MDExOlB1bGxSZXF1ZXN0NTQyNDA2OTEw,10,BugFix for encoding and not update info.,1277270,riverzhou,closed,0,,,,,1,2020-12-18T08:58:54Z,2021-02-11T22:37:56Z,2021-02-11T22:37:56Z,NONE,dogsheep/evernote-to-sqlite/pulls/10,"Bugfix 1: Traceback (most recent call last): File ""d:\anaconda3\lib\runpy.py"", line 194, in _run_module_as_main return _run_code(code, main_globals, None, File ""d:\anaconda3\lib\runpy.py"", line 87, in _run_code exec(code, run_globals) File ""D:\Anaconda3\Scripts\evernote-to-sqlite.exe\__main__.py"", line 7, in File ""d:\anaconda3\lib\site-packages\click\core.py"", line 829, in __call__ File ""d:\anaconda3\lib\site-packages\click\core.py"", line 782, in main rv = self.invoke(ctx) File ""d:\anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) return ctx.invoke(self.callback, **ctx.params) File ""d:\anaconda3\lib\site-packages\click\core.py"", line 610, in invoke return callback(*args, **kwargs) File ""d:\anaconda3\lib\site-packages\evernote_to_sqlite\cli.py"", line 30, in enex for tag, note in find_all_tags(fp, [""note""], progress_callback=bar.update): File ""d:\anaconda3\lib\site-packages\evernote_to_sqlite\utils.py"", line 11, in find_all_tags chunk = fp.read(1024 * 1024) UnicodeDecodeError: 'gbk' codec can't decode byte 0xa4 in position 383: illegal multibyte sequence Bugfix 2: Traceback (most recent call last): File ""D:\Anaconda3\Scripts\evernote-to-sqlite-script.py"", line 33, in sys.exit(load_entry_point('evernote-to-sqlite==0.3', 'console_scripts', 'evernote-to-sqlite')()) File ""D:\Anaconda3\lib\site-packages\click\core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""D:\Anaconda3\lib\site-packages\click\core.py"", line 782, in main rv = self.invoke(ctx) File ""D:\Anaconda3\lib\site-packages\click\core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""D:\Anaconda3\lib\site-packages\click\core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""D:\Anaconda3\lib\site-packages\click\core.py"", line 610, in invoke return callback(*args, **kwargs) File ""D:\Anaconda3\lib\site-packages\evernote_to_sqlite-0.3-py3.8.egg\evernote_to_sqlite\cli.py"", line 31, in enex File ""D:\Anaconda3\lib\site-packages\evernote_to_sqlite-0.3-py3.8.egg\evernote_to_sqlite\utils.py"", line 28, in save_note AttributeError: 'NoneType' object has no attribute 'text'",303218369,evernote-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 792851444,MDU6SXNzdWU3OTI4NTE0NDQ=,11,XML parse error,3613583,dskrad,closed,0,,,,,2,2021-01-24T17:38:54Z,2021-02-11T21:18:58Z,2021-02-11T21:18:48Z,NONE,,"I am on Windows 10 using Windows Subsystem for Linux, Python 3.8. I installed evernote-to-sqlite via pipx (in a venv). I tried using enex files from the latest version of Evernote for Windows (10.6.9 which only lets you export 50 notes at a time) and from Legacy Evernote (6.25.2.9198 which lets you export all your notes at once). The enex file from latest evernote gives this error: File ""/usr/lib/python3.8/xml/etree/ElementTree.py"", line 1320, in XML parser.feed(text) xml.etree.ElementTree.ParseError: XML or text declaration not at start of entity: line 2, column 6 The enex file from Legacy Evernote gives this error: File ""/home/david/.local/pipx/venvs/evernote-to-sqlite/lib/python3.8/site-packages/evernote_to_sqlite/utils.py"", line 28, in save_note updated = note.find(""updated"").text AttributeError: 'NoneType' object has no attribute 'text'",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 978743426,MDU6SXNzdWU5Nzg3NDM0MjY=,13,xml.etree.ElementTree.ParseError: not well-formed (invalid token),9599,simonw,closed,0,,,,,4,2021-08-25T05:48:21Z,2021-08-26T18:45:13Z,2021-08-26T18:45:13Z,MEMBER,,"Got this error today: ``` (evernote-to-sqlite) /tmp % evernote-to-sqlite enex evernote.db simonwillison\'s\ notebook.enex Importing from ENEX [######------------------------------] 17% Traceback (most recent call last): File ""/Users/simon/.local/bin/evernote-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/evernote_to_sqlite/cli.py"", line 31, in enex save_note(db, note) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/evernote_to_sqlite/utils.py"", line 36, in save_note content = ET.tostring(ET.fromstring(content_xml)).decode(""utf-8"") File ""/usr/local/Cellar/python@3.9/3.9.6/Frameworks/Python.framework/Versions/3.9/lib/python3.9/xml/etree/ElementTree.py"", line 1347, in XML parser.feed(text) xml.etree.ElementTree.ParseError: not well-formed (invalid token): line 2, column 132 ```",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718938046,MDU6SXNzdWU3MTg5MzgwNDY=,2,Convert dates to a better format,9599,simonw,closed,0,,,,,0,2020-10-11T22:12:33Z,2020-10-11T23:15:03Z,2020-10-11T23:15:03Z,MEMBER,,"They currently look like this: https://github.com/dogsheep/evernote-to-sqlite/blob/9d8efd17580f6ddf76745c145d1e69dd24e52b64/tests/test_evernote_to_sqlite.py#L35-L36",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718938321,MDU6SXNzdWU3MTg5MzgzMjE=,3,Use a content hash for the note IDs,9599,simonw,closed,0,,,,,0,2020-10-11T22:13:46Z,2020-10-11T23:15:04Z,2020-10-11T23:15:04Z,MEMBER,,"Without a GUID note IDs are pretty ineffective, but using a hash of the contents will at least avoid creating identical duplicates in the future. https://sqlite-utils.readthedocs.io/en/stable/python-api.html#setting-an-id-based-on-the-hash-of-the-row-contents",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718938508,MDU6SXNzdWU3MTg5Mzg1MDg=,4,Configure FTS + add an index on the date columns,9599,simonw,closed,0,,,,,2,2020-10-11T22:14:40Z,2020-10-11T23:41:29Z,2020-10-11T23:41:29Z,MEMBER,,"Sort by date descending is likely the most common way of sorting, so that column should be indexed. Also add FTS configuration for both notes and the OCR column on resources.",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718949182,MDU6SXNzdWU3MTg5NDkxODI=,6,Better handling of OCR data,9599,simonw,closed,0,,,,,2,2020-10-11T23:20:52Z,2020-10-12T00:04:10Z,2020-10-12T00:04:10Z,MEMBER,,"> I haven't done the FTS on OCR yet. I'm going to move that to another ticket because it requires more thought. _Originally posted by @simonw in https://github.com/dogsheep/evernote-to-sqlite/issues/4#issuecomment-706784028_",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 743297582,MDU6SXNzdWU3NDMyOTc1ODI=,7,evernote-to-sqlite on windows 10 give this error: TypeError: insert() got an unexpected keyword argument 'replace',42387931,martinvanwieringen,closed,0,,,,,1,2020-11-15T16:57:28Z,2021-02-11T22:13:17Z,2021-02-11T22:13:17Z,NONE,,"running evernote-to-sqlite 0.2 on windows 10. Command: evernote-to-sqlite enex evernote.db MyNotes.enex I get the followinng error: File ""C:\Users\marti\AppData\Roaming\Python\Python38\site-packages\evernote_to_sqlite\utils.py"", line 46, in save_note note_id = db[""notes""].insert(row, hash_id=""id"", replace=True, alter=True).last_pk TypeError: insert() got an unexpected keyword argument 'replace' Removing replace=True, Leads to below error: note_id = db[""notes""].insert(row, hash_id=""id"", alter=True).last_pk File ""C:\Users\marti\AppData\Roaming\Python\Python38\site-packages\sqlite_utils\db.py"", line 924, in insert return self.insert_all( File ""C:\Users\marti\AppData\Roaming\Python\Python38\site-packages\sqlite_utils\db.py"", line 1046, in insert_all result = self.db.conn.execute(sql, values) sqlite3.IntegrityError: UNIQUE constraint failed: notes.id",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 748370021,MDExOlB1bGxSZXF1ZXN0NTI1MzcxMDI5,8,"fix import error if note has no ""updated"" element",4028322,mkorosec,closed,0,,,,,0,2020-11-22T22:51:05Z,2021-02-11T22:34:06Z,2021-02-11T22:34:06Z,CONTRIBUTOR,dogsheep/evernote-to-sqlite/pulls/8,"I got the following error when executing evernote-to-sqlite enex evernote.db evernote.enex ``` ... File ""evernote_to_sqlite/cli.py"", line 31, in enex save_note(db, note) File ""evernote_to_sqlite/utils.py"", line 28, in save_note updated = note.find(""updated"").text AttributeError: 'NoneType' object has no attribute 'text' ``` Seems that in some cases the updated element is not added to the note, this is a part of the problematic note: ``` 20201019T074518Z web.clip7 webclipper.evernote ```",303218369,evernote-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 748372469,MDU6SXNzdWU3NDgzNzI0Njk=,9,ParseError: undefined entity š,4028322,mkorosec,closed,0,,,,,1,2020-11-22T23:04:35Z,2021-02-11T22:10:55Z,2021-02-11T22:10:55Z,CONTRIBUTOR,,"I encountered a parse error if the enex file contained š or   Run command: evernote-to-sqlite enex evernote.db evernote.enex ``` Traceback (most recent call last): ... File ""evernote_to_sqlite/cli.py"", line 31, in enex save_note(db, note) File ""evernote_to_sqlite/utils.py"", line 35, in save_note content = ET.tostring(ET.fromstring(content_xml)).decode(""utf-8"") File ""/usr/lib/python3.8/xml/etree/ElementTree.py"", line 1320, in XML parser.feed(text) xml.etree.ElementTree.ParseError: undefined entity š: line 3, column 35 ``` Workaround: ``` sed -i 's/š//g' evernote.enex sed -i 's/ //g' evernote.enex ```",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 493599818,MDU6SXNzdWU0OTM1OTk4MTg=,1,Command for fetching starred repos,9599,simonw,closed,0,,,,,0,2019-09-14T08:36:29Z,2019-09-14T21:30:48Z,2019-09-14T21:30:48Z,MEMBER,,,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516967682,MDU6SXNzdWU1MTY5Njc2ODI=,10,Add this repos_starred view,9599,simonw,closed,0,,,,,3,2019-11-04T05:44:38Z,2020-05-02T16:37:36Z,2020-05-02T16:37:36Z,MEMBER,,"```sql create view repos_starred as select stars.starred_at, users.login, repos.* from repos join stars on repos.id = stars.repo join users on repos.owner = users.id order by starred_at desc; ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520521843,MDU6SXNzdWU1MjA1MjE4NDM=,11,Command to fetch releases,9599,simonw,closed,0,,,,,0,2019-11-09T22:23:30Z,2019-11-09T22:57:00Z,2019-11-09T22:57:00Z,MEMBER,,"https://developer.github.com/v3/repos/releases/#list-releases-for-a-repository `GET /repos/:owner/:repo/releases`",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520756546,MDU6SXNzdWU1MjA3NTY1NDY=,12,Add this view for seeing new releases,9599,simonw,closed,0,,,,,5,2019-11-11T06:00:12Z,2020-05-02T18:58:18Z,2020-05-02T18:58:17Z,MEMBER,,"```sql CREATE VIEW recent_releases AS select json_object(""label"", repos.full_name, ""href"", repos.html_url) as repo, json_object( ""href"", releases.html_url, ""label"", releases.name ) as release, substr(releases.published_at, 0, 11) as date, releases.body as body_markdown, releases.published_at from releases join repos on repos.id = releases.repo order by releases.published_at desc ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 521275281,MDU6SXNzdWU1MjEyNzUyODE=,13,Set up a live demo Datasette instance,9599,simonw,closed,0,,,5225818,1.0,9,2019-11-12T01:27:02Z,2020-03-24T00:03:26Z,2020-03-24T00:03:25Z,MEMBER,,"I deployed https://github-to-sqlite-releases-j7hipcg4aq-uc.a.run.app/ by running this: ``` #!/bin/bash # Fetch repos for simonw and dogsheep github-to-sqlite repos github.db simonw dogsheep -a auth.json # Fetch releases for the repos tagged 'datasette-io' sqlite-utils github.db "" select full_name from repos where rowid in ( select repos.rowid from repos, json_each(repos.topics) j where j.value = 'datasette-io' )"" --csv --no-headers | while read repo; do github-to-sqlite releases \ github.db $(echo $repo | tr -d '\r') \ -a auth.json; sleep 2; done; ``` And then deploying using this: ``` $ datasette publish cloudrun github.db \ --title ""github-to-sqlite releases demo"" \ --about_url=""https://github.com/simonw/github-to-sqlite"" \ --about='github-to-sqlite' \ --install=datasette-render-markdown \ --install=datasette-json-html \ --service=github-to-sqlite-releases ``` This should happen automatically for every release. I can run it once a day in Circle CI to keep the demo database up-to-date.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 544571092,MDU6SXNzdWU1NDQ1NzEwOTI=,15,Assets table with downloads,2029,garethr,closed,0,,,5225818,1.0,4,2020-01-02T13:05:28Z,2020-03-28T12:17:01Z,2020-03-23T19:17:32Z,NONE,,"The `releases` command extracts the releases table, but data about the individual assets are locked up in the JSON document in the `assets` field. My main interest is in individual and aggregate download counts. I was wondering if creating a new table with a record per asset may be useful? If so I'm happy to send a PR when I get a moment. Do you have opinions about that simply being part of the `releases` command or would you prefer a separate command as well?",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 546051181,MDU6SXNzdWU1NDYwNTExODE=,16,Exception running first command: IndexError: list index out of range,15092,jayvdb,closed,0,,,,,4,2020-01-07T03:01:58Z,2020-04-14T18:37:21Z,2020-04-14T18:37:21Z,NONE,,"Exception running first command without an existing db or auth. ```py > mkdir ~/.github/coala > /usr/bin/github-to-sqlite repos ~/.github/coala coala Traceback (most recent call last): File ""/usr/bin/github-to-sqlite"", line 11, in load_entry_point('github-to-sqlite==0.6', 'console_scripts', 'github-to-sqlite')() File ""/usr/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/usr/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/usr/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/usr/lib/python3.7/site-packages/github_to_sqlite/cli.py"", line 163, in repos utils.save_repo(db, repo) File ""/usr/lib/python3.7/site-packages/github_to_sqlite/utils.py"", line 120, in save_repo to_save[""owner""] = save_user(db, to_save[""owner""]) File ""/usr/lib/python3.7/site-packages/github_to_sqlite/utils.py"", line 61, in save_user return db[""users""].upsert(to_save, pk=""id"", alter=True).last_pk File ""/usr/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1135, in upsert extracts=extracts, File ""/usr/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1162, in upsert_all upsert=True, File ""/usr/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1105, in insert_all row = list(self.rows_where(""rowid = ?"", [self.last_rowid]))[0] IndexError: list index out of range ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 578883725,MDU6SXNzdWU1Nzg4ODM3MjU=,17,Command for importing commits,9599,simonw,closed,0,,,,,2,2020-03-10T21:55:12Z,2020-03-11T02:47:37Z,2020-03-11T02:47:37Z,MEMBER,,Using this API: https://api.github.com/repos/dogsheep/github-to-sqlite/commits,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585411547,MDU6SXNzdWU1ODU0MTE1NDc=,18,Commits in GitHub API can have null author,9599,simonw,closed,0,,,5225818,1.0,8,2020-03-21T02:20:56Z,2020-03-23T20:44:49Z,2020-03-23T20:44:26Z,MEMBER,,"``` Traceback (most recent call last): File ""/home/ubuntu/datasette-venv/bin/github-to-sqlite"", line 8, in sys.exit(cli()) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/github_to_sqlite/cli.py"", line 235, in commits utils.save_commits(db, commits, repo_full[""id""]) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/github_to_sqlite/utils.py"", line 290, in save_commits commit_to_insert[""author""] = save_user(db, commit[""author""]) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/github_to_sqlite/utils.py"", line 54, in save_user for key, value in user.items() AttributeError: 'NoneType' object has no attribute 'items' ``` Got this running the `commits` command from cron.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585850715,MDU6SXNzdWU1ODU4NTA3MTU=,19,"Enable full-text search for more stuff (like commits, issues and issue_comments)",9599,simonw,closed,0,,,5225818,1.0,2,2020-03-23T00:19:56Z,2020-03-23T19:06:39Z,2020-03-23T19:06:39Z,MEMBER,,Currently FTS is only enabled for repos and releases.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 493668862,MDU6SXNzdWU0OTM2Njg4NjI=,2,Extract licenses from repos into a separate table,9599,simonw,closed,0,,,,,0,2019-09-14T21:33:41Z,2019-09-14T21:46:58Z,2019-09-14T21:46:58Z,MEMBER,," ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586454513,MDU6SXNzdWU1ODY0NTQ1MTM=,20,Upgrade to sqlite-utils 2.x,9599,simonw,closed,0,,,5225818,1.0,0,2020-03-23T19:17:58Z,2020-03-23T19:22:52Z,2020-03-23T19:22:52Z,MEMBER,,,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586561727,MDU6SXNzdWU1ODY1NjE3Mjc=,21,Turn GitHub API errors into exceptions,9599,simonw,closed,0,,,5225818,1.0,2,2020-03-23T22:37:24Z,2020-03-23T23:48:23Z,2020-03-23T23:48:22Z,MEMBER,,"This would have really helped in debugging the mess in #13. Running with this `auth.json` is a useful demo: ```json {""github_personal_token"": """"} ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/21/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586567379,MDU6SXNzdWU1ODY1NjczNzk=,22,Handle empty git repositories,9599,simonw,closed,0,,,,,0,2020-03-23T22:49:48Z,2020-03-23T23:13:11Z,2020-03-23T23:13:11Z,MEMBER,,"Got this error: ``` github_to_sqlite.utils.GitHubError: {'message': 'Git Repository is empty.', 'documentation_url': 'https://developer.github.com/v3/repos/commits/#list-commits-on-a-repository'} ``` From https://api.github.com/repos/dogsheep/beta/commits",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586595839,MDU6SXNzdWU1ODY1OTU4Mzk=,23,Release 1.0,9599,simonw,closed,0,,,5225818,1.0,1,2020-03-24T00:03:55Z,2020-03-24T00:15:50Z,2020-03-24T00:15:50Z,MEMBER,,Need to compile release notes.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601265023,MDU6SXNzdWU2MDEyNjUwMjM=,25,Improvements to demo instance,9599,simonw,closed,0,,,,,1,2020-04-16T17:26:55Z,2020-04-16T18:07:12Z,2020-04-16T18:07:12Z,MEMBER,,- [x] Demo should pull issue-comments as well,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601271612,MDU6SXNzdWU2MDEyNzE2MTI=,26,Topics are missing from repositories,9599,simonw,closed,0,,,,,2,2020-04-16T17:36:32Z,2020-04-16T17:41:11Z,2020-04-16T17:41:11Z,MEMBER,,"I'm sure this used to work, but right now repositories are fetched without their topics. https://developer.github.com/v3/repos/ says you need to send a custom `Accept` header of `application/vnd.github.mercy-preview+json` to get topics.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/26/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601330277,MDU6SXNzdWU2MDEzMzAyNzc=,27,Repos have a big blob of JSON in the organization column,9599,simonw,closed,0,,,,,5,2020-04-16T18:43:14Z,2020-04-18T00:19:16Z,2020-04-18T00:18:52Z,MEMBER,,"e.g. https://github-to-sqlite.dogsheep.net/github/repos ![github__repos__11_rows_where_sorted_by_updated_at_descending](https://user-images.githubusercontent.com/9599/79494124-5640b980-7fd7-11ea-99a2-17ffbd82f9ce.png) This appears to be obsolete because the `owner` column already links to that record, albeit in the `users` table with `type` set to `Organization`: https://github-to-sqlite.dogsheep.net/github/users/53015001",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/27/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601333634,MDU6SXNzdWU2MDEzMzM2MzQ=,28,Pull repository contributors,9599,simonw,closed,0,,,,,3,2020-04-16T18:46:40Z,2020-04-18T15:05:10Z,2020-04-18T15:05:10Z,MEMBER,,"https://developer.github.com/v3/repos/#list-contributors `GET /repos/:owner/:repo/contributors` Not sure if this should be a separate command or should be part of the existing `repos` command. I'm leaning towards a new `contributors` command.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/28/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 603617013,MDU6SXNzdWU2MDM2MTcwMTM=,29,Milestones should have foreign key to creator and repo,9599,simonw,closed,0,,,,,1,2020-04-21T00:20:44Z,2020-04-21T00:43:58Z,2020-04-21T00:43:58Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github/milestones Creator is an integer but not a foreign key to users Repo is missing entirely!",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/29/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 493670426,MDU6SXNzdWU0OTM2NzA0MjY=,3,Command to fetch all repos belonging to a user or organization,9599,simonw,closed,0,,,,,2,2019-09-14T21:54:21Z,2019-09-17T00:17:53Z,2019-09-17T00:17:53Z,MEMBER,,"How about this: $ github-to-sqlite repos simonw",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 603618244,MDU6SXNzdWU2MDM2MTgyNDQ=,30,Issues milestone column is the wrong type,9599,simonw,closed,0,,,,,2,2020-04-21T00:24:34Z,2020-04-21T00:45:23Z,2020-04-21T00:36:22Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github/issues?milestone=2857392 ![2A4C1185-2434-4F29-9EA0-3246E2F03F77](https://user-images.githubusercontent.com/9599/79811760-b7e08b00-832b-11ea-9ad7-684a6ae097a6.jpeg) It is TEXT when it should be an INTEGER - which is why the foreign key label is not correctly displayed.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/30/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 603624862,MDU6SXNzdWU2MDM2MjQ4NjI=,31,Issue and milestone should have foreign key to repo,9599,simonw,closed,0,,,,,3,2020-04-21T00:46:24Z,2020-04-22T01:20:19Z,2020-04-22T01:20:19Z,MEMBER,,"Currently the `repo` column on those tables is a string `simonw/datasette` rather than an ID referencing a row in `repos`. _Originally posted by @simonw in https://github.com/dogsheep/github-to-sqlite/issues/29#issuecomment-616883275_",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/31/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 604222295,MDU6SXNzdWU2MDQyMjIyOTU=,32,Issue comments don't appear to populate issues foreign key,9599,simonw,closed,0,,,,,3,2020-04-21T19:17:32Z,2020-04-22T01:17:44Z,2020-04-22T01:17:44Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github?sql=select+html_url%2C+id%2C+issue+from+issue_comments+order+by+updated_at+desc+limit+101 ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/32/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 609950090,MDU6SXNzdWU2MDk5NTAwOTA=,33,Fall back to authentication via ENV,2029,garethr,closed,0,,,,,4,2020-04-30T12:58:14Z,2020-05-02T18:46:10Z,2020-05-02T18:45:37Z,NONE,,"Would you accept a PR that falls back to looking for an environment variable for the GitHub token? Specifically a change here: https://github.com/dogsheep/github-to-sqlite/blob/c34d5a18bfc41fa08755ba3d5cf9fe09ff204238/github_to_sqlite/cli.py#L271 I'd like to use `github-to-sqlite` in a GitHub Action workflow and this would be simpler than trying to fill out the prompt or generate a file with sensitive content. Wanted to check first, I'm happy to submit a PR with tests and updates to the docs. ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610408908,MDU6SXNzdWU2MTA0MDg5MDg=,34,Command for retrieving dependents for a repo,9599,simonw,closed,0,,,,,6,2020-04-30T21:47:51Z,2020-05-03T15:53:01Z,2020-05-03T15:53:01Z,MEMBER,,"I really, really want to start grabbing this data: https://github.com/simonw/datasette/network/dependents",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/34/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610511450,MDU6SXNzdWU2MTA1MTE0NTA=,35,Create index on issue_comments(user) and other foreign keys,9599,simonw,closed,0,,,,,3,2020-05-01T02:06:56Z,2020-05-02T18:26:24Z,2020-05-02T18:26:24Z,MEMBER,,"``` create index issue_comments_user on issue_comments(user) ``` I'm sure there are other user columns that could benefit from an index.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/35/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610842926,MDU6SXNzdWU2MTA4NDI5MjY=,36,Add view for better display of dependent repos,9599,simonw,closed,0,,,,,2,2020-05-01T16:33:44Z,2020-05-02T16:50:31Z,2020-05-02T16:30:11Z,MEMBER,,"```sql select repos.full_name as repo, 'https://github.com/' || repos2.full_name as dependent, repos2.created_at as dependent_repo_created, repos2.updated_at as dependent_repo_updated, repos2.stargazers_count as dependent_repo_stars, repos2.watchers_count as dependent_repo_watchers from dependents join repos as repos2 on dependents.dependent = repos2.id join repos on dependents.repo = repos.id order by repos2.created_at desc ``` https://dogsheep.simonwillison.net/github?sql=select%0D%0A++repos.full_name+as+repo%2C%0D%0A++%27https%3A%2F%2Fgithub.com%2F%27+%7C%7C+repos2.full_name+as+dependent%2C%0D%0A++repos2.created_at+as+dependent_repo_created%2C%0D%0A++repos2.updated_at+as+dependent_repo_updated%2C%0D%0A++repos2.stargazers_count+as+dependent_repo_stars%2C%0D%0A++repos2.watchers_count+as+dependent_repo_watchers%0D%0Afrom%0D%0A++dependents%0D%0A++join+repos+as+repos2+on+dependents.dependent+%3D+repos2.id%0D%0A++join+repos+on+dependents.repo+%3D+repos.id%0D%0Aorder+by%0D%0A++repos2.created_at+desc",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/36/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610843136,MDU6SXNzdWU2MTA4NDMxMzY=,37,Mechanism for creating views if they don't yet exist,9599,simonw,closed,0,,,,,3,2020-05-01T16:34:10Z,2020-05-02T16:19:47Z,2020-05-02T16:19:31Z,MEMBER,,Needed for #36 #10 #12 ,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/37/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611284481,MDU6SXNzdWU2MTEyODQ0ODE=,38,[Feature Request] Support Repo Name in Search 🥺,5779832,zzeleznick,closed,0,,,,,4,2020-05-02T22:08:51Z,2020-05-03T02:34:32Z,2020-05-02T23:15:11Z,NONE,,"## Description Per your [v2.2 release tweet](https://twitter.com/simonw/status/1256700238099693568) I played with the demo, but the output did not match my expectations. ## Expected Behavior Expected a search query for ""twitter"" contained within the `repo` column to return non-zero results. ## Actual Behavior 😭 [0 rows where repo contains ""twitter"" sorted by starred_at descending](https://github-to-sqlite.dogsheep.net/github/stars?repo__contains=twitter&_sort_desc=starred_at) ## Best Explanation Per the table schema (see appendix) `repo` is of type `INTEGER` which built from `repo_id` and does not expose the repo name in search. ## Desired Behavior Given that searching for ""206156866"" is less intuitive than ""twitter"", it would be great to support this via extending the search capabilities or by adding an additional column. ✅ 104 rows where repo contains ""twitter"" ❌ [104 rows where repo contains ""206156866"" sorted by starred_at descending](https://github-to-sqlite.dogsheep.net/github/stars?repo__contains=206156866&_sort_desc=starred_at) ## Appendix ``` CREATE TABLE [stars] ( [user] INTEGER REFERENCES [users]([id]), [repo] INTEGER REFERENCES [repos]([id]), [starred_at] TEXT, PRIMARY KEY ([user], [repo]) ); CREATE INDEX [idx_stars_repo] ON [stars] ([repo]); CREATE INDEX [idx_stars_user] ON [stars] ([user]); ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 613777056,MDU6SXNzdWU2MTM3NzcwNTY=,39,issues foreign key to repo isn't working,9599,simonw,closed,0,,,,,1,2020-05-07T05:11:48Z,2020-08-18T14:24:46Z,2020-08-18T14:23:56Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github/issues?_facet=repo If the foreign key was working those would be repository names. From the schema at the bottom of the page: ``` [repo] TEXT, ``` That's the wrong type and not a foreign key.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/39/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 493670730,MDU6SXNzdWU0OTM2NzA3MzA=,4,Command to fetch stargazers for one or more repos,9599,simonw,closed,0,,,,,8,2019-09-14T21:58:22Z,2020-05-02T21:30:27Z,2020-05-02T21:30:27Z,MEMBER,,"Maybe this: $ github-to-sqlite stargazers github.db simonw/datasette It could accept more than one repos. Maybe have options similar to `--sql` in [twitter-to-sqlite](https://github.com/dogsheep/twitter-to-sqlite) so you can e.g. fetch all stargazers for all of the repos you have fetched into the database already (or all of the repos belonging to owner X)",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 637899539,MDU6SXNzdWU2Mzc4OTk1Mzk=,40,Demo deploy is broken,9599,simonw,closed,0,,,,,2,2020-06-12T17:20:17Z,2020-06-12T18:06:48Z,2020-06-12T18:06:48Z,MEMBER,,"https://github.com/dogsheep/github-to-sqlite/runs/766180404?check_suite_focus=true ``` The following NEW packages will be installed: sqlite3 0 upgraded, 1 newly installed, 0 to remove and 11 not upgraded. Need to get 752 kB of archives. After this operation, 2482 kB of additional disk space will be used. Ign:1 http://azure.archive.ubuntu.com/ubuntu bionic-updates/main amd64 sqlite3 amd64 3.22.0-1ubuntu0.3 Err:1 http://security.ubuntu.com/ubuntu bionic-updates/main amd64 sqlite3 amd64 3.22.0-1ubuntu0.3 404 Not Found [IP: 52.177.174.250 80] E: Failed to fetch http://security.ubuntu.com/ubuntu/pool/main/s/sqlite3/sqlite3_3.22.0-1ubuntu0.3_amd64.deb 404 Not Found [IP: 52.177.174.250 80] E: Unable to fetch some archives, maybe run apt-get update or try with --fix-missing? ##[error]Process completed with exit code 100. ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/40/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 651159727,MDU6SXNzdWU2NTExNTk3Mjc=,41,Demo is failing to deploy,9599,simonw,closed,0,,,,,7,2020-07-05T22:40:33Z,2020-07-06T01:07:03Z,2020-07-06T01:07:02Z,MEMBER,,"https://github.com/dogsheep/github-to-sqlite/runs/837714622?check_suite_focus=true ``` Creating Revision.........................................................................................................................................failed Deployment failed ERROR: (gcloud.run.deploy) Cloud Run error: Container failed to start. Failed to start and then listen on the port defined by the PORT environment variable. Logs for this revision might contain more information. Traceback (most recent call last): File ""/opt/hostedtoolcache/Python/3.8.3/x64/bin/datasette"", line 8, in sys.exit(cli()) File ""/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/datasette/publish/cloudrun.py"", line 138, in cloudrun check_call( File ""/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/subprocess.py"", line 364, in check_call raise CalledProcessError(retcode, cmd) subprocess.CalledProcessError: Command 'gcloud run deploy --allow-unauthenticated --platform=managed --image gcr.io/datasette-222320/datasette github-to-sqlite' returned non-zero exit status 1. ##[error]Process completed with exit code 1. ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/41/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 654405302,MDU6SXNzdWU2NTQ0MDUzMDI=,42,Option for importing just specific repos,9599,simonw,closed,0,,,,,0,2020-07-09T23:20:15Z,2020-07-09T23:25:35Z,2020-07-09T23:25:35Z,MEMBER,,"For if you know which specific repos you care about, as opposed to loading everything owned by the authenticated user. github-to-sqlite repos specific.db -r simonw/datasette -r simonw/github-contents ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/42/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 660355904,MDU6SXNzdWU2NjAzNTU5MDQ=,43,github-to-sqlite tags command for fetching tags,9599,simonw,closed,0,,,,,4,2020-07-18T20:14:12Z,2020-07-18T23:05:56Z,2020-07-18T21:52:15Z,MEMBER,,Fetches paginated data from https://api.github.com/repos/simonw/datasette/tags,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/43/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 660413281,MDU6SXNzdWU2NjA0MTMyODE=,44,Rename tags.repo_id column to tags.repo,9599,simonw,closed,0,,,,,0,2020-07-18T22:13:46Z,2020-07-18T22:15:12Z,2020-07-18T22:15:12Z,MEMBER,,"For improved consistency with other tables. https://observablehq.com/@simonw/datasette-table-diagram ![datasette-table-diagram(1)](https://user-images.githubusercontent.com/9599/87862843-3cca4900-c909-11ea-9c76-58b3f4aca43f.png) ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/44/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 660429601,MDU6SXNzdWU2NjA0Mjk2MDE=,45,Fix the demo - it breaks because of the tags table change,9599,simonw,closed,0,,,,,5,2020-07-18T22:49:32Z,2020-07-18T23:03:14Z,2020-07-18T23:03:13Z,MEMBER,,"https://github.com/dogsheep/github-to-sqlite/runs/885773677 ``` File ""/home/runner/work/github-to-sqlite/github-to-sqlite/github_to_sqlite/utils.py"", line 476, in save_tags db[""tags""].insert_all( File ""/opt/hostedtoolcache/Python/3.8.3/x64/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1145, in insert_all result = self.db.conn.execute(query, params) sqlite3.OperationalError: table tags has no column named repo ``` That's because I changed the name in #44. I thought this would be safe since no-one else could possibly be using this yet (it hadn't shipped in a release) but turns out I broke my demo!",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/45/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 681086659,MDU6SXNzdWU2ODEwODY2NTk=,47,emojis command,9599,simonw,closed,0,,,,,1,2020-08-18T14:26:26Z,2020-08-18T14:52:13Z,2020-08-18T14:52:13Z,MEMBER,,For fun - it can import https://api.github.com/emojis - maybe with an option to fetch the binary representations in addition to the URLs.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/47/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 681228542,MDExOlB1bGxSZXF1ZXN0NDY5NjUxNzMy,48,Add pull requests,755825,adamjonas,closed,0,,,,,2,2020-08-18T17:58:44Z,2020-11-29T23:51:09Z,2020-11-29T23:51:09Z,CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/48,"ref #46 Issues don't have merge information on them, which means that PRs need to be pulled separately. Did my best to mimic the API of issues.",207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/48/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 493671014,MDU6SXNzdWU0OTM2NzEwMTQ=,5,"Add ""incomplete"" boolean to users table for incomplete profiles",9599,simonw,closed,0,,,,,2,2019-09-14T22:01:50Z,2020-03-23T19:23:31Z,2020-03-23T19:23:30Z,MEMBER,,"User profiles that are fetched from e.g. stargazers (#4) are incomplete - they have a login but they don't have name, company etc. Add a `incomplete` boolean flag to the `users` table to record this. Then later I can add a `backfill-users` command which loops through and fetches missing data for those incomplete profiles.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 724264574,MDU6SXNzdWU3MjQyNjQ1NzQ=,52,Option to fetch README and/or HTML-rendered README for repos,9599,simonw,closed,0,,,,,0,2020-10-19T05:10:24Z,2020-10-19T05:33:42Z,2020-10-19T05:33:42Z,MEMBER,,"I'm thinking: github-to-sqlite repos ... --readme # Populates readme column with raw text github-to-sqlite repos ... --readme-html # Populates readme_html column with raw HTML https://developer.github.com/v3/repos/contents/#get-a-repository-readme",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/52/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 753026003,MDU6SXNzdWU3NTMwMjYwMDM=,54,github-to-sqlite workflows command,9599,simonw,closed,0,,,,,3,2020-11-29T21:56:42Z,2020-11-29T22:08:46Z,2020-11-29T21:57:17Z,MEMBER,,"A command that fetches the YAML workflows for different repos, parses them and stores them in relational tables would be really useful for maintaining larger numbers of workflows.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/54/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 753026388,MDU6SXNzdWU3NTMwMjYzODg=,55,github-to-sqlite workflows does not correctly replace existing records,9599,simonw,closed,0,,,,,0,2020-11-29T21:58:43Z,2020-11-29T23:48:50Z,2020-11-29T23:48:50Z,MEMBER,,Following #54 - see this TODO: https://github.com/dogsheep/github-to-sqlite/blob/1b23ce11953f9f59c0161ea1f99188b55b5ea11c/github_to_sqlite/utils.py#L700,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/55/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 753122082,MDU6SXNzdWU3NTMxMjIwODI=,56,Link to example tables from the README,9599,simonw,closed,0,,,,,0,2020-11-30T04:01:51Z,2020-11-30T04:10:27Z,2020-11-30T04:10:27Z,MEMBER,,Would help demonstrate how the tool works.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/56/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 758944006,MDU6SXNzdWU3NTg5NDQwMDY=,57,--readme throws 404 error if README does not exist in repo,9599,simonw,closed,0,,,,,0,2020-12-07T23:58:49Z,2020-12-16T18:17:54Z,2020-12-16T18:17:54Z,MEMBER,,It should fail silently (populate the column with a null) instead.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/57/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 769150394,MDU6SXNzdWU3NjkxNTAzOTQ=,58,Readme HTML has broken internal links,9599,simonw,closed,0,,,,,2,2020-12-16T17:58:11Z,2020-12-16T19:20:14Z,2020-12-16T19:20:14Z,MEMBER,,"From https://github.com/simonw/datasette.io/issues/46 ```html
  • Filtering tables
  • ...

    Filtering tables

    ``` So this is a bug in GitHub's API, but we need to work around it.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/58/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 771872303,MDExOlB1bGxSZXF1ZXN0NTQzMjQ2NTM1,59,Remove unneeded exists=True for -a/--auth flag.,631242,frosencrantz,closed,0,,,,,3,2020-12-21T06:03:55Z,2021-05-22T14:06:19Z,2021-05-19T16:08:12Z,CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/59,The file does not need to exist when using an environment variable.,207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/59/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 504238461,MDU6SXNzdWU1MDQyMzg0NjE=,6,sqlite3.OperationalError: table users has no column named bio,1055831,dazzag24,closed,0,,,,,2,2019-10-08T19:39:52Z,2019-10-13T05:31:28Z,2019-10-13T05:30:19Z,NONE,,"``` $ github-to-sqlite repos github.db $ github-to-sqlite starred github.db dazzag24 Traceback (most recent call last): File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/bin/github-to-sqlite"", line 10, in sys.exit(cli()) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/cli.py"", line 106, in starred utils.save_stars(db, user, stars) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/utils.py"", line 177, in save_stars user_id = save_user(db, user) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/utils.py"", line 61, in save_user return db[""users""].upsert(to_save, pk=""id"").last_pk File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py"", line 1067, in upsert extracts=extracts, File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py"", line 916, in insert extracts=extracts, File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py"", line 1024, in insert_all result = self.db.conn.execute(sql, values) sqlite3.OperationalError: table users has no column named bio ``` ``` $ pipenv graph github-to-sqlite==0.4 - requests [required: Any, installed: 2.22.0] - certifi [required: >=2017.4.17, installed: 2019.9.11] - chardet [required: >=3.0.2,<3.1.0, installed: 3.0.4] - idna [required: >=2.5,<2.9, installed: 2.8] - urllib3 [required: >=1.21.1,<1.26,!=1.25.1,!=1.25.0, installed: 1.25.6] - sqlite-utils [required: ~=1.11, installed: 1.11] - click [required: Any, installed: 7.0] - click-default-group [required: Any, installed: 1.2.2] - click [required: Any, installed: 7.0] - tabulate [required: Any, installed: 0.8.5] Python 3.6.8 ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 797108702,MDExOlB1bGxSZXF1ZXN0NTY0MTcyMTQw,61,fixing typo in get cli help text,22578954,daniel-butler,closed,0,,,,,1,2021-01-29T18:57:04Z,2021-05-19T16:07:09Z,2021-05-19T16:07:09Z,CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/61,,207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/61/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 506276893,MDU6SXNzdWU1MDYyNzY4OTM=,7,issue-comments command for importing issue comments,9599,simonw,closed,0,,,,,1,2019-10-13T05:23:58Z,2019-10-14T14:44:12Z,2019-10-13T05:24:30Z,MEMBER,,Using this API: https://developer.github.com/v3/issues/comments/,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1177059481,I_kwDODFdgUs5GKICZ,71,Store commit parents,64686,carltongibson,closed,0,,,,,0,2022-03-22T17:06:48Z,2022-04-22T12:44:04Z,2022-04-22T12:44:04Z,NONE,,"Hi @simonw 👋 Currently, stored commit data doesn't quite give me the information I'm needing... Committer date and author date are not 100% reliable for dividing a commit history up by release or branch. A PR created before a release but merged after can have earlier dates… — this can be quite frustrating if you're trying to pin down commits for a release: _It should be there!_, but then isn't. (This gets worse using release branches.) Would you be open to adding the `sha` of a `parent` of a commit to the commit table? (As an FK? 🤔 — likely not feasible.) It's part of the [response body](https://docs.github.com/en/rest/reference/commits#get-a-commit): ``` ""parents"": [ { ""url"": ""https://api.github.com/repos/octocat/Hello-World/commits/6dcb09b5b57875f334f61aebed695e2e4193db5e"", ""sha"": ""6dcb09b5b57875f334f61aebed695e2e4193db5e"" } ], ``` I think this list should only have a single entry. (🤔 — not sure why it's a list then...) With this it would be possible to build/reconstruct a chain of commits from the history, that I don't **think** is available as yet (unless you know a better way). It is certainly possible to get sequential lists of commits out of git directly, so the same would be possible combining tools, but wondering if a single tool could do it. What do you think? Thanks! 🏅 ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/71/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1261884917,PR_kwDODFdgUs45K1L3,73,Fixing 'NoneType' object has no attribute 'items',1224205,empjustine,closed,0,,,,,1,2022-06-06T13:58:11Z,2022-07-18T19:40:12Z,2022-07-18T19:40:12Z,CONTRIBUTOR,dogsheep/github-to-sqlite/pulls/73,"Under some conditions, GitHub caches removed starred repositories and ends up leaving dangling `None` user references. Traceback (most recent call last): File ""/home/dogsheep/dogsheep/github-to-sqlite/bin/github-to-sqlite"", line 8, in sys.exit(cli()) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/click/core.py"", line 1130, in __call__ return self.main(*args, **kwargs) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/click/core.py"", line 1055, in main rv = self.invoke(ctx) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/click/core.py"", line 1657, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/click/core.py"", line 760, in invoke return __callback(*args, **kwargs) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/github_to_sqlite/cli.py"", line 181, in starred utils.save_stars(db, user, stars) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/github_to_sqlite/utils.py"", line 494, in save_stars repo_id = save_repo(db, repo) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/github_to_sqlite/utils.py"", line 308, in save_repo to_save[""owner""] = save_user(db, to_save[""owner""]) File ""/home/dogsheep/dogsheep/github-to-sqlite/lib64/python3.10/site-packages/github_to_sqlite/utils.py"", line 229, in save_user for key, value in user.items() AttributeError: 'NoneType' object has no attribute 'items'",207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/73/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1308461063,I_kwDODFdgUs5N_YgH,74,500 error in github-to-sqlite demo,9599,simonw,closed,0,,,,,5,2022-07-18T19:39:32Z,2022-07-18T21:16:18Z,2022-07-18T21:14:22Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github/issue_comments throws a 500: > `cannot import name 'etree' from 'markdown.util' (/usr/local/lib/python3.8/site-packages/markdown/util.py)` https://console.cloud.google.com/run/detail/us-central1/github-to-sqlite/metrics?project=datasette-222320 suggests this started happening 3 days ago.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/74/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516763727,MDExOlB1bGxSZXF1ZXN0MzM1OTgwMjQ2,8,"stargazers command, refs #4",9599,simonw,closed,0,,,,,5,2019-11-03T00:37:36Z,2020-05-02T20:00:27Z,2020-05-02T20:00:26Z,MEMBER,dogsheep/github-to-sqlite/pulls/8,Needs tests. Refs #4.,207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 516769276,MDU6SXNzdWU1MTY3NjkyNzY=,9,Commands do not work without an auth.json file,9599,simonw,closed,0,,,,,0,2019-11-03T01:54:28Z,2019-11-11T05:30:48Z,2019-11-11T05:30:48Z,MEMBER,,"`auth.json` is meant to be optional. If it's not provided, the tool should make heavily rate-limited unauthenticated requests. ``` $ github-to-sqlite repos .data/repos.db simonw Usage: github-to-sqlite repos [OPTIONS] DB_PATH [USERNAME] Try ""github-to-sqlite repos --help"" for help. Error: Invalid value for ""-a"" / ""--auth"": File ""auth.json"" does not exist. ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585526292,MDU6SXNzdWU1ODU1MjYyOTI=,1,Set up full text search,9599,simonw,closed,0,,,,,1,2020-03-21T15:57:35Z,2020-03-21T19:47:46Z,2020-03-21T19:45:52Z,MEMBER,,"Should run against `title` and `text` in `items`, and `about` and `id` in `users`.",248903544,hacker-news-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470637068,MDU6SXNzdWU0NzA2MzcwNjg=,1,Use XML Analyser to figure out the structure of the export XML,9599,simonw,closed,0,,,,,1,2019-07-20T05:19:02Z,2019-07-20T05:20:09Z,2019-07-20T05:20:09Z,MEMBER,,https://github.com/simonw/xml_analyser,197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 519038979,MDU6SXNzdWU1MTkwMzg5Nzk=,10,Failed to import workout points,9599,simonw,closed,0,,,,,4,2019-11-07T04:50:22Z,2019-11-08T01:18:37Z,2019-11-08T01:18:37Z,MEMBER,,"I just ran the script and it failed to import any `workout_points`, though it did import `workouts`.",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 723838331,MDU6SXNzdWU3MjM4MzgzMzE=,11,export.xml file name varies with different language settings,572,jarib,closed,0,,,,,7,2020-10-17T20:07:18Z,2020-10-17T21:39:15Z,2020-10-17T21:14:10Z,NONE,,"The XML file exported from my phone has a Norwegian file name – `eksport.xml` 🙄 I can work around this by unpacking the zip and using `--xml`, but then I lose the workout points. Perhaps this could be solved by `--localized-xml eksport.xml`? Alternatively just fall back to the first XML file in the root folder of the zip. ",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 975158266,MDU6SXNzdWU5NzUxNTgyNjY=,19,table activity_summary has no column named appleMoveTime,9599,simonw,closed,0,,,,,0,2021-08-20T00:46:44Z,2021-08-20T00:54:34Z,2021-08-20T00:54:34Z,MEMBER,,"Got this error today against a fresh export: table activity_summary has no column named appleMoveTime ",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470637152,MDU6SXNzdWU0NzA2MzcxNTI=,2,Import workouts,9599,simonw,closed,0,,,,,1,2019-07-20T05:20:21Z,2019-07-20T06:21:41Z,2019-07-20T06:21:41Z,MEMBER,,From #1,197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470637206,MDU6SXNzdWU0NzA2MzcyMDY=,3,Import ActivitySummary,9599,simonw,closed,0,,,,,0,2019-07-20T05:21:00Z,2019-07-20T05:58:07Z,2019-07-20T05:58:07Z,MEMBER,,"From #1 ```python 'ActivitySummary': {'attr_counts': {'activeEnergyBurned': 980, 'activeEnergyBurnedGoal': 980, 'activeEnergyBurnedUnit': 980, 'appleExerciseTime': 980, 'appleExerciseTimeGoal': 980, 'appleStandHours': 980, 'appleStandHoursGoal': 980, 'dateComponents': 980}, 'child_counts': {}, 'count': 980, 'parent_counts': {'HealthData': 980}}, ```",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470640505,MDU6SXNzdWU0NzA2NDA1MDU=,4,Import Records,9599,simonw,closed,0,,,,,1,2019-07-20T06:11:20Z,2019-07-20T06:21:41Z,2019-07-20T06:21:41Z,MEMBER,,"From #1: ```python 'Record': {'attr_counts': {'creationDate': 2672233, 'device': 2665111, 'endDate': 2672233, 'sourceName': 2672233, 'sourceVersion': 2671779, 'startDate': 2672233, 'type': 2672233, 'unit': 2650012, 'value': 2672232}, 'child_counts': {'HeartRateVariabilityMetadataList': 2318, 'MetadataEntry': 287974}, 'count': 2672233, 'parent_counts': {'Correlation': 2, 'HealthData': 2672231}}, ```",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470691622,MDU6SXNzdWU0NzA2OTE2MjI=,5,Add progress bar,9599,simonw,closed,0,,,,,2,2019-07-20T16:29:07Z,2019-07-22T03:30:13Z,2019-07-22T02:49:22Z,MEMBER,,"Showing a progress bar would be nice, using Click. The easiest way to do this would probably be be to hook it up to the length of the compressed content, and update it as this code pushes more XML bytes through the parser: https://github.com/dogsheep/healthkit-to-sqlite/blob/d64299765064501f4efdd9a0b21dbdba9ec4287f/healthkit_to_sqlite/utils.py#L6-L10",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470856782,MDU6SXNzdWU0NzA4NTY3ODI=,6,Break up records into different tables for each type,9599,simonw,closed,0,,,,,1,2019-07-22T01:54:59Z,2019-07-22T03:28:55Z,2019-07-22T03:28:50Z,MEMBER,,"I don't think there's much benefit to having all of the different record types stored in the same enormous table. Here's what I get when I use `_facet=type`: I'm going to try splitting these up into separate tables - so `HKQuantityTypeIdentifierBodyMassIndex` becomes a table called `rBodyMassIndex` - and see if that's nicer to work with.",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 472097220,MDU6SXNzdWU0NzIwOTcyMjA=,7,Script uses a lot of RAM,9599,simonw,closed,0,,,,,3,2019-07-24T06:11:11Z,2019-07-24T06:35:52Z,2019-07-24T06:35:52Z,MEMBER,,"I'm using an XML pull parser which should avoid the need to slurp the whole XML file into memory, but it's not working - the script still uses over 1GB of RAM when it runs according to Activity Monitor. I think this is because I'm still causing the full root element to be incrementally loaded into memory just in case I try and access it later. http://effbot.org/elementtree/iterparse.htm says I should use `elem.clear()` as I go. It also says: > The above pattern has one drawback; it does not clear the root element, so you will end up with a single element with lots of empty child elements. If your files are huge, rather than just large, this might be a problem. To work around this, you need to get your hands on the root element. So I will try that recipe and see if it helps.",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 472104705,MDExOlB1bGxSZXF1ZXN0MzAwNTgwMjIx,8,Use less RAM,9599,simonw,closed,0,,,,,0,2019-07-24T06:35:01Z,2019-07-24T06:35:52Z,2019-07-24T06:35:52Z,MEMBER,dogsheep/healthkit-to-sqlite/pulls/8,Closes #7,197882382,healthkit-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 472429048,MDU6SXNzdWU0NzI0MjkwNDg=,9,Too many SQL variables,166463,tholo,closed,0,,,,,4,2019-07-24T18:24:17Z,2019-07-26T10:01:05Z,2019-07-26T10:01:05Z,NONE,,"Decided to try importing my data, and ran into this: ``` Traceback (most recent call last): File ""/Users/tholo/Source/health/bin/healthkit-to-sqlite"", line 10, in sys.exit(cli()) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/healthkit_to_sqlite/cli.py"", line 50, in cli convert_xml_to_sqlite(fp, db, progress_callback=bar.update) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/healthkit_to_sqlite/utils.py"", line 41, in convert_xml_to_sqlite write_records(records, db) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/healthkit_to_sqlite/utils.py"", line 80, in write_records column_order=[""startDate"", ""endDate"", ""value"", ""unit""], File ""/Users/tholo/Source/health/lib/python3.7/site-packages/sqlite_utils/db.py"", line 911, in insert_all result = self.db.conn.execute(sql, values) sqlite3.OperationalError: too many SQL variables ``` Added some debug output in sqlite_utils/db.py, which resulted in: ``` INSERT INTO [rBodyMassIndex] ([creationDate], [endDate], [metadata_HKWasUserEntered], [metadata_Health Mate App Version], [metadata_Modified Date], [metadata_Withings Link], [metadata_Withings User Identifier], [sourceName], [sourceVersion], [startDate], [unit], [value]) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) ; ``` with the attached data: ``` ['2019-06-27 22:55:10 -0700', '2011-06-22 21:05:53 -0700', '0', '4.4.2', '2011-06-23 04:05:53 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308801953&type=1', '301293', 'Health Mate', '4040200', '2011-06-22 21:05:53 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2011-06-23 09:36:27 -0700', '0', '4.4.2', '2011-06-23 16:36:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308846987&type=1', '301293', 'Health Mate', '4040200', '2011-06-23 09:36:27 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2011-06-23 23:54:07 -0700', '0', '4.4.2', '2011-06-24 06:55:19 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308898447&type=1', '301293', 'Health Mate', '4040200', '2011-06-23 23:54:07 -0700', 'count', '30.679', '2019-06-27 22:55:10 -0700', '2011-06-24 09:13:40 -0700', '0', '4.4.2', '2011-06-24 16:14:35 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308932020&type=1', '301293', 'Health Mate', '4040200', '2011-06-24 09:13:40 -0700', 'count', '30.3549', '2019-06-27 22:55:10 -0700', '2011-06-25 08:30:08 -0700', '0', '4.4.2', '2011-06-25 15:30:49 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309015808&type=1', '301293', 'Health Mate', '4040200', '2011-06-25 08:30:08 -0700', 'count', '30.3395', '2019-06-27 22:55:10 -0700', '2011-06-26 07:47:51 -0700', '0', '4.4.2', '2011-06-26 14:48:27 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309099671&type=1', '301293', 'Health Mate', '4040200', '2011-06-26 07:47:51 -0700', 'count', '30.2315', '2019-06-27 22:55:10 -0700', '2011-06-28 08:48:26 -0700', '0', '4.4.2', '2011-06-28 15:49:13 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309276106&type=1', '301293', 'Health Mate', '4040200', '2011-06-28 08:48:26 -0700', 'count', '30.0617', '2019-06-27 22:55:10 -0700', '2011-06-29 09:21:16 -0700', '0', '4.4.2', '2011-06-29 16:21:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309364476&type=1', '301293', 'Health Mate', '4040200', '2011-06-29 09:21:16 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-06-30 08:41:46 -0700', '0', '4.4.2', '2011-06-30 15:42:30 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309448506&type=1', '301293', 'Health Mate', '4040200', '2011-06-30 08:41:46 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2011-07-01 09:05:28 -0700', '0', '4.4.2', '2011-07-01 16:06:24 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309536328&type=1', '301293', 'Health Mate', '4040200', '2011-07-01 09:05:28 -0700', 'count', '29.8611', '2019-06-27 22:55:10 -0700', '2011-07-02 08:58:50 -0700', '0', '4.4.2', '2011-07-02 15:59:40 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309622330&type=1', '301293', 'Health Mate', '4040200', '2011-07-02 08:58:50 -0700', 'count', '29.8765', '2019-06-27 22:55:10 -0700', '2011-07-04 09:33:43 -0700', '0', '4.4.2', '2011-07-04 16:34:19 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309797223&type=1', '301293', 'Health Mate', '4040200', '2011-07-04 09:33:43 -0700', 'count', '30.0309', '2019-06-27 22:55:10 -0700', '2011-07-06 09:40:23 -0700', '0', '4.4.2', '2011-07-06 16:41:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309970423&type=1', '301293', 'Health Mate', '4040200', '2011-07-06 09:40:23 -0700', 'count', '30.1852', '2019-06-27 22:55:10 -0700', '2011-07-08 08:08:48 -0700', '0', '4.4.2', '2011-07-08 15:09:51 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310137728&type=1', '301293', 'Health Mate', '4040200', '2011-07-08 08:08:48 -0700', 'count', '30.0309', '2019-06-27 22:55:10 -0700', '2011-07-09 08:31:05 -0700', '0', '4.4.2', '2011-07-09 15:31:48 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310225465&type=1', '301293', 'Health Mate', '4040200', '2011-07-09 08:31:05 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-07-10 08:14:36 -0700', '0', '4.4.2', '2011-07-10 15:15:12 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310310876&type=1', '301293', 'Health Mate', '4040200', '2011-07-10 08:14:36 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2011-07-12 07:55:21 -0700', '0', '4.4.2', '2011-07-12 14:55:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310482521&type=1', '301293', 'Health Mate', '4040200', '2011-07-12 07:55:21 -0700', 'count', '30.108', '2019-06-27 22:55:10 -0700', '2011-07-13 08:48:05 -0700', '0', '4.4.2', '2011-07-13 15:48:42 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310572085&type=1', '301293', 'Health Mate', '4040200', '2011-07-13 08:48:05 -0700', 'count', '30', '2019-06-27 22:55:10 -0700', '2011-07-14 09:05:16 -0700', '0', '4.4.2', '2011-07-14 16:05:57 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310659516&type=1', '301293', 'Health Mate', '4040200', '2011-07-14 09:05:16 -0700', 'count', '29.9074', '2019-06-27 22:55:10 -0700', '2011-07-15 07:09:56 -0700', '0', '4.4.2', '2011-07-15 14:10:35 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310738996&type=1', '301293', 'Health Mate', '4040200', '2011-07-15 07:09:56 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-07-16 09:26:04 -0700', '0', '4.4.2', '2011-07-16 16:26:44 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310833564&type=1', '301293', 'Health Mate', '4040200', '2011-07-16 09:26:04 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-07-17 09:52:59 -0700', '0', '4.4.2', '2011-07-17 16:53:38 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310921579&type=1', '301293', 'Health Mate', '4040200', '2011-07-17 09:52:59 -0700', 'count', '29.8765', '2019-06-27 22:55:10 -0700', '2011-07-19 08:56:16 -0700', '0', '4.4.2', '2011-07-19 15:57:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311090976&type=1', '301293', 'Health Mate', '4040200', '2011-07-19 08:56:16 -0700', 'count', '29.7685', '2019-06-27 22:55:10 -0700', '2011-07-21 08:21:20 -0700', '0', '4.4.2', '2011-07-21 15:22:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311261680&type=1', '301293', 'Health Mate', '4040200', '2011-07-21 08:21:20 -0700', 'count', '29.7685', '2019-06-27 22:55:10 -0700', '2011-07-23 08:49:56 -0700', '0', '4.4.2', '2011-07-23 15:50:40 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311436196&type=1', '301293', 'Health Mate', '4040200', '2011-07-23 08:49:56 -0700', 'count', '29.7222', '2019-06-27 22:55:10 -0700', '2011-07-24 09:17:35 -0700', '0', '4.4.2', '2011-07-24 16:18:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311524255&type=1', '301293', 'Health Mate', '4040200', '2011-07-24 09:17:35 -0700', 'count', '29.5833', '2019-06-27 22:55:10 -0700', '2011-07-25 07:51:55 -0700', '0', '4.4.2', '2011-07-25 14:52:48 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311605515&type=1', '301293', 'Health Mate', '4040200', '2011-07-25 07:51:55 -0700', 'count', '29.5525', '2019-06-27 22:55:10 -0700', '2011-08-06 10:04:05 -0700', '0', '4.4.2', '2011-08-06 17:04:47 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1312650245&type=1', '301293', 'Health Mate', '4040200', '2011-08-06 10:04:05 -0700', 'count', '29.7377', '2019-06-27 22:55:10 -0700', '2011-08-08 07:52:22 -0700', '0', '4.4.2', '2011-08-08 14:53:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1312815142&type=1', '301293', 'Health Mate', '4040200', '2011-08-08 07:52:22 -0700', 'count', '29.6605', '2019-06-27 22:55:10 -0700', '2011-08-10 07:57:30 -0700', '0', '4.4.2', '2011-08-10 14:58:12 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1312988250&type=1', '301293', 'Health Mate', '4040200', '2011-08-10 07:57:30 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-08-12 07:51:14 -0700', '0', '4.4.2', '2011-08-12 14:51:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1313160674&type=1', '301293', 'Health Mate', '4040200', '2011-08-12 07:51:14 -0700', 'count', '29.6914', '2019-06-27 22:55:10 -0700', '2011-08-13 07:45:28 -0700', '0', '4.4.2', '2011-08-13 14:46:08 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1313246728&type=1', '301293', 'Health Mate', '4040200', '2011-08-13 07:45:28 -0700', 'count', '29.5833', '2019-06-27 22:55:10 -0700', '2011-08-17 09:06:20 -0700', '0', '4.4.2', '2011-08-17 16:07:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1313597180&type=1', '301293', 'Health Mate', '4040200', '2011-08-17 09:06:20 -0700', 'count', '29.5679', '2019-06-27 22:55:10 -0700', '2011-08-22 08:28:08 -0700', '0', '4.4.2', '2011-08-22 15:28:57 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1314026888&type=1', '301293', 'Health Mate', '4040200', '2011-08-22 08:28:08 -0700', 'count', '29.9846', '2019-06-27 22:55:10 -0700', '2011-08-25 08:59:30 -0700', '0', '4.4.2', '2011-08-25 16:00:15 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1314287970&type=1', '301293', 'Health Mate', '4040200', '2011-08-25 08:59:30 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2011-08-30 08:13:59 -0700', '0', '4.4.2', '2011-08-30 15:46:08 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1314717239&type=1', '301293', 'Health Mate', '4040200', '2011-08-30 08:13:59 -0700', 'count', '29.784', '2019-06-27 22:55:10 -0700', '2011-09-12 08:47:51 -0700', '0', '4.4.2', '2011-09-12 15:48:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1315842471&type=1', '301293', 'Health Mate', '4040200', '2011-09-12 08:47:51 -0700', 'count', '29.7377', '2019-06-27 22:55:10 -0700', '2011-09-13 09:17:27 -0700', '0', '4.4.2', '2011-09-13 16:48:30 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1315930647&type=1', '301293', 'Health Mate', '4040200', '2011-09-13 09:17:27 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-10-01 09:12:20 -0700', '0', '4.4.2', '2011-10-01 16:13:00 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1317485540&type=1', '301293', 'Health Mate', '4040200', '2011-10-01 09:12:20 -0700', 'count', '29.8148', '2019-06-27 22:55:10 -0700', '2011-10-11 11:14:11 -0700', '0', '4.4.2', '2011-10-11 18:15:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1318356851&type=1', '301293', 'Health Mate', '4040200', '2011-10-11 11:14:11 -0700', 'count', '29.7377', '2019-06-27 22:55:10 -0700', '2011-10-16 09:29:47 -0700', '0', '4.4.2', '2011-10-16 16:30:39 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1318782587&type=1', '301293', 'Health Mate', '4040200', '2011-10-16 09:29:47 -0700', 'count', '29.6914', '2019-06-27 22:55:10 -0700', '2011-10-19 09:21:44 -0700', '0', '4.4.2', '2011-10-19 16:22:25 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1319041304&type=1', '301293', 'Health Mate', '4040200', '2011-10-19 09:21:44 -0700', 'count', '29.7685', '2019-06-27 22:55:10 -0700', '2011-10-24 07:04:22 -0700', '0', '4.4.2', '2011-10-24 14:05:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1319465062&type=1', '301293', 'Health Mate', '4040200', '2011-10-24 07:04:22 -0700', 'count', '29.5988', '2019-06-27 22:55:10 -0700', '2011-11-07 09:33:17 -0700', '0', '4.4.2', '2011-11-07 16:33:58 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1320683597&type=1', '301293', 'Health Mate', '4040200', '2011-11-07 09:33:17 -0700', 'count', '29.8611', '2019-06-27 22:55:10 -0700', '2011-11-10 07:59:03 -0700', '0', '4.4.2', '2011-11-10 14:59:48 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1320937143&type=1', '301293', 'Health Mate', '4040200', '2011-11-10 07:59:03 -0700', 'count', '29.9383', '2019-06-27 22:55:10 -0700', '2011-11-13 09:28:31 -0700', '0', '4.4.2', '2011-11-13 16:29:20 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1321201711&type=1', '301293', 'Health Mate', '4040200', '2011-11-13 09:28:31 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-11-21 08:45:06 -0700', '0', '4.4.2', '2011-11-21 15:46:04 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1321890306&type=1', '301293', 'Health Mate', '4040200', '2011-11-21 08:45:06 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2011-11-23 09:55:44 -0700', '0', '4.4.2', '2011-11-23 16:56:18 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1322067344&type=1', '301293', 'Health Mate', '4040200', '2011-11-23 09:55:44 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2011-11-29 09:50:44 -0700', '0', '4.4.2', '2011-11-29 16:51:31 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1322585444&type=1', '301293', 'Health Mate', '4040200', '2011-11-29 09:50:44 -0700', 'count', '30.1698', '2019-06-27 22:55:10 -0700', '2011-11-30 11:13:21 -0700', '0', '4.4.2', '2011-11-30 18:14:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1322676801&type=1', '301293', 'Health Mate', '4040200', '2011-11-30 11:13:21 -0700', 'count', '30.0617', '2019-06-27 22:55:10 -0700', '2011-12-04 10:24:36 -0700', '0', '4.4.2', '2011-12-04 17:25:24 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1323019476&type=1', '301293', 'Health Mate', '4040200', '2011-12-04 10:24:36 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2011-12-10 09:22:18 -0700', '0', '4.4.2', '2011-12-10 16:23:07 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1323534138&type=1', '301293', 'Health Mate', '4040200', '2011-12-10 09:22:18 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-12-26 10:36:42 -0700', '0', '4.4.2', '2011-12-26 17:37:31 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1324921002&type=1', '301293', 'Health Mate', '4040200', '2011-12-26 10:36:42 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2012-01-11 11:24:13 -0700', '0', '4.4.2', '2012-01-11 18:25:04 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1326306253&type=1', '301293', 'Health Mate', '4040200', '2012-01-11 11:24:13 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2012-01-15 10:17:09 -0700', '0', '4.4.2', '2012-01-15 17:17:51 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1326647829&type=1', '301293', 'Health Mate', '4040200', '2012-01-15 10:17:09 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2012-01-19 09:24:32 -0700', '0', '4.4.2', '2012-01-19 16:25:21 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1326990272&type=1', '301293', 'Health Mate', '4040200', '2012-01-19 09:24:32 -0700', 'count', '29.7994', '2019-06-27 22:55:10 -0700', '2012-01-29 10:26:13 -0700', '0', '4.4.2', '2012-01-29 17:26:52 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1327857973&type=1', '301293', 'Health Mate', '4040200', '2012-01-29 10:26:13 -0700', 'count', '30.0154', '2019-06-27 22:55:10 -0700', '2012-02-03 10:13:28 -0700', '0', '4.4.2', '2012-02-03 17:15:01 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1328289208&type=1', '301293', 'Health Mate', '4040200', '2012-02-03 10:13:28 -0700', 'count', '29.8457', '2019-06-27 22:55:10 -0700', '2012-02-12 09:23:01 -0700', '0', '4.4.2', '2012-02-12 16:23:53 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1329063781&type=1', '301293', 'Health Mate', '4040200', '2012-02-12 09:23:01 -0700', 'count', '30.1235', '2019-06-27 22:55:10 -0700', '2012-03-03 09:26:06 -0700', '0', '4.4.2', '2012-03-03 16:26:54 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1330791966&type=1', '301293', 'Health Mate', '4040200', '2012-03-03 09:26:06 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2012-03-11 11:23:15 -0700', '0', '4.4.2', '2012-03-11 18:24:16 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1331490195&type=1', '301293', 'Health Mate', '4040200', '2012-03-11 11:23:15 -0700', 'count', '30.2161', '2019-06-27 22:55:10 -0700', '2012-03-16 09:39:36 -0700', '0', '4.4.2', '2012-03-16 16:40:20 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1331915976&type=1', '301293', 'Health Mate', '4040200', '2012-03-16 09:39:36 -0700', 'count', '30.2778', '2019-06-27 22:55:10 -0700', '2012-03-21 08:33:07 -0700', '0', '4.4.2', '2012-03-21 15:34:00 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1332343987&type=1', '301293', 'Health Mate', '4040200', '2012-03-21 08:33:07 -0700', 'count', '30.1389', '2019-06-27 22:55:10 -0700', '2012-04-11 08:49:34 -0700', '0', '4.4.2', '2012-04-11 15:50:18 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1334159374&type=1', '301293', 'Health Mate', '4040200', '2012-04-11 08:49:34 -0700', 'count', '30.0154', '2019-06-27 22:55:10 -0700', '2012-04-13 08:32:06 -0700', '0', '4.4.2', '2012-04-13 15:32:49 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1334331126&type=1', '301293', 'Health Mate', '4040200', '2012-04-13 08:32:06 -0700', 'count', '29.9383', '2019-06-27 22:55:10 -0700', '2012-04-20 08:21:38 -0700', '0', '4.4.2', '2012-04-20 15:52:45 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1334935298&type=1', '301293', 'Health Mate', '4040200', '2012-04-20 08:21:38 -0700', 'count', '30.2006', '2019-06-27 22:55:10 -0700', '2012-04-25 09:00:01 -0700', '0', '4.4.2', '2012-04-25 16:00:42 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1335369601&type=1', '301293', 'Health Mate', '4040200', '2012-04-25 09:00:01 -0700', 'count', '30.2006', '2019-06-27 22:55:10 -0700', '2012-05-04 11:10:18 -0700', '0', '4.4.2', '2012-05-04 18:10:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1336155018&type=1', '301293', 'Health Mate', '4040200', '2012-05-04 11:10:18 -0700', 'count', '30.4321', '2019-06-27 22:55:10 -0700', '2012-05-12 09:35:00 -0700', '0', '4.4.2', '2012-05-12 16:35:43 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1336840500&type=1', '301293', 'Health Mate', '4040200', '2012-05-12 09:35:00 -0700', 'count', '30.1235', '2019-06-27 22:55:10 -0700', '2012-05-22 09:27:53 -0700', '0', '4.4.2', '2012-05-22 16:28:37 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1337704073&type=1', '301293', 'Health Mate', '4040200', '2012-05-22 09:27:53 -0700', 'count', '30.4167', '2019-06-27 22:55:10 -0700', '2012-05-31 09:23:16 -0700', '0', '4.4.2', '2012-05-31 16:24:04 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1338481396&type=1', '301293', 'Health Mate', '4040200', '2012-05-31 09:23:16 -0700', 'count', '30.2006', '2019-06-27 22:55:10 -0700', '2012-06-08 09:29:07 -0700', '0', '4.4.2', '2012-06-08 16:29:52 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1339172947&type=1', '301293', 'Health Mate', '4040200', '2012-06-08 09:29:07 -0700', 'count', '30.5247', '2019-06-27 22:55:10 -0700', '2012-06-21 08:07:33 -0700', '0', '4.4.2', '2012-06-21 15:08:20 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1340291253&type=1', '301293', 'Health Mate', '4040200', '2012-06-21 08:07:33 -0700', 'count', '30.5864', '2019-06-27 22:55:10 -0700', '2012-08-08 10:02:22 -0700', '0', '4.4.2', '2012-08-08 17:03:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1344445342&type=1', '301293', 'Health Mate', '4040200', '2012-08-08 10:02:22 -0700', 'count', '30.6636', '2019-06-27 22:55:10 -0700', '2012-08-17 09:11:32 -0700', '0', '4.4.2', '2012-08-17 16:42:05 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1345219892&type=1', '301293', 'Health Mate', '4040200', '2012-08-17 09:11:32 -0700', 'count', '30.8796', '2019-06-27 22:55:10 -0700', '2012-09-10 08:27:21 -0700', '0', '4.4.2', '2012-09-10 15:28:07 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1347290841&type=1', '301293', 'Health Mate', '4040200', '2012-09-10 08:27:21 -0700', 'count', '31.034', '2019-06-27 22:55:10 -0700', '2012-09-17 08:35:33 -0700', '0', '4.4.2', '2012-09-17 15:35:33 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1347896133&type=1', '301293', 'Health Mate', '4040200', '2012-09-17 08:35:33 -0700', 'count', '30.7099', '2019-06-27 22:55:10 -0700', '2012-09-26 08:59:46 -0700', '0', '4.4.2', '2012-09-26 16:13:18 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1348675186&type=1', '301293', 'Health Mate', '4040200', '2012-09-26 08:59:46 -0700', 'count', '30.679', '2019-06-27 22:55:10 -0700', '2012-10-18 08:51:16 -0700', '0', '4.4.2', '2012-10-18 15:51:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1350575476&type=1', '301293', 'Health Mate', '4040200', '2012-10-18 08:51:16 -0700', 'count', '30.7716', '2019-06-27 22:55:10 -0700', '2012-11-15 08:54:57 -0700', '0', '4.4.2', '2012-11-15 15:55:58 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1352994897&type=1', '301293', 'Health Mate', '4040200', '2012-11-15 08:54:57 -0700', 'count', '31.0802', '2019-06-27 22:55:10 -0700', '2012-12-17 09:13:40 -0700', '0', '4.4.2', '2012-12-17 16:20:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1355760820&type=1', '301293', 'Health Mate', '4040200', '2012-12-17 09:13:40 -0700', 'count', '29.784', '2019-06-27 22:55:10 -0700', '2012-12-19 11:09:55 -0700', '0', '4.4.2', '2012-12-19 18:10:37 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1355940595&type=1', '301293', 'Health Mate', '4040200', '2012-12-19 11:09:55 -0700', 'count', '29.6914', '2019-06-27 22:55:10 -0700', '2012-12-25 10:37:41 -0700', '0', '4.4.2', '2012-12-25 17:38:25 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1356457061&type=1', '301293', 'Health Mate', '4040200', '2012-12-25 10:37:41 -0700', 'count', '29.8765', '2019-06-27 22:55:10 -0700', '2013-01-01 10:44:02 -0700', '0', '4.4.2', '2013-01-01 17:44:46 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1357062242&type=1', '301293', 'Health Mate', '4040200', '2013-01-01 10:44:02 -0700', 'count', '30.0772', '2019-06-27 22:55:10 -0700', '2013-01-15 09:10:46 -0700', '0', '4.4.2', '2013-01-15 16:11:28 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1358266246&type=1', '301293', 'Health Mate', '4040200', '2013-01-15 09:10:46 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2013-01-20 11:03:39 -0700', '0', '4.4.2', '2013-01-20 18:04:22 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1358705019&type=1', '301293', 'Health Mate', '4040200', '2013-01-20 11:03:39 -0700', 'count', '30.108', '2019-06-27 22:55:10 -0700', '2013-01-30 08:56:30 -0700', '0', '4.4.2', '2013-01-30 15:57:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1359561390&type=1', '301293', 'Health Mate', '4040200', '2013-01-30 08:56:30 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2013-02-04 11:02:35 -0700', '0', '4.4.2', '2013-02-04 18:03:25 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1360000955&type=1', '301293', 'Health Mate', '4040200', '2013-02-04 11:02:35 -0700', 'count', '29.8148', '2019-06-27 22:55:10 -0700', '2013-02-07 09:07:06 -0700', '0', '4.4.2', '2013-02-07 16:07:49 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1360253226&type=1', '301293', 'Health Mate', '4040200', '2013-02-07 09:07:06 -0700', 'count', '30.1389', '2019-06-27 22:55:10 -0700', '2013-02-19 08:49:57 -0700', '0', '4.4.2', '2013-02-19 15:50:39 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1361288997&type=1', '301293', 'Health Mate', '4040200', '2013-02-19 08:49:57 -0700', 'count', '30.1235', '2019-06-27 22:55:10 -0700', '2013-03-02 11:20:54 -0700', '0', '4.4.2', '2013-03-02 18:21:38 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1362248454&type=1', '301293', 'Health Mate', '4040200', '2013-03-02 11:20:54 -0700', 'count', '30', '2019-06-27 22:55:10 -0700', '2013-04-23 08:05:30 -0700', '0', '4.4.2', '2013-04-23 15:06:59 +0000', 'withings-bd2://timeline/measure?user """""" id=301293&date=1366729530&type=1', '301293', 'Health Mate', '4040200', '2013-04-23 08:05:30 -0700', 'count', '30.5247', '2019-06-27 22:55:10 -0700', '2013-05-09 09:49:18 -0700', '0', '4.4.2', '2013-05-09 16:50:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1368118158&type=1', '301293', 'Health Mate', '4040200', '2013-05-09 09:49:18 -0700', 'count', '30.4167', '2019-06-27 22:55:10 -0700', '2013-06-09 09:28:47 -0700', '0', '4.4.2', '2013-06-09 16:29:30 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1370795327&type=1', '301293', 'Health Mate', '4040200', '2013-06-09 09:28:47 -0700', 'count', '30.8333', '2019-06-27 22:55:10 -0700', '2013-07-09 08:00:17 -0700', '0', '4.4.2', '2013-07-09 15:01:00 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1373382017&type=1', '301293', 'Health Mate', '4040200', '2013-07-09 08:00:17 -0700', 'count', '30.8179', '2019-06-27 22:55:10 -0700', '2013-07-28 09:16:55 -0700', '0', '4.4.2', '2013-07-28 16:17:39 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1375028215&type=1', '301293', 'Health Mate', '4040200', '2013-07-28 09:16:55 -0700', 'count', '30.5556', '2019-06-27 22:55:10 -0700', '2013-09-13 09:22:19 -0700', '0', '4.4.2', '2013-09-13 16:23:08 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1379089339&type=1', '301293', 'Health Mate', '4040200', '2013-09-13 09:22:19 -0700', 'count', '30.9568', '2019-06-27 22:55:10 -0700', '2013-09-24 08:08:23 -0700', '0', '4.4.2', '2013-09-24 15:09:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1380035303&type=1', '301293', 'Health Mate', '4040200', '2013-09-24 08:08:23 -0700', 'count', '31.4352', '2019-06-27 22:55:10 -0700', '2013-10-01 08:15:13 -0700', '0', '4.4.2', '2013-10-01 15:15:57 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1380640513&type=1', '301293', 'Health Mate', '4040200', '2013-10-01 08:15:13 -0700', 'count', '31.2037', '2019-06-27 22:55:10 -0700', '2013-10-23 09:31:25 -0700', '0', '4.4.2', '2013-10-23 16:32:13 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1382545885&type=1', '301293', 'Health Mate', '4040200', '2013-10-23 09:31:25 -0700', 'count', '31.8056'] ```",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503233021,MDU6SXNzdWU1MDMyMzMwMjE=,1,Use better pagination (and implement progress bar),9599,simonw,closed,0,,,,,4,2019-10-07T04:58:11Z,2020-03-27T22:13:57Z,2020-03-27T22:13:57Z,MEMBER,,"Right now we attempt to load everything at once - which caps out at 5,000 items and is really slow. We can do better by implementing pagination using count and offset.",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1246826792,I_kwDODLZ_YM5KUREo,10,"When running `auth` command, don't overwrite an existing auth.json file",11887,ashanan,closed,0,,,,,3,2022-05-24T16:42:20Z,2022-09-07T15:07:38Z,2022-08-22T16:17:19Z,NONE,,"Ran the `auth` command in the same directory I'd previously set up an auth.json file for `twitter-to-sqlite` and it was completely overwritten. Not the biggest issue, but still unexpected. Ideally, for me, the keys would just be added to the existing file, but getting a warning and a chance to back out would be a good solution as well.",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1345452427,I_kwDODLZ_YM5QMfmL,11,"-a option is used for ""--auth"" and for ""--all""",2467,fernand0,closed,0,,,,,3,2022-08-21T10:50:48Z,2022-08-21T21:11:57Z,2022-08-21T21:11:57Z,NONE,,"I'm not sure which option is best, instead of -a -all.",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1795187493,I_kwDODLZ_YM5rAGMl,12,Switch to pyproject.toml,9599,simonw,closed,0,,,,,2,2023-07-09T01:06:56Z,2023-07-09T01:19:43Z,2023-07-09T01:19:42Z,MEMBER,,First of my CLI tools to use https://til.simonwillison.net/python/pyproject,213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503234169,MDU6SXNzdWU1MDMyMzQxNjk=,2,Track and use the 'since' value,9599,simonw,closed,0,,,,,3,2019-10-07T05:02:59Z,2020-03-27T22:22:30Z,2020-03-27T22:22:30Z,MEMBER,,"Pocket says: > Whenever possible, you should use the since parameter, or count and and offset parameters when retrieving a user's list. After retrieving the list, you should store the current time (which is provided along with the list response) and pass that in the next request for the list. This way the server only needs to return a small set (changes since that time) instead of the user's entire list every time. At the bottom of https://getpocket.com/developer/docs/v3/retrieve",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 589402939,MDU6SXNzdWU1ODk0MDI5Mzk=,4,"Store authentication information as ""pocket_access_token"" etc",9599,simonw,closed,0,,,,,0,2020-03-27T20:43:22Z,2020-03-27T20:43:59Z,2020-03-27T20:43:59Z,MEMBER,,The `pocket_` prefix will mean that the same `auth.json` file can be used for other Dogsheep tools without Pocket over-riding a value set by some other tool.,213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 750141615,MDExOlB1bGxSZXF1ZXN0NTI2ODQ3ODIz,7,Fixed conflicting CLI flags,8944,tlockney,closed,0,,,,,1,2020-11-24T23:25:12Z,2022-08-21T21:11:56Z,2022-08-21T21:11:56Z,CONTRIBUTOR,dogsheep/pocket-to-sqlite/pulls/7,"The `-a` used for the auth credentials and the shortened form of the `--all` flags were in conflict on the `fetch` command. To be consistent with other `-to-sqlite` libraries in the Dogsheep ecosystem, I removed the shortened form of the `--all` flag.",213286752,pocket-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 487598042,MDU6SXNzdWU0ODc1OTgwNDI=,1,Implement code to pull checkins from the Foursquare API,9599,simonw,closed,0,,,,,0,2019-08-30T17:40:02Z,2019-08-30T18:23:24Z,2019-08-30T18:23:24Z,MEMBER,,"The tool currently only works with a pre-prepared JSON file of checkins. When called without options, it should prompt the user to paste in a Foursquare OAuth token. The `--token=` option should work too, and should be backed up by an optional environment variable.",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 719637258,MDExOlB1bGxSZXF1ZXN0NTAxNzkxNjYz,10,Update utils.py to fix sqlite3.OperationalError,29426418,mattiaborsoi,closed,0,,,,,1,2020-10-12T20:17:53Z,2020-10-12T20:25:10Z,2020-10-12T20:25:09Z,CONTRIBUTOR,dogsheep/swarm-to-sqlite/pulls/10,"Fixes the errors: - sqlite3.OperationalError: table posts has no column named text - sqlite3.OperationalError: table photos has no column named hasSticker That will cause sqlite-utils to notice if there's a missing column and add it. As recommended by @simonw",205429375,swarm-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 743400216,MDU6SXNzdWU3NDM0MDAyMTY=,11,Error thrown: sqlite3.OperationalError: table users has no column named lastName,61791,beaugunderson,closed,0,,,,,2,2020-11-16T01:21:18Z,2021-01-18T04:35:22Z,2021-01-18T04:35:22Z,NONE,,"Just installed `swarm-to-sqlite-0.3.2` and tried according to the docs: ``` Traceback (most recent call last): File ""/usr/local/bin/swarm-to-sqlite"", line 8, in sys.exit(cli()) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/swarm_to_sqlite/cli.py"", line 73, in cli save_checkin(checkin, db) File ""/usr/local/lib/python3.9/site-packages/swarm_to_sqlite/utils.py"", line 82, in save_checkin checkins_table.m2m(""users"", user, m2m_table=""likes"", pk=""id"") File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1914, in m2m id = other_table.insert(record, pk=pk, replace=True).last_pk File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1647, in insert return self.insert_all( File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1765, in insert_all self.insert_chunk( File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1575, in insert_chunk result = self.db.execute(query, params) File ""/usr/local/lib/python3.9/site-packages/sqlite_utils/db.py"", line 200, in execute return self.conn.execute(sql, parameters) sqlite3.OperationalError: table users has no column named lastName ```",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1661617056,I_kwDODD6af85jCkOg,15,ambiguous column name: createdAt - on checkin_details view,9599,simonw,closed,0,,,,,0,2023-04-11T01:07:47Z,2023-04-11T03:16:37Z,2023-04-11T03:16:37Z,MEMBER,,"It looks like Swarm changed their schema and now both `venues` and `checkins` have `createdAt` fields. Which breaks this view: https://github.com/dogsheep/swarm-to-sqlite/blob/719b6e96a016d0ca8b316d3bed9c2a7a0cb499ee/swarm_to_sqlite/utils.py#L171-L188",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487598468,MDU6SXNzdWU0ODc1OTg0Njg=,2,--save option to dump checkins to a JSON file on disk,9599,simonw,closed,0,,,,,1,2019-08-30T17:41:06Z,2019-08-31T02:40:21Z,2019-08-31T02:40:21Z,MEMBER,,"This is a complement to the `--load` option - mainly useful for development purposes. (I'll rename `--file` to `--load` as part of this issue).",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487600595,MDU6SXNzdWU0ODc2MDA1OTU=,3,Option to fetch only checkins more recent than the current max checkin,9599,simonw,closed,0,,,,,4,2019-08-30T17:46:45Z,2019-10-16T20:41:23Z,2019-10-16T20:39:59Z,MEMBER,,"The Foursquare checkins API supports ""return every checkin occurring after this point"" - I can pass it the maximum createdAt date currently stored in the database. This will allow for quick incremental fetches via a cron.",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487601121,MDU6SXNzdWU0ODc2MDExMjE=,4,Online tool for getting a Foursquare OAuth token,9599,simonw,closed,0,,,,,1,2019-08-30T17:48:14Z,2019-08-31T18:07:26Z,2019-08-31T18:07:26Z,MEMBER,,"I will link to this from the documentation. See also this conversation on Twitter: https://twitter.com/simonw/status/1166822603023011840 I've decided to go with ""copy and paste in a token"" rather than hooking up a local web server that can have tokens passed to it.",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487721884,MDU6SXNzdWU0ODc3MjE4ODQ=,5,Treat Foursquare timestamps as UTC,9599,simonw,closed,0,,,,,0,2019-08-31T02:44:47Z,2019-08-31T02:50:41Z,2019-08-31T02:50:41Z,MEMBER,,"Current test failure is due to timezone differences between my laptop and Circle CI: https://circleci.com/gh/dogsheep/swarm-to-sqlite/3 ``` E Full diff: E - [{'created': '2018-07-01T04:48:19', E ? ^ E + [{'created': '2018-07-01T02:48:19', E ? ^ E 'createdAt': 1530413299, ``` The timestamps I store in `created` should always be UTC.",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 543355051,MDExOlB1bGxSZXF1ZXN0MzU3NjQwMTg2,6,don't break if source is missing,78035,mfa,closed,0,,,,,1,2019-12-29T10:46:47Z,2020-03-28T02:28:11Z,2020-03-28T02:28:11Z,CONTRIBUTOR,dogsheep/swarm-to-sqlite/pulls/6,broke for me. very old checkins in 2010 had no source set.,205429375,swarm-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 589491711,MDU6SXNzdWU1ODk0OTE3MTE=,7,Upgrade to sqlite-utils 2.x,9599,simonw,closed,0,,,,,0,2020-03-28T02:24:51Z,2020-03-28T02:25:03Z,2020-03-28T02:25:03Z,MEMBER,,,205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 648245071,MDU6SXNzdWU2NDgyNDUwNzE=,8,Error thrown: table photos has no column named hasSticker,18504,harperreed,closed,0,,,,,2,2020-06-30T14:54:37Z,2020-10-12T20:35:06Z,2020-10-12T20:25:24Z,NONE,,"While running `swarm-to-sqlite` it throws an error: harper@:~/dogsheep/swarm$ swarm-to-sqlite checkins.db --save=checkins.json Please provide your Foursquare OAuth token: Importing 8127 checkins [#################-------------------] 49% 00:01:52 Traceback (most recent call last): File ""/home/harper/.local/bin/swarm-to-sqlite"", line 11, in sys.exit(cli()) File ""/home/harper/.local/lib/python3.6/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/home/harper/.local/lib/python3.6/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/home/harper/.local/lib/python3.6/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/harper/.local/lib/python3.6/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/home/harper/.local/lib/python3.6/site-packages/swarm_to_sqlite/cli.py"", line 73, in cli save_checkin(checkin, db) File ""/home/harper/.local/lib/python3.6/site-packages/swarm_to_sqlite/utils.py"", line 94, in save_checkin photos_table.insert(photo, replace=True) File ""/home/harper/.local/lib/python3.6/site-packages/sqlite_utils/db.py"", line 963, in insert alter = self.value_or_default(""alter"", alter) File ""/home/harper/.local/lib/python3.6/site-packages/sqlite_utils/db.py"", line 1142, in insert_all def upsert_all( sqlite3.OperationalError: table photos has no column named hasSticker Where should i dig in?",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488833136,MDU6SXNzdWU0ODg4MzMxMzY=,1,"Imported followers should go in ""users"", relationships in ""following""",9599,simonw,closed,0,,,,,0,2019-09-03T21:27:37Z,2019-09-04T20:23:04Z,2019-09-04T20:23:04Z,MEMBER,,"Right now `twitter-to-sqlite followers` dumps everything in a `followers` table, and doesn't actually record which account they are following! It should instead save them all in a global `users` table and then set up m2m relationships in a `following` table. This also means it should create a record for the specified user in order to record both sides of each relationship.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 492297930,MDU6SXNzdWU0OTIyOTc5MzA=,10,Rethink progress bars for various commands,9599,simonw,closed,0,,,,,5,2019-09-11T15:06:47Z,2020-04-01T03:45:48Z,2020-04-01T03:45:48Z,MEMBER,,"Progress bars and the `--silent` option are implemented inconsistently across commands at the moment. This is made more challenging by the fact that for many operations the total length is not known. https://click.palletsprojects.com/en/7.x/api/#click.progressbar",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503045221,MDU6SXNzdWU1MDMwNDUyMjE=,11,Commands for recording real-time tweets from the streaming API,9599,simonw,closed,0,,,,,1,2019-10-06T03:09:30Z,2019-10-06T04:54:17Z,2019-10-06T04:48:31Z,MEMBER,,"https://developer.twitter.com/en/docs/tweets/filter-realtime/api-reference/post-statuses-filter We can support tracking keywords and following specific users.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503053800,MDU6SXNzdWU1MDMwNTM4MDA=,12,"Extract ""source"" into a separate lookup table",9599,simonw,closed,0,,,,,3,2019-10-06T05:17:23Z,2019-10-17T15:49:24Z,2019-10-17T15:49:24Z,MEMBER,,"It's pretty bulky and ugly at the moment: ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503085013,MDU6SXNzdWU1MDMwODUwMTM=,13,statuses-lookup command,9599,simonw,closed,0,,,,,1,2019-10-06T11:00:20Z,2019-10-07T00:33:49Z,2019-10-07T00:31:44Z,MEMBER,,"For bulk retrieving tweets by their ID. https://developer.twitter.com/en/docs/tweets/post-and-engage/api-reference/get-statuses-lookup Rate limit is 900/15 minutes (1 call per second) but each call can pull up to 100 IDs, so we can pull 6,000 per minute. Should support `--SQL` and `--attach` #8 ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503244410,MDU6SXNzdWU1MDMyNDQ0MTA=,14,"When importing favorites, record which user favorited them",9599,simonw,closed,0,,,,,0,2019-10-07T05:45:11Z,2019-10-14T03:30:25Z,2019-10-14T03:30:25Z,MEMBER,,"This code currently just dumps them into the `tweets` table without recording who it was who had favorited them. https://github.com/dogsheep/twitter-to-sqlite/blob/436a170d74ec70903d1b4ca430c2c6b6435cdfcc/twitter_to_sqlite/cli.py#L152-L157",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 505666744,MDExOlB1bGxSZXF1ZXN0MzI3MDUxNjcz,15,"twitter-to-sqlite import command, refs #4",9599,simonw,closed,0,,,,,0,2019-10-11T06:37:14Z,2019-10-11T06:45:01Z,2019-10-11T06:45:01Z,MEMBER,dogsheep/twitter-to-sqlite/pulls/15,,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 505674949,MDU6SXNzdWU1MDU2NzQ5NDk=,17,import command should empty all archive-* tables first,9599,simonw,closed,0,,,,,2,2019-10-11T06:58:43Z,2019-10-11T15:40:08Z,2019-10-11T15:40:08Z,MEMBER,,Can have a CLI option for NOT doing that.,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 505928530,MDU6SXNzdWU1MDU5Mjg1MzA=,18,Command to import home-timeline,9599,simonw,closed,0,,,,,4,2019-10-11T15:47:54Z,2019-10-11T16:51:33Z,2019-10-11T16:51:12Z,MEMBER,,"Feature request: https://twitter.com/johankj/status/1182563563136868352 > Would it be possible to save all tweets in my timeline from the last X days? I would love to see how big a percentage some users are of my daily timeline as a metric on whether I should unfollow them/move them to a list.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 506087267,MDU6SXNzdWU1MDYwODcyNjc=,19,since_id support for home-timeline,9599,simonw,closed,0,,,,,3,2019-10-11T22:48:24Z,2019-10-16T19:13:06Z,2019-10-16T19:12:46Z,MEMBER,,Currently every time you run `home-timeline` we pull all 800 available tweets. We should offer to support `since_id` (which can be provided or can be pulled directly from the database) in order to work more efficiently if this command is executed e.g. on a cron.,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488833698,MDU6SXNzdWU0ODg4MzM2OTg=,2,"""twitter-to-sqlite user-timeline"" command for pulling tweets by a specific user",9599,simonw,closed,0,,,,,3,2019-09-03T21:29:12Z,2019-09-04T20:02:11Z,2019-09-04T20:02:11Z,MEMBER,,"Twitter only allows up to 3,200 tweets to be retrieved from https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-user_timeline.html I'm going to do: $ twitter-to-sqlite tweets simonw ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 506268945,MDU6SXNzdWU1MDYyNjg5NDU=,20,--since support for various commands for refresh-by-cron,9599,simonw,closed,0,,,,,3,2019-10-13T03:40:46Z,2019-10-21T03:32:04Z,2019-10-16T19:26:11Z,MEMBER,,"I want to run a cron that updates my Twitter database every X minutes. It should be able to retrieve the following without needing to paginate through everything: - [x] Tweets I have tweeted - [x] My home timeline (see #19) - [x] Tweets I have favourited It would be nice if this could be standardized across all commands as a `--since` option.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 506432572,MDU6SXNzdWU1MDY0MzI1NzI=,21,Fix & escapes in tweet text,9599,simonw,closed,0,,,,,1,2019-10-14T03:37:28Z,2019-10-15T18:48:16Z,2019-10-15T18:48:16Z,MEMBER,," Shouldn't be storing `&` here.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/21/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 508024032,MDU6SXNzdWU1MDgwMjQwMzI=,22,Ability to import from uncompressed archive or from specific files,9599,simonw,closed,0,,,,,0,2019-10-16T18:31:57Z,2019-10-16T18:53:36Z,2019-10-16T18:53:36Z,MEMBER,,"Currently you can only import like this: $ twitter-to-sqlite import path-to-twitter.zip It would be useful if you could import from a folder that was decompressed from that zip: $ twitter-to-sqlite import path-to-twitter/ AND from individual files within that folder - since that would allow you to e.g. selectively import certain files: $ twitter-to-sqlite import path-to-twitter/favorites.js path-to-twitter/tweets.js",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 508190730,MDU6SXNzdWU1MDgxOTA3MzA=,23,Extremely simple migration system,9599,simonw,closed,0,,,,,2,2019-10-17T02:13:57Z,2019-10-17T16:57:17Z,2019-10-17T16:57:17Z,MEMBER,,"Needed for #12. This is going to be an incredibly simple version of the Django migration system. * A `migrations` table, keeping track of which migrations were applied (and when) * A `migrate()` function which applies any pending migrations * A `MIGRATIONS` constant which is a list of functions to be applied The function names will be detected and used as the names of the migrations. Every time you run the CLI tool it will call the `migrate()` function before doing anything else. Needs to take into account that there might be no tables at all. As such, migration functions should sanity check that the tables they are going to work on actually exist.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 508553387,MDExOlB1bGxSZXF1ZXN0MzI5MzI0MzY4,24,Tweet source extraction and new migration system,9599,simonw,closed,0,,,,,0,2019-10-17T15:24:56Z,2019-10-17T15:49:29Z,2019-10-17T15:49:24Z,MEMBER,dogsheep/twitter-to-sqlite/pulls/24,Closes #12 and #23,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 508578780,MDU6SXNzdWU1MDg1Nzg3ODA=,25,Ensure migrations don't accidentally create foreign key twice,9599,simonw,closed,0,,,,,2,2019-10-17T16:08:50Z,2019-10-17T16:56:47Z,2019-10-17T16:56:47Z,MEMBER,,"Is it possible for these lines to run against a database table that already has these foreign keys? https://github.com/dogsheep/twitter-to-sqlite/blob/c9295233f219c446fa2085cace987067488a31b9/twitter_to_sqlite/migrations.py#L21-L22",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 513074501,MDU6SXNzdWU1MTMwNzQ1MDE=,26,Command for importing mentions timeline,9599,simonw,closed,0,,,,,1,2019-10-28T03:14:27Z,2019-10-30T02:36:13Z,2019-10-30T02:20:47Z,MEMBER,,"https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-mentions_timeline Almost identical to home-timeline #18 but it uses `https://api.twitter.com/1.1/statuses/mentions_timeline.json` instead.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/26/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 514459062,MDU6SXNzdWU1MTQ0NTkwNjI=,27,retweets-of-me command,9599,simonw,closed,0,,,,,4,2019-10-30T07:43:01Z,2019-11-03T01:12:58Z,2019-11-03T01:12:58Z,MEMBER,,https://developer.twitter.com/en/docs/tweets/post-and-engage/api-reference/get-statuses-retweets_of_me,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 515658861,MDU6SXNzdWU1MTU2NTg4NjE=,28,Add indexes to followers table,9599,simonw,closed,0,,,,,1,2019-10-31T18:40:22Z,2019-11-09T20:15:42Z,2019-11-09T20:11:48Z,MEMBER,,`select follower_id from following where followed_id = 12497` takes over a second for me at the moment.,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/28/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 518725064,MDU6SXNzdWU1MTg3MjUwNjQ=,29,`import` command fails on empty files,21148,jacobian,closed,0,,,,,4,2019-11-06T20:34:26Z,2019-11-09T20:33:38Z,2019-11-09T19:36:36Z,CONTRIBUTOR,,"If a file in the export is empty (in my case it was `account-suspensions.js`), `twitter-to-sqlite import` fails: ``` $ twitter-to-sqlite import twitter.db ~/Downloads/twitter-2019-11-06-926f4f3be4b3b1fcb1aa387c40cd14f7c8aaf9bbcdb2d78ac14d9989add501bb.zip Traceback (most recent call last): File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/bin/twitter-to-sqlite"", line 10, in sys.exit(cli()) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py"", line 627, in import_ archive.import_from_file(db, filename, content) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/archive.py"", line 224, in import_from_file db[table_name].upsert_all(rows, hash_id=""pk"") File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1113, in upsert_all extracts=extracts, File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/sqlite_utils/db.py"", line 980, in insert_all first_record = next(records) StopIteration ``` This appears to be because `db.upsert_all` is called with no rows -- I think? I hacked around this by modifying `import_from_file` to have an `if rows:` clause: ``` for table, rows in to_insert.items(): if rows: table_name = ""archive_{}"".format(table.replace(""-"", ""_"")) ... ``` I'm happy to work up a real PR if that's the right approach, but I'm not sure it is.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/29/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488833975,MDU6SXNzdWU0ODg4MzM5NzU=,3,Command for running a search and saving tweets for that search,9599,simonw,closed,0,,,,,6,2019-09-03T21:29:56Z,2019-11-04T05:31:56Z,2019-11-04T05:31:16Z,MEMBER,, $ twitter-to-sqlite search dogsheep,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 518739697,MDU6SXNzdWU1MTg3Mzk2OTc=,30,`followers` fails because `transform_user` is called twice,21148,jacobian,closed,0,,,,,2,2019-11-06T20:44:52Z,2019-11-09T20:15:28Z,2019-11-09T19:55:52Z,CONTRIBUTOR,,"Trying to run `twitter-to-sqlite followers` errors out: ``` Traceback (most recent call last): File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/bin/twitter-to-sqlite"", line 10, in sys.exit(cli()) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py"", line 130, in followers go(bar.update) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py"", line 116, in go utils.save_users(db, [profile]) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/utils.py"", line 302, in save_users transform_user(user) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/utils.py"", line 181, in transform_user user[""created_at""] = parser.parse(user[""created_at""]) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 1374, in parse return DEFAULTPARSER.parse(timestr, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 646, in parse res, skipped_tokens = self._parse(timestr, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 725, in _parse l = _timelex.split(timestr) # Splits the timestr into tokens File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 207, in split return list(cls(s)) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 76, in __init__ '{itype}'.format(itype=instream.__class__.__name__)) TypeError: Parser must be a string or character stream, not datetime ``` This appears to be because https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L111 calls `transform_user`, and then https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L116 calls `transform_user` again, which fails because the user is already transformed. I was able to work around this by commenting out https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L116. Shall I work up a patch for that, or is there a better approach?",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/30/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520508502,MDU6SXNzdWU1MjA1MDg1MDI=,31,"""friends"" command (similar to ""followers"")",9599,simonw,closed,0,,,,,2,2019-11-09T20:20:20Z,2022-09-20T05:05:03Z,2020-02-07T07:03:28Z,MEMBER,,"Current list of commands: ``` followers Save followers for specified user (defaults to... followers-ids Populate followers table with IDs of account followers friends-ids Populate followers table with IDs of account friends ``` Obvious omission here is `friends`, which would be powered by `https://api.twitter.com/1.1/friends/list.json`: https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-friends-list",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/31/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 561454071,MDU6SXNzdWU1NjE0NTQwNzE=,32,"Documentation for "" favorites"" command",9599,simonw,closed,0,,,,,0,2020-02-07T06:50:11Z,2020-02-07T06:59:10Z,2020-02-07T06:59:10Z,MEMBER,,"It looks like I forgot to document this one in the README. https://github.com/dogsheep/twitter-to-sqlite/blob/6ebd482619bd94180e54bb7b56549c413077d329/twitter_to_sqlite/cli.py#L183-L194",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/32/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 561469252,MDExOlB1bGxSZXF1ZXN0MzcyMjczNjA4,33,Upgrade to sqlite-utils 2.2.1,9599,simonw,closed,0,,,,,1,2020-02-07T07:32:12Z,2020-03-20T19:21:42Z,2020-03-20T19:21:41Z,MEMBER,dogsheep/twitter-to-sqlite/pulls/33,,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 585266763,MDU6SXNzdWU1ODUyNjY3NjM=,34,IndexError running user-timeline command,9599,simonw,closed,0,,,,,2,2020-03-20T18:54:08Z,2020-03-20T19:20:52Z,2020-03-20T19:20:37Z,MEMBER,,"``` $ twitter-to-sqlite user-timeline data.db --screen_name Allen_Joines Traceback (most recent call last): File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/bin/twitter-to-sqlite"", line 11, in load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')() File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py"", line 256, in user_timeline utils.save_tweets(db, chunk) File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py"", line 289, in save_tweets db[""users""].upsert(user, pk=""id"", alter=True) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1128, in upsert conversions=conversions, File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1157, in upsert_all upsert=True, File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1096, in insert_all row = list(self.rows_where(""rowid = ?"", [self.last_rowid]))[0] IndexError: list index out of range ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/34/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585282212,MDU6SXNzdWU1ODUyODIyMTI=,35,twitter-to-sqlite user-timeline [screen_names] --sql / --attach,9599,simonw,closed,0,,,,,5,2020-03-20T19:26:07Z,2020-03-20T20:17:00Z,2020-03-20T20:16:35Z,MEMBER,,Split from #8.,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/35/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585306847,MDU6SXNzdWU1ODUzMDY4NDc=,36,twitter-to-sqlite followers/friends --sql / --attach,9599,simonw,closed,0,,,,,0,2020-03-20T20:20:33Z,2020-03-20T23:12:38Z,2020-03-20T23:12:38Z,MEMBER,,"Split from #8. The `friends` and `followers` commands don't yet support `--sql` and `--attach`. (`friends-ids` and `followers-ids` do though).",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/36/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585353598,MDU6SXNzdWU1ODUzNTM1OTg=,37,"Handle ""User not found"" error",9599,simonw,closed,0,,,,,3,2020-03-20T22:14:32Z,2020-04-17T23:43:46Z,2020-04-17T23:43:46Z,MEMBER,,"While running `user-timeline` I got this bug (because a screen name I asked for didn't exist): ``` File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py"", line 185, in transform_user user[""created_at""] = parser.parse(user[""created_at""]) KeyError: 'created_at' >>> import pdb >>> pdb.pm() > /Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py(185)transform_user() -> user[""created_at""] = parser.parse(user[""created_at""]) (Pdb) user {'errors': [{'code': 50, 'message': 'User not found.'}]} ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/37/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585359363,MDU6SXNzdWU1ODUzNTkzNjM=,38,Screen name display for user-timeline is uneven,9599,simonw,closed,0,,,,,1,2020-03-20T22:30:23Z,2020-03-20T22:37:17Z,2020-03-20T22:37:17Z,MEMBER,,"``` CDPHE [####################################] 67 CHFSKy [####################################] 3216 DHSWI [####################################] 41 DPHHSMT [####################################] 742 Delaware_DHSS [####################################] 3231 DhhsNevada [####################################] 639 ``` I could format them to match the length of the longest screen name instead.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 590666760,MDU6SXNzdWU1OTA2NjY3NjA=,39,--since feature can be confused by retweets,9599,simonw,closed,0,,,,,11,2020-03-30T23:25:33Z,2020-04-01T03:45:16Z,2020-04-01T03:45:16Z,MEMBER,,"If you run `twitter-to-sqlite user-timeline ... --since` it's supposed to fetch Tweets those specific users tweeted since last time the command was run. It does this by seeking out the max ID of their previous tweets: https://github.com/dogsheep/twitter-to-sqlite/blob/810cb2af5a175837204389fd7f4b5721f8b325ab/twitter_to_sqlite/cli.py#L305-L311 BUT... this has a nasty flaw: if another account had retweeted one of their recent tweets the retweeted-tweet will have been loaded into the database - so we may treat that as the most recent since ID and miss a bunch of their tweets!",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/39/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488835586,MDU6SXNzdWU0ODg4MzU1ODY=,4,Command for importing data from a Twitter Export file,9599,simonw,closed,0,,,,,2,2019-09-03T21:34:13Z,2019-10-11T06:45:02Z,2019-10-11T06:45:02Z,MEMBER,,"Twitter lets you export all of your data as an archive file: https://twitter.com/settings/your_twitter_data A command for importing this data into SQLite would be extremely useful. $ twitter-to-sqlite import twitter.db path-to-archive.zip ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 590669793,MDU6SXNzdWU1OTA2Njk3OTM=,40,Feature: record history of follower counts,9599,simonw,closed,0,,,,,5,2020-03-30T23:32:28Z,2020-04-01T04:13:05Z,2020-04-01T04:13:05Z,MEMBER,,"We currently over-write the follower count every time we import a tweet (when we import that user profile again): https://github.com/dogsheep/twitter-to-sqlite/blob/810cb2af5a175837204389fd7f4b5721f8b325ab/twitter_to_sqlite/utils.py#L293-L294 It would be neat if we noticed if that user's follower count (and maybe other counts?) had changed since we last saved them and recorded that change in a separate history table. This would be an inexpensive way of building up rough charts of follower count over time.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/40/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 591613579,MDU6SXNzdWU1OTE2MTM1Nzk=,41,"Bug: recorded a since_id for None, None",9599,simonw,closed,0,,,,,0,2020-04-01T04:29:43Z,2020-04-01T04:31:11Z,2020-04-01T04:31:11Z,MEMBER,,"This shouldn't happen in the `since_ids` table (relates to #39): ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/41/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602173589,MDU6SXNzdWU2MDIxNzM1ODk=,42,Error running user-timeline with --sql and --ids together,9599,simonw,closed,0,,,,,0,2020-04-17T19:02:06Z,2020-04-17T23:34:40Z,2020-04-17T23:34:40Z,MEMBER,,"``` $ twitter-to-sqlite user-timeline tweets.db --sql='select id from users' --ids Traceback (most recent call last): File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/bin/twitter-to-sqlite"", line 11, in load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')() File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py"", line 284, in user_timeline ""@{:"" + str(max(len(identifier) for identifier in identifiers)) + ""}"" File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py"", line 284, in ""@{:"" + str(max(len(identifier) for identifier in identifiers)) + ""}"" TypeError: object of type 'int' has no len() ``` But this DID work - casting to strings: ``` $ twitter-to-sqlite user-timeline tweets.db --sql='select """" || id from users' --ids ... this worked ... ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/42/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602176870,MDU6SXNzdWU2MDIxNzY4NzA=,43,"""twitter-to-sqlite lists"" command for retrieving a user's owned lists",9599,simonw,closed,0,,,,,1,2020-04-17T19:08:59Z,2020-04-17T23:48:28Z,2020-04-17T23:30:39Z,MEMBER,,"https://developer.twitter.com/en/docs/accounts-and-users/create-manage-lists/api-reference/get-lists-ownerships `https://api.twitter.com/1.1/lists/ownerships.json `",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/43/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602181581,MDU6SXNzdWU2MDIxODE1ODE=,44,"tweet[""source""] can be an empty string",9599,simonw,closed,0,,,,,0,2020-04-17T19:18:26Z,2020-04-17T22:01:44Z,2020-04-17T22:01:44Z,MEMBER,,"Got this excepion: ``` File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py"", line 641, in extract_and_save_source details = m.groupdict() AttributeError: 'NoneType' object has no attribute 'groupdict' ``` I traced it back to this tweet: https://twitter.com/osder/status/578712651393576960 ``` (Pdb) source_re re.compile('.*?)"".*?>(?P.*?)') (Pdb) locals()['source'] '' (Pdb) u > /Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py(393)save_tweets() -> tweet[""source""] = extract_and_save_source(db, tweet[""source""]) (Pdb) tweet {'created_at': '2015-03-20T00:20:22+00:00', 'id': 578712651393576960, 'full_text': '@osder', 'truncated': False, 'display_text_range': [0, 6], 'source': '', 'in_reply_to_status_id': 578712521382715392, 'in_reply_to_user_id': 1545741, 'in_reply_to_screen_name': 'osder', 'geo': None, 'coordinates': None, 'place': None, 'contributors': None, 'is_quote_status': False, 'retweet_count': 0, 'favorite_count': 0, 'favorited': False, 'retweeted': False, 'lang': 'und', 'user': 1545741} ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/44/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610284471,MDU6SXNzdWU2MTAyODQ0NzE=,46,Error running 'search' for the first time,9599,simonw,closed,0,,,,,0,2020-04-30T18:11:20Z,2020-04-30T18:11:58Z,2020-04-30T18:11:58Z,MEMBER,,"``` % twitter-to-sqlite search infodemic.db '#infodemic' Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/bin/twitter-to-sqlite"", line 11, in load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')() File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/Users/simon/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py"", line 867, in search for tweet in tweets: File ""/Users/simon/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py"", line 165, in fetch_timeline [since_type_id, since_key], sqlite3.OperationalError: no such table: since_ids ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/46/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 663976976,MDU6SXNzdWU2NjM5NzY5NzY=,48,Add a table of contents to the README,9599,simonw,closed,0,,,,,3,2020-07-22T18:54:33Z,2020-07-23T17:46:07Z,2020-07-22T19:03:02Z,MEMBER,,Using https://github.com/jonschlinkert/markdown-toc,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/48/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 681575714,MDExOlB1bGxSZXF1ZXN0NDY5OTQ0OTk5,49,"Document the use of --stop_after with favorites, refs #20",370930,mikepqr,closed,0,,,,,1,2020-08-19T06:10:52Z,2021-08-20T00:02:11Z,2021-08-20T00:02:11Z,CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/49,(I discovered this trawling the issues for how to use --since with favorites),206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/49/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 745393298,MDU6SXNzdWU3NDUzOTMyOTg=,52,Discussion: Adding support for fetching only fresh tweets,4169772,fatihky,closed,0,,,,,1,2020-11-18T07:01:48Z,2020-11-18T07:12:45Z,2020-11-18T07:12:45Z,NONE,,I think it'd be very useful if this tool has an option like `--incremental` to fetch only newer tweets. This way operations could complete very fast in sequential runs. I'd want to try to implement this feature if it seems OK for this tool's purpose. ,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/52/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 771324837,MDU6SXNzdWU3NzEzMjQ4Mzc=,53,--since support for favorites,27,anotherjesse,closed,0,,,,,1,2020-12-19T07:08:23Z,2020-12-19T07:47:11Z,2020-12-19T07:47:11Z,NONE,,"Having support for `--since` for updating your favorites would be ideal as the api is both slow and it only returns ~3k most recent favorites. https://twittercommunity.com/t/cant-get-all-favorite-tweets-by-rest-api/22007/3 The api seems to take an optional `since_id` parameter - https://developer.twitter.com/en/docs/twitter-api/v1/tweets/post-and-engage/api-reference/get-favorites-list",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/53/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 779211940,MDExOlB1bGxSZXF1ZXN0NTQ5MjA0MDYz,55,Fix archive imports,21148,jacobian,closed,0,,,,,2,2021-01-05T15:54:48Z,2021-08-20T00:02:49Z,2021-08-20T00:02:49Z,CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/55,This fixes the issues discussed in #54,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/55/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 796736607,MDU6SXNzdWU3OTY3MzY2MDc=,56,Not all quoted statuses get fetched?,42315895,gsajko,closed,0,,,,,3,2021-01-29T09:48:44Z,2021-02-03T10:36:36Z,2021-02-03T10:36:36Z,NONE,," ![image](https://user-images.githubusercontent.com/42315895/106259325-5f75dc80-621f-11eb-8311-db8f2fe2a257.png) In my database I have 13300 quote tweets, but eta 3600 have `quoted_status` empty. I fetched some of them using `https://api.twitter.com/1.1/statuses/show.json?id=xx` and they did have ids of quoted tweets.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/56/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 907645813,MDU6SXNzdWU5MDc2NDU4MTM=,57,"Error: Use either --since or --since_id, not both",42904,rubenv,closed,0,,,,,6,2021-05-31T18:11:04Z,2021-08-20T00:01:31Z,2021-08-20T00:01:31Z,CONTRIBUTOR,,"I'm using the following command: ``` twitter-to-sqlite user-timeline -a twitter-auth.json twitter/tweets.db --since ``` Which gives the following error: ``` Error: Use either --since or --since_id, not both ``` Running without `--since`. ``` Traceback (most recent call last): File ""/usr/local/bin/twitter-to-sqlite"", line 8, in sys.exit(cli()) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/twitter_to_sqlite/cli.py"", line 317, in user_timeline for tweet in bar: File ""/usr/local/lib/python3.9/site-packages/click/_termui_impl.py"", line 328, in generator for rv in self.iter: File ""/usr/local/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 234, in fetch_user_timeline yield from fetch_timeline( File ""/usr/local/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 202, in fetch_timeline raise Exception(str(tweets[""errors""])) Exception: [{'code': 44, 'message': 'since_id parameter is invalid.'}] ``` ``` Python 3.9.5 twitter-to-sqlite, version 0.21.3 ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/57/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 984939366,MDU6SXNzdWU5ODQ5MzkzNjY=,58,"Error: Use either --since or --since_id, not both - still broken",42904,rubenv,closed,0,,,,,1,2021-09-01T09:45:28Z,2021-09-21T17:37:41Z,2021-09-21T17:37:41Z,CONTRIBUTOR,,"Hi Simon, It appears the fix for #57 doesn't fix things for me: ``` $ twitter-to-sqlite --version twitter-to-sqlite, version 0.21.4 $ python --version Python 3.9.6 ``` ``` $ twitter-to-sqlite home-timeline -a twitter-auth.json twitter/timeline.db --since Importing tweets Error: Use either --since or --since_id, not both ``` Is there any way I can help debug this?",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/58/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 984942782,MDExOlB1bGxSZXF1ZXN0NzI0MzE3NjUw,59,"Fix for since_id bug, closes #58",42904,rubenv,closed,0,,,,,1,2021-09-01T09:49:09Z,2021-09-21T17:37:40Z,2021-09-21T17:37:40Z,CONTRIBUTOR,dogsheep/twitter-to-sqlite/pulls/59,Fixes remaining instances of this bug,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/59/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 489419782,MDU6SXNzdWU0ODk0MTk3ODI=,6,Extract extended_entities into a media table,9599,simonw,closed,0,,,,,0,2019-09-04T21:59:10Z,2019-09-04T22:08:01Z,2019-09-04T22:08:01Z,MEMBER,," ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1088816961,I_kwDODEm0Qs5A5gdB,62,KeyError: 'created_at' for private accounts?,6764957,swyxio,closed,0,,,,,2,2021-12-26T17:51:51Z,2022-03-12T02:36:32Z,2022-02-24T18:10:18Z,NONE,,"hey Simon! i was running `twitter-to-sqlite user-timeline twitter.db` for [my private alt](https://twitter.com/swyxio) and ran into this error:
    ![image](https://user-images.githubusercontent.com/6764957/147416165-46b69c30-100a-406f-8534-8612b75547ae.png) ```bash Traceback (most recent call last): File ""/Users/swyx/Work/datasette/env/bin/twitter-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1128, in __call__ return self.main(*args, **kwargs) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1053, in main rv = self.invoke(ctx) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py"", line 754, in invoke return __callback(*args, **kwargs) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/cli.py"", line 291, in user_timeline profile = utils.get_profile(db, session, **kwargs) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 133, in get_profile save_users(db, [profile]) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 453, in save_users transform_user(user) File ""/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py"", line 285, in transform_user user[""created_at""] = parser.parse(user[""created_at""]) KeyError: 'created_at' ```
    this looks awfully like #37 but it can't be, because i'm authed into my account and obviously i have perms to read my own account. wonder if there's any diagnostic methods i should apply here? just filing an issue for others to find while i investigate.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/62/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 490798130,MDU6SXNzdWU0OTA3OTgxMzA=,7,users-lookup command for fetching users,9599,simonw,closed,0,,,,,0,2019-09-08T19:47:59Z,2019-09-08T20:32:13Z,2019-09-08T20:32:13Z,MEMBER,,"https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-users-lookup ``` https://api.twitter.com/1.1/users/lookup.json?user_id=783214,6253282 https://api.twitter.com/1.1/users/lookup.json?screen_name=simonw,cleopaws ``` CLI design: ``` $ twitter-to-sqlite users-lookup simonw cleopaws $ twitter-to-sqlite users-lookup 783214 6253282 --ids ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 490803176,MDU6SXNzdWU0OTA4MDMxNzY=,8,--sql and --attach options for feeding commands from SQL queries,9599,simonw,closed,0,,,,,4,2019-09-08T20:35:49Z,2020-03-20T23:13:01Z,2020-03-20T23:13:01Z,MEMBER,,"Say you want to fetch Twitter profiles for a list of accounts that are stored in another database: $ twitter-to-sqlite users-lookup users.db --attach attending.db \ --sql ""select Twitter from attending.attendes where Twitter is not null"" The SQL query you feed in is expected to return a list of screen names suitable for processing further by the command. Should be supported by all three of: - [x] `twitter-to-sqlite users-lookup` - [x] `twitter-to-sqlite user-timeline` - [x] `twitter-to-sqlite followers` and `friends` The `--attach` option allows other SQLite databases to be attached to the connection. Without it the SQL query will have to read from the single attached database.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 491791152,MDU6SXNzdWU0OTE3OTExNTI=,9,followers-ids and friends-ids subcommands,9599,simonw,closed,0,,,,,1,2019-09-10T16:58:15Z,2019-09-10T17:36:55Z,2019-09-10T17:36:55Z,MEMBER,,"These will import follower and friendship IDs into the following tables, using these APIs: https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-followers-ids https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-friends-ids",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267513424,MDU6SXNzdWUyNjc1MTM0MjQ=,1,Addressable pages for every row in a table,9599,simonw,closed,0,,,2857392,Ship first public release,6,2017-10-23T00:44:16Z,2017-10-24T14:11:04Z,2017-10-24T14:11:03Z,OWNER,," /database-name-7sha256/table-name/compound-pk /database-name-7sha256/table-name/compound-pk.json Tricky part will be figuring out what the private key is - especially since it could be a compound primary key and it might involve different data types.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267517381,MDU6SXNzdWUyNjc1MTczODE=,10,Set up Travis,9599,simonw,closed,0,,,2859414,v1 stretch goals,1,2017-10-23T01:29:07Z,2017-11-04T23:48:57Z,2017-11-04T23:48:57Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274160723,MDU6SXNzdWUyNzQxNjA3MjM=,100,TemplateAssertionError: no filter named 'tojson',13304454,coisnepe,closed,0,,,,,2,2017-11-15T13:43:41Z,2017-11-16T09:25:10Z,2017-11-16T00:14:13Z,NONE,,"A 500 error is raised upon clicking on the name of a table on the homepage, say _http://0.0.0.0:8001/_ to _http://0.0.0.0:8001/test_check-c1f4771/users_ The API part seems to function as intended, though... ``` 2017-11-15 14:33:57 - (sanic)[ERROR]: Traceback (most recent call last): File ""/usr/local/lib/python3.5/dist-packages/sanic/app.py"", line 503, in handle_request response = await response File ""/usr/local/lib/python3.5/dist-packages/datasette/app.py"", line 155, in get return await self.view_get(request, name, hash, **kwargs) File ""/usr/local/lib/python3.5/dist-packages/datasette/app.py"", line 219, in view_get **context, File ""/usr/local/lib/python3.5/dist-packages/sanic_jinja2/__init__.py"", line 84, in render return html(self.render_string(template, request, **context)) File ""/usr/local/lib/python3.5/dist-packages/sanic_jinja2/__init__.py"", line 81, in render_string return self.env.get_template(template).render(**context) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 812, in get_template return self._load_template(name, self.make_globals(globals)) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 786, in _load_template template = self.loader.load(self, name, globals) File ""/usr/lib/python3/dist-packages/jinja2/loaders.py"", line 125, in load code = environment.compile(source, name, filename) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 565, in compile self.handle_exception(exc_info, source_hint=source_hint) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 754, in handle_exception reraise(exc_type, exc_value, tb) File ""/usr/lib/python3/dist-packages/jinja2/_compat.py"", line 37, in reraise raise value.with_traceback(tb) File ""/usr/local/lib/python3.5/dist-packages/datasette/templates/table.html"", line 29, in template
    params = {{ query.params|tojson(4) }}
    File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 515, in _generate return generate(source, self, name, filename, defer_init=defer_init) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 62, in generate generator.visit(node) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 849, in visit_Template self.blockvisit(block.body, block_frame) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 1172, in visit_If self.blockvisit(node.body, if_frame) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 1353, in visit_Output self.visit(argument, frame) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 1565, in visit_Filter self.fail('no filter named %r' % node.name, node.lineno) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 427, in fail raise TemplateAssertionError(msg, lineno, self.name, self.filename) jinja2.exceptions.TemplateAssertionError: no filter named 'tojson' 2017-11-15 14:33:57 - (network)[INFO][127.0.0.1:41316]: GET http://0.0.0.0:8001/test_check-c1f4771/users 500 144 2017-11-15 14:33:57 - (network)[INFO][127.0.0.1:41316]: GET http://0.0.0.0:8001/favicon.ico 200 0 ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/100/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 717746043,MDExOlB1bGxSZXF1ZXN0NTAwMjU2NDg1,1000,datasette.client internal requests mechanism,9599,simonw,closed,0,9599,simonw,5971510,Datasette 0.50,18,2020-10-08T23:58:25Z,2020-10-09T16:11:26Z,2020-10-09T16:11:25Z,OWNER,simonw/datasette/pulls/1000,Refs #943,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1000/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 717768441,MDU6SXNzdWU3MTc3Njg0NDE=,1001,OPTIONS requests return a 500 error,9599,simonw,closed,0,,,5971510,Datasette 0.50,8,2020-10-09T00:57:13Z,2020-10-09T01:44:41Z,2020-10-09T01:43:58Z,OWNER,,"``` % curl -vv -XOPTIONS https://latest.datasette.io/ * Trying 216.58.195.83:443... > OPTIONS / HTTP/1.1 > Host: latest.datasette.io > User-Agent: curl/7.70.0 > Accept: */* > * Mark bundle as not supporting multiuse < HTTP/1.1 500 Internal Server Error ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1001/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 717783692,MDU6SXNzdWU3MTc3ODM2OTI=,1002,Release notes for Datasette 0.50,9599,simonw,closed,0,,,5971510,Datasette 0.50,1,2020-10-09T01:45:00Z,2020-10-09T17:52:54Z,2020-10-09T17:52:53Z,OWNER,,https://github.com/simonw/datasette/compare/0.49.1...c12b7a5def7028845a54a9fdac4052a87a0a8bb8,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1002/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718255803,MDU6SXNzdWU3MTgyNTU4MDM=,1004,Replace MockRequest with Request.fake(),9599,simonw,closed,0,,,5971510,Datasette 0.50,0,2020-10-09T15:55:28Z,2020-10-09T16:26:24Z,2020-10-09T16:26:24Z,OWNER,,"This code: https://github.com/simonw/datasette/blob/7249ac5ca04b5ddc6517750326ee7e522cc49145/tests/utils.py#L1-L8 Predates the introduction of this class method: https://github.com/simonw/datasette/blob/7249ac5ca04b5ddc6517750326ee7e522cc49145/datasette/utils/asgi.py#L108-L121",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1004/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718259202,MDU6SXNzdWU3MTgyNTkyMDI=,1005,Remove xfail tests when new httpx is released,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2020-10-09T16:00:19Z,2021-02-28T22:41:08Z,2021-02-28T22:41:08Z,OWNER,,"> My `httpx` pull request adding `raw_path` support was just merged: https://github.com/encode/httpx/pull/1357 - but it's not in a release yet. > > I'm going to mark these tests as `xfail` so I can land this change - I'll remove that once an `httpx` release comes out that I can use to get the tests passing. > _Originally posted by @simonw in https://github.com/simonw/datasette/pull/1000#issuecomment-706263157_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1005/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718264811,MDU6SXNzdWU3MTgyNjQ4MTE=,1006,Documentation for datasette.client,9599,simonw,closed,0,,,5971510,Datasette 0.50,2,2020-10-09T16:09:02Z,2020-10-09T17:22:31Z,2020-10-09T17:20:37Z,OWNER,,"> I'm going to document this in a separate issue. _Originally posted by @simonw in https://github.com/simonw/datasette/pull/1000#issuecomment-706269271_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1006/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274161964,MDU6SXNzdWUyNzQxNjE5NjQ=,101,TemplateAssertionError: no filter named 'tojson',450244,eaubin,closed,0,,,,,1,2017-11-15T13:47:32Z,2017-11-15T13:48:55Z,2017-11-15T13:48:55Z,NONE,,"I get an exception clicking on the table link: ``` 2017-11-15 08:40:10 - (sanic)[ERROR]: Traceback (most recent call last): File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/sanic/app.py"", line 503, in handle_request response = await response File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/datasette/app.py"", line 155, in get return await self.view_get(request, name, hash, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/datasette/app.py"", line 219, in view_get **context, File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/sanic_jinja2/__init__.py"", line 84, in render return html(self.render_string(template, request, **context)) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/sanic_jinja2/__init__.py"", line 81, in render_string return self.env.get_template(template).render(**context) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 812, in get_template return self._load_template(name, self.make_globals(globals)) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 786, in _load_template template = self.loader.load(self, name, globals) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/loaders.py"", line 125, in load code = environment.compile(source, name, filename) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 565, in compile self.handle_exception(exc_info, source_hint=source_hint) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 754, in handle_exception reraise(exc_type, exc_value, tb) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/_compat.py"", line 37, in reraise raise value.with_traceback(tb) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/datasette/templates/table.html"", line 29, in template
    params = {{ query.params|tojson(4) }}
    File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 515, in _generate return generate(source, self, name, filename, defer_init=defer_init) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 62, in generate generator.visit(node) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 849, in visit_Template self.blockvisit(block.body, block_frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 1172, in visit_If self.blockvisit(node.body, if_frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 1353, in visit_Output self.visit(argument, frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 1565, in visit_Filter self.fail('no filter named %r' % node.name, node.lineno) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 427, in fail raise TemplateAssertionError(msg, lineno, self.name, self.filename) jinja2.exceptions.TemplateAssertionError: no filter named 'tojson' ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/101/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718484082,MDU6SXNzdWU3MTg0ODQwODI=,1010,json / CSV links are broken in Datasette 0.50,9599,simonw,closed,0,,,,,3,2020-10-10T00:07:42Z,2020-10-10T02:32:03Z,2020-10-10T02:32:03Z,OWNER,,"e.g. on https://latest.datasette.io/fixtures/sortable That export link block is broken. The HTML is: ```html

    This data as json, CSV (advanced)

    ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1010/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718521469,MDU6SXNzdWU3MTg1MjE0Njk=,1011,column name links broken in 0.50.1,649467,mhalle,closed,0,,,,,4,2020-10-10T03:37:51Z,2020-10-10T04:09:32Z,2020-10-10T03:52:07Z,NONE,,"I just upgraded from 0.49 to 0.50.1 and found that the links on column headers are broken. If I inspect the source, they have a leading ""//"" (without host or port) rather than including base_url like other links on the page do. The links in the ""gears"" menu for each column do work. I don't have custom templates for my project. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1011/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718723543,MDU6SXNzdWU3MTg3MjM1NDM=,1014,Add Link: pagination HTTP headers,9599,simonw,closed,0,,,6026070,0.51,6,2020-10-10T23:42:40Z,2020-10-23T19:44:05Z,2020-10-11T00:18:51Z,OWNER,,Spun off from #782. These can go on all of the JSON endpoints that support pagination.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1014/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718953669,MDU6SXNzdWU3MTg5NTM2Njk=,1016,"Add a ""delete"" icon next to filters (in addition to ""remove filter"")",9599,simonw,closed,0,,,6026070,0.51,3,2020-10-11T23:49:53Z,2020-10-23T19:44:06Z,2020-10-12T03:01:58Z,OWNER,,"The ""remove filter"" option in the select box is not very discoverable. It would be good to have an additional remove icon, pointed to by the pink arrow, which removes a specific selected filter.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1016/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 719381863,MDExOlB1bGxSZXF1ZXN0NTAxNTc5MDg4,1017,"Update janus requirement from <0.6,>=0.4 to >=0.4,<0.7",27856297,dependabot-preview[bot],closed,0,,,,,1,2020-10-12T13:29:46Z,2020-10-14T21:52:08Z,2020-10-14T21:52:07Z,CONTRIBUTOR,simonw/datasette/pulls/1017,"Updates the requirements on [janus](https://github.com/aio-libs/janus) to permit the latest version.
    Changelog

    Sourced from janus's changelog.

    Changes

    0.5.0 (2020-04-23)

    • Remove explicit loop arguments and forbid creating queues outside event loops #246

    0.4.0 (2018-07-28)

    • Add py.typed macro #89
    • Drop python 3.4 support and fix minimal version python3.5.3 #88
    • Add property with that indicates if queue is closed #86

    0.3.2 (2018-07-06)

    • Fixed python 3.7 support #97

    0.3.1 (2018-01-30)

    • Fixed bug with join() in case tasks are added by sync_q.put() #75

    0.3.0 (2017-02-21)

    • Expose unfinished_tasks property #34

    0.2.4 (2016-12-05)

    • Restore tarball deploying

    0.2.3 (2016-07-12)

    • Fix exception type

    0.2.2 (2016-07-11)

    • Update asyncio.async() to use asyncio.ensure_future() #6

    0.2.1 (2016-03-24)

    • Fix python setup.py test command #4
    Commits
    • d186724 Fix yaml
    • dbb2d7b Fix deploy script
    • 18df625 Bump to 0.6.0
    • a50b7ec Test on ubuntu only, the library has no platform specific dependencies
    • b599d94 Fix workflow
    • 9897fca Setup github workflows
    • cde6918 Drop Python 3.5, test on Python 3.9, format with black/isort
    • 5f04d79 Support Python 3.9 officially
    • ac23eb7 janus: remove unused type ignores (#287)
    • 0da8f95 Make all tests non-skipped again
    • Additional commits viewable in compare view

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
    Dependabot commands and options
    You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
    ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1017/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 719382156,MDExOlB1bGxSZXF1ZXN0NTAxNTc5MzE1,1018,"Update asgiref requirement from ~=3.2.10 to >=3.2.10,<3.4.0",27856297,dependabot-preview[bot],closed,0,,,,,1,2020-10-12T13:30:09Z,2020-10-14T21:51:36Z,2020-10-14T21:51:35Z,CONTRIBUTOR,simonw/datasette/pulls/1018,"Updates the requirements on [asgiref](https://github.com/django/asgiref) to permit the latest version.
    Changelog

    Sourced from asgiref's changelog.

    3.3.0 (2020-10-09)

    • sync_to_async now defaults to thread-sensitive mode being on
    • async_to_sync now works inside of forked processes
    • WsgiToAsgi now correctly clamps its response body when Content-Length is set

    3.2.10 (2020-08-18)

    • Fixed bugs due to bad WeakRef handling introduced in 3.2.8

    3.2.9 (2020-06-16)

    • Fixed regression with exception handling in 3.2.8 related to the contextvars fix.

    3.2.8 (2020-06-15)

    • Fixed small memory leak in local.Local
    • contextvars are now persisted through AsyncToSync

    3.2.7 (2020-03-24)

    • Bug fixed in local.Local where deleted Locals would occasionally inherit their storage into new Locals due to memory reuse.

    3.2.6 (2020-03-23)

    • local.Local now works in all threading situations, no longer requires periodic garbage collection, and works with libraries that monkeypatch threading (like gevent)

    3.2.5 (2020-03-11)

    • self is now preserved on methods by async_to_sync

    3.2.4 (2020-03-10)

    Commits
    • 7dba5ff Releasing 3.3.0
    • e1e0dd9 Added ZeroCopy extension
    • 3834d13 Added rpc.py to Implementations (#198)
    • 03b0dbb Clamped WsgiToAsgi response body using Content-Length value
    • cfd82e4 Fix linting with unused import removal
    • cc1877e Fix import sorting in previous commit.
    • 7becc9d Making thread_sensitive=True the default
    • 66a6e68 Fixed #194: Made async_to_sync work inside a fork
    • 4ab9d8e Fixed #193: Bumped docs version to 3.0
    • 1c9d063 Clarified "Optional" meaning (#190)
    • Additional commits viewable in compare view

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
    Dependabot commands and options
    You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
    ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1018/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 721050815,MDU6SXNzdWU3MjEwNTA4MTU=,1019,"""Edit SQL"" button on canned queries",639012,jsfenfen,closed,0,,,6026070,0.51,7,2020-10-14T00:51:39Z,2020-10-23T19:44:06Z,2020-10-14T03:44:23Z,CONTRIBUTOR,,"Feature request: Would it be possible to add an ""edit this query"" button on canned queries? Clicking it would open the canned query as an editable sql query. I think the intent is to have named parameters to allow this, but sometimes you just gotta rewrite it? ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1019/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274264175,MDU6SXNzdWUyNzQyNjQxNzU=,102,datasette publish elasticbeanstalk,9599,simonw,closed,0,,,,,1,2017-11-15T18:48:31Z,2021-01-04T20:13:20Z,2021-01-04T20:13:19Z,OWNER,,"It looks like Elastic Beanstalk is the most convenient way to deploy a docker container to AWS without first deploying a cluster. https://aws.amazon.com/blogs/devops/dockerizing-a-python-web-app/ looks helpful. We would need to automate the deployment with Boto: http://boto3.readthedocs.io/en/latest/reference/services/elasticbeanstalk.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/102/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 721830990,MDExOlB1bGxSZXF1ZXN0NTAzNjg1MDc3,1022,Fix table name in spatialite example command,639012,jsfenfen,closed,0,,,,,2,2020-10-14T22:19:34Z,2020-10-14T23:46:46Z,2020-10-14T23:46:46Z,CONTRIBUTOR,simonw/datasette/pulls/1022,The example query for creating a new point geometry seems to be using a table called 'museums' but at one point it instead uses 'events'. I *believe* it is intended to be museums (the example makes more sense if so). ,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1022/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 722673818,MDU6SXNzdWU3MjI2NzM4MTg=,1023,Fix issues relating to base_url,9599,simonw,closed,0,,,6026070,0.51,3,2020-10-15T21:02:06Z,2020-11-24T19:51:44Z,2020-10-31T20:51:01Z,OWNER,,Lots of `base_url` bugs that I'd like to solve at once.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1023/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 722674708,MDU6SXNzdWU3MjI2NzQ3MDg=,1024,Figure out how to run an environment that exercises the base_url proxy setting,9599,simonw,closed,0,,,6026070,0.51,9,2020-10-15T21:03:39Z,2020-10-23T19:44:06Z,2020-10-15T22:34:04Z,OWNER,,Refs #1023.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1024/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 722724086,MDU6SXNzdWU3MjI3MjQwODY=,1025,"Fix last remaining links to ""/"" that do not respect base_url",9599,simonw,closed,0,,,6026070,0.51,7,2020-10-15T22:46:38Z,2020-10-23T19:44:06Z,2020-10-20T05:21:29Z,OWNER,,"Refs #1023 ``` datasette % git grep '""/""' -- '*.html' datasette/templates/error.html: home datasette/templates/patterns.html: home / datasette/templates/query.html: home / ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1025/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 722738988,MDU6SXNzdWU3MjI3Mzg5ODg=,1026,How should datasette.client interact with base_url,9599,simonw,closed,0,,,6026070,0.51,5,2020-10-15T23:07:11Z,2020-10-31T19:29:52Z,2020-10-31T19:29:51Z,OWNER,,"Refs #1023. If Datasette is running with a `base_url` setting and a plugin calls e.g. `datasette.client.get(""/-/plugins.json"")` what should happen?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1026/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 722758132,MDU6SXNzdWU3MjI3NTgxMzI=,1027,Add documentation on serving Datasette behind a proxy using base_url,9599,simonw,closed,0,,,6026070,0.51,5,2020-10-15T23:46:29Z,2020-10-31T21:14:05Z,2020-10-31T21:14:05Z,OWNER,,"This can go on this page: https://docs.datasette.io/en/stable/deploying.html Refs #1023, #865",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1027/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 723803777,MDU6SXNzdWU3MjM4MDM3Nzc=,1028,--load-extension=spatialite shortcut,9599,simonw,closed,0,,,6026070,0.51,1,2020-10-17T17:02:08Z,2022-01-20T21:29:41Z,2020-10-19T22:37:55Z,OWNER,,I added this to `sqlite-utils` in https://github.com/simonw/sqlite-utils/issues/136 and I really like it: pass a special value of `spatialite` and Datasette should attempt to load it from known likely installation locations.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1028/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 723837704,MDExOlB1bGxSZXF1ZXN0NTA1MzM5NTE1,1029,fix(docs): broken link,17075617,jthodge,closed,0,,,,,1,2020-10-17T20:03:20Z,2020-10-17T20:05:04Z,2020-10-17T20:05:04Z,CONTRIBUTOR,simonw/datasette/pulls/1029,This PR fixes a broken markdown link in the `Publish` docs page.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1029/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 274265878,MDU6SXNzdWUyNzQyNjU4Nzg=,103,datasette publish appengine,9599,simonw,closed,0,,,,,1,2017-11-15T18:54:18Z,2021-01-04T20:05:14Z,2021-01-04T20:05:14Z,OWNER,,"Similar approach to Heroku, discussed in #90 Looks like this could be pretty easy: https://cloud.google.com/appengine/docs/flexible/python/quickstart",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/103/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 724369025,MDExOlB1bGxSZXF1ZXN0NTA1NzY5NDYy,1031,Fallback to databases in inspect-data.json when no -i options are passed,299380,frankier,closed,0,,,,,6,2020-10-19T07:51:06Z,2021-03-29T01:46:45Z,2021-03-29T00:23:41Z,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/1031,Currenlty `Datasette.__init__` checks immutables against None to decide whether to fallback to inspect-data.json. This patch modifies the serve command to pass None when no -i options are passed so this fallback works correctly.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1031/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 725099777,MDU6SXNzdWU3MjUwOTk3Nzc=,1033,datasette.urls.static_plugins(...) method,9599,simonw,closed,0,,,6026070,0.51,5,2020-10-20T01:16:32Z,2020-10-24T22:58:33Z,2020-10-24T20:03:52Z,OWNER,,"Follow-on from #904. For constructing URLs like this: /-/static-plugins/NAME_OF_PLUGIN_PACKAGE/yourfile.js Should be documented on https://docs.datasette.io/en/latest/writing_plugins.html#static-assets and https://docs.datasette.io/en/latest/internals.html#datasette-urls",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1033/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 725184645,MDU6SXNzdWU3MjUxODQ2NDU=,1034,Better way of representing binary data in .csv output,9599,simonw,closed,0,,,6026070,0.51,19,2020-10-20T04:28:58Z,2021-06-17T18:13:21Z,2020-10-29T22:47:46Z,OWNER,,"I just noticed this: https://latest.datasette.io/fixtures/binary_data.csv ```csv rowid,data 1,b'\x15\x1c\x02\xc7\xad\x05\xfe' 2,b'\x15\x1c\x03\xc7\xad\x05\xfe' ``` There's no good way to represent binary data in a CSV file, but this seems like one of the more-bad options.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1034/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 725743755,MDU6SXNzdWU3MjU3NDM3NTU=,1035,"datasette.urls.table(..., format=""json"") argument",9599,simonw,closed,0,,,6026070,0.51,3,2020-10-20T16:09:34Z,2020-10-31T18:16:43Z,2020-10-31T18:16:43Z,OWNER,,"> That `datasette.urls.table(""db"", ""table"") + "".json""` example is bad because if the table name contains a `.` it should be `?_format=json` instead. > > Maybe `.table()` should have a `format=""json""` option that knows how to do this. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1026#issuecomment-712962517_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1035/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 725996507,MDU6SXNzdWU3MjU5OTY1MDc=,1036,Make it possible to download BLOB data from the Datasette UI,9599,simonw,closed,0,,,6026070,0.51,16,2020-10-20T22:47:56Z,2021-01-18T17:45:00Z,2020-10-25T00:14:52Z,OWNER,,"Currently you can only extract binary BLOB data as base64-encoded JSON, which is not user friendly at all. It should always be possible for end-users to get the binary data out. I'm worried about XSS vulnerabilities here, but hopefully sending `Content-Type: application/octet-stream` helps there? Need to research that.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1036/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 726094754,MDU6SXNzdWU3MjYwOTQ3NTQ=,1037,Add horizontal scrollbar to tables,9599,simonw,closed,0,,,6026070,0.51,3,2020-10-21T03:13:34Z,2020-10-27T20:52:04Z,2020-10-21T03:16:36Z,OWNER,,Currently you have to scroll the entire page sideways if a table is wide. Make the table `overflow-x: auto` instead.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1037/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 726154220,MDExOlB1bGxSZXF1ZXN0NTA3MjY3MDg3,1038,DOC: Fix syntax error,194147,gerrymanoim,closed,0,,,,,2,2020-10-21T05:45:38Z,2020-10-21T22:57:21Z,2020-10-21T22:44:17Z,CONTRIBUTOR,simonw/datasette/pulls/1038,"If I understand https://docs.datasette.io/en/stable/plugin_hooks.html#register-routes correctly, `register_routes` should return a `List[Tuple[str, Callable]]`. I believe the current code in documentation has a syntax error (extra `)`). ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1038/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 726687572,MDU6SXNzdWU3MjY2ODc1NzI=,1039,Add an animation to the column actions menu,9599,simonw,closed,0,,,6026070,0.51,1,2020-10-21T16:56:28Z,2020-10-23T19:44:07Z,2020-10-21T17:02:32Z,OWNER,,"Inspired by the animation on some of GitHub's dropdown menus: https://github.com/primer/css/blob/da8ee54248e6d76c15c18e53684a15a6516b5b7f/src/utilities/animations.scss#L114-L131 ```css /* Fade in an element and scale it fast */ .anim-scale-in { animation-name: scale-in; animation-duration: 0.15s; animation-timing-function: cubic-bezier(0.2, 0, 0.13, 1.5); } @keyframes scale-in { 0% { opacity: 0; transform: scale(0.5); } 100% { opacity: 1; transform: scale(1); } } ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1039/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274284246,MDExOlB1bGxSZXF1ZXN0MTUyODcwMDMw,104,[WIP] Add publish to heroku support,21148,jacobian,closed,0,,,,,6,2017-11-15T19:56:22Z,2017-11-21T20:55:05Z,2017-11-21T20:55:05Z,CONTRIBUTOR,simonw/datasette/pulls/104," Refs #90 ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/104/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 726910999,MDExOlB1bGxSZXF1ZXN0NTA3OTAzMzky,1040,/db/table/-/blob/pk/column.blob download URL,9599,simonw,closed,0,,,6026070,0.51,3,2020-10-21T22:39:15Z,2020-10-24T23:09:20Z,2020-10-24T23:09:19Z,OWNER,simonw/datasette/pulls/1040,"Refs #1036. Still needs: - [x] Comprehensive tests across all of the code branches, plus permissions - [x] A bit more refactoring to share logic cleanly with `RowView` - ~~A configuration option to disable this feature (probably)~~",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1040/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 727627923,MDU6SXNzdWU3Mjc2Mjc5MjM=,1041,extra_js_urls and extra_css_urls should respect base_url setting,9599,simonw,closed,0,,,6026070,0.51,4,2020-10-22T18:34:33Z,2020-10-31T20:49:28Z,2020-10-31T20:48:58Z,OWNER,,"_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1033#issuecomment-714681365_ Refs #1023",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1041/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 727802081,MDU6SXNzdWU3Mjc4MDIwODE=,1042,Plugin hook for loading templates,9599,simonw,closed,0,,,6026070,0.51,14,2020-10-23T00:18:39Z,2020-10-30T17:47:21Z,2020-10-30T17:47:20Z,OWNER,,This can work with the Jinja template loaders. It would unlock things like storing templates in SQLite.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1042/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 727915394,MDExOlB1bGxSZXF1ZXN0NTA4NzE5NTY3,1043,Include LICENSE in sdist,45380,bollwyvl,closed,0,,,,,4,2020-10-23T05:04:12Z,2020-10-26T00:14:57Z,2020-10-23T20:54:35Z,CONTRIBUTOR,simonw/datasette/pulls/1043,"Hi, thanks for `datasette`! This PR adds the `LICENSE` to source distributions, which seems the norm for Apache-2.0 stuff. I noticed the [0.50.2 sdist](https://files.pythonhosted.org/packages/f2/ba/1b5f182c3f1769c0863bcaa77406bdcb81c92e31bb579959c01b1d8951c0/datasette-0.50.2.tar.gz) doesn't ship `LICENSE`, but the 0.5.2 `whl` does, so I'm assuming the intent _is_ to ship... and it's a one-liner! Motivation: It might be a bit of a slog, but I'm looking to see about getting `datasette` (and friends!) available on conda-forge. There are a few missing upstreams (`asgi-csrf`, `python-basecov`, `mergedeep`) and some of the plugins don't even appear to _have_ tarballs (just `whl`!), but the little stuff like licenses are nice to get out handled upstream vs separately grabbing them.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1043/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 727916744,MDExOlB1bGxSZXF1ZXN0NTA4NzIwNjYw,1044,Add minimum supported python,45380,bollwyvl,closed,0,,,,,2,2020-10-23T05:08:03Z,2020-10-23T20:53:08Z,2020-10-23T20:53:08Z,CONTRIBUTOR,simonw/datasette/pulls/1044,"Thanks for `datasette`! This PR adds `python_requires` to formally signal the [minimum supported python version](https://packaging.python.org/guides/dropping-older-python-versions/#specify-the-version-ranges-for-supported-python-distributions) (which is pointed out with classifiers, so seems pretty straightforward).",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1044/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 728600048,MDU6SXNzdWU3Mjg2MDAwNDg=,1045,"Document that datasette.render_template(template, ...) also accepts a list of templates",9599,simonw,closed,0,,,6026070,0.51,1,2020-10-23T23:37:12Z,2020-10-24T00:22:10Z,2020-10-24T00:22:09Z,OWNER,,"https://docs.datasette.io/en/stable/internals.html#await-render-template-template-context-none-request-none `await .render_template(template, context=None, request=None)` This currently only accepts a single template. If it accepted a list of templates (where the first available template gets rendered) it could be more widely used by Datasette internally. Spotted this while researching #1042.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1045/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 728895193,MDU6SXNzdWU3Mjg4OTUxOTM=,1046,Link to blob downloads in the right places,9599,simonw,closed,0,,,6026070,0.51,2,2020-10-24T23:00:41Z,2020-10-25T00:13:21Z,2020-10-25T00:13:21Z,OWNER,,"Split from #1040, refs #1036.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1046/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 728895233,MDU6SXNzdWU3Mjg4OTUyMzM=,1047,A new section in the docs about how Datasette handles BLOB columns,9599,simonw,closed,0,,,6026070,0.51,1,2020-10-24T23:01:02Z,2020-10-31T22:11:25Z,2020-10-31T21:38:05Z,OWNER,,"Split from #1040, refs #1036.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1047/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 729017519,MDExOlB1bGxSZXF1ZXN0NTA5NTkwMjA1,1049,Add template block prior to extra URL loaders,82988,psychemedia,closed,0,,,,,4,2020-10-25T13:08:55Z,2020-10-29T09:20:52Z,2020-10-29T09:20:34Z,CONTRIBUTOR,simonw/datasette/pulls/1049,"To handle packages that require Javascript state setting prior to loading a package (eg [`thebelab`](https://thebelab.readthedocs.io/en/latest/examples/minimal_example.html), provide a template block before the URLs are loaded.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1049/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 274314940,MDU6SXNzdWUyNzQzMTQ5NDA=,105,Consider data-package as a format for metadata,9599,simonw,closed,0,,,,,4,2017-11-15T21:43:34Z,2017-11-20T19:50:53Z,2017-11-20T19:50:53Z,OWNER,,http://frictionlessdata.io/specs/data-package/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/105/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 729057388,MDU6SXNzdWU3MjkwNTczODg=,1050,Switch to .blob render extension for BLOB downloads,9599,simonw,closed,0,,,6026070,0.51,10,2020-10-25T16:26:21Z,2020-10-29T22:01:39Z,2020-10-29T22:01:39Z,OWNER,,This may require a complete rethink of the `/db/table/-/blob/row/column.blob` mechanism I just built for #1036.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1050/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 729096595,MDU6SXNzdWU3MjkwOTY1OTU=,1051,Better display of binary data on arbitrary query results page,9599,simonw,closed,0,,,,,6,2020-10-25T19:38:06Z,2020-10-29T22:12:16Z,2020-10-29T22:01:39Z,OWNER,,"https://latest.datasette.io/fixtures?sql=select+rowid%2C+data+from+binary_data+order+by+rowid+limit+101 Problem: if these were larger fields that HTML page could have multiple megabytes of Python binary string representations on it. It should behave more like the regular table view does: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1051/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 729183332,MDU6SXNzdWU3MjkxODMzMzI=,1052,Column action menu overlapped by Leaflet maps,9599,simonw,closed,0,,,6026070,0.51,1,2020-10-26T02:17:29Z,2020-10-27T20:52:04Z,2020-10-26T02:19:36Z,OWNER,,"Using `datasette-leaflet-geojson`: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1052/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 729604838,MDU6SXNzdWU3Mjk2MDQ4Mzg=,1053,Document recommendations for plugin authors to design URLs,9599,simonw,closed,0,,,6026070,0.51,1,2020-10-26T14:19:21Z,2020-10-29T19:37:58Z,2020-10-29T19:35:40Z,OWNER,,"See thread: https://twitter.com/kanedr/status/1320653434895347713 > The process and API for making a plugin is great btw. One question I had was the best format for URLs. I've created an url like ///reconcile as a json endpoint, but that could conflict with the row-level URLs. Is there a recommended pattern to use?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1053/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 730199464,MDU6SXNzdWU3MzAxOTk0NjQ=,1054,Switch from versioneer to concrete version in setup.py,9599,simonw,closed,0,,,6026070,0.51,2,2020-10-27T07:38:08Z,2020-10-29T03:38:18Z,2020-10-29T03:38:17Z,OWNER,,"The new PyPI resolver keeps on showing me warnings like this one when I install Datasette directly from GitHub using `pip install https://github.com/simonw/datasette/archive/main.zip`: ``` Successfully built datasette Installing collected packages: datasette Attempting uninstall: datasette Found existing installation: datasette 0.50.2 Uninstalling datasette-0.50.2: Successfully uninstalled datasette-0.50.2 ERROR: After October 2020 you may experience errors when installing or updating packages. This is because pip will change the way that it resolves dependency conflicts. We recommend you use --use-feature=2020-resolver to test your packages with the new resolver before it becomes the default. datasette-upload-csvs 0.5 requires datasette>=0.47, but you'll have datasette 0+unknown which is incompatible. datasette-publish-vercel 0.8 requires datasette>=0.44, but you'll have datasette 0+unknown which is incompatible. datasette-psutil 0.2 requires datasette>=0.44, but you'll have datasette 0+unknown which is incompatible. datasette-leaflet-geojson 0.6 requires datasette>=0.48, but you'll have datasette 0+unknown which is incompatible. datasette-edit-schema 0.3 requires datasette>=0.44, but you'll have datasette 0+unknown which is incompatible. datasette-cluster-map 0.13 requires datasette>=0.48, but you'll have datasette 0+unknown which is incompatible. Successfully installed datasette-0+unknown ``` This is because we use versioneer. I'm going to drop that in favour of embedding the version directly in `setup.py`, like I do in other projects such as `sqlite-utils`. I'll use a `.dev` suffix in the development version, as suggested by https://www.python.org/dev/peps/pep-0440/#developmental-releases",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1054/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 730752399,MDExOlB1bGxSZXF1ZXN0NTExMDA1NTQy,1056,"Radical new colour scheme and base styles, courtesy of @natbat",9599,simonw,closed,0,,,6026070,0.51,1,2020-10-27T19:31:49Z,2020-10-27T19:39:57Z,2020-10-27T19:39:56Z,OWNER,simonw/datasette/pulls/1056,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1056/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 730797787,MDU6SXNzdWU3MzA3OTc3ODc=,1057,--cors should enable /fixtures.db CORS access,9599,simonw,closed,0,,,6026070,0.51,1,2020-10-27T20:38:34Z,2020-10-27T20:52:05Z,2020-10-27T20:51:09Z,OWNER,,So Datasette can work with `SQL.js` as seen in https://observablehq.com/@mbostock/sqlite,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1057/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 730802994,MDU6SXNzdWU3MzA4MDI5OTQ=,1058,Database download should implement cascading permissions,9599,simonw,closed,0,,,6026070,0.51,1,2020-10-27T20:43:27Z,2020-10-28T03:15:47Z,2020-10-28T03:15:47Z,OWNER,,"https://github.com/simonw/datasette/blob/5a1519796037105bc20bcf2f91a76e022926c204/datasette/views/database.py#L130-L136 Should be updated for #832 cascading permissions. Example commit: https://github.com/simonw/datasette/commit/d6e03b04302a0852e7133dc030eab50177c37be7",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1058/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 731445447,MDExOlB1bGxSZXF1ZXN0NTExNTQ5Mzc0,1059,"Update aiofiles requirement from <0.6,>=0.4 to >=0.4,<0.7",27856297,dependabot-preview[bot],closed,0,,,,,2,2020-10-28T13:32:40Z,2020-10-28T17:08:29Z,2020-10-28T17:08:28Z,CONTRIBUTOR,simonw/datasette/pulls/1059,"Updates the requirements on [aiofiles](https://github.com/Tinche/aiofiles) to permit the latest version.
    Commits

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
    Dependabot commands and options
    You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
    ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1059/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 274315193,MDU6SXNzdWUyNzQzMTUxOTM=,106,Document how pagination works,9599,simonw,closed,0,,,,,1,2017-11-15T21:44:32Z,2019-06-24T06:42:33Z,2019-06-24T06:42:33Z,OWNER,,I made a start at that in this comment: https://news.ycombinator.com/item?id=15691926,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/106/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 731827081,MDExOlB1bGxSZXF1ZXN0NTExODY4MTUz,1060,New explicit versioning mechanism,9599,simonw,closed,0,,,6026070,0.51,1,2020-10-28T22:14:55Z,2020-10-29T03:38:17Z,2020-10-29T03:38:16Z,OWNER,simonw/datasette/pulls/1060,"- Remove all references to versioneer - Re-implement versioning to use a static string baked into the repo - Ensure that string is output by `datasette --version` and `/-/versions` Refs #1054",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1060/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 732634375,MDExOlB1bGxSZXF1ZXN0NTEyNTQ1MzY0,1061,.blob output renderer,9599,simonw,closed,0,,,6026070,0.51,4,2020-10-29T20:25:08Z,2020-10-29T22:01:40Z,2020-10-29T22:01:39Z,OWNER,simonw/datasette/pulls/1061,"- [x] Remove the `/-/...blob/...` route I added in #1040 in place of the new `.blob` renderer URLs - [x] Link to new `.blob` download links on the arbitrary query page (using `_blob_hash=...`) - plus tests for this Closes #1050, Closes #1051",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1061/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 732685643,MDU6SXNzdWU3MzI2ODU2NDM=,1063,.csv should link to .blob downloads,9599,simonw,closed,0,,,6026070,0.51,3,2020-10-29T21:45:58Z,2021-06-17T18:12:30Z,2020-10-29T22:47:45Z,OWNER,,"- [x] Update `.csv` output to link to these things (and get that `xfail` test to pass) - ~~Add a `.csv?_blob_base64=1` argument that causes them to be output in base64 in the CSV~~ > Moving the CSV work to a separate ticket. _Originally posted by @simonw in https://github.com/simonw/datasette/pull/1061#issuecomment-719042601_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1063/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 732798913,MDU6SXNzdWU3MzI3OTg5MTM=,1064,Navigation menu plus plugin hook,9599,simonw,closed,0,,,6026070,0.51,10,2020-10-30T00:49:36Z,2020-10-30T03:45:16Z,2020-10-30T03:45:16Z,OWNER,,Needed for #690. Prototype in https://github.com/simonw/datasette/commit/0d7ac764861d84be24d661cf4104ce61ea11a82a,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1064/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 732856937,MDExOlB1bGxSZXF1ZXN0NTEyNzM2NzA1,1065,Nav menu plus menu_links() hook,9599,simonw,closed,0,,,6026070,0.51,1,2020-10-30T03:40:18Z,2020-10-30T03:45:17Z,2020-10-30T03:45:16Z,OWNER,simonw/datasette/pulls/1065,"Closes #1064, refs #690.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1065/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 732859030,MDU6SXNzdWU3MzI4NTkwMzA=,1066,Table actions menu plus plugin hook,9599,simonw,closed,0,,,6026070,0.51,3,2020-10-30T03:46:54Z,2020-10-30T05:18:36Z,2020-10-30T05:16:50Z,OWNER,,"> For the table actions: attaching it to a cog icon next to the table name could make sense. > > > > This is the column action icon at twice the size, color `#666`. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/690#issuecomment-709497595_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1066/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 732905360,MDU6SXNzdWU3MzI5MDUzNjA=,1067,"Table actions menu on view pages, not on query pages",9599,simonw,closed,0,,,6026070,0.51,6,2020-10-30T05:56:39Z,2020-10-31T17:51:31Z,2020-10-31T17:40:14Z,OWNER,,Follow-on from #1066.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1067/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 732939921,MDU6SXNzdWU3MzI5Mzk5MjE=,1068,Default menu links should check a real permission ,9599,simonw,closed,0,,,6026070,0.51,5,2020-10-30T07:08:34Z,2020-10-30T15:44:13Z,2020-10-30T15:42:11Z,OWNER,,"https://github.com/simonw/datasette/blob/18a64fbb29271ce607937110bbdb55488c43f4e0/datasette/default_menu_links.py#L4-L6 This should check a named permission so that it can be customized by permission plugins.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1068/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 733303548,MDExOlB1bGxSZXF1ZXN0NTEzMTA2MDI2,1069,load_template() plugin hook,9599,simonw,closed,0,,,6026070,0.51,6,2020-10-30T15:59:45Z,2020-10-30T17:47:20Z,2020-10-30T17:47:19Z,OWNER,simonw/datasette/pulls/1069,Refs #1042,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1069/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 274343647,MDExOlB1bGxSZXF1ZXN0MTUyOTE0NDgw,107,add support for ?field__isnull=1,3433657,raynae,closed,0,,,,,4,2017-11-15T23:36:36Z,2017-11-17T15:12:29Z,2017-11-17T13:29:22Z,CONTRIBUTOR,simonw/datasette/pulls/107,Is this what you had in mind for [this issue](https://github.com/simonw/datasette/issues/64)?,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/107/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 733390884,MDU6SXNzdWU3MzMzOTA4ODQ=,1070,load_template() example in documentation showing loading from a database,9599,simonw,closed,0,,,6026070,0.51,1,2020-10-30T17:45:03Z,2020-10-31T16:22:51Z,2020-10-31T16:22:45Z,OWNER,,"> I should include an example in the documentation that shows loading templates from a database table. _Originally posted by @simonw in https://github.com/simonw/datasette/pull/1069#issuecomment-719664530_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1070/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 733485423,MDU6SXNzdWU3MzM0ODU0MjM=,1071,Messages should be displayed full width,9599,simonw,closed,0,,,6026070,0.51,1,2020-10-30T20:11:35Z,2020-10-30T20:20:02Z,2020-10-30T20:13:05Z,OWNER,,"In the pattern portfolio: But they're currently showing like this: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1071/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 733499930,MDU6SXNzdWU3MzM0OTk5MzA=,1072,load_template hook doesn't work for include/extends,9599,simonw,closed,0,,,6026070,0.51,20,2020-10-30T20:33:44Z,2020-10-31T20:48:18Z,2020-10-30T22:50:57Z,OWNER,,"Includes like this one always go to disk, without hitting the `load_template` plugin hook: ```html+jinja
    {% block footer %}{% include ""_footer.html"" %}{% endblock %}
    ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1072/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 733560417,MDU6SXNzdWU3MzM1NjA0MTc=,1073,Remove load_template plugin hook,9599,simonw,closed,0,,,6026070,0.51,1,2020-10-30T22:51:52Z,2020-10-31T16:22:00Z,2020-10-31T16:22:00Z,OWNER,,"I couldn't get it working correctly with async (necessary for include/extend to function), and on deeper investigation it appears that I can build something equivalent to what I wanted using the existing `prepare_jinja2_environment` hook. > I'm going to remove the `load_template` plugin hook and see if it's possible to build the edit templates extension against `prepare_jinja2_environment` instead. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1072#issuecomment-719833744_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1073/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 733768037,MDU6SXNzdWU3MzM3NjgwMzc=,1074,latest.datasette.io should include plugins from fixtures,9599,simonw,closed,0,,,6026070,0.51,3,2020-10-31T17:23:23Z,2020-10-31T19:47:47Z,2020-10-31T19:47:47Z,OWNER,,"> It bothers me that these aren't visible in any public demos. Maybe `latest.datasette.io` should include the `my_plugins.py` and `my_plugins2.py` plugins? _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1067#issuecomment-719961701_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1074/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 733796942,MDU6SXNzdWU3MzM3OTY5NDI=,1075,PrefixedUrlString mechanism broke everything,9599,simonw,closed,0,,,6026070,0.51,6,2020-10-31T19:58:05Z,2020-10-31T20:48:51Z,2020-10-31T20:48:51Z,OWNER,,Added in 7a67bc7a569509d65b3a8661e0ad2c65f0b09166 refs #1026. Lots of tests are failing now.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1075/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 733805089,MDU6SXNzdWU3MzM4MDUwODk=,1076,Release notes for 0.51,9599,simonw,closed,0,,,6026070,0.51,0,2020-10-31T20:51:21Z,2020-10-31T22:27:00Z,2020-10-31T22:27:00Z,OWNER,,Start by combining release notes from https://github.com/simonw/datasette/releases/tag/0.51a0 and https://github.com/simonw/datasette/releases/tag/0.51a1 and https://github.com/simonw/datasette/releases/tag/0.51a2,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1076/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 733829385,MDU6SXNzdWU3MzM4MjkzODU=,1077,database_actions plugin hook,9599,simonw,closed,0,,,6055094,Datasette 0.52,3,2020-10-31T23:48:12Z,2020-11-02T18:43:25Z,2020-11-02T18:29:50Z,OWNER,,Like `column_actions` but adds a cog menu to the database page.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1077/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274374317,MDU6SXNzdWUyNzQzNzQzMTc=,108,"Include version in python code, output in template",9599,simonw,closed,0,,,,,0,2017-11-16T02:32:40Z,2017-11-16T15:30:04Z,2017-11-16T15:30:04Z,OWNER,,It would be useful if I could tell which version of datasette was running on a site. Embed version number and include it in maybe a tooltip on the “powered by datasette” link,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/108/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 735644513,MDU6SXNzdWU3MzU2NDQ1MTM=,1081,"Fixtures should use FTS4 or FTS5, not FTS3",9599,simonw,closed,0,,,6055094,Datasette 0.52,0,2020-11-03T21:24:13Z,2020-11-12T00:03:00Z,2020-11-12T00:02:59Z,OWNER,,"Just spotted that `fixtures.db` uses FTS3, which is pretty much obsolete these days. https://github.com/simonw/datasette/blob/13d1228d80c91d382a05b1a9549ed02c300ef851/tests/fixtures.py#L488-L489",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1081/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 737394470,MDU6SXNzdWU3MzczOTQ0NzA=,1084,Table/database action menu cut off if too short ,9599,simonw,closed,0,,,6055094,Datasette 0.52,4,2020-11-06T01:55:23Z,2020-11-21T23:45:59Z,2020-11-21T23:45:59Z,OWNER,,"![3CC0C181-959E-4B20-BE39-806ED93E833E](https://user-images.githubusercontent.com/9599/98316836-03891800-1f90-11eb-9e52-5266baf33296.jpeg) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1084/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 740512882,MDExOlB1bGxSZXF1ZXN0NTE4OTg4ODc5,1085,Use FTS4 in fixtures,9599,simonw,closed,0,,,,,1,2020-11-11T06:44:30Z,2020-11-12T00:02:59Z,2020-11-12T00:02:58Z,OWNER,simonw/datasette/pulls/1085,Refs #1081,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1085/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 741021342,MDU6SXNzdWU3NDEwMjEzNDI=,1086,Foreign keys with blank titles result in non-clickable links,9599,simonw,closed,0,,,6055094,Datasette 0.52,3,2020-11-11T19:41:09Z,2020-11-28T23:28:29Z,2020-11-11T23:46:20Z,OWNER,," The HTML looks like this: ```html ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1086/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 741268956,MDU6SXNzdWU3NDEyNjg5NTY=,1088,OperationalError('interrupted') can 500 on row page,9599,simonw,closed,0,,,6055094,Datasette 0.52,3,2020-11-12T04:29:55Z,2020-11-28T23:28:35Z,2020-11-12T04:36:52Z,OWNER,,"I got this on my (private) https://dogsheep.simonwillison.net/twitter/tweets/1188612004572880896 page: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1088/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 741665726,MDU6SXNzdWU3NDE2NjU3MjY=,1089,Sweep documentation for words that minimize involved difficulty,9599,simonw,closed,0,,,6055094,Datasette 0.52,1,2020-11-12T14:53:05Z,2020-11-28T23:28:43Z,2020-11-12T20:07:26Z,OWNER,,Inspired by https://github.com/django/django/pull/11482,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1089/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274378301,MDU6SXNzdWUyNzQzNzgzMDE=,109,Set up readthedocs,9599,simonw,closed,0,,,,,1,2017-11-16T02:58:01Z,2017-11-16T16:53:26Z,2017-11-16T16:13:56Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/109/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 742011049,MDU6SXNzdWU3NDIwMTEwNDk=,1091,.json and .csv exports fail to apply base_url,9599,simonw,closed,0,,,6346396,Datasette 0.54,22,2020-11-12T23:45:16Z,2021-01-24T21:20:24Z,2021-01-09T22:19:29Z,OWNER,,"> Just tested with the latest Docker image, and it works pretty much everywhere! THANK YOU! > > I did notice that if I try to export json or csv, the base is not applied. Not sure if I should reopen this issue or open a new one. > > To see this, go here: https://corpora.tika.apache.org/datasette/corpora-metadata/REF_PARSE_EXCEPTION_TYPES > > Click/hover over json or CSV and you'll see that the 'datasette' base is not included. _Originally posted by @tballison in https://github.com/simonw/datasette/issues/865#issuecomment-726385422_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1091/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 742041667,MDU6SXNzdWU3NDIwNDE2Njc=,1092,Make cascading permission checks available to plugins,9599,simonw,closed,0,,,,,1,2020-11-13T01:02:55Z,2023-08-30T22:17:42Z,2023-08-30T22:17:41Z,OWNER,,"The `BaseView` class has a method for cascading permission checks, but it's not easily accessible to plugins. https://github.com/simonw/datasette/blob/5eb8e9bf250b26e30b017d39a392c33973997656/datasette/views/base.py#L75-L99 This leaves plugins like `datasette-graphql` having to implement their own versions of this logic, which is bad: https://github.com/simonw/datasette-graphql/issues/65 > First check `view-database` - if that says `False` then disallow access, if it says `True` then allow access. If it says `None` check `view-instance`. This should become a supported API that plugins are encouraged to use.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1092/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 743011397,MDU6SXNzdWU3NDMwMTEzOTc=,1094,import EX_CANTCREAT means datasette fails to work on Windows,1049910,drkane,closed,0,,,,,1,2020-11-14T14:17:11Z,2020-12-05T19:35:04Z,2020-12-05T19:35:04Z,NONE,,"Trying to use datasette 0.51.1 gives the following error: ``` ImportError: cannot import name 'EX_CANTCREAT' from 'os' (C:\Users\drkan\AppData\Local\Programs\Python\Python39\lib\os.py) ``` Looks like that code is only available on unix: https://docs.python.org/3/library/os.html#os.EX_CANTCREAT Removing the line makes it work fine (`EX_CANTCREAT` doesn't seem to be used anywhere?)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1094/reactions"", ""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 743369188,MDExOlB1bGxSZXF1ZXN0NTIxMjc2Mjk2,1097,Use f-strings,9599,simonw,closed,0,,,,,1,2020-11-15T23:12:36Z,2020-11-15T23:24:24Z,2020-11-15T23:24:23Z,OWNER,simonw/datasette/pulls/1097,"Since Datasette now requires Python 3.6, how about some f-strings? I ran this in the `datasette` root checkout: ``` pip install flynt flynt . black . ```",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1097/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 743370900,MDU6SXNzdWU3NDMzNzA5MDA=,1098,Foreign key links break for compound foreign keys,9599,simonw,closed,0,,,,,5,2020-11-15T23:22:14Z,2020-11-29T19:50:31Z,2020-11-29T19:30:23Z,OWNER,,"Reported on Twitter here: https://twitter.com/ZaneSelvans/status/1328093641395548161 > Maybe I'm doing something wrong here but the automatically generated links based foreign key relationships seem to be working here for utility_id_eia, but not for plant_id_eia & generator_id which seems odd: https://pudl-datasette-xl7xwcpe2a-uc.a.run.app/pudl/generators_eia860 > > Right now it seems like they're trying to, but with only one of the two keys, so it gives ""Error 500. You did not supply a value for binding 2."" Maybe only create the links when it's a simple foreign key?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1098/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267522549,MDU6SXNzdWUyNjc1MjI1NDk=,11,Code that generates compile-time properties about the database ,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-23T02:18:24Z,2017-10-23T16:04:23Z,2017-10-23T16:04:23Z,OWNER,,"At a minimum this will include: * sha hash of each database file * list of tables with row counts for each database file",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274578142,MDU6SXNzdWUyNzQ1NzgxNDI=,110,Add --load-extension option to datasette for loading extra SQLite extensions,9599,simonw,closed,0,,,,,2,2017-11-16T16:26:19Z,2017-11-16T18:38:30Z,2017-11-16T16:58:50Z,OWNER,,"This would allow users with extra SQLite extensions installed (like spatialite) to load them at runtime. Inspired by this comment: https://github.com/simonw/datasette/issues/46#issuecomment-344810525",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/110/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 747702144,MDU6SXNzdWU3NDc3MDIxNDQ=,1100,Error on OPTIONS request to database,1319404,akehrer,closed,0,,,,,2,2020-11-20T18:16:43Z,2020-12-03T00:57:35Z,2020-12-03T00:50:17Z,NONE,,"When I perform an OPTIONS request against a database or table datasette fails with an internal error. All these tests result in the traceback below. ``` curl -XOPTIONS http://127.0.0.1:8001/test-db curl -XOPTIONS http://127.0.0.1:8001/test-db/table1 curl -XOPTIONS http://127.0.0.1:8001/test-db/table1\?_search\=test ``` ``` Traceback (most recent call last): File ""[path-to-python]/site-packages/datasette/app.py"", line 1033, in route_path response = await view(request, send) File ""[path-to-python]/site-packages/datasette/views/base.py"", line 146, in view request, **request.scope[""url_route""][""kwargs""] File ""[path-to-python]/site-packages/datasette/views/base.py"", line 118, in dispatch_request return await handler(request, *args, **kwargs) TypeError: object Response can't be used in 'await' expression ``` Making the `options` function in the `DataView` class async fixed it for me. ```python async def options(self, request, *args, **kwargs): r = Response.text(""ok"") if self.ds.cors: r.headers[""Access-Control-Allow-Origin""] = ""*"" return r ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1100/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 749289611,MDU6SXNzdWU3NDkyODk2MTE=,1102,Plugin testing docs should show datasette.client,9599,simonw,closed,0,,,,,1,2020-11-24T02:34:46Z,2020-11-29T07:47:22Z,2020-11-29T07:44:58Z,OWNER,,https://docs.datasette.io/en/stable/testing_plugins.html currently shows how to use HTTPX directly.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1102/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 749979454,MDU6SXNzdWU3NDk5Nzk0NTQ=,1103,Rename /-/config to /-/settings,9599,simonw,closed,0,,,6055094,Datasette 0.52,2,2020-11-24T19:31:00Z,2020-11-24T20:19:20Z,2020-11-24T20:19:19Z,OWNER,,"As part of rebranding config to settings, see also #992.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1103/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 749981663,MDU6SXNzdWU3NDk5ODE2NjM=,1104,config.json in directory config mode should be settings.json,9599,simonw,closed,0,,,6055094,Datasette 0.52,2,2020-11-24T19:34:38Z,2020-11-24T20:37:42Z,2020-11-24T20:37:41Z,OWNER,,"Another knock-on effect of #992. https://github.com/simonw/datasette/blob/4bac9f18f9d04e5ed10f072502bcc508e365438e/docs/config.rst#L51-L55",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1104/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 749982022,MDU6SXNzdWU3NDk5ODIwMjI=,1105,Rebrand config as settings,9599,simonw,closed,0,,,6055094,Datasette 0.52,2,2020-11-24T19:35:12Z,2020-11-24T21:40:28Z,2020-11-24T21:40:28Z,OWNER,,"I realized I need a tracking ticket for this. I want to start splitting things like plugin configuration and default facets / sort order out of `metadata.json` - so I want to start calling those things configuration. But the term configuration is already used for the `--config` family of global settings. So I'm rebranding that type of configuration as settings to free up the name ""configuration"" for more run-time concerns (default sort order) and plugin configuration.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1105/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 749983857,MDU6SXNzdWU3NDk5ODM4NTc=,1106,Rebrand and redirect config.rst as settings.rst,9599,simonw,closed,0,,,6055094,Datasette 0.52,4,2020-11-24T19:38:17Z,2020-11-24T21:39:58Z,2020-11-24T21:39:58Z,OWNER,,"> I'd like to redirect https://docs.datasette.io/en/stable/config.html to a new https://docs.datasette.io/en/stable/settings.html page too. I can use https://docs.readthedocs.io/en/stable/user-defined-redirects.html for that. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1105#issuecomment-733190827_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1106/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 750079085,MDU6SXNzdWU3NTAwNzkwODU=,1107,Rename datasette.config() method to datasette.setting(),9599,simonw,closed,0,,,6055094,Datasette 0.52,5,2020-11-24T21:24:11Z,2020-11-24T22:09:11Z,2020-11-24T22:06:38Z,OWNER,,Part of #1105. Thankfully this isn't yet part of the documented public API on https://docs.datasette.io/en/stable/internals.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1107/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 750087350,MDU6SXNzdWU3NTAwODczNTA=,1108,Configure /en/stable/config.html redirect when I ship 0.52,9599,simonw,closed,0,,,6055094,Datasette 0.52,1,2020-11-24T21:39:19Z,2020-11-29T02:42:42Z,2020-11-29T02:42:42Z,OWNER,,"Like this: _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1106#issuecomment-733248437_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1108/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 750330029,MDU6SXNzdWU3NTAzMzAwMjk=,1110,datasette publish option for installing extra apt-get packages,9599,simonw,closed,0,,,6055094,Datasette 0.52,2,2020-11-25T03:03:43Z,2020-11-28T23:28:56Z,2020-11-25T03:05:41Z,OWNER,,I ran into a need for this while playing with https://github.com/simonw/datasette-ripgrep - I need to install the `ripgrep` Ubuntu package when I deploy the plugin using Cloud Run.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1110/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 752749485,MDExOlB1bGxSZXF1ZXN0NTI4OTk3NjE0,1112,Fix --metadata doc usage,50527,jefftriplett,closed,0,,,6055094,Datasette 0.52,3,2020-11-28T19:19:51Z,2020-11-28T23:28:21Z,2020-11-28T19:53:48Z,CONTRIBUTOR,simonw/datasette/pulls/1112,"I stumbled on this while trying to figure out how to configure datasette-ripgrep via https://github.com/simonw/datasette-ripgrep/issues/15 You may not want to update the changelog (those are annoying) so I added two commits in case that's easier. ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1112/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 752789159,MDU6SXNzdWU3NTI3ODkxNTk=,1113,500 error on row page if query against foreign keys hits time limit,9599,simonw,closed,0,,,,,1,2020-11-28T23:20:08Z,2020-11-29T02:40:01Z,2020-11-28T23:23:31Z,OWNER,,"This page exhibited the following error: https://data.catalyst.coop/ferc1/f1_respondent_id/145 `(OperationalError('interrupted'), 'select (select count(*) from f1_acb_epda where respondent_id=:id), (select count(*) from f1_accumdepr_prvsn where respondent_id=:id), (select count(*) from f1_accumdfrrdtaxcr where respondent_id=:id), (select count(*) from f1_adit_190_detail where respondent_id=:id), (select count(*) from f1_adit_190_notes where respondent_id=:id), (select count(*) from f1_adit_amrt_prop where respondent_id=:id), (select count(*) from f1_adit_other where respondent_id=:id), (select count(*) from f1_adit_other_prop where respondent_id=:id), (select count(*) from f1_allowances where respondent_id=:id), (select count(*) from f1_bal_sheet_cr where respondent_id=:id), (select count(*) from f1_capital_stock where respondent_id=:id), (select count(*) from f1_cash_flow where respondent_id=:id), (select count(*) from f1_cmmn_utlty_p_e where respondent_id=:id), (select count(*) from f1_comp_balance_db where respondent_id=:id), (select count(*) from f1_construction where respondent_id=:id), (select count(*) from f1_control_respdnt where respondent_id=:id), (select count(*) from f1_co_directors where respondent_id=:id), (select count(*) from f1_cptl_stk_expns where respondent_id=:id), (select count(*) from f1_csscslc_pcsircs where respondent_id=:id), (select count(*) from f1_dacs_epda where respondent_id=:id), (select count(*) from f1_dscnt_cptl_stk where respondent_id=:id), (select count(*) from f1_edcfu_epda where respondent_id=:id), (select count(*) from f1_elctrc_erg_acct where respondent_id=:id), (select count(*) from f1_elctrc_oper_rev where respondent_id=:id), (select count(*) from f1_elc_oper_rev_nb where respondent_id=:id), (select count(*) from f1_elc_op_mnt_expn where respondent_id=:id), (select count(*) from f1_electric where respondent_id=:id), (select count(*) from f1_envrnmntl_expns where respondent_id=:id), (select count(*) from f1_envrnmntl_fclty where respondent_id=:id), (select count(*) from f1_fuel where respondent_id=:id), (select count(*) from f1_general_info where respondent_id=:id), (select count(*) from f1_gnrt_plant where respondent_id=:id), (select count(*) from f1_important_chg where respondent_id=:id), (select count(*) from f1_incm_stmnt_2 where respondent_id=:id), (select count(*) from f1_income_stmnt where respondent_id=:id), (select count(*) from f1_miscgen_expnelc where respondent_id=:id), (select count(*) from f1_misc_dfrrd_dr where respondent_id=:id), (select count(*) from f1_mthly_peak_otpt where respondent_id=:id), (select count(*) from f1_mtrl_spply where respondent_id=:id), (select count(*) from f1_nbr_elc_deptemp where respondent_id=:id), (select count(*) from f1_nonutility_prop where respondent_id=:id), (select count(*) from f1_nuclear_fuel where respondent_id=:id), (select count(*) from f1_officers_co where respondent_id=:id), (select count(*) from f1_othr_dfrrd_cr where respondent_id=:id), (select count(*) from f1_othr_pd_in_cptl where respondent_id=:id), (select count(*) from f1_othr_reg_assets where respondent_id=:id), (select count(*) from f1_othr_reg_liab where respondent_id=:id), (select count(*) from f1_overhead where respondent_id=:id), (select count(*) from f1_pccidica where respondent_id=:id), (select count(*) from f1_plant_in_srvce where respondent_id=:id), (select count(*) from f1_pumped_storage where respondent_id=:id), (select count(*) from f1_purchased_pwr where respondent_id=:id), (select count(*) from f1_reconrpt_netinc where respondent_id=:id), (select count(*) from f1_reg_comm_expn where respondent_id=:id), (select count(*) from f1_respdnt_control where respondent_id=:id), (select count(*) from f1_retained_erng where respondent_id=:id), (select count(*) from f1_r_d_demo_actvty where respondent_id=:id), (select count(*) from f1_sales_by_sched where respondent_id=:id), (select count(*) from f1_sale_for_resale where respondent_id=:id), (select count(*) from f1_sbsdry_totals where respondent_id=:id), (select count(*) from f1_schedules_list where respondent_id=:id), (select count(*) from f1_security_holder where respondent_id=:id), (select count(*) from f1_slry_wg_dstrbtn where respondent_id=:id), (select count(*) from f1_substations where respondent_id=:id), (select count(*) from f1_taxacc_ppchrgyr where respondent_id=:id), (select count(*) from f1_unrcvrd_cost where respondent_id=:id), (select count(*) from f1_utltyplnt_smmry where respondent_id=:id), (select count(*) from f1_work where respondent_id=:id), (select count(*) from f1_xmssn_adds where respondent_id=:id), (select count(*) from f1_xmssn_elc_bothr where respondent_id=:id), (select count(*) from f1_xmssn_elc_fothr where respondent_id=:id), (select count(*) from f1_xmssn_line where respondent_id=:id), (select count(*) from f1_xtraordnry_loss where respondent_id=:id), (select count(*) from f1_audit_log where respondent_id=:id), (select count(*) from f1_load_file_names where respondent_id=:id), (select count(*) from f1_privilege where respondent_id=:id), (select count(*) from f1_sys_error_log where respondent_id=:id), (select count(*) from f1_hydro where respondent_id=:id), (select count(*) from f1_ident_attsttn where respondent_id=:id), (select count(*) from f1_steam where respondent_id=:id), (select count(*) from f1_leased where respondent_id=:id), (select count(*) from f1_sbsdry_detail where respondent_id=:id), (select count(*) from f1_plant where respondent_id=:id), (select count(*) from f1_long_term_debt where respondent_id=:id), (select count(*) from f1_106_2009 where respondent_id=:id), (select count(*) from f1_106a_2009 where respondent_id=:id), (select count(*) from f1_106b_2009 where respondent_id=:id), (select count(*) from f1_208_elc_dep where respondent_id=:id), (select count(*) from f1_231_trn_stdycst where respondent_id=:id), (select count(*) from f1_324_elc_expns where respondent_id=:id), (select count(*) from f1_325_elc_cust where respondent_id=:id), (select count(*) from f1_331_transiso where respondent_id=:id), (select count(*) from f1_338_dep_depl where respondent_id=:id), (select count(*) from f1_397_isorto_stl where respondent_id=:id), (select count(*) from f1_398_ancl_ps where respondent_id=:id), (select count(*) from f1_399_mth_peak where respondent_id=:id), (select count(*) from f1_400_sys_peak where respondent_id=:id), (select count(*) from f1_400a_iso_peak where respondent_id=:id), (select count(*) from f1_429_trans_aff where respondent_id=:id), (select count(*) from f1_allowances_nox where respondent_id=:id), (select count(*) from f1_cmpinc_hedge_a where respondent_id=:id), (select count(*) from f1_cmpinc_hedge where respondent_id=:id), (select count(*) from f1_email where respondent_id=:id), (select count(*) from f1_rg_trn_srv_rev where respondent_id=:id), (select count(*) from f1_s0_filing_log where respondent_id=:id), (select count(*) from f1_security where respondent_id=:id)', {'id': '145'})`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1113/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 752966476,MDU6SXNzdWU3NTI5NjY0NzY=,1114,--load-extension=spatialite not working with datasetteproject/datasette docker image,2182,danp,closed,0,,,,,4,2020-11-29T17:35:20Z,2022-01-20T21:29:42Z,2020-11-29T17:37:45Z,CONTRIBUTOR,,"https://github.com/simonw/datasette/commit/6aa5886379dd9017215904fb28567b80018902f9 added the `--load-extension=spatialite` shortcut looking for the extension in these places: https://github.com/simonw/datasette/blob/12877d7a48e2aa28bb5e780f929a218f7265d849/datasette/utils/__init__.py#L56-L60 However, in the datasetteproject/datasette docker image the file is at `/usr/local/lib/mod_spatialite.so`. This results in the example command [here](https://docs.datasette.io/en/stable/installation.html#loading-spatialite) failing: ``` % docker run --rm -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/data.db --load-extension=spatialite Error: Could not find SpatiaLite extension ``` But it does work when given an explicit path: ``` % docker run --rm -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/data.db --load-extension=/usr/local/lib/mod_spatialite.so INFO: Started server process [1] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://0.0.0.0:8001 (Press CTRL+C to quit) ... ``` Perhaps `SPATIALITE_PATHS` should include `/usr/local/lib/mod_spatialite.so`?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1114/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 752995227,MDU6SXNzdWU3NTI5OTUyMjc=,1115,SpatiaLite error could suggest --load-extension=spatialite,9599,simonw,closed,0,,,,,1,2020-11-29T20:05:07Z,2022-01-20T21:29:42Z,2020-11-29T20:13:22Z,OWNER,,"https://github.com/simonw/datasette/blob/242bc89fdf2e775e340d69a4e851b3a9accb31c6/datasette/cli.py#L533-L548 This could use the `find_spatialite()` function and, if it finds something, suggest the user use `--load-extension=spatialite` https://github.com/simonw/datasette/blob/242bc89fdf2e775e340d69a4e851b3a9accb31c6/datasette/utils/__init__.py#L1015-L1019",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1115/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 753668177,MDU6SXNzdWU3NTM2NjgxNzc=,1116,GENERATED column support,2789593,nattaylor,closed,0,,,,,9,2020-11-30T17:33:47Z,2020-11-30T21:29:59Z,2020-11-30T21:29:59Z,NONE,,"I think this is a feature request... perhaps I should just try to contribute it myself, but thought I'd check in case support is planned already. For a table with the following schema, datasette 0.51.1 doesn't pick up the GENERATED columns and the column list only contains `(rowid, body)` If I edit the SQL and select the generated columns, it will happily show them. At first glance it appears that [`def table_column_details(conn, table):`](https://github.com/simonw/datasette/blob/4777362bf2692bc72b221ec47c3e6216151d1b89/datasette/utils/__init__.py#L575) would have to be refactored to use a different methodology to get the columns, since `PRAGMA table_info(deeds);` returns just `0|body|TEXT|0||0` so maybe it wouldn't be worth it. ``` CREATE TABLE deeds ( body TEXT, id INT GENERATED ALWAYS AS (json_extract(body, '$.id')) STORED, consideration INT GENERATED ALWAYS AS (json_extract(body, '$.consideration')) STORED ); ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1116/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 753767911,MDExOlB1bGxSZXF1ZXN0NTI5NzgzMjc1,1117,Support for generated columns,9599,simonw,closed,0,,,,,6,2020-11-30T20:10:46Z,2020-11-30T22:23:19Z,2020-11-30T21:29:58Z,OWNER,simonw/datasette/pulls/1117,"Refs #1116. My first attempt at this worked on my laptop but broke in CI, so I'm going to iterate on it in a pull request instead.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1117/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 753788261,MDU6SXNzdWU3NTM3ODgyNjE=,1118,messagge_is_html typo,9599,simonw,closed,0,,,,,0,2020-11-30T20:43:22Z,2020-11-30T21:24:28Z,2020-11-30T21:24:28Z,OWNER,,https://ripgrep.datasette.io/-/ripgrep?pattern=messagge_is_html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1118/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 753876808,MDU6SXNzdWU3NTM4NzY4MDg=,1119,"Include generated columns in fixtures.db, if SQLite version supports it",9599,simonw,closed,0,,,,,1,2020-11-30T23:25:31Z,2020-12-01T00:41:14Z,2020-12-01T00:28:04Z,OWNER,,"In #1117 I added a one-off test that creates a DB with generated columns in it: https://github.com/simonw/datasette/blob/461670a0b87efa953141b449a9a261919864ceb3/tests/test_api.py#L1933-L1964 If this table was conditionally added to `fixtures.db` (if SQLite supports the feature) it would be available in the demo at https://latest.datasette.io/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1119/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274617240,MDU6SXNzdWUyNzQ2MTcyNDA=,112,Allow --load-extension to be set via environment variables,9599,simonw,closed,0,,,,,1,2017-11-16T18:28:31Z,2017-11-17T14:19:23Z,2017-11-17T14:17:27Z,OWNER,,"This will make it easier to package up datasette in a Docker container with a bunch of pre-compiled extensions without the user having to remember to include all of the options every time. Click has a mechanism for this: http://click.pocoo.org/5/options/#multiple-values-from-environment-values",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/112/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 753898359,MDExOlB1bGxSZXF1ZXN0NTI5ODg3ODYx,1120,generated_columns table in fixtures.py,9599,simonw,closed,0,,,,,1,2020-12-01T00:17:19Z,2020-12-01T00:28:03Z,2020-12-01T00:28:02Z,OWNER,simonw/datasette/pulls/1120,Refs #1119,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1120/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 754178780,MDU6SXNzdWU3NTQxNzg3ODA=,1121,Table actions cog is misaligned,3243482,abdusco,closed,0,,,,,1,2020-12-01T08:41:25Z,2020-12-03T01:03:19Z,2020-12-03T00:33:36Z,CONTRIBUTOR,,"At the moment it looks like this https://datasette-graphql-demo.datasette.io/github/repos ![image](https://user-images.githubusercontent.com/3243482/100716533-e6e2d300-33c9-11eb-866e-1e83ba228bf5.png) Adding a few flex statements fixes the alignment and centers `h1` text and the cog icon vertically. ![image](https://user-images.githubusercontent.com/3243482/100716605-f8c47600-33c9-11eb-8d69-0e37499cf641.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1121/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 754179035,MDExOlB1bGxSZXF1ZXN0NTMwMTI1Njk1,1122,Fix misaligned table actions cog,3243482,abdusco,closed,0,,,,,2,2020-12-01T08:41:46Z,2020-12-03T10:56:40Z,2020-12-03T00:33:37Z,CONTRIBUTOR,simonw/datasette/pulls/1122,Fixes https://github.com/simonw/datasette/issues/1121,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1122/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 755721275,MDU6SXNzdWU3NTU3MjEyNzU=,1123,"Table actions hook are order dependent, should not be",9599,simonw,closed,0,,,,,1,2020-12-03T00:47:12Z,2020-12-03T00:50:17Z,2020-12-03T00:50:17Z,OWNER,,"Got this error: https://github.com/simonw/datasette/runs/1489770800 ``` > assert get_table_actions_links(response_2.text) == [ {""label"": ""From async"", ""href"": ""/""}, {""label"": ""Database: fixtures"", ""href"": ""/""}, {""label"": f""Table: {table_or_view}"", ""href"": ""/""}, ] E AssertionError: assert [{'href': '/'...'From async'}] == [{'href': '/'...: facetable'}] E At index 0 diff: {'label': 'Database: fixtures', 'href': '/'} != {'label': 'From async', 'href': '/'} E Use -v to get the full diff ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1123/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 756439516,MDU6SXNzdWU3NTY0Mzk1MTY=,1124,Datasette on Amazon Linux on ARM returns 404 for static assets,9599,simonw,closed,0,,,,,9,2020-12-03T18:20:37Z,2020-12-03T21:42:02Z,2020-12-03T21:10:54Z,OWNER,,"Very weird bug this one. Steps to reproduce: ``` # I started a amzn2-ami-hvm-2.0.20201126.0-arm64-gp2 t4g.micro instance ec2 % ssh -i simonw-ec2.pem ec2-user@ec2-18-219-238-192.us-east-2.compute.amazonaws.com [ec2-user@ip-172-31-30-7 ~]$ sudo yum install python3 Loaded plugins: extras_suggestions, langpacks, priorities, update-motd ... Is this ok [y/d/N]: y Downloading packages: (1/4): python3-3.7.9-1.amzn2.0.1.aarch64.rpm | 72 kB 00:00:00 (2/4): python3-pip-9.0.3-1.amzn2.0.2.noarch.rpm | 1.9 MB 00:00:00 (3/4): python3-setuptools-38.4.0-3.amzn2.0.6.noarch.rpm | 617 kB 00:00:00 (4/4): python3-libs-3.7.9-1.amzn2.0.1.aarch64.rpm | 9.1 MB 00:00:00 ----------------------------------------------------------------------------------------------------------------------------------------------------------- Total 68 MB/s | 12 MB 00:00:00 Running transaction check Running transaction test Transaction test succeeded Running transaction Installing : python3-pip-9.0.3-1.amzn2.0.2.noarch 1/4 Installing : python3-3.7.9-1.amzn2.0.1.aarch64 2/4 Installing : python3-setuptools-38.4.0-3.amzn2.0.6.noarch 3/4 Installing : python3-libs-3.7.9-1.amzn2.0.1.aarch64 4/4 Verifying : python3-setuptools-38.4.0-3.amzn2.0.6.noarch 1/4 Verifying : python3-pip-9.0.3-1.amzn2.0.2.noarch 2/4 Verifying : python3-3.7.9-1.amzn2.0.1.aarch64 3/4 Verifying : python3-libs-3.7.9-1.amzn2.0.1.aarch64 4/4 Installed: python3.aarch64 0:3.7.9-1.amzn2.0.1 Dependency Installed: python3-libs.aarch64 0:3.7.9-1.amzn2.0.1 python3-pip.noarch 0:9.0.3-1.amzn2.0.2 python3-setuptools.noarch 0:38.4.0-3.amzn2.0.6 Complete! [ec2-user@ip-172-31-30-7 ~]$ python3 -m pip install --user pipx Collecting pipx Downloading https://files.pythonhosted.org/packages/42/8a/8447ec14562c5d97afbee54ec390864718bccce9dfb0506c8c77852489d3/pipx-0.15.6.0-py3-none-any.whl (43kB) 100% |████████████████████████████████| 51kB 3.8MB/s Collecting packaging>=20.0 (from pipx) Downloading https://files.pythonhosted.org/packages/28/87/8edcf555adaf60d053ead881bc056079e29319b643ca710339ce84413136/packaging-20.7-py2.py3-none-any.whl Collecting argcomplete<2.0,>=1.9.4 (from pipx) Downloading https://files.pythonhosted.org/packages/e3/d0/ee7fc6ceac8957196def9bfa3b955d02163058defd3edd51ef7b1ff2769e/argcomplete-1.12.2-py2.py3-none-any.whl Collecting userpath>=1.4.1 (from pipx) Downloading https://files.pythonhosted.org/packages/fb/f1/faca8a3cc86bd2223aaf72edd222bc31d6ae2eea5feaf17a144634a92e6d/userpath-1.4.1-py2.py3-none-any.whl Collecting pyparsing>=2.0.2 (from packaging>=20.0->pipx) Downloading https://files.pythonhosted.org/packages/8a/bb/488841f56197b13700afd5658fc279a2025a39e22449b7cf29864669b15d/pyparsing-2.4.7-py2.py3-none-any.whl (67kB) 100% |████████████████████████████████| 71kB 8.8MB/s Collecting importlib-metadata<4,>=0.23; python_version == ""3.7"" (from argcomplete<2.0,>=1.9.4->pipx) Downloading https://files.pythonhosted.org/packages/99/c7/4ccf2baa455613aa9e61372365aba8594ff2806c82189c31a6c65e7c679e/importlib_metadata-3.1.1-py3-none-any.whl Collecting click (from userpath>=1.4.1->pipx) Downloading https://files.pythonhosted.org/packages/d2/3d/fa76db83bf75c4f8d338c2fd15c8d33fdd7ad23a9b5e57eb6c5de26b430e/click-7.1.2-py2.py3-none-any.whl (82kB) 100% |████████████████████████████████| 92kB 10.5MB/s Collecting distro; platform_system == ""Linux"" (from userpath>=1.4.1->pipx) Downloading https://files.pythonhosted.org/packages/25/b7/b3c4270a11414cb22c6352ebc7a83aaa3712043be29daa05018fd5a5c956/distro-1.5.0-py2.py3-none-any.whl Collecting zipp>=0.5 (from importlib-metadata<4,>=0.23; python_version == ""3.7""->argcomplete<2.0,>=1.9.4->pipx) Downloading https://files.pythonhosted.org/packages/41/ad/6a4f1a124b325618a7fb758b885b68ff7b058eec47d9220a12ab38d90b1f/zipp-3.4.0-py3-none-any.whl Installing collected packages: pyparsing, packaging, zipp, importlib-metadata, argcomplete, click, distro, userpath, pipx Successfully installed argcomplete-1.12.2 click-7.1.2 distro-1.5.0 importlib-metadata-3.1.1 packaging-20.7 pipx-0.15.6.0 pyparsing-2.4.7 userpath-1.4.1 zipp-3.4.0 [ec2-user@ip-172-31-30-7 ~]$ python3 -m pipx ensurepath /home/ec2-user/.local/bin is already in PATH. /home/ec2-user/.local/bin is already in PATH. All pipx binary directories have been added to PATH. If you are sure you want to proceed, try again with the '--force' flag. Otherwise pipx is ready to go! ✨ 🌟 ✨ [ec2-user@ip-172-31-30-7 ~]$ pipx install datasette installed package datasette 0.52.2, Python 3.7.9 These apps are now globally available - datasette done! ✨ 🌟 ✨ [ec2-user@ip-172-31-30-7 ~]$ datasette --version datasette, version 0.52.2 [ec2-user@ip-172-31-30-7 ~]$ datasette --get /-/static/app.css 404 ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1124/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 756622648,MDU6SXNzdWU3NTY2MjI2NDg=,1125,Show pysqlite3 version on /-/versions,9599,simonw,closed,0,,,,,5,2020-12-03T21:57:23Z,2020-12-04T04:16:57Z,2020-12-04T04:16:57Z,OWNER,,"This code can use `pysqlite3` if available. The version should be exposed on `/-/versions`. https://github.com/simonw/datasette/blob/4cce5516661b24afeddaf35bee84b00fbf5c7f89/datasette/utils/sqlite.py#L1-L4",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1125/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 756761963,MDU6SXNzdWU3NTY3NjE5NjM=,1126,Switch to google-github-actions/setup-gcloud for demo deploy,9599,simonw,closed,0,,,,,1,2020-12-04T03:11:24Z,2020-12-04T03:51:38Z,2020-12-04T03:51:38Z,OWNER,,"That workflow is showing warnings: https://github.com/simonw/datasette/actions/runs/399400077 > Thank you for using setup-gcloud Action. GoogleCloudPlatform/github-actions/setup-gcloud has been deprecated, please switch to google-github-actions/setup-gcloud. https://github.com/google-github-actions/setup-gcloud has a Cloud Run example here: https://github.com/google-github-actions/setup-gcloud/blob/master/example-workflows/cloud-run/.github/workflows/cloud-run.yml",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1126/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 756818250,MDU6SXNzdWU3NTY4MTgyNTA=,1127,Make the custom SQL query text box larger or resizable,596279,zaneselvans,closed,0,,,,,1,2020-12-04T05:37:11Z,2021-06-02T04:29:06Z,2021-06-02T04:28:55Z,NONE,,"The text entry field for custom SQL queries is too small to display a moderately complex query, especially when it's been formatted. Would it be easy to make the textbox resizable by the user rather than having a fixed height?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1127/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 756867924,MDExOlB1bGxSZXF1ZXN0NTMyMzQyMDI1,1128,Fix startup error on windows,3243482,abdusco,closed,0,,,,,2,2020-12-04T07:12:26Z,2020-12-06T08:41:45Z,2020-12-05T19:35:04Z,CONTRIBUTOR,simonw/datasette/pulls/1128,"Fixes https://github.com/simonw/datasette/issues/1094 This import isn't used at all, and causes error on startup on Windows.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1128/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 274662378,MDU6SXNzdWUyNzQ2NjIzNzg=,113,Fix the   bug on the database custom SQL query view,9599,simonw,closed,0,,,2919870,Foreign key edition,0,2017-11-16T21:01:26Z,2017-11-17T15:40:52Z,2017-11-17T15:40:52Z,OWNER,,"https://sf-film-locations.now.sh/sf-film-locations-57704b7?sql=select+*+from+Film_Locations_in_San_Francisco This is the bug I fixed in 01e0c3fa18cd0dd7970e208790ffd683a420c924 - but I only fixed it in one place.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/113/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 757481949,MDU6SXNzdWU3NTc0ODE5NDk=,1131,"""datasette inspect"" outputs invalid JSON if an error is logged",9599,simonw,closed,0,,,,,3,2020-12-05T00:00:45Z,2020-12-05T20:48:34Z,2020-12-05T05:21:19Z,OWNER,,"See https://github.com/simonw/register-of-members-interests/issues/6: ``` % datasette inspect regmem.db ERROR: conn=, sql = 'select count(*) from [items_fts]', params = None: SQL logic error { ""regmem"": { ""hash"": ""6fde27e3dea80d6b65f2ac7f89cd8448980fee8c91b505ba29c311ba0393317f"", ""size"": 936198144, ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1131/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 758899581,MDU6SXNzdWU3NTg4OTk1ODE=,1132,New filter: array does not contain,9599,simonw,closed,0,,,,,1,2020-12-07T22:28:20Z,2020-12-07T22:50:36Z,2020-12-07T22:41:09Z,OWNER,,"I want to see all of my GitHub repos that are tagged `datasette-plugin` but are NOT tagged `datasette-io`: https://github-to-sqlite.dogsheep.net/github/repos?_facet=owner&owner=9599&_facet_array=topics&topics__arraycontains=datasette-plugin&topics__arraycontains=datasette-io#facet-topics",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1132/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 759695780,MDU6SXNzdWU3NTk2OTU3ODA=,1133,Option to omit header row in CSV export,9599,simonw,closed,0,,,,,2,2020-12-08T18:54:46Z,2021-06-17T18:12:31Z,2020-12-10T23:28:51Z,OWNER,,`?_header=off` - for symmetry with existing option `?_nl=on`.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1133/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 760312579,MDU6SXNzdWU3NjAzMTI1Nzk=,1134,"""_searchmode=raw"" throws an index out of range error when combined with ""_search_COLUMN""",2181410,clausjuhl,closed,0,,,,,4,2020-12-09T13:05:37Z,2020-12-10T05:57:17Z,2020-12-09T19:56:55Z,NONE,,"Hi Simon! Maybe it's just me, but when [using _searchmode=raw (trying to enable wildcard-searching) in combination with the ""_search_COLUMN""-table argument](https://byraadsarkivet.aarhus.dk/db/cases?_searchmode=raw&_search_title=sundhedsfrem*), I get a list index out of range error. [When combining with the simpler ""_search""-argument everything works, including wildcard-seaches.](https://byraadsarkivet.aarhus.dk/db/cases?_search=sundhedsfrem*&_searchmode=raw). Here's the traceback: ``` Traceback (most recent call last): File ""/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/utils/asgi.py"", line 122, in route_path return await view(new_scope, receive, send) File ""/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/utils/asgi.py"", line 196, in view request, **scope[""url_route""][""kwargs""] File ""/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/views/base.py"", line 204, in get request, database, hash, correct_hash_provided, **kwargs File ""/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/views/base.py"", line 342, in view_get request, database, hash, **kwargs File ""/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/views/table.py"", line 393, in data search_col = key.split(""_search_"", 1)[1] IndexError: list index out of range ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1134/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 760605882,MDU6SXNzdWU3NjA2MDU4ODI=,1135,Feature: --create option to create database file if it does not yet exist,9599,simonw,closed,0,,,,,0,2020-12-09T19:23:58Z,2021-01-24T21:19:39Z,2020-12-09T19:45:52Z,OWNER,,"I'd like to be able to tell people to run the following in the Datasette documentation to get started: brew install datasette datasette install datasette-upload-csvs datasette data.db --create --root --open This would give them a local Datasette instance with the ability to drag-and-drop CSV files directly into it. Just one catch: I don't want to have to talk them through creating an empty SQLite database file. So I want to add a new `--create` option which means ""If any of the database files passed on the command-line do not yet exist, create them"".",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1135/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 760621356,MDU6SXNzdWU3NjA2MjEzNTY=,1136,Establish pattern for release branches to support bug fixes,9599,simonw,closed,0,,,,,8,2020-12-09T19:48:18Z,2020-12-09T20:17:02Z,2020-12-09T20:14:41Z,OWNER,,"I want to fix the bug in #1134 and ship it as Datasette 0.52.5 - but the `main` branch now has a feature in it (4c25b035b2370983c8dd5e0c8762e9154e379774 added `arraynotcontains`, #1132). I'm not ready for a feature release, so instead I want to release 0.52.5 with just that bug fix. This is the first time I will have shipped a release from a branch. I need to establish that pattern and add it to the documentation in https://docs.datasette.io/en/stable/contributing.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1136/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 761706858,MDU6SXNzdWU3NjE3MDY4NTg=,1137,Update README to reflect new datasette.io site,9599,simonw,closed,0,,,,,0,2020-12-10T23:22:06Z,2020-12-10T23:28:50Z,2020-12-10T23:28:50Z,OWNER,,Can finally close #659.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1137/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 761713079,MDU6SXNzdWU3NjE3MTMwNzk=,1138,"""Powered by Datasette"" should link to new datasette.io site",9599,simonw,closed,0,,,,,0,2020-12-10T23:33:41Z,2020-12-15T02:28:10Z,2020-12-10T23:37:14Z,OWNER,,https://datasette.io/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1138/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274733145,MDExOlB1bGxSZXF1ZXN0MTUzMjAxOTQ1,114,"Add spatialite, switch to debian and local build",54999,ingenieroariel,closed,0,,,,,1,2017-11-17T02:37:09Z,2017-11-17T03:50:52Z,2017-11-17T03:50:52Z,CONTRIBUTOR,simonw/datasette/pulls/114,"Improves the Dockerfile to support spatial datasets, work with the local datasette code (Friendly with git tags and Dockerhub) and moves to slim debian, a small image easy to extend via apt packages for sqlite.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/114/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 763207948,MDU6SXNzdWU3NjMyMDc5NDg=,1141,Default styling for bullet point lists,9599,simonw,closed,0,,,,,0,2020-12-12T02:49:33Z,2021-03-29T00:14:05Z,2021-03-29T00:14:05Z,OWNER,,"I just noticed that https://datasette.io/content/recent_releases (which uses `datasette-render-markdown`) is missing its bullet points: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1141/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 766494367,MDExOlB1bGxSZXF1ZXN0NTM5NDg5NTI1,1145,"Update pytest requirement from <6.2.0,>=5.2.2 to >=5.2.2,<6.3.0",27856297,dependabot-preview[bot],closed,0,,,6346396,Datasette 0.54,1,2020-12-14T14:22:16Z,2021-01-24T21:20:29Z,2020-12-16T21:44:39Z,CONTRIBUTOR,simonw/datasette/pulls/1145,"Updates the requirements on [pytest](https://github.com/pytest-dev/pytest) to permit the latest version.
    Release notes

    Sourced from pytest's releases.

    6.2.0

    pytest 6.2.0 (2020-12-12)

    Breaking Changes

    • #7808: pytest now supports python3.6+ only.

    Deprecations

    • #7469: Directly constructing/calling the following classes/functions is now deprecated:

      • _pytest.cacheprovider.Cache
      • _pytest.cacheprovider.Cache.for_config()
      • _pytest.cacheprovider.Cache.clear_cache()
      • _pytest.cacheprovider.Cache.cache_dir_from_config()
      • _pytest.capture.CaptureFixture
      • _pytest.fixtures.FixtureRequest
      • _pytest.fixtures.SubRequest
      • _pytest.logging.LogCaptureFixture
      • _pytest.pytester.Pytester
      • _pytest.pytester.Testdir
      • _pytest.recwarn.WarningsRecorder
      • _pytest.recwarn.WarningsChecker
      • _pytest.tmpdir.TempPathFactory
      • _pytest.tmpdir.TempdirFactory

      These have always been considered private, but now issue a deprecation warning, which may become a hard error in pytest 7.0.0.

    • #7530: The --strict command-line option has been deprecated, use --strict-markers instead.

      We have plans to maybe in the future to reintroduce --strict and make it an encompassing flag for all strictness related options (--strict-markers and --strict-config at the moment, more might be introduced in the future).

    • #7988: The @pytest.yield_fixture decorator/function is now deprecated. Use pytest.fixture instead.

      yield_fixture has been an alias for fixture for a very long time, so can be search/replaced safely.

    Features

    • #5299: pytest now warns about unraisable exceptions and unhandled thread exceptions that occur in tests on Python>=3.8. See unraisable for more information.

    • #7425: New pytester fixture, which is identical to testdir but its methods return pathlib.Path when appropriate instead of py.path.local.

      This is part of the movement to use pathlib.Path objects internally, in order to remove the dependency to py in the future.

      Internally, the old Testdir <_pytest.pytester.Testdir> is now a thin wrapper around Pytester <_pytest.pytester.Pytester>, preserving the old interface.

    Changelog

    Sourced from pytest's changelog.

    Commits
    • e7073af Prepare release version 6.2.0
    • 683f29f Merge pull request #8129 from bluetech/docs-pygments-workaround
    • 0feeddf doc: temporary workaround for pytest-pygments lexing error
    • b478275 Merge pull request #8128 from bluetech/skip-reason-empty
    • 3302ff9 terminal: when the skip/xfail is empty, don't show it as "()"
    • 59bd0f6 Merge pull request #8126 from bluetech/tox-regen-pretend-scm2
    • 6298ff1 tox: use pip legacy resolver for regen job
    • d51ecbd Merge pull request #8125 from bluetech/tox-rm-pip-req
    • f237b07 tox: remove requires: pip>=20.3.1
    • 95e0e19 Merge pull request #8124 from bluetech/s0undt3ch-feature/skip-context-hook
    • Additional commits viewable in compare view

    Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
    Dependabot commands and options
    You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
    ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1145/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 767561886,MDU6SXNzdWU3Njc1NjE4ODY=,1148,Syntax error with + symbol when deployed to Vercel,765871,kaihendry,closed,0,,,,,2,2020-12-15T12:53:12Z,2020-12-16T21:57:42Z,2020-12-16T21:51:54Z,NONE,,"Works locally: ``` (ins)[hendry@t14s aws-partners]$ sqlite3 partners.db < test.sql 5|{""href"":""https://partners.amazonaws.com/partners/001E000000NaBI0IAN"",""label"":""Slalom""} 6|{""href"":""https://partners.amazonaws.com/partners/001E000000VHBQIIA5"",""label"":""Accenture""} 7|{""href"":""https://partners.amazonaws.com/partners/001E000000Rp588IAB"",""label"":""Druva""} 8|{""href"":""https://partners.amazonaws.com/partners/001E0000013FeQXIA0"",""label"":""Palo Alto Networks""} 9|{""href"":""https://partners.amazonaws.com/partners/001E000000Rl12lIAB"",""label"":""New Relic""} 10|{""href"":""https://partners.amazonaws.com/partners/001E000000NaBHWIA3"",""label"":""Deloitte""} 11|{""href"":""https://partners.amazonaws.com/partners/001E000000Rp5GDIAZ"",""label"":""MegazoneCloud""} 12|{""href"":""https://partners.amazonaws.com/partners/001E000000NaBHMIA3"",""label"":""iret, Inc.""} (ins)[hendry@t14s aws-partners]$ cat test.sql select A.launch_rank, A.partner_info from summary A INNER JOIN summary B ON A.launch_rank>=B.launch_rank-3 AND A.launch_rank<=B.launch_rank+4 WHERE B.""partner_info"" LIKE '%Palo Alto%' ``` Copy the SQL into https://aws-partners-singapore.vercel.app/partners and get syntax error: ![image](https://user-images.githubusercontent.com/765871/102217393-78e4f280-3f17-11eb-9e13-ca79ed62ec34.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1148/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274877366,MDExOlB1bGxSZXF1ZXN0MTUzMzA2ODgy,115,Add keyboard shortcut to execute SQL query,198537,rgieseke,closed,0,,,,,1,2017-11-17T14:13:33Z,2017-11-17T15:16:34Z,2017-11-17T14:22:56Z,CONTRIBUTOR,simonw/datasette/pulls/115,"Very cool tool, thanks a lot! This PR adds a `Shift-Enter` short cut to execute the SQL query. I used CodeMirrors keyboard handling.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/115/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 770436876,MDU6SXNzdWU3NzA0MzY4NzY=,1150,Maintain an in-memory SQLite table of connected databases and their tables,9599,simonw,closed,0,,,3268330,Datasette 1.0,32,2020-12-17T23:02:13Z,2020-12-27T14:51:39Z,2020-12-18T22:34:12Z,OWNER,,"I want Datasette to have its own internal metadata about connected tables, to power features like a paginated searchable homepage in #461. I want this to be a SQLite table. This could also be part of the directory scanning mechanism prototyped in #672 - where Datasette can be set to continually scan a directory for new database files that it can serve. Also relevant to the Datasette Library concept in #417.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1150/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 770448622,MDU6SXNzdWU3NzA0NDg2MjI=,1151,Database class mechanism for cross-connection in-memory databases,9599,simonw,closed,0,,,6346396,Datasette 0.54,11,2020-12-17T23:25:43Z,2021-01-26T19:07:44Z,2020-12-18T01:01:26Z,OWNER,,"> Next challenge: figure out how to use the `Database` class from https://github.com/simonw/datasette/blob/0.53/datasette/database.py for an in-memory database which persists data for the duration of the lifetime of the server, and allows access to that in-memory database from multiple threads in a way that lets them see each other's changes. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1150#issuecomment-747768112_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1151/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 771202454,MDU6SXNzdWU3NzEyMDI0NTQ=,1153,"Use YAML examples in documentation by default, not JSON",9599,simonw,closed,0,,,,,22,2020-12-18T22:20:15Z,2023-07-08T20:09:48Z,2023-07-08T20:08:13Z,OWNER,,"YAML configuration is much better for multi-line strings, and I'm increasingly adding configuration options to Datasette that benefit from that - fragments of HTML in `description_html` or SQL queries used to configure things like https://github.com/simonw/datasette-atom for example. Rather than confusing things by showing both in the documentation, I should switch all of the default examples to use YAML instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1153/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 771208009,MDU6SXNzdWU3NzEyMDgwMDk=,1154,Documentation for new _internal database and tables,9599,simonw,closed,0,,,6346396,Datasette 0.54,2,2020-12-18T22:34:52Z,2021-01-25T00:09:22Z,2021-01-25T00:08:41Z,OWNER,,"> Needs documentation, but I can wait to write that until I've tested out the feature a bit more. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1150#issuecomment-748352106_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1154/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 771216293,MDU6SXNzdWU3NzEyMTYyOTM=,1155,Better internal database_name for _internal database,9599,simonw,closed,0,,,,,9,2020-12-18T22:47:27Z,2020-12-22T20:04:35Z,2020-12-22T20:04:35Z,OWNER,,"https://latest.datasette.io/login-as-root then https://latest.datasette.io/_internal That `database_name` is ugly. It should just be `_internal`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1155/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 772408750,MDU6SXNzdWU3NzI0MDg3NTA=,1156,Rename _schemas to _internal,9599,simonw,closed,0,,,6346396,Datasette 0.54,1,2020-12-21T19:27:58Z,2021-01-24T21:20:39Z,2020-12-21T19:51:18Z,OWNER,,"I like `_internal` as the name for the in-memory Datasette schema database better. Refs #1154 #1150 #1155",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1156/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 772438273,MDU6SXNzdWU3NzI0MzgyNzM=,1157,Use time.perf_counter() instead of time.time() to measure performance,9599,simonw,closed,0,,,6346396,Datasette 0.54,1,2020-12-21T20:21:41Z,2021-01-24T21:20:42Z,2020-12-21T21:49:20Z,OWNER,,"I do that in a bunch of places: https://ripgrep.datasette.io/-/ripgrep?pattern=time%28%29&literal=on&glob=datasette%2F%2A%2A https://docs.python.org/3/library/time.html#time.perf_counter > `time.``perf_counter`() → float > > Return the value (in fractional seconds) of a performance counter, i.e. a clock with the highest available resolution to measure a short duration. It does include time elapsed during sleep and is system-wide. The reference point of the returned value is undefined, so that only the difference between the results of consecutive calls is valid. > > _New in version 3.3._",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1157/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 773913793,MDExOlB1bGxSZXF1ZXN0NTQ0OTIzNDM3,1158,Modernize code to Python 3.6+,6774676,eumiro,closed,0,,,6346396,Datasette 0.54,4,2020-12-23T16:21:38Z,2021-01-24T21:20:50Z,2020-12-23T17:04:32Z,CONTRIBUTOR,simonw/datasette/pulls/1158,"- compact dict and set building - remove redundant parentheses - simplify chained conditions - change method name to lowercase - use triple double quotes for docstrings please feel free to accept/reject any of these independent commits",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1158/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 274884209,MDU6SXNzdWUyNzQ4ODQyMDk=,116,Add documentation section about SQLite extensions,9599,simonw,closed,0,,,,,1,2017-11-17T14:36:30Z,2018-05-28T17:23:42Z,2018-05-28T17:23:41Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/116/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 777145954,MDU6SXNzdWU3NzcxNDU5NTQ=,1167,Add Prettier to contributing documentation,9599,simonw,closed,0,,,6346396,Datasette 0.54,3,2020-12-31T22:00:55Z,2021-01-25T02:01:19Z,2021-01-25T01:58:28Z,OWNER,,"Following #1166 - the docs at https://docs.datasette.io/en/stable/contributing.html should include a section about JavaScript, and it should document how to run Prettier. I run it in VS Code but it can be run on the command-line too: npx prettier 'datasette/static/*[!.min].js' --write ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1167/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 777677671,MDU6SXNzdWU3Nzc2Nzc2NzE=,1169,Prettier package not actually being cached,3637,benpickles,closed,0,,,,,4,2021-01-03T17:04:41Z,2021-01-04T19:52:34Z,2021-01-04T19:52:33Z,CONTRIBUTOR,,"With the current configuration Prettier seems to be installed on every run - which can been [seen from the output](https://github.com/simonw/datasette/runs/1631686028?check_suite_focus=true#step:4:4): ``` npx: installed 1 in 5.166s ``` Prettier isn't explicitly being installed (it's surprising that actually installing the dependencies isn't included in the [actions/cache docs](https://github.com/actions/cache/blob/main/examples.md#macos-and-ubuntu)) but it turns out that `npx` will automatically install the package for the specified command (it actually _guesses_ the package name from the name of the command). I'm not sure where Prettier ends up being installed but it doesn't appear to be in `~/.npm` according to the [post-cache output](https://github.com/simonw/datasette/runs/1631686028#step:7:2) (or `./node_modules` when I tested locally): ``` Cache hit occurred on the primary key Linux-npm-565329898f77080e58b14d45cf816ab94877e6f2ece9d395c369c533548a7ee7, not saving cache. ``` I think there are a couple of approaches to tackling this, you could manually install/cache Prettier within the action, or add a `package.json` with Prettier. I would go with the latter because it's a more standard and maintainable approach and it will also ensure that, along with CI, anyone working on the project will run the same version of Prettier (you'll also get Dependabot JavaScript updates). I've tested the [`package.json` approach on a branch](https://github.com/simonw/datasette/compare/main...benpickles:cache-prettier) and am happy to turn it into a pull request if you fancy. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1169/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274900388,MDExOlB1bGxSZXF1ZXN0MTUzMzI0MzAx,117,Don't prevent tabbing to `Run SQL` button,198537,rgieseke,closed,0,,,,,1,2017-11-17T15:27:50Z,2017-11-19T20:30:24Z,2017-11-18T00:53:43Z,CONTRIBUTOR,simonw/datasette/pulls/117,"Mentioned in #115 Here you go!",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/117/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 778126516,MDExOlB1bGxSZXF1ZXN0NTQ4MjcxNDcy,1170,Install Prettier via package.json,3637,benpickles,closed,0,,,6346396,Datasette 0.54,3,2021-01-04T14:18:03Z,2021-01-24T21:21:01Z,2021-01-04T19:52:34Z,CONTRIBUTOR,simonw/datasette/pulls/1170,This adds a package.json with Prettier and means that developers/CI will use the same version. It also ensures that NPM packages are cached on GitHub Actions which fixes #1169.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1170/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 779691739,MDU6SXNzdWU3Nzk2OTE3Mzk=,1176,"Policy on documenting ""public"" datasette.utils functions",9599,simonw,closed,0,,,3268330,Datasette 1.0,13,2021-01-05T22:55:25Z,2022-02-07T06:43:32Z,2022-02-07T06:42:58Z,OWNER,,"https://github.com/simonw/datasette-css-properties starts [like this](https://github.com/simonw/datasette-css-properties/blob/0.1/datasette_css_properties/__init__.py#L1-L3): ```python from datasette import hookimpl from datasette.utils.asgi import Response from datasette.utils import escape_css_string, to_css_class ``` `escape_css_string` and `to_css_class` are not documented, which means relying on them is risky since there's no promise that they won't change. Would be good to figure out a policy on this, and maybe promote some of them to ""documented"" status.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1176/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 780267857,MDU6SXNzdWU3ODAyNjc4NTc=,1178,Use force_https_urls on when deploying with Cloud Run,9599,simonw,closed,0,,,6346396,Datasette 0.54,9,2021-01-06T08:20:55Z,2021-01-24T21:21:05Z,2021-01-06T18:24:47Z,OWNER,,"_Original title: datasette.absolute_url() should return https:// not http:// on Cloud Run_ https://latest-with-plugins.datasette.io/github/issue_comments.Notebook?_labels=on currently provides `http://` links, which break in Observable since it won't load `http://` content.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1178/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275048699,MDExOlB1bGxSZXF1ZXN0MTUzNDMyMDQ1,118,Foreign key information on row and table pages,9599,simonw,closed,0,,,,,0,2017-11-18T03:13:27Z,2017-11-18T03:15:57Z,2017-11-18T03:15:50Z,OWNER,simonw/datasette/pulls/118,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/118/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 781262510,MDU6SXNzdWU3ODEyNjI1MTA=,1181,"Certain database names results in 404: ""Database not found: None""",1470389,jieter,closed,0,,,6346396,Datasette 0.54,4,2021-01-07T12:01:16Z,2021-12-21T18:25:15Z,2021-01-25T05:13:19Z,NONE,,"I have a file named `test-database (1).sqlite`. When requesting the home route `/`, I see datasette is able to read it correctly: However, if I click any of the links, datasette replies with: `Error 404 Database not found: None` It seems the hash is crucial, as renaming the file to `database (1).sqlite` makes the error go away. This lines checks for a single dash: https://github.com/simonw/datasette/blob/97fb10c17dd007a275ab743742e93e932335ad67/datasette/views/base.py#L184 ``` $ datasette test-database\ \(1\).sqlite INFO: Started server process [68314] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) INFO: 127.0.0.1:54043 - ""GET /favicon.ico HTTP/1.1"" 200 OK INFO: 127.0.0.1:54043 - ""GET / HTTP/1.1"" 200 OK ... INFO: 127.0.0.1:54044 - ""GET /favicon.ico HTTP/1.1"" 200 OK INFO: 127.0.0.1:54044 - ""GET /test-database (1) HTTP/1.1"" 404 Not Found ``` Version: ``` $ datasette --version datasette, version 0.53 ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1181/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 782692159,MDU6SXNzdWU3ODI2OTIxNTk=,1182,"Retire ""Ecosystem"" page in favour of datasette.io/plugins and /tools",9599,simonw,closed,0,,,6346396,Datasette 0.54,3,2021-01-09T21:54:47Z,2021-01-24T21:21:09Z,2021-01-09T22:17:28Z,OWNER,,https://docs.datasette.io/en/stable/ecosystem.html is no longer needed.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1182/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 783714076,MDU6SXNzdWU3ODM3MTQwNzY=,1184,request.full_path property,9599,simonw,closed,0,,,6346396,Datasette 0.54,0,2021-01-11T21:21:58Z,2021-01-24T21:21:16Z,2021-01-11T21:34:47Z,OWNER,,"> I'll also add `request.full_path` for consistency with these: https://github.com/simonw/datasette/blob/97fb10c17dd007a275ab743742e93e932335ad67/datasette/utils/asgi.py#L77-L90 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1179#issuecomment-755495387_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1184/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 784628163,MDU6SXNzdWU3ODQ2MjgxNjM=,1185,"""Statement may not contain PRAGMA"" error is not strictly true",9599,simonw,closed,0,,,6346396,Datasette 0.54,3,2021-01-12T22:07:10Z,2021-01-24T21:21:37Z,2021-01-12T22:26:26Z,OWNER,,"Consider https://latest.datasette.io/fixtures?sql=select+%27select%0D%0A%27+%7C%7C+group_concat%28%27++++case+when+%5B%27+%7C%7C+name+%7C%7C+%27%5D+is+not+null+then+%27+%7C%7C+quote%28name+%7C%7C+%27%2C+%27%29+%7C%7C+%27+else+%27%27%27%27+end%27%2C+%27+%7C%7C%0D%0A%27%29+%7C%7C+%27%0D%0A++as+columns%2C%0D%0A++count%28*%29+as+num_rows%0D%0Afrom%0D%0A++%5B%27+%7C%7C+%3Atable+%7C%7C+%27%5D%0D%0Agroup+by%0D%0A++columns%0D%0Aorder+by%0D%0A++num_rows+desc%27+as+query+from+pragma_ytable_info%28%3Atable%29&table=facetable It says ""Statement may not contain PRAGMA"" - but that's not actually true. Datasette has an allow-list of PRAGMA that are OK - in this case there was a typo in `pragma_ytable_info` which caused the error, but pragma_table_info` would have been OK. So the error message is misleading.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1185/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 785573793,MDU6SXNzdWU3ODU1NzM3OTM=,1186,"script type=""module"" support",9599,simonw,closed,0,,,6346396,Datasette 0.54,1,2021-01-14T01:17:47Z,2021-01-24T21:21:41Z,2021-01-14T01:50:58Z,OWNER,,"Custom JavaScript can be loaded in `metadata.json` like this: ```json { ""extra_js_urls"": [ { ""url"": ""https://code.jquery.com/jquery-3.2.1.slim.min.js"", ""sri"": ""sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g="" } ] } ``` Add a `""module"": true` option which causes the resulting script element to use `
     1