issues
3,044 rows sorted by number
This data as json, CSV (advanced)
Suggested facets: author_association, draft, state_reason, updated_at (date)
created_at (date) >30 ✖
- 2017-10-23 24
- 2017-11-15 13
- 2017-11-19 13
- 2023-07-26 13
- 2017-11-13 12
- 2017-11-14 12
- 2017-11-11 11
- 2018-04-16 11
- 2023-03-09 11
- 2017-10-24 10
- 2020-10-30 10
- 2020-11-24 10
- 2022-08-27 10
- 2022-10-27 10
- 2023-05-21 10
- 2023-07-22 10
- 2018-05-16 9
- 2019-10-11 9
- 2020-06-09 9
- 2020-09-22 9
- 2020-10-11 9
- 2022-01-09 9
- 2022-02-02 9
- 2019-05-11 8
- 2019-05-19 8
- 2020-08-28 8
- 2020-09-17 8
- 2020-10-09 8
- 2022-11-30 8
- 2019-02-24 7
- …
closed_at (date) >30 ✖
- 2023-05-08 16
- 2017-11-13 15
- 2019-06-24 15
- 2018-05-28 13
- 2020-05-04 13
- 2021-10-13 13
- 2023-07-22 13
- 2019-05-03 12
- 2020-06-09 12
- 2020-09-15 12
- 2020-10-31 12
- 2017-10-24 11
- 2017-12-07 11
- 2020-05-02 11
- 2021-02-14 11
- 2020-05-28 10
- 2021-01-04 10
- 2021-03-29 10
- 2021-05-19 10
- 2021-06-02 10
- 2022-01-11 10
- 2022-03-21 10
- 2022-08-27 10
- 2023-05-21 10
- 2017-11-11 9
- 2017-11-15 9
- 2018-07-10 9
- 2019-02-24 9
- 2019-05-16 9
- 2019-05-19 9
- …
repo 16
- datasette 2,098
- sqlite-utils 604
- github-to-sqlite 79
- twitter-to-sqlite 73
- dogsheep-photos 40
- dogsheep-beta 38
- healthkit-to-sqlite 24
- evernote-to-sqlite 16
- swarm-to-sqlite 15
- apple-notes-to-sqlite 14
- google-takeout-to-sqlite 13
- pocket-to-sqlite 12
- dogsheep.github.io 8
- hacker-news-to-sqlite 6
- inaturalist-to-sqlite 2
- genome-to-sqlite 2
id | node_id | number ▼ | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | pull_request | body | repo | type | active_lock_reason | performed_via_github_app | reactions | draft | state_reason |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
267513424 | MDU6SXNzdWUyNjc1MTM0MjQ= | 1 | Addressable pages for every row in a table | simonw 9599 | closed | 0 | Ship first public release 2857392 | 6 | 2017-10-23T00:44:16Z | 2017-10-24T14:11:04Z | 2017-10-24T14:11:03Z | OWNER |
Tricky part will be figuring out what the private key is - especially since it could be a compound primary key and it might involve different data types. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
347058326 | MDExOlB1bGxSZXF1ZXN0MjA1NzcwOTk2 | 1 | Make .indexes compatible with older SQLite versions | simonw 9599 | closed | 0 | 0 | 2018-08-02T15:17:05Z | 2018-08-02T15:17:30Z | 2018-08-02T15:17:30Z | OWNER | simonw/sqlite-utils/pulls/1 | Older SQLite versions return a different set of columns from the PRAGMA we are using. |
sqlite-utils 140912432 | pull | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/1/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | |||||
470637068 | MDU6SXNzdWU0NzA2MzcwNjg= | 1 | Use XML Analyser to figure out the structure of the export XML | simonw 9599 | closed | 0 | 1 | 2019-07-20T05:19:02Z | 2019-07-20T05:20:09Z | 2019-07-20T05:20:09Z | MEMBER | healthkit-to-sqlite 197882382 | issue | { "url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/1/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||||
487598042 | MDU6SXNzdWU0ODc1OTgwNDI= | 1 | Implement code to pull checkins from the Foursquare API | simonw 9599 | closed | 0 | 0 | 2019-08-30T17:40:02Z | 2019-08-30T18:23:24Z | 2019-08-30T18:23:24Z | MEMBER | The tool currently only works with a pre-prepared JSON file of checkins. When called without options, it should prompt the user to paste in a Foursquare OAuth token. The |
swarm-to-sqlite 205429375 | issue | { "url": "https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/1/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
488833136 | MDU6SXNzdWU0ODg4MzMxMzY= | 1 | Imported followers should go in "users", relationships in "following" | simonw 9599 | closed | 0 | 0 | 2019-09-03T21:27:37Z | 2019-09-04T20:23:04Z | 2019-09-04T20:23:04Z | MEMBER | Right now It should instead save them all in a global |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/1/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
493599818 | MDU6SXNzdWU0OTM1OTk4MTg= | 1 | Command for fetching starred repos | simonw 9599 | closed | 0 | 0 | 2019-09-14T08:36:29Z | 2019-09-14T21:30:48Z | 2019-09-14T21:30:48Z | MEMBER | github-to-sqlite 207052882 | issue | { "url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/1/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||||
496415321 | MDU6SXNzdWU0OTY0MTUzMjE= | 1 | Figure out some interesting example SQL queries | simonw 9599 | open | 0 | 9 | 2019-09-20T15:28:07Z | 2021-05-03T03:46:23Z | MEMBER | My knowledge of genetics has left me short here. I'd love to be able to provide some interesting example SELECT queries - maybe one that spots if you are likely to have red hair? |
genome-to-sqlite 209590345 | issue | { "url": "https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
503233021 | MDU6SXNzdWU1MDMyMzMwMjE= | 1 | Use better pagination (and implement progress bar) | simonw 9599 | closed | 0 | 4 | 2019-10-07T04:58:11Z | 2020-03-27T22:13:57Z | 2020-03-27T22:13:57Z | MEMBER | Right now we attempt to load everything at once - which caps out at 5,000 items and is really slow. We can do better by implementing pagination using count and offset. |
pocket-to-sqlite 213286752 | issue | { "url": "https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/1/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
504720731 | MDU6SXNzdWU1MDQ3MjA3MzE= | 1 | Add more details on how to request data from google takeout correctly. | dazzag24 1055831 | open | 0 | 0 | 2019-10-09T15:17:34Z | 2019-10-09T15:17:34Z | NONE | The default is to download everything. This can result in an enormous amount of data when you only really need 2 types of data for now:
In addition unless you specify that "My Activity" is downloaded in JSON format the default is HTML. This then causes the
command to fail as it only contains html files not json files. Thanks |
google-takeout-to-sqlite 206649770 | issue | { "url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/1/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
519979091 | MDExOlB1bGxSZXF1ZXN0MzM4NjQ3Mzc4 | 1 | Add parkrun-to-sqlite | mrw34 1101318 | closed | 0 | 0 | 2019-11-08T12:05:32Z | 2020-10-12T00:35:16Z | 2020-10-12T00:35:16Z | CONTRIBUTOR | dogsheep/dogsheep.github.io/pulls/1 | dogsheep.github.io 214746582 | pull | { "url": "https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/1/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | ||||||
585526292 | MDU6SXNzdWU1ODU1MjYyOTI= | 1 | Set up full text search | simonw 9599 | closed | 0 | 1 | 2020-03-21T15:57:35Z | 2020-03-21T19:47:46Z | 2020-03-21T19:45:52Z | MEMBER | Should run against |
hacker-news-to-sqlite 248903544 | issue | { "url": "https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/1/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
602533300 | MDU6SXNzdWU2MDI1MzMzMDA= | 1 | Import photo metadata from Apple Photos into SQLite | simonw 9599 | open | 0 | Apple Photos online and securely browsable 5324096 | 8 | 2020-04-18T19:23:26Z | 2020-05-04T02:41:40Z | MEMBER | Faces, albums, locations, that kind of thing. |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/1/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
|||||||
689800307 | MDU6SXNzdWU2ODk4MDAzMDc= | 1 | Add an index on the timestamp column | simonw 9599 | closed | 0 | 0 | 2020-09-01T04:33:37Z | 2020-09-01T04:49:23Z | 2020-09-01T04:49:23Z | MEMBER | Since default view will likely be ordered by timestamp descending. |
dogsheep-beta 197431109 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-beta/issues/1/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
718934942 | MDU6SXNzdWU3MTg5MzQ5NDI= | 1 | Documentation on how to use this with Datasette | simonw 9599 | open | 0 | 1 | 2020-10-11T21:56:27Z | 2020-10-11T22:14:00Z | MEMBER | In particular how to use |
evernote-to-sqlite 303218369 | issue | { "url": "https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/1/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1353411865 | I_kwDODEpn8M5Qq20Z | 1 | Problem with my user | fernand0 2467 | open | 0 | 0 | 2022-08-28T16:59:37Z | 2022-08-28T16:59:37Z | NONE | If I call the program with:
inaturalist-to-sqlite inaturalist.db ftricas
the program exits with an error:
Additional info, the command:
shows that the correct name can be 'taxons'. There is another small problem with a warning: warnings.warn("urllib3 ({}) or chardet ({})/charset_normalizer ({}) doesn't match a supported " |
inaturalist-to-sqlite 206202864 | issue | { "url": "https://api.github.com/repos/dogsheep/inaturalist-to-sqlite/issues/1/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1616347574 | I_kwDOJHON9s5gV4G2 | 1 | Initial proof of concept with ChatGPT | simonw 9599 | closed | 0 | 3 | 2023-03-09T03:44:39Z | 2023-03-09T03:51:55Z | 2023-03-09T03:51:55Z | MEMBER | I'm using ChatGPT to figure out enough AppleScript to get at my notes data. |
apple-notes-to-sqlite 611552758 | issue | { "url": "https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/1/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
267513523 | MDU6SXNzdWUyNjc1MTM1MjM= | 2 | Initial proof-of-concept | simonw 9599 | closed | 0 | Ship first public release 2857392 | 0 | 2017-10-23T00:45:37Z | 2017-10-23T01:26:39Z | 2017-10-23T00:45:53Z | OWNER | datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
349850687 | MDU6SXNzdWUzNDk4NTA2ODc= | 2 | Mechanism for adding foreign keys to an existing table | simonw 9599 | closed | 0 | 1 | 2018-08-12T22:50:56Z | 2019-02-24T21:34:41Z | 2019-02-24T21:34:41Z | OWNER | SQLite does not have ALTER TABLE support for adding new foreign keys... but it turns out it's possible to make these changes without having to duplicate the entire table by carefully running Here's how Django does it: https://github.com/django/django/blob/d3449faaa915a08c275b35de01e66a7ef6bdb2dc/django/db/backends/sqlite3/schema.py#L103-L125 And here's the official documentation about this: https://sqlite.org/lang_altertable.html#otheralter (scroll to the very bottom of the page) |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/2/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
470637152 | MDU6SXNzdWU0NzA2MzcxNTI= | 2 | Import workouts | simonw 9599 | closed | 0 | 1 | 2019-07-20T05:20:21Z | 2019-07-20T06:21:41Z | 2019-07-20T06:21:41Z | MEMBER | From #1 |
healthkit-to-sqlite 197882382 | issue | { "url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/2/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
487598468 | MDU6SXNzdWU0ODc1OTg0Njg= | 2 | --save option to dump checkins to a JSON file on disk | simonw 9599 | closed | 0 | 1 | 2019-08-30T17:41:06Z | 2019-08-31T02:40:21Z | 2019-08-31T02:40:21Z | MEMBER | This is a complement to the (I'll rename |
swarm-to-sqlite 205429375 | issue | { "url": "https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/2/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
488833698 | MDU6SXNzdWU0ODg4MzM2OTg= | 2 | "twitter-to-sqlite user-timeline" command for pulling tweets by a specific user | simonw 9599 | closed | 0 | 3 | 2019-09-03T21:29:12Z | 2019-09-04T20:02:11Z | 2019-09-04T20:02:11Z | MEMBER | Twitter only allows up to 3,200 tweets to be retrieved from https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-user_timeline.html I'm going to do:
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
493668862 | MDU6SXNzdWU0OTM2Njg4NjI= | 2 | Extract licenses from repos into a separate table | simonw 9599 | closed | 0 | 0 | 2019-09-14T21:33:41Z | 2019-09-14T21:46:58Z | 2019-09-14T21:46:58Z | MEMBER | github-to-sqlite 207052882 | issue | { "url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/2/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||||
503234169 | MDU6SXNzdWU1MDMyMzQxNjk= | 2 | Track and use the 'since' value | simonw 9599 | closed | 0 | 3 | 2019-10-07T05:02:59Z | 2020-03-27T22:22:30Z | 2020-03-27T22:22:30Z | MEMBER | Pocket says:
At the bottom of https://getpocket.com/developer/docs/v3/retrieve |
pocket-to-sqlite 213286752 | issue | { "url": "https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/2/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
541274681 | MDU6SXNzdWU1NDEyNzQ2ODE= | 2 | Add linkedin-to-sqlite | mnp 881925 | open | 0 | 0 | 2019-12-21T03:13:40Z | 2019-12-21T03:13:40Z | NONE | There is an API available. https://developer.linkedin.com/docs/rest-api# At the minimum, I would think contact list and messages would be of interest. |
dogsheep.github.io 214746582 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/2/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
602533352 | MDU6SXNzdWU2MDI1MzMzNTI= | 2 | Ability to convert HEIC images to JPEG | simonw 9599 | closed | 0 | Apple Photos online and securely browsable 5324096 | 1 | 2020-04-18T19:23:43Z | 2020-04-28T16:47:21Z | 2020-04-28T16:47:21Z | MEMBER | dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/2/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
664793260 | MDU6SXNzdWU2NjQ3OTMyNjA= | 2 | Yak shave | ekg 145425 | open | 0 | 0 | 2020-07-23T22:04:18Z | 2020-07-23T22:04:18Z | NONE | Just a quick note... The 23andme data is not exactly your genome, but a SNP chip of your genome. It's "some of your genotypes." Or about 0.1% of your genome. Nice work in any case! It deserves to be liberated!!!!! |
genome-to-sqlite 209590345 | issue | { "url": "https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/2/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
689809225 | MDU6SXNzdWU2ODk4MDkyMjU= | 2 | Apply porter stemming | simonw 9599 | closed | 0 | 2 | 2020-09-01T04:57:55Z | 2020-09-01T20:42:00Z | 2020-09-01T20:40:24Z | MEMBER | This can be on by default. You can turn it off for a table in the config file using |
dogsheep-beta 197431109 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-beta/issues/2/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
718938046 | MDU6SXNzdWU3MTg5MzgwNDY= | 2 | Convert dates to a better format | simonw 9599 | closed | 0 | 0 | 2020-10-11T22:12:33Z | 2020-10-11T23:15:03Z | 2020-10-11T23:15:03Z | MEMBER | evernote-to-sqlite 303218369 | issue | { "url": "https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/2/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||||
769376447 | MDU6SXNzdWU3NjkzNzY0NDc= | 2 | killed by oomkiller on large location-history | khimaros 231498 | open | 0 | 2 | 2020-12-17T00:32:24Z | 2020-12-17T00:48:32Z | NONE | memory seems to grow unbounded and is oom-killed after about 20GB memory usage. this is happening while loading a ~1GB uncompressed location history. |
google-takeout-to-sqlite 206649770 | issue | { "url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/2/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
952179830 | MDU6SXNzdWU5NTIxNzk4MzA= | 2 | Command for fetching Hacker News threads from the search API | simonw 9599 | open | 0 | 4 | 2021-07-25T02:00:45Z | 2021-07-25T03:12:57Z | MEMBER | I want to be able to fetch every item for a domain, e.g. https://news.ycombinator.com/from?site=simonwillison.net |
hacker-news-to-sqlite 248903544 | issue | { "url": "https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/2/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1485017981 | I_kwDODEpn8M5Yg5N9 | 2 | table identifications has no column named previous_observation_taxon | heaversm 520541 | open | 0 | 0 | 2022-12-08T16:47:17Z | 2022-12-08T16:47:17Z | NONE | Installed successfully with pip and ran
|
inaturalist-to-sqlite 206202864 | issue | { "url": "https://api.github.com/repos/dogsheep/inaturalist-to-sqlite/issues/2/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1616354999 | I_kwDOJHON9s5gV563 | 2 | First working version | simonw 9599 | closed | 0 | 7 | 2023-03-09T03:53:00Z | 2023-03-09T05:10:22Z | 2023-03-09T05:10:22Z | MEMBER | It's going to shell out to I'm going with that option because https://appscript.sourceforge.io/status.html warns against the other potential methods:
But |
apple-notes-to-sqlite 611552758 | issue | { "url": "https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/2/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
267515678 | MDU6SXNzdWUyNjc1MTU2Nzg= | 3 | Make individual column valuables addressable, with smart content types | simonw 9599 | open | 0 | 1 | 2017-10-23T01:11:32Z | 2017-12-10T03:11:58Z | OWNER | Some SQLite databases embed images in columns. It would be cool if these had URLs.
The one without an explicit file extension auto-detects the correct extension. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/3/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
351845423 | MDU6SXNzdWUzNTE4NDU0MjM= | 3 | Experiment with contentless FTS tables | simonw 9599 | closed | 0 | 1 | 2018-08-18T19:31:01Z | 2019-07-22T20:58:55Z | 2019-07-22T20:58:55Z | OWNER | Could greatly reduce size of resulting database for large datasets: http://cocoamine.net/blog/2015/09/07/contentless-fts4-for-large-immutable-documents/ |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/3/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
470637206 | MDU6SXNzdWU0NzA2MzcyMDY= | 3 | Import ActivitySummary | simonw 9599 | closed | 0 | 0 | 2019-07-20T05:21:00Z | 2019-07-20T05:58:07Z | 2019-07-20T05:58:07Z | MEMBER | From #1
|
healthkit-to-sqlite 197882382 | issue | { "url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/3/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
487600595 | MDU6SXNzdWU0ODc2MDA1OTU= | 3 | Option to fetch only checkins more recent than the current max checkin | simonw 9599 | closed | 0 | 4 | 2019-08-30T17:46:45Z | 2019-10-16T20:41:23Z | 2019-10-16T20:39:59Z | MEMBER | The Foursquare checkins API supports "return every checkin occurring after this point" - I can pass it the maximum createdAt date currently stored in the database. This will allow for quick incremental fetches via a cron. |
swarm-to-sqlite 205429375 | issue | { "url": "https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/3/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
488833975 | MDU6SXNzdWU0ODg4MzM5NzU= | 3 | Command for running a search and saving tweets for that search | simonw 9599 | closed | 0 | 6 | 2019-09-03T21:29:56Z | 2019-11-04T05:31:56Z | 2019-11-04T05:31:16Z | MEMBER |
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
493670426 | MDU6SXNzdWU0OTM2NzA0MjY= | 3 | Command to fetch all repos belonging to a user or organization | simonw 9599 | closed | 0 | 2 | 2019-09-14T21:54:21Z | 2019-09-17T00:17:53Z | 2019-09-17T00:17:53Z | MEMBER | How about this:
|
github-to-sqlite 207052882 | issue | { "url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/3/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
503243784 | MDU6SXNzdWU1MDMyNDM3ODQ= | 3 | Extract images into separate tables | simonw 9599 | open | 0 | 1 | 2019-10-07T05:43:01Z | 2020-09-01T06:17:45Z | MEMBER | As already done with authors. Slightly harder because images do not have a universally unique ID. Also need to figure out what to do about there being columns for both |
pocket-to-sqlite 213286752 | issue | { "url": "https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/3/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
543717994 | MDExOlB1bGxSZXF1ZXN0MzU3OTc0MzI2 | 3 | Add todoist-to-sqlite | bcongdon 706257 | closed | 0 | 0 | 2019-12-30T04:02:59Z | 2020-10-12T00:35:58Z | 2020-10-12T00:35:57Z | CONTRIBUTOR | dogsheep/dogsheep.github.io/pulls/3 | Really enjoying getting into the dogsheep/datasette ecosystem. I made a downloader for Todoist, and I think/hope others might find this useful |
dogsheep.github.io 214746582 | pull | { "url": "https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/3/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | |||||
602533481 | MDU6SXNzdWU2MDI1MzM0ODE= | 3 | Import EXIF data into SQLite - lens used, ISO, aperture etc | simonw 9599 | open | 0 | Apple Photos online and securely browsable 5324096 | 2 | 2020-04-18T19:24:31Z | 2021-10-05T12:38:24Z | MEMBER | dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/3/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
689810340 | MDU6SXNzdWU2ODk4MTAzNDA= | 3 | Datasette plugin to provide custom page for running faceted, ranked searches | simonw 9599 | closed | 0 | 3 | 2020-09-01T05:00:22Z | 2020-09-03T21:01:41Z | 2020-09-03T21:01:41Z | MEMBER | This will be a page at It will offer a default timeline view plus search and facet by type/date. |
dogsheep-beta 197431109 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-beta/issues/3/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
718938321 | MDU6SXNzdWU3MTg5MzgzMjE= | 3 | Use a content hash for the note IDs | simonw 9599 | closed | 0 | 0 | 2020-10-11T22:13:46Z | 2020-10-11T23:15:04Z | 2020-10-11T23:15:04Z | MEMBER | Without a GUID note IDs are pretty ineffective, but using a hash of the contents will at least avoid creating identical duplicates in the future. |
evernote-to-sqlite 303218369 | issue | { "url": "https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/3/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
769397742 | MDU6SXNzdWU3NjkzOTc3NDI= | 3 | sqlite-utils error on takeout import | khimaros 231498 | open | 0 | 0 | 2020-12-17T01:18:48Z | 2020-12-17T01:19:04Z | NONE |
there is no table create in additionally, this package and hackernews-to-sqlite have conflicting |
google-takeout-to-sqlite 206649770 | issue | { "url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/3/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
952189173 | MDU6SXNzdWU5NTIxODkxNzM= | 3 | Use HN algolia endpoint to retrieve trees | simonw 9599 | open | 0 | 3 | 2021-07-25T03:35:27Z | 2021-07-25T18:41:17Z | MEMBER | The
Here's an example that loads quickly, with about 50 comments: https://hn.algolia.com/api/v1/items/27941108 It doesn't appear to use pagination at all - if a thread is big then the response is big. I ran this search to find some stories with more than 1000 comments: https://hn.algolia.com/api/v1/search?tags=story&numericFilters=num_comments%3E=1000 Here's one: https://news.ycombinator.com/item?id=25015967 with 4759 comments. Hitting the API takes 41s and returns 3.7 MB of JSON!
|
hacker-news-to-sqlite 248903544 | issue | { "url": "https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/3/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1616422013 | I_kwDOJHON9s5gWKR9 | 3 | `apple-notes-to-sqlite --dump` option | simonw 9599 | closed | 0 | 0 | 2023-03-09T05:05:49Z | 2023-03-09T05:06:14Z | 2023-03-09T05:06:14Z | MEMBER | Option that doesn't write to the database at all, it just outputs all the notes to stdout as newline-delimited JSON. |
apple-notes-to-sqlite 611552758 | issue | { "url": "https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/3/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
267515836 | MDU6SXNzdWUyNjc1MTU4MzY= | 4 | Make URLs immutable | simonw 9599 | closed | 0 | Ship first public release 2857392 | 8 | 2017-10-23T01:13:30Z | 2017-10-24T02:38:24Z | 2017-10-24T02:38:24Z | OWNER | Absolutely everything should have a far-future expires header Part of the URL will be the truncated sha1 hash of the database file itself, calculated at build time |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/4/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
403028630 | MDExOlB1bGxSZXF1ZXN0MjQ3NTc2OTQy | 4 | Fts5 | simonw 9599 | closed | 0 | 0 | 2019-01-25T06:54:05Z | 2019-01-25T06:54:33Z | 2019-01-25T06:54:33Z | OWNER | simonw/sqlite-utils/pulls/4 | sqlite-utils 140912432 | pull | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/4/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | ||||||
470640505 | MDU6SXNzdWU0NzA2NDA1MDU= | 4 | Import Records | simonw 9599 | closed | 0 | 1 | 2019-07-20T06:11:20Z | 2019-07-20T06:21:41Z | 2019-07-20T06:21:41Z | MEMBER | From #1:
|
healthkit-to-sqlite 197882382 | issue | { "url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/4/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
487601121 | MDU6SXNzdWU0ODc2MDExMjE= | 4 | Online tool for getting a Foursquare OAuth token | simonw 9599 | closed | 0 | 1 | 2019-08-30T17:48:14Z | 2019-08-31T18:07:26Z | 2019-08-31T18:07:26Z | MEMBER | I will link to this from the documentation. See also this conversation on Twitter: https://twitter.com/simonw/status/1166822603023011840 I've decided to go with "copy and paste in a token" rather than hooking up a local web server that can have tokens passed to it. |
swarm-to-sqlite 205429375 | issue | { "url": "https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/4/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
488835586 | MDU6SXNzdWU0ODg4MzU1ODY= | 4 | Command for importing data from a Twitter Export file | simonw 9599 | closed | 0 | 2 | 2019-09-03T21:34:13Z | 2019-10-11T06:45:02Z | 2019-10-11T06:45:02Z | MEMBER | Twitter lets you export all of your data as an archive file: https://twitter.com/settings/your_twitter_data A command for importing this data into SQLite would be extremely useful.
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/4/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
493670730 | MDU6SXNzdWU0OTM2NzA3MzA= | 4 | Command to fetch stargazers for one or more repos | simonw 9599 | closed | 0 | 8 | 2019-09-14T21:58:22Z | 2020-05-02T21:30:27Z | 2020-05-02T21:30:27Z | MEMBER | Maybe this:
It could accept more than one repos. Maybe have options similar to |
github-to-sqlite 207052882 | issue | { "url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/4/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
558715564 | MDExOlB1bGxSZXF1ZXN0MzcwMDI0Njk3 | 4 | Add beeminder-to-sqlite | bcongdon 706257 | closed | 0 | 0 | 2020-02-02T15:51:36Z | 2020-10-12T00:36:16Z | 2020-10-12T00:36:16Z | CONTRIBUTOR | dogsheep/dogsheep.github.io/pulls/4 | dogsheep.github.io 214746582 | pull | { "url": "https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/4/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | ||||||
589402939 | MDU6SXNzdWU1ODk0MDI5Mzk= | 4 | Store authentication information as "pocket_access_token" etc | simonw 9599 | closed | 0 | 0 | 2020-03-27T20:43:22Z | 2020-03-27T20:43:59Z | 2020-03-27T20:43:59Z | MEMBER | The |
pocket-to-sqlite 213286752 | issue | { "url": "https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/4/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
602533539 | MDU6SXNzdWU2MDI1MzM1Mzk= | 4 | Upload all my photos to a secure S3 bucket | simonw 9599 | closed | 0 | Apple Photos online and securely browsable 5324096 | 14 | 2020-04-18T19:24:50Z | 2020-04-18T21:58:11Z | 2020-04-18T21:57:13Z | MEMBER |
|
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/4/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
689839399 | MDU6SXNzdWU2ODk4MzkzOTk= | 4 | Optimize the FTS table | simonw 9599 | closed | 0 | 1 | 2020-09-01T05:58:17Z | 2020-09-01T06:10:08Z | 2020-09-01T06:10:08Z | MEMBER | dogsheep-beta 197431109 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-beta/issues/4/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||||
718938508 | MDU6SXNzdWU3MTg5Mzg1MDg= | 4 | Configure FTS + add an index on the date columns | simonw 9599 | closed | 0 | 2 | 2020-10-11T22:14:40Z | 2020-10-11T23:41:29Z | 2020-10-11T23:41:29Z | MEMBER | Sort by date descending is likely the most common way of sorting, so that column should be indexed. Also add FTS configuration for both notes and the OCR column on resources. |
evernote-to-sqlite 303218369 | issue | { "url": "https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/4/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
778380836 | MDU6SXNzdWU3NzgzODA4MzY= | 4 | Feature Request: Gmail | Btibert3 203343 | open | 0 | 5 | 2021-01-04T21:31:09Z | 2021-03-04T20:54:44Z | NONE | From takeout, I only exported my Gmail account. Ideally I could parse this into sqlite via this tool. |
google-takeout-to-sqlite 206649770 | issue | { "url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/4/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1205867842 | I_kwDODtX3eM5H4BVC | 4 | Retrieve the top-level story for a comment | telotortium 1755789 | open | 0 | 0 | 2022-04-15T20:25:39Z | 2022-04-15T20:25:39Z | NONE | I think that each comment inserted into the database should include a column |
hacker-news-to-sqlite 248903544 | issue | { "url": "https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/4/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1616429236 | I_kwDOJHON9s5gWMC0 | 4 | Support incremental updates | simonw 9599 | open | 0 | 2 | 2023-03-09T05:14:00Z | 2023-03-09T18:20:56Z | MEMBER | Running this script can take several hours against a large notes database. Would be neat if it could run against just the notes that have been modified since it last ran. Could pull the max Problem is I don't actually know what order it iterates over the notes in. |
apple-notes-to-sqlite 611552758 | issue | { "url": "https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/4/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
267516066 | MDU6SXNzdWUyNjc1MTYwNjY= | 5 | Implement sensible query pagination | simonw 9599 | closed | 0 | Ship first public release 2857392 | 3 | 2017-10-23T01:16:00Z | 2017-11-10T20:41:39Z | 2017-11-10T20:41:39Z | OWNER | datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/5/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
403396009 | MDExOlB1bGxSZXF1ZXN0MjQ3ODYxNDE5 | 5 | Run Travis tests against Python 3.8-dev | simonw 9599 | closed | 0 | 0 | 2019-01-26T02:30:55Z | 2019-01-26T02:37:54Z | 2019-01-26T02:37:54Z | OWNER | simonw/sqlite-utils/pulls/5 | sqlite-utils 140912432 | pull | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/5/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | ||||||
470691622 | MDU6SXNzdWU0NzA2OTE2MjI= | 5 | Add progress bar | simonw 9599 | closed | 0 | 2 | 2019-07-20T16:29:07Z | 2019-07-22T03:30:13Z | 2019-07-22T02:49:22Z | MEMBER | Showing a progress bar would be nice, using Click. The easiest way to do this would probably be be to hook it up to the length of the compressed content, and update it as this code pushes more XML bytes through the parser: |
healthkit-to-sqlite 197882382 | issue | { "url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/5/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
487721884 | MDU6SXNzdWU0ODc3MjE4ODQ= | 5 | Treat Foursquare timestamps as UTC | simonw 9599 | closed | 0 | 0 | 2019-08-31T02:44:47Z | 2019-08-31T02:50:41Z | 2019-08-31T02:50:41Z | MEMBER | Current test failure is due to timezone differences between my laptop and Circle CI: https://circleci.com/gh/dogsheep/swarm-to-sqlite/3
The timestamps I store in |
swarm-to-sqlite 205429375 | issue | { "url": "https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/5/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
488874815 | MDU6SXNzdWU0ODg4NzQ4MTU= | 5 | Write tests that simulate the Twitter API | simonw 9599 | open | 0 | 1 | 2019-09-03T23:55:35Z | 2019-09-03T23:56:28Z | MEMBER | I can use betamax for this: https://pypi.org/project/betamax/ |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/5/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
493671014 | MDU6SXNzdWU0OTM2NzEwMTQ= | 5 | Add "incomplete" boolean to users table for incomplete profiles | simonw 9599 | closed | 0 | 2 | 2019-09-14T22:01:50Z | 2020-03-23T19:23:31Z | 2020-03-23T19:23:30Z | MEMBER | User profiles that are fetched from e.g. stargazers (#4) are incomplete - they have a login but they don't have name, company etc. Add a |
github-to-sqlite 207052882 | issue | { "url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/5/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
602551638 | MDU6SXNzdWU2MDI1NTE2Mzg= | 5 | photos-to-sqlite s3-auth command | simonw 9599 | closed | 0 | 1 | 2020-04-18T21:05:25Z | 2020-04-18T21:08:44Z | 2020-04-18T21:08:44Z | MEMBER | Modeled on |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/5/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
629473827 | MDU6SXNzdWU2Mjk0NzM4Mjc= | 5 | Set up a demo | harryvederci 26745575 | open | 0 | 1 | 2020-06-02T19:56:49Z | 2020-09-01T06:18:43Z | NONE | First off, thanks for open sourcing this application! This is a suggestion to increase the amount of people that would make use of it: an example in the readme file would help. Currently, users have to clone the app, install it, authorize through pocket, run a command, an then find out if this application does what they hope it does. Another possibility is to add a file Keep up the good work! |
pocket-to-sqlite 213286752 | issue | { "url": "https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/5/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
689847361 | MDU6SXNzdWU2ODk4NDczNjE= | 5 | Add a context column that's not searchable | simonw 9599 | closed | 0 | 1 | 2020-09-01T06:13:42Z | 2020-09-03T18:43:50Z | 2020-09-03T18:43:50Z | MEMBER | I sometimes like to configure titles that are things like "Comment on issue X" or "Photo in Golden Gate Park" - these shouldn't be included in the search index but should be stored so they can be displayed to provide context. Add a column for this - probably called |
dogsheep-beta 197431109 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-beta/issues/5/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
718938889 | MDU6SXNzdWU3MTg5Mzg4ODk= | 5 | Figure out how to display images from <en-media> tags inline in Datasette | simonw 9599 | open | 0 | 6 | 2020-10-11T22:17:03Z | 2020-10-16T20:16:28Z | MEMBER | Relates to #1. Evernote XML looks like this: ```xml <en-note>
This note includes two images.
The Python logo
<en-media hash="61098c2c541de7f0a907c301dd6542da" type="image/svg+xml" width="125"/>
The Evernote logo
<en-media hash="91bd26175acac0b2ffdb6efac199f8ca" type="image/svg+xml" width="125"/>
</en-note> ``` That hash is the md5 we use to store resources. It should be possible to turn these into embedded image tags, especially if done in conjunction with the https://github.com/simonw/datasette-media plugin. |
evernote-to-sqlite 303218369 | issue | { "url": "https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/5/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
723499985 | MDExOlB1bGxSZXF1ZXN0NTA1MDc2NDE4 | 5 | Add fitbit-to-sqlite | mrphil007 4632208 | open | 0 | 0 | 2020-10-16T20:04:05Z | 2020-10-16T20:04:05Z | FIRST_TIME_CONTRIBUTOR | dogsheep/dogsheep.github.io/pulls/5 | dogsheep.github.io 214746582 | pull | { "url": "https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/5/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | |||||||
813880401 | MDExOlB1bGxSZXF1ZXN0NTc3OTUzNzI3 | 5 | WIP: Add Gmail takeout mbox import | UtahDave 306240 | open | 0 | 25 | 2021-02-22T21:30:40Z | 2021-07-28T07:18:56Z | FIRST_TIME_CONTRIBUTOR | dogsheep/google-takeout-to-sqlite/pulls/5 | WIP This PR adds the ability to import emails from a Gmail mbox export from Google Takeout. This is my first PR to a datasette/dogsheep repo. I've tested this on my personal Google Takeout mbox with ~520,000 emails going back to 2004. This took around ~20 minutes to process. To provide some feedback on the progress of the import I added the "rich" python module. I'm happy to remove that if adding a dependency is discouraged. However, I think it makes a nice addition to give feedback on the progress of a long import. Do we want to log emails that have errors when trying to import them? Dealing with encodings with emails is a bit tricky. I'm very open to feedback on how to deal with those better. As well as any other feedback for improvements. |
google-takeout-to-sqlite 206649770 | pull | { "url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | ||||||
1353418822 | PR_kwDODtX3eM497MOV | 5 | The program fails when the user has no submissions | fernand0 2467 | open | 0 | 0 | 2022-08-28T17:25:45Z | 2022-08-28T17:25:45Z | FIRST_TIME_CONTRIBUTOR | dogsheep/hacker-news-to-sqlite/pulls/5 | Tested with:
Result:
There is a problem of style with the patch (but not sure what to do) because with the new inicialization ( submitted = []) the part
is not needed. Maybe there is a more adequate way of doing this. |
hacker-news-to-sqlite 248903544 | pull | { "url": "https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/5/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | ||||||
1616440856 | I_kwDOJHON9s5gWO4Y | 5 | Configure full text search | simonw 9599 | open | 0 | 0 | 2023-03-09T05:20:46Z | 2023-03-09T05:20:46Z | MEMBER | FTS would be useful. Maybe even extract the plain text from the notes to make that index easier to create, rather than creating it against the HTML. Can use the |
apple-notes-to-sqlite 611552758 | issue | { "url": "https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/5/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
267516329 | MDU6SXNzdWUyNjc1MTYzMjk= | 6 | Better JSON response options | simonw 9599 | closed | 0 | Ship first public release 2857392 | 0 | 2017-10-23T01:18:47Z | 2017-10-24T15:07:58Z | 2017-10-24T15:07:58Z | OWNER | Default returns this:
.jsono instead returns a list of objects each duplicating the headers in its keys. They both probably share the same pagination mechanism so it might not be a jsono flat list. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
403624090 | MDU6SXNzdWU0MDM2MjQwOTA= | 6 | "sqlite-utils insert" should support newline-delimited JSON | simonw 9599 | closed | 0 | 1 | 2019-01-28T02:00:02Z | 2019-01-28T02:17:45Z | 2019-01-28T02:17:45Z | OWNER | We can already export newline delimited JSON. We should learn to import it as well. The neat thing about importing it is that you can import GBs of data without having to read the whole lot into memory in order to decode the wrapping JSON array. Datasette can export it now: https://github.com/simonw/datasette/issues/405 Demo: https://latest.datasette.io/fixtures/facetable.json?_shape=array&_nl=on It should be possible to do this:
|
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
470856782 | MDU6SXNzdWU0NzA4NTY3ODI= | 6 | Break up records into different tables for each type | simonw 9599 | closed | 0 | 1 | 2019-07-22T01:54:59Z | 2019-07-22T03:28:55Z | 2019-07-22T03:28:50Z | MEMBER | I don't think there's much benefit to having all of the different record types stored in the same enormous table. Here's what I get when I use I'm going to try splitting these up into separate tables - so |
healthkit-to-sqlite 197882382 | issue | { "url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
489419782 | MDU6SXNzdWU0ODk0MTk3ODI= | 6 | Extract extended_entities into a media table | simonw 9599 | closed | 0 | 0 | 2019-09-04T21:59:10Z | 2019-09-04T22:08:01Z | 2019-09-04T22:08:01Z | MEMBER | twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||||
504238461 | MDU6SXNzdWU1MDQyMzg0NjE= | 6 | sqlite3.OperationalError: table users has no column named bio | dazzag24 1055831 | closed | 0 | 2 | 2019-10-08T19:39:52Z | 2019-10-13T05:31:28Z | 2019-10-13T05:30:19Z | NONE | ``` $ github-to-sqlite repos github.db $ github-to-sqlite starred github.db dazzag24 Traceback (most recent call last): File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/bin/github-to-sqlite", line 10, in <module> sys.exit(cli()) File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py", line 764, in call return self.main(args, kwargs) File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py", line 717, in main rv = self.invoke(ctx) File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py", line 956, in invoke return ctx.invoke(self.callback, ctx.params) File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py", line 555, in invoke return callback(args, **kwargs) File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/cli.py", line 106, in starred utils.save_stars(db, user, stars) File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/utils.py", line 177, in save_stars user_id = save_user(db, user) File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/utils.py", line 61, in save_user return db["users"].upsert(to_save, pk="id").last_pk File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py", line 1067, in upsert extracts=extracts, File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py", line 916, in insert extracts=extracts, File "/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py", line 1024, in insert_all result = self.db.conn.execute(sql, values) sqlite3.OperationalError: table users has no column named bio ``` ``` $ pipenv graph github-to-sqlite==0.4 - requests [required: Any, installed: 2.22.0] - certifi [required: >=2017.4.17, installed: 2019.9.11] - chardet [required: >=3.0.2,<3.1.0, installed: 3.0.4] - idna [required: >=2.5,<2.9, installed: 2.8] - urllib3 [required: >=1.21.1,<1.26,!=1.25.1,!=1.25.0, installed: 1.25.6] - sqlite-utils [required: ~=1.11, installed: 1.11] - click [required: Any, installed: 7.0] - click-default-group [required: Any, installed: 1.2.2] - click [required: Any, installed: 7.0] - tabulate [required: Any, installed: 0.8.5] Python 3.6.8 ``` |
github-to-sqlite 207052882 | issue | { "url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
543355051 | MDExOlB1bGxSZXF1ZXN0MzU3NjQwMTg2 | 6 | don't break if source is missing | mfa 78035 | closed | 0 | 1 | 2019-12-29T10:46:47Z | 2020-03-28T02:28:11Z | 2020-03-28T02:28:11Z | CONTRIBUTOR | dogsheep/swarm-to-sqlite/pulls/6 | broke for me. very old checkins in 2010 had no source set. |
swarm-to-sqlite 205429375 | pull | { "url": "https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | |||||
602575575 | MDU6SXNzdWU2MDI1NzU1NzU= | 6 | Add progress bar to upload command | simonw 9599 | closed | 0 | 2 | 2020-04-18T23:32:41Z | 2020-04-19T00:15:24Z | 2020-04-19T00:15:24Z | MEMBER | Upload was added in #4 |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
689848827 | MDU6SXNzdWU2ODk4NDg4Mjc= | 6 | ISO timestamps | simonw 9599 | open | 0 | 0 | 2020-09-01T06:16:42Z | 2020-09-01T06:16:42Z | MEMBER | The
Should use ISO instead, e.g. |
pocket-to-sqlite 213286752 | issue | { "url": "https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
689850810 | MDU6SXNzdWU2ODk4NTA4MTA= | 6 | Set up a demo instance | simonw 9599 | open | 0 | 0 | 2020-09-01T06:20:24Z | 2020-09-01T06:20:24Z | MEMBER | Once I've got the Datasette plugin to a state where it's worth building a demo: #3 I can use data from my public https://github-to-sqlite.dogsheep.net/ demo plus the Pocket data subset I use for the demo in https://github.com/dogsheep/pocket-to-sqlite/issues/5 - I could pull in the https://dogsheep-photos.dogsheep.net/ photos data too. |
dogsheep-beta 197431109 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-beta/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
718949182 | MDU6SXNzdWU3MTg5NDkxODI= | 6 | Better handling of OCR data | simonw 9599 | closed | 0 | 2 | 2020-10-11T23:20:52Z | 2020-10-12T00:04:10Z | 2020-10-12T00:04:10Z | MEMBER |
Originally posted by @simonw in https://github.com/dogsheep/evernote-to-sqlite/issues/4#issuecomment-706784028 |
evernote-to-sqlite 303218369 | issue | { "url": "https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
821841046 | MDU6SXNzdWU4MjE4NDEwNDY= | 6 | Upgrade to latest sqlite-utils | simonw 9599 | open | 0 | 1 | 2021-03-04T07:21:54Z | 2021-03-04T07:22:51Z | MEMBER | This is pinned to v1 at the moment. |
google-takeout-to-sqlite 206649770 | issue | { "url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
842765105 | MDExOlB1bGxSZXF1ZXN0NjAyMjYxMDky | 6 | Add testres-db tool | ligurio 1151557 | closed | 0 | 1 | 2021-03-28T15:43:23Z | 2022-02-16T05:12:05Z | 2022-02-16T05:12:05Z | NONE | dogsheep/dogsheep.github.io/pulls/6 | dogsheep.github.io 214746582 | pull | { "url": "https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | ||||||
1617602868 | I_kwDOJHON9s5gaqk0 | 6 | Character encoding problem | simonw 9599 | open | 0 | 2 | 2023-03-09T16:44:34Z | 2023-04-14T15:22:09Z | MEMBER | I ran against a recent note with this in it:
And got back:
Pasting that into https://ftfy.vercel.app/?s=Actions+%E2%80%9A%C3%B6%C3%B4%C3%94%E2%88%8F%C3%A8+ gives this:
|
apple-notes-to-sqlite 611552758 | issue | { "url": "https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1641117021 | PR_kwDODtX3eM5M66op | 6 | Add permalink virtual field to items table | xavdid 1231935 | open | 0 | 1 | 2023-03-26T22:22:38Z | 2023-03-29T18:38:52Z | FIRST_TIME_CONTRIBUTOR | dogsheep/hacker-news-to-sqlite/pulls/6 | I added a virtual column (no storage overhead) to the output that easily links back to the source. It works nicely out of the box with datasette: I got bit a bit by https://github.com/simonw/sqlite-utils/issues/411, so I went with a manual I also added my best-guess instructions for local development on this package. I'm shooting in the dark, so feel free to replace with how you work on it locally. |
hacker-news-to-sqlite 248903544 | pull | { "url": "https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | ||||||
267516650 | MDU6SXNzdWUyNjc1MTY2NTA= | 7 | Framework where by every page is JSON plus a template | simonw 9599 | closed | 0 | Ship first public release 2857392 | 1 | 2017-10-23T01:22:03Z | 2017-10-24T02:27:25Z | 2017-10-24T02:27:25Z | OWNER | Every single page of my interface should be implemented as a function that returns JSON. I can then build my jinja templates on top of the exact data that would be returned by the API version. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/7/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
403625674 | MDU6SXNzdWU0MDM2MjU2NzQ= | 7 | .insert_all() should accept a generator and process it efficiently | simonw 9599 | closed | 0 | 3 | 2019-01-28T02:11:58Z | 2019-01-28T06:26:53Z | 2019-01-28T06:26:53Z | OWNER | Right now you have to load every record into memory before passing the list to If you want to process millions of rows, this is inefficient. Python has generators - we should use them! The only catch here is that part of the magic of If a record outside of those first 1,000 has a rogue column, we can crash with an error. This will free us up to make the |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/7/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
472097220 | MDU6SXNzdWU0NzIwOTcyMjA= | 7 | Script uses a lot of RAM | simonw 9599 | closed | 0 | 3 | 2019-07-24T06:11:11Z | 2019-07-24T06:35:52Z | 2019-07-24T06:35:52Z | MEMBER | I'm using an XML pull parser which should avoid the need to slurp the whole XML file into memory, but it's not working - the script still uses over 1GB of RAM when it runs according to Activity Monitor. I think this is because I'm still causing the full root element to be incrementally loaded into memory just in case I try and access it later. http://effbot.org/elementtree/iterparse.htm says I should use
So I will try that recipe and see if it helps. |
healthkit-to-sqlite 197882382 | issue | { "url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/7/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
490798130 | MDU6SXNzdWU0OTA3OTgxMzA= | 7 | users-lookup command for fetching users | simonw 9599 | closed | 0 | 0 | 2019-09-08T19:47:59Z | 2019-09-08T20:32:13Z | 2019-09-08T20:32:13Z | MEMBER | https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-users-lookup
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/7/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
506276893 | MDU6SXNzdWU1MDYyNzY4OTM= | 7 | issue-comments command for importing issue comments | simonw 9599 | closed | 0 | 1 | 2019-10-13T05:23:58Z | 2019-10-14T14:44:12Z | 2019-10-13T05:24:30Z | MEMBER | Using this API: https://developer.github.com/v3/issues/comments/ |
github-to-sqlite 207052882 | issue | { "url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/7/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
589491711 | MDU6SXNzdWU1ODk0OTE3MTE= | 7 | Upgrade to sqlite-utils 2.x | simonw 9599 | closed | 0 | 0 | 2020-03-28T02:24:51Z | 2020-03-28T02:25:03Z | 2020-03-28T02:25:03Z | MEMBER | swarm-to-sqlite 205429375 | issue | { "url": "https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/7/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||||
602585497 | MDU6SXNzdWU2MDI1ODU0OTc= | 7 | Integrate image content hashing | simonw 9599 | open | 0 | 2 | 2020-04-19T00:36:58Z | 2021-08-26T02:01:01Z | MEMBER | To spot duplicate images (where the file content differs such that the sha256 is no longer a match) it would be useful to calculate and store perceptual hashes of some sort. |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/7/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 } |
||||||||
691265198 | MDU6SXNzdWU2OTEyNjUxOTg= | 7 | Mechanism for differentiating between "by me" and "liked by me" | simonw 9599 | closed | 0 | 6 | 2020-09-02T17:44:37Z | 2020-09-02T21:06:28Z | 2020-09-02T21:06:28Z | MEMBER | Some of the content I'm indexing is by me - photos I've taken, tweets I wrote, commits, comments I posted. Some of it is stuff that I've "liked" or "bookmarked" in some way - favourited tweets, Pocket articles, starred GitHub repos. It woud be useful to be able to differentiate between the two. |
dogsheep-beta 197431109 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-beta/issues/7/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
743297582 | MDU6SXNzdWU3NDMyOTc1ODI= | 7 | evernote-to-sqlite on windows 10 give this error: TypeError: insert() got an unexpected keyword argument 'replace' | martinvanwieringen 42387931 | closed | 0 | 1 | 2020-11-15T16:57:28Z | 2021-02-11T22:13:17Z | 2021-02-11T22:13:17Z | NONE | running evernote-to-sqlite 0.2 on windows 10. Command: evernote-to-sqlite enex evernote.db MyNotes.enex I get the followinng error: File "C:\Users\marti\AppData\Roaming\Python\Python38\site-packages\evernote_to_sqlite\utils.py", line 46, in save_note note_id = db["notes"].insert(row, hash_id="id", replace=True, alter=True).last_pk TypeError: insert() got an unexpected keyword argument 'replace' Removing replace=True, Leads to below error: note_id = db["notes"].insert(row, hash_id="id", alter=True).last_pk File "C:\Users\marti\AppData\Roaming\Python\Python38\site-packages\sqlite_utils\db.py", line 924, in insert return self.insert_all( File "C:\Users\marti\AppData\Roaming\Python\Python38\site-packages\sqlite_utils\db.py", line 1046, in insert_all result = self.db.conn.execute(sql, values) sqlite3.IntegrityError: UNIQUE constraint failed: notes.id |
evernote-to-sqlite 303218369 | issue | { "url": "https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/7/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
750141615 | MDExOlB1bGxSZXF1ZXN0NTI2ODQ3ODIz | 7 | Fixed conflicting CLI flags | tlockney 8944 | closed | 0 | 1 | 2020-11-24T23:25:12Z | 2022-08-21T21:11:56Z | 2022-08-21T21:11:56Z | CONTRIBUTOR | dogsheep/pocket-to-sqlite/pulls/7 | The |
pocket-to-sqlite 213286752 | pull | { "url": "https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/7/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | |||||
925384329 | MDExOlB1bGxSZXF1ZXN0NjczODcyOTc0 | 7 | Add instagram-to-sqlite | gavindsouza 36654812 | open | 0 | 0 | 2021-06-19T12:26:16Z | 2021-07-28T07:58:59Z | FIRST_TIME_CONTRIBUTOR | dogsheep/dogsheep.github.io/pulls/7 | The tool covers only chat imports at the time of opening this PR but I'm planning to import everything else that I feel inquisitive about |
dogsheep.github.io 214746582 | pull | { "url": "https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/7/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | ||||||
930946817 | MDU6SXNzdWU5MzA5NDY4MTc= | 7 | KeyError: 'accuracy' when processing Location History | davidwilemski 403152 | open | 0 | 0 | 2021-06-27T14:39:43Z | 2021-06-27T14:39:43Z | NONE | I'm new to both the dogsheep tools and datasette but have been experimenting a bit the last few days and these are really cool tools! I encountered a problem running my Google location history through this tool running the latest release in a docker container:
It looks like the tool assumes the My first attempt at a local patch to get myself going was to convert accessing the Now that I've done a successful import run using my initial fix of calling I'm happy to provide a PR for a fix but figured I'd ask about which direction is preferred first. |
google-takeout-to-sqlite 206649770 | issue | { "url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/7/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [pull_request] TEXT, [body] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT , [active_lock_reason] TEXT, [performed_via_github_app] TEXT, [reactions] TEXT, [draft] INTEGER, [state_reason] TEXT); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);