issues
17 rows where comments = 0, repo = 206156866 and type = "issue" sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: user, author_association, created_at (date), updated_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | pull_request | body | repo | type | active_lock_reason | performed_via_github_app | reactions | draft | state_reason |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1816830546 | I_kwDODEm0Qs5sSqJS | 73 | Twitter v1 API shutdown | david-perez 6341745 | open | 0 | 0 | 2023-07-22T16:57:41Z | 2023-07-22T16:57:41Z | NONE | I've been using this project reliably over the past two years to periodically download my liked tweets, but unfortunately since 19th July I get:
It appears like Twitter has now shut down their v1 endpoints, which is rather gracious of them, considering they announced they'd be deprecated on 29th April. Unfortunately retrieving likes using the v2 API is not part of their free plan. In fact, with the free plan one can only post and delete tweets and retrieve information about oneself. So I'm afraid this is the end of this very nice project. It was very useful, thank you! |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/73/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 } |
||||||||
1524431805 | I_kwDODEm0Qs5a3Pu9 | 72 | Import thread, including self- and others' replies | mcint 601708 | open | 0 | 0 | 2023-01-08T09:51:06Z | 2023-01-08T09:51:06Z | NONE | statuses-lookup, home-timeline, mentions (only for auth'ed user) don't cover this.
twitter-to-sqlite focuses on archiving users, but does not easily support archiving conversations or community activity. For reference, this is implemented in twarc, using a search, optionally recursively. Other research suggests that this formerly, or currently, requires a search query, use of undocumented |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/72/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1097332098 | I_kwDODEm0Qs5BZ_WC | 64 | Include all entities for tweets | max 111631 | open | 0 | 0 | 2022-01-09T23:35:28Z | 2022-01-09T23:35:28Z | NONE | Per our conversation on Twitter: It would be neat if all entities (including URLs) were captured. This way you can ensure, that URLs are parsed out exactly the same way Twitter parses URLs – we all know parsing URLs with a regex ain't fun. Right now, I believe the tool filters out all entities that are not of type |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/64/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1091850530 | I_kwDODEm0Qs5BFFEi | 63 | Import archive error 'withheld_in_countries' | pauloxnet 521097 | open | 0 | 0 | 2022-01-01T16:58:59Z | 2022-01-01T16:58:59Z | NONE | Importing the twitter archive I received this error:
I found only a single tweet with the key I solved the error removing the key from the |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/63/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
703218448 | MDU6SXNzdWU3MDMyMTg0NDg= | 51 | Documentation for twitter-to-sqlite fetch | simonw 9599 | open | 0 | 0 | 2020-09-17T02:38:10Z | 2020-09-17T02:38:10Z | MEMBER | It's mentioned in passing in the README but it deserves its own section:
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/51/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
610284471 | MDU6SXNzdWU2MTAyODQ0NzE= | 46 | Error running 'search' for the first time | simonw 9599 | closed | 0 | 0 | 2020-04-30T18:11:20Z | 2020-04-30T18:11:58Z | 2020-04-30T18:11:58Z | MEMBER |
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/46/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
602173589 | MDU6SXNzdWU2MDIxNzM1ODk= | 42 | Error running user-timeline with --sql and --ids together | simonw 9599 | closed | 0 | 0 | 2020-04-17T19:02:06Z | 2020-04-17T23:34:40Z | 2020-04-17T23:34:40Z | MEMBER |
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/42/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
602181581 | MDU6SXNzdWU2MDIxODE1ODE= | 44 | tweet["source"] can be an empty string | simonw 9599 | closed | 0 | 0 | 2020-04-17T19:18:26Z | 2020-04-17T22:01:44Z | 2020-04-17T22:01:44Z | MEMBER | Got this excepion:
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/44/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
591613579 | MDU6SXNzdWU1OTE2MTM1Nzk= | 41 | Bug: recorded a since_id for None, None | simonw 9599 | closed | 0 | 0 | 2020-04-01T04:29:43Z | 2020-04-01T04:31:11Z | 2020-04-01T04:31:11Z | MEMBER | This shouldn't happen in the |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/41/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
585306847 | MDU6SXNzdWU1ODUzMDY4NDc= | 36 | twitter-to-sqlite followers/friends --sql / --attach | simonw 9599 | closed | 0 | 0 | 2020-03-20T20:20:33Z | 2020-03-20T23:12:38Z | 2020-03-20T23:12:38Z | MEMBER | Split from #8. The ( |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/36/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
561454071 | MDU6SXNzdWU1NjE0NTQwNzE= | 32 | Documentation for " favorites" command | simonw 9599 | closed | 0 | 0 | 2020-02-07T06:50:11Z | 2020-02-07T06:59:10Z | 2020-02-07T06:59:10Z | MEMBER | It looks like I forgot to document this one in the README. |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/32/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
508024032 | MDU6SXNzdWU1MDgwMjQwMzI= | 22 | Ability to import from uncompressed archive or from specific files | simonw 9599 | closed | 0 | 0 | 2019-10-16T18:31:57Z | 2019-10-16T18:53:36Z | 2019-10-16T18:53:36Z | MEMBER | Currently you can only import like this:
It would be useful if you could import from a folder that was decompressed from that zip:
AND from individual files within that folder - since that would allow you to e.g. selectively import certain files:
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/22/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
503244410 | MDU6SXNzdWU1MDMyNDQ0MTA= | 14 | When importing favorites, record which user favorited them | simonw 9599 | closed | 0 | 0 | 2019-10-07T05:45:11Z | 2019-10-14T03:30:25Z | 2019-10-14T03:30:25Z | MEMBER | This code currently just dumps them into the |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/14/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
505673645 | MDU6SXNzdWU1MDU2NzM2NDU= | 16 | Do a better job with archived direct message threads | simonw 9599 | open | 0 | 0 | 2019-10-11T06:55:21Z | 2019-10-11T06:55:27Z | MEMBER | twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/16/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
|||||||||
490798130 | MDU6SXNzdWU0OTA3OTgxMzA= | 7 | users-lookup command for fetching users | simonw 9599 | closed | 0 | 0 | 2019-09-08T19:47:59Z | 2019-09-08T20:32:13Z | 2019-09-08T20:32:13Z | MEMBER | https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-users-lookup
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/7/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
489419782 | MDU6SXNzdWU0ODk0MTk3ODI= | 6 | Extract extended_entities into a media table | simonw 9599 | closed | 0 | 0 | 2019-09-04T21:59:10Z | 2019-09-04T22:08:01Z | 2019-09-04T22:08:01Z | MEMBER | twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||||
488833136 | MDU6SXNzdWU0ODg4MzMxMzY= | 1 | Imported followers should go in "users", relationships in "following" | simonw 9599 | closed | 0 | 0 | 2019-09-03T21:27:37Z | 2019-09-04T20:23:04Z | 2019-09-04T20:23:04Z | MEMBER | Right now It should instead save them all in a global |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/1/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [pull_request] TEXT, [body] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT , [active_lock_reason] TEXT, [performed_via_github_app] TEXT, [reactions] TEXT, [draft] INTEGER, [state_reason] TEXT); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);