issues
73 rows where repo = 206156866 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: user, comments, author_association, created_at (date), updated_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | pull_request | body | repo | type | active_lock_reason | performed_via_github_app | reactions | draft | state_reason |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1816830546 | I_kwDODEm0Qs5sSqJS | 73 | Twitter v1 API shutdown | david-perez 6341745 | open | 0 | 0 | 2023-07-22T16:57:41Z | 2023-07-22T16:57:41Z | NONE | I've been using this project reliably over the past two years to periodically download my liked tweets, but unfortunately since 19th July I get:
It appears like Twitter has now shut down their v1 endpoints, which is rather gracious of them, considering they announced they'd be deprecated on 29th April. Unfortunately retrieving likes using the v2 API is not part of their free plan. In fact, with the free plan one can only post and delete tweets and retrieve information about oneself. So I'm afraid this is the end of this very nice project. It was very useful, thank you! |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/73/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 } |
||||||||
1524431805 | I_kwDODEm0Qs5a3Pu9 | 72 | Import thread, including self- and others' replies | mcint 601708 | open | 0 | 0 | 2023-01-08T09:51:06Z | 2023-01-08T09:51:06Z | NONE | statuses-lookup, home-timeline, mentions (only for auth'ed user) don't cover this.
twitter-to-sqlite focuses on archiving users, but does not easily support archiving conversations or community activity. For reference, this is implemented in twarc, using a search, optionally recursively. Other research suggests that this formerly, or currently, requires a search query, use of undocumented |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/72/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
779088071 | MDU6SXNzdWU3NzkwODgwNzE= | 54 | Archive import appears to be broken on recent exports | jacobian 21148 | open | 0 | 5 | 2021-01-05T14:18:01Z | 2023-01-04T11:06:55Z | CONTRIBUTOR | I requested a Twitter export yesterday, and unfortunately they seem to have changed it such that So far I've ran into two issues. The first was easy to work around, but the second will take more investigation. If I can find the time I'll keep working on it and update this issue accordingly. The issues (so far): 1. Data seems to have moved to a
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/54/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1513238455 | PR_kwDODEm0Qs5GUoPm | 71 | Archive: Fix "ni devices" typo in importer | sometimes-i-send-pull-requests 26161409 | open | 0 | 0 | 2022-12-28T23:33:31Z | 2022-12-28T23:33:31Z | FIRST_TIME_CONTRIBUTOR | dogsheep/twitter-to-sqlite/pulls/71 | twitter-to-sqlite 206156866 | pull | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/71/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | |||||||
1513238314 | PR_kwDODEm0Qs5GUoN6 | 70 | Archive: Import Twitter Circle data | sometimes-i-send-pull-requests 26161409 | open | 0 | 0 | 2022-12-28T23:33:09Z | 2022-12-28T23:33:09Z | FIRST_TIME_CONTRIBUTOR | dogsheep/twitter-to-sqlite/pulls/70 | twitter-to-sqlite 206156866 | pull | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/70/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | |||||||
1513238152 | PR_kwDODEm0Qs5GUoMM | 69 | Archive: Import new tweets table name | sometimes-i-send-pull-requests 26161409 | open | 0 | 0 | 2022-12-28T23:32:44Z | 2022-12-28T23:32:44Z | FIRST_TIME_CONTRIBUTOR | dogsheep/twitter-to-sqlite/pulls/69 | Given the code here, it seems like in the past this file was named "tweet.js". In recent exports, it's named "tweets.js". The archive importer needs to be modified to take this into account. Existing logic is reused for importing this table. (However, the resulting table name will be different, matching the different file name -- archive_tweets, rather than archive_tweet). |
twitter-to-sqlite 206156866 | pull | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/69/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | ||||||
1513237982 | PR_kwDODEm0Qs5GUoKL | 68 | Archive: Import mute table | sometimes-i-send-pull-requests 26161409 | open | 0 | 0 | 2022-12-28T23:32:06Z | 2022-12-28T23:32:06Z | FIRST_TIME_CONTRIBUTOR | dogsheep/twitter-to-sqlite/pulls/68 | twitter-to-sqlite 206156866 | pull | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/68/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | |||||||
1513237712 | PR_kwDODEm0Qs5GUoG_ | 67 | Add support for app-only bearer tokens | sometimes-i-send-pull-requests 26161409 | open | 0 | 0 | 2022-12-28T23:31:20Z | 2022-12-28T23:31:20Z | FIRST_TIME_CONTRIBUTOR | dogsheep/twitter-to-sqlite/pulls/67 | Previously, twitter-to-sqlite only supported OAuth1 authentication, and the token must be on behalf of a user. However, Twitter also supports application-only bearer tokens, documented here: https://developer.twitter.com/en/docs/authentication/oauth-2-0/bearer-tokens This PR adds support to twitter-to-sqlite for using application-only bearer tokens. To use, the auth.json file just needs to contain a "bearer_token" key instead of "api_key", "api_secret_key", etc. |
twitter-to-sqlite 206156866 | pull | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/67/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | ||||||
1077560091 | I_kwDODEm0Qs5AOkMb | 61 | Data Pull fails for "Essential" level access to the Twitter API (for Documentation) | jmnickerson05 57161638 | open | 0 | 1 | 2021-12-11T14:59:41Z | 2022-10-31T14:47:58Z | NONE | Per Twitter documentation: https://developer.twitter.com/en/docs/twitter-api/getting-started/about-twitter-api#v2-access-leve This isn't any fault of twitter-to-sqlite of course, but it should probably be documented as a side-note. And this is how I'm surfacing the message from utils.py: |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/61/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1063982712 | I_kwDODEm0Qs4_axZ4 | 60 | Execution on Windows | bernard01 1733616 | open | 0 | 1 | 2021-11-26T00:24:34Z | 2022-10-14T16:58:27Z | NONE | My installation on Windows using pip has been successful. I have Python 3.6. How do I run twitter-to-sqlite? I cannot even figure out how "auth" is a command. I have python on my path: C:\prog\python\Python36;C:\prog\python\Python36\Scripts Where should the commands be executed, and where are the files created? Could some basics please be added to the documentation to get beginners started? |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/60/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
520508502 | MDU6SXNzdWU1MjA1MDg1MDI= | 31 | "friends" command (similar to "followers") | simonw 9599 | closed | 0 | 2 | 2019-11-09T20:20:20Z | 2022-09-20T05:05:03Z | 2020-02-07T07:03:28Z | MEMBER | Current list of commands:
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/31/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1244082183 | PR_kwDODEm0Qs44PPLy | 66 | Ageinfo workaround | ashanan 11887 | open | 0 | 0 | 2022-05-21T21:08:29Z | 2022-05-21T21:09:16Z | FIRST_TIME_CONTRIBUTOR | dogsheep/twitter-to-sqlite/pulls/66 | I'm not sure if this is due to a new format or just because my ageinfo file is blank, but trying to import an archive would crash when it got to that file. This PR adds a guard clause in the Let me know if you want any changes! |
twitter-to-sqlite 206156866 | pull | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/66/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | ||||||
1088816961 | I_kwDODEm0Qs5A5gdB | 62 | KeyError: 'created_at' for private accounts? | swyxio 6764957 | closed | 0 | 2 | 2021-12-26T17:51:51Z | 2022-03-12T02:36:32Z | 2022-02-24T18:10:18Z | NONE | hey Simon! i was running ![image](https://user-images.githubusercontent.com/6764957/147416165-46b69c30-100a-406f-8534-8612b75547ae.png)```bash Traceback (most recent call last): File "/Users/swyx/Work/datasette/env/bin/twitter-to-sqlite", line 8, in <module> sys.exit(cli()) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py", line 1128, in __call__ return self.main(*args, **kwargs) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py", line 1053, in main rv = self.invoke(ctx) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py", line 754, in invoke return __callback(*args, **kwargs) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/cli.py", line 291, in user_timeline profile = utils.get_profile(db, session, **kwargs) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py", line 133, in get_profile save_users(db, [profile]) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py", line 453, in save_users transform_user(user) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py", line 285, in transform_user user["created_at"] = parser.parse(user["created_at"]) KeyError: 'created_at' ```this looks awfully like #37 but it can't be, because i'm authed into my account and obviously i have perms to read my own account. wonder if there's any diagnostic methods i should apply here? just filing an issue for others to find while i investigate. |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/62/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1160327106 | PR_kwDODEm0Qs4z_V3w | 65 | Update Twitter dev link, clarify apps vs projects | rixx 2657547 | open | 0 | 0 | 2022-03-05T11:56:08Z | 2022-03-05T11:56:08Z | FIRST_TIME_CONTRIBUTOR | dogsheep/twitter-to-sqlite/pulls/65 | Twitter pushes you heavily towards v2 projects instead of v1 apps – I know the README mentions v1 API compatibility at the top, but I still nearly got turned around here. |
twitter-to-sqlite 206156866 | pull | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/65/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | ||||||
1097332098 | I_kwDODEm0Qs5BZ_WC | 64 | Include all entities for tweets | max 111631 | open | 0 | 0 | 2022-01-09T23:35:28Z | 2022-01-09T23:35:28Z | NONE | Per our conversation on Twitter: It would be neat if all entities (including URLs) were captured. This way you can ensure, that URLs are parsed out exactly the same way Twitter parses URLs – we all know parsing URLs with a regex ain't fun. Right now, I believe the tool filters out all entities that are not of type |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/64/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1091850530 | I_kwDODEm0Qs5BFFEi | 63 | Import archive error 'withheld_in_countries' | pauloxnet 521097 | open | 0 | 0 | 2022-01-01T16:58:59Z | 2022-01-01T16:58:59Z | NONE | Importing the twitter archive I received this error:
I found only a single tweet with the key I solved the error removing the key from the |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/63/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
984939366 | MDU6SXNzdWU5ODQ5MzkzNjY= | 58 | Error: Use either --since or --since_id, not both - still broken | rubenv 42904 | closed | 0 | 1 | 2021-09-01T09:45:28Z | 2021-09-21T17:37:41Z | 2021-09-21T17:37:41Z | CONTRIBUTOR | Hi Simon, It appears the fix for #57 doesn't fix things for me:
Is there any way I can help debug this? |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/58/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
984942782 | MDExOlB1bGxSZXF1ZXN0NzI0MzE3NjUw | 59 | Fix for since_id bug, closes #58 | rubenv 42904 | closed | 0 | 1 | 2021-09-01T09:49:09Z | 2021-09-21T17:37:40Z | 2021-09-21T17:37:40Z | CONTRIBUTOR | dogsheep/twitter-to-sqlite/pulls/59 | Fixes remaining instances of this bug |
twitter-to-sqlite 206156866 | pull | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/59/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | |||||
779211940 | MDExOlB1bGxSZXF1ZXN0NTQ5MjA0MDYz | 55 | Fix archive imports | jacobian 21148 | closed | 0 | 2 | 2021-01-05T15:54:48Z | 2021-08-20T00:02:49Z | 2021-08-20T00:02:49Z | CONTRIBUTOR | dogsheep/twitter-to-sqlite/pulls/55 | This fixes the issues discussed in #54 |
twitter-to-sqlite 206156866 | pull | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/55/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | |||||
681575714 | MDExOlB1bGxSZXF1ZXN0NDY5OTQ0OTk5 | 49 | Document the use of --stop_after with favorites, refs #20 | mikepqr 370930 | closed | 0 | 1 | 2020-08-19T06:10:52Z | 2021-08-20T00:02:11Z | 2021-08-20T00:02:11Z | CONTRIBUTOR | dogsheep/twitter-to-sqlite/pulls/49 | (I discovered this trawling the issues for how to use --since with favorites) |
twitter-to-sqlite 206156866 | pull | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/49/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | |||||
907645813 | MDU6SXNzdWU5MDc2NDU4MTM= | 57 | Error: Use either --since or --since_id, not both | rubenv 42904 | closed | 0 | 6 | 2021-05-31T18:11:04Z | 2021-08-20T00:01:31Z | 2021-08-20T00:01:31Z | CONTRIBUTOR | I'm using the following command:
Which gives the following error:
Running without
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/57/reactions", "total_count": 4, "+1": 4, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
796736607 | MDU6SXNzdWU3OTY3MzY2MDc= | 56 | Not all quoted statuses get fetched? | gsajko 42315895 | closed | 0 | 3 | 2021-01-29T09:48:44Z | 2021-02-03T10:36:36Z | 2021-02-03T10:36:36Z | NONE | In my database I have 13300 quote tweets, but eta 3600 have I fetched some of them using |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/56/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
771324837 | MDU6SXNzdWU3NzEzMjQ4Mzc= | 53 | --since support for favorites | anotherjesse 27 | closed | 0 | 1 | 2020-12-19T07:08:23Z | 2020-12-19T07:47:11Z | 2020-12-19T07:47:11Z | NONE | Having support for https://twittercommunity.com/t/cant-get-all-favorite-tweets-by-rest-api/22007/3 The api seems to take an optional |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/53/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
745393298 | MDU6SXNzdWU3NDUzOTMyOTg= | 52 | Discussion: Adding support for fetching only fresh tweets | fatihky 4169772 | closed | 0 | 1 | 2020-11-18T07:01:48Z | 2020-11-18T07:12:45Z | 2020-11-18T07:12:45Z | NONE | I think it'd be very useful if this tool has an option like |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/52/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
703218448 | MDU6SXNzdWU3MDMyMTg0NDg= | 51 | Documentation for twitter-to-sqlite fetch | simonw 9599 | open | 0 | 0 | 2020-09-17T02:38:10Z | 2020-09-17T02:38:10Z | MEMBER | It's mentioned in passing in the README but it deserves its own section:
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/51/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
698791218 | MDU6SXNzdWU2OTg3OTEyMTg= | 50 | favorites --stop_after=N stops after min(N, 200) | mikepqr 370930 | open | 0 | 2 | 2020-09-11T03:38:14Z | 2020-09-13T05:11:14Z | CONTRIBUTOR | For any number greater than 200, |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/50/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
663976976 | MDU6SXNzdWU2NjM5NzY5NzY= | 48 | Add a table of contents to the README | simonw 9599 | closed | 0 | 3 | 2020-07-22T18:54:33Z | 2020-07-23T17:46:07Z | 2020-07-22T19:03:02Z | MEMBER | twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/48/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||||
639542974 | MDU6SXNzdWU2Mzk1NDI5NzQ= | 47 | Fall back to FTS4 if FTS5 is not available | hpk42 73579 | open | 0 | 3 | 2020-06-16T10:11:23Z | 2020-06-17T20:13:48Z | NONE | got this with version 0.21.1 from pypi. twitter-to-sqlite auth worked but then "twitter-to-sqlite user-timeline USER.db" produced a tracekback ending in "no such module: FTS5". |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/47/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
610284471 | MDU6SXNzdWU2MTAyODQ0NzE= | 46 | Error running 'search' for the first time | simonw 9599 | closed | 0 | 0 | 2020-04-30T18:11:20Z | 2020-04-30T18:11:58Z | 2020-04-30T18:11:58Z | MEMBER |
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/46/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
602619330 | MDU6SXNzdWU2MDI2MTkzMzA= | 45 | Use raise_for_status() everywhere | simonw 9599 | open | 0 | 1 | 2020-04-19T04:38:28Z | 2020-04-19T04:39:22Z | MEMBER | I keep seeing errors which I think are caused by authentication or rate limit problems but which appear to be unexpected JSON responses - presumably because they are actually an error message. Recent example: https://github.com/simonw/jsk-fellows-on-twitter/runs/598892575 Using |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/45/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
602176870 | MDU6SXNzdWU2MDIxNzY4NzA= | 43 | "twitter-to-sqlite lists" command for retrieving a user's owned lists | simonw 9599 | closed | 0 | 1 | 2020-04-17T19:08:59Z | 2020-04-17T23:48:28Z | 2020-04-17T23:30:39Z | MEMBER | twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/43/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||||
585353598 | MDU6SXNzdWU1ODUzNTM1OTg= | 37 | Handle "User not found" error | simonw 9599 | closed | 0 | 3 | 2020-03-20T22:14:32Z | 2020-04-17T23:43:46Z | 2020-04-17T23:43:46Z | MEMBER | While running
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/37/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
602173589 | MDU6SXNzdWU2MDIxNzM1ODk= | 42 | Error running user-timeline with --sql and --ids together | simonw 9599 | closed | 0 | 0 | 2020-04-17T19:02:06Z | 2020-04-17T23:34:40Z | 2020-04-17T23:34:40Z | MEMBER |
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/42/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
602181581 | MDU6SXNzdWU2MDIxODE1ODE= | 44 | tweet["source"] can be an empty string | simonw 9599 | closed | 0 | 0 | 2020-04-17T19:18:26Z | 2020-04-17T22:01:44Z | 2020-04-17T22:01:44Z | MEMBER | Got this excepion:
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/44/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
591613579 | MDU6SXNzdWU1OTE2MTM1Nzk= | 41 | Bug: recorded a since_id for None, None | simonw 9599 | closed | 0 | 0 | 2020-04-01T04:29:43Z | 2020-04-01T04:31:11Z | 2020-04-01T04:31:11Z | MEMBER | This shouldn't happen in the |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/41/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
590669793 | MDU6SXNzdWU1OTA2Njk3OTM= | 40 | Feature: record history of follower counts | simonw 9599 | closed | 0 | 5 | 2020-03-30T23:32:28Z | 2020-04-01T04:13:05Z | 2020-04-01T04:13:05Z | MEMBER | We currently over-write the follower count every time we import a tweet (when we import that user profile again): It would be neat if we noticed if that user's follower count (and maybe other counts?) had changed since we last saved them and recorded that change in a separate history table. This would be an inexpensive way of building up rough charts of follower count over time. |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/40/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
492297930 | MDU6SXNzdWU0OTIyOTc5MzA= | 10 | Rethink progress bars for various commands | simonw 9599 | closed | 0 | 5 | 2019-09-11T15:06:47Z | 2020-04-01T03:45:48Z | 2020-04-01T03:45:48Z | MEMBER | Progress bars and the This is made more challenging by the fact that for many operations the total length is not known. https://click.palletsprojects.com/en/7.x/api/#click.progressbar |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/10/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
590666760 | MDU6SXNzdWU1OTA2NjY3NjA= | 39 | --since feature can be confused by retweets | simonw 9599 | closed | 0 | 11 | 2020-03-30T23:25:33Z | 2020-04-01T03:45:16Z | 2020-04-01T03:45:16Z | MEMBER | If you run It does this by seeking out the max ID of their previous tweets: BUT... this has a nasty flaw: if another account had retweeted one of their recent tweets the retweeted-tweet will have been loaded into the database - so we may treat that as the most recent since ID and miss a bunch of their tweets! |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/39/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
490803176 | MDU6SXNzdWU0OTA4MDMxNzY= | 8 | --sql and --attach options for feeding commands from SQL queries | simonw 9599 | closed | 0 | 4 | 2019-09-08T20:35:49Z | 2020-03-20T23:13:01Z | 2020-03-20T23:13:01Z | MEMBER | Say you want to fetch Twitter profiles for a list of accounts that are stored in another database:
The SQL query you feed in is expected to return a list of screen names suitable for processing further by the command. Should be supported by all three of:
The |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
585306847 | MDU6SXNzdWU1ODUzMDY4NDc= | 36 | twitter-to-sqlite followers/friends --sql / --attach | simonw 9599 | closed | 0 | 0 | 2020-03-20T20:20:33Z | 2020-03-20T23:12:38Z | 2020-03-20T23:12:38Z | MEMBER | Split from #8. The ( |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/36/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
585359363 | MDU6SXNzdWU1ODUzNTkzNjM= | 38 | Screen name display for user-timeline is uneven | simonw 9599 | closed | 0 | 1 | 2020-03-20T22:30:23Z | 2020-03-20T22:37:17Z | 2020-03-20T22:37:17Z | MEMBER |
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/38/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
585282212 | MDU6SXNzdWU1ODUyODIyMTI= | 35 | twitter-to-sqlite user-timeline [screen_names] --sql / --attach | simonw 9599 | closed | 0 | 5 | 2020-03-20T19:26:07Z | 2020-03-20T20:17:00Z | 2020-03-20T20:16:35Z | MEMBER | Split from #8. |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/35/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
561469252 | MDExOlB1bGxSZXF1ZXN0MzcyMjczNjA4 | 33 | Upgrade to sqlite-utils 2.2.1 | simonw 9599 | closed | 0 | 1 | 2020-02-07T07:32:12Z | 2020-03-20T19:21:42Z | 2020-03-20T19:21:41Z | MEMBER | dogsheep/twitter-to-sqlite/pulls/33 | twitter-to-sqlite 206156866 | pull | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/33/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | ||||||
585266763 | MDU6SXNzdWU1ODUyNjY3NjM= | 34 | IndexError running user-timeline command | simonw 9599 | closed | 0 | 2 | 2020-03-20T18:54:08Z | 2020-03-20T19:20:52Z | 2020-03-20T19:20:37Z | MEMBER |
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/34/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
561454071 | MDU6SXNzdWU1NjE0NTQwNzE= | 32 | Documentation for " favorites" command | simonw 9599 | closed | 0 | 0 | 2020-02-07T06:50:11Z | 2020-02-07T06:59:10Z | 2020-02-07T06:59:10Z | MEMBER | It looks like I forgot to document this one in the README. |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/32/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
518725064 | MDU6SXNzdWU1MTg3MjUwNjQ= | 29 | `import` command fails on empty files | jacobian 21148 | closed | 0 | 4 | 2019-11-06T20:34:26Z | 2019-11-09T20:33:38Z | 2019-11-09T19:36:36Z | CONTRIBUTOR | If a file in the export is empty (in my case it was
This appears to be because I hacked around this by modifying
I'm happy to work up a real PR if that's the right approach, but I'm not sure it is. |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/29/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
515658861 | MDU6SXNzdWU1MTU2NTg4NjE= | 28 | Add indexes to followers table | simonw 9599 | closed | 0 | 1 | 2019-10-31T18:40:22Z | 2019-11-09T20:15:42Z | 2019-11-09T20:11:48Z | MEMBER |
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/28/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
518739697 | MDU6SXNzdWU1MTg3Mzk2OTc= | 30 | `followers` fails because `transform_user` is called twice | jacobian 21148 | closed | 0 | 2 | 2019-11-06T20:44:52Z | 2019-11-09T20:15:28Z | 2019-11-09T19:55:52Z | CONTRIBUTOR | Trying to run
This appears to be because https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L111 calls I was able to work around this by commenting out https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L116. Shall I work up a patch for that, or is there a better approach? |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/30/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
488833975 | MDU6SXNzdWU0ODg4MzM5NzU= | 3 | Command for running a search and saving tweets for that search | simonw 9599 | closed | 0 | 6 | 2019-09-03T21:29:56Z | 2019-11-04T05:31:56Z | 2019-11-04T05:31:16Z | MEMBER |
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
514459062 | MDU6SXNzdWU1MTQ0NTkwNjI= | 27 | retweets-of-me command | simonw 9599 | closed | 0 | 4 | 2019-10-30T07:43:01Z | 2019-11-03T01:12:58Z | 2019-11-03T01:12:58Z | MEMBER | twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||||
513074501 | MDU6SXNzdWU1MTMwNzQ1MDE= | 26 | Command for importing mentions timeline | simonw 9599 | closed | 0 | 1 | 2019-10-28T03:14:27Z | 2019-10-30T02:36:13Z | 2019-10-30T02:20:47Z | MEMBER | https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-mentions_timeline Almost identical to home-timeline #18 but it uses |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/26/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
506268945 | MDU6SXNzdWU1MDYyNjg5NDU= | 20 | --since support for various commands for refresh-by-cron | simonw 9599 | closed | 0 | 3 | 2019-10-13T03:40:46Z | 2019-10-21T03:32:04Z | 2019-10-16T19:26:11Z | MEMBER | I want to run a cron that updates my Twitter database every X minutes. It should be able to retrieve the following without needing to paginate through everything:
It would be nice if this could be standardized across all commands as a |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
508190730 | MDU6SXNzdWU1MDgxOTA3MzA= | 23 | Extremely simple migration system | simonw 9599 | closed | 0 | 2 | 2019-10-17T02:13:57Z | 2019-10-17T16:57:17Z | 2019-10-17T16:57:17Z | MEMBER | Needed for #12. This is going to be an incredibly simple version of the Django migration system.
The function names will be detected and used as the names of the migrations. Every time you run the CLI tool it will call the Needs to take into account that there might be no tables at all. As such, migration functions should sanity check that the tables they are going to work on actually exist. |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/23/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
508578780 | MDU6SXNzdWU1MDg1Nzg3ODA= | 25 | Ensure migrations don't accidentally create foreign key twice | simonw 9599 | closed | 0 | 2 | 2019-10-17T16:08:50Z | 2019-10-17T16:56:47Z | 2019-10-17T16:56:47Z | MEMBER | Is it possible for these lines to run against a database table that already has these foreign keys? |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/25/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
508553387 | MDExOlB1bGxSZXF1ZXN0MzI5MzI0MzY4 | 24 | Tweet source extraction and new migration system | simonw 9599 | closed | 0 | 0 | 2019-10-17T15:24:56Z | 2019-10-17T15:49:29Z | 2019-10-17T15:49:24Z | MEMBER | dogsheep/twitter-to-sqlite/pulls/24 | Closes #12 and #23 |
twitter-to-sqlite 206156866 | pull | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/24/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | |||||
503053800 | MDU6SXNzdWU1MDMwNTM4MDA= | 12 | Extract "source" into a separate lookup table | simonw 9599 | closed | 0 | 3 | 2019-10-06T05:17:23Z | 2019-10-17T15:49:24Z | 2019-10-17T15:49:24Z | MEMBER | It's pretty bulky and ugly at the moment: |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
506087267 | MDU6SXNzdWU1MDYwODcyNjc= | 19 | since_id support for home-timeline | simonw 9599 | closed | 0 | 3 | 2019-10-11T22:48:24Z | 2019-10-16T19:13:06Z | 2019-10-16T19:12:46Z | MEMBER | Currently every time you run |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/19/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
508024032 | MDU6SXNzdWU1MDgwMjQwMzI= | 22 | Ability to import from uncompressed archive or from specific files | simonw 9599 | closed | 0 | 0 | 2019-10-16T18:31:57Z | 2019-10-16T18:53:36Z | 2019-10-16T18:53:36Z | MEMBER | Currently you can only import like this:
It would be useful if you could import from a folder that was decompressed from that zip:
AND from individual files within that folder - since that would allow you to e.g. selectively import certain files:
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/22/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
506432572 | MDU6SXNzdWU1MDY0MzI1NzI= | 21 | Fix & escapes in tweet text | simonw 9599 | closed | 0 | 1 | 2019-10-14T03:37:28Z | 2019-10-15T18:48:16Z | 2019-10-15T18:48:16Z | MEMBER | Shouldn't be storing |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/21/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
503244410 | MDU6SXNzdWU1MDMyNDQ0MTA= | 14 | When importing favorites, record which user favorited them | simonw 9599 | closed | 0 | 0 | 2019-10-07T05:45:11Z | 2019-10-14T03:30:25Z | 2019-10-14T03:30:25Z | MEMBER | This code currently just dumps them into the |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/14/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
505928530 | MDU6SXNzdWU1MDU5Mjg1MzA= | 18 | Command to import home-timeline | simonw 9599 | closed | 0 | 4 | 2019-10-11T15:47:54Z | 2019-10-11T16:51:33Z | 2019-10-11T16:51:12Z | MEMBER | Feature request: https://twitter.com/johankj/status/1182563563136868352
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
505674949 | MDU6SXNzdWU1MDU2NzQ5NDk= | 17 | import command should empty all archive-* tables first | simonw 9599 | closed | 0 | 2 | 2019-10-11T06:58:43Z | 2019-10-11T15:40:08Z | 2019-10-11T15:40:08Z | MEMBER | Can have a CLI option for NOT doing that. |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/17/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
505673645 | MDU6SXNzdWU1MDU2NzM2NDU= | 16 | Do a better job with archived direct message threads | simonw 9599 | open | 0 | 0 | 2019-10-11T06:55:21Z | 2019-10-11T06:55:27Z | MEMBER | twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/16/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
|||||||||
488835586 | MDU6SXNzdWU0ODg4MzU1ODY= | 4 | Command for importing data from a Twitter Export file | simonw 9599 | closed | 0 | 2 | 2019-09-03T21:34:13Z | 2019-10-11T06:45:02Z | 2019-10-11T06:45:02Z | MEMBER | Twitter lets you export all of your data as an archive file: https://twitter.com/settings/your_twitter_data A command for importing this data into SQLite would be extremely useful.
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/4/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
505666744 | MDExOlB1bGxSZXF1ZXN0MzI3MDUxNjcz | 15 | twitter-to-sqlite import command, refs #4 | simonw 9599 | closed | 0 | 0 | 2019-10-11T06:37:14Z | 2019-10-11T06:45:01Z | 2019-10-11T06:45:01Z | MEMBER | dogsheep/twitter-to-sqlite/pulls/15 | twitter-to-sqlite 206156866 | pull | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/15/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
0 | ||||||
503085013 | MDU6SXNzdWU1MDMwODUwMTM= | 13 | statuses-lookup command | simonw 9599 | closed | 0 | 1 | 2019-10-06T11:00:20Z | 2019-10-07T00:33:49Z | 2019-10-07T00:31:44Z | MEMBER | For bulk retrieving tweets by their ID. https://developer.twitter.com/en/docs/tweets/post-and-engage/api-reference/get-statuses-lookup Rate limit is 900/15 minutes (1 call per second) but each call can pull up to 100 IDs, so we can pull 6,000 per minute. Should support |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/13/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
503045221 | MDU6SXNzdWU1MDMwNDUyMjE= | 11 | Commands for recording real-time tweets from the streaming API | simonw 9599 | closed | 0 | 1 | 2019-10-06T03:09:30Z | 2019-10-06T04:54:17Z | 2019-10-06T04:48:31Z | MEMBER | https://developer.twitter.com/en/docs/tweets/filter-realtime/api-reference/post-statuses-filter We can support tracking keywords and following specific users. |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/11/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
491791152 | MDU6SXNzdWU0OTE3OTExNTI= | 9 | followers-ids and friends-ids subcommands | simonw 9599 | closed | 0 | 1 | 2019-09-10T16:58:15Z | 2019-09-10T17:36:55Z | 2019-09-10T17:36:55Z | MEMBER | These will import follower and friendship IDs into the following tables, using these APIs: https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-followers-ids https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-friends-ids |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/9/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
490798130 | MDU6SXNzdWU0OTA3OTgxMzA= | 7 | users-lookup command for fetching users | simonw 9599 | closed | 0 | 0 | 2019-09-08T19:47:59Z | 2019-09-08T20:32:13Z | 2019-09-08T20:32:13Z | MEMBER | https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-users-lookup
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/7/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
489419782 | MDU6SXNzdWU0ODk0MTk3ODI= | 6 | Extract extended_entities into a media table | simonw 9599 | closed | 0 | 0 | 2019-09-04T21:59:10Z | 2019-09-04T22:08:01Z | 2019-09-04T22:08:01Z | MEMBER | twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||||
488833136 | MDU6SXNzdWU0ODg4MzMxMzY= | 1 | Imported followers should go in "users", relationships in "following" | simonw 9599 | closed | 0 | 0 | 2019-09-03T21:27:37Z | 2019-09-04T20:23:04Z | 2019-09-04T20:23:04Z | MEMBER | Right now It should instead save them all in a global |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/1/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
488833698 | MDU6SXNzdWU0ODg4MzM2OTg= | 2 | "twitter-to-sqlite user-timeline" command for pulling tweets by a specific user | simonw 9599 | closed | 0 | 3 | 2019-09-03T21:29:12Z | 2019-09-04T20:02:11Z | 2019-09-04T20:02:11Z | MEMBER | Twitter only allows up to 3,200 tweets to be retrieved from https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-user_timeline.html I'm going to do:
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
488874815 | MDU6SXNzdWU0ODg4NzQ4MTU= | 5 | Write tests that simulate the Twitter API | simonw 9599 | open | 0 | 1 | 2019-09-03T23:55:35Z | 2019-09-03T23:56:28Z | MEMBER | I can use betamax for this: https://pypi.org/project/betamax/ |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/5/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [pull_request] TEXT, [body] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT , [active_lock_reason] TEXT, [performed_via_github_app] TEXT, [reactions] TEXT, [draft] INTEGER, [state_reason] TEXT); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);