home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

8 rows where author_association = "MEMBER" and "created_at" is on date 2019-10-11 sorted by updated_at descending

✖
✖
✖

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: issue_url, created_at (date), updated_at (date)

issue 4

  • Command to import home-timeline 4
  • import command should empty all archive-* tables first 2
  • Command for importing data from a Twitter Export file 1
  • since_id support for home-timeline 1

user 1

  • simonw 8

author_association 1

  • MEMBER · 8 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions issue performed_via_github_app
541248629 https://github.com/dogsheep/twitter-to-sqlite/issues/19#issuecomment-541248629 https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/19 MDEyOklzc3VlQ29tbWVudDU0MTI0ODYyOQ== simonw 9599 2019-10-11T22:48:56Z 2019-10-11T22:48:56Z MEMBER

since_id documented here: https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-home_timeline

Returns results with an ID greater than (that is, more recent than) the specified ID. There are limits to the number of Tweets which can be accessed through the API. If the limit of Tweets has occured since the since_id, the since_id will be forced to the oldest ID available.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
since_id support for home-timeline 506087267  
541119834 https://github.com/dogsheep/twitter-to-sqlite/issues/18#issuecomment-541119834 https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18 MDEyOklzc3VlQ29tbWVudDU0MTExOTgzNA== simonw 9599 2019-10-11T15:51:22Z 2019-10-11T16:51:33Z MEMBER

In order to support multiple user timelines being saved in the same database, I'm going to import the tweets into the tweets table AND add a new timeline_tweets table recording that a specific tweet showed up in a specific user's timeline.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Command to import home-timeline 505928530  
541141169 https://github.com/dogsheep/twitter-to-sqlite/issues/18#issuecomment-541141169 https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18 MDEyOklzc3VlQ29tbWVudDU0MTE0MTE2OQ== simonw 9599 2019-10-11T16:51:29Z 2019-10-11T16:51:29Z MEMBER

Documented here: https://github.com/dogsheep/twitter-to-sqlite/blob/master/README.md#retrieving-tweets-from-your-home-timeline

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Command to import home-timeline 505928530  
541118934 https://github.com/dogsheep/twitter-to-sqlite/issues/18#issuecomment-541118934 https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18 MDEyOklzc3VlQ29tbWVudDU0MTExODkzNA== simonw 9599 2019-10-11T15:48:54Z 2019-10-11T15:48:54Z MEMBER

Rate limit is tight: 15 requests every 15 mins!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Command to import home-timeline 505928530  
541118773 https://github.com/dogsheep/twitter-to-sqlite/issues/18#issuecomment-541118773 https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18 MDEyOklzc3VlQ29tbWVudDU0MTExODc3Mw== simonw 9599 2019-10-11T15:48:31Z 2019-10-11T15:48:31Z MEMBER

https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-home_timeline

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Command to import home-timeline 505928530  
541112588 https://github.com/dogsheep/twitter-to-sqlite/issues/17#issuecomment-541112588 https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/17 MDEyOklzc3VlQ29tbWVudDU0MTExMjU4OA== simonw 9599 2019-10-11T15:31:30Z 2019-10-11T15:31:30Z MEMBER

No need for an option:

This command will delete and recreate all of your archive-* tables every time you run it. If this is not what you want, run the command against a fresh SQLite database rather than running it again one that already exists.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
import command should empty all archive-* tables first 505674949  
541112108 https://github.com/dogsheep/twitter-to-sqlite/issues/17#issuecomment-541112108 https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/17 MDEyOklzc3VlQ29tbWVudDU0MTExMjEwOA== simonw 9599 2019-10-11T15:30:15Z 2019-10-11T15:30:15Z MEMBER

It should delete the tables entirely. That way it will work even if the table schema has changed.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
import command should empty all archive-* tables first 505674949  
540879620 https://github.com/dogsheep/twitter-to-sqlite/issues/4#issuecomment-540879620 https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/4 MDEyOklzc3VlQ29tbWVudDU0MDg3OTYyMA== simonw 9599 2019-10-11T02:59:16Z 2019-10-11T02:59:16Z MEMBER

Also import ad preferences and all that other junk.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Command for importing data from a Twitter Export file 488835586  

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
                ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
                ON [issue_comments] ([user]);
Powered by Datasette · Queries took 387.809ms · About: github-to-sqlite
  • Sort ascending
  • Sort descending
  • Facet by this
  • Hide this column
  • Show all columns
  • Show not-blank rows