home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

12 rows where "created_at" is on date 2019-10-16 and user = 9599 sorted by updated_at descending

✖
✖
✖

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: issue_url, created_at (date), updated_at (date)

issue 6

  • Option to fetch only checkins more recent than the current max checkin 3
  • Extract "source" into a separate lookup table 3
  • since_id support for home-timeline 2
  • If you have databases called foo.db and foo-bar.db you cannot visit /foo-bar 2
  • --since support for various commands for refresh-by-cron 1
  • Handle really wide tables better 1

author_association 2

  • MEMBER 9
  • OWNER 3

user 1

  • simonw · 12 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions issue performed_via_github_app
542882604 https://github.com/dogsheep/swarm-to-sqlite/issues/3#issuecomment-542882604 https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/3 MDEyOklzc3VlQ29tbWVudDU0Mjg4MjYwNA== simonw 9599 2019-10-16T20:41:23Z 2019-10-16T20:41:23Z MEMBER

Documented here: https://github.com/dogsheep/swarm-to-sqlite/blob/0.2/README.md#usage

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Option to fetch only checkins more recent than the current max checkin 487600595  
542876047 https://github.com/dogsheep/swarm-to-sqlite/issues/3#issuecomment-542876047 https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/3 MDEyOklzc3VlQ29tbWVudDU0Mjg3NjA0Nw== simonw 9599 2019-10-16T20:23:36Z 2019-10-16T20:23:36Z MEMBER

I'm going to go with --since=1d/2w/3h for this.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Option to fetch only checkins more recent than the current max checkin 487600595  
542875885 https://github.com/dogsheep/swarm-to-sqlite/issues/3#issuecomment-542875885 https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/3 MDEyOklzc3VlQ29tbWVudDU0Mjg3NTg4NQ== simonw 9599 2019-10-16T20:23:08Z 2019-10-16T20:23:08Z MEMBER

https://developer.foursquare.com/docs/api/users/checkins documents afterTimestamp:

Retrieve the first results to follow these seconds since epoch. This should be useful for paging forward in time, or when polling for changes. To avoid missing results when polling, we recommend subtracting several seconds from the last poll time and then de-duplicating.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Option to fetch only checkins more recent than the current max checkin 487600595  
542872388 https://github.com/simonw/datasette/issues/597#issuecomment-542872388 https://api.github.com/repos/simonw/datasette/issues/597 MDEyOklzc3VlQ29tbWVudDU0Mjg3MjM4OA== simonw 9599 2019-10-16T20:13:38Z 2019-10-16T20:13:38Z OWNER

I encountered this bug in my own private instance of Datasette running behind an nginx proxy.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
If you have databases called foo.db and foo-bar.db you cannot visit /foo-bar 508070977  
542872267 https://github.com/simonw/datasette/issues/597#issuecomment-542872267 https://api.github.com/repos/simonw/datasette/issues/597 MDEyOklzc3VlQ29tbWVudDU0Mjg3MjI2Nw== simonw 9599 2019-10-16T20:13:21Z 2019-10-16T20:13:21Z OWNER

$ echo '{"hello": "world"}' | sqlite-utils insert foo.db hello - $ echo '{"hello": "world"}' | sqlite-utils insert foo-bar.db hello - $ datasette publish now \ --about_url=https://github.com/simonw/datasette/issues/597 \ --title="Issue #597" \ --alias=datasette-issue-597 \ foo.db foo-bar.db This failed to replicate the issue. https://datasette-issue-597.now.sh/ is working as it should.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
If you have databases called foo.db and foo-bar.db you cannot visit /foo-bar 508070977  
542858025 https://github.com/dogsheep/twitter-to-sqlite/issues/12#issuecomment-542858025 https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12 MDEyOklzc3VlQ29tbWVudDU0Mjg1ODAyNQ== simonw 9599 2019-10-16T19:35:31Z 2019-10-16T19:36:09Z MEMBER

Maybe this means I need an upgrade command to apply these kinds of migrations? Total feature creep!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Extract "source" into a separate lookup table 503053800  
542855427 https://github.com/dogsheep/twitter-to-sqlite/issues/12#issuecomment-542855427 https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12 MDEyOklzc3VlQ29tbWVudDU0Mjg1NTQyNw== simonw 9599 2019-10-16T19:27:55Z 2019-10-16T19:27:55Z MEMBER

I can do that by keeping source as a TEXT column but turning it into a non-enforced foreign key against a new sources table. Then I can run code that scans that column for any values beginning with a < and converts them.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Extract "source" into a separate lookup table 503053800  
542855081 https://github.com/dogsheep/twitter-to-sqlite/issues/12#issuecomment-542855081 https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12 MDEyOklzc3VlQ29tbWVudDU0Mjg1NTA4MQ== simonw 9599 2019-10-16T19:26:56Z 2019-10-16T19:26:56Z MEMBER

This may be the first case where I want to be able to repair existing databases rather than discarding their contents.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Extract "source" into a separate lookup table 503053800  
542854749 https://github.com/dogsheep/twitter-to-sqlite/issues/20#issuecomment-542854749 https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20 MDEyOklzc3VlQ29tbWVudDU0Mjg1NDc0OQ== simonw 9599 2019-10-16T19:26:01Z 2019-10-16T19:26:01Z MEMBER

I'm not going to do this for "accounts that have followed me" and "new accounts that I have followed" - instead I will recommend running the friend_ids and followers_ids commands on a daily basis since that data doesn't really change much by the hour.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
--since support for various commands for refresh-by-cron 506268945  
542849963 https://github.com/dogsheep/twitter-to-sqlite/issues/19#issuecomment-542849963 https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/19 MDEyOklzc3VlQ29tbWVudDU0Mjg0OTk2Mw== simonw 9599 2019-10-16T19:13:06Z 2019-10-16T19:13:06Z MEMBER

Updated documentation: https://github.com/dogsheep/twitter-to-sqlite/blob/fced2a9b67d2cbdf9817f1eb75f7c28e413c963b/README.md#retrieving-tweets-from-your-home-timeline

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
since_id support for home-timeline 506087267  
542832952 https://github.com/dogsheep/twitter-to-sqlite/issues/19#issuecomment-542832952 https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/19 MDEyOklzc3VlQ29tbWVudDU0MjgzMjk1Mg== simonw 9599 2019-10-16T18:30:11Z 2019-10-16T18:30:11Z MEMBER

The --since option will derive the since_id from the max ID in the timeline_tweets table:

$ twitter-to-sqlite home-timeline --since

The --since_id=xxx option lets you specify that ID directly.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
since_id support for home-timeline 506087267  
542462126 https://github.com/simonw/datasette/issues/596#issuecomment-542462126 https://api.github.com/repos/simonw/datasette/issues/596 MDEyOklzc3VlQ29tbWVudDU0MjQ2MjEyNg== simonw 9599 2019-10-16T00:45:45Z 2019-10-16T00:45:45Z OWNER

This means moving away from select *. I've been thinking this would be worthwhile anyway, since that way when you click "Edit SQL" you'll get a more useful SQL statement to start hacking away at.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Handle really wide tables better 507454958  

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
                ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
                ON [issue_comments] ([user]);
Powered by Datasette · Queries took 969.631ms · About: github-to-sqlite
  • Sort ascending
  • Sort descending
  • Facet by this
  • Hide this column
  • Show all columns
  • Show not-blank rows