home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

6 rows where issue = 1095570074 and "updated_at" is on date 2022-01-08 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • simonw 6

issue 1

  • `--batch-size 1` doesn't seem to commit for every item · 6 ✖

author_association 1

  • OWNER 6
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions issue performed_via_github_app
1008155916 https://github.com/simonw/sqlite-utils/issues/364#issuecomment-1008155916 https://api.github.com/repos/simonw/sqlite-utils/issues/364 IC_kwDOCGYnMM48Fz0M simonw 9599 2022-01-08T21:16:46Z 2022-01-08T21:16:46Z OWNER

No, chunks() seems to work OK in the test I just added.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`--batch-size 1` doesn't seem to commit for every item 1095570074  
1008154873 https://github.com/simonw/sqlite-utils/issues/364#issuecomment-1008154873 https://api.github.com/repos/simonw/sqlite-utils/issues/364 IC_kwDOCGYnMM48Fzj5 simonw 9599 2022-01-08T21:11:55Z 2022-01-08T21:11:55Z OWNER

I'm suspicious that the chunks() utility function may not be working correctly: ```pycon In [10]: [list(d) for d in list(chunks('abc', 5))] Out[10]: [['a'], ['b'], ['c']]

In [11]: [list(d) for d in list(chunks('abcdefghi', 5))] Out[11]: [['a'], ['b'], ['c'], ['d'], ['e'], ['f'], ['g'], ['h'], ['i']]

In [12]: [list(d) for d in list(chunks('abcdefghi', 3))] Out[12]: [['a'], ['b'], ['c'], ['d'], ['e'], ['f'], ['g'], ['h'], ['i']] ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`--batch-size 1` doesn't seem to commit for every item 1095570074  
1008153586 https://github.com/simonw/sqlite-utils/issues/364#issuecomment-1008153586 https://api.github.com/repos/simonw/sqlite-utils/issues/364 IC_kwDOCGYnMM48FzPy simonw 9599 2022-01-08T21:06:15Z 2022-01-08T21:06:15Z OWNER

I added a print statement after for query, params in queries_and_params and confirmed that something in the code is waiting until 16 records are available to be inserted and then executing the inserts, even with --batch-size 1.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`--batch-size 1` doesn't seem to commit for every item 1095570074  
1008151884 https://github.com/simonw/sqlite-utils/issues/364#issuecomment-1008151884 https://api.github.com/repos/simonw/sqlite-utils/issues/364 IC_kwDOCGYnMM48Fy1M simonw 9599 2022-01-08T20:59:21Z 2022-01-08T20:59:21Z OWNER

(That Heroku example doesn't record the timestamp, which limits its usefulness)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`--batch-size 1` doesn't seem to commit for every item 1095570074  
1008143248 https://github.com/simonw/sqlite-utils/issues/364#issuecomment-1008143248 https://api.github.com/repos/simonw/sqlite-utils/issues/364 IC_kwDOCGYnMM48FwuQ simonw 9599 2022-01-08T20:34:12Z 2022-01-08T20:34:12Z OWNER

Built that tool: https://github.com/simonw/stream-delay and https://pypi.org/project/stream-delay/

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`--batch-size 1` doesn't seem to commit for every item 1095570074  
1008129841 https://github.com/simonw/sqlite-utils/issues/364#issuecomment-1008129841 https://api.github.com/repos/simonw/sqlite-utils/issues/364 IC_kwDOCGYnMM48Ftcx simonw 9599 2022-01-08T20:04:42Z 2022-01-08T20:04:42Z OWNER

It would be easier to test this if I had a utility for streaming out a file one line at a time.

A few recipes for this in https://superuser.com/questions/526242/cat-file-to-terminal-at-particular-speed-of-lines-per-second - I'm going to build a quick stream-delay tool though.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`--batch-size 1` doesn't seem to commit for every item 1095570074  

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
                ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
                ON [issue_comments] ([user]);
Powered by Datasette · Queries took 15.498ms · About: github-to-sqlite