issue_comments
4 rows where author_association = "OWNER" and issue = 1578790070 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date)
issue 1
- `Table.convert()` skips falsey values · 4 ✖
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
1539051724 | https://github.com/simonw/sqlite-utils/issues/527#issuecomment-1539051724 | https://api.github.com/repos/simonw/sqlite-utils/issues/527 | IC_kwDOCGYnMM5bvBDM | simonw 9599 | 2023-05-08T21:07:04Z | 2023-05-08T21:07:04Z | OWNER | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
`Table.convert()` skips falsey values 1578790070 | ||
1539035838 | https://github.com/simonw/sqlite-utils/issues/527#issuecomment-1539035838 | https://api.github.com/repos/simonw/sqlite-utils/issues/527 | IC_kwDOCGYnMM5bu9K- | simonw 9599 | 2023-05-08T20:55:00Z | 2023-05-08T20:55:00Z | OWNER | I'm going to go with |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
`Table.convert()` skips falsey values 1578790070 | |
1539033736 | https://github.com/simonw/sqlite-utils/issues/527#issuecomment-1539033736 | https://api.github.com/repos/simonw/sqlite-utils/issues/527 | IC_kwDOCGYnMM5bu8qI | simonw 9599 | 2023-05-08T20:52:51Z | 2023-05-08T20:52:51Z | OWNER | OK, I implemented that at the Python API level. I need to decide how it should work for the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
`Table.convert()` skips falsey values 1578790070 | |
1506223848 | https://github.com/simonw/sqlite-utils/issues/527#issuecomment-1506223848 | https://api.github.com/repos/simonw/sqlite-utils/issues/527 | IC_kwDOCGYnMM5Zxybo | simonw 9599 | 2023-04-13T02:08:16Z | 2023-04-13T02:08:16Z | OWNER | I agree, this is a design flaw. It's technically a breaking change. As such, I would need to bump to v4 to responsibly release this. I'd rather bundle in a few more breaking changes before shipping that version. One thing we could do here is add an extra argument to
This would trigger the new, improved behaviour without breaking existing code that depends on how it works at the moment. Then in What do you think? (I'm open to suggestions for better names for this parameter too) |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
`Table.convert()` skips falsey values 1578790070 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
user 1