issue_comments: 481310295
This data as json
html_url | issue_url | id | node_id | user | created_at | updated_at | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
https://github.com/simonw/datasette/issues/420#issuecomment-481310295 | https://api.github.com/repos/simonw/datasette/issues/420 | 481310295 | MDEyOklzc3VlQ29tbWVudDQ4MTMxMDI5NQ== | 9599 | 2019-04-09T15:50:52Z | 2019-04-09T15:50:52Z | OWNER | Efficient row counts are even more important for the The row counts on those pages don't have to be precise, so one option is for me to calculate them and cache them occasionally. I could even have a dedicated thread which just does the counting? In #422 I've figured out a mechanism for getting accurate or lower-bound counts within a time limit (accurate if possible, lower-bound otherwise). |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
421971339 |