issue_comments: 1258864140
This data as json
html_url | issue_url | id | node_id | user | created_at | updated_at | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
https://github.com/simonw/datasette/issues/526#issuecomment-1258864140 | https://api.github.com/repos/simonw/datasette/issues/526 | 1258864140 | IC_kwDOBm6k_c5LCL4M | 9599 | 2022-09-27T01:55:32Z | 2022-09-27T01:55:32Z | OWNER | That recursive query is a great example of the kind of thing having a maximum row limit protects against. Imagine if Datasette CSVs did allow unlimited retrievals. Someone could hit the CSV endpoint for that recursive query and tie up Datasette's SQL connection effectively forever. Even if this feature becomes a permission-guarded thing we still need to take that case into account. At the very least it would be good if the query could be cancelled if the client disconnects - so if someone accidentally starts an infinite query they can cancel the request and free up the server resources. It might be a good idea to implement a page that shows "currently running" queries and allows users with the right permission to terminate them from that page. Another option: a "limit of last resource" - either a very high row limit (10,000,000 perhaps) or even a time limit, saying that all queries will be cancelled if they take longer than thirty minutes or similar. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
459882902 |