home / github / issue_comments

Menu
  • Search all tables
  • GraphQL API

issue_comments: 1258860845

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions issue performed_via_github_app
https://github.com/simonw/datasette/issues/526#issuecomment-1258860845 https://api.github.com/repos/simonw/datasette/issues/526 1258860845 IC_kwDOBm6k_c5LCLEt 9599 2022-09-27T01:48:31Z 2022-09-27T01:50:01Z OWNER

The protection is supposed to be from this line: python rows = cursor.fetchmany(max_returned_rows + 1) By capping the call to .fetchman() at max_returned_rows + 1 (the + 1 is to allow detection of whether or not there is a next page) I'm ensuring that Datasette never attempts to iterate over a huge result set.

SQLite and the sqlite3 library seem to handle this correctly. Here's an example:

```pycon

import sqlite3 conn = sqlite3.connect(":memory:") cursor = conn.execute(""" ... with recursive counter(x) as ( ... select 0 ... union ... select x + 1 from counter ... ) ... select * from counter""") cursor.fetchmany(10) [(0,), (1,), (2,), (3,), (4,), (5,), (6,), (7,), (8,), (9,), (10,)] ``counter` there is an infinitely long table (see TIL) - but we can retrieve the first 10 results without going into an infinite loop.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
459882902  
Powered by Datasette · Queries took 1.009ms · About: github-to-sqlite