home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

2 rows where author_association = "OWNER" and issue = 735852274 sorted by updated_at descending

✖
✖
✖

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: created_at (date), updated_at (date)

user 1

  • simonw 2

issue 1

  • DigitalOcean buildpack memory errors for large sqlite db? · 2 ✖

author_association 1

  • OWNER · 2 ✖
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions issue performed_via_github_app
721931504 https://github.com/simonw/datasette/issues/1082#issuecomment-721931504 https://api.github.com/repos/simonw/datasette/issues/1082 MDEyOklzc3VlQ29tbWVudDcyMTkzMTUwNA== simonw 9599 2020-11-04T19:32:47Z 2020-11-04T19:35:44Z OWNER

I wonder if setting a soft memory limit within Datasette would help here: https://www.sqlite.org/malloc.html#_setting_memory_usage_limits

If attempts are made to allocate more memory than specified by the soft heap limit, then SQLite will first attempt to free cache memory before continuing with the allocation request.

https://www.sqlite.org/pragma.html#pragma_soft_heap_limit

PRAGMA soft_heap_limit PRAGMA soft_heap_limit=N

This pragma invokes the sqlite3_soft_heap_limit64() interface with the argument N, if N is specified and is a non-negative integer. The soft_heap_limit pragma always returns the same integer that would be returned by the sqlite3_soft_heap_limit64(-1) C-language function.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DigitalOcean buildpack memory errors for large sqlite db? 735852274  
721545090 https://github.com/simonw/datasette/issues/1082#issuecomment-721545090 https://api.github.com/repos/simonw/datasette/issues/1082 MDEyOklzc3VlQ29tbWVudDcyMTU0NTA5MA== simonw 9599 2020-11-04T06:47:15Z 2020-11-04T06:47:15Z OWNER

I've run into a similar problem with Google Cloud Run: beyond a certain size of database file I find myself needing to run instances there with more RAM assigned to them.

I haven't yet figured out a method to estimate the amount of RAM that will be needed to successfully serve a database file of a specific size- I've been using trial and error.

5GB is quite a big database file, so it doesn't surprise me that it may need a bigger instance. I recommend trying it on a 1GB or 2GB of RAM Digital Ocean instance (their default is 512MB) and see if that works.

Let me know what you find out!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
DigitalOcean buildpack memory errors for large sqlite db? 735852274  

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
                ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
                ON [issue_comments] ([user]);
Powered by Datasette · Queries took 17.875ms · About: github-to-sqlite
  • Sort ascending
  • Sort descending
  • Facet by this
  • Hide this column
  • Show all columns
  • Show not-blank rows