{"html_url": "https://github.com/simonw/datasette/issues/1082#issuecomment-721931504", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1082", "id": 721931504, "node_id": "MDEyOklzc3VlQ29tbWVudDcyMTkzMTUwNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-11-04T19:32:47Z", "updated_at": "2020-11-04T19:35:44Z", "author_association": "OWNER", "body": "I wonder if setting a soft memory limit within Datasette would help here: https://www.sqlite.org/malloc.html#_setting_memory_usage_limits \r\n\r\n> If attempts are made to allocate more memory than specified by the soft heap limit, then SQLite will first attempt to free cache memory before continuing with the allocation request.\r\n\r\nhttps://www.sqlite.org/pragma.html#pragma_soft_heap_limit\r\n\r\n> **PRAGMA soft_heap_limit**\r\n> **PRAGMA soft_heap_limit=N**\r\n> \r\n> This pragma invokes the [sqlite3_soft_heap_limit64()](https://www.sqlite.org/c3ref/hard_heap_limit64.html) interface with the argument N, if N is specified and is a non-negative integer. The soft_heap_limit pragma always returns the same integer that would be returned by the [sqlite3_soft_heap_limit64](https://www.sqlite.org/c3ref/hard_heap_limit64.html)(-1) C-language function.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 735852274, "label": "DigitalOcean buildpack memory errors for large sqlite db?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1082#issuecomment-721547177", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1082", "id": 721547177, "node_id": "MDEyOklzc3VlQ29tbWVudDcyMTU0NzE3Nw==", "user": {"value": 39538958, "label": "justmars"}, "created_at": "2020-11-04T06:52:30Z", "updated_at": "2020-11-04T06:53:16Z", "author_association": "NONE", "body": "I think I tried the same db size on the following scenarios in Digital Ocean:\r\n1. Basic ($5/month) with 512MB RAM\r\n2. Basic ($10/month) with 1GB RAM\r\n3. Pro ($12/month) with 1GB RAM\r\n\r\nAll such attempts conked out with \"out of memory\" errors", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 735852274, "label": "DigitalOcean buildpack memory errors for large sqlite db?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1082#issuecomment-721545090", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1082", "id": 721545090, "node_id": "MDEyOklzc3VlQ29tbWVudDcyMTU0NTA5MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-11-04T06:47:15Z", "updated_at": "2020-11-04T06:47:15Z", "author_association": "OWNER", "body": "I've run into a similar problem with Google Cloud Run: beyond a certain size of database file I find myself needing to run instances there with more RAM assigned to them.\r\n\r\nI haven't yet figured out a method to estimate the amount of RAM that will be needed to successfully serve a database file of a specific size- I've been using trial and error.\r\n\r\n5GB is quite a big database file, so it doesn't surprise me that it may need a bigger instance. I recommend trying it on a 1GB or 2GB of RAM Digital Ocean instance (their default is 512MB) and see if that works.\r\n\r\nLet me know what you find out!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 735852274, "label": "DigitalOcean buildpack memory errors for large sqlite db?"}, "performed_via_github_app": null}