issues: 1015646369
This data as json
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | pull_request | body | repo | type | active_lock_reason | performed_via_github_app | reactions | draft | state_reason |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1015646369 | I_kwDOBm6k_c48iYih | 1480 | Exceeding Cloud Run memory limits when deploying a 4.8G database | 110420 | open | 0 | 9 | 2021-10-04T21:20:24Z | 2022-10-07T04:39:10Z | CONTRIBUTOR | When I try to deploy a 4.8G SQLite database to Google Cloud Run, I get this error message:
Unfortunately, the maximum amount of memory that can be allocated to an instance is 8192M. Naively profiling the memory usage of running Datasette with this database locally on my MacBook shows the following memory usage (using Activity Monitor) when I just start up Datasette locally:
I'm trying to understand if there's a query or other operation that gets run during container deployment that causes memory use to be so large and if this can be avoided somehow. This is somewhat related to #1082, but on a different platform, so I decided to open a new issue. |
107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1480/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |