{"html_url": "https://github.com/simonw/datasette/issues/1480#issuecomment-1271101072", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1480", "id": 1271101072, "node_id": "IC_kwDOBm6k_c5Lw3aQ", "user": {"value": 536941, "label": "fgregg"}, "created_at": "2022-10-07T04:39:10Z", "updated_at": "2022-10-07T04:39:10Z", "author_association": "CONTRIBUTOR", "body": "switching from `immutable=1` to `mode=ro` completely addressed this. see https://github.com/simonw/datasette/issues/1836#issuecomment-1271100651 for details.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1015646369, "label": "Exceeding Cloud Run memory limits when deploying a 4.8G database"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1480#issuecomment-1269847461", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1480", "id": 1269847461, "node_id": "IC_kwDOBm6k_c5LsFWl", "user": {"value": 536941, "label": "fgregg"}, "created_at": "2022-10-06T11:21:49Z", "updated_at": "2022-10-06T11:21:49Z", "author_association": "CONTRIBUTOR", "body": "thanks @simonw, i'll spend a little more time trying to figure out why this isn't working on cloudrun, and then will flip over to fly if i can't.\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1015646369, "label": "Exceeding Cloud Run memory limits when deploying a 4.8G database"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1480#issuecomment-1269275153", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1480", "id": 1269275153, "node_id": "IC_kwDOBm6k_c5Lp5oR", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-10-06T03:54:33Z", "updated_at": "2022-10-06T03:54:33Z", "author_association": "OWNER", "body": "I've been having success using Fly recently for a project which I thought would be too large for Cloud Run. I wrote about that here:\r\n\r\n- https://simonwillison.net/2022/Sep/5/laion-aesthetics-weeknotes/", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1015646369, "label": "Exceeding Cloud Run memory limits when deploying a 4.8G database"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1480#issuecomment-1268629159", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1480", "id": 1268629159, "node_id": "IC_kwDOBm6k_c5Lnb6n", "user": {"value": 536941, "label": "fgregg"}, "created_at": "2022-10-05T16:00:55Z", "updated_at": "2022-10-05T16:00:55Z", "author_association": "CONTRIBUTOR", "body": "as a next step, i'll fetch the docker image from the google registry, and see what memory and disk usage looks like when i run it locally.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1015646369, "label": "Exceeding Cloud Run memory limits when deploying a 4.8G database"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1480#issuecomment-1268613335", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1480", "id": 1268613335, "node_id": "IC_kwDOBm6k_c5LnYDX", "user": {"value": 536941, "label": "fgregg"}, "created_at": "2022-10-05T15:45:49Z", "updated_at": "2022-10-05T15:45:49Z", "author_association": "CONTRIBUTOR", "body": "running into this as i continue to grow my labor data warehouse.\r\n\r\nHere a CloudRun PM says the container size should **not** count against memory: https://stackoverflow.com/a/56570717", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1015646369, "label": "Exceeding Cloud Run memory limits when deploying a 4.8G database"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1480#issuecomment-947203725", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1480", "id": 947203725, "node_id": "IC_kwDOBm6k_c44dS6N", "user": {"value": 110420, "label": "ghing"}, "created_at": "2021-10-20T00:21:54Z", "updated_at": "2021-10-20T00:21:54Z", "author_association": "CONTRIBUTOR", "body": "This StackOverflow post, [sqlite - Cloud Run: Why does my instance need so much RAM?](https://stackoverflow.com/questions/59812405/cloud-run-why-does-my-instance-need-so-much-ram), points to [this section of the Cloud Run docs](https://cloud.google.com/run/docs/troubleshooting) that says:\r\n\r\n> Note that the Cloud Run container instances run in an environment where the files written to the local filesystem count towards the available memory. This also includes any log files that are not written to /var/log/* or /dev/log.\r\n\r\nDoes datasette write any large files when starting? \r\n\r\nOr does the [`COPY` command in the Dockerfile](https://github.com/simonw/datasette/blob/main/datasette/utils/__init__.py#L349) count as writing to the local filesystem?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1015646369, "label": "Exceeding Cloud Run memory limits when deploying a 4.8G database"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1480#issuecomment-947196177", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1480", "id": 947196177, "node_id": "IC_kwDOBm6k_c44dRER", "user": {"value": 110420, "label": "ghing"}, "created_at": "2021-10-20T00:05:10Z", "updated_at": "2021-10-20T00:05:10Z", "author_association": "CONTRIBUTOR", "body": "I was looking through the Dockerfile-generation code to see if there was anything that would cause memory usage to be a lot during deployment. \r\n\r\nI noticed that the Dockerfile [runs `datasette --inspect`](https://github.com/simonw/datasette/blob/main/datasette/utils/__init__.py#L354). Is it possible that this is using a lot of memory usage?\r\n\r\nOr would that come into play when running `gcloud builds submit`, not when it's actually deployed?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1015646369, "label": "Exceeding Cloud Run memory limits when deploying a 4.8G database"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1480#issuecomment-938171377", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1480", "id": 938171377, "node_id": "IC_kwDOBm6k_c4361vx", "user": {"value": 110420, "label": "ghing"}, "created_at": "2021-10-07T21:33:12Z", "updated_at": "2021-10-07T21:33:12Z", "author_association": "CONTRIBUTOR", "body": "Thanks for the reply @simonw. What services have you had better success with than Cloud Run for larger database?\r\n\r\nAlso, what about my issue description makes you think there may be a workaround?\r\n\r\nIs there any instrumentation I could add to see at which point in the deploy the memory usage spikes? Should I be able to see this whether it's running under Docker locally, or do you suspect this is Cloud Run-specific?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1015646369, "label": "Exceeding Cloud Run memory limits when deploying a 4.8G database"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1480#issuecomment-938134038", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1480", "id": 938134038, "node_id": "IC_kwDOBm6k_c436soW", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-10-07T20:31:46Z", "updated_at": "2021-10-07T20:31:46Z", "author_association": "OWNER", "body": "I've had this problem too - my solution was to not use Cloud Run for databases larger than about 2GB, but the way you describe it here makes me think that maybe there is a workaround here which could get it to work.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1015646369, "label": "Exceeding Cloud Run memory limits when deploying a 4.8G database"}, "performed_via_github_app": null}