html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,issue,performed_via_github_app https://github.com/simonw/datasette/issues/694#issuecomment-648296323,https://api.github.com/repos/simonw/datasette/issues/694,648296323,MDEyOklzc3VlQ29tbWVudDY0ODI5NjMyMw==,3903726,2020-06-23T17:10:51Z,2020-06-23T17:10:51Z,NONE,"@simonw Did you find the reason? I had similar situation and I check this on millions ways. I am sure app doesn't consume such memory. I was trying the app with: `docker run --rm -it -p 80:80 -m 128M foo` I was watching app with `docker stats`. Even limited memory by `CMD [""java"", ""-Xms60M"", ""-Xmx60M"", ""-jar"", ""api.jar""]`. Checked memory usage by app in code and print bash commands. The app definitely doesn't use this memory. Also doesn't write files. Only one solution is to change memory to 512M. It is definitely something wrong with `cloud run`. I even did special app for testing this. It looks like when I cross very small amount of code / memory / app size in random when, then memory needs grow +hundreds. Nothing make sense here. Especially it works everywhere expect cloud run. Please let me know if you discovered something more.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",576582604, https://github.com/simonw/datasette/issues/694#issuecomment-596266190,https://api.github.com/repos/simonw/datasette/issues/694,596266190,MDEyOklzc3VlQ29tbWVudDU5NjI2NjE5MA==,9599,2020-03-08T23:32:58Z,2020-03-08T23:32:58Z,OWNER,Shipped in [Datasette 0.38](https://datasette.readthedocs.io/en/latest/changelog.html#v0-38).,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",576582604, https://github.com/simonw/datasette/issues/694#issuecomment-595491182,https://api.github.com/repos/simonw/datasette/issues/694,595491182,MDEyOklzc3VlQ29tbWVudDU5NTQ5MTE4Mg==,9599,2020-03-05T23:07:33Z,2020-03-05T23:45:38Z,OWNER,"So two things I need to do for this: * Add a `--memory` option to `datasette publish cloudrun` * Maybe capture this error and output a helpful suggestion that you increase the memory with that option? Not sure how feasible that is.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",576582604, https://github.com/simonw/datasette/issues/694#issuecomment-595498926,https://api.github.com/repos/simonw/datasette/issues/694,595498926,MDEyOklzc3VlQ29tbWVudDU5NTQ5ODkyNg==,9599,2020-03-05T23:35:32Z,2020-03-05T23:35:32Z,OWNER,"Tested that with: ``` datasette publish cloudrun fixtures.db --memory 512Mi --service fixtures-memory-512mi ``` Here's the result in the Google Cloud web console: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",576582604, https://github.com/simonw/datasette/issues/694#issuecomment-595492478,https://api.github.com/repos/simonw/datasette/issues/694,595492478,MDEyOklzc3VlQ29tbWVudDU5NTQ5MjQ3OA==,9599,2020-03-05T23:12:25Z,2020-03-05T23:12:25Z,OWNER,I wonder if there's some weird reason that we churn through too much RAM on initial datasette startup here? I wouldn't expect startup to use a huge spike of RAM. Maybe need to profile a bit.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",576582604, https://github.com/simonw/datasette/issues/694#issuecomment-595490889,https://api.github.com/repos/simonw/datasette/issues/694,595490889,MDEyOklzc3VlQ29tbWVudDU5NTQ5MDg4OQ==,9599,2020-03-05T23:06:30Z,2020-03-05T23:06:30Z,OWNER,"This fixed it (I tried 1Gi first but that gave the same error): ``` gcloud run deploy --allow-unauthenticated --platform=managed \ --image gcr.io/datasette-222320/datasette non-profit-ethics --memory=2Gi ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",576582604, https://github.com/simonw/datasette/issues/694#issuecomment-595489514,https://api.github.com/repos/simonw/datasette/issues/694,595489514,MDEyOklzc3VlQ29tbWVudDU5NTQ4OTUxNA==,9599,2020-03-05T23:01:35Z,2020-03-05T23:01:35Z,OWNER,"Aha! The logs said ""Memory limit of 244M exceeded with 247M used. Consider increasing the memory limit, see https://cloud.google.com/run/docs/configuring/memory-limits""","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",576582604, https://github.com/simonw/datasette/issues/694#issuecomment-595489222,https://api.github.com/repos/simonw/datasette/issues/694,595489222,MDEyOklzc3VlQ29tbWVudDU5NTQ4OTIyMg==,9599,2020-03-05T23:00:33Z,2020-03-05T23:00:33Z,OWNER,"The initial `datasette publish cloudrun` failed, now I can replicate that error by running: ``` gcloud run deploy --allow-unauthenticated --platform=managed \ --image gcr.io/datasette-222320/datasette non-profit-ethics ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",576582604,