home / github / issue_comments

Menu
  • Search all tables
  • GraphQL API

issue_comments: 1255603780

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions issue performed_via_github_app
https://github.com/simonw/datasette/issues/1415#issuecomment-1255603780 https://api.github.com/repos/simonw/datasette/issues/1415 1255603780 IC_kwDOBm6k_c5K1v5E 17532695 2022-09-22T22:06:10Z 2022-09-22T22:06:10Z NONE

This would be great! I just went through the process of figuring out the minimum permissions for a service account to run datasette publish cloudrun for PUDL's datasette deployment. These are the roles I gave the service account (disclaim: I'm not sure these are the minimum permissions):

  • Cloud Build Service Account: The SA needs this role to publish the build on Cloud Build.
  • Cloud Run Admin for the Cloud Run datasette service so the SA can deploy the build.
  • I gave the SA the Storage Admin role on the bucket Cloud Build creates to store the build tar files.
  • The Viewer Role is required for storing build logs in the default bucket. More on this below!

The Viewer Role is a Basic IAM role that Google does not recommend using:

Caution: Basic roles include thousands of permissions across all Google Cloud services. In production environments, do not grant basic roles unless there is no alternative. Instead, grant the most limited predefined roles or custom roles that meet your needs.

If you don't grant the Viewer role the gcloud builds submit command will successfully create a build but returns exit code 1, preventing the script from getting to the cloud run step:

``` ERROR: (gcloud.builds.submit) The build is running, and logs are being written to the default logs bucket. This tool can only stream logs if you are Viewer/Owner of the project and, if applicable, allowed by your VPC-SC security policy.

The default logs bucket is always outside any VPC-SC security perimeter. If you want your logs saved inside your VPC-SC perimeter, use your own bucket. See https://cloud.google.com/build/docs/securing-builds/store-manage-build-logs. long stack trace... CalledProcessError: Command 'gcloud builds submit --tag gcr.io/catalyst-cooperative-pudl/datasette' returned non-zero exit status 1. ```

You can store Cloud Build logs in a user-created bucket which only requires the Storage Admin role. However, you have to pass a config file to gcloud builds submit, which isn't possible with the current options for datasette publish cloudrun.

I propose we add an additional CLI option to datasette publish cloudrun called --build-config that allows users to pass a config file specifying a user create Cloud Build log bucket.

{
    "total_count": 1,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 1,
    "eyes": 0
}
959137143  
Powered by Datasette · Queries took 1.14ms · About: github-to-sqlite