{"html_url": "https://github.com/simonw/sqlite-utils/issues/500#issuecomment-1282800547", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/500", "id": 1282800547, "node_id": "IC_kwDOCGYnMM5Mdfuj", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-10-18T18:02:09Z", "updated_at": "2022-10-18T18:02:09Z", "author_association": "OWNER", "body": "Documentation: https://sqlite-utils.datasette.io/en/latest/reference.html#sqlite-utils-utils-flatten", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 1, \"eyes\": 0}", "issue": {"value": 1413610718, "label": "Turn --flatten into a documented utility function"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1415#issuecomment-1255603780", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1415", "id": 1255603780, "node_id": "IC_kwDOBm6k_c5K1v5E", "user": {"value": 17532695, "label": "bendnorman"}, "created_at": "2022-09-22T22:06:10Z", "updated_at": "2022-09-22T22:06:10Z", "author_association": "NONE", "body": "This would be great! I just went through the process of figuring out the minimum permissions for a service account to run `datasette publish cloudrun` for [PUDL](https://github.com/catalyst-cooperative/pudl)'s [datasette deployment](https://data.catalyst.coop/). These are the roles I gave the service account (disclaim: I'm not sure these are the minimum permissions):\r\n\r\n- Cloud Build Service Account: The SA needs this role to publish the build on Cloud Build. \r\n- Cloud Run Admin for the Cloud Run datasette service so the SA can deploy the build.\r\n- I gave the SA the Storage Admin role on the bucket Cloud Build creates to store the build tar files. \r\n- The Viewer Role is [required for storing build logs in the default bucket](https://cloud.google.com/build/docs/running-builds/submit-build-via-cli-api#permissions). More on this below!\r\n\r\nThe Viewer Role is a Basic IAM role that [Google does not recommend using](https://cloud.google.com/build/docs/running-builds/submit-build-via-cli-api#permissions):\r\n\r\n> Caution: Basic roles include thousands of permissions across all Google Cloud services. In production environments, do not grant basic roles unless there is no alternative. Instead, grant the most limited [predefined roles](https://cloud.google.com/iam/docs/understanding-roles#predefined_roles) or [custom roles](https://cloud.google.com/iam/docs/understanding-custom-roles) that meet your needs.\r\n\r\nIf you don't grant the Viewer role the `gcloud builds submit` command will successfully create a build but returns exit code 1, preventing the script from getting to the cloud run step:\r\n\r\n```\r\nERROR: (gcloud.builds.submit)\r\nThe build is running, and logs are being written to the default logs bucket.\r\nThis tool can only stream logs if you are Viewer/Owner of the project and, if applicable, allowed by your VPC-SC security policy.\r\n\r\nThe default logs bucket is always outside any VPC-SC security perimeter.\r\nIf you want your logs saved inside your VPC-SC perimeter, use your own bucket.\r\nSee https://cloud.google.com/build/docs/securing-builds/store-manage-build-logs.\r\n```\r\nlong stack trace...\r\n```\r\nCalledProcessError: Command 'gcloud builds submit --tag gcr.io/catalyst-cooperative-pudl/datasette' returned non-zero exit status 1.\r\n```\r\n\r\nYou can store Cloud Build logs in a [user-created bucket](https://cloud.google.com/build/docs/securing-builds/store-manage-build-logs#store-custom-bucket) which only requires the Storage Admin role. However, you have to pass a config file to `gcloud builds submit`, which isn't possible with the current options for `datasette publish cloudrun`. \r\n\r\nI propose we add an additional CLI option to `datasette publish cloudrun` called `--build-config` that allows users to pass a [config file](https://cloud.google.com/build/docs/running-builds/submit-build-via-cli-api#running_builds) specifying a user create Cloud Build log bucket. ", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 1, \"eyes\": 0}", "issue": {"value": 959137143, "label": "feature request: document minimum permissions for service account for cloudrun"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1814#issuecomment-1251677220", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1814", "id": 1251677220, "node_id": "IC_kwDOBm6k_c5KmxQk", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-09-19T23:34:30Z", "updated_at": "2022-09-19T23:34:30Z", "author_association": "OWNER", "body": "The `settings.json` file can only be used with settings that are set using `--setting name value` - the full list of those is here: https://docs.datasette.io/en/stable/settings.html\r\n\r\nThe `--static` option works differently. In configuration directory mode you can skip it entirely and instead have a `/static/` folder - so your directory structure would look like this:\r\n\r\n```\r\nbibliography/static/styles.css\r\n```\r\nAnd then when you run `datasette bibliography/` the following URL will work:\r\n\r\n http://127.0.0.1:8001/static/styles.css\r\n", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 1, \"eyes\": 0}", "issue": {"value": 1378495690, "label": "Static files not served"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/440#issuecomment-1155672675", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/440", "id": 1155672675, "node_id": "IC_kwDOCGYnMM5E4ipj", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-06-14T20:19:07Z", "updated_at": "2022-06-14T20:19:07Z", "author_association": "OWNER", "body": "Documentation: https://sqlite-utils.datasette.io/en/latest/python-api.html#reading-rows-from-a-file", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 1, \"eyes\": 0}", "issue": {"value": 1250629388, "label": "CSV files with too many values in a row cause errors"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/343#issuecomment-1055855845", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/343", "id": 1055855845, "node_id": "IC_kwDOCGYnMM4-7xTl", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-01T21:04:45Z", "updated_at": "2022-03-01T22:43:38Z", "author_association": "OWNER", "body": "I'm going to make that `_hash()` utility function a documented, non-underscore-prefixed function too - called `hash_record()`.", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 1, \"eyes\": 0}", "issue": {"value": 1063388037, "label": "Provide function to generate hash_id from specified columns"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1423#issuecomment-995023410", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1423", "id": 995023410, "node_id": "IC_kwDOBm6k_c47Ttoy", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-15T17:48:40Z", "updated_at": "2021-12-15T17:48:40Z", "author_association": "OWNER", "body": "You've caused me to rethink this feature - I no longer think there's value in only showing these numbers if `?_facet_size=max` as opposed to all of the time. New issue coming up.", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 1, \"eyes\": 0}", "issue": {"value": 962391325, "label": "Show count of facet values if ?_facet_size=max"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1252#issuecomment-808986495", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1252", "id": 808986495, "node_id": "MDEyOklzc3VlQ29tbWVudDgwODk4NjQ5NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-03-29T00:13:59Z", "updated_at": "2021-03-29T00:13:59Z", "author_association": "OWNER", "body": "Neat fix, thank you!", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 1, \"eyes\": 0}", "issue": {"value": 825217564, "label": "Add back styling to lists within table cells (fixes #1141)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1265#issuecomment-803130332", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1265", "id": 803130332, "node_id": "MDEyOklzc3VlQ29tbWVudDgwMzEzMDMzMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-03-19T21:03:09Z", "updated_at": "2021-03-19T21:03:09Z", "author_association": "OWNER", "body": "This is now available in `datasette-auth-passwords` 0.4! https://github.com/simonw/datasette-auth-passwords/releases/tag/0.4", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 1, \"eyes\": 0}", "issue": {"value": 836123030, "label": "Support for HTTP Basic Authentication"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1236#issuecomment-783674038", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1236", "id": 783674038, "node_id": "MDEyOklzc3VlQ29tbWVudDc4MzY3NDAzOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-22T21:05:21Z", "updated_at": "2021-02-22T21:05:21Z", "author_association": "OWNER", "body": "It's good on mobile - iOS at least. Going to close this open new issues if anyone reports bugs.", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 1, \"eyes\": 0}", "issue": {"value": 812228314, "label": "Ability to increase size of the SQL editor window"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/283#issuecomment-781764561", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/283", "id": 781764561, "node_id": "MDEyOklzc3VlQ29tbWVudDc4MTc2NDU2MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-19T02:10:21Z", "updated_at": "2021-02-19T02:10:21Z", "author_association": "OWNER", "body": "This feature is now released! https://docs.datasette.io/en/stable/changelog.html#v0-55", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 1, \"eyes\": 0}", "issue": {"value": 325958506, "label": "Support cross-database joins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/283#issuecomment-781077127", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/283", "id": 781077127, "node_id": "MDEyOklzc3VlQ29tbWVudDc4MTA3NzEyNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-18T05:56:30Z", "updated_at": "2021-02-18T05:57:34Z", "author_association": "OWNER", "body": "I'm going to to try prototyping the `--crossdb` option that causes `/_memory` to connect to all databases as a starting point and see how well that works.", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 1, \"eyes\": 0}", "issue": {"value": 325958506, "label": "Support cross-database joins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1091#issuecomment-758280611", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1091", "id": 758280611, "node_id": "MDEyOklzc3VlQ29tbWVudDc1ODI4MDYxMQ==", "user": {"value": 6739646, "label": "tballison"}, "created_at": "2021-01-11T23:06:10Z", "updated_at": "2021-01-11T23:06:10Z", "author_association": "NONE", "body": "+1\r\n\r\nYep! Fixes it. If I navigate to https://corpora.tika.apache.org/datasette, I get a 404 (database not found: datasette), but if I navigate to https://corpora.tika.apache.org/datasette/file_profiles/, everything WORKS!\r\n\r\nThank you!", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 1, \"eyes\": 0}", "issue": {"value": 742011049, "label": ".json and .csv exports fail to apply base_url"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/187#issuecomment-489353316", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/187", "id": 489353316, "node_id": "MDEyOklzc3VlQ29tbWVudDQ4OTM1MzMxNg==", "user": {"value": 46059, "label": "carsonyl"}, "created_at": "2019-05-04T18:36:36Z", "updated_at": "2019-05-04T18:36:36Z", "author_association": "NONE", "body": "Hi @simonw - I just hit this issue when trying out Datasette after your PyCon talk today. Datasette is pinned to Sanic 0.7.0, but it looks like 0.8.0 added the option to remove the uvloop dependency for Windows by having an environment variable `SANIC_NO_UVLOOP` at install time. Maybe that'll be sufficient before a port to Starlette?", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 1, \"eyes\": 0}", "issue": {"value": 309033998, "label": "Windows installation error"}, "performed_via_github_app": null}