github
html_url | issue_url | id | node_id | user | created_at | updated_at | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
https://github.com/simonw/sqlite-utils/issues/235#issuecomment-1304539296 | https://api.github.com/repos/simonw/sqlite-utils/issues/235 | 1304539296 | IC_kwDOCGYnMM5NwbCg | 559711 | 2022-11-05T12:40:12Z | 2022-11-05T12:40:12Z | NONE | I had the problem this morning when running: `Python==3.9.6 sqlite3.sqlite_version==3.37.0 sqlite-utils==3.30 ` I upgraded to: `Python ==3.10.8 sqlite3.sqlite_version==3.37.2 sqlite-utils==3.30 ` and the error did not appear anymore. Hope this helps Ryan | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
810618495 | |
https://github.com/simonw/sqlite-utils/issues/511#issuecomment-1304320521 | https://api.github.com/repos/simonw/sqlite-utils/issues/511 | 1304320521 | IC_kwDOCGYnMM5NvloJ | 7908073 | 2022-11-04T22:54:09Z | 2022-11-04T22:59:54Z | CONTRIBUTOR | I ran `PRAGMA integrity_check` and it returned `ok`. but then I tried restoring from a backup and I didn't get this `IntegrityError: constraint failed` error. So I think it was just something wrong with my database. If it happens again I will first try to reindex and see if that fixes the issue | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1436539554 | |
https://github.com/simonw/sqlite-utils/issues/511#issuecomment-1304078945 | https://api.github.com/repos/simonw/sqlite-utils/issues/511 | 1304078945 | IC_kwDOCGYnMM5Nuqph | 7908073 | 2022-11-04T19:38:36Z | 2022-11-04T20:13:17Z | CONTRIBUTOR | Even more bizarre, the source db only has one record and the target table has no conflicting record: ``` 875 0.3s lb:/ (main|✚2) [0|0]🌺 sqlite-utils tube_71.db 'select * from media where path = "https://archive.org/details/088ghostofachanceroygetssackedrevengeofthelivinglunchdvdripxvidphz"' | jq [ { "size": null, "time_created": null, "play_count": 1, "language": null, "view_count": null, "width": null, "height": null, "fps": null, "average_rating": null, "live_status": null, "age_limit": null, "uploader": null, "time_played": 0, "path": "https://archive.org/details/088ghostofachanceroygetssackedrevengeofthelivinglunchdvdripxvidphz", "id": "088ghostofachanceroygetssackedrevengeofthelivinglunchdvdripxvidphz/074 - Home Away from Home, Rainy Day Robot, Odie the Amazing DVDRip XviD [PhZ].mkv", "ie_key": "ArchiveOrg", "playlist_path": "https://archive.org/details/088ghostofachanceroygetssackedrevengeofthelivinglunchdvdripxvidphz", "duration": 1424.05, "tags": null, "title": "074 - Home Away from Home, Rainy Day Robot, Odie the Amazing DVDRip XviD [PhZ].mkv" } ] 875 0.3s lb:/ (main|✚2) [0|0]🥧 sqlite-utils video.db 'select * from media where path = "https://archive.org/details/088ghostofachanceroygetssackedrevengeofthelivinglunchdvdripxvidphz"' | jq [] ``` I've been able to use this code successfully several times before so not sure what's causing the issue. I guess the way that I'm handling multiple databases is an issue, though it hasn't ever inserted into the source db, not sure what's different. The only reasonable explanation is that it is trying to insert into the source db from the source db for some reason? Or maybe sqlite3 is checking the source db for primary key violation because the table name is the same | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1436539554 | |
https://github.com/simonw/datasette/issues/1882#issuecomment-1302818784 | https://api.github.com/repos/simonw/datasette/issues/1882 | 1302818784 | IC_kwDOBm6k_c5Np2_g | 9599 | 2022-11-04T00:25:18Z | 2022-11-04T16:12:39Z | OWNER | On that basis I think the core API design should change to this: ``` POST /db/-/create Authorization: Bearer xxx Content-Type: application/json { "name": "my new table", "columns": [ { "name": "id", "type": "integer" }, { "name": "title", "type": "text" } ] "pk": "id" } ``` This leaves room for a `"rows": []` key at the root too. Having that as a child of `"table"` felt unintuitive to me, and I didn't like the way this looked either: ```json { "table": { "name": "my_new_table" }, "rows": [ {"id": 1, "title": "Title"} ] } ``` Weird to have the table `name` nested inside `table` when `rows` wasn't. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1435294468 | |
https://github.com/simonw/sqlite-utils/issues/50#issuecomment-1303660293 | https://api.github.com/repos/simonw/sqlite-utils/issues/50 | 1303660293 | IC_kwDOCGYnMM5NtEcF | 7908073 | 2022-11-04T14:38:36Z | 2022-11-04T14:38:36Z | CONTRIBUTOR | where did you see the limit as 999? I believe the limit has been 32766 for quite some time. If you could detect which one this could speed up batch insert of some types of data significantly | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
473083260 | |
https://github.com/simonw/datasette/issues/1217#issuecomment-1303301786 | https://api.github.com/repos/simonw/datasette/issues/1217 | 1303301786 | IC_kwDOBm6k_c5Nrs6a | 31312775 | 2022-11-04T11:37:52Z | 2022-11-04T11:37:52Z | NONE | All seems to work well, but there are some glitches to do with proxies, see #1883 . Excited to use this :) | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
802513359 | |
https://github.com/simonw/datasette/issues/1217#issuecomment-1303299509 | https://api.github.com/repos/simonw/datasette/issues/1217 | 1303299509 | IC_kwDOBm6k_c5NrsW1 | 31312775 | 2022-11-04T11:35:13Z | 2022-11-04T11:35:13Z | NONE | The following worked for deployment to RStudio / Posit Connect An app.py along the lines of: ```python from pathlib import Path from datasette.app import Datasette example_db = Path(__file__).parent / "data" / "example.db" # use connect 'Content URL' setting here to set app to /datasette/ ds = Datasette(files=[example_db], settings={"base_url": "/datasette/"}) ds._startup_invoked = True ds_app = ds.app() ``` Then to deploy, from within a virtualenv with `rsconnect-python` ```sh rsconnect write-manifest fastapi -p $VIRTUAL_ENV/bin/python -e app:ds_app -o . rsconnect deploy manifest manifest.json -n <name of connect server> -t "Example Datasette" ``` | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
802513359 | |
https://github.com/simonw/datasette/issues/1882#issuecomment-1302818153 | https://api.github.com/repos/simonw/datasette/issues/1882 | 1302818153 | IC_kwDOBm6k_c5Np21p | 9599 | 2022-11-04T00:23:58Z | 2022-11-04T00:23:58Z | OWNER | I made a decision here that this endpoint should also accept an optional `"rows": [...]` list which is used to automatically create the table using a schema derived from those example rows (which then get inserted): - https://github.com/simonw/datasette/issues/1862#issuecomment-1302817807 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1435294468 | |
https://github.com/simonw/datasette/issues/1862#issuecomment-1302817807 | https://api.github.com/repos/simonw/datasette/issues/1862 | 1302817807 | IC_kwDOBm6k_c5Np2wP | 9599 | 2022-11-04T00:23:13Z | 2022-11-04T00:23:13Z | OWNER | I don't like this on `/db/table/-/insert` - I think it makes more sense to optionally pass a `"rows"` key to the `/db/-/create` endpoint instead. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1425011030 | |
https://github.com/simonw/datasette/issues/1862#issuecomment-1302817500 | https://api.github.com/repos/simonw/datasette/issues/1862 | 1302817500 | IC_kwDOBm6k_c5Np2rc | 9599 | 2022-11-04T00:22:31Z | 2022-11-04T00:22:31Z | OWNER | Maybe this is a feature added to the existing `/db/table/-/insert` endpoint? Bit weird that you can call that endpoint for a table that doesn't exist yet, but it fits the `sqlite-utils` way of creating tables which I've found very pleasant over the past few years. So perhaps the API looks like this: ``` POST /<database>/<table>/-/insert Content-Type: application/json Authorization: Bearer dstok_<rest-of-token> { "create_table": true, "rows": [ { "column1": "value1", "column2": "value2" }, { "column1": "value3", "column2": "value4" } ] } ``` The `create_table` option will cause the table to be created if it doesn't already exist. That means I probably also need a `"pk": "..."` column for setting a primary key if the table is being created ... and maybe other options that I invent for this other feature too? - #1882 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1425011030 | |
https://github.com/simonw/datasette/issues/1871#issuecomment-1302815105 | https://api.github.com/repos/simonw/datasette/issues/1871 | 1302815105 | IC_kwDOBm6k_c5Np2GB | 9599 | 2022-11-04T00:17:23Z | 2022-11-04T00:17:23Z | OWNER | I'll probably enhance it a bit more though, I want to provide a UI that lists all the tables you can explore and lets you click to pre-fill the forms with them. Though at that point what should I do about the other endpoints? Probably list those too. Gets a bit complex, especially with the row-level update and delete endpoints. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1427293909 | |
https://github.com/simonw/datasette/issues/1871#issuecomment-1302814693 | https://api.github.com/repos/simonw/datasette/issues/1871 | 1302814693 | IC_kwDOBm6k_c5Np1_l | 9599 | 2022-11-04T00:16:36Z | 2022-11-04T00:16:36Z | OWNER | I can close this issue once I fix it so it no longer hard-codes a potentially invalid example endpoint: https://github.com/simonw/datasette/blob/bcc781f4c50a8870e3389c4e60acb625c34b0317/datasette/templates/api_explorer.html#L24-L26 https://github.com/simonw/datasette/blob/bcc781f4c50a8870e3389c4e60acb625c34b0317/datasette/templates/api_explorer.html#L34-L35 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1427293909 | |
https://github.com/simonw/datasette/issues/1881#issuecomment-1302813449 | https://api.github.com/repos/simonw/datasette/issues/1881 | 1302813449 | IC_kwDOBm6k_c5Np1sJ | 9599 | 2022-11-04T00:14:07Z | 2022-11-04T00:14:07Z | OWNER | Tool is now live here: https://latest-1-0-dev.datasette.io/-/permissions Needs root perms, so access this first: https://latest-1-0-dev.datasette.io/login-as-root | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1434094365 | |
https://github.com/simonw/datasette/issues/1881#issuecomment-1302812918 | https://api.github.com/repos/simonw/datasette/issues/1881 | 1302812918 | IC_kwDOBm6k_c5Np1j2 | 9599 | 2022-11-04T00:13:05Z | 2022-11-04T00:13:05Z | OWNER | Has tests now. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1434094365 | |
https://github.com/simonw/datasette/issues/1863#issuecomment-1302790013 | https://api.github.com/repos/simonw/datasette/issues/1863 | 1302790013 | IC_kwDOBm6k_c5Npv99 | 9599 | 2022-11-03T23:32:30Z | 2022-11-03T23:32:30Z | OWNER | I'm not going to allow updates to primary keys. If you need to do that, you can instead delete the record and then insert a new one with the new primary keys you wanted - or maybe use a custom SQL query. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1425029242 | |
https://github.com/simonw/datasette/issues/1851#issuecomment-1294224185 | https://api.github.com/repos/simonw/datasette/issues/1851 | 1294224185 | IC_kwDOBm6k_c5NJEs5 | 9599 | 2022-10-27T23:18:24Z | 2022-11-03T23:26:05Z | OWNER | So new API design is: ``` POST /db/table/-/insert Authorization: Bearer xxx Content-Type: application/json { "row": { "id": 1, "name": "New record" } } ``` Returns: ``` 201 Created { "row": [{ "id": 1, "name": "New record" }] } ``` | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1421544654 | |
https://github.com/simonw/datasette/issues/1863#issuecomment-1302785086 | https://api.github.com/repos/simonw/datasette/issues/1863 | 1302785086 | IC_kwDOBm6k_c5Npuw- | 9599 | 2022-11-03T23:24:33Z | 2022-11-03T23:24:56Z | OWNER | Thinking more about validation: I'm considering if this should validate that columns which are defined as SQLite foreign keys are being updated to values that exist in those other tables. I like the sound of this. It seems like a sensible default behaviour for Datasette. And it fits with the fact that Datasette treats foreign keys specially elsewhere in the interface. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1425029242 | |
https://github.com/simonw/datasette/issues/1863#issuecomment-1302760549 | https://api.github.com/repos/simonw/datasette/issues/1863 | 1302760549 | IC_kwDOBm6k_c5Npoxl | 9599 | 2022-11-03T22:43:04Z | 2022-11-03T23:21:31Z | OWNER | The `id=(int, ...)` thing is weird, but is apparently Pydantic syntax for a required field? https://cs.github.com/starlite-api/starlite/blob/28ddc847c4cb072f0d5d21a9ecd5259711f12ec9/docs/usage/11-data-transfer-objects.md#L161 confirms: > 1. For required fields use a tuple of type + ellipsis, for example `(str, ...)`. > 2. For optional fields use a tuple of type + `None`, for example `(str, None)` > 3. To set a default value use a tuple of type + default value, for example `(str, "Hello World")` | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1425029242 | |
https://github.com/simonw/datasette/issues/1863#issuecomment-1302760382 | https://api.github.com/repos/simonw/datasette/issues/1863 | 1302760382 | IC_kwDOBm6k_c5Npou- | 9599 | 2022-11-03T22:42:47Z | 2022-11-03T22:42:47Z | OWNER | ```python print(create_model('document', id=(int, ...), title=(str, None)).schema_json(indent=2)) ``` ```json { "title": "document", "type": "object", "properties": { "id": { "title": "Id", "type": "integer" }, "title": { "title": "Title", "type": "string" } }, "required": [ "id" ] } ``` | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1425029242 | |
https://github.com/simonw/datasette/issues/1863#issuecomment-1302759174 | https://api.github.com/repos/simonw/datasette/issues/1863 | 1302759174 | IC_kwDOBm6k_c5NpocG | 9599 | 2022-11-03T22:40:47Z | 2022-11-03T22:40:47Z | OWNER | I'm considering Pydantic for this, see: - https://github.com/simonw/datasette/issues/1882#issuecomment-1302716350 In particular the `create_model()` method: https://pydantic-docs.helpmanual.io/usage/models/#dynamic-model-creation This would give me good validation. It would also, weirdly, give me the ability to output JSON schema. Maybe I could have this as the JSON schema for a row? `/db/table/-/json-schema` | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1425029242 | |
https://github.com/simonw/datasette/issues/1882#issuecomment-1302716350 | https://api.github.com/repos/simonw/datasette/issues/1882 | 1302716350 | IC_kwDOBm6k_c5Npd-- | 9599 | 2022-11-03T21:51:14Z | 2022-11-03T22:35:54Z | OWNER | Validating this JSON object is getting a tiny bit complex. I'm tempted to adopt https://pydantic-docs.helpmanual.io/ at this point. The `create_model` example on https://stackoverflow.com/questions/66168517/generate-dynamic-model-using-pydantic/66168682#66168682 is particularly relevant, especially when I work on this issue: - #1863 ```python from pydantic import create_model d = {"strategy": {"name": "test_strat2", "periods": 10}} Strategy = create_model("Strategy", **d["strategy"]) print(Strategy.schema_json(indent=2)) ``` `create_model()`: https://pydantic-docs.helpmanual.io/usage/models/#dynamic-model-creation | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1435294468 | |
https://github.com/simonw/datasette/issues/1882#issuecomment-1302721916 | https://api.github.com/repos/simonw/datasette/issues/1882 | 1302721916 | IC_kwDOBm6k_c5NpfV8 | 9599 | 2022-11-03T21:58:50Z | 2022-11-03T21:59:17Z | OWNER | Mocked up a quick HTML+JavaScript form for creating that JSON structure using some iteration against Copilot prompts: ```html <pre> /* JSON format: { "table": { "name": "my new table", "columns": [ { "name": "id", "type": "integer" }, { "name": "title", "type": "text" } ] "pk": "id" } } HTML form with Javascript for creating this JSON: */</pre> <form id="create-table-form"> <label for="table-name">Table name</label> <input type="text" id="table-name" name="table-name" required><br> <label for="table-pk">Primary key</label> <input type="text" id="table-pk" name="table-pk" required><br> <label for="column-name">Column name</label> <input type="text" id="column-name" name="column-name" required> <label for="column-type">Column type</label> <input type="text" id="column-type" name="column-type" required> <button type="button" id="add-column">Add column</button> <p>Current columns:</p> <ul id="columns"></ul> <button type="button" id="create-table">Create table</button> </form> <script> var form = document.getElementById('create-table-form'); var tableName = document.getElementById('table-name'); var tablePk = document.getElementById('table-pk'); var columnName = document.getElementById('column-name'); var columnType = document.getElementById('column-type'); var addColumn = document.getElementById('add-column'); var createTable = document.getElementById('create-table'); var columns = []; addColumn.addEventListener('click', () => { columns.push({ name: columnName.value, type: columnType.value }); var li = document.createElement('li'); li.textContent = columnName.value + ' (' + columnType.value + ')'; // Add a delete button to each column var deleteButton = document.createElement('button'); deleteButton.textContent = 'Delete'; deleteButton.addEventListener('click', () => { columns.splice(colu… | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1435294468 | |
https://github.com/simonw/datasette/issues/1882#issuecomment-1302715662 | https://api.github.com/repos/simonw/datasette/issues/1882 | 1302715662 | IC_kwDOBm6k_c5Npd0O | 9599 | 2022-11-03T21:50:27Z | 2022-11-03T21:50:27Z | OWNER | API design for this: ``` POST /db/-/create Authorization: Bearer xxx Content-Type: application/json { "table": { "name": "my new table", "columns": [ { "name": "id", "type": "integer" }, { "name": "title", "type": "text" } ] "pk": "id" } } ``` Supported column types are: - `integer` - `text` - `float` (even though SQLite calls it a "real") - `blob` This matches my design for `sqlite-utils`: https://sqlite-utils.datasette.io/en/stable/cli.html#cli-create-table | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1435294468 | |
https://github.com/simonw/datasette/issues/1843#issuecomment-1302679026 | https://api.github.com/repos/simonw/datasette/issues/1843 | 1302679026 | IC_kwDOBm6k_c5NpU3y | 9599 | 2022-11-03T21:22:42Z | 2022-11-03T21:22:42Z | OWNER | Docs for the new `db.close()` method: https://docs.datasette.io/en/latest/internals.html#db-close | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1408757705 | |
https://github.com/simonw/datasette/issues/1843#issuecomment-1302678384 | https://api.github.com/repos/simonw/datasette/issues/1843 | 1302678384 | IC_kwDOBm6k_c5NpUtw | 9599 | 2022-11-03T21:21:59Z | 2022-11-03T21:21:59Z | OWNER | I added extra debug info to `/-/threads` to see this for myself: ```diff diff --git a/datasette/app.py b/datasette/app.py index 02bd38f1..16579e28 100644 --- a/datasette/app.py +++ b/datasette/app.py @@ -969,6 +969,13 @@ class Datasette: "threads": [ {"name": t.name, "ident": t.ident, "daemon": t.daemon} for t in threads ], + "file_connections": { + db.name: [ + [dict(r) for r in conn.execute("pragma database_list").fetchall()] + for conn in db._all_file_connections + ] + for db in self.databases.values() + }, } # Only available in Python 3.7+ if hasattr(asyncio, "all_tasks"): ``` Output after hitting refresh on a few `/fixtures` tables to ensure more threads started: ``` "file_connections": { "_internal": [], "fixtures": [ [ { "seq": 0, "name": "main", "file": "/Users/simon/Dropbox/Development/datasette/fixtures.db" } ], [ { "seq": 0, "name": "main", "file": "/Users/simon/Dropbox/Development/datasette/fixtures.db" } ], [ { "seq": 0, "name": "main", "file": "/Users/simon/Dropbox/Development/datasette/fixtures.db" } ] ] }, ``` I decided not to ship this feature though as it leaks the names of internal database files. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1408757705 | |
https://github.com/simonw/datasette/issues/1843#issuecomment-1302634332 | https://api.github.com/repos/simonw/datasette/issues/1843 | 1302634332 | IC_kwDOBm6k_c5NpJ9c | 9599 | 2022-11-03T20:34:56Z | 2022-11-03T20:34:56Z | OWNER | Confirmed that calling `conn.close()` on each SQLite file-based connection is the way to fix this problem. I'm adding a `db.close()` method (sync, not async - I tried async first but it was really hard to cause every thread in the pool to close its threadlocal database connection) which loops through all known open file-based connections and closes them. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1408757705 | |
https://github.com/simonw/datasette/issues/1843#issuecomment-1302574330 | https://api.github.com/repos/simonw/datasette/issues/1843 | 1302574330 | IC_kwDOBm6k_c5No7T6 | 9599 | 2022-11-03T19:30:22Z | 2022-11-03T19:30:22Z | OWNER | This is affecting me a lot at the moment, on my laptop (runs fine in CI). Here's a change to `conftest.py` which highlights the problem - it cause a failure the moment there are more than 5 open files according to `psutil`: ```diff diff --git a/tests/conftest.py b/tests/conftest.py index f4638a14..21d433c1 100644 --- a/tests/conftest.py +++ b/tests/conftest.py @@ -1,6 +1,7 @@ import httpx import os import pathlib +import psutil import pytest import re import subprocess @@ -192,3 +193,8 @@ def ds_unix_domain_socket_server(tmp_path_factory): yield ds_proc, uds # Shut it down at the end of the pytest session ds_proc.terminate() + + +def pytest_runtest_teardown(item: pytest.Item) -> None: + open_files = psutil.Process().open_files() + assert len(open_files) < 5 ``` The first error I get from this with `pytest --pdb -x` is here: ``` tests/test_api.py ............E >>>>> traceback >>>>> item = <Function test_sql_time_limit> def pytest_runtest_teardown(item: pytest.Item) -> None: open_files = psutil.Process().open_files() > assert len(open_files) < 5 E AssertionError: assert 5 < 5 E + where 5 = len([popenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmpfglrt4p2/fixtures.db', fd=14), popenfile(... fd=19), popenfile(path='/private/var/folders/wr/hn3206rs1yzgq3r49bz8nvnh0000gn/T/tmphdi5b250/fixtures.dot.db', fd=20)]) /Users/simon/Dropbox/Development/datasette/tests/conftest.py:200: AssertionError >>>>> entering PDB >>>>> >>>>> PDB post_mortem (IO-capturing turned off) >>>>> > /Users/simon/Dropbox/Development/datasette/tests/conftest.py(200)pytest_runtest_teardown() -> assert len(open_files) < 5 ``` That's this test: https://github.com/simonw/datasette/blob/2ec5583629005b32cb0877786f9681c5d43ca33f/tests/test_api.py#L656-L673 Which uses this fixture: https://github.com/simonw/datasette/blob/2ec5583629005b32cb0877786f9681c5d43ca33f/tests/fixtures.py#L228-L231 Which calls this func… | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1408757705 | |
https://github.com/simonw/datasette/issues/1855#issuecomment-1301646670 | https://api.github.com/repos/simonw/datasette/issues/1855 | 1301646670 | IC_kwDOBm6k_c5NlY1O | 9599 | 2022-11-03T05:11:26Z | 2022-11-03T05:11:26Z | OWNER | That still needs comprehensive tests before I land it. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1423336089 | |
https://github.com/simonw/datasette/issues/1855#issuecomment-1301646493 | https://api.github.com/repos/simonw/datasette/issues/1855 | 1301646493 | IC_kwDOBm6k_c5NlYyd | 9599 | 2022-11-03T05:11:06Z | 2022-11-03T05:11:06Z | OWNER | Built a prototype of the above: ```diff diff --git a/datasette/default_permissions.py b/datasette/default_permissions.py index 32b0c758..f68aa38f 100644 --- a/datasette/default_permissions.py +++ b/datasette/default_permissions.py @@ -6,8 +6,8 @@ import json import time -@hookimpl(tryfirst=True) -def permission_allowed(datasette, actor, action, resource): +@hookimpl(tryfirst=True, specname="permission_allowed") +def permission_allowed_default(datasette, actor, action, resource): async def inner(): if action in ( "permissions-debug", @@ -57,6 +57,44 @@ def permission_allowed(datasette, actor, action, resource): return inner +@hookimpl(specname="permission_allowed") +def permission_allowed_actor_restrictions(actor, action, resource): + if actor is None: + return None + _r = actor.get("_r") + if not _r: + # No restrictions, so we have no opinion + return None + action_initials = "".join([word[0] for word in action.split("-")]) + # If _r is defined then we use those to further restrict the actor + # Crucially, we only use this to say NO (return False) - we never + # use it to return YES (True) because that might over-ride other + # restrictions placed on this actor + all_allowed = _r.get("a") + if all_allowed is not None: + assert isinstance(all_allowed, list) + if action_initials in all_allowed: + return None + # How about for the current database? + if action in ("view-database", "view-database-download", "execute-sql"): + database_allowed = _r.get("d", {}).get(resource) + if database_allowed is not None: + assert isinstance(database_allowed, list) + if action_initials in database_allowed: + return None + # Or the current table? That's any time the resource is (database, table) + if not isinstance(resource, str) and len(resource) == 2: + database, table = resource + table_al… | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1423336089 | |
https://github.com/simonw/datasette/issues/1881#issuecomment-1301639741 | https://api.github.com/repos/simonw/datasette/issues/1881 | 1301639741 | IC_kwDOBm6k_c5NlXI9 | 9599 | 2022-11-03T04:58:21Z | 2022-11-03T04:58:21Z | OWNER | The whole `database_name` or `(database_name, table_name)` tuple for resource is a bit of a code smell. Maybe this is a chance to tidy that up too? | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1434094365 | |
https://github.com/simonw/datasette/issues/1881#issuecomment-1301639370 | https://api.github.com/repos/simonw/datasette/issues/1881 | 1301639370 | IC_kwDOBm6k_c5NlXDK | 9599 | 2022-11-03T04:57:21Z | 2022-11-03T04:57:21Z | OWNER | The plugin hook would be called `register_permissions()`, for consistency with `register_routes()` and `register_commands()`. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1434094365 | |
https://github.com/simonw/datasette/issues/1881#issuecomment-1301638918 | https://api.github.com/repos/simonw/datasette/issues/1881 | 1301638918 | IC_kwDOBm6k_c5NlW8G | 9599 | 2022-11-03T04:56:06Z | 2022-11-03T04:56:06Z | OWNER | I've also introduced a new concept of a permission abbreviation, which like the permission name needs to be globally unique. That's a problem for plugins - they might just be able to guarantee that their permission long-form name is unique among other plugins (through sensible naming conventions) but the thing where they declare a initial-letters-only abbreviation is far more risky. I think abbreviations are optional - they are provided for core permissions but plugins are advised not to use them. Also Datasette could check that the installed plugins do not provide conflicting permissions on startup and refuse to start if they do. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1434094365 | |
https://github.com/simonw/datasette/issues/1881#issuecomment-1301638156 | https://api.github.com/repos/simonw/datasette/issues/1881 | 1301638156 | IC_kwDOBm6k_c5NlWwM | 9599 | 2022-11-03T04:54:00Z | 2022-11-03T04:54:00Z | OWNER | If I have the permissions defined like this: ```python PERMISSIONS = ( Permission("view-instance", "vi", False, False, True), Permission("view-database", "vd", True, False, True), Permission("view-database-download", "vdd", True, False, True), Permission("view-table", "vt", True, True, True), Permission("view-query", "vq", True, True, True), Permission("insert-row", "ir", True, True, False), Permission("delete-row", "dr", True, True, False), Permission("drop-table", "dt", True, True, False), Permission("execute-sql", "es", True, False, True), Permission("permissions-debug", "pd", False, False, False), Permission("debug-menu", "dm", False, False, False), ) ``` Instead of just calling them by their undeclared names in places like this: ```python await self.ds.permission_allowed( request.actor, "execute-sql", database, default=True ) ``` On the one hand I can ditch that confusing `default=True` option - whether a permission is on by default becomes a characteristic of that `Permission()` itself, which feels much neater. On the other hand though, plugins that introduce their own permissions - like https://datasette.io/plugins/datasette-edit-schema - will need a way to register those permissions with Datasette core. Probably another plugin hook. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1434094365 | |
https://github.com/simonw/datasette/issues/1881#issuecomment-1301635906 | https://api.github.com/repos/simonw/datasette/issues/1881 | 1301635906 | IC_kwDOBm6k_c5NlWNC | 9599 | 2022-11-03T04:48:09Z | 2022-11-03T04:48:09Z | OWNER | I built this prototype on the http://127.0.0.1:8001/-/allow-debug page, which is open to anyone to visit. But... I just realized that using this tool can leak information - you can use it to guess the names of invisible databases and tables and run theoretical permission checks against them. Using the tool also pollutes the list of permission checks that show up on the root-anlo `/-/permissions` page. So.... I'm going to restrict the usage of this tool to users with access to `/-/permissions` and put it on that page instead. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1434094365 | |
https://github.com/simonw/datasette/issues/1881#issuecomment-1301635340 | https://api.github.com/repos/simonw/datasette/issues/1881 | 1301635340 | IC_kwDOBm6k_c5NlWEM | 9599 | 2022-11-03T04:46:41Z | 2022-11-03T04:46:41Z | OWNER | Built this prototype: ![prototype](https://user-images.githubusercontent.com/9599/199649219-f146e43b-bfb5-45e6-9777-956f21a79887.gif) In building it I realized I needed to know which permissions took a table, a database, both or neither. So I had to bake that into the code. Here's the prototype so far (which includes a prototype of the logic for the `_r` field on actor, see #1855): ```diff diff --git a/datasette/default_permissions.py b/datasette/default_permissions.py index 32b0c758..f68aa38f 100644 --- a/datasette/default_permissions.py +++ b/datasette/default_permissions.py @@ -6,8 +6,8 @@ import json import time -@hookimpl(tryfirst=True) -def permission_allowed(datasette, actor, action, resource): +@hookimpl(tryfirst=True, specname="permission_allowed") +def permission_allowed_default(datasette, actor, action, resource): async def inner(): if action in ( "permissions-debug", @@ -57,6 +57,44 @@ def permission_allowed(datasette, actor, action, resource): return inner +@hookimpl(specname="permission_allowed") +def permission_allowed_actor_restrictions(actor, action, resource): + if actor is None: + return None + _r = actor.get("_r") + if not _r: + # No restrictions, so we have no opinion + return None + action_initials = "".join([word[0] for word in action.split("-")]) + # If _r is defined then we use those to further restrict the actor + # Crucially, we only use this to say NO (return False) - we never + # use it to return YES (True) because that might over-ride other + # restrictions placed on this actor + all_allowed = _r.get("a") + if all_allowed is not None: + assert isinstance(all_allowed, list) + if action_initials in all_allowed: + return None + # How about for the current database? + if action in ("view-database", "view-database-download", "execute-sql"): + database_allowed = _r.get("d", {}).get(resource) + if databa… | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1434094365 | |
https://github.com/simonw/datasette/issues/1855#issuecomment-1301594495 | https://api.github.com/repos/simonw/datasette/issues/1855 | 1301594495 | IC_kwDOBm6k_c5NlMF_ | 9599 | 2022-11-03T03:11:17Z | 2022-11-03T03:11:17Z | OWNER | Maybe the way to do this is through a new standard mechanism on the actor: a set of additional restrictions, e.g.: ``` { "id": "root", "_r": { "a": ["ir", "ur", "dr"], "d": { "fixtures": ["ir", "ur", "dr"] }, "t": { "fixtures": { "searchable": ["ir"] } } } ``` `"a"` is "all permissions" - these apply to everything. `"d"` permissions only apply to the specified database `"t"` permissions only apply to the specified table The way this works is there's a default [permission_allowed(datasette, actor, action, resource)](https://docs.datasette.io/en/stable/plugin_hooks.html#id25) hook which only consults these, and crucially just says NO if those rules do not match. In this way it would apply as an extra layer of permission rules over the defaults (which for this `root` instance would all return yes). | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1423336089 | |
https://github.com/simonw/datasette/issues/1880#issuecomment-1301043042 | https://api.github.com/repos/simonw/datasette/issues/1880 | 1301043042 | IC_kwDOBm6k_c5NjFdi | 525934 | 2022-11-02T18:20:14Z | 2022-11-02T18:20:14Z | NONE | Follow on question - is all memory use @simonw - for both datasette and SQLlite confined to the "query time" itself i.e. the memory use is relevant only to a particular transaction or query - and then subsequently released? | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1433576351 | |
https://github.com/simonw/datasette/issues/1871#issuecomment-1299607082 | https://api.github.com/repos/simonw/datasette/issues/1871 | 1299607082 | IC_kwDOBm6k_c5Ndm4q | 9599 | 2022-11-02T05:45:31Z | 2022-11-02T05:45:31Z | OWNER | I'm going to add a link to the Datasette API docs for the current running version of Datasette, e.g. to https://docs.datasette.io/en/0.63/json_api.html | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1427293909 | |
https://github.com/simonw/datasette/issues/1871#issuecomment-1299600257 | https://api.github.com/repos/simonw/datasette/issues/1871 | 1299600257 | IC_kwDOBm6k_c5NdlOB | 9599 | 2022-11-02T05:36:40Z | 2022-11-02T05:36:40Z | OWNER | The API Explorer should definitely link to the `/-/create-token` page for users who have permission though. And it should probably go in the Datasette application menu? | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1427293909 | |
https://github.com/simonw/datasette/issues/1871#issuecomment-1299599461 | https://api.github.com/repos/simonw/datasette/issues/1871 | 1299599461 | IC_kwDOBm6k_c5NdlBl | 9599 | 2022-11-02T05:35:36Z | 2022-11-02T05:36:15Z | OWNER | Here's a slightly wild idea: what if there was a button on `/-/api` that you could click to turn on "API explorer mode" for the rest of the Datasette interface - which sets a cookie, and that cookie means you then see "API explorer" links in all sorts of other relevant places in the Datasette UI (maybe tucked away in cog menus). Only reason I don't want to show these to everyone is that I don't think this is a very user-friendly feature: if you don't know what an API is I don't want to expose you to it unnecessarily. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1427293909 | |
https://github.com/simonw/datasette/issues/1871#issuecomment-1299598570 | https://api.github.com/repos/simonw/datasette/issues/1871 | 1299598570 | IC_kwDOBm6k_c5Ndkzq | 9599 | 2022-11-02T05:34:28Z | 2022-11-02T05:34:28Z | OWNER | This is pretty useful now. Two features I still want to add: - The ability to link to the API explorer such that the form is pre-filled with material from the URL. Need to guard against clickjacking first though, so no-one can link to it in an invisible iframe and trick the user into hitting POST. - Some kind of list of endpoints so people can click links to start using the API explorer. A list of every table the user can write to with each of their `/db/table/-/insert` endpoints for example. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1427293909 | |
https://github.com/simonw/datasette/issues/1871#issuecomment-1299597066 | https://api.github.com/repos/simonw/datasette/issues/1871 | 1299597066 | IC_kwDOBm6k_c5NdkcK | 9599 | 2022-11-02T05:32:22Z | 2022-11-02T05:32:22Z | OWNER | Demo of the latest API explorer: ![explorer](https://user-images.githubusercontent.com/9599/199406184-1292df42-25ea-4daf-8b54-ca26170ec1ea.gif) | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1427293909 | |
https://github.com/simonw/datasette/issues/1871#issuecomment-1299388341 | https://api.github.com/repos/simonw/datasette/issues/1871 | 1299388341 | IC_kwDOBm6k_c5Ncxe1 | 9599 | 2022-11-02T00:24:28Z | 2022-11-02T00:25:00Z | OWNER | I want JSON syntax highlighting. https://github.com/luyilin/json-format-highlight is an MIT licensed tiny highlighter that looks decent for this. https://unpkg.com/json-format-highlight@1.0.1/dist/json-format-highlight.js | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1427293909 | |
https://github.com/simonw/datasette/issues/1871#issuecomment-1299349741 | https://api.github.com/repos/simonw/datasette/issues/1871 | 1299349741 | IC_kwDOBm6k_c5NcoDt | 9599 | 2022-11-01T23:22:55Z | 2022-11-01T23:22:55Z | OWNER | It's weird that the API explorer only lets you explore POST APIs. It should probably also let you explore GET APIs, or be renamed. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1427293909 | |
https://github.com/simonw/datasette/issues/1879#issuecomment-1299098458 | https://api.github.com/repos/simonw/datasette/issues/1879 | 1299098458 | IC_kwDOBm6k_c5Nbqta | 9599 | 2022-11-01T20:27:40Z | 2022-11-01T20:33:52Z | OWNER | https://github.com/simonw/datasette-x-forwarded-host/blob/main/datasette_x_forwarded_host/__init__.py could happen in core controlled by: `--setting trust_forwarded_host 1` | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1432037325 | |
https://github.com/simonw/datasette/issues/1879#issuecomment-1299102108 | https://api.github.com/repos/simonw/datasette/issues/1879 | 1299102108 | IC_kwDOBm6k_c5Nbrmc | 9599 | 2022-11-01T20:30:54Z | 2022-11-01T20:33:06Z | OWNER | One idea: add a `/-/debug` page (or `/-/tips` or `/-/checks`) which shows the incoming requests headers and could even detect if there's an `x-forwarded-host` header that isn't being repeated and show a tip on how to fix that. | { "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1432037325 | |
https://github.com/simonw/datasette/issues/1879#issuecomment-1299102755 | https://api.github.com/repos/simonw/datasette/issues/1879 | 1299102755 | IC_kwDOBm6k_c5Nbrwj | 9599 | 2022-11-01T20:31:37Z | 2022-11-01T20:31:37Z | OWNER | And some JavaScript that can spot if Datasette thinks it is being served over HTTP when it's actually being served over HTTPS. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1432037325 | |
https://github.com/simonw/datasette/issues/1879#issuecomment-1299096850 | https://api.github.com/repos/simonw/datasette/issues/1879 | 1299096850 | IC_kwDOBm6k_c5NbqUS | 9599 | 2022-11-01T20:26:12Z | 2022-11-01T20:26:12Z | OWNER | The other relevant plugin here is https://datasette.io/plugins/datasette-x-forwarded-host Maybe that should be rolled into core too? | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1432037325 | |
https://github.com/simonw/datasette/issues/1879#issuecomment-1299090678 | https://api.github.com/repos/simonw/datasette/issues/1879 | 1299090678 | IC_kwDOBm6k_c5Nboz2 | 9599 | 2022-11-01T20:20:28Z | 2022-11-01T20:20:28Z | OWNER | My first step in debugging these is to install https://datasette.io/plugins/datasette-debug-asgi - but now I'm thinking maybe something like that should be part of core. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1432037325 | |
https://github.com/simonw/datasette/issues/1862#issuecomment-1299073433 | https://api.github.com/repos/simonw/datasette/issues/1862 | 1299073433 | IC_kwDOBm6k_c5NbkmZ | 9599 | 2022-11-01T20:04:31Z | 2022-11-01T20:04:31Z | OWNER | It really feels like this should be accompanied by a `/db/-/create` API for creating tables. I had to add that to `sqlite-utils` eventually (initially it only supported creating by passing in an example document): https://sqlite-utils.datasette.io/en/stable/cli.html#cli-create-table | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1425011030 | |
https://github.com/simonw/datasette/issues/1878#issuecomment-1299071456 | https://api.github.com/repos/simonw/datasette/issues/1878 | 1299071456 | IC_kwDOBm6k_c5NbkHg | 9599 | 2022-11-01T20:02:43Z | 2022-11-01T20:02:43Z | OWNER | Note that "update" is partially covered by the `replace` option to `/-/insert`, added here: - https://github.com/simonw/datasette/issues/1873#issuecomment-1298885451 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1432013704 | |
https://github.com/simonw/datasette/issues/1873#issuecomment-1298919552 | https://api.github.com/repos/simonw/datasette/issues/1873 | 1298919552 | IC_kwDOBm6k_c5Na_CA | 9599 | 2022-11-01T18:11:27Z | 2022-11-01T18:11:27Z | OWNER | I forgot to document `ignore` and `replace`. Also I need to add tests that cover: - Forgetting to include a primary key on a non-autoincrement table - Compound primary keys - Rowid only tables with and without rowid specified I think my validation logic here will get caught out by the fact that `rowid` does not show up as a valid column name: https://github.com/simonw/datasette/blob/9bec7c38eb93cde5afb16df9bdd96aea2a5b0459/datasette/views/table.py#L1151-L1160 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1428630253 | |
https://github.com/simonw/datasette/issues/1873#issuecomment-1298905135 | https://api.github.com/repos/simonw/datasette/issues/1873 | 1298905135 | IC_kwDOBm6k_c5Na7gv | 9599 | 2022-11-01T17:59:59Z | 2022-11-01T17:59:59Z | OWNER | It's a bit surprising that you can send `"ignore": true, "return_rows": true` and the returned `"inserted"` key will list rows that were NOT inserted (since they were ignored). Three options: 1. Ignore that and document it 2. Fix it so `"inserted"` only returns rows that were actually inserted (bit tricky) 3. Change the name of `"inserted"` to something else I'm picking 3 - I'm going to change it to be called `"rows"` instead. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1428630253 | |
https://github.com/simonw/datasette/issues/1873#issuecomment-1298885451 | https://api.github.com/repos/simonw/datasette/issues/1873 | 1298885451 | IC_kwDOBm6k_c5Na2tL | 9599 | 2022-11-01T17:42:20Z | 2022-11-01T17:42:20Z | OWNER | Design decision: ```json { "rows": [{"id": 1, "title": "The title"}], "ignore": true } ``` Or `"replace": true`. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1428630253 | |
https://github.com/simonw/sqlite-utils/issues/506#issuecomment-1298879701 | https://api.github.com/repos/simonw/sqlite-utils/issues/506 | 1298879701 | IC_kwDOCGYnMM5Na1TV | 9599 | 2022-11-01T17:37:13Z | 2022-11-01T17:37:13Z | OWNER | The question I was originally trying to answer here was this: how many rows were actually inserted by that call to `.insert_all()`? I don't know that `.rowcount` would ever be useful here, since the "correct" answer depends on other factors - had I determined to ignore or replace records with a primary key that matches an existing record for example? So I think if people need `rowcount` they can get it by using a `cursor` directly. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1429029604 | |
https://github.com/simonw/sqlite-utils/issues/506#issuecomment-1298877872 | https://api.github.com/repos/simonw/sqlite-utils/issues/506 | 1298877872 | IC_kwDOCGYnMM5Na02w | 9599 | 2022-11-01T17:35:30Z | 2022-11-01T17:35:30Z | OWNER | This may not make sense. First, `.last_rowid` is a property on table - but that doesn't make sense for `rowcount` since it should clearly be a property on the database itself (you can run a query directly using `db.execute()` without going through a `Table` object). So I tried this prototype: ```diff diff --git a/docs/python-api.rst b/docs/python-api.rst index 206e5e6..78d3a8d 100644 --- a/docs/python-api.rst +++ b/docs/python-api.rst @@ -186,6 +186,15 @@ The ``db.query(sql)`` function executes a SQL query and returns an iterator over # {'name': 'Cleo'} # {'name': 'Pancakes'} +After executing a query the ``db.rowcount`` property on that database instance will reflect the number of rows affected by any insert, update or delete operations performed by that query: + +.. code-block:: python + + db = Database(memory=True) + db["dogs"].insert_all([{"name": "Cleo"}, {"name": "Pancakes"}]) + print(db.rowcount) + # Outputs: 2 + .. _python_api_execute: db.execute(sql, params) diff --git a/sqlite_utils/db.py b/sqlite_utils/db.py index a06f4b7..c19c2dd 100644 --- a/sqlite_utils/db.py +++ b/sqlite_utils/db.py @@ -294,6 +294,8 @@ class Database: _counts_table_name = "_counts" use_counts_table = False + # Number of rows inserted, updated or deleted + rowcount: Optional[int] = None def __init__( self, @@ -480,9 +482,11 @@ class Database: if self._tracer: self._tracer(sql, parameters) if parameters is not None: - return self.conn.execute(sql, parameters) + cursor = self.conn.execute(sql, parameters) else: - return self.conn.execute(sql) + cursor = self.conn.execute(sql) + self.rowcount = cursor.rowcount + return cursor def executescript(self, sql: str) -> sqlite3.Cursor: """ ``` But this happens: ```pycon >>> from sqlite_utils import Database >>> db = Database(memory=True) >>> db["dogs"].insert_a… | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1429029604 | |
https://github.com/simonw/datasette/issues/1876#issuecomment-1298856054 | https://api.github.com/repos/simonw/datasette/issues/1876 | 1298856054 | IC_kwDOBm6k_c5Navh2 | 9599 | 2022-11-01T17:16:01Z | 2022-11-01T17:16:01Z | OWNER | `ta.style.height = ta.scrollHeight + 'px'` is an easy way to do that. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1431786951 | |
https://github.com/simonw/datasette/issues/1876#issuecomment-1298854321 | https://api.github.com/repos/simonw/datasette/issues/1876 | 1298854321 | IC_kwDOBm6k_c5NavGx | 9599 | 2022-11-01T17:14:33Z | 2022-11-01T17:14:33Z | OWNER | I could use a `textarea` here (would need to figure out a neat pattern to expand it to fit the query): <img width="426" alt="image" src="https://user-images.githubusercontent.com/9599/199295041-25abe0b9-f825-43a2-ae5a-face622e08bc.png"> | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1431786951 | |
https://github.com/simonw/sqlite-utils/issues/507#issuecomment-1297859539 | https://api.github.com/repos/simonw/sqlite-utils/issues/507 | 1297859539 | IC_kwDOCGYnMM5NW8PT | 7908073 | 2022-11-01T00:40:16Z | 2022-11-01T00:40:16Z | CONTRIBUTOR | Ideally people could fix their data if they run into this issue. If you are using filenames try [convmv](https://linux.die.net/man/1/convmv) ``` convmv --preserve-mtimes -f utf8 -t utf8 --notest -i -r . ``` maybe this script will also help: ```py import argparse, shutil from pathlib import Path import ftfy from xklb import utils from xklb.utils import log def parse_args() -> argparse.Namespace: parser = argparse.ArgumentParser() parser.add_argument("paths", nargs='*') parser.add_argument("--verbose", "-v", action="count", default=0) args = parser.parse_args() log.info(utils.dict_filter_bool(args.__dict__)) return args def rename_invalid_paths() -> None: args = parse_args() for path in args.paths: log.info(path) for p in sorted([str(p) for p in Path(path).rglob("*")], key=len): fixed = ftfy.fix_text(p, uncurl_quotes=False).replace("\r\n", "\n").replace("\r", "\n").replace("\n", "") if p != fixed: try: shutil.move(p, fixed) except FileNotFoundError: log.warning("FileNotFound. %s", p) else: log.info(fixed) if __name__ == "__main__": rename_invalid_paths() ``` | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1430325103 | |
https://github.com/simonw/sqlite-utils/pull/508#issuecomment-1297754631 | https://api.github.com/repos/simonw/sqlite-utils/issues/508 | 1297754631 | IC_kwDOCGYnMM5NWioH | 22429695 | 2022-10-31T22:14:48Z | 2022-10-31T22:53:59Z | NONE | # [Codecov](https://codecov.io/gh/simonw/sqlite-utils/pull/508?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) Report Base: **96.25**% // Head: **96.09**% // Decreases project coverage by **`-0.15%`** :warning: > Coverage data is based on head [(`2d6a149`)](https://codecov.io/gh/simonw/sqlite-utils/pull/508?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) compared to base [(`529110e`)](https://codecov.io/gh/simonw/sqlite-utils/commit/529110e7d8c4a6b1bbf5fb61f2e29d72aa95a611?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). > Patch coverage: 63.63% of modified lines in pull request are covered. > :exclamation: Current head 2d6a149 differs from pull request most recent head 43a8c4c. Consider uploading reports for the commit 43a8c4c to get more accurate results <details><summary>Additional details and impacted files</summary> ```diff @@ Coverage Diff @@ ## main #508 +/- ## ========================================== - Coverage 96.25% 96.09% -0.16% ========================================== Files 4 4 Lines 2401 2407 +6 ========================================== + Hits 2311 2313 +2 - Misses 90 94 +4 ``` | [Impacted Files](https://codecov.io/gh/simonw/sqlite-utils/pull/508?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) | Coverage Δ | | |---|---|---| | [sqlite\_utils/db.py](https://codecov.io/gh/simonw/sqlite-utils/pull/508/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison#diff-c3FsaXRlX3V0aWxzL2RiLnB5) | `96.79% <63.63%> (-0.30%)` | :arrow_down: | Help us with your feedback. Take ten seconds to… | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1430563092 | |
https://github.com/simonw/sqlite-utils/issues/448#issuecomment-1297703307 | https://api.github.com/repos/simonw/sqlite-utils/issues/448 | 1297703307 | IC_kwDOCGYnMM5NWWGL | 167893 | 2022-10-31T21:23:51Z | 2022-10-31T21:27:32Z | CONTRIBUTOR | The Windows aspect is a red herring: OP's sample above produces the same error on Linux. (Though I don't know what's going on with the CI). The same error can also be obtained by passing an `io` from a file opened in non-binary mode (`'r'` as opposed to `'rb'`) to `rows_from_file()`. This is how I got here. The fix for my case is easy: open the file in mode `'rb'`. The analagous fix for OP's problem also works: use `BytesIO` in place of `StringIO`. Minimal test case (derived from [utils.py](https://github.com/simonw/sqlite-utils/blob/main/sqlite_utils/utils.py#L304)): ``` python import io from typing import cast #fp = io.StringIO("id,name\n1,Cleo") # error fp = io.BytesIO(bytes("id,name\n1,Cleo", encoding='utf-8')) # okay reader = io.BufferedReader(cast(io.RawIOBase, fp)) reader.peek(1) # exception thrown here ``` I see the signature of `rows_from_file()` correctly has `fp: BinaryIO` but I guess you'd need either a runtime type check for that (not all `io`s have `mode()`), or to catch the `AttributeError` on `peek()` to produce a better error for users. Neither option is ideal. Some thoughts on testing binary-ness of `io`s in this SO question: https://stackoverflow.com/questions/44584829/how-to-determine-if-file-is-opened-in-binary-or-text-mode | { "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1279144769 | |
https://github.com/dogsheep/twitter-to-sqlite/issues/61#issuecomment-1297201971 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/61 | 1297201971 | IC_kwDODEm0Qs5NUbsz | 3153638 | 2022-10-31T14:47:58Z | 2022-10-31T14:47:58Z | NONE | There’s also a limit of 3200 tweets. I wonder if that can be circumvented somehow. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1077560091 | |
https://github.com/simonw/datasette/issues/1864#issuecomment-1296403316 | https://api.github.com/repos/simonw/datasette/issues/1864 | 1296403316 | IC_kwDOBm6k_c5NRYt0 | 9599 | 2022-10-31T00:39:43Z | 2022-10-31T00:39:43Z | OWNER | It looks like SQLite has features for this already: https://www.sqlite.org/foreignkeys.html#fk_actions > Foreign key ON DELETE and ON UPDATE clauses are used to configure actions that take place when deleting rows from the parent table (ON DELETE), or modifying the parent key values of existing rows (ON UPDATE). A single foreign key constraint may have different actions configured for ON DELETE and ON UPDATE. Foreign key actions are similar to triggers in many ways. On that basis, I'm not going to implement anything additional in the `.../-/delete` endpoint relating to foreign keys. Developers who want special treatment of them can do that with a combination of a plugin (maybe I'll build a `datasette-enable-foreign-keys` plugin) and tables created using those `ON DELETE` clauses. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1425029275 | |
https://github.com/simonw/datasette/issues/1864#issuecomment-1296402071 | https://api.github.com/repos/simonw/datasette/issues/1864 | 1296402071 | IC_kwDOBm6k_c5NRYaX | 9599 | 2022-10-31T00:37:09Z | 2022-10-31T00:37:09Z | OWNER | I need to think about what happens if you delete a row that is the target of a foreign key from another row. https://www.sqlite.org/foreignkeys.html#fk_enable shows that SQLite will only actively enforce these relationships (e.g. throw an error if you try to delete a row that is referenced by another row) if you first run `PRAGMA foreign_keys = ON;` against the connection. > Foreign key constraints are disabled by default (for backwards compatibility), so must be enabled separately for each [database connection](https://www.sqlite.org/c3ref/sqlite3.html). (Note, however, that future releases of SQLite might change so that foreign key constraints enabled by default. Careful developers will not make any assumptions about whether or not foreign keys are enabled by default but will instead enable or disable them as necessary.) I don't actually believe that the SQLite maintainers will ever make that the default though. Datasette doesn't turn these on at the moment, but it could be turned on by a `prepare_connection()` plugin. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1425029275 | |
https://github.com/simonw/datasette/issues/1864#issuecomment-1296375536 | https://api.github.com/repos/simonw/datasette/issues/1864 | 1296375536 | IC_kwDOBm6k_c5NRR7w | 9599 | 2022-10-30T23:17:11Z | 2022-10-30T23:17:11Z | OWNER | I'm a bit nervous about calling `.delete()` with the `pk_values` - can I be sure they are in the correct order? https://github.com/simonw/datasette/blob/00632ded30e7cf9f0cf9478680645d1dabe269ae/datasette/views/row.py#L188-L190 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1425029275 | |
https://github.com/simonw/datasette/issues/1864#issuecomment-1296375310 | https://api.github.com/repos/simonw/datasette/issues/1864 | 1296375310 | IC_kwDOBm6k_c5NRR4O | 9599 | 2022-10-30T23:16:19Z | 2022-10-30T23:16:19Z | OWNER | Still needs tests that cover compound primary keys and rowid tables. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1425029275 | |
https://github.com/simonw/datasette/issues/1874#issuecomment-1296363981 | https://api.github.com/repos/simonw/datasette/issues/1874 | 1296363981 | IC_kwDOBm6k_c5NRPHN | 9599 | 2022-10-30T22:19:47Z | 2022-10-30T22:19:47Z | OWNER | Documentation: https://docs.datasette.io/en/1.0-dev/json_api.html#dropping-tables | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1429030341 | |
https://github.com/simonw/sqlite-utils/issues/506#issuecomment-1296358636 | https://api.github.com/repos/simonw/sqlite-utils/issues/506 | 1296358636 | IC_kwDOCGYnMM5NRNzs | 9599 | 2022-10-30T21:52:11Z | 2022-10-30T21:52:11Z | OWNER | This could work in a similar way to `db.insert(...).last_rowid`. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1429029604 | |
https://github.com/simonw/datasette/issues/1873#issuecomment-1296343716 | https://api.github.com/repos/simonw/datasette/issues/1873 | 1296343716 | IC_kwDOBm6k_c5NRKKk | 9599 | 2022-10-30T20:24:55Z | 2022-10-30T20:24:55Z | OWNER | I think the key feature I need here is going to be the equivalent of `ignore=True` and `replace=True` for dealing with primary key collisions, see https://sqlite-utils.datasette.io/en/stable/reference.html#sqlite_utils.db.Table.insert | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1428630253 | |
https://github.com/simonw/datasette/issues/1873#issuecomment-1296343317 | https://api.github.com/repos/simonw/datasette/issues/1873 | 1296343317 | IC_kwDOBm6k_c5NRKEV | 9599 | 2022-10-30T20:22:40Z | 2022-10-30T20:22:40Z | OWNER | So maybe they're not actually worth worrying about separately, because they are guaranteed to have a primary key set. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1428630253 | |
https://github.com/simonw/datasette/issues/1873#issuecomment-1296343173 | https://api.github.com/repos/simonw/datasette/issues/1873 | 1296343173 | IC_kwDOBm6k_c5NRKCF | 9599 | 2022-10-30T20:21:54Z | 2022-10-30T20:22:20Z | OWNER | One last case to consider: `WITHOUT ROWID` tables. https://www.sqlite.org/withoutrowid.html > By default, every row in SQLite has a special column, usually called the "[rowid](https://www.sqlite.org/lang_createtable.html#rowid)", that uniquely identifies that row within the table. However if the phrase "WITHOUT ROWID" is added to the end of a [CREATE TABLE](https://www.sqlite.org/lang_createtable.html) statement, then the special "rowid" column is omitted. There are sometimes space and performance advantages to omitting the rowid. > > ... > > Every WITHOUT ROWID table must have a [PRIMARY KEY](https://www.sqlite.org/lang_createtable.html#primkeyconst). An error is raised if a CREATE TABLE statement with the WITHOUT ROWID clause lacks a PRIMARY KEY. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1428630253 | |
https://github.com/simonw/datasette/issues/1873#issuecomment-1296343014 | https://api.github.com/repos/simonw/datasette/issues/1873 | 1296343014 | IC_kwDOBm6k_c5NRJ_m | 9599 | 2022-10-30T20:21:01Z | 2022-10-30T20:21:01Z | OWNER | Actually, for simplicity I'm going to say that you can always set the primary key, even for auto-incrementing primary key columns... but you cannot set it on pure `rowid` columns. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1428630253 | |
https://github.com/simonw/datasette/issues/1873#issuecomment-1296342814 | https://api.github.com/repos/simonw/datasette/issues/1873 | 1296342814 | IC_kwDOBm6k_c5NRJ8e | 9599 | 2022-10-30T20:20:05Z | 2022-10-30T20:20:05Z | OWNER | Some notes on what Datasette does already https://latest.datasette.io/fixtures/tags.json?_shape=array returns: ```json [ { "tag": "canine" }, { "tag": "feline" } ] ``` That table is defined [like this](https://latest.datasette.io/fixtures/tags): ```sql CREATE TABLE tags ( tag TEXT PRIMARY KEY ); ``` Here's a `rowid` table with no explicit primary key: https://latest.datasette.io/fixtures/binary_data https://latest.datasette.io/fixtures/binary_data.json?_shape=array ```json [ { "rowid": 1, "data": { "$base64": true, "encoded": "FRwCx60F/g==" } }, { "rowid": 2, "data": { "$base64": true, "encoded": "FRwDx60F/g==" } }, { "rowid": 3, "data": null } ] ``` ```sql CREATE TABLE binary_data ( data BLOB ); ``` https://latest.datasette.io/fixtures/simple_primary_key has a text primary key: https://latest.datasette.io/fixtures/simple_primary_key.json?_shape=array ```json [ { "id": "1", "content": "hello" }, { "id": "2", "content": "world" }, { "id": "3", "content": "" }, { "id": "4", "content": "RENDER_CELL_DEMO" }, { "id": "5", "content": "RENDER_CELL_ASYNC" } ] ``` ```sql CREATE TABLE simple_primary_key ( id varchar(30) primary key, content text ); ``` https://latest.datasette.io/fixtures/compound_primary_key is a compound primary key. https://latest.datasette.io/fixtures/compound_primary_key.json?_shape=array ```json [ { "pk1": "a", "pk2": "b", "content": "c" }, { "pk1": "a/b", "pk2": ".c-d", "content": "c" } ] ``` ```sql CREATE TABLE compound_primary_key ( pk1 varchar(30), pk2 varchar(30), content text, PRIMARY KEY (pk1, pk2) ); ``` | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1428630253 | |
https://github.com/simonw/datasette/issues/1873#issuecomment-1296341469 | https://api.github.com/repos/simonw/datasette/issues/1873 | 1296341469 | IC_kwDOBm6k_c5NRJnd | 9599 | 2022-10-30T20:13:50Z | 2022-10-30T20:13:50Z | OWNER | I checked and SQLite itself does allow you to set the `rowid` on that kind of table - it then increments from whatever you inserted: ``` % sqlite3 /tmp/t.db SQLite version 3.39.4 2022-09-07 20:51:41 Enter ".help" for usage hints. sqlite> create table docs (title text); sqlite> insert into docs (title) values ('one'); sqlite> select rowid, title from docs; 1|one sqlite> insert into docs (rowid, title) values (3, 'three'); sqlite> select rowid, title from docs; 1|one 3|three sqlite> insert into docs (title) values ('another'); sqlite> select rowid, title from docs; 1|one 3|three 4|another ``` | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1428630253 | |
https://github.com/simonw/datasette/issues/1873#issuecomment-1296341055 | https://api.github.com/repos/simonw/datasette/issues/1873 | 1296341055 | IC_kwDOBm6k_c5NRJg_ | 9599 | 2022-10-30T20:11:47Z | 2022-10-30T20:12:30Z | OWNER | If a table has an auto-incrementing primary key, should you be allowed to insert records with an explicit key into it? I'm torn on this one. It's something you can do with direct database access, but it's something I very rarely want to do. I'm inclined to disallow it and say that if you want that you can get it using a writable canned query instead. Likewise, I'm not going to provide a way to set the `rowid` explicitly on a freshly inserted row. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1428630253 | |
https://github.com/simonw/datasette/issues/1871#issuecomment-1296339386 | https://api.github.com/repos/simonw/datasette/issues/1871 | 1296339386 | IC_kwDOBm6k_c5NRJG6 | 9599 | 2022-10-30T20:03:04Z | 2022-10-30T20:03:04Z | OWNER | I do need to skip CSRF for these API calls. I'm going to start out by doing that using the `skip_csrf()` hook to skip CSRF checks on anything with a `content-type: application/json` request header. ```python @hookimpl def skip_csrf(scope): if scope["type"] == "http": headers = scope.get("headers") if dict(headers).get(b'content-type') == b'application/json': return True ``` | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1427293909 | |
https://github.com/simonw/datasette/issues/1871#issuecomment-1296339205 | https://api.github.com/repos/simonw/datasette/issues/1871 | 1296339205 | IC_kwDOBm6k_c5NRJEF | 9599 | 2022-10-30T20:02:05Z | 2022-10-30T20:02:05Z | OWNER | Realized the API explorer doesn't need the API key piece at all - it can work with standard cookie-based auth. This also reflects how most plugins are likely to use this API, where they'll be adding JavaScript that uses `fetch()` to call the write API directly. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1427293909 | |
https://github.com/simonw/datasette/issues/1871#issuecomment-1296131872 | https://api.github.com/repos/simonw/datasette/issues/1871 | 1296131872 | IC_kwDOBm6k_c5NQWcg | 9599 | 2022-10-30T06:27:56Z | 2022-10-30T06:27:56Z | OWNER | Initial prototype API explorer is now live at https://latest-1-0-dev.datasette.io/-/api | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1427293909 | |
https://github.com/simonw/datasette/issues/1873#issuecomment-1296131681 | https://api.github.com/repos/simonw/datasette/issues/1873 | 1296131681 | IC_kwDOBm6k_c5NQWZh | 9599 | 2022-10-30T06:27:12Z | 2022-10-30T06:27:12Z | OWNER | Relevant TODO: https://github.com/simonw/datasette/blob/c35859ae3df163406f1a1895ccf9803e933b2d8e/datasette/views/table.py#L1131-L1135 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1428630253 | |
https://github.com/simonw/datasette/issues/1872#issuecomment-1296131343 | https://api.github.com/repos/simonw/datasette/issues/1872 | 1296131343 | IC_kwDOBm6k_c5NQWUP | 9599 | 2022-10-30T06:26:01Z | 2022-10-30T06:26:01Z | OWNER | Good spot fixing that! Sorry about this - it was a change in Datasette 0.63 which should have been better called out. My goal for Datasette 1.0 (which I aim to have out by the end of the year) is to introduce a formal process for avoiding problems like this, with very clear documentation when something like this might happen. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1428560020 | |
https://github.com/simonw/datasette/issues/1871#issuecomment-1296130073 | https://api.github.com/repos/simonw/datasette/issues/1871 | 1296130073 | IC_kwDOBm6k_c5NQWAZ | 9599 | 2022-10-30T06:20:56Z | 2022-10-30T06:20:56Z | OWNER | That initial prototype looks like this: <img width="560" alt="image" src="https://user-images.githubusercontent.com/9599/198865445-e5dd8c12-6504-47e0-98e5-2dfde25052f4.png"> It currently shows the returned JSON from the API in an `alert()`. Next I should make that part of the page instead. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1427293909 | |
https://github.com/simonw/datasette/issues/1871#issuecomment-1296126389 | https://api.github.com/repos/simonw/datasette/issues/1871 | 1296126389 | IC_kwDOBm6k_c5NQVG1 | 9599 | 2022-10-30T06:04:48Z | 2022-10-30T06:04:48Z | OWNER | This is even more important now I have pushed: - #1866 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1427293909 | |
https://github.com/simonw/datasette/issues/1871#issuecomment-1296114136 | https://api.github.com/repos/simonw/datasette/issues/1871 | 1296114136 | IC_kwDOBm6k_c5NQSHY | 9599 | 2022-10-30T05:15:40Z | 2022-10-30T05:15:40Z | OWNER | Host it at `/-/api` It's an input box with a path in and a textarea you can put JSON in, plus a submit button to post the request. It lists the API endpoints you can use - click on a link to populate the form field plus a example. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1427293909 | |
https://github.com/simonw/datasette/issues/1872#issuecomment-1296080804 | https://api.github.com/repos/simonw/datasette/issues/1872 | 1296080804 | IC_kwDOBm6k_c5NQJ-k | 192568 | 2022-10-30T03:06:32Z | 2022-10-30T03:06:32Z | CONTRIBUTOR | I updated datasette-publish-vercel to 0.14.2 in requirements.txt And the site is back up! Is there a way that we can get some sort of notice when something like this will have critical impact on website function? | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1428560020 | |
https://github.com/simonw/datasette/issues/1872#issuecomment-1296076803 | https://api.github.com/repos/simonw/datasette/issues/1872 | 1296076803 | IC_kwDOBm6k_c5NQJAD | 192568 | 2022-10-30T02:50:34Z | 2022-10-30T02:50:34Z | CONTRIBUTOR | should this issue be under https://github.com/simonw/datasette-publish-vercel/issues ? Perhaps I just need to update: datasette-publish-vercel==0.11 in requirements.txt? I'll try that and see what happens... | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1428560020 | |
https://github.com/simonw/datasette/pull/1870#issuecomment-1295667649 | https://api.github.com/repos/simonw/datasette/issues/1870 | 1295667649 | IC_kwDOBm6k_c5NOlHB | 536941 | 2022-10-29T00:52:43Z | 2022-10-29T00:53:43Z | CONTRIBUTOR | > Are you saying that I can build a container, but then when I run it and it does `datasette serve -i data.db ...` it will somehow modify the image, or create a new modified filesystem layer in the runtime environment, as a result of running that `serve` command? Somehow, `datasette serve -i data.db` will lead to the `data.db` being modified, which will trigger a [copy-on-write](https://docs.docker.com/storage/storagedriver/#the-copy-on-write-cow-strategy) of `data.db` into the read-write layer of the container. I don't understand **how** that happens. it kind of feels like a bug in sqlite, but i can't quite follow the sqlite code. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1426379903 | |
https://github.com/simonw/datasette/pull/1870#issuecomment-1295660092 | https://api.github.com/repos/simonw/datasette/issues/1870 | 1295660092 | IC_kwDOBm6k_c5NOjQ8 | 9599 | 2022-10-29T00:25:26Z | 2022-10-29T00:25:26Z | OWNER | Saw your comment here too: https://github.com/simonw/datasette/issues/1480#issuecomment-1271101072 > switching from `immutable=1` to `mode=ro` completely addressed this. see https://github.com/simonw/datasette/issues/1836#issuecomment-1271100651 for details. So maybe we need a special case for containers that are intended to be run using Docker - the ones produced by `datasette package` and `datasette publish cloudrun`? Those are cases where the `-i` option should actually be opened in read-only mode, not immutable mode. Maybe a `datasette serve --irw data.db` option for opening a file in immutable-but-actually-read-only mode? Bit ugly though. I should run some benchmarks to figure out if `immutable` really does offer significant performance benefits. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1426379903 | |
https://github.com/simonw/datasette/pull/1870#issuecomment-1295657771 | https://api.github.com/repos/simonw/datasette/issues/1870 | 1295657771 | IC_kwDOBm6k_c5NOisr | 9599 | 2022-10-29T00:19:03Z | 2022-10-29T00:19:03Z | OWNER | Just saw your comment here: https://github.com/simonw/datasette/issues/1836#issuecomment-1272357976 > when you are running from docker, you **always** will want to run as `mode=ro` because the same thing that is causing duplication in the inspect layer will cause duplication in the final container read/write layer when `datasette serve` runs. I don't understand this. My mental model of how Docker works is that the image itself is created using `docker build`... but then when the image runs later on (`docker run`) the image itself isn't touched at all. Are you saying that I can build a container, but then when I run it and it does `datasette serve -i data.db ...` it will somehow modify the image, or create a new modified filesystem layer in the runtime environment, as a result of running that `serve` command? | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1426379903 | |
https://github.com/simonw/datasette/issues/1866#issuecomment-1295200988 | https://api.github.com/repos/simonw/datasette/issues/1866 | 1295200988 | IC_kwDOBm6k_c5NMzLc | 9599 | 2022-10-28T16:29:55Z | 2022-10-28T16:29:55Z | OWNER | I wonder if there's something clever I could do here within a transaction? Start a transaction. Write out a temporary in-memory table with all of the existing primary keys in the table. Run the bulk insert. Then run `select pk from table where pk not in (select pk from old_pks)` to see what has changed. I don't think that's going to work well for large tables. I'm going to go with not returning inserted rows by default, unless you pass a special option requesting that. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1426001541 | |
https://github.com/simonw/sqlite-utils/issues/496#issuecomment-1294408928 | https://api.github.com/repos/simonw/sqlite-utils/issues/496 | 1294408928 | IC_kwDOCGYnMM5NJxzg | 39538958 | 2022-10-28T03:36:56Z | 2022-10-28T03:37:50Z | NONE | With respect to the typing of Table class itself, my interim solution: ```python from sqlite_utils.db import Table def tbl(self, table_name: str) -> Table: tbl = self.db[table_name] if isinstance(tbl, Table): return tbl raise Exception(f"Missing {table_name=}") ``` With respect to @chapmanjacobd concern on the `DEFAULT` being an empty class, have also been using `# type: ignore`, e.g. ```python @classmethod def insert_list(cls, areas: list[str]): return meta.tbl(meta.Areas).insert_all( ({"area": a} for a in areas), ignore=True # type: ignore ) ``` | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1393202060 | |
https://github.com/simonw/datasette/issues/1866#issuecomment-1294316640 | https://api.github.com/repos/simonw/datasette/issues/1866 | 1294316640 | IC_kwDOBm6k_c5NJbRg | 9599 | 2022-10-28T01:51:40Z | 2022-10-28T01:51:40Z | OWNER | This needs to support the following: - Rows do not include a primary key - one is assigned by the database - Rows provide their own primary key, any clashes are errors - Rows provide their own primary key, clashes are silently ignored - Rows provide their own primary key, replacing any existing records | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1426001541 | |
https://github.com/simonw/datasette/issues/1866#issuecomment-1294306071 | https://api.github.com/repos/simonw/datasette/issues/1866 | 1294306071 | IC_kwDOBm6k_c5NJYsX | 9599 | 2022-10-28T01:37:14Z | 2022-10-28T01:37:59Z | OWNER | Quick crude benchmark: ```python import sqlite3 db = sqlite3.connect(":memory:") def create_table(db, name): db.execute(f"create table {name} (id integer primary key, title text)") create_table(db, "single") create_table(db, "multi") create_table(db, "bulk") def insert_singles(titles): inserted = [] for title in titles: cursor = db.execute(f"insert into single (title) values (?)", [title]) inserted.append((cursor.lastrowid, title)) return inserted def insert_many(titles): db.executemany(f"insert into multi (title) values (?)", ((t,) for t in titles)) def insert_bulk(titles): db.execute("insert into bulk (title) values {}".format( ", ".join("(?)" for _ in titles) ), titles) titles = ["title {}".format(i) for i in range(1, 10001)] ``` Then in iPython I ran these: ``` In [14]: %timeit insert_singles(titles) 23.8 ms ± 535 µs per loop (mean ± std. dev. of 7 runs, 10 loops each) In [13]: %timeit insert_many(titles) 12 ms ± 520 µs per loop (mean ± std. dev. of 7 runs, 100 loops each) In [12]: %timeit insert_bulk(titles) 2.59 ms ± 25 µs per loop (mean ± std. dev. of 7 runs, 100 loops each) ``` So the bulk insert really is a lot faster - 3ms compared to 24ms for single inserts, so ~8x faster. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1426001541 | |
https://github.com/simonw/datasette/issues/1866#issuecomment-1294296767 | https://api.github.com/repos/simonw/datasette/issues/1866 | 1294296767 | IC_kwDOBm6k_c5NJWa_ | 9599 | 2022-10-28T01:22:25Z | 2022-10-28T01:23:09Z | OWNER | Nasty catch on this one: I wanted to return the IDs of the freshly inserted rows. But... the `insert_all()` method I was planning to use from `sqlite-utils` doesn't appear to have a way of doing that: https://github.com/simonw/sqlite-utils/blob/529110e7d8c4a6b1bbf5fb61f2e29d72aa95a611/sqlite_utils/db.py#L2813-L2835 SQLite itself added a `RETURNING` statement which might help, but that is only available from version 3.35 released in March 2021: https://www.sqlite.org/lang_returning.html - which isn't commonly available yet. https://latest.datasette.io/-/versions right now shows 3.34, and https://lite.datasette.io/#/-/versions shows 3.27.2 (from Feb 2019). Two options then: 1. Even for bulk inserts do one insert at a time so I can use `cursor.lastrowid` to get the ID of the inserted record. This isn't terrible since SQLite is very fast, but it may still be a big performance hit for large inserts. 2. Don't return the list of inserted rows for bulk inserts 3. Default to not returning the list of inserted rows for bulk inserts, but allow the user to request that - in which case we use the slower path That third option might be the way to go here. I should benchmark first to figure out how much of a difference this actually makes. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1426001541 | |
https://github.com/simonw/datasette/pull/1870#issuecomment-1294285471 | https://api.github.com/repos/simonw/datasette/issues/1870 | 1294285471 | IC_kwDOBm6k_c5NJTqf | 536941 | 2022-10-28T01:06:03Z | 2022-10-28T01:06:03Z | CONTRIBUTOR | as far as i can tell, [this is where the "immutable" argument is used](https://github.com/sqlite/sqlite/blob/c97bb14fab566f6fa8d967c8fd1e90f3702d5b73/src/pager.c#L4926-L4931) in sqlite: ```c pPager->noLock = sqlite3_uri_boolean(pPager->zFilename, "nolock", 0); if( (iDc & SQLITE_IOCAP_IMMUTABLE)!=0 || sqlite3_uri_boolean(pPager->zFilename, "immutable", 0) ){ vfsFlags |= SQLITE_OPEN_READONLY; goto act_like_temp_file; } ``` so it does set the read only flag, but then has a goto. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1426379903 | |
https://github.com/simonw/datasette/issues/1866#issuecomment-1294282263 | https://api.github.com/repos/simonw/datasette/issues/1866 | 1294282263 | IC_kwDOBm6k_c5NJS4X | 9599 | 2022-10-28T01:00:42Z | 2022-10-28T01:00:42Z | OWNER | I'm going to set the limit at 1,000 rows inserted at a time. I'll make this configurable using a new `max_insert_rows` setting (for consistency with `max_returned_rows`). | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1426001541 | |
https://github.com/simonw/datasette/issues/1851#issuecomment-1294281451 | https://api.github.com/repos/simonw/datasette/issues/1851 | 1294281451 | IC_kwDOBm6k_c5NJSrr | 9599 | 2022-10-28T00:59:25Z | 2022-10-28T00:59:25Z | OWNER | I'm going to use this endpoint for bulk inserts too, so I'm closing this issue and continuing the work here: - #1866 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1421544654 | |
https://github.com/simonw/datasette/pull/1870#issuecomment-1294238862 | https://api.github.com/repos/simonw/datasette/issues/1870 | 1294238862 | IC_kwDOBm6k_c5NJISO | 22429695 | 2022-10-27T23:44:25Z | 2022-10-27T23:44:25Z | NONE | # [Codecov](https://codecov.io/gh/simonw/datasette/pull/1870?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) Report Base: **92.55**% // Head: **92.55**% // No change to project coverage :thumbsup: > Coverage data is based on head [(`4faa4fd`)](https://codecov.io/gh/simonw/datasette/pull/1870?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) compared to base [(`bf00b0b`)](https://codecov.io/gh/simonw/datasette/commit/bf00b0b59b6692bdec597ac9db4e0b497c5a47b4?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). > Patch has no changes to coverable lines. <details><summary>Additional details and impacted files</summary> ```diff @@ Coverage Diff @@ ## main #1870 +/- ## ======================================= Coverage 92.55% 92.55% ======================================= Files 35 35 Lines 4432 4432 ======================================= Hits 4102 4102 Misses 330 330 ``` | [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1870?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) | Coverage Δ | | |---|---|---| | [datasette/app.py](https://codecov.io/gh/simonw/datasette/pull/1870/diff?src=pr&el=tree&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison#diff-ZGF0YXNldHRlL2FwcC5weQ==) | `94.30% <ø> (ø)` | | Help us with your feedback. Take ten seconds to tell us [how you rate us](https://about.codecov.io/nps?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Have a feature suggestion? [Share it here.](https://app.codecov.io/gh/feedback/?utm_medium=referral&utm_source=g… | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1426379903 | |
https://github.com/simonw/datasette/pull/1870#issuecomment-1294237783 | https://api.github.com/repos/simonw/datasette/issues/1870 | 1294237783 | IC_kwDOBm6k_c5NJIBX | 536941 | 2022-10-27T23:42:18Z | 2022-10-27T23:42:18Z | CONTRIBUTOR | Relevant sqlite forum thread: https://www.sqlite.org/forum/forumpost/02f7bda329f41e30451472421cf9ce7f715b768ce3db02797db1768e47950d48 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1426379903 | |
https://github.com/simonw/datasette/issues/1851#issuecomment-1289712350 | https://api.github.com/repos/simonw/datasette/issues/1851 | 1289712350 | IC_kwDOBm6k_c5M33Le | 9599 | 2022-10-24T22:28:39Z | 2022-10-27T23:18:48Z | OWNER | API design: (**UPDATE: this was [later changed to POST /db/table/-/insert](https://github.com/simonw/datasette/issues/1851#issuecomment-1294224185)) ``` POST /db/table Authorization: Bearer xxx Content-Type: application/json { "row": { "id": 1, "name": "New record" } } ``` Returns: ``` 201 Created { "row": { "id": 1, "name": "New record" } } ``` You can omit optional fields in the input, including the ID field. The returned object will always include all fields - and will even include `rowid` if your object doesn't have a primary key of its own. I decided to use `"row"` as the key in both request and response, to preserve space for other future keys - one that tells you that the table has been created, for example. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1421544654 | |
https://github.com/simonw/datasette/issues/1869#issuecomment-1294181485 | https://api.github.com/repos/simonw/datasette/issues/1869 | 1294181485 | IC_kwDOBm6k_c5NI6Rt | 9599 | 2022-10-27T22:24:37Z | 2022-10-27T22:24:37Z | OWNER | https://docs.datasette.io/en/stable/changelog.html#v0-63 Annotated release notes: https://simonwillison.net/2022/Oct/27/datasette-0-63/ | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1426253476 | |
https://github.com/simonw/datasette/issues/1786#issuecomment-1294116493 | https://api.github.com/repos/simonw/datasette/issues/1786 | 1294116493 | IC_kwDOBm6k_c5NIqaN | 9599 | 2022-10-27T21:50:12Z | 2022-10-27T21:50:12Z | OWNER | Demo in Datasette Lite: https://lite.datasette.io/#/fixtures?sql=select%0A++pk1%2C%0A++pk2%2C%0A++content%2C%0A++sortable%2C%0A++sortable_with_nulls%2C%0A++sortable_with_nulls_2%2C%0A++text%0Afrom%0A++sortable%0Aorder+by%0A++pk1%2C%0A++pk2%0Alimit%0A++101 | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
1342430983 |