pull_requests
608 rows sorted by title
This data as json, CSV (advanced)
Suggested facets: state, milestone, draft, author_association, repo, created_at (date), updated_at (date), closed_at (date)
merged_at (date) >30 ✖
- 2019-05-03 8
- 2020-05-04 8
- 2021-10-13 8
- 2021-05-19 6
- 2022-03-06 6
- 2021-05-22 5
- 2018-04-14 4
- 2020-10-12 4
- 2021-03-29 4
- 2021-11-30 4
- 2022-07-18 4
- 2023-03-29 4
- 2017-11-17 3
- 2019-07-03 3
- 2019-10-14 3
- 2020-02-25 3
- 2020-10-14 3
- 2020-10-27 3
- 2021-06-02 3
- 2021-08-25 3
- 2021-10-14 3
- 2022-08-17 3
- 2022-08-27 3
- 2022-12-08 3
- 2023-05-08 3
- 2023-05-21 3
- 2023-08-29 3
- 2023-09-06 3
- 2023-11-04 3
- 2017-11-15 2
- …
id | node_id | number | state | locked | title ▼ | user | body | created_at | updated_at | closed_at | merged_at | merge_commit_sha | assignee | milestone | draft | head | base | author_association | repo | url | merged_by | auto_merge |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
181723303 | MDExOlB1bGxSZXF1ZXN0MTgxNzIzMzAz | 209 | closed | 0 | Don't duplicate simple primary keys in the link column | russss 45057 | When there's a simple (single-column) primary key, it looks weird to duplicate it in the link column. This change removes the second PK column and treats the link column as if it were the PK column from a header/sorting perspective. This might make it a bit more difficult to tell what the link for the row is, I'm not sure yet. I feel like the alternative is to change the link column to just have the text "view" or something, instead of repeating the PK. (I doubt it makes much more sense with compound PKs.) Bonus change in this PR: fix urlencoding of links in the displayed HTML. Before: ![image](https://user-images.githubusercontent.com/45057/38783830-e2ababb4-40ff-11e8-97fb-25e286a8c920.png) After: ![image](https://user-images.githubusercontent.com/45057/38783835-ebf6b48e-40ff-11e8-8c47-6a864cf21ccc.png) | 2018-04-15T21:56:15Z | 2018-04-18T08:40:37Z | 2018-04-18T01:13:04Z | 2018-04-18T01:13:04Z | 136a70d88741e2a5892c3de437064a9d14494d66 | 0 | 4acde4e187795214af6fc86f46af48982ec5de46 | bf5ec2d61148f9852441934dd206b3b1c07a512f | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/209 | ||||
208719043 | MDExOlB1bGxSZXF1ZXN0MjA4NzE5MDQz | 361 | closed | 0 | Import pysqlite3 if available, closes #360 | simonw 9599 | 2018-08-16T00:52:21Z | 2018-08-16T00:58:57Z | 2018-08-16T00:58:57Z | 2018-08-16T00:58:57Z | aae49fef3b75848628d824077ec063834e3e5167 | 0 | da41daa168af8f29a1beb5278aed833cf3dc48ce | e1db8194e8c1d7f361fd0c1c3fc1b91d6aa920e5 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/361 | |||||
271338405 | MDExOlB1bGxSZXF1ZXN0MjcxMzM4NDA1 | 434 | closed | 0 | "datasette publish cloudrun" command to publish to Google Cloud Run | rprimet 10352819 | This is a very rough draft to start a discussion on a possible datasette cloud run publish plugin (see issue #400). The main change was to dynamically set the listening port in `make_dockerfile` to satisfy cloud run's [requirements](https://cloud.google.com/run/docs/reference/container-contract). This was done by running `datasette` through `sh` to get environment variable substitution. Not sure if that's the right approach? | 2019-04-17T14:41:18Z | 2019-05-03T21:50:44Z | 2019-05-03T13:59:02Z | 2019-05-03T13:59:02Z | 75a21fc2a136ccfc9da7bbf521cf288e63c9707f | 0 | 74c20d0d2eac13892ac20db0e66fcb3437544aa6 | bf229c9bd88179c8ec16bd65fd4fb28ab4241c2e | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/434 | ||||
1154884166 | PR_kwDOBm6k_c5E1iJG | 1938 | closed | 0 | "permissions" blocks in metadata.json/yaml | simonw 9599 | Refs #1636 - [x] Documentation - [ ] Implementation - [ ] Validate metadata to check there are no nonsensical permissions (like `debug-menu` set at the table level) - [ ] Tests <!-- readthedocs-preview datasette start --> ---- :books: Documentation preview :books:: https://datasette--1938.org.readthedocs.build/en/1938/ <!-- readthedocs-preview datasette end --> | 2022-12-08T22:07:36Z | 2022-12-13T05:23:19Z | 2022-12-13T05:23:18Z | 271ea3ae0c858de2d392b61a1a4a9f5837cbddf8 | Datasette 1.0a2 8711695 | 0 | 6e35a6b4f7ea9ba3fb6f02f45452eeb41de69786 | e539c1c024bc62d88df91d9107cbe37e7f0fe55f | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/1938 | ||||
466410755 | MDExOlB1bGxSZXF1ZXN0NDY2NDEwNzU1 | 927 | closed | 0 | 'datasette --get' option, refs #926 | simonw 9599 | Refs #926, #898 | 2020-08-11T23:31:52Z | 2020-08-12T00:24:42Z | 2020-08-12T00:24:41Z | 2020-08-12T00:24:41Z | e139a7619f63d45ca2ff1ee108b933e17b5675b3 | 0 | 2111da01a03cfc62303b6a4b59ea9f96d22c0f78 | 83eda049af3f38d4289118d3576f96b2535084b1 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/927 | ||||
575940193 | MDExOlB1bGxSZXF1ZXN0NTc1OTQwMTkz | 1232 | closed | 0 | --crossdb option for joining across databases | simonw 9599 | Refs #283. Still needs: - [x] Unit test for --crossdb queries - [x] Show warning on console if it truncates at ten databases (or on web interface) - [x] Show connected databases on the `/_memory` database page - [x] Documentation - [x] https://latest.datasette.io/ demo should demonstrate this feature | 2021-02-18T19:48:50Z | 2021-02-18T22:09:13Z | 2021-02-18T22:09:12Z | 2021-02-18T22:09:12Z | 6f41c8a2bef309a66588b2875c3e24d26adb4850 | 0 | 887649942b02d70a0fe4e205e1e5eff4e745b016 | 4df548e7668b5b21d64a267964951e67894f4712 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/1232 | ||||
375180832 | MDExOlB1bGxSZXF1ZXN0Mzc1MTgwODMy | 672 | open | 0 | --dirs option for scanning directories for SQLite databases | simonw 9599 | Refs #417. | 2020-02-14T02:25:52Z | 2020-03-27T01:03:53Z | 0e0e544f1f23451f04d7ca576ace5b18ce168e6f | 0 | ee718b98b793df2a15b125cbf20816c9864bf7e9 | 6aa516d82dea9885cb4db8d56ec2ccfd4cd9b840 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/672 | ||||||
815164865 | PR_kwDOCGYnMM4wlm3B | 361 | closed | 0 | --lines and --text and --convert and --import | simonw 9599 | Refs #356 Still TODO: - [x] Get `--lines` working, with tests - [x] Get `--text` working, with tests - [x] Get regular JSON import working with `--convert` with tests - [x] Get `--lines` working with `--convert` with tests - [x] Get `--text` working with `--convert` with tests - [x] Get `--csv` and `--tsv` import working with `--convert` with tests - [x] Get `--nl` working with `--convert` with tests - [x] Documentation for all of the above | 2022-01-06T01:49:44Z | 2022-01-06T06:37:03Z | 2022-01-06T06:24:54Z | 2022-01-06T06:24:54Z | 413f8ed754e38d7b190de888c85fe8438336cb11 | 0 | b7f0b88d49032a053f0de2dbba356ee1f3b949c0 | f3fd8613113d21d44238a6ec54b375f5aa72c4e0 | OWNER | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/361 | ||||
397749653 | MDExOlB1bGxSZXF1ZXN0Mzk3NzQ5NjUz | 714 | closed | 0 | --metadata accepts YAML as well as JSON | simonw 9599 | Refs #713. Still needs tests and documentation. | 2020-04-02T18:36:02Z | 2020-04-02T19:30:54Z | 2020-04-02T19:30:54Z | 2020-04-02T19:30:54Z | 6717c719dd36dc2adc0f9da38a8c8e08129e96b4 | 0 | 5170c31adc44f6ef14b21782ba6f8ecb46dd9450 | 2aaad72789c427875426673c1a43e67c86fc970e | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/714 | ||||
295065796 | MDExOlB1bGxSZXF1ZXN0Mjk1MDY1Nzk2 | 544 | closed | 0 | --plugin-secret option | simonw 9599 | Refs #543 - [x] Zeit Now v1 support - [x] Solve escaping of ENV in Dockerfile - [x] Heroku support - [x] Unit tests - [x] Cloud Run support - [x] Documentation | 2019-07-06T22:18:20Z | 2019-07-08T02:06:31Z | 2019-07-08T02:06:31Z | 2019-07-08T02:06:31Z | 973f8f139df6ad425354711052cfc2256de2e522 | Datasette 0.29 4471010 | 0 | ccf80604e931fba1893b5bab11de386fed82009e | fcfcae21e67cc15090942b1d2a47b5f016279337 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/544 | |||
572209243 | MDExOlB1bGxSZXF1ZXN0NTcyMjA5MjQz | 1222 | closed | 0 | --ssl-keyfile and --ssl-certfile, refs #1221 | simonw 9599 | 2021-02-12T00:45:58Z | 2021-02-12T00:52:18Z | 2021-02-12T00:52:17Z | 2021-02-12T00:52:17Z | eda652cf6ee28a0babfb30ce3834512e9e33fb8e | 0 | 8ec72ea3e3e0a9876d5e61e4a2260224f16db2e3 | aa1fe0692c2abb901216738bfb35f9fcc5090e7d | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/1222 | |||||
368734500 | MDExOlB1bGxSZXF1ZXN0MzY4NzM0NTAw | 663 | closed | 0 | -p argument for datasette package, plus tests - refs #661 | simonw 9599 | 2020-01-29T19:47:49Z | 2020-01-29T22:46:43Z | 2020-01-29T22:46:43Z | 2020-01-29T22:46:43Z | 67fc9c5720ed1fcd62b116481f70d4e80b403a22 | 0 | 8adfc9db7f15e36fed677be4a9c833ff2cdec0bc | 34d77d780f68b778fd9d6ebbaf69f250436f055f | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/663 | |||||
1492599485 | PR_kwDOBm6k_c5Y90K9 | 2161 | closed | 0 | -s/--setting x y gets merged into datasette.yml, refs #2143, #2156 | simonw 9599 | This change updates the `-s/--setting` option to `datasette serve` to allow it to be used to set arbitrarily complex nested settings in a way that is compatible with the new `-c datasette.yml` work happening in: - #2143 It will enable things like this: ``` datasette data.db --setting plugins.datasette-ripgrep.path "/home/simon/code" ``` For the moment though it just affects [settings](https://docs.datasette.io/en/1.0a4/settings.html) - so you can do this: ``` datasette data.db --setting settings.sql_time_limit_ms 3500 ``` I've also implemented a backwards compatibility mechanism, so if you use it this way (the old way): ``` datasette data.db --setting sql_time_limit_ms 3500 ``` It will notice that the setting you passed is one of Datasette's core settings, and will treat that as if you said `settings.sql_time_limit_ms` instead. <!-- readthedocs-preview datasette start --> ---- :books: Documentation preview :books:: https://datasette--2161.org.readthedocs.build/en/2161/ <!-- readthedocs-preview datasette end --> | 2023-08-28T19:30:42Z | 2023-08-28T20:06:15Z | 2023-08-28T20:06:14Z | 2023-08-28T20:06:14Z | d9aad1fd042a25d226f2ace1f7827b4602761038 | 0 | a5cbf80d795b599697b2b873566386abb0cd8b32 | 527cec66b0403e689c8fb71fc8b381a1d7a46516 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/2161 | ||||
512545364 | MDExOlB1bGxSZXF1ZXN0NTEyNTQ1MzY0 | 1061 | closed | 0 | .blob output renderer | simonw 9599 | - [x] Remove the `/-/...blob/...` route I added in #1040 in place of the new `.blob` renderer URLs - [x] Link to new `.blob` download links on the arbitrary query page (using `_blob_hash=...`) - plus tests for this Closes #1050, Closes #1051 | 2020-10-29T20:25:08Z | 2020-10-29T22:01:40Z | 2020-10-29T22:01:39Z | 2020-10-29T22:01:39Z | 78b3eeaad9189eb737014f53212082684f4bb0d4 | 0.51 6026070 | 0 | 1196d084de6a7a6f68c7705a6cc096bb8df132e3 | d6f9ff71378c4eab34dad181c23cfc143a4aef2d | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/1061 | |||
379192258 | MDExOlB1bGxSZXF1ZXN0Mzc5MTkyMjU4 | 683 | closed | 0 | .execute_write() and .execute_write_fn() methods on Database | simonw 9599 | See #682 - [x] Come up with design for `.execute_write()` and `.execute_write_fn()` - [x] Build some quick demo plugins to exercise the design - [x] Write some unit tests - [x] Write the documentation | 2020-02-24T19:51:58Z | 2020-05-30T18:40:20Z | 2020-02-25T04:45:08Z | 2020-02-25T04:45:08Z | a093c5f79fa034a97d2ad8b606745dd3b80365af | Datasette 1.0 3268330 | 0 | ec6e2edfe18446c9d77e3f30efbc299d27ea5c1b | 411056c4c43e74f2b3d0e3bc1175e7998516b1b3 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/683 | |||
297459797 | MDExOlB1bGxSZXF1ZXN0Mjk3NDU5Nzk3 | 40 | closed | 0 | .get() method plus support for compound primary keys | simonw 9599 | - [x] Tests for the `NotFoundError` exception - [x] Documentation for `.get()` method - [x] Support `--pk` multiple times to define CLI compound primary keys - [x] Documentation for compound primary keys | 2019-07-15T03:43:13Z | 2019-07-15T04:28:57Z | 2019-07-15T04:28:52Z | 2019-07-15T04:28:52Z | c65b67ca46f70e2da46a5b945f4ed358173262e9 | 0 | b5a5df6d0ed47f33f6e1b4873948ead9a7c71060 | 65b2156d9cc0aa6b5c3dc7a6bd600d98b281a13b | OWNER | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/40 | ||||
1479795255 | PR_kwDOCGYnMM5YM-I3 | 584 | closed | 0 | .transform() instead of modifying sqlite_master for add_foreign_keys | simonw 9599 | Refs: - #577 <!-- readthedocs-preview sqlite-utils start --> ---- :books: Documentation preview :books:: https://sqlite-utils--584.org.readthedocs.build/en/584/ <!-- readthedocs-preview sqlite-utils end --> | 2023-08-17T23:32:45Z | 2023-08-18T00:48:13Z | 2023-08-18T00:48:08Z | 2023-08-18T00:48:08Z | 509857ee8724f73760f3631b69c26f9047381187 | 0 | 291505084e652972ad806383250757d41d596d38 | 1dc6b5aa644a92d3654f7068110ed7930989ce71 | OWNER | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/584 | ||||
1507097949 | PR_kwDOCGYnMM5Z1H1d | 593 | closed | 0 | .transform() now preserves rowid values, refs #592 | simonw 9599 | Refs: - #592 - [x] Tests against weird shaped tables I need to test that this works against: - `rowid` tables - Tables that have a column called `rowid` even though they are not rowid tables <!-- readthedocs-preview sqlite-utils start --> ---- :books: Documentation preview :books:: https://sqlite-utils--593.org.readthedocs.build/en/593/ <!-- readthedocs-preview sqlite-utils end --> | 2023-09-08T01:02:28Z | 2023-09-10T17:44:59Z | 2023-09-09T00:45:30Z | 2023-09-09T00:45:30Z | 1c6ea54338e24fcebcee4e2f9c170ee300a5d946 | 0 | b86374f705d1f4143a51634b30289cb48add0ea2 | 5d123f031fc4fadc98f508e0ef6b7b6671e86155 | OWNER | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/593 | ||||
507903392 | MDExOlB1bGxSZXF1ZXN0NTA3OTAzMzky | 1040 | closed | 0 | /db/table/-/blob/pk/column.blob download URL | simonw 9599 | Refs #1036. Still needs: - [x] Comprehensive tests across all of the code branches, plus permissions - [x] A bit more refactoring to share logic cleanly with `RowView` - ~~A configuration option to disable this feature (probably)~~ | 2020-10-21T22:39:15Z | 2020-10-24T23:09:20Z | 2020-10-24T23:09:19Z | 2020-10-24T23:09:19Z | 5a1519796037105bc20bcf2f91a76e022926c204 | 0.51 6026070 | 0 | 4f3165f25fd9241fcf1291c797f4c77766b954dc | bf82b3d6a605c9ddadd5fb739249dfe6defaf635 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/1040 | |||
1144085408 | PR_kwDOBm6k_c5EMVug | 1931 | closed | 0 | /db/table/-/upsert | simonw 9599 | Refs #1878 Still todo: - [x] Support `"return": true` properly for upserts (with tests) - [x] Require both `insert-row` and `update-row` permissions - [x] Tests are going to need to cover both rowid-only and compound primary key tables, including all of the error states - [x] Documentation <!-- readthedocs-preview datasette start --> ---- :books: Documentation preview :books:: https://datasette--1931.org.readthedocs.build/en/1931/ <!-- readthedocs-preview datasette end --> | 2022-12-03T07:01:44Z | 2022-12-08T01:12:17Z | 2022-12-08T01:12:16Z | 2022-12-08T01:12:16Z | 272982e8a6f45700ff93c3917b4688a86de0e672 | Datasette 1.0a2 8711695 | 0 | 7cd6fd9f76913196d4f99a194a30e406f33aa363 | 93ababe6f7150454d2cf278dae08569e505d2a5b | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/1931 | |||
152360740 | MDExOlB1bGxSZXF1ZXN0MTUyMzYwNzQw | 81 | closed | 0 | :fire: Removes DS_Store | jefftriplett 50527 | 2017-11-13T22:07:52Z | 2017-11-14T02:24:54Z | 2017-11-13T22:16:55Z | 2017-11-13T22:16:55Z | 06a826c3188af82f27bb6b4e09cc89b782d30bd6 | 0 | c66d297eac556a7f4fd4dcdb15cfb9466fddac77 | d75f423b6fcfc074b7c6f8f7679da8876f181edd | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/81 | |||||
249680944 | MDExOlB1bGxSZXF1ZXN0MjQ5NjgwOTQ0 | 9 | closed | 0 | :pencil: Updates my_database.py to my_database.db | jefftriplett 50527 | I noticed that both `.py` and `.db` were used in the docs and assumed you'd prefer `.db`. | 2019-02-01T17:35:43Z | 2019-02-24T03:55:04Z | 2019-02-24T03:55:04Z | 2019-02-24T03:55:04Z | c5068a0972651b3e359ebc2d6c1486b8b7d2c242 | 0 | 1ad604fbbd3311f041357190796a3613c0c729d1 | 441c131db5cc68e197db19f0623ff8a96c90c3ff | CONTRIBUTOR | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/9 | ||||
651492888 | MDExOlB1bGxSZXF1ZXN0NjUxNDkyODg4 | 1339 | closed | 0 | ?_col=/?_nocol= to show/hide columns on the table page | simonw 9599 | See #615. Still to do: - [x] Allow combination of `?_col=` and `?_nocol=` (`_nocol` wins) - [x] Deduplicate same column if passed in `?_col=` multiple times - [x] Validate that user did not try to remove a primary key - [x] Add tests - [x] Ensure this works correctly for SQL views - [x] Add documentation | 2021-05-24T17:15:20Z | 2021-05-27T04:17:44Z | 2021-05-27T04:17:43Z | 2021-05-27T04:17:43Z | f1c29fd6a184254aa68efadf096bcf21e848f921 | 0 | 387c8379b92e559180098f73017a1bf2e6370205 | 2bd9d54b2762c991e11950c22c88c0336158d49b | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/1339 | ||||
1212277427 | PR_kwDOBm6k_c5IQeKz | 1999 | closed | 0 | ?_extra= support (draft) | simonw 9599 | Refs: - #262 <!-- readthedocs-preview datasette start --> ---- :books: Documentation preview :books:: https://datasette--1999.org.readthedocs.build/en/1999/ <!-- readthedocs-preview datasette end --> | 2023-01-21T04:55:18Z | 2023-03-22T22:49:41Z | 2023-03-22T22:49:40Z | 2023-03-22T22:49:40Z | d97e82df3c8a3f2e97038d7080167be9bb74a68d | 0 | 69a31cd5b61f0b62938efdeec5972090f1a1a508 | 56b0758a5fbf85d01ff80a40c9b028469d7bb65f | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/1999 | ||||
195339111 | MDExOlB1bGxSZXF1ZXN0MTk1MzM5MTEx | 311 | closed | 0 | ?_labels=1 to expand foreign keys (in csv and json), refs #233 | simonw 9599 | Output looks something like this: { "rowid": 233, "TreeID": 121240, "qLegalStatus": { "value" 2, "label": "Private" } "qSpecies": { "value": 16, "label": "Sycamore" } "qAddress": "91 Commonwealth Ave", ... } | 2018-06-16T16:31:12Z | 2018-06-16T22:20:31Z | 2018-06-16T22:20:31Z | 9fe59e54ad65eb1c8239b1a78edb5219d3ab8ab0 | 0 | 40287b1ba09d6e75f0db1458fe78d8c055f128af | d0a578c0fc07b9d9208cd9de981bdf7385a26c49 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/311 | |||||
379378780 | MDExOlB1bGxSZXF1ZXN0Mzc5Mzc4Nzgw | 686 | closed | 0 | ?_searchmode=raw option | simonw 9599 | Closes #676 | 2020-02-25T05:45:50Z | 2020-02-25T05:56:09Z | 2020-02-25T05:56:04Z | 2020-02-25T05:56:04Z | 6cb65555f46456eb31b62e855e21b1d8c809b1a2 | 0 | abc782cb342c21b565142e44e70502e61ac6756b | a093c5f79fa034a97d2ad8b606745dd3b80365af | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/686 | ||||
185307407 | MDExOlB1bGxSZXF1ZXN0MTg1MzA3NDA3 | 246 | closed | 0 | ?_shape=array and _timelimit= | simonw 9599 | 2018-05-02T00:18:54Z | 2018-05-02T00:20:41Z | 2018-05-02T00:20:40Z | 2018-05-02T00:20:40Z | 690736436bac599ca042d1caa465c6d66d2651f9 | 0 | 3807d93b98573e142858c5871b8b4aadda71d28f | aa954382c3776d596f459897b0d984161293529d | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/246 | |||||
270191084 | MDExOlB1bGxSZXF1ZXN0MjcwMTkxMDg0 | 430 | closed | 0 | ?_where= parameter on table views, closes #429 | simonw 9599 | 2019-04-13T01:15:09Z | 2019-04-13T01:37:23Z | 2019-04-13T01:37:23Z | 2019-04-13T01:37:23Z | bc6a9b45646610f362b4287bc4110440991aa4d6 | 0 | 3ee087c7b60da7ec3e5d2f73611fc6ea99ff82fc | e11cb4c66442abca2a6b6159521a6cf4da8739c1 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/430 | |||||
657373726 | MDExOlB1bGxSZXF1ZXN0NjU3MzczNzI2 | 262 | closed | 0 | Ability to add descending order indexes | simonw 9599 | Refs #260 | 2021-05-29T04:51:04Z | 2021-05-29T05:01:42Z | 2021-05-29T05:01:39Z | 2021-05-29T05:01:39Z | 51d01da30d45c1fbc1e587e6046a933529cf915e | 0 | 50a4fb722d29c3e53f7b148a41aeda901d02a264 | b2302875c97f723e02cc39136d0b20fd706369aa | OWNER | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/262 | ||||
716262829 | MDExOlB1bGxSZXF1ZXN0NzE2MjYyODI5 | 1444 | closed | 0 | Ability to deploy demos of branches | simonw 9599 | See #1442. | 2021-08-19T21:08:04Z | 2021-08-19T21:09:44Z | 2021-08-19T21:09:39Z | 2021-08-19T21:09:39Z | d84e574e59c51ddcd6cf60a6f9b3d45182daf824 | 0 | 75f9fe6d6bc642ce5587dd74eed70064c11868be | adb5b70de5cec3c3dd37184defe606a082c232cf | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/1444 | ||||
719109709 | MDExOlB1bGxSZXF1ZXN0NzE5MTA5NzA5 | 321 | closed | 0 | Ability to insert file contents as text, in addition to blob | simonw 9599 | Refs #319. | 2021-08-24T22:37:18Z | 2021-08-24T23:31:17Z | 2021-08-24T23:31:13Z | 2021-08-24T23:31:13Z | 49a010c93d90bc68ce1c6fff7639927248912b54 | 0 | db2dd6d9f30b347f4ed22b07f59b5a615184fbfd | 9258f4bd8450c951900de998a7bf81ca9b45a014 | OWNER | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/321 | ||||
308292447 | MDExOlB1bGxSZXF1ZXN0MzA4MjkyNDQ3 | 55 | closed | 0 | Ability to introspect and run queries against views | simonw 9599 | See #54 | 2019-08-17T13:40:56Z | 2019-08-23T12:19:42Z | 2019-08-23T12:19:42Z | 2019-08-23T12:19:42Z | 9faa98222669723d31e918bb16a42c13c363817f | 0 | 4441d6d838fd7518ce715184361f549a04ec8b70 | 0e7b461eb3e925aef713206c15794ceae9259c57 | OWNER | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/55 | ||||
434055752 | MDExOlB1bGxSZXF1ZXN0NDM0MDU1NzUy | 844 | closed | 0 | Action to run tests and upload coverage report | simonw 9599 | Refs #843 | 2020-06-13T20:52:47Z | 2020-06-13T21:36:52Z | 2020-06-13T21:36:50Z | 2020-06-13T21:36:50Z | cf7a2bdb404734910ec07abc7571351a2d934828 | 0 | 1210d9f41841bdca450f85a2342cdb0ff339c1b4 | 80c18a18fc444b89cc12b73599d56e091f3a3c87 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/844 | ||||
1319390463 | PR_kwDOBm6k_c5OpEz_ | 2061 | open | 0 | Add "Packaging a plugin using Poetry" section in docs | rclement 1238873 | This PR adds a new section about packaging a plugin using `poetry` within the "Writing plugins" page of the documentation. <!-- readthedocs-preview datasette start --> ---- :books: Documentation preview :books:: https://datasette--2061.org.readthedocs.build/en/2061/ <!-- readthedocs-preview datasette end --> | 2023-04-19T07:23:28Z | 2023-04-19T07:27:18Z | e777f394dc770714055e48c952e04c1620454e3e | 0 | 2650e3ca2c5ae4f21efe216f9959be31d9e58eed | 5890a20c374fb0812d88c9b0ef26a838bfa06c76 | FIRST_TIME_CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/2061 | ||||||
716357982 | MDExOlB1bGxSZXF1ZXN0NzE2MzU3OTgy | 66 | open | 0 | Add --merged-by flag to pull-requests sub command | sarcasticadmin 30531572 | ## Description Proposing a solution to the API limitation for `merged_by` in pull_requests. Specifically the following called out in the readme: ``` Note that the merged_by column on the pull_requests table will only be populated for pull requests that are loaded using the --pull-request option - the GitHub API does not return this field for pull requests that are loaded in bulk. ``` This approach might cause larger repos to hit rate limits called out in https://github.com/dogsheep/github-to-sqlite/issues/51 but seems to work well in the repos I tested and included below. ## Old Behavior - Had to list out the pull-requests individually via multiple `--pull-request` flags ## New Behavior - `--merged-by` flag for getting 'merge_by' information out of pull-requests without having to specify individual PR numbers. # Testing Picking some repo that has more than one merger (datasette only has 1 😉 ) ``` $ github-to-sqlite pull-requests ./github.db opnsense/tools --merged-by $ echo "select id, url, merged_by from pull_requests;" | sqlite3 ./github.db 83533612|https://github.com/opnsense/tools/pull/39|1915288 102632885|https://github.com/opnsense/tools/pull/43|1915288 149114810|https://github.com/opnsense/tools/pull/57|1915288 160394495|https://github.com/opnsense/tools/pull/64|1915288 163308408|https://github.com/opnsense/tools/pull/67|1915288 169723264|https://github.com/opnsense/tools/pull/69|1915288 171381422|https://github.com/opnsense/tools/pull/72|1915288 179938195|https://github.com/opnsense/tools/pull/77|1915288 196233824|https://github.com/opnsense/tools/pull/82|1915288 215289964|https://github.com/opnsense/tools/pull/93| 219696100|https://github.com/opnsense/tools/pull/97|1915288 223664843|https://github.com/opnsense/tools/pull/99| 228446172|https://github.com/opnsense/tools/pull/103|1915288 238930434|https://github.com/opnsense/tools/pull/110|1915288 255507110|https://github.com/opnsense/tools/pull/119|1915288 255980675|https://github.com/opnsense/tools/pull/120… | 2021-08-20T00:57:55Z | 2021-09-28T21:50:31Z | 6b4276d9469e4579c81588ac9e3d128026d919a0 | 0 | a92a31d5d446022baeaf7f3c9ea107094637e64d | ed3752022e45b890af63996efec804725e95d0d4 | FIRST_TIME_CONTRIBUTOR | github-to-sqlite 207052882 | https://github.com/dogsheep/github-to-sqlite/pull/66 | ||||||
872509423 | PR_kwDOBm6k_c40AW_v | 1649 | closed | 0 | Add /opt/homebrew to where spatialite extension can be found | danp 2182 | Helps homebrew on Apple Silicon setups find spatialite without needing a full path. Similar to #1114 | 2022-03-06T18:09:35Z | 2022-03-06T22:46:00Z | 2022-03-06T19:39:15Z | 2022-03-06T19:39:15Z | de810f49cc57a4f88e4a1553d26c579253ce4531 | 0 | 59b2c16e3db10390b134673bda20045c351bdef8 | 0499f174c063283aa9b589d475a32077aaf7adc5 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/1649 | ||||
730020867 | MDExOlB1bGxSZXF1ZXN0NzMwMDIwODY3 | 1467 | closed | 0 | Add Authorization header when CORS flag is set | jameslittle230 3058200 | This PR adds the [`Access-Control-Allow-Headers`](https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Access-Control-Allow-Headers) flag when CORS mode is enabled. This would fix https://github.com/simonw/datasette-auth-tokens/issues/4. When making cross-origin requests, the server must respond with all allowable HTTP headers. A Datasette instance using auth tokens must accept the `Authorization` HTTP header in order for cross-origin authenticated requests to take place. Please let me know if there's a better way of doing this! I couldn't figure out a way to change the app's response from the plugin itself, so I'm starting here. If you'd rather this logic live in the plugin, I'd love any guidance you're able to give. | 2021-09-08T22:14:41Z | 2021-10-17T02:29:07Z | 2021-10-14T18:54:18Z | 15f258735ddee555028a075c09e1e8f74069be70 | 0 | 05109e8d61dedd477c4cedfb89b1da65610f70d1 | d57ab156b35ec642549fb69d08279850065027d2 | NONE | datasette 107914493 | https://github.com/simonw/datasette/pull/1467 | |||||
634821065 | MDExOlB1bGxSZXF1ZXN0NjM0ODIxMDY1 | 1319 | closed | 0 | Add Docker multi-arch support with Buildx | blairdrummond 10801138 | This adds Docker support to extra CPU architectures (like arm) using [Docker's Buildx action](https://github.com/marketplace/actions/docker-setup-buildx) You can see [what that looks like on Dockerhub](https://hub.docker.com/r/blairdrummond/datasette/tags?page=1&ordering=last_updated) And how it lets Datasette run on a Raspberry Pi (top is my dockerhub, bottom is upstream) ![Screenshot from 2021-05-08 15-32-25](https://user-images.githubusercontent.com/10801138/117551210-a17a9f80-b012-11eb-966b-10e1590dd4a9.png) The workflow log [here](https://github.com/blairdrummond/datasette/runs/2535743398?check_suite_focus=true) (I subbed `blairdrummond` for datasetteproject in my branch) | 2021-05-08T19:35:03Z | 2021-05-27T16:49:24Z | 2021-05-27T16:49:24Z | 2021-05-27T16:49:23Z | 89822d10be0da446471986addea91d9766f12efb | 0 | cfca570f9ca010ff9036c75209dc42e78bbc945f | 1b697539f5b53cec3fe13c0f4ada13ba655c88c7 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/1319 | ||||
698423667 | MDExOlB1bGxSZXF1ZXN0Njk4NDIzNjY3 | 8 | open | 0 | Add Gmail takeout mbox import (v2) | maxhawkins 28565 | WIP This PR builds on #5 to continue implementing gmail import support. Building on @UtahDave's work, these commits add a few performance and bug fixes: * Decreased memory overhead for import by manually parsing mbox headers. * Fixed error where some messages in the mbox would yield a row with NULL in all columns. I will send more commits to fix any errors I encounter as I run the importer on my personal takeout data. | 2021-07-28T07:05:32Z | 2023-09-08T01:22:49Z | d2809fd3fd835358d01ad10401228a562539b29e | 0 | 8e6d487b697ce2e8ad885acf613a157bfba84c59 | e54e544427f1cc3ea8189f0e95f54046301a8645 | FIRST_TIME_CONTRIBUTOR | google-takeout-to-sqlite 206649770 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/8 | ||||||
853484980 | PR_kwDOCGYnMM4y3yW0 | 407 | closed | 0 | Add SpatiaLite helpers to CLI | eyeseast 25778 | Closes #398 This adds SpatiaLite helpers to the CLI. ```sh # init spatialite when creating a database sqlite-utils create database.db --enable-wal --init-spatialite # add geometry columns # needs a database, table, geometry column name, type, with optional SRID and not-null # this will throw an error if the table doesn't already exist sqlite-utils add-geometry-column database.db table-name geometry --srid 4326 --not-null # spatial index an existing table/column # this will throw an error it the table and column don't exist sqlite-utils create-spatial-index database.db table-name geometry ``` Docs and tests are included. | 2022-02-15T16:50:17Z | 2022-02-16T01:49:40Z | 2022-02-16T00:58:08Z | 2022-02-16T00:58:07Z | a692c56659c3563b26dcdc9e3534d63ecc26e180 | 0 | a974da591915e0548182bbbf01da34ecb9e537e6 | e7f040106b5f5a892ebd984f19b21c605e87c142 | CONTRIBUTOR | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/407 | ||||
592364255 | MDExOlB1bGxSZXF1ZXN0NTkyMzY0MjU1 | 16 | open | 0 | Add a fallback ID, print if no ID found | n8henrie 1234956 | Fixes https://github.com/dogsheep/healthkit-to-sqlite/issues/14 | 2021-03-13T13:38:29Z | 2021-03-13T14:44:04Z | 16ab307b2138891f226a66e4954c5470de753a0f | 0 | 27b3d54ccfe7d861770a9d0b173f6503580fea4a | 71e36e1cf034b96de2a8e6652265d782d3fdf63b | FIRST_TIME_CONTRIBUTOR | healthkit-to-sqlite 197882382 | https://github.com/dogsheep/healthkit-to-sqlite/pull/16 | ||||||
275281307 | MDExOlB1bGxSZXF1ZXN0Mjc1MjgxMzA3 | 444 | closed | 0 | Add a max-line-length setting for flake8 | russss 45057 | This stops my automatic editor linting from flagging lines which are too long. It's been lingering in my checkout for ages. 160 is an arbitrary large number - we could alter it if we have any opinions (but I find the line length limit to be my least favourite part of PEP8). | 2019-05-02T08:58:57Z | 2019-05-04T09:44:48Z | 2019-05-03T13:11:28Z | 2019-05-03T13:11:28Z | 470cf0b05d4fda0d2563f81c7e32af13fe346ccc | 0 | 4f0d265951d7e95920298b46eff39bb9cc783984 | efc93b8ab5a21e3802f75f08d5e41409f5684b5d | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/444 | ||||
587332473 | MDExOlB1bGxSZXF1ZXN0NTg3MzMyNDcz | 1252 | closed | 0 | Add back styling to lists within table cells (fixes #1141) | bobwhitelock 7476523 | This overrides the Datasette reset - see https://github.com/simonw/datasette/blob/d0fd833b8cdd97e1b91d0f97a69b494895d82bee/datasette/static/app.css#L35-L38 - to add back the default styling of list items displayed within Datasette table cells. Following this change, the same content as in the original issue looks like this: ![2021-03-09_02:57:32](https://user-images.githubusercontent.com/7476523/110411982-63e5ae80-8083-11eb-9b5c-e5dc825073e2.png) | 2021-03-09T03:00:57Z | 2021-03-29T00:14:04Z | 2021-03-29T00:14:04Z | 2021-03-29T00:14:04Z | e72397d65b06b019521b6411243687464ac8d8ca | 0 | d22aa32cd9c0f798bcab917cc2024a371b4c0069 | d0fd833b8cdd97e1b91d0f97a69b494895d82bee | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/1252 | ||||
370024697 | MDExOlB1bGxSZXF1ZXN0MzcwMDI0Njk3 | 4 | closed | 0 | Add beeminder-to-sqlite | bcongdon 706257 | 2020-02-02T15:51:36Z | 2020-10-12T00:36:16Z | 2020-10-12T00:36:16Z | 2020-10-12T00:36:16Z | 7e4c6ecdabc249c77e8049cd172b1b5af08a3371 | 0 | 6713b5c50178b95a9ec50227d4ef5793e71e8b0a | 2972bb001ab5f675eced62f7ba5adef2d3eba2ad | CONTRIBUTOR | dogsheep.github.io 214746582 | https://github.com/dogsheep/dogsheep.github.io/pull/4 | |||||
572254103 | MDExOlB1bGxSZXF1ZXN0NTcyMjU0MTAz | 1223 | closed | 0 | Add compile option to Dockerfile to fix failing test (fixes #696) | bobwhitelock 7476523 | This test was failing when run inside the Docker container: `test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3]`, with this error: ``` def test_searchable(app_client, path, expected_rows): response = app_client.get(path) > assert expected_rows == response.json["rows"] E AssertionError: assert [[1, 'barry c...sel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E + [] E - [[1, 'barry cat', 'terry dog', 'panther'], E - [2, 'terry dog', 'sara weasel', 'puma']] ``` The issue was that the version of sqlite3 built inside the Docker container was built with FTS3 and FTS4 enabled, but without the `SQLITE_ENABLE_FTS3_PARENTHESIS` compile option passed, which adds support for using `AND` and `NOT` within `match` expressions (see https://sqlite.org/fts3.html#compiling_and_enabling_fts3_and_fts4 and https://www.sqlite.org/compile.html). Without this, the `AND` used in the search in this test was being interpreted as a literal string, and so no matches were found. Adding this compile option fixes this. --- I actually ran into this issue because the same test was failing when I ran the test suite on my own machine, outside of Docker, and so I eventually tracked this down to my system sqlite3 also being compiled without this option. I wonder if this is a sign of a slightly deeper issue, that Datasette can silently behave differently based on the version and compilation of sqlite3 it is being used with. On my own system I fixed the test suite by running `pip install pysqlite3-binary`, so that this would be picked up instead of the `sqlite` package, as this seems to be compiled using this option, . Maybe using `pysqlite3-binary` could be installed/recommended by default so a more deterministic version of sqlite is used? Or there could be some feature detection done on the available sqlite version, to know what features are … | 2021-02-12T03:38:05Z | 2021-03-07T12:01:12Z | 2021-03-07T07:41:17Z | 2021-03-07T07:41:17Z | d0fd833b8cdd97e1b91d0f97a69b494895d82bee | 0 | d1cd1f259c699fab3af01c4aa90035ed0242471a | 9603d893b9b72653895318c9104d754229fdb146 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/1223 | ||||
1238017010 | PR_kwDOBm6k_c5JyqPy | 2025 | open | 0 | Add database metadata to index.html template context | palewire 9993 | Fixes #2016 <!-- readthedocs-preview datasette start --> ---- :books: Documentation preview :books:: https://datasette--2025.org.readthedocs.build/en/2025/ <!-- readthedocs-preview datasette end --> | 2023-02-12T11:16:58Z | 2023-02-12T11:17:14Z | a2d3bb02cf2c9b8ed7c788910fdda606108cd584 | 0 | 912ed9de92d1bb9a28f50a2e08c5e7df2b827c15 | 0b4a28691468b5c758df74fa1d72a823813c96bf | FIRST_TIME_CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/2025 | ||||||
596627780 | MDExOlB1bGxSZXF1ZXN0NTk2NjI3Nzgw | 18 | open | 0 | Add datetime parsing | n8henrie 1234956 | Parses the datetime columns so they are subsequently properly recognized as datetime. Fixes https://github.com/dogsheep/healthkit-to-sqlite/issues/17 | 2021-03-19T14:34:22Z | 2021-03-19T14:34:22Z | c87f4e8aa88ec277c6b5a000670c2cb42a10c03d | 0 | e0e7a0f99f844db33964b27c29b0b8d5f160202b | 71e36e1cf034b96de2a8e6652265d782d3fdf63b | FIRST_TIME_CONTRIBUTOR | healthkit-to-sqlite 197882382 | https://github.com/dogsheep/healthkit-to-sqlite/pull/18 | ||||||
165029807 | MDExOlB1bGxSZXF1ZXN0MTY1MDI5ODA3 | 182 | closed | 0 | Add db filesize next to download link | raynae 3433657 | Took a stab at #172, will this do the trick? | 2018-01-25T04:58:56Z | 2019-03-22T13:50:57Z | 2019-02-06T04:59:38Z | a8d9e69872dec9a551b25cd609ffdbf3896045bd | 0 | b62835205a830472abb66c708822c2dcdf4ab027 | 56623e48da5412b25fb39cc26b9c743b684dd968 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/182 | |||||
719998225 | MDExOlB1bGxSZXF1ZXN0NzE5OTk4MjI1 | 322 | closed | 0 | Add dict type to be mapped as TEXT in sqllite | minaeid90 2496189 | the library deal with Postgres type jsonb as dictionary, add dict type as a TEXT for mapping to sqlite | 2021-08-25T20:54:26Z | 2021-11-15T00:27:40Z | 2021-11-15T00:27:40Z | 2021-11-15T00:27:40Z | 271b894af52eb6437ae6cd84eba9867ad8dd43f6 | 0 | 69619f68c26478fdee479110e084fd22711013a3 | 77c240df56068341561e95e4a412cbfa24dc5bc7 | CONTRIBUTOR | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/322 | ||||
1069033742 | PR_kwDOBm6k_c4_uCkO | 1825 | closed | 0 | Add documentation for serving via OpenRC | asimpson 1048831 | I also removed a few lines which felt redundant given the following section dedicated to running behind a nginx proxy. <!-- readthedocs-preview datasette start --> ---- :books: Documentation preview :books:: https://datasette--1825.org.readthedocs.build/en/1825/ <!-- readthedocs-preview datasette end --> | 2022-09-27T19:00:56Z | 2022-09-28T04:21:37Z | 2022-09-28T04:21:37Z | 2022-09-28T04:21:37Z | 984b1df12cf19a6731889fc0665bb5f622e07b7c | 0 | e7e96dc2ef2b76338786f1b911a9753bb8bfc297 | 5f9f567acbc58c9fcd88af440e68034510fb5d2b | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/1825 | ||||
992299943 | PR_kwDOCGYnMM47JUun | 452 | closed | 0 | Add duplicate table feature | davidleejy 1690072 | This PR addresses a feature request raised in issue #449. Specifically this PR adds a functionality that lets users duplicate a table via: ```python table_new = db["my_table"].duplicate("new_table") ``` Test added in file `tests/test_duplicate.py`. Happy to make changes to meet maintainers' feedback, if any. | 2022-07-09T20:24:31Z | 2022-07-15T21:21:37Z | 2022-07-15T21:21:36Z | 2022-07-15T21:21:36Z | b366e68deb0780048a23610c279552f8529d4726 | 0 | eef350fe543c6301c61b257c5f708e0e16ed5a34 | 42440d6345c242ee39778045e29143fb550bd2c2 | CONTRIBUTOR | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/452 | ||||
1170816476 | PR_kwDOBm6k_c5FyT3c | 1967 | closed | 0 | Add favicon to documentation | choldgraf 1839645 | I've been browsing the datasette documentation and found it hard to quickly locate tabs with many of them open, because it does not ship a favicon. So this PR: - Grabs the favicon `.png` from datasette itself[^1] - Adds it to the `_static/` folder - Sets `html_favicon` to load it in the docs [^1]: I also learned that Chrome can fetch favicons as an internal service! See `chrome://favicon/https://datasette.io/tools/github-to-sqlite`. | 2022-12-19T14:01:04Z | 2022-12-31T19:15:51Z | 2022-12-31T19:00:31Z | 2022-12-31T19:00:31Z | 994ce46ed4a5d680bee58242efd95181946c25e9 | 0 | ac64f6fe6aeb1941d01f862999a8b9d4e95f4991 | e03aed00026cc2e59c09ca41f69a247e1a85cc89 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/1967 | ||||
505076418 | MDExOlB1bGxSZXF1ZXN0NTA1MDc2NDE4 | 5 | open | 0 | Add fitbit-to-sqlite | mrphil007 4632208 | 2020-10-16T20:04:05Z | 2020-10-16T20:04:05Z | 9b9a677a4fcb6a31be8c406b3050cfe1c6e7e398 | 0 | db64d60ee92448b1d2a7e190d9da20eb306326b0 | d0686ebed6f08e9b18b4b96c2b8170e043a69adb | FIRST_TIME_CONTRIBUTOR | dogsheep.github.io 214746582 | https://github.com/dogsheep/dogsheep.github.io/pull/5 | |||||||
560204306 | MDExOlB1bGxSZXF1ZXN0NTYwMjA0MzA2 | 224 | closed | 0 | Add fts offset docs. | polyrand 37962604 | The limit can be passed as a string to the query builder to have an offset. I have tested it using the shorthand `limit=f"15, 30"`, the standard syntax should work too. | 2021-01-22T20:50:58Z | 2021-02-14T19:31:06Z | 2021-02-14T19:31:06Z | 4d6ff040770119fb2c1bcbc97678d9deca752f2f | 0 | 341f50d2d95ba1d69ad64ba8c0ec0ffa9a68d063 | 36dc7e3909a44878681c266b90f9be76ac749f2d | NONE | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/224 | |||||
768796296 | PR_kwDOCGYnMM4t0uaI | 333 | closed | 0 | Add functionality to read Parquet files. | Florents-Tselai 2118708 | I needed this for a project of mine, and I thought it'd be useful to have it in sqlite-utils (It's also mentioned in #248 ). The current implementation works (data is read & data types are inferred correctly. I've added a single straightforward test case, but @simonw please let me know if there are any non-obvious flags/combinations I should test too. | 2021-10-28T23:43:19Z | 2021-11-25T19:47:35Z | 2021-11-25T19:47:35Z | eda2b1f8d2670c6ca8512e3e7c0150866bd0bdc6 | 0 | 50ec2e49dee3b09a48a7aef55eceaa3f752a52e7 | fda4dad23a0494890267fbe8baf179e2b56ee914 | NONE | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/333 | |||||
445023326 | MDExOlB1bGxSZXF1ZXN0NDQ1MDIzMzI2 | 118 | closed | 0 | Add insert --truncate option | tsibley 79913 | Deletes all rows in the table (if it exists) before inserting new rows. SQLite doesn't implement a TRUNCATE TABLE statement but does optimize an unqualified DELETE FROM. This can be handy if you want to refresh the entire contents of a table but a) don't have a PK (so can't use --replace), b) don't want the table to disappear (even briefly) for other connections, and c) have to handle records that used to exist being deleted. Ideally the replacement of rows would appear instantaneous to other connections by putting the DELETE + INSERT in a transaction, but this is very difficult without breaking other code as the current transaction handling is inconsistent and non-systematic. There exists the possibility for the DELETE to succeed but the INSERT to fail, leaving an empty table. This is not much worse, however, than the current possibility of one chunked INSERT succeeding and being committed while the next chunked INSERT fails, leaving a partially complete operation. | 2020-07-06T21:58:40Z | 2020-07-08T17:26:21Z | 2020-07-08T17:26:21Z | 2020-07-08T17:26:21Z | ae4593316ccf5e42ad26f27033193834a7e696c8 | 0 | 332f7d770b84734dbed4842ab3ed24ee5b687889 | f8277d0fb9c05a88a9ff01d996e31d55f0f0a645 | CONTRIBUTOR | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/118 | ||||
274174614 | MDExOlB1bGxSZXF1ZXN0Mjc0MTc0NjE0 | 437 | closed | 0 | Add inspect and prepare_sanic hooks | russss 45057 | This adds two new plugin hooks: The `inspect` hook allows plugins to add data to the inspect dictionary. The `prepare_sanic` hook allows plugins to hook into the web router. I've attached a warning to this hook in the docs in light of #272 but I want this hook now... On quick inspection, I don't think it's worthwhile to try and make this hook independent of the web framework (but it looks like Starlette would make the hook implementation a bit nicer). Ref #14 | 2019-04-28T11:53:34Z | 2019-06-24T16:38:57Z | 2019-06-24T16:38:56Z | 7aeaac7c478acf572bda61bdaa6ac3247dc15811 | 0 | f33a0a63a7442f0b665320ac3e2eb55de315f1f7 | 11b352b4d52fd02a422776edebb14f12e4994d3b | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/437 | |||||
673872974 | MDExOlB1bGxSZXF1ZXN0NjczODcyOTc0 | 7 | open | 0 | Add instagram-to-sqlite | gavindsouza 36654812 | The tool covers only chat imports at the time of opening this PR but I'm planning to import everything else that I feel inquisitive about ref: https://github.com/gavindsouza/instagram-to-sqlite | 2021-06-19T12:26:16Z | 2021-07-28T07:58:59Z | 66e9828db4a8ddc4049ab9932e1304288e571821 | 0 | 4e4c6baf41778071a960d288b0ef02bd01cb6376 | 92c6bb77629feeed661c7b8d9183a11367de39e0 | FIRST_TIME_CONTRIBUTOR | dogsheep.github.io 214746582 | https://github.com/dogsheep/dogsheep.github.io/pull/7 | ||||||
500798091 | MDExOlB1bGxSZXF1ZXN0NTAwNzk4MDkx | 1008 | open | 0 | Add json_loads and json_dumps jinja2 filters | mhalle 649467 | 2020-10-09T20:11:34Z | 2020-12-15T02:30:28Z | e33e91ca7c9b2fdeab9d8179ce0d603918b066aa | 0 | 40858989d47043743d6b1c9108528bec6a317e43 | 1bdbc8aa7f4fd7a768d456146e44da86cb1b36d1 | FIRST_TIME_CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/1008 | |||||||
153306882 | MDExOlB1bGxSZXF1ZXN0MTUzMzA2ODgy | 115 | closed | 0 | Add keyboard shortcut to execute SQL query | rgieseke 198537 | Very cool tool, thanks a lot! This PR adds a `Shift-Enter` short cut to execute the SQL query. I used CodeMirrors keyboard handling. | 2017-11-17T14:13:33Z | 2017-11-17T15:16:34Z | 2017-11-17T14:22:56Z | 2017-11-17T14:22:56Z | eda848b37f8452dba7913583ef101f39d9b130ba | 0 | bb514164e69400fc0be4e033c27f45f90b1ef651 | ed2b3f25beac720f14869350baacc5f62b065194 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/115 | ||||
508720660 | MDExOlB1bGxSZXF1ZXN0NTA4NzIwNjYw | 1044 | closed | 0 | Add minimum supported python | bollwyvl 45380 | Thanks for `datasette`! This PR adds `python_requires` to formally signal the [minimum supported python version](https://packaging.python.org/guides/dropping-older-python-versions/#specify-the-version-ranges-for-supported-python-distributions) (which is pointed out with classifiers, so seems pretty straightforward). | 2020-10-23T05:08:03Z | 2020-10-23T20:53:08Z | 2020-10-23T20:53:08Z | 2020-10-23T20:53:08Z | cab8e65261b117b493af6a0b21aa2e1ae4564419 | 0 | 6453ab18e56b36bc912b6f24c4a43002c6084ade | d0cc6f4c32e1f89238ddec782086b3122f445bd4 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/1044 | ||||
1608050242 | PR_kwDOCGYnMM5f2OZC | 604 | closed | 0 | Add more STRICT table support | tkhattra 16437338 | - https://github.com/simonw/sqlite-utils/issues/344#issuecomment-982014776 Make `table.transform()` preserve STRICT mode. <!-- readthedocs-preview sqlite-utils start --> ---- :books: Documentation preview :books:: https://sqlite-utils--604.org.readthedocs.build/en/604/ <!-- readthedocs-preview sqlite-utils end --> | 2023-11-19T19:38:53Z | 2023-12-08T05:17:20Z | 2023-12-08T05:05:27Z | 2023-12-08T05:05:27Z | 1500c19bd0f31b2e7f28a5ec2d7bfa133a2e4d4c | 0 | 61c6e26cf922c70b65b161473723ff9d869a04a5 | 9286c1ba432e890b1bb4b2a1f847b15364c1fa18 | CONTRIBUTOR | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/604 | ||||
1492889894 | PR_kwDOBm6k_c5Y-7Em | 2162 | closed | 0 | Add new `--internal internal.db` option, deprecate legacy `_internal` database | asg017 15178711 | refs #2157 This PR adds a new `--internal` option to datasette serve. If provided, it is the path to a persistent internal database that Datasette core and Datasette plugins can use to store data, as discussed in the proposal issue. This PR also removes and deprecates the previous in-memory `_internal` database. Those tables now appear in the `internal` database, with `core_` prefixes (ex `tables` in `_internal` is now `core_tables` in `internal`). ## A note on the new `core_` tables However, one important notes about those new `core_` tables: If a `--internal` DB is passed in, that means those `core_` tables will persist across multiple Datasette instances. This wasn't the case before, since `_internal` was always an in-memory database created from scratch. I tried to put those `core_` tables as `TEMP` tables - after all, there's always one 1 `internal` DB connection at a time, so I figured it would work. But, since we use the `Database()` wrapper for the internal DB, it has two separate connections: a default read-only connection and a write connection that is created when a write operation occurs. Which meant the `TEMP` tables would be created by the write connection, but not available in the read-only connection. So I had a brillant idea: Attach an in-memory named database with `cache=shared`, and create those tables there! ```sql ATTACH DATABASE 'file:datasette_internal_core?mode=memory&cache=shared' AS core; ``` We'd run this on both the read-only connection and the write-only connection. That way, those tables would stay in memory, they'd communicate with the `cache=shared` feature, and we'd be good to go. However, I couldn't find an easy way to run a `ATTACH DATABASE` command on the read-only query. Using `Database()` as a wrapper for the internal DB is pretty limiting - it's meant for Datasette "data" databases, where we want multiple readers and possibly 1 write connection at a time. But the internal database doesn't really require that kind of support - I think we… | 2023-08-29T00:05:07Z | 2023-08-29T03:24:23Z | 2023-08-29T03:24:23Z | 2023-08-29T03:24:23Z | 92b8bf38c02465f624ce3f48dcabb0b100c4645d | 0 | 73489cac8ef8e934e601302fa6594e27b75a382d | 2e2825869fc2655b5fcadc743f6f9dec7a49bc65 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/2162 | ||||
1031503844 | PR_kwDOBm6k_c49e3_k | 1789 | closed | 0 | Add new entrypoint option to `--load-extension` | asg017 15178711 | Closes #1784 The `--load-extension` flag can now accept an optional "entrypoint" value, to specify which entrypoint SQLite should load from the given extension. ```bash # would load default entrypoint like before datasette data.db --load-extension ext # loads the extensions with the "sqlite3_foo_init" entrpoint datasette data.db --load-extension ext:sqlite3_foo_init # loads the extensions with the "sqlite3_bar_init" entrpoint datasette data.db --load-extension ext:sqlite3_bar_init ``` For testing, I added a small SQLite extension in C at `tests/ext.c`. If compiled, then pytest will run the unit tests in `test_load_extensions.py`to verify that Datasette loads in extensions correctly (and loads the correct entrypoints). Compiling the extension requires a C compiler, I compiled it on my Mac with: ``` gcc ext.c -I path/to/sqlite -fPIC -shared -o ext.dylib ``` Where `path/to/sqlite` is a directory that contains the SQLite amalgamation header files. Re documentation: I added a bit to the help text for `--load-extension` (which I believe should auto-add to documentation?), and the existing extension documentation is spatialite specific. Let me know if a new extensions documentation page would be helpful! | 2022-08-19T19:27:47Z | 2022-08-23T18:42:52Z | 2022-08-23T18:34:30Z | 2022-08-23T18:34:30Z | 1d64c9a8dac45b9a3452acf8e76dfadea2b0bc49 | 0 | 5a2a05f2cea7b55b1c3bb1df043c0a454eca6563 | 663ac431fe7202c85967568d82b2034f92b9aa43 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/1789 | ||||
187770345 | MDExOlB1bGxSZXF1ZXN0MTg3NzcwMzQ1 | 258 | closed | 0 | Add new metadata key persistent_urls which removes the hash from all database urls | philroche 247131 | Add new metadata key "persistent_urls" which removes the hash from all database urls when set to "true" This PR is just to gauge if this, or something like it, is something you would consider merging? I understand the reason why the substring of the hash is included in the url but there are some use cases where the urls should persist across deployments. For bookmarks for example or for scripts that use the JSON API. This is the initial commit for this feature. Tests and documentation updates to follow. | 2018-05-14T09:39:18Z | 2018-05-21T07:38:15Z | 2018-05-21T07:38:15Z | 457fcdfc82a0260db543d49006d49f8486f233b5 | 0 | 0d77a896ccb16b34c86fdeef7738f2d056e27e02 | 2b79f2bdeb1efa86e0756e741292d625f91cb93d | NONE | datasette 107914493 | https://github.com/simonw/datasette/pull/258 | |||||
821992886 | PR_kwDOCGYnMM4w_p22 | 385 | closed | 0 | Add new spatialite helper methods | eyeseast 25778 | Refs #79 This PR adds three new Spatialite-related methods to Database and Table: - `Database.init_spatialite` loads the Spatialite extension and initializes it - `Table.add_geometry_column` adds a geometry column - `Table.create_spatial_index` creates a spatial index Has tests and documentation. Feedback very welcome. | 2022-01-14T03:57:30Z | 2022-02-05T00:04:26Z | 2022-02-04T05:55:10Z | 2022-02-04T05:55:10Z | ee11274fcb1c00f32c95f2ef2924d5349538eb4d | 0 | af86b17acf2fa50048e38b96497636d49db89766 | 74586d3cb26fa3cc3412721985ecdc1864c2a31d | CONTRIBUTOR | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/385 | ||||
1047561919 | PR_kwDODFdgUs4-cIa_ | 76 | open | 0 | Add organization support to repos command | OverkillGuy 2757699 | New --organization flag to signify all given "usernames" are private orgs. Adapts API URL to the organization path instead. Not the best implementation, but a first draft to talk around Fixes #75 (badly, no tests, overly vague, untested) | 2022-09-06T13:21:42Z | 2022-09-06T13:59:08Z | 1514acfa87f57261547bc3d7fc4f161e34285d76 | 0 | bb959b46e8a7647755c14dee180fdd5209451954 | ace13ec3d98090d99bd71871c286a4a612c96a50 | FIRST_TIME_CONTRIBUTOR | github-to-sqlite 207052882 | https://github.com/dogsheep/github-to-sqlite/pull/76 | ||||||
338647378 | MDExOlB1bGxSZXF1ZXN0MzM4NjQ3Mzc4 | 1 | closed | 0 | Add parkrun-to-sqlite | mrw34 1101318 | 2019-11-08T12:05:32Z | 2020-10-12T00:35:16Z | 2020-10-12T00:35:16Z | 2020-10-12T00:35:16Z | 58ca0c785fbf34250042379dd0269bf2d0c5ea7e | 0 | ccb86548e0ae6f02a83f1feb0974476ad0f2f2d8 | 2972bb001ab5f675eced62f7ba5adef2d3eba2ad | CONTRIBUTOR | dogsheep.github.io 214746582 | https://github.com/dogsheep/dogsheep.github.io/pull/1 | |||||
1272169404 | PR_kwDOCGYnMM5L08O8 | 531 | closed | 0 | Add paths for homebrew on Apple silicon | eyeseast 25778 | This also passes in the extension path when specified in GIS methods. Wherever we know an extension path, we use `db.init_spatialite(find_spatialite() or load_extension)`. <!-- readthedocs-preview sqlite-utils start --> ---- :books: Documentation preview :books:: https://sqlite-utils--531.org.readthedocs.build/en/531/ <!-- readthedocs-preview sqlite-utils end --> | 2023-03-11T22:27:52Z | 2023-04-09T01:49:44Z | 2023-04-09T01:49:43Z | 24f3eb082b98b8d676bab2eab4f763cd9b50fe96 | 0 | afdf6187716b19fce8692f6887a1d45c85477fee | c0251cc9271260de73b4227859a51fab9b4cb745 | CONTRIBUTOR | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/531 | |||||
1306498393 | PR_kwDOCGYnMM5N35VZ | 536 | closed | 0 | Add paths for homebrew on Apple silicon | eyeseast 25778 | Does what it says and nothing else. This is the same set of paths as Datasette uses. <!-- readthedocs-preview sqlite-utils start --> ---- :books: Documentation preview :books:: https://sqlite-utils--536.org.readthedocs.build/en/536/ <!-- readthedocs-preview sqlite-utils end --> | 2023-04-08T13:34:21Z | 2023-04-13T01:44:43Z | 2023-04-13T01:44:43Z | 2023-04-13T01:44:43Z | 8f9a729e8aff972cb18de25b40f4113e26bbc758 | 0 | cea05dc5eab8d10fbd8943e615d2ab0dceff863c | c0251cc9271260de73b4227859a51fab9b4cb745 | CONTRIBUTOR | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/536 | ||||
1290512937 | PR_kwDODtX3eM5M66op | 6 | open | 0 | Add permalink virtual field to items table | xavdid 1231935 | I added a virtual column (no storage overhead) to the output that easily links back to the source. It works nicely out of the box with datasette: ![](https://cdn.zappy.app/faf43661d539ee0fee02c0421de22d65.png) I got bit a bit by https://github.com/simonw/sqlite-utils/issues/411, so I went with a manual `table_xinfo` and creating the table via execute. Happy to adjust if that issue moves, but this seems like it works. I also added my best-guess instructions for local development on this package. I'm shooting in the dark, so feel free to replace with how you work on it locally. | 2023-03-26T22:22:38Z | 2023-03-29T18:38:52Z | 99bda9434e0adaa8459bc0abbe6262785cd4086c | 0 | b04d6c76c26820f2e0b04da58dd82789e83cbb42 | c5585c103d124b23ba1e163f8857d4ba49fe452a | FIRST_TIME_CONTRIBUTOR | hacker-news-to-sqlite 248903544 | https://github.com/dogsheep/hacker-news-to-sqlite/pull/6 | ||||||
469651732 | MDExOlB1bGxSZXF1ZXN0NDY5NjUxNzMy | 48 | closed | 0 | Add pull requests | adamjonas 755825 | ref #46 Issues don't have merge information on them, which means that PRs need to be pulled separately. Did my best to mimic the API of issues. | 2020-08-18T17:58:44Z | 2020-11-29T23:51:09Z | 2020-11-29T23:51:09Z | 2020-11-29T23:51:09Z | b37f55549461cfe0731b57623f315860b3db49d0 | 0 | 3a0d5c498f9faae4e40aab204cd01b965a4f61f3 | 16d271253f4ea71b261d2d228b926c7bc1a7e660 | CONTRIBUTOR | github-to-sqlite 207052882 | https://github.com/dogsheep/github-to-sqlite/pull/48 | ||||
707490789 | MDExOlB1bGxSZXF1ZXN0NzA3NDkwNzg5 | 312 | closed | 0 | Add reference page to documentation using Sphinx autodoc | simonw 9599 | Refs #311. | 2021-08-10T16:59:17Z | 2021-08-10T23:09:32Z | 2021-08-10T23:09:28Z | 2021-08-10T23:09:28Z | 6155da72c8939b5d9bdacb7853e5e8d1767ce1d5 | 0 | 43bc06481783c3cfcee70c0cb541a686e8894adb | ee469e3122d6f5973ec2584c1580d930daca2e7c | OWNER | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/312 | ||||
274468836 | MDExOlB1bGxSZXF1ZXN0Mjc0NDY4ODM2 | 441 | closed | 0 | Add register_output_renderer hook | russss 45057 | This changeset refactors out the JSON renderer and then adds a hook and dispatcher system to allow custom output renderers to be registered. The CSV output renderer is untouched because supporting streaming renderers through this system would be significantly more complex, and probably not worthwhile. We can't simply allow hooks to be called at request time because we need a list of supported file extensions when the request is being routed in order to resolve ambiguous database/table names. So, renderers need to be registered at startup. I've tried to make this API independent of Sanic's request/response objects so that this can remain stable during the switch to ASGI. I'm using dictionaries to keep it simple and to make adding additional options in the future easy. Fixes #440 | 2019-04-29T18:03:21Z | 2019-05-01T23:01:57Z | 2019-05-01T23:01:57Z | 2019-05-01T23:01:57Z | cf406c075433882b656e340870adf7757976fa4c | 0 | c9f941f06eb0268841de49407725917c74a8a2dc | 11b352b4d52fd02a422776edebb14f12e4994d3b | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/441 | ||||
727265025 | MDExOlB1bGxSZXF1ZXN0NzI3MjY1MDI1 | 1455 | closed | 0 | Add scientists to target groups | rgieseke 198537 | Not sure if you want them mentioned explicitly (it's already a long list), but following up on https://twitter.com/simonw/status/1434176989565382656 | 2021-09-04T16:28:58Z | 2021-09-04T16:32:21Z | 2021-09-04T16:31:38Z | 2021-09-04T16:31:38Z | 772f9a07ce363869e0aaa7600617454dc00e6966 | 0 | afba5f9201e395bce43dcac13da4a2041872316e | 67cbf0ae7243431bf13702e6e3ba466b619c4d6f | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/1455 | ||||
1586779743 | PR_kwDOCGYnMM5elFZf | 600 | closed | 0 | Add spatialite arm64 linux path | MikeCoats 37802088 | According to both [Debian](https://packages.debian.org/bookworm/arm64/libsqlite3-mod-spatialite/filelist) and [Ubuntu](https://packages.ubuntu.com/mantic/arm64/libsqlite3-mod-spatialite/filelist), the correct “target triple” for arm64 is `aarch64-linux-gnu`, so we should be looking in `/usr/lib/aarch64-linux-gnu` for `mod_spatialite.so`. I can confirm that on both of my Debian arm64 SBCs, `libsqlite3-mod-spatialite` installs to that path. ``` $ ls -l /usr/lib/*/*spatial* lrwxrwxrwx 1 root root 23 Dec 1 2022 /usr/lib/aarch64-linux-gnu/mod_spatialite.so -> mod_spatialite.so.7.1.0 lrwxrwxrwx 1 root root 23 Dec 1 2022 /usr/lib/aarch64-linux-gnu/mod_spatialite.so.7 -> mod_spatialite.so.7.1.0 -rw-r--r-- 1 root root 7348584 Dec 1 2022 /usr/lib/aarch64-linux-gnu/mod_spatialite.so.7.1.0 ``` This is a set of before and after snippets of pytest’s output for this PR. ### Before ``` $ pytest tests/test_get.py ...... [ 73%] tests/test_gis.py ssssssssssss [ 75%] tests/test_hypothesis.py .... [ 75%] ``` ### After ``` $ pytest tests/test_get.py ...... [ 73%] tests/test_gis.py ............ [ 75%] tests/test_hypothesis.py .... [ 75%] ``` Issue: #599 <!-- readthedocs-preview sqlite-utils start --> ---- :books: Documentation preview :books:: https://sqlite-utils--600.org.readthedocs.build/en/600/ <!-- readthedocs-preview sqlite-utils end --> | 2023-11-03T22:23:26Z | 2023-11-04T00:34:33Z | 2023-11-04T00:31:49Z | 2023-11-04T00:31:49Z | b92ea4793ce4dcb73cf762aae634ab72f65ec50f | 0 | b1a60766a4150268557c4445297087a3f076be01 | 622c3a5a7dd53a09c029e2af40c2643fe7579340 | CONTRIBUTOR | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/600 | ||||
153201945 | MDExOlB1bGxSZXF1ZXN0MTUzMjAxOTQ1 | 114 | closed | 0 | Add spatialite, switch to debian and local build | ingenieroariel 54999 | Improves the Dockerfile to support spatial datasets, work with the local datasette code (Friendly with git tags and Dockerhub) and moves to slim debian, a small image easy to extend via apt packages for sqlite. | 2017-11-17T02:37:09Z | 2017-11-17T03:50:52Z | 2017-11-17T03:50:52Z | 2017-11-17T03:50:52Z | 8b4c600d98b85655b3a1454ebf64f858b5fe54c8 | 0 | 6c6b63d890529eeefcefb7ab126ea3bd7b2315c1 | b7c4165346ee8b6a6fbd72d6ba2275a24a8a8ae3 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/114 | ||||
1179812287 | PR_kwDODEm0Qs5GUoG_ | 67 | open | 0 | Add support for app-only bearer tokens | sometimes-i-send-pull-requests 26161409 | Previously, twitter-to-sqlite only supported OAuth1 authentication, and the token must be on behalf of a user. However, Twitter also supports application-only bearer tokens, documented here: https://developer.twitter.com/en/docs/authentication/oauth-2-0/bearer-tokens This PR adds support to twitter-to-sqlite for using application-only bearer tokens. To use, the auth.json file just needs to contain a "bearer_token" key instead of "api_key", "api_secret_key", etc. | 2022-12-28T23:31:20Z | 2022-12-28T23:31:20Z | 7825cd68047088cbdc9666586f1af9b7e1fa88c2 | 0 | 52050d06eeb85f3183b086944b7b75ae758096cd | f09d611782a8372cfb002792dfa727325afb4db6 | FIRST_TIME_CONTRIBUTOR | twitter-to-sqlite 206156866 | https://github.com/dogsheep/twitter-to-sqlite/pull/67 | ||||||
747742034 | PR_kwDODFdgUs4skaNS | 68 | open | 0 | Add support for retrieving teams / members | philwills 68329 | Adds a method for retrieving all the teams within an organisation and all the members in those teams. The latter is stored as a join table `team_members` beteween `teams` and `users`. | 2021-10-01T15:55:02Z | 2021-10-01T15:59:53Z | f46e276c356c893370d5893296f4b69f08baf02c | 0 | cc838e87b1eb19b299f277a07802923104f35ce2 | ed3752022e45b890af63996efec804725e95d0d4 | FIRST_TIME_CONTRIBUTOR | github-to-sqlite 207052882 | https://github.com/dogsheep/github-to-sqlite/pull/68 | ||||||
295748268 | MDExOlB1bGxSZXF1ZXN0Mjk1NzQ4MjY4 | 556 | closed | 0 | Add support for running datasette as a module | abdusco 3243482 | This PR allows running datasette using `python -m datasette` command in addition to just running the executable. This function is quite useful when debugging a plugin in a project because IDEs like PyCharm can easily start a debug session when datasette is run as a module in contrast to trying to attach a debugger to a running process. ![image](https://user-images.githubusercontent.com/3243482/60890448-fc4ede80-a263-11e9-8b42-d2a3db8d1a59.png) | 2019-07-09T13:13:30Z | 2019-07-11T16:07:45Z | 2019-07-11T16:07:44Z | 2019-07-11T16:07:44Z | 9ca860e54fe480d0a365c0c1d8d085926d12be1e | 0 | 056a7eac9480cb814d9c453b983e6b2b831e0ca1 | 81fa8b6cdc5457b42a224779e5291952314e8d20 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/556 | ||||
509590205 | MDExOlB1bGxSZXF1ZXN0NTA5NTkwMjA1 | 1049 | closed | 0 | Add template block prior to extra URL loaders | psychemedia 82988 | To handle packages that require Javascript state setting prior to loading a package (eg [`thebelab`](https://thebelab.readthedocs.io/en/latest/examples/minimal_example.html), provide a template block before the URLs are loaded. | 2020-10-25T13:08:55Z | 2020-10-29T09:20:52Z | 2020-10-29T09:20:34Z | 99f994b14e2dbe22fda18b67dd5c824d359443fb | 0 | 50a743ad35684f09d3c3880f6af2019e59271237 | 42f4851e3e7885f1092f104d6c883cea40b12f02 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/1049 | |||||
602261092 | MDExOlB1bGxSZXF1ZXN0NjAyMjYxMDky | 6 | closed | 0 | Add testres-db tool | ligurio 1151557 | 2021-03-28T15:43:23Z | 2022-02-16T05:12:05Z | 2022-02-16T05:12:05Z | eceb016506b5db29b9c21bc7fcf5e6e77259c7b4 | 0 | 91cfa6f7dcab032e2d21e80657c81e69119e2018 | 92c6bb77629feeed661c7b8d9183a11367de39e0 | NONE | dogsheep.github.io 214746582 | https://github.com/dogsheep/dogsheep.github.io/pull/6 | ||||||
917185800 | PR_kwDOBm6k_c42qyUI | 1717 | closed | 0 | Add timeout option to Cloudrun build | wragge 127565 | I've found that the Cloudrun build phase often hits a timeout limit with large databases. I believe the default timeout is 10 minutes. This pull request just adds a `--timeout` option to the cloudrun `publish` command and passes the value on to the build step. | 2022-04-23T11:51:21Z | 2022-04-24T14:03:08Z | 2022-04-24T14:03:08Z | 2022-04-24T14:03:08Z | 3001e1e394b6cb605c2cd81eed671a7da419c1b3 | 0 | 9b9a314a84453cec5ad6c886351ef3df9d47a5a4 | d57c347f35bcd8cff15f913da851b4b8eb030867 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/1717 | ||||
357974326 | MDExOlB1bGxSZXF1ZXN0MzU3OTc0MzI2 | 3 | closed | 0 | Add todoist-to-sqlite | bcongdon 706257 | Really enjoying getting into the dogsheep/datasette ecosystem. I made a downloader for Todoist, and I think/hope others might find this useful | 2019-12-30T04:02:59Z | 2020-10-12T00:35:58Z | 2020-10-12T00:35:57Z | 2020-10-12T00:35:57Z | 85af27dbff7e08a92656639fbf0cfa15c7d30b5c | 0 | 49bc87a43555d10696044e8e40d700d93611a190 | 58ca0c785fbf34250042379dd0269bf2d0c5ea7e | CONTRIBUTOR | dogsheep.github.io 214746582 | https://github.com/dogsheep/dogsheep.github.io/pull/3 | ||||
313105634 | MDExOlB1bGxSZXF1ZXN0MzEzMTA1NjM0 | 57 | closed | 0 | Add triggers while enabling FTS | amjith 49260 | This adds the option for a user to set up triggers in the database to keep their FTS table in sync with the parent table. Ref: https://sqlite.org/fts5.html#external_content_and_contentless_tables I would prefer to make the creation of triggers the default behavior, but that will break existing usage where people have been calling `populate_fts` after inserting new rows. I am happy to make changes to the PR as you see fit. | 2019-09-02T04:23:40Z | 2019-09-03T01:03:59Z | 2019-09-02T23:42:29Z | 2019-09-02T23:42:29Z | 405e092d5916e70df10f82d15e9c052aa9ee8d80 | 0 | e01943271b17115fbe0e81d523126d2fb1c7c24b | cb70f7d10996b844154bf3da88779dd1f65590bc | CONTRIBUTOR | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/57 | ||||
395258687 | MDExOlB1bGxSZXF1ZXN0Mzk1MjU4Njg3 | 96 | closed | 0 | Add type conversion for Panda's Timestamp | b0b5h4rp13 32605365 | Add type conversion for Panda's Timestamp, if Panda library is present in system (thanks for this project, I was about to do the same thing from scratch) | 2020-03-29T14:13:09Z | 2020-03-31T04:40:49Z | 2020-03-31T04:40:48Z | 2020-03-31T04:40:48Z | 8ea626e5fcdc4c9e52f615c6347e68173805f8b4 | 0 | 16ebbd2d494caabd0eeb502f8a944614b464bb12 | 22250a9c735077d6f365b73bf824e6c67b122c83 | CONTRIBUTOR | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/96 | ||||
189707374 | MDExOlB1bGxSZXF1ZXN0MTg5NzA3Mzc0 | 279 | closed | 0 | Add version number support with Versioneer | rgieseke 198537 | I think that's all for getting Versioneer support, I've been happily using it in a couple of projects ... ``` In [2]: datasette.__version__ Out[2]: '0.22+3.g6e12445' ``` Repo: https://github.com/warner/python-versioneer Versioneer Licence: Public Domain (CC0-1.0) Closes #273 | 2018-05-22T15:39:45Z | 2018-05-22T19:35:23Z | 2018-05-22T19:35:22Z | 2018-05-22T19:35:22Z | 49f317752cfe89c5641165a490eef49e025752a7 | 0 | d0d19453e8623cc98a2baa2cadaaff4cd2fee973 | 58b5a37dbbf13868a46bcbb284509434e66eca25 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/279 | ||||
1037685744 | PR_kwDOBm6k_c492dPw | 1793 | closed | 0 | Added a useful resource | MobiWancode 111973926 | Have added a useful resource about the types of databases in SQL i.e SQLite, PostgreSQL, MySQL &, etc from the scaler topics. <!-- readthedocs-preview datasette start --> ---- :books: Documentation preview :books:: https://datasette--1793.org.readthedocs.build/en/1793/ <!-- readthedocs-preview datasette end --> | 2022-08-26T08:41:26Z | 2022-09-06T00:41:25Z | 2022-09-06T00:41:24Z | 40c948ac58afa155bbceaff70c43e85e58434188 | 0 | 32a9224b7e107016e5ba0fc90ff86cfafad93b2f | ba35105eee2d3ba620e4f230028a02b2e2571df2 | NONE | datasette 107914493 | https://github.com/simonw/datasette/pull/1793 | |||||
293992382 | MDExOlB1bGxSZXF1ZXN0MjkzOTkyMzgy | 535 | closed | 0 | Added asgi_wrapper plugin hook, closes #520 | simonw 9599 | 2019-07-03T03:58:00Z | 2019-07-03T04:06:26Z | 2019-07-03T04:06:26Z | 2019-07-03T04:06:26Z | 4d2fdafe39159c9a8aa83f7e9bfe768bbbbb56a3 | 0 | 93bfa26bfd25a3cc911d637596e364d3474325bd | b9ede4c1898616512b5d204f9c941deff473cbe4 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/535 | |||||
755729137 | PR_kwDOBm6k_c4tC4Lx | 1487 | closed | 0 | Added instructions for installing plugins via pipx, #1486 | RhetTbull 41546558 | Adds missing instructions for installing plugins via pipx | 2021-10-12T00:48:30Z | 2021-10-13T21:09:11Z | 2021-10-13T21:09:10Z | 2021-10-13T21:09:10Z | 68087440b3448633a62807c1623559619584f2ee | 0 | 4909d5814494dcae77a851905bfc392c70f60d60 | 0d5cc20aeffa3537cfc9296d01ec24b9c6e23dcf | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/1487 | ||||
322529381 | MDExOlB1bGxSZXF1ZXN0MzIyNTI5Mzgx | 578 | closed | 0 | Added support for multi arch builds | heussd 887095 | Minor changes in Dockerfile and new Makefile to support Docker multi architecture builds. `make`will build one image per architecture and push them as one Docker manifest to Docker Hub. Feel free to change `IMAGE_NAME ` to `datasetteproject/datasette` to update your official Docker Hub image(s). | 2019-09-29T18:43:03Z | 2019-11-13T19:13:15Z | 2019-11-13T19:13:15Z | ae1aa0929b9e62a413ec9b4a40588e6aafe50573 | 0 | ce6372bc6210ae52ac1951647b8fbaee40d64fc1 | 0fc8afde0eb5ef677f4ac31601540d6168c8208d | NONE | datasette 107914493 | https://github.com/simonw/datasette/pull/578 | |||||
943518450 | PR_kwDODEm0Qs44PPLy | 66 | open | 0 | Ageinfo workaround | ashanan 11887 | I'm not sure if this is due to a new format or just because my ageinfo file is blank, but trying to import an archive would crash when it got to that file. This PR adds a guard clause in the `ageinfo` transformer and sets a default value that doesn't throw an exception. Seems likely to be the same issue mentioned by danp in https://github.com/dogsheep/twitter-to-sqlite/issues/54, my ageinfo file looks the same. Added that same ageinfo file to the test archive as well to help confirm my workaround didn't break anything. Let me know if you want any changes! | 2022-05-21T21:08:29Z | 2022-05-21T21:09:16Z | c22e8eba634b70e914de9f72e452b1ebea55c6ef | 0 | 75ae7c94120d14083217bc76ebd603b396937104 | f09d611782a8372cfb002792dfa727325afb4db6 | FIRST_TIME_CONTRIBUTOR | twitter-to-sqlite 206156866 | https://github.com/dogsheep/twitter-to-sqlite/pull/66 | ||||||
201451332 | MDExOlB1bGxSZXF1ZXN0MjAxNDUxMzMy | 345 | closed | 0 | Allow app names for `datasette publish heroku` | russss 45057 | Lets you supply the `-n` parameter for Heroku deploys, which also lets you update existing Heroku deployments. | 2018-07-14T13:12:34Z | 2018-07-14T14:09:54Z | 2018-07-14T14:04:44Z | 2018-07-14T14:04:44Z | 58fec99ab0a31bcf25968f2aa05d37de8139b83c | 0 | 864cf8c1b01a581d6dc9711efe7cb4f2a6ac87e8 | 31a5d8fa77be68d4f837f0a80a611675dce49f4b | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/345 | ||||
496298180 | MDExOlB1bGxSZXF1ZXN0NDk2Mjk4MTgw | 986 | closed | 0 | Allow facet by primary keys, fixes #985 | MrNaif2018 39452697 | Hello! This PR makes it possible to facet by primary keys. Did I get it right that just removing the condition on UI side is enough? From testing it works fine with primary keys, just as with normal keys. If so, should I also remove unused `data-is-pk`? | 2020-10-01T14:18:55Z | 2020-10-01T16:51:45Z | 2020-10-01T16:51:45Z | 58906c597f1217381f5d746726bcb8bdfa8f52f8 | 0 | 76f7094bd33f037a1c689a173f0dbbb988e6dcdd | 141544613f9e76ddb74eee38d6f8ee1e0e70f833 | NONE | datasette 107914493 | https://github.com/simonw/datasette/pull/986 | |||||
510235909 | MDExOlB1bGxSZXF1ZXN0NTEwMjM1OTA5 | 189 | closed | 0 | Allow iterables other than Lists in m2m records | adamwolf 35681 | I was playing around with sqlite-utils, creating a Roam Research dogsheep-style importer for Datasette, and ran into a slight snag. I wanted to use a generator to add an order column in an importer. It looked something like: ``` def order_generator(iterable, attr=None): if attr is None: attr = "order" order: int = 0 for i in iterable: i[attr] = order order += 1 yield i ``` When I used this with `insert_all` and other things, it worked fine--but it didn't work as the `records` argument to `m2m`. I dug into it, and sqlite-utils is explicitly checking if the records argument is a list or a tuple. I flipped the check upside down, and now it checks if the argument is a mapping. If it's a mapping, it wraps it in a list, otherwise it leaves it alone. (I get that it might not really make sense to put the order column on the second table. I changed my import schema a bit, and no longer have a real example, but maybe this change still makes sense.) The automated tests still pass, but I did not add any new ones. Let me know what you think! I'm really loving Datasette and its ecosystem; thanks for everything! | 2020-10-26T18:47:44Z | 2020-10-27T16:28:37Z | 2020-10-27T16:24:21Z | 2020-10-27T16:24:21Z | f045d8559a6d2cb922a2de30fbcc896a4486b82f | 0 | 93230b2acb61635b6d5070ad9c65e7221c63b75a | e4f1c7b936981de29823730c5dbef4f4ba7a4286 | CONTRIBUTOR | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/189 | ||||
764281468 | PR_kwDOBm6k_c4tjgJ8 | 1495 | open | 0 | Allow routes to have extra options | fgregg 536941 | Right now, datasette routes can only be a 2-tuple of `(regex, view_fn)`. If it was possible for datasette to handle extra options, like [standard Django does](https://docs.djangoproject.com/en/3.2/topics/http/urls/#passing-extra-options-to-view-functions), it would add flexibility for plugin authors. For example, if extra options were enabled, then it would be easy to make a single table the home page (#1284). This plugin would accomplish it. ```python from datasette import hookimpl from datasette.views.table import TableView @hookimpl def register_routes(datasette): return [ (r"^/$", TableView.as_view(datasette), {'db_name': 'DB_NAME', 'table': 'TABLE_NAME'}) ] ``` | 2021-10-22T15:00:45Z | 2021-11-19T15:36:27Z | 44969c5654748fb26ad05ab37245678f245f32e5 | 0 | fe7fa14b39846b919dfed44514a7d18d67e01dfd | ff9ccfb0310501a3b4b4ca24d73246a8eb3e7914 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/1495 | ||||||
1105985162 | PR_kwDOCGYnMM5B6_6K | 508 | closed | 0 | Allow surrogates in parameters | chapmanjacobd 7908073 | closes #507 https://dwheeler.com/essays/fixing-unix-linux-filenames.html <!-- readthedocs-preview sqlite-utils start --> ---- :books: Documentation preview :books:: https://sqlite-utils--508.org.readthedocs.build/en/508/ <!-- readthedocs-preview sqlite-utils end --> | 2022-10-31T22:11:49Z | 2022-11-17T15:11:16Z | 2022-10-31T22:55:36Z | 3b551597240d9a6058b1c3c720073120db213678 | 0 | 43a8c4c91fc22fb6bea07846f144072b0d047f4e | 529110e7d8c4a6b1bbf5fb61f2e29d72aa95a611 | CONTRIBUTOR | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/508 | |||||
1358254580 | PR_kwDOCGYnMM5Q9VH0 | 546 | closed | 0 | Analyze tables options: --common-limit, --no-most, --no-least | simonw 9599 | Refs #544 - [x] Documentation for CLI options - [x] Documentation for new Python API parameters: `most_common: bool` and `least_common: bool` - [x] Tests for CLI - [x] Tests for Python API | 2023-05-21T15:54:39Z | 2023-05-21T16:19:30Z | 2023-05-21T16:19:30Z | 2023-05-21T16:19:30Z | d2a7b15b2b930fe384e1f1715fc4af23386f4935 | 0 | 2eca17d46eca2cff52c78553085ec64d9029c969 | e047cc32e9d5de7025d4d3c16554d4290f4bd3d1 | OWNER | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/546 | ||||
275861559 | MDExOlB1bGxSZXF1ZXN0Mjc1ODYxNTU5 | 449 | closed | 0 | Apply black to everything | simonw 9599 | I've been hesitating on this for literally months, because I'm not at all excited about the giant diff that will result. But I've been using black on many of my other projects (most actively [sqlite-utils](https://github.com/simonw/sqlite-utils)) and the productivity boost is undeniable: I don't have to spend a single second thinking about code formatting any more! So it's worth swallowing the one-off pain and moving on in a new, black-enabled world. | 2019-05-03T21:57:26Z | 2019-05-04T02:17:14Z | 2019-05-04T02:15:15Z | 2019-05-04T02:15:15Z | 35d6ee2790e41e96f243c1ff58be0c9c0519a8ce | 0 | 9683aeb2394a4b7e44499b8a0240af3baafda832 | 66c87cee0c7344c7877373c60b180c766c206101 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/449 | ||||
1179812838 | PR_kwDODEm0Qs5GUoPm | 71 | open | 0 | Archive: Fix "ni devices" typo in importer | sometimes-i-send-pull-requests 26161409 | 2022-12-28T23:33:31Z | 2022-12-28T23:33:31Z | 7905dbd6e36bcabcfd9106c70ebb36ecf9e38260 | 0 | 0d3c62e8ba6e545785069cc0ffc8dc1bad03db80 | f09d611782a8372cfb002792dfa727325afb4db6 | FIRST_TIME_CONTRIBUTOR | twitter-to-sqlite 206156866 | https://github.com/dogsheep/twitter-to-sqlite/pull/71 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [pull_requests] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [state] TEXT, [locked] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [body] TEXT, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [merged_at] TEXT, [merge_commit_sha] TEXT, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [draft] INTEGER, [head] TEXT, [base] TEXT, [author_association] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [url] TEXT, [merged_by] INTEGER REFERENCES [users]([id]) , [auto_merge] TEXT); CREATE INDEX [idx_pull_requests_merged_by] ON [pull_requests] ([merged_by]); CREATE INDEX [idx_pull_requests_repo] ON [pull_requests] ([repo]); CREATE INDEX [idx_pull_requests_milestone] ON [pull_requests] ([milestone]); CREATE INDEX [idx_pull_requests_assignee] ON [pull_requests] ([assignee]); CREATE INDEX [idx_pull_requests_user] ON [pull_requests] ([user]);