issues
374 rows where comments = 2 and type = "issue" sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: milestone, author_association, state_reason, created_at (date), updated_at (date), closed_at (date)
repo 12
type 1
- issue · 374 ✖
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | pull_request | body | repo | type | active_lock_reason | performed_via_github_app | reactions | draft | state_reason |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1570375808 | I_kwDODFdgUs5dmgiA | 79 | Deploy demo job is failing due to rate limit | simonw 9599 | open | 0 | 2 | 2023-02-03T20:05:01Z | 2023-12-08T14:50:15Z | MEMBER | github-to-sqlite 207052882 | issue | { "url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/79/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
|||||||||
2029161033 | I_kwDOCGYnMM548opJ | 606 | str and int as aliases for text and integer | simonw 9599 | closed | 0 | 2 | 2023-12-06T18:35:49Z | 2023-12-06T19:44:04Z | 2023-12-06T18:49:32Z | OWNER | I keep making this mistake:
Error: Invalid value for '[[integer|float|blob|text|INTEGER|FLOAT|BLOB|TEXT]]': 'int' is not one of 'integer', 'float', 'blob', 'text', 'INTEGER', 'FLOAT', 'BLOB', 'TEXT'. ``` |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/606/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1910269679 | I_kwDOBm6k_c5x3Gbv | 2196 | Discord invite link returns 401 | Olshansk 1892194 | closed | 0 | 2 | 2023-09-24T15:16:54Z | 2023-10-13T00:07:08Z | 2023-10-12T21:54:54Z | NONE | I found the link to the datasette discord channel via this query. The following video should be self explanatory: https://github.com/simonw/datasette/assets/1892194/8cd33e88-bcaa-41f3-9818-ab4d589c3f02 Link for reference: https://discord.com/invite/ktd74dm5mw |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2196/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1825007061 | I_kwDOBm6k_c5sx2XV | 2123 | datasette serve when invoked with --reload interprets the serve command as a file | cadeef 79087 | open | 0 | 2 | 2023-07-27T19:07:22Z | 2023-09-18T13:02:46Z | NONE | When running
If a 'serve' file is created it launches properly (albeit with an empty database called serve):
Version (running from HEAD on main):
This issue appears to have existed for awhile as https://github.com/simonw/datasette/issues/1380#issuecomment-953366110 mentions the error in a different context. I'm happy to debug and land a patch if it's welcome. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2123/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1900026059 | I_kwDOBm6k_c5xQBjL | 2188 | Plugin Hooks for "compile to SQL" languages | asg017 15178711 | open | 0 | 2 | 2023-09-18T01:37:15Z | 2023-09-18T06:58:53Z | CONTRIBUTOR | There's a ton of tools/languages that compile to SQL, which may be nice in Datasette. Some examples:
It would be cool if plugins could extend Datasette to use these languages, in both the code editor and API usage. A few things I'd imagine a
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2188/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1898927976 | I_kwDOBm6k_c5xL1do | 2186 | Mechanism for register_output_renderer hooks to access full count | simonw 9599 | open | 0 | Datasette 1.0 3268330 | 2 | 2023-09-15T18:57:54Z | 2023-09-15T19:27:59Z | OWNER | The cause of this bug: - https://github.com/simonw/datasette-export-notebook/issues/17 Is that That field is no longer available by default - the It would be useful if plugins like this could access the total count on demand, should they need to. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2186/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
|||||||
1891614971 | I_kwDOCGYnMM5wv8D7 | 594 | Represent compound foreign keys in table.foreign_keys output | simonw 9599 | open | 0 | 2 | 2023-09-12T03:48:24Z | 2023-09-12T03:51:13Z | OWNER | Given this schema:
|
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/594/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1846076261 | I_kwDOBm6k_c5uCONl | 2139 | border-color: ##ff0000 bug - two hashes | simonw 9599 | closed | 0 | Datasette 1.0a-next 8755003 | 2 | 2023-08-11T01:22:58Z | 2023-08-11T05:16:24Z | 2023-08-11T05:16:24Z | OWNER | datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2139/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1844213115 | I_kwDOBm6k_c5t7HV7 | 2138 | on_success_message_sql option for writable canned queries | simonw 9599 | closed | 0 | Datasette 1.0a-next 8755003 | 2 | 2023-08-10T00:20:14Z | 2023-08-10T00:39:40Z | 2023-08-10T00:34:26Z | OWNER |
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2138/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
1840329615 | I_kwDOBm6k_c5tsTOP | 2130 | Render plugin mechanism needs `error` and `truncated` fields | simonw 9599 | closed | 0 | Datasette 1.0a3 9700784 | 2 | 2023-08-07T23:19:19Z | 2023-08-08T01:51:54Z | 2023-08-08T01:47:42Z | OWNER | While working on: - https://github.com/simonw/datasette/pull/2118 It became clear that the Needs to grow the ability to be told if an error occurred (an |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2130/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
1816997390 | I_kwDOCGYnMM5sTS4O | 576 | Backfill the release notes prior to 0.4 | simonw 9599 | closed | 0 | 2 | 2023-07-23T05:41:42Z | 2023-07-23T05:49:51Z | 2023-07-23T05:48:21Z | OWNER | Currently the changelog starts at 0.4: https://sqlite-utils.datasette.io/en/3.34/changelog.html#id115 I want the other releases - according to https://pypi.org/project/sqlite-utils/#history there are three missing: |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/576/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1816919568 | I_kwDOCGYnMM5sS_4Q | 575 | Python API ability to opt-out of connection plugins | simonw 9599 | closed | 0 | 2 | 2023-07-22T23:01:13Z | 2023-07-22T23:17:22Z | 2023-07-22T23:08:22Z | OWNER | Plugins affecting the CLI by default makes sense to me. I'm less confident about them always affecting users of the Python API. I'm going to have them apply by default, but I'm going to add a mechanism to opt-out on an individual database basis. Basically this: ```python from sqlite_utils import Database db = Database(memory=True, execute_plugins=False) Anything using db from here on will not execute plugins``` cc @asg017 Refs: - #567 - #574 |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/575/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1816877910 | I_kwDOCGYnMM5sS1tW | 572 | Don't test Python 3.7 against textual | simonw 9599 | closed | 0 | 2 | 2023-07-22T19:57:03Z | 2023-07-22T22:16:50Z | 2023-07-22T22:16:50Z | OWNER | Spotted this in the GitHub Actions logs: |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/572/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1808215339 | I_kwDOBm6k_c5rxy0r | 2104 | Tables starting with an underscore should be treated as hidden | simonw 9599 | open | 0 | 2 | 2023-07-17T17:13:53Z | 2023-07-18T22:41:37Z | OWNER | Plugins can then take advantage of this pattern, for example: - https://github.com/simonw/datasette-auth-tokens/pull/8 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2104/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1765870617 | I_kwDOBm6k_c5pQQwZ | 2087 | `--settings settings.json` option | simonw 9599 | open | 0 | 2 | 2023-06-20T17:48:45Z | 2023-07-14T17:02:03Z | OWNER | https://discord.com/channels/823971286308356157/823971286941302908/1120705940728066080
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2087/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1795219865 | I_kwDOCGYnMM5rAOGZ | 566 | `--no-headers` doesn't work on most formats | zellyn 33625 | open | 0 | 2 | 2023-07-09T03:43:36Z | 2023-07-09T04:13:35Z | NONE | Version 3.33
|
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/566/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1795187493 | I_kwDODLZ_YM5rAGMl | 12 | Switch to pyproject.toml | simonw 9599 | closed | 0 | 2 | 2023-07-09T01:06:56Z | 2023-07-09T01:19:43Z | 2023-07-09T01:19:42Z | MEMBER | First of my CLI tools to use https://til.simonwillison.net/python/pyproject |
pocket-to-sqlite 213286752 | issue | { "url": "https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/12/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1781047747 | I_kwDOBm6k_c5qKKHD | 2092 | test_homepage intermittent failure | simonw 9599 | closed | 0 | 2 | 2023-06-29T15:20:37Z | 2023-06-29T15:26:28Z | 2023-06-29T15:24:13Z | OWNER | e.g. in https://github.com/simonw/datasette/actions/runs/5413590227/jobs/9839373852 ``` =================================== FAILURES =================================== _____ testhomepage _______ [gw0] linux -- Python 3.7.17 /opt/hostedtoolcache/Python/3.7.17/x64/bin/python ds_client = <datasette.app.DatasetteClient object at 0x7f85d271ef50>
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2092/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1781005740 | I_kwDOBm6k_c5qJ_2s | 2090 | Adopt ruff for linting | simonw 9599 | open | 0 | 2 | 2023-06-29T14:56:43Z | 2023-06-29T15:05:04Z | OWNER | datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2090/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
|||||||||
1655860104 | I_kwDOCGYnMM5ismuI | 535 | rows: --transpose or psql extended view-like functionality | chapmanjacobd 7908073 | closed | 0 | 2 | 2023-04-05T15:37:33Z | 2023-06-15T08:39:49Z | 2023-06-14T22:05:28Z | CONTRIBUTOR | It would be nice if the rows subcommand had a flag, perhaps called In other words instead of this:
| track_id | title | song_id | release | artist_id | artist_mbid | artist_name | duration | artist_familiarity | artist_hotttnesss | year | track_7digitalid | shs_perf | shs_work | |--------------------|-------------------|--------------------|--------------------------------------|--------------------|--------------------------------------|------------------|------------|----------------------|---------------------|--------|--------------------|------------|------------| | TRMMMYQ128F932D901 | Silent Night | SOQMMHC12AB0180CB8 | Monster Ballads X-Mas | ARYZTJS1187B98C555 | 357ff05d-848a-44cf-b608-cb34b5701ae5 | Faster Pussy cat | 252.055 | 0.649822 | 0.394032 | 2003 | 7032331 | -1 | 0 | | TRMMMKD128F425225D | Tanssi vaan | SOVFVAK12A8C1350D9 | Karkuteillä | ARMVN3U1187FB3A1EB | 8d7ef530-a6fd-4f8f-b2e2-74aec765e0f9 | Karkkiautomaatti | 156.551 | 0.439604 | 0.356992 | 1995 | 1514808 | -1 | 0 | | TRMMMRX128F93187D9 | No One Could Ever | SOGTUKN12AB017F4F1 | Butter | ARGEKB01187FB50750 | 3d403d44-36ce-465c-ad43-ae877e65adc4 | Hudson Mohawke | 138.971 | 0.643681 | 0.437504 | 2006 | 6945353 | -1 | 0 | | TRMMMCH128F425532C | Si Vos Querés | SOBNYVR12A8C13558C | De Culo | ARNWYLR1187B9B2F9C | 12be7648-7094-495f-90e6-df4189d68615 | Yerba Brava | 145.058 | 0.448501 | 0.372349 | 2003 | 2168257 | -1 | 0 | | TRMMMWA128F426B589 | Tangle Of Aspens | SOHSBXH12A8C13B0DF | Rene Ablaze Presents Winter Sessions | AREQDTE1269FB37231 | | Der Mystic | 514.298 | 0 | 0 | 0 | 2264873 | -1 | 0 | The output would look something like this:
| track_id | |--------------------| | TRYYYVU12903CD01E3 | | TRYYYDJ128F9310A21 | | TRYYYMG128F4260ECA | | TRYYYJO128F426DA37 | | TRYYYUS12903CD2DF0 | | title | |-------------------------------------| | Fernweh feat. Sektion Kuchikäschtli | | Faraday | | Novemba | | Jago Chhadeo | | O Samba Da Vida | | song_id | |--------------------| | SOWXJXQ12AB0189F43 | | SOLXGOR12A81C21EB7 | | SOHODZI12A8C137BB3 | | SOXQYIQ12A8C137FBB | | SOTXAME12AB018F136 | | release | |---------------------------------| | So Oder So | | The Trance Collection Vol. 2 | | Dub_Connected: electronic music | | Naale Baba Lassi Pee Gya | | Pacha V.I.P. | | artist_id | |--------------------| | AR7PLM21187B990D08 | | ARCMCOK1187B9B1073 | | ARZ3R6M1187B9AF750 | | ART5FZD1187B9A7FCF | | AR7Z4J81187FB3FC59 | | artist_mbid | |--------------------------------------| | 3af2b07e-c91c-4160-9bda-f0b9e3144ed3 | | 4ac5f3de-c5ad-475e-ad50-41f1ef9dba20 | | 8b97e9c8-61f5-4615-9a96-276f24204e34 | | 2357c400-9109-42b6-b3fe-9e2d9f8e3872 | | 9d50cb20-7e42-45cc-b0dd-154c3e92a577 | | artist_name | |----------------| | Texta | | Elude | | Gabriel Le Mar | | Kuldeep Manak | | Kiko Navarro | | duration | |------------| | 295.079 | | 484.519 | | 553.038 | | 244.166 | | 217.443 | | artist_familiarity | |----------------------| | 0.552977 | | 0.403668 | | 0.556918 | | 0.4015 | | 0.528617 | | artist_hotttnesss | |---------------------| | 0.454869 | | 0.256935 | | 0.336914 | | 0.374866 | | 0.411595 | | year | |--------| | 2004 | | 0 | | 0 | | 0 | | 0 | | track_7digitalid | |--------------------| | 8486723 | | 5472456 | | 2219291 | | 1632096 | | 7522478 | | shs_perf | |------------| | -1 | | -1 | | -1 | | -1 | | -1 | | shs_work | |------------| | 0 | | 0 | | 0 | | 0 | | 0 | |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/535/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1740150327 | I_kwDOCGYnMM5nuJY3 | 557 | Aliased ROWID option for tables created from alter=True commands | chapmanjacobd 7908073 | closed | 0 | 2 | 2023-06-04T05:29:28Z | 2023-06-14T06:09:21Z | 2023-06-05T19:26:26Z | CONTRIBUTOR |
ROWID should never be used with foreign keys but the simple act of aliasing rowid to id (which is what happens when one does It would be convenient if there were more options to use a string column (eg. filepath) as the PK, and be able to use it during upserts, but when creating a foreign key, to create an integer column which aliases rowid I made an attempt to switch to integer primary keys here but it is not going well... In my usecase the path column is a business key. Yes, it should be as simple as including the https://github.com/chapmanjacobd/library/actions/runs/5173602136/jobs/9319024777 |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/557/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1718515590 | I_kwDOCGYnMM5mbneG | 544 | New options for analyze-tables --common-limit --no-most and --no-least | simonw 9599 | closed | 0 | 2 | 2023-05-21T14:03:19Z | 2023-05-21T17:03:06Z | 2023-05-21T16:19:31Z | OWNER | The "least common" section is frequently uninteresting, especially for huge tables with a large number of repeated-once values.
|
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/544/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1708030220 | I_kwDOBm6k_c5lznkM | 2073 | Faceting doesn't work against integer columns in views | simonw 9599 | open | 0 | 2 | 2023-05-12T18:20:10Z | 2023-05-12T18:24:07Z | OWNER | Spotted this issue here: https://til.simonwillison.net/datasette/baseline I had to do this workaround:
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2073/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1702354223 | I_kwDOBm6k_c5ld90v | 2070 | Mechanism for deploying a preview of a branch using Vercel | simonw 9599 | closed | 0 | 2 | 2023-05-09T16:21:45Z | 2023-05-09T16:25:00Z | 2023-05-09T16:24:31Z | OWNER | I prototyped that here: https://github.com/simonw/one-off-actions/blob/main/.github/workflows/deploy-datasette-branch-preview.yml It deployed the |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2070/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1595340692 | I_kwDOCGYnMM5fFveU | 530 | add ability to configure "on delete" and "on update" attributes of foreign keys: | fgregg 536941 | open | 0 | 2 | 2023-02-22T15:44:14Z | 2023-05-08T20:39:01Z | CONTRIBUTOR | sqlite supports these, and it would be quite nice to be able to add them with sqlite-utils. |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/530/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1620254998 | I_kwDOCGYnMM5gkyEW | 532 | Show more information when JSON can't be imported with sqlite-utils insert | voltagex 83080728 | closed | 0 | 2 | 2023-03-12T06:41:44Z | 2023-05-08T20:32:16Z | 2023-05-08T20:32:02Z | NONE | I am currently trying to import the JSON export of my data from Discord, specifically
Please show more information as to why this is invalid, if possible. I am using version 3.30 with Python 3.10 on Windows 11. |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/532/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1617602868 | I_kwDOJHON9s5gaqk0 | 6 | Character encoding problem | simonw 9599 | open | 0 | 2 | 2023-03-09T16:44:34Z | 2023-04-14T15:22:09Z | MEMBER | I ran against a recent note with this in it:
And got back:
Pasting that into https://ftfy.vercel.app/?s=Actions+%E2%80%9A%C3%B6%C3%B4%C3%94%E2%88%8F%C3%A8+ gives this:
|
apple-notes-to-sqlite 611552758 | issue | { "url": "https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1515883470 | I_kwDOC8tyDs5aWovO | 24 | DOC: xml.etree.ElementTree.ParseError due to healthkit version 12 | mmngreco 6231413 | open | 0 | 2 | 2023-01-01T23:00:38Z | 2023-03-30T10:17:31Z | NONE | Hi @simonw I hope you find this issue ok, the idea is provide some documentation to other users like me about how to solve this problem and save some time. Following the instructions from the
So, after debugging and searching on internet I found this useful link: https://discussions.apple.com/thread/254202523 (etresoft, the real hero). Which basically says that the xml given by the health app (healthkit version 12) has some bugs but fortunately, they can be solved with a couple of commads:
|
healthkit-to-sqlite 197882382 | issue | { "url": "https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/24/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1633077183 | I_kwDOBm6k_c5hVse_ | 2041 | Remove obsolete table POST code | simonw 9599 | closed | 0 | Datasette 1.0a-next 8755003 | 2 | 2023-03-21T01:01:40Z | 2023-03-21T01:17:44Z | 2023-03-21T01:17:43Z | OWNER | Spotted this in: - #1999
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2041/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
1620516340 | I_kwDOCGYnMM5glx30 | 533 | ReadTheDocs error: not all arguments converted during string formatting | simonw 9599 | closed | 0 | 2 | 2023-03-12T21:21:05Z | 2023-03-12T21:25:33Z | 2023-03-12T21:25:33Z | OWNER | This came up as a failure running tests for: - #531 Traceback on https://readthedocs.org/projects/sqlite-utils/builds/19749348/ ``` File "/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/531/lib/python3.8/site-packages/docutils/parsers/rst/states.py", line 889, in interpreted nodes, messages2 = role_fn(role, rawsource, text, lineno, self) File "/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/531/lib/python3.8/site-packages/sphinx/ext/extlinks.py", line 103, in role title = caption % part TypeError: not all arguments converted during string formatting Exception occurred: File "/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/531/lib/python3.8/site-packages/sphinx/ext/extlinks.py", line 103, in role title = caption % part TypeError: not all arguments converted during string formatting ``` |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/533/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1618249044 | I_kwDOBm6k_c5gdIVU | 2038 | Consider a `strict_templates` setting | simonw 9599 | open | 0 | 2 | 2023-03-10T02:09:13Z | 2023-03-10T02:11:06Z | OWNER | A setting which turns on Jinja strict mode, so any templates that access undefined variables raise a hard error. Prototype here:
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2038/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1616429236 | I_kwDOJHON9s5gWMC0 | 4 | Support incremental updates | simonw 9599 | open | 0 | 2 | 2023-03-09T05:14:00Z | 2023-03-09T18:20:56Z | MEMBER | Running this script can take several hours against a large notes database. Would be neat if it could run against just the notes that have been modified since it last ran. Could pull the max Problem is I don't actually know what order it iterates over the notes in. |
apple-notes-to-sqlite 611552758 | issue | { "url": "https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/4/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1612296210 | I_kwDOBm6k_c5gGbAS | 2033 | `datasette install -r requirements.txt` | simonw 9599 | closed | 0 | 2 | 2023-03-06T22:17:17Z | 2023-03-06T22:54:52Z | 2023-03-06T22:27:34Z | OWNER | Would be useful for cases where you want to install a whole set of plugins in one go, e.g. when running tutorials in GitHub Codespaces. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2033/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1594383280 | I_kwDOBm6k_c5fCFuw | 2030 | How to use Datasette with apache webserver on GCP? | gk7279 19700859 | closed | 0 | 2 | 2023-02-22T03:08:49Z | 2023-02-22T21:54:39Z | 2023-02-22T21:54:39Z | NONE | Hi Simon and Datasette team- I have installed apache2 webserver inside GCP VM using apt. I can see my "Hello World" index.html if I use the external IP of this GCP in a browser. However, when I try to run datasette with different combinations of -h and -p, I am still unable to access the webpage. I cannot invest Docker on this VM. Any pointers to use datasette with already existing apache2 webserver on GCP is appreciated. Thanks. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2030/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1579695809 | I_kwDOBm6k_c5eKD7B | 2023 | Error: Invalid setting 'hash_urls' in settings.json in 0.64.1 | mlaparie 80409402 | closed | 0 | 2 | 2023-02-10T13:35:01Z | 2023-02-10T15:40:00Z | 2023-02-10T15:39:59Z | NONE | On a Debian machine, using datasette 0.64.1 installed with This is my
This looks ok to me. Would you have any ideas? |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2023/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1186696202 | I_kwDOBm6k_c5Gu4wK | 1696 | Show foreign key label when filtering | simonw 9599 | open | 0 | 2 | 2022-03-30T16:18:54Z | 2023-01-29T20:56:20Z | OWNER | For example here: 3 corresponds to "Human Related: Other" - it would be neat to display this in this area of the page somehow. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1696/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1558644003 | I_kwDOBm6k_c5c5wUj | 2006 | Teach `datasette publish` to pin to `datasette<1.0` in a 0.x release | simonw 9599 | open | 0 | Datasette 1.0 3268330 | 2 | 2023-01-26T19:17:40Z | 2023-01-26T19:20:53Z | OWNER | I just realized that when I ship Datasette 1.0 there may be automated deployments out there which could deploy the 1.0 version by accident, potentially breaking any customizations that aren't compatible with the 1.0 changes. I can hopefully help avoid that by shipping one last entry in the |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2006/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
|||||||
1551113681 | I_kwDOBm6k_c5cdB3R | 1998 | `datasette --version` should also show the SQLite version | simonw 9599 | open | 0 | 2 | 2023-01-20T16:11:30Z | 2023-01-20T18:19:06Z | OWNER | datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1998/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
|||||||||
1522778923 | I_kwDOBm6k_c5aw8Mr | 1978 | Document datasette.urls.row and row_blob | eyeseast 25778 | closed | 0 | 2 | 2023-01-06T15:45:51Z | 2023-01-09T14:30:00Z | 2023-01-09T14:30:00Z | CONTRIBUTOR | These are in the codebase but not in documentation. I think everything else in this class is documented. ```python class Urls: ... def row(self, database, table, row_path, format=None): path = f"{self.table(database, table)}/{row_path}" if format is not None: path = path_with_format(path=path, format=format) return PrefixedUrlString(path)
``` |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1978/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
not_planned | ||||||
1524867951 | I_kwDOBm6k_c5a46Nv | 1980 | "Cannot sort table by id" when sortable_columns is used | simonw 9599 | open | 0 | 2 | 2023-01-09T03:21:33Z | 2023-01-09T03:23:53Z | OWNER | I had an instance with this in
It sent me to
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1980/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1515182998 | I_kwDOBm6k_c5aT9uW | 1970 | Path "None" in _internal database table | simonw 9599 | closed | 0 | 2 | 2022-12-31T18:51:05Z | 2022-12-31T19:22:58Z | 2022-12-31T18:52:49Z | OWNER | datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1970/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||||
1501713288 | I_kwDOBm6k_c5ZglOI | 1963 | 0.63.3 bugfix release | simonw 9599 | closed | 0 | 2 | 2022-12-18T02:48:15Z | 2022-12-18T03:26:55Z | 2022-12-18T03:26:55Z | OWNER | I'm going to ship a release which back-ports these two fixes: |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1963/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1495821607 | I_kwDOBm6k_c5ZKG0n | 1953 | Release notes for Datasette 1.0a2 | simonw 9599 | closed | 0 | Datasette 1.0a2 8711695 | 2 | 2022-12-14T06:26:40Z | 2022-12-15T02:02:15Z | 2022-12-15T02:01:08Z | OWNER | datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1953/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
855296937 | MDU6SXNzdWU4NTUyOTY5Mzc= | 1295 | Errors should have links to further information | simonw 9599 | open | 0 | 2 | 2021-04-11T12:39:12Z | 2022-12-14T23:28:49Z | OWNER | Inspired by this tweet: https://twitter.com/willmcgugan/status/1381186384510255104
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1295/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1483320357 | I_kwDOBm6k_c5Yaawl | 1937 | /db/-/create API should require insert-rows permission to use row: or rows: option | simonw 9599 | closed | 0 | Datasette 1.0a2 8711695 | 2 | 2022-12-08T01:33:09Z | 2022-12-14T20:21:26Z | 2022-12-14T20:21:26Z | OWNER | Otherwise someone with |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1937/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
1497288666 | I_kwDOBm6k_c5ZPs_a | 1956 | Handle abbreviations properly in permission_allowed_actor_restrictions | simonw 9599 | closed | 0 | Datasette 1.0a2 8711695 | 2 | 2022-12-14T19:54:21Z | 2022-12-14T20:04:29Z | 2022-12-14T20:04:28Z | OWNER | This code currently assumes abbreviations are:
That's no longer correct, they are now registered by the new plugin hook: - #1939 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1956/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
1495241162 | I_kwDOBm6k_c5ZH5HK | 1950 | Bad ?_sort returns a 500 error, should be a 400 | simonw 9599 | closed | 0 | 2 | 2022-12-13T22:08:16Z | 2022-12-13T22:23:22Z | 2022-12-13T22:23:22Z | OWNER | datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1950/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||||
1200649124 | I_kwDOBm6k_c5HkHOk | 1708 | Datasette 1.0 alpha upcoming release notes | simonw 9599 | open | 0 | Datasette 1.0a-next 8755003 | 2 | 2022-04-11T22:57:12Z | 2022-12-13T05:29:06Z | OWNER | I'm going to try writing the release notes first, to see if that helps unblock me. ⚠️ Any release notes in this issue are a draft, and should not be treated as the real thing ⚠️ |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1708/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
|||||||
1493339206 | I_kwDOBm6k_c5ZAoxG | 1946 | `datasette --get` mechanism for sending tokens | simonw 9599 | closed | 0 | Datasette 1.0a2 8711695 | 2 | 2022-12-13T04:25:05Z | 2022-12-13T04:36:57Z | 2022-12-13T04:36:57Z | OWNER |
Originally posted by @simonw in https://github.com/simonw/datasette/issues/1855#issuecomment-1347731288 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1946/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
1483250004 | I_kwDOBm6k_c5YaJlU | 1936 | Fix /db/table/-/upsert in the API explorer | simonw 9599 | open | 0 | Datasette 1.0 3268330 | 2 | 2022-12-08T00:59:34Z | 2022-12-08T01:36:02Z | OWNER | Split from: - #1931 - #1878 This is a bit tricky because the code needs to figure out what the primary keys are for an item, and whether or not |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1936/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
|||||||
1469043836 | I_kwDOBm6k_c5Xj9R8 | 1917 | Don't allow writable API to edit the `_memory` database | simonw 9599 | closed | 0 | Datasette 1.0a1 7867486 | 2 | 2022-11-30T04:51:59Z | 2022-11-30T05:07:56Z | 2022-11-30T05:07:55Z | OWNER | It shows up on https://latest.datasette.io/-/api (once you are signed in as root) - but there's no point in creating tables in it because they likely won't persist from one request to the next, as it's not a shared named database. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1917/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
1468519699 | I_kwDOBm6k_c5Xh9UT | 1911 | `/db/-/create` should support creating tables with compound primary keys | simonw 9599 | closed | 0 | Datasette 1.0a0 8658075 | 2 | 2022-11-29T18:30:47Z | 2022-11-29T18:50:58Z | 2022-11-29T18:48:05Z | OWNER | Found myself needing this to write the tests for: - #1864 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1911/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
1455928469 | I_kwDOBm6k_c5Wx7SV | 1903 | Refactor all error classes into a datasette.exceptions module | simonw 9599 | open | 0 | Datasette 1.0 3268330 | 2 | 2022-11-18T22:44:45Z | 2022-11-20T22:35:01Z | OWNER | While working on this issue: - #1896 I realized that Datasette has error classes scattered around a fair bit, including some in the I should clean these up. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1903/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
|||||||
1447388809 | I_kwDOBm6k_c5WRWaJ | 1887 | Add a confirm step to the drop table API | simonw 9599 | closed | 0 | Datasette 1.0a0 8658075 | 2 | 2022-11-14T04:59:53Z | 2022-11-15T19:59:59Z | 2022-11-14T05:18:51Z | OWNER |
Originally posted by @simonw in https://github.com/simonw/datasette/issues/1871#issuecomment-1313097057 Added drop table API in: - #1874 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1887/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
1436539554 | I_kwDOCGYnMM5Vn9qi | 511 | [insert_all, upsert_all] IntegrityError: constraint failed | chapmanjacobd 7908073 | closed | 0 | 2 | 2022-11-04T19:21:48Z | 2022-11-04T22:59:54Z | 2022-11-04T22:54:09Z | CONTRIBUTOR | My understand is that ``` import argparse from pathlib import Path from xklb import db, utils from xklb.utils import log def parse_args() -> argparse.Namespace: parser = argparse.ArgumentParser() parser.add_argument("database") parser.add_argument("dbs", nargs="*") parser.add_argument("--upsert") parser.add_argument("--db", "-db", help=argparse.SUPPRESS) parser.add_argument("--verbose", "-v", action="count", default=0) args = parser.parse_args()
def merge_db(args, source_db): source_db = str(Path(source_db).resolve())
def merge_dbs(): args = parse_args() for s_db in args.dbs: merge_db(args, s_db) if name == "main": merge_dbs() ``` ``` $ lb-dev merge video.db tube_71.db --upsert path -vv SQL: INSERT OR IGNORE INTO media VALUES(?); - params: ['https://archive.org/details/088ghostofachanceroygetssackedrevengeofthelivinglunchdvdripxvidphz'] ... File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:3122, in Table.insert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, hash_id_columns, alter, ignore, replace, truncate, extracts, conversions, columns, upsert, analyze) 3116 all_columns += [ 3117 column for column in record if column not in all_columns 3118 ] 3120 first = False -> 3122 self.insert_chunk( 3123 alter, 3124 extracts, 3125 chunk, 3126 all_columns, 3127 hash_id, 3128 hash_id_columns, 3129 upsert, 3130 pk, 3131 conversions, 3132 num_records_processed, 3133 replace, 3134 ignore, 3135 ) 3137 if analyze: 3138 self.analyze() File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:2887, in Table.insert_chunk(self, alter, extracts, chunk, all_columns, hash_id, hash_id_columns, upsert, pk, conversions, num_records_processed, replace, ignore) 2885 for query, params in queries_and_params: 2886 try: -> 2887 result = self.db.execute(query, params) 2888 except OperationalError as e: 2889 if alter and (" column" in e.args[0]): 2890 # Attempt to add any missing columns, then try again File ~/.local/lib/python3.10/site-packages/sqlite_utils/db.py:484, in Database.execute(self, sql, parameters) 482 self._tracer(sql, parameters) 483 if parameters is not None: --> 484 return self.conn.execute(sql, parameters) 485 else: 486 return self.conn.execute(sql) IntegrityError: constraint failed
|
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/511/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1431786951 | I_kwDOBm6k_c5VV1XH | 1876 | SQL query should wrap on SQL interrupted screen | simonw 9599 | closed | 0 | 2 | 2022-11-01T17:14:01Z | 2022-11-01T17:22:33Z | 2022-11-01T17:22:33Z | OWNER | Just saw this: |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1876/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1423182778 | I_kwDOCGYnMM5U1Au6 | 505 | Release sqlite-utils 3.30 | simonw 9599 | closed | 0 | 2 | 2022-10-25T22:20:05Z | 2022-10-25T22:41:26Z | 2022-10-25T22:41:16Z | OWNER | sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/505/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||||
1420055377 | I_kwDOBm6k_c5UpFNR | 1847 | Both _local_metadata and _metadata_local? | simonw 9599 | closed | 0 | 2 | 2022-10-24T01:43:08Z | 2022-10-24T01:53:13Z | 2022-10-24T01:53:13Z | OWNER | Spotted this in the debugger against the
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1847/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1397193691 | I_kwDOBm6k_c5TR3vb | 1832 | __bool__ method on Results | simonw 9599 | closed | 0 | 2 | 2022-10-05T04:18:12Z | 2022-10-05T04:32:33Z | 2022-10-05T04:32:33Z | OWNER | Wrote this code today: https://github.com/simonw/datasette-public/blob/1401bfae50e71c1dfd2bfb6954f2e86d5a7ab21b/datasette_public/init.py#L41
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1832/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1393903845 | I_kwDOBm6k_c5TFUjl | 1828 | word-wrap: anywhere resulting in weird display | simonw 9599 | closed | 0 | 2 | 2022-10-02T21:25:03Z | 2022-10-02T23:01:17Z | 2022-10-02T23:01:17Z | OWNER | e.g. on https://github-to-sqlite.dogsheep.net/github/commits This is from a change introduced here: https://github.com/simonw/datasette/commit/bf8d84af5422606597be893cedd375020cb2b369 in #1805 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1828/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1122427321 | I_kwDOBm6k_c5C5uG5 | 1624 | Index page `/` has no CORS headers | simonw 9599 | open | 0 | 2 | 2022-02-02T21:56:10Z | 2022-09-28T16:54:22Z | OWNER | Compare the following: ``` % curl -I 'https://latest.datasette.io/fixtures' HTTP/1.1 200 OK link: https://latest.datasette.io/fixtures.json; rel="alternate"; type="application/json+datasette" cache-control: max-age=5 referrer-policy: no-referrer access-control-allow-origin: * access-control-allow-headers: Authorization access-control-expose-headers: Link content-type: text/html; charset=utf-8 x-databases: _memory, _internal, fixtures, extra_database Date: Wed, 02 Feb 2022 21:55:49 GMT Server: Google Frontend Transfer-Encoding: chunked % curl -I 'https://latest.datasette.io/' |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1624/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1386593843 | I_kwDOCGYnMM5Spb4z | 494 | Document how to use Just | simonw 9599 | closed | 0 | 2 | 2022-09-26T19:25:12Z | 2022-09-26T19:32:36Z | 2022-09-26T19:26:39Z | OWNER | I'm using https://github.com/simonw/sqlite-utils/blob/afbd2b2cba45cccb305c3d4638d18db4dd3d4bbd/Justfile#L1-L24 |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/494/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1363765916 | I_kwDOCGYnMM5RSWqc | 483 | `sqlite-utils install` command | simonw 9599 | closed | 0 | 2 | 2022-09-06T20:13:55Z | 2022-09-26T19:04:43Z | 2022-09-26T18:57:15Z | OWNER | With the addition of In addition to the existing This isn't easy if you installed the tool with Datasette solved this problem with the
|
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/483/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
520508502 | MDU6SXNzdWU1MjA1MDg1MDI= | 31 | "friends" command (similar to "followers") | simonw 9599 | closed | 0 | 2 | 2019-11-09T20:20:20Z | 2022-09-20T05:05:03Z | 2020-02-07T07:03:28Z | MEMBER | Current list of commands:
|
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/31/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1378640768 | I_kwDOBm6k_c5SLGOA | 1816 | Validate settings.json on startup in configuration directory mode | simonw 9599 | closed | 0 | 2 | 2022-09-19T23:35:18Z | 2022-09-20T01:15:48Z | 2022-09-20T01:15:48Z | OWNER |
Originally posted by @simonw in https://github.com/simonw/datasette/issues/1814#issuecomment-1251677554 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1816/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1378495690 | I_kwDOBm6k_c5SKizK | 1814 | Static files not served | frafra 4068 | closed | 0 | 2 | 2022-09-19T20:38:17Z | 2022-09-19T23:35:06Z | 2022-09-19T23:34:30Z | NONE | Folder structure:
File Using datasette revision d0737e4de51ce178e556fc011ccb8cc46bbb6359. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1814/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1367835380 | I_kwDOCGYnMM5Rh4L0 | 487 | Specify foreign key against compound key in other table | ryanfox 540968 | closed | 0 | 2 | 2022-09-09T13:32:09Z | 2022-09-11T04:00:44Z | 2022-09-11T04:00:44Z | NONE | When inserting rows via the library, is it possible to specify a foreign key to a compound primary key? For example, suppose I create a table:
And I want to add related data in another table:
Is it possible to specify a value for SQLite does support it: https://www.sqlite.org/foreignkeys.html#fk_composite |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/487/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1353441389 | I_kwDOCGYnMM5Qq-Bt | 477 | Conda Forge | thewchan 49702524 | closed | 0 | 2 | 2022-08-28T19:03:08Z | 2022-09-07T03:46:55Z | 2022-09-07T03:46:55Z | NONE | Hello! I have successfully put this package on to Conda Forge, and I have extending the invitation for the owner/maintainers of this package to be maintainers on Conda Forge as well. Let me know if you are interested! Thanks. https://github.com/conda-forge/sqlite-utils-feedstock |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/477/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1353196970 | I_kwDOCGYnMM5QqCWq | 476 | Release notes for 3.29 | simonw 9599 | closed | 0 | 3.29 8355157 | 2 | 2022-08-27T23:21:21Z | 2022-08-28T04:07:15Z | 2022-08-28T04:07:03Z | OWNER | sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/476/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1352946135 | I_kwDOCGYnMM5QpFHX | 472 | Reuse the locals/globals fix from --functions for other code accepting options | simonw 9599 | closed | 0 | 3.29 8355157 | 2 | 2022-08-27T05:12:05Z | 2022-08-27T05:20:12Z | 2022-08-27T05:20:12Z | OWNER | I figured out a workaround for the ugly |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/472/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
1339663518 | I_kwDOBm6k_c5P2aSe | 1784 | Include "entrypoint" option on `--load-extension`? | asg017 15178711 | closed | 0 | 2 | 2022-08-16T00:22:57Z | 2022-08-23T18:34:31Z | 2022-08-23T18:34:31Z | CONTRIBUTOR | ProblemSQLite extensions have the option to define multiple "entrypoints" in each loadable extension. For example, the upcoming version of (Similar multiple entrypoints will also be added for sqlite-http). The ProposalI want there to be a new command line option of the Then, under the hood, this line of code: Would look something like this:
One potential problem: For backward compatibility, I'm not sure if Click allows cli flags to have variable number of options ("arity"). So I guess it could also use a
Or maybe even a new flag name?
Personally I prefer the
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1784/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1339444565 | I_kwDOBm6k_c5P1k1V | 1783 | Better guidance as to what to do after you've installed Datasette | simonw 9599 | open | 0 | 2 | 2022-08-15T20:11:06Z | 2022-08-15T20:14:01Z | OWNER | Feedback from Discord:
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1783/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1338278056 | I_kwDOBm6k_c5PxICo | 1782 | Release notes for Datasette 0.62 | simonw 9599 | closed | 0 | Datasette 0.62 8303187 | 2 | 2022-08-14T15:26:45Z | 2022-08-14T17:40:45Z | 2022-08-14T17:32:54Z | OWNER | I've written a lot of these already for the alphas: |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1782/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
1223527226 | I_kwDOBm6k_c5I7Ys6 | 1738 | "Cannot use _sort and _sort_desc at the same time" | simonw 9599 | closed | 0 | Datasette 0.62 8303187 | 2 | 2022-05-03T01:06:24Z | 2022-08-14T16:13:55Z | 2022-08-14T16:13:55Z | OWNER | Triggered this error while playing with the sort desc checkbox and the apply button that are only visible on this page at mobile screen width: https://latest.datasette.io/fixtures/compound_three_primary_keys?_sort_desc=pk1 Navigate to that page (with the browser narrow enough to show the box), un-check the box and click Apply: Also notable: I managed to get to a page with |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1738/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
1298531653 | I_kwDOCGYnMM5NZgVF | 451 | Make sqlite_utils.utils.chunks a documented function | simonw 9599 | closed | 0 | 2 | 2022-07-08T06:01:04Z | 2022-07-15T22:09:34Z | 2022-07-15T21:59:33Z | OWNER | I want to use it in another project: https://github.com/simonw/sqlite-utils/blob/8a9fe6498faf783a1fdeb1793e661ad194a05267/sqlite_utils/utils.py#L471-L474 |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/451/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1277328147 | I_kwDOCGYnMM5MInsT | 446 | Use Just to automate running tests and linters locally | simonw 9599 | closed | 0 | 2 | 2022-06-20T19:51:09Z | 2022-06-21T19:28:35Z | 2022-06-20T19:54:50Z | OWNER | I keep committing code that fails additional tests like |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/446/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1269998342 | I_kwDOCGYnMM5LsqMG | 443 | Make `utils.rows_from_file()` a documented API | simonw 9599 | closed | 0 | 2 | 2022-06-13T21:53:24Z | 2022-06-20T19:49:37Z | 2022-06-14T20:12:46Z | OWNER |
Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/440#issuecomment-1154385916 |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/443/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1224112817 | I_kwDOCGYnMM5I9nqx | 430 | Document how to use `PRAGMA temp_store` to avoid errors when running VACUUM against huge databases | rayvoelker 9308268 | open | 0 | 2 | 2022-05-03T13:33:58Z | 2022-06-14T23:26:37Z | NONE | I'm trying to figure out a way to get the Here's the bit that's causing the error, and the resulting error output: ```python combine these columns into 1 table "bib_properties" :best_titlebib_level_codemat_typematerial_codebest_authordb["circ_trans"].extract( ["best_title", "bib_level_code", "mat_type", "material_code", "best_author"], table="bib_properties", fk_column="bib_properties_id" ) db["circ_trans"].extract( ["call_number"], table="call_number", fk_column="call_number_id", rename={"call_number": "value"} ) ``` ```pythonOperationalError Traceback (most recent call last) Input In [17], in <cell line: 7>() 1 # combine these columns into 1 table "bib_properties" : 2 # best_title 3 # bib_level_code 4 # mat_type 5 # material_code 6 # best_author ----> 7 db["circ_trans"].extract( 8 ["best_title", "bib_level_code", "mat_type", "material_code", "best_author"], 9 table="bib_properties", 10 fk_column="bib_properties_id" 11 ) 13 db["circ_trans"].extract( 14 ["call_number"], 15 table="call_number", 16 fk_column="call_number_id", 17 rename={"call_number": "value"} 18 ) File ~/jupyter/venv/lib/python3.10/site-packages/sqlite_utils/db.py:1764, in Table.extract(self, columns, table, fk_column, rename) 1761 column_order.append(c.name) 1763 # Drop the unnecessary columns and rename lookup column -> 1764 self.transform( 1765 drop=set(columns), 1766 rename={magic_lookup_column: fk_column}, 1767 column_order=column_order, 1768 ) 1770 # And add the foreign key constraint 1771 self.add_foreign_key(fk_column, table, "id") File ~/jupyter/venv/lib/python3.10/site-packages/sqlite_utils/db.py:1526, in Table.transform(self, types, rename, drop, pk, not_null, defaults, drop_foreign_keys, column_order) 1524 with self.db.conn: 1525 for sql in sqls: -> 1526 self.db.execute(sql) 1527 # Run the foreign_key_check before we commit 1528 if pragma_foreign_keys_was_on: File ~/jupyter/venv/lib/python3.10/site-packages/sqlite_utils/db.py:465, in Database.execute(self, sql, parameters) 463 return self.conn.execute(sql, parameters) 464 else: --> 465 return self.conn.execute(sql) OperationalError: database or disk is full ``` This database is about 17G in total size, so I'm assuming the error is coming from the vacuum ... where i'm assuming it's maybe trying to do the temp storage in a location that doesn't have sufficient room. The disk space is more than ample on the host in question (1.8T is free in the directory where the sqlite db resides) The I'm trying to think if there's a way to set the ```python SET the temp file store to be a file ...print(db.execute('PRAGMA temp_store').fetchall()) print(db.execute('PRAGMA temp_store=FILE').fetchall()) print(db.execute('PRAGMA temp_store').fetchall()) the users home directory ...print(db.execute("PRAGMA temp_store_directory='/home/plchuser/'").fetchall()) print(db.execute("PRAGMA sqlite3_temp_directory='/home/plchuser/'").fetchall()) print(db.execute("PRAGMA temp_store_directory").fetchall())
print(db.execute("PRAGMA sqlite3_temp_directory").fetchall())
Here's the docs on the Temporary File Storage Locations https://www.sqlite.org/tempfiles.html |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/430/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1269886084 | I_kwDOCGYnMM5LsOyE | 442 | `maximize_csv_field_size_limit()` utility function | simonw 9599 | closed | 0 | 2 | 2022-06-13T19:54:54Z | 2022-06-14T21:55:15Z | 2022-06-14T21:31:49Z | OWNER | This code here runs only if I found myself needing the same fix in another library: It should be a documented utility function. |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/442/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1250161887 | I_kwDOCGYnMM5Kg_Tf | 438 | illegal UTF-16 surrogate | frafra 4068 | closed | 0 | 2 | 2022-05-26T22:49:52Z | 2022-05-27T08:21:53Z | 2022-05-27T08:21:53Z | NONE | I am trying to insert ``` sqlite-utils insert --csv --delimiter ";" --encoding="utf-16-le" --pk "Id" csv fremmedart test.db [------------------------------------] 0% Error: 'utf-16-le' codec can't decode bytes in position 98-99: illegal UTF-16 surrogate The input you provided uses a character encoding other than utf-8. You can fix this by passing the --encoding= option with the encoding of the file. If you do not know the encoding, running 'file filename.csv' may tell you. It's often worth trying: --encoding=latin-1 ``` I tried to convert the file using ``` sqlite-utils insert --csv --delimiter ";" --encoding=utf-8 --pk "Id" csv_utf8 fremmedart test.db [------------------------------------] 0% Error: 'utf-8' codec can't decode byte 0xd9 in position 99: invalid continuation byte The input you provided uses a character encoding other than utf-8. You can fix this by passing the --encoding= option with the encoding of the file. If you do not know the encoding, running 'file filename.csv' may tell you. It's often worth trying: --encoding=latin-1 ``` I have no issues reading such file using this Python code:
|
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/438/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1243704847 | I_kwDOCGYnMM5KIW4P | 435 | Switch to Furo documentation theme | simonw 9599 | closed | 0 | 2 | 2022-05-20T21:46:39Z | 2022-05-20T21:56:10Z | 2022-05-20T21:54:43Z | OWNER | sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/435/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||||
1243517592 | I_kwDOBm6k_c5KHpKY | 1748 | Add copy buttons next to code examples in the documentation | simonw 9599 | closed | 0 | 2 | 2022-05-20T19:09:00Z | 2022-05-20T19:15:00Z | 2022-05-20T19:11:32Z | OWNER | Similar to the ones in |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1748/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1221849746 | I_kwDOBm6k_c5I0_KS | 1732 | Custom page variables aren't decoded | tannewt 52649 | open | 0 | 2 | 2022-04-30T14:55:46Z | 2022-05-03T01:50:45Z | NONE | I have a page Datasette should unescape the url component before passing them into the template. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1732/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1129052172 | I_kwDOBm6k_c5DS_gM | 1633 | base_url or prefix does not work with _exact match | henrikek 6613091 | open | 0 | 2 | 2022-02-09T21:45:07Z | 2022-04-28T09:12:56Z | NONE | When i hit "Apply" button to search with "_exact" for a column syntax the URL prefix is removed from the url. And the result is: If I add the marked row to url_builder.py it seams to work: |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1633/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1182227211 | I_kwDOBm6k_c5Gd1sL | 1692 | [plugins][feature request]: Support additional script tag attributes when loading custom JS | hydrosquall 9020979 | open | 0 | 2 | 2022-03-27T01:16:03Z | 2022-03-30T06:14:51Z | CONTRIBUTOR | Motivation
To be able to support legacy browsers without slowing down users with modern browsers, I would like to be able to set additional HTML attributes on the tag fallback script, ```html <script type="module" src="/index.my-es-module-bundle.js"></script> <script src="/index.my-legacy-fallback-bundle.js" nomodule="" defer></script>``` ProposalTo achieve this, I propose additional optional properties to the API accepted by the Under this API, I'd write something like this to get the above HTML rendered in Datasette.
Resources
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1692/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1185868354 | I_kwDOBm6k_c5GrupC | 1695 | Option to un-filter facet not shown for `?col__exact=value` | simonw 9599 | open | 0 | 2 | 2022-03-30T04:44:02Z | 2022-03-30T04:46:18Z | OWNER | Spotted this on a page with With |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1695/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1182141761 | I_kwDOBm6k_c5Gdg1B | 1690 | Idea: `datasette.set_actor_cookie(response, actor)` | simonw 9599 | open | 0 | 2 | 2022-03-26T22:41:52Z | 2022-03-26T22:43:00Z | OWNER | I just wrote this code in a plugin and it felt like it could benefit from an abstraction: https://github.com/simonw/datasette-auth0/blob/152e6eb21e96e9b73bd9c205f9749a1297d0ef0b/datasette_auth0/init.py#L79-L92
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1690/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1177101697 | I_kwDOBm6k_c5GKSWB | 1681 | Potential bug in numeric handling where_clause for filters | simonw 9599 | open | 0 | 2 | 2022-03-22T17:43:50Z | 2022-03-22T17:49:09Z | OWNER |
Originally posted by @simonw in https://github.com/simonw/datasette/issues/1671#issuecomment-1075432283 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1681/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1171599874 | I_kwDOCGYnMM5F1TIC | 415 | Convert with `--multi` and `--dry-run` flag does not work | dotcs 3976183 | closed | 0 | 2 | 2022-03-16T21:59:46Z | 2022-03-21T04:18:24Z | 2022-03-21T04:18:24Z | NONE | It's not possible to combine Let's first create a simple database from JSON string
and then try to convert the "foo" column with a static value "bar" (see docs Converting a column into multiple columns)
But without the
|
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/415/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
910088936 | MDU6SXNzdWU5MTAwODg5MzY= | 1355 | datasette --get should efficiently handle streaming CSV | simonw 9599 | open | 0 | 2 | 2021-06-03T04:40:40Z | 2022-03-20T22:38:53Z | OWNER | It would be great if you could use Current implementation looks like it loads the entire result into memory first: https://github.com/simonw/datasette/blob/f78ebdc04537a6102316d6dbbf6c887565806078/datasette/cli.py#L546-L552 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1355/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1174404647 | I_kwDOBm6k_c5F__4n | 1669 | Release 0.61 alpha | simonw 9599 | closed | 0 | 2 | 2022-03-20T00:35:35Z | 2022-03-20T01:24:36Z | 2022-03-20T01:24:36Z | OWNER |
Originally posted by @simonw in https://github.com/simonw/datasette/issues/1668#issuecomment-1073136896 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1669/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1170554975 | I_kwDOBm6k_c5FxUBf | 1663 | Document the internals that were used in datasette-hashed-urls | simonw 9599 | closed | 0 | Datasette 1.0 3268330 | 2 | 2022-03-16T05:17:08Z | 2022-03-19T04:04:50Z | 2022-03-17T21:32:38Z | OWNER | The https://github.com/simonw/datasette-hashed-urls used a couple of currently undocumented features:
- |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1663/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
1122416919 | I_kwDOBm6k_c5C5rkX | 1623 | /-/patterns returns link: alternate JSON header to 404 | simonw 9599 | closed | 0 | Datasette 1.0 3268330 | 2 | 2022-02-02T21:42:49Z | 2022-03-19T04:04:49Z | 2022-02-02T21:48:56Z | OWNER | Bug from: - #1620
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1623/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
626593402 | MDU6SXNzdWU2MjY1OTM0MDI= | 780 | Internals documentation for datasette.metadata() method | simonw 9599 | open | 0 | Datasette 1.0 3268330 | 2 | 2020-05-28T15:14:22Z | 2022-03-15T20:50:34Z | OWNER | datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/780/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1088816961 | I_kwDODEm0Qs5A5gdB | 62 | KeyError: 'created_at' for private accounts? | swyxio 6764957 | closed | 0 | 2 | 2021-12-26T17:51:51Z | 2022-03-12T02:36:32Z | 2022-02-24T18:10:18Z | NONE | hey Simon! i was running ![image](https://user-images.githubusercontent.com/6764957/147416165-46b69c30-100a-406f-8534-8612b75547ae.png)```bash Traceback (most recent call last): File "/Users/swyx/Work/datasette/env/bin/twitter-to-sqlite", line 8, in <module> sys.exit(cli()) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py", line 1128, in __call__ return self.main(*args, **kwargs) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py", line 1053, in main rv = self.invoke(ctx) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py", line 1659, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py", line 1395, in invoke return ctx.invoke(self.callback, **ctx.params) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/click/core.py", line 754, in invoke return __callback(*args, **kwargs) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/cli.py", line 291, in user_timeline profile = utils.get_profile(db, session, **kwargs) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py", line 133, in get_profile save_users(db, [profile]) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py", line 453, in save_users transform_user(user) File "/Users/swyx/Work/datasette/env/lib/python3.9/site-packages/twitter_to_sqlite/utils.py", line 285, in transform_user user["created_at"] = parser.parse(user["created_at"]) KeyError: 'created_at' ```this looks awfully like #37 but it can't be, because i'm authed into my account and obviously i have perms to read my own account. wonder if there's any diagnostic methods i should apply here? just filing an issue for others to find while i investigate. |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/62/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1161937073 | I_kwDOBm6k_c5FQcCx | 1653 | Mechanism to default a table to sorting by multiple columns | simonw 9599 | open | 0 | 2 | 2022-03-07T21:20:11Z | 2022-03-07T21:23:39Z | OWNER | Discussed in https://github.com/simonw/datasette/discussions/1652
<sup>Originally posted by **zaneselvans** March 7, 2022</sup>
It's easy to tell datasette to sort tables using a single column, as [described in the docs](https://docs.datasette.io/en/stable/metadata.html#setting-a-default-sort-order):
```yaml
databases:
ferc1:
tables:
f1_edcfu_epda:
sort: created_time
```
But is there some way to tell it to sort using a composite key, like you would in an `ORDER BY` clause instead? For example, the way it's being done **[in this query](https://data.catalyst.coop/ferc1?sql=select%0D%0A++rowid%2C%0D%0A++respondent_id%2C%0D%0A++report_year%2C%0D%0A++spplmnt_num%2C%0D%0A++row_number%2C%0D%0A++row_seq%2C%0D%0A++row_prvlg%2C%0D%0A++acct_num%2C%0D%0A++depr_plnt_base%2C%0D%0A++est_avg_srvce_lf%2C%0D%0A++net_salvage%2C%0D%0A++apply_depr_rate%2C%0D%0A++mrtlty_crv_typ%2C%0D%0A++avg_remaining_lf%2C%0D%0A++report_prd%0D%0Afrom%0D%0A++f1_edcfu_epda%0D%0Awhere%0D%0A++respondent_id+%3D+210%0D%0A++AND+report_year+%3D+2020%0D%0Aorder+by%0D%0A++report_year%2C+report_prd%2C+respondent_id%2C+spplmnt_num%2C+row_number%0D%0Alimit%0D%0A++1000)** on our Datasette?
```sql
SELECT
respondent_id,
report_year,
spplmnt_num,
row_number,
row_seq,
row_prvlg,
acct_num,
depr_plnt_base,
est_avg_srvce_lf,
net_salvage,
apply_depr_rate,
mrtlty_crv_typ,
avg_remaining_lf,
report_prd
FROM
f1_edcfu_epda
WHERE
respondent_id = 210
AND report_year = 2020
ORDER BY
report_year, report_prd, respondent_id, spplmnt_num, row_number
LIMIT
1000
```
The problem here is that by default it's using `rowid` (the SQLite assigned autoincrementing integer key) to order the records, but the table **should** have a natural composite primary key, but the original database that this data is being migrated from doesn't enforce unique primary keys, so there are dupes, and we don't want to drop those rows, and the records are somehow getting jumbled in the database (the `rowid` ordering isn't lined up with the expected ordering based on the composite primary key, though it's close) and this jumbling is confusing to users that expect to see the data ordered based on the natural primary key.
I've tried setting the `sort` metadata parameter to a list of column names, a tuple of column names, a quoted string of comma-separated column names, a quoted string of a tuple of column names...
```yaml
databases:
ferc1:
tables:
f1_edcfu_epda:
sort: "(report_year, report_prd, respondent_id, spplmnt_num, row_number)"
```
and they all give me server errors like:
```
Cannot sort table by (report_year, report_prd, respondent_id, spplmnt_num, row_number)
``` |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1653/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1148725876 | I_kwDOBm6k_c5EeCp0 | 1640 | Support static assets where file length may change, e.g. logs | broccolihighkicks 57859326 | open | 0 | 2 | 2022-02-24T00:34:42Z | 2022-03-05T01:19:25Z | NONE | This is a bit of an oxymoron. I am serving a log.txt file for a background process using the Datasette --static CLI. This is useful as I can observe a background process from the web UI to see any errors that occur (instead of spelunking the logs via docker exec/ssh etc). I get this error, which I think is because Datasette assumes that the size of the content does not change (but appending new log lines means the content length changes).
Thanks, I am finding Datasette very useful. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1640/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1128139375 | I_kwDOCGYnMM5DPgpv | 405 | `Database(memory_name="name")` constructor argument | simonw 9599 | closed | 0 | 2 | 2022-02-09T07:15:03Z | 2022-02-16T01:23:16Z | 2022-02-16T01:23:16Z | OWNER | SQLite in-memory databases can be named, in which case multiple connections can be opened to a shared in-memory database running within the same process. Datasette supports this - SQLite could support it too. |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/405/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1128120451 | I_kwDOCGYnMM5DPcCD | 404 | Add example of `--convert` to the help for `sqlite-utils insert` | simonw 9599 | closed | 0 | 2 | 2022-02-09T06:49:09Z | 2022-02-09T06:56:35Z | 2022-02-09T06:55:16Z | OWNER | https://sqlite-utils.datasette.io/en/3.23/cli-reference.html#insert would be more useful if it included an example of I can maybe use an example from https://simonwillison.net/2022/Jan/11/sqlite-utils/ |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/404/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1109783030 | I_kwDOBm6k_c5CJfH2 | 1607 | More detailed information about installed SpatiaLite version | simonw 9599 | closed | 0 | Datasette 1.0 3268330 | 2 | 2022-01-20T21:28:03Z | 2022-02-09T06:42:02Z | 2022-02-09T06:32:28Z | OWNER | https://www.gaia-gis.it/gaia-sins/spatialite-sql-5.0.0.html#version has a whole bunch of interesting functions for things like These could be shown on the |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1607/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [pull_request] TEXT, [body] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT , [active_lock_reason] TEXT, [performed_via_github_app] TEXT, [reactions] TEXT, [draft] INTEGER, [state_reason] TEXT); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);