issues
418 rows where comments = 0 and type = "issue" sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: milestone, author_association, state_reason, created_at (date), updated_at (date), closed_at (date)
repo 16
- datasette 258
- sqlite-utils 74
- github-to-sqlite 20
- twitter-to-sqlite 17
- dogsheep-beta 15
- swarm-to-sqlite 5
- dogsheep-photos 5
- apple-notes-to-sqlite 5
- google-takeout-to-sqlite 4
- healthkit-to-sqlite 3
- pocket-to-sqlite 3
- evernote-to-sqlite 3
- inaturalist-to-sqlite 2
- dogsheep.github.io 2
- genome-to-sqlite 1
- hacker-news-to-sqlite 1
type 1
- issue · 418 ✖
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | pull_request | body | repo | type | active_lock_reason | performed_via_github_app | reactions | draft | state_reason |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
2023057255 | I_kwDOBm6k_c54lWdn | 2212 | Can't filter with numbers | fzakaria 605070 | open | 0 | 0 | 2023-12-04T05:26:29Z | 2023-12-04T05:26:29Z | NONE | I have a schema that uses numbers for a column (actually it's a boolean 1 or 0 but SQLite doesn't have Boolean). I can't seem to get the facet to work or even filtering on this column. My guess is that Datasette is "stringifying" the number and it's not matching? Example: https://debian-sqlelf.fly.dev/debian/elf_symbols?_sort_desc=name&_facet=exported&exported=0 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2212/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
2019811176 | I_kwDOBm6k_c54Y99o | 2211 | Unreachable exception handlers for `sqlite3.OperationalError` | mattparmett 1214074 | open | 0 | 0 | 2023-12-01T00:50:22Z | 2023-12-01T00:50:22Z | NONE | There are several places where Because the exception will be caught by the first handler, the logic in the second handler is unreachable and will never be executed. If this is intended behavior, the second handler can be removed. If this is not intended, and the second handler should be the one that catches this exception, then This issue was found via a CodeQL query on the repository, and I've listed the occurrences found by the query below. There may be other instances of this issue in the code that were not surfaced by the query. I'd be happy to share the query if others would like to view or run it. One example: Other instances: https://github.com/simonw/datasette/blob/main/datasette/views/base.py#L266-L270 https://github.com/simonw/datasette/blob/main/datasette/views/base.py#L452-L456 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2211/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1994845152 | I_kwDOBm6k_c525uvg | 2207 | ModuleNotFoundError: No module named 'click_default_group | honzajavorek 283441 | open | 0 | 0 | 2023-11-15T14:04:32Z | 2023-11-15T14:04:32Z | NONE | No matter what I do, I'm getting this error:
I have datasette in my dependencies like this:
I had the latest regular version (not pre-release) there originally, but the result was the same:
Full pyproject.toml is at https://github.com/honzajavorek/junior.guru/ Previously datasette worked for me, but I guess something had to upgrade and now I can't even launch it. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2207/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1978603203 | I_kwDOCGYnMM517xbD | 602 | `sqlite-utils transform` removes the `AUTOINCREMENT` keyword | ArsTapatun 4472046 | open | 0 | 0 | 2023-11-06T08:48:43Z | 2023-11-06T08:48:43Z | NONE | ContextWe ran into this bug randomly, noticing that deleted Reproducible exampleOriginal database ```sql $ sqlite3 test.db << EOF CREATE TABLE mytable ( col1 INTEGER PRIMARY KEY AUTOINCREMENT, col2 TEXT NOT NULL ) EOF $ sqlite3 test.db ".schema mytable" CREATE TABLE mytable ( col1 INTEGER PRIMARY KEY AUTOINCREMENT, col2 TEXT NOT NULL ); ``` Modified database after sqlite-utils ```sql $ sqlite-utils transform test.db mytable --rename col2 renamedcol2 $ sqlite3 test.db "SELECT sql FROM sqlite_master WHERE name = 'mytable';" CREATE TABLE IF NOT EXISTS "mytable" ( [col1] INTEGER PRIMARY KEY, [renamedcol2] TEXT NOT NULL ); ``` |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/602/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1978022687 | I_kwDOBm6k_c515jsf | 2204 | request.post_body() can only be called once | simonw 9599 | open | 0 | 0 | 2023-11-05T23:22:03Z | 2023-11-05T23:23:23Z | OWNER | This code here: It consumes the messages, which means if you try to call it a second time you won't be able to get at the body. This is efficient - we don't end up with a Potential solution: set Potential optimization: only do this for bodies that are shorter than a certain threshold - maybe 1MB - and raise an exception if you attempt to call I'm a bit nervous about that option though, since it could result in errors that don't show up in testing but do show up in production. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2204/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1977726056 | I_kwDOBm6k_c514bRo | 2203 | custom plugin not seen as sql function | LyzardKing 7113541 | open | 0 | 0 | 2023-11-05T10:30:19Z | 2023-11-05T10:30:19Z | NONE | Hi, I'm not sure if this is the right repo for this issue. I'm using datasette with the parquet (to read a duckdb), and jellyfish plugins. Both work perfectly. Now I need to create a simple plugin that uses the python rouge package and returns a similarity score (similarly to how the jellyfish plugin works).
If I create a custom plugin, even the example hello_world one, copied directly from the tutorial, I get the following error:
Since the jellyfish plugin doesn't do anything more complex, I'm wondering if there is some other kind of issue with my setup. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2203/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1977155641 | I_kwDOCGYnMM512QA5 | 601 | Move plugin directory into documentation | simonw 9599 | open | 0 | 0 | 2023-11-04T04:07:52Z | 2023-11-04T04:07:52Z | OWNER | https://github.com/simonw/sqlite-utils-plugins should be in the official documentation. I can use the same pattern as https://llm.datasette.io/en/stable/plugins/directory.html |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/601/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1553425465 | I_kwDOCGYnMM5cl2Q5 | 522 | Add COLUMN_TYPE_MAPPING for timedelta | maport 81377 | closed | 0 | 0 | 2023-01-23T16:49:54Z | 2023-11-04T00:49:51Z | 2023-11-04T00:49:51Z | NONE | Currently trying to create a column with Python type ```
The reason this would be useful is that ```
So currently any attempt to convert a MySQL DB with a I was rather surprised that |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/522/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1955676270 | I_kwDOBm6k_c50kUBu | 2201 | Discord invite link is invalid | andrewsanchez 11708906 | open | 0 | 0 | 2023-10-21T21:50:05Z | 2023-10-21T21:50:05Z | NONE | https://datasette.io/discord leads to https://discord.com/invite/ktd74dm5mw and returns the following: |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2201/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1943259395 | I_kwDOEhK-wc5z08kD | 16 | time data '2014-11-21T11:44:12.000Z' does not match format '%Y%m%dT%H%M%SZ' | linonetwo 3746270 | open | 0 | 0 | 2023-10-14T13:24:39Z | 2023-10-14T13:24:39Z | NONE |
enex is exported by evernote mac client |
evernote-to-sqlite 303218369 | issue | { "url": "https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/16/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1931794126 | I_kwDOBm6k_c5zJNbO | 2198 | --load-extension=spatialite not working with Windows | hcarter333 363004 | open | 0 | 0 | 2023-10-08T12:50:22Z | 2023-10-08T12:50:22Z | NONE | Using each of
and
and
I got the error: ``` File "C:\Users\m3n7es\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\datasette\database.py", line 209, in in_thread self.ds._prepare_connection(conn, self.name) File "C:\Users\m3n7es\AppData\Local\Packages\PythonSoftwareFoundation.Python.3.11_qbz5n2kfra8p0\LocalCache\local-packages\Python311\site-packages\datasette\app.py", line 596, in _prepare_connection conn.execute("SELECT load_extension(?, ?)", [path, entrypoint]) sqlite3.OperationalError: The specified module could not be found. ``` I finally tried modifying the code in app.py to read: ``` def _prepare_connection(self, conn, database): conn.row_factory = sqlite3.Row conn.text_factory = lambda x: str(x, "utf-8", "replace") if self.sqlite_extensions: conn.enable_load_extension(True) for extension in self.sqlite_extensions: # "extension" is either a string path to the extension # or a 2-item tuple that specifies which entrypoint to load. #if isinstance(extension, tuple): # path, entrypoint = extension # conn.execute("SELECT load_extension(?, ?)", [path, entrypoint]) #else: conn.execute("SELECT load_extension('C:\Windows\System32\mod_spatialite.dll')") ``` At which point the counties example worked. Is there a correct way to install/use the extension on Windows? My method will cause issues if there's a second extension to be used. On an unrelated note, my next step is to figure out how to write a query across the two loaded databases supplied from the command line:
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2198/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1920416843 | I_kwDOCGYnMM5ydzxL | 597 | sqlite-utils insert-files should be able to convert fields | grimnight 1737541 | open | 0 | 0 | 2023-09-30T22:20:47Z | 2023-09-30T22:20:47Z | NONE | Currently using both ```shell ~ ❯ cat test.py import os class Example: def init(self, arg1, arg2): self.arg1 = arg1 ~ ❯ sqlite-utils insert-files test.sqlar sqlar test.py -c name:name -c data:content -c mode:mode -c mtime:mtime -c sz:size --pk=name [####################################] 100% ~ ❯ sqlite-utils convert test.sqlar sqlar data "zlib.compress(value)" --import=zlib --where "name = 'test.py'" [####################################] 100% ~ ❯ cat test.py | sqlite-utils convert test.sqlar sqlar data "zlib.compress(sys.stdin.buffer.read())" --import=zlib --import=sys --where "name = 'test.py'" # Alternative way [####################################] 100% ~ ❯ sqlite3 test.sqlar "SELECT hex(data) FROM sqlar WHERE name = 'test.py';" | python3 -c "import sys, zlib; sys.stdout.buffer.write(zlib.decompress(bytes.fromhex(sys.stdin.read())))" import os class Example: def init(self, arg1, arg2): self.arg1 = arg1 ~ ❯ rm test.py ~ ❯ sqlar -l test.sqlar test.py ~ ❯ sqlar -x test.sqlar ~ ❯ cat test.py import os class Example: def init(self, arg1, arg2): self.arg1 = arg1 ``` |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/597/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1899310542 | I_kwDOBm6k_c5xNS3O | 2187 | Datasette for serving JSON only | geofinder 19705106 | open | 0 | 0 | 2023-09-16T05:48:29Z | 2023-09-16T05:48:29Z | NONE | Hi, is there any way to use datasette for serving json only without displaying webpage? I've tried to search about this in documentation but didn't get any information |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2187/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1895266807 | I_kwDOBm6k_c5w93n3 | 2184 | Design decision - should configuration be exposed at /-/config ? | simonw 9599 | open | 0 | 0 | 2023-09-13T21:07:08Z | 2023-09-13T21:07:38Z | OWNER |
Originally posted by @simonw in https://github.com/simonw/datasette/pull/2183#discussion_r1325076924 Also refs: - #2093 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2184/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1888477283 | I_kwDOC8SPRc5wj-Bj | 38 | Run `rebuild_fts` after building the index | simonw 9599 | open | 0 | 0 | 2023-09-08T23:17:45Z | 2023-09-08T23:17:45Z | MEMBER | In: - https://github.com/simonw/datasette.io/issues/152#issuecomment-1712323347 This turned out to be the fix:
|
dogsheep-beta 197431109 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-beta/issues/38/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1886649402 | I_kwDOBm6k_c5wc_w6 | 2179 | Flaky test: test_hidden_sqlite_stat1_table | simonw 9599 | closed | 0 | 0 | 2023-09-07T22:48:43Z | 2023-09-07T22:51:19Z | 2023-09-07T22:51:19Z | OWNER | This test here: https://github.com/simonw/datasette/blob/fbcb103c0cb6668018ace539a01a6a1f156e8d6a/tests/test_api.py#L1011-L1020 It failed for me like this:
Looks like some builds of SQLite include a |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2179/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1876407598 | I_kwDOBm6k_c5v17Uu | 2169 | execute-sql on a database should imply view-database/view-permission | simonw 9599 | closed | 0 | 0 | 2023-08-31T22:45:56Z | 2023-08-31T22:46:28Z | 2023-08-31T22:46:28Z | OWNER | I noticed that a token with |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2169/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1875739055 | I_kwDOBm6k_c5vzYGv | 2167 | Document return type of await ds.permission_allowed() | simonw 9599 | open | 0 | 0 | 2023-08-31T15:14:23Z | 2023-08-31T15:14:23Z | OWNER | The return type isn't documented here: https://github.com/simonw/datasette/blob/4c3ef033110407f3b3dbce501659d523724985e0/docs/internals.rst#L327-L350 On inspecting the code I'm not 100% sure if it's possible for this. method to return |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2167/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1871935751 | I_kwDOD079W85vk3kH | 40 | ImportError: cannot import name 'formatargspec' from 'inspect' | hosslikw 36752421 | closed | 0 | 0 | 2023-08-29T15:36:31Z | 2023-08-31T03:18:07Z | 2023-08-31T03:18:06Z | NONE | I get the following error when running "pip3 install dogsheep-photos" " from inspect import ismethod, isclass, formatargspec ImportError: cannot import name 'formatargspec' from 'inspect' (/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/inspect.py). Did you mean: 'formatargvalues'?" Python 3.12.0rc1 sqlite 3.43.0 datasette, version 0.64.3 |
dogsheep-photos 256834907 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/40/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
594237015 | MDU6SXNzdWU1OTQyMzcwMTU= | 718 | Plugin idea: datasette-redirects | simonw 9599 | open | 0 | 0 | 2020-04-05T03:41:38Z | 2023-08-30T22:17:31Z | OWNER | I just had to write a one-off custom plugin to redirect niche-musems.com to www.niche-museums.com (https://github.com/simonw/museums/issues/21) - it would be great if this kind of thing could be handled by a configurable plugin. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/718/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
reopened | |||||||
1795051447 | I_kwDOBm6k_c5q_k-3 | 2097 | Drop Python 3.7 | simonw 9599 | closed | 0 | 0 | 2023-07-08T18:39:44Z | 2023-08-23T18:18:00Z | 2023-08-23T18:18:00Z | OWNER |
Originally posted by @simonw in https://github.com/simonw/datasette/issues/1153#issuecomment-1627455892 It's not supported any more: https://devguide.python.org/versions/ |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2097/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1754174496 | I_kwDOCGYnMM5ojpQg | 558 | Ability to define unique columns when creating a table | aguinane 1910303 | open | 0 | 0 | 2023-06-13T06:56:19Z | 2023-08-18T01:06:03Z | NONE | When creating a new table, it would be good to have an option to set unique columns similar to how not_null is set. ```python from sqlite_utils import Database columns = {"mRID": str, "name": str} db = Database("example.db") db["ExampleTable"].create(columns, pk="mRID", not_null=["mRID"], if_not_exists=True) db["ExampleTable"].create_index(["mRID"], unique=True, if_not_exists=True) ``` So something like this would add the UNIQUE flag to the table definition.
|
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/558/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1822940964 | I_kwDOBm6k_c5sp98k | 2115 | Ensure all tests pass against new query view JSON | simonw 9599 | closed | 0 | Datasette 1.0a3 9700784 | 0 | 2023-07-26T18:25:20Z | 2023-08-08T02:01:39Z | 2023-08-08T02:01:38Z | OWNER |
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2115/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
1839344979 | I_kwDOCGYnMM5toi1T | 582 | Handling CSV/file input that contains NUL bytes | betatim 1448859 | open | 0 | 0 | 2023-08-07T12:24:14Z | 2023-08-07T12:24:14Z | NONE | I was using sqlite-utils to create a DB from a CSV and it turns out the CSV contains a NUL byte. When the processing reaches the line that contains the NUL an exception is raised. I'm wondering if there is something that can be done in Concretely the file is the This is the command and output:
|
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/582/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1824457306 | I_kwDOBm6k_c5svwJa | 2122 | Parameters on canned queries: fixed or query-generated list? | meowcat 1563881 | open | 0 | 0 | 2023-07-27T14:07:07Z | 2023-07-27T14:07:07Z | NONE | Hi, currently parameters in canned queries are just text fields. It would be cool to have one of the options below. Would you accept a PR doing something in this direction? (Possibly this could even work as a plugin.)
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2122/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1823428714 | I_kwDOBm6k_c5sr1Bq | 2120 | Add __all__ to datasette/__init__.py | simonw 9599 | open | 0 | 0 | 2023-07-27T01:07:10Z | 2023-07-27T01:07:10Z | OWNER | Currently looks like this: https://github.com/simonw/datasette/blob/08181823990a71ffa5a1b57b37259198eaa43e06/datasette/init.py#L1-L6 Adding |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2120/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1822918995 | I_kwDOCGYnMM5sp4lT | 580 | Add way to export to a csv file using the Python library | kevinlinxc 44324811 | open | 0 | 0 | 2023-07-26T18:09:26Z | 2023-07-26T18:09:26Z | NONE | According to the documentation, we can make a csv output using the CLI tool, but not the Python library. Could we have the latter? |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/580/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1822813627 | I_kwDOBm6k_c5spe27 | 2108 | some (many?) SQL syntax errors are not throwing errors with a .csv endpoint | fgregg 536941 | open | 0 | 0 | 2023-07-26T16:57:45Z | 2023-07-26T16:58:07Z | CONTRIBUTOR | here's a CTE query that should always fail with a syntax error:
when we make this query against the default endpoint, we do indeed get a 400 status code the problem is returned to the user: https://global-power-plants.datasettes.com/global-power-plants?sql=with+foo+as+%28nonsense%29+select+*+from+foo%3B but, if we use the csv endpoint, we get a 200 status code and no indication of a problem: https://global-power-plants.datasettes.com/global-power-plants.csv?sql=with+foo+as+%28nonsense%29+select+*+from+foo%3B same with this bad sql
vs but, datasette catches this bad sql at both endpoints:
https://global-power-plants.datasettes.com/global-power-plants?sql=slect%0D%0A++a%0D%0Afrom%0D%0A++foo%3B https://global-power-plants.datasettes.com/global-power-plants.csv?sql=slect%0D%0A++a%0D%0Afrom%0D%0A++foo%3B |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2108/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1821108702 | I_kwDOCGYnMM5si-ne | 579 | Special handling for SQLite column of type `JSON` | asg017 15178711 | open | 0 | 0 | 2023-07-25T20:37:23Z | 2023-07-25T20:37:23Z | CONTRIBUTOR |
Automatic NestingAccording to "Nested JSON Values", sqlite-utils will only expand JSON if the Instead,
I'm sure there's other ways |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/579/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1786243905 | I_kwDOCGYnMM5qd-tB | 564 | Document that running `db.transform()` tidies up the schema indentation | simonw 9599 | closed | 0 | 0 | 2023-07-03T13:59:28Z | 2023-07-22T22:15:34Z | 2023-07-22T22:15:34Z | OWNER |
```pycon <Table log (id, name2, age, weight, foo)> >>> db["log"].add_column("bar", str) <Table log (id, name2, age, weight, foo, bar)> >>> db["log"].add_column("baz", str) <Table log (id, name2, age, weight, foo, bar, baz)> >>> print(db["log"].schema) CREATE TABLE "log" ( [id] INTEGER PRIMARY KEY, [name2] TEXT, [age] INTEGER, [weight] FLOAT , [foo] TEXT, [bar] TEXT, [baz] TEXT) >>> db["log"].transform() <Table log (id, name2, age, weight, foo, bar, baz)> >>> print(db["log"].schema) CREATE TABLE "log" ( [id] INTEGER PRIMARY KEY, [name2] TEXT, [age] INTEGER, [weight] FLOAT, [foo] TEXT, [bar] TEXT, [baz] TEXT ) ``` _Originally posted by @simonw in https://github.com/simonw/llm/issues/65#issuecomment-1618347727_ |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/564/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 } |
completed | ||||||
1816857105 | I_kwDOCGYnMM5sSwoR | 570 | `sqlite-utils install -e` option | simonw 9599 | closed | 0 | 0 | 2023-07-22T18:32:23Z | 2023-07-22T18:55:59Z | 2023-07-22T18:32:56Z | OWNER | As seen in LLM. Needed while working on: - #567 |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/570/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1816830546 | I_kwDODEm0Qs5sSqJS | 73 | Twitter v1 API shutdown | david-perez 6341745 | open | 0 | 0 | 2023-07-22T16:57:41Z | 2023-07-22T16:57:41Z | NONE | I've been using this project reliably over the past two years to periodically download my liked tweets, but unfortunately since 19th July I get:
It appears like Twitter has now shut down their v1 endpoints, which is rather gracious of them, considering they announced they'd be deprecated on 29th April. Unfortunately retrieving likes using the v2 API is not part of their free plan. In fact, with the free plan one can only post and delete tweets and retrieve information about oneself. So I'm afraid this is the end of this very nice project. It was very useful, thank you! |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/73/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 1 } |
||||||||
1808116827 | I_kwDOBm6k_c5rxaxb | 2103 | data attribute on Datasette tables exposing the primary key of the row | simonw 9599 | open | 0 | 0 | 2023-07-17T16:18:25Z | 2023-07-17T16:18:25Z | OWNER | Maybe put it on the |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2103/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1794097871 | I_kwDOBm6k_c5q78LP | 2095 | Introduce "dark mode" CSS | jamietanna 3315059 | open | 0 | 0 | 2023-07-07T19:15:58Z | 2023-07-07T19:15:58Z | NONE | Using the CSS media query |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2095/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1783304750 | I_kwDOBm6k_c5qSxIu | 2094 | JS Plugin Hooks for the Code Editor | asg017 15178711 | open | 0 | 0 | 2023-07-01T00:51:57Z | 2023-07-01T00:51:57Z | CONTRIBUTOR | When #2052 merges, I'd like to add support to add extensions/functions to the Datasette code editor. I'd eventually like to build a JS plugin for
I did some hacking to see what this would look like, see here:
There can be a new hook that allows JS plugins to add new "extension" in the CodeMirror editorview here: Will need some more planning. For example, the Codemirror bundle in Datasette has functions that we could re-export for plugins to use (so we don't load 2 version of |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2094/reactions", "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1773450152 | I_kwDOCGYnMM5ptLOo | 559 | sqlean support | simonw 9599 | closed | 0 | 0 | 2023-06-25T19:27:26Z | 2023-06-25T23:25:53Z | 2023-06-25T23:25:53Z | OWNER | If sqlean is available, use that. Refs: - https://github.com/nalgeon/sqlean.py/issues/1#issuecomment-1605707788 This will provide a good workaround for: - #235 |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/559/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1764792125 | I_kwDOBm6k_c5pMJc9 | 2086 | Show information on startup in directory configuration mode | simonw 9599 | open | 0 | 0 | 2023-06-20T07:13:33Z | 2023-06-20T07:13:33Z | OWNER | https://discord.com/channels/823971286308356157/823971286941302908/1120516587036889098
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2086/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1762180409 | I_kwDOBm6k_c5pCL05 | 2085 | Interactive row selection in Datasette | learning4life 24938923 | open | 0 | 0 | 2023-06-18T08:29:45Z | 2023-06-18T08:31:23Z | NONE | Simon did a excellent prototype of an interactive row selection in Datasette. I hope this functionality can be turned into a Datasette plugin. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2085/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1761613778 | I_kwDOBm6k_c5pABfS | 2084 | Support facets for columns that contain timestamps | devxpy 19492893 | open | 0 | 0 | 2023-06-17T03:33:54Z | 2023-06-17T03:33:54Z | NONE | Django has this very nice filter for datetime fields - It would be nice to have something similar to facet by a field that contains a timestamp in datasette too - Which doesn't seem to do anything with timestamps right now... |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2084/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1751214236 | I_kwDOC8SPRc5oYWic | 36 | Getting sqlite_master may not be modified when creating dogsheep index | khushmeeet 8711912 | open | 0 | 0 | 2023-06-11T03:21:53Z | 2023-06-11T03:21:53Z | NONE | When creating a
Command I ran to get this error
Dogsheep version
Python version
|
dogsheep-beta 197431109 | issue | { "url": "https://api.github.com/repos/dogsheep/dogsheep-beta/issues/36/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1727478903 | I_kwDOBm6k_c5m9zx3 | 2081 | Update Endpoints defined in metadata throws 403 Forbidden after a while | cutmasta-kun 15085007 | open | 0 | 0 | 2023-05-26T11:52:30Z | 2023-05-26T11:52:30Z | NONE | Hello. I expose an endpoint to update This works really well! But after a while, the Datasette Instanz answers with 403 Forbidden. I have to delete the database and recreate it in order to work again. Any help here? (´。_。`) |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2081/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1701018909 | I_kwDOCGYnMM5lY30d | 543 | Tests broken on Windows due to new convert() lambda names | simonw 9599 | closed | 0 | 0 | 2023-05-08T22:11:29Z | 2023-05-08T22:19:04Z | 2023-05-08T22:19:04Z | OWNER | https://github.com/simonw/sqlite-utils/actions/runs/4920084038/jobs/8788501314
|
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/543/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1432377191 | I_kwDOCGYnMM5VYFdn | 509 | `sqlite-utils transform` breaks DEFAULT string values and STRFTIME() | kennysong 2199875 | closed | 0 | 0 | 2022-11-02T02:32:23Z | 2023-05-08T21:13:38Z | 2023-05-08T21:13:38Z | NONE | Very nice library! Our team found sqlite-utils through @simonw's comment on the "Simple declarative schema migration for SQLite" article, and we were excited to use it, but unfortunately Running
Here are steps to reproduce: Original database ``` $ sqlite3 test.db << EOF CREATE TABLE mytable ( col1 TEXT DEFAULT 'foo', col2 TEXT DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')) ) EOF $ sqlite3 test.db "SELECT sql FROM sqlite_master WHERE name = 'mytable';" CREATE TABLE mytable ( col1 TEXT DEFAULT 'foo', col2 TEXT DEFAULT (STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW')) ) ``` Modified database after sqlite-utils ``` $ sqlite3 test.db "INSERT INTO mytable DEFAULT VALUES; SELECT * FROM mytable;" foo|2022-11-02 02:26:58.038 $ sqlite-utils transform test.db mytable --rename col1 renamedcol1 $ sqlite3 test.db "SELECT sql FROM sqlite_master WHERE name = 'mytable';" CREATE TABLE "mytable" ( [renamedcol1] TEXT DEFAULT '''foo''', [col2] TEXT DEFAULT 'STRFTIME(''%Y-%m-%d %H:%M:%f'', ''NOW'')' ) $ sqlite3 test.db "INSERT INTO mytable DEFAULT VALUES; SELECT * FROM mytable;" foo|2022-11-02 02:26:58.038 'foo'|STRFTIME('%Y-%m-%d %H:%M:%f', 'NOW') ``` (Related: #336) |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/509/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1665510265 | I_kwDOBm6k_c5jRat5 | 2060 | Clean up a bunch of warnings from ruff | simonw 9599 | open | 0 | 0 | 2023-04-13T01:23:02Z | 2023-04-13T01:23:02Z | OWNER | See: - #2056
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2060/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1661617056 | I_kwDODD6af85jCkOg | 15 | ambiguous column name: createdAt - on checkin_details view | simonw 9599 | closed | 0 | 0 | 2023-04-11T01:07:47Z | 2023-04-11T03:16:37Z | 2023-04-11T03:16:37Z | MEMBER | It looks like Swarm changed their schema and now both Which breaks this view: https://github.com/dogsheep/swarm-to-sqlite/blob/719b6e96a016d0ca8b316d3bed9c2a7a0cb499ee/swarm_to_sqlite/utils.py#L171-L188 |
swarm-to-sqlite 205429375 | issue | { "url": "https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/15/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1650981564 | I_kwDOJHON9s5iZ_q8 | 12 | Error running pytest | amlestin 14314871 | open | 0 | 0 | 2023-04-02T15:02:36Z | 2023-04-02T15:07:10Z | NONE |
Solution: This is likely a PYTHONPATH issue due to having pytest installed both globally and in the venv. We can guarantee the tests run by adding the current directory to sys.path automatically using
The alternative is to activate the venv, install pytest, deactivate, then activate the venv again (https://stackoverflow.com/questions/35045038/how-do-i-use-pytest-with-virtualenv) |
apple-notes-to-sqlite 611552758 | issue | { "url": "https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/12/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1649793525 | I_kwDOBm6k_c5iVdn1 | 2051 | `?_extra=row_urls` for table pages | simonw 9599 | open | 0 | 0 | 2023-03-31T17:58:36Z | 2023-03-31T17:58:36Z | OWNER | Provides URLs to the JSON version of those rows. Maybe it persists the |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2051/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1646068413 | I_kwDOBm6k_c5iHQK9 | 2048 | Test failures encountered while packaging for GNU Guix | Apteryks 8332263 | open | 0 | 0 | 2023-03-29T15:36:54Z | 2023-03-29T15:36:54Z | NONE | Hello, While reviewing a packaged submitted to Guix to add app_client = <datasette.utils.testing.TestClient object at 0x7fffef099be0>
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:701: AssertionError ----------------------------- Captured stderr call ----------------------------- ERROR: conn=<sqlite3.Connection object at 0x7fffeedfe5d0>, sql = 'select rowid, * from [table%7E2Fwith%7E2Fslashes%7E2Ecsv] where "rowid"=:p0', params = {'p0': '3'}: no such table: table%7E2Fwith%7E2Fslashes%7E2Ecsv __ test_database_page_for_database_with_dot_in_name __ [gw15] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_with_dot = <datasette.utils.testing.TestClient object at 0x7fffef3416a0>
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:633: AssertionError ___ test_tilde_encoded_database_names[fo%o] ______ [gw6] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python db_name = 'fo%o'
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:983: AssertionError ___ testtilde_encoded_database_names[f~/c.d] _____ [gw7] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python db_name = 'f~/c.d'
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:983: AssertionError __ test_database_with_space_in_name[/searchable.json] __ [gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = <datasette.utils.testing.TestClient object at 0x7fffef11d730> path = '/searchable.json'
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in call return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(args, *kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( self = <httpx.AsyncClient object at 0x7fffef11d940> request = <Request('GET', 'http://localhost/extra%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E20database/searchable%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E2Ejson')> follow_redirects = True history = [<Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, ...]
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ___ test_database_with_space_in_name[.json] ______ [gw19] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = <datasette.utils.testing.TestClient object at 0x7fffef085a90> path = '.json'
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in call return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(args, *kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( self = <httpx.AsyncClient object at 0x7fffecd99ca0> request = <Request('GET', 'http://localhost/extra%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E20database%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E2Ejson')> follow_redirects = True history = [<Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, ...]
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects __ test_database_with_space_in_name[/searchable_view] __ [gw22] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = <datasette.utils.testing.TestClient object at 0x7fffeeab4c70> path = '/searchable_view'
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in call return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(args, *kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( self = <httpx.AsyncClient object at 0x7fffec5b3580> request = <Request('GET', 'http://localhost/extra%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E20database/searchable_view')> follow_redirects = True history = [<Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, ...]
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/client.py:1672: TooManyRedirects ___ test_database_with_space_in_name[/] ___ [gw18] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = <datasette.utils.testing.TestClient object at 0x7fffef085be0> path = '/'
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in call return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(args, *kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( self = <httpx.AsyncClient object at 0x7fffec5f1370> request = <Request('GET', 'http://localhost/extra%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E20database/')> follow_redirects = True history = [<Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, ...]
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects ____ test_database_with_space_in_name[/searchable] _____ [gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = <datasette.utils.testing.TestClient object at 0x7fffef099f10> path = '/searchable'
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in call return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(args, *kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( self = <httpx.AsyncClient object at 0x7fffecd8c790> request = <Request('GET', 'http://localhost/extra%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E20database/searchable')> follow_redirects = True history = [<Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, ...]
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects __ testdatabase_with_space_in_name[/searchable_view.json] ____ [gw23] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client_two_attached_databases = <datasette.utils.testing.TestClient object at 0x7fffef341520> path = '/searchable_view.json'
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_api.py:920: /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:223: in call return call_result.result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:438: in result return self.__get_result() /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/concurrent/futures/_base.py:390: in __get_result raise self._exception /gnu/store/mcclmphjgbrgpa0v037a4nlq336482g8-python-asgiref-3.4.1/lib/python3.9/site-packages/asgiref/sync.py:292: in main_wrap result = await self.awaitable(args, *kwargs) /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:66: in get return await self._request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:156: in _request httpx_response = await self.ds.client.request( /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/app.py:1602: in request return await client.request( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1527: in request return await self.send(request, auth=auth, follow_redirects=follow_redirects) /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1614: in send response = await self._send_handling_auth( /gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1642: in _send_handling_auth response = await self._send_handling_redirects( self = <httpx.AsyncClient object at 0x7fffef085460> request = <Request('GET', 'http://localhost/extra%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E20database/searchable_view%7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E7E2Ejson')> follow_redirects = True history = [<Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, <Response [302 Found]>, ...]
/gnu/store/bj5lb299rfb4cbbq5kczq9imdk9a7y64-python-httpx-0.23.0/lib/python3.9/site-packages/httpx/_client.py:1672: TooManyRedirects __ test_weird_database_names[database (1).sqlite] __ [gw7] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python tmpdir = local('/tmp/guix-build-datasette-0.64.2.drv-0/pytest-of-nixbld/pytest-0/popen-gw7/test_weird_database_names_data0') filename = 'database (1).sqlite'
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_cli.py:321: AssertionError ___ test_weird_database_names[test-database (1).sqlite] ______ [gw6] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python tmpdir = local('/tmp/guix-build-datasette-0.64.2.drv-0/pytest-of-nixbld/pytest-0/popen-gw6/test_weird_database_names_test0') filename = 'test-database (1).sqlite'
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_cli.py:321: AssertionError _ test_row_html_compound_primary_key[/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd-expected1] _ [gw11] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = <datasette.utils.testing.TestClient object at 0x7fffec2d37f0> path = '/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd' expected = [['<td class="col-pk1 type-str">a/b</td>', '<td class="col-pk2 type-str">.c-d</td>', '<td class="col-content type-str">c</td>']]
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:370: AssertionError _ test_css_classes_on_body[/fixtures/table~2Fwith~2Fslashes~2Ecsv-expected_classes5] _ [gw3] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = <datasette.utils.testing.TestClient object at 0x7fffd4743fa0> path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv' expected_classes = ['table', 'db-fixtures', 'table-tablewithslashescsv-fa7563']
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:238: AssertionError _ test_templates_considered[/fixtures/table~2Fwith~2Fslashes~2Ecsv-table-fixtures-tablewithslashescsv-fa7563.html, *table.html] _ [gw3] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = <datasette.utils.testing.TestClient object at 0x7fffd4743fa0> path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv' expected_considered = 'table-fixtures-tablewithslashescsv-fa7563.html, *table.html'
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:264: AssertionError _ test_alternate_url_json[/fixtures/table~2Fwith~2Fslashes~2Ecsv-http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json] _ [gw21] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = <datasette.utils.testing.TestClient object at 0x7fffecd9fac0> path = '/fixtures/table~2Fwith~2Fslashes~2Ecsv' expected = 'http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json'
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:948: AssertionError _ test_edit_sql_link_on_canned_queries[/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC-/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B] _ [gw18] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = <datasette.utils.testing.TestClient object at 0x7fffec5952e0> path = '/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC' expected = '/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B'
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_html.py:841: AssertionError _____ test_table_with_slashes_in_name ______ [gw9] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = <datasette.utils.testing.TestClient object at 0x7fffec0860a0>
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:141: AssertionError ___ testcustom_query_with_unicode_characters _____ [gw8] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = <datasette.utils.testing.TestClient object at 0x7fffec1cda90>
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:1042: /tmp/guix-build-datasette-0.64.2.drv-0/source/datasette/utils/testing.py:40: in json return json.loads(self.text) /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/init.py:346: in loads return _default_decoder.decode(s) /gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/decoder.py:337: in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) self = <json.decoder.JSONDecoder object at 0x7ffff7479760>, s = '', idx = 0
/gnu/store/65i3nhcwmz0p8rqbg48gaavyky4g4hwk-python-3.9.9/lib/python3.9/json/decoder.py:355: JSONDecodeError _ test_searchable[/fixtures/searchable.json?_search=te+AND+do&_searchmode=raw-expected_rows3] _ [gw13] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python app_client = <datasette.utils.testing.TestClient object at 0x7fffd470bf10> path = '/fixtures/searchable.json?_search=te+AND+do&_searchmode=raw' expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']]
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:402: AssertionError _ test_searchmode[table_metadata1-_search=te+AND+do-expected_rows1] ____ [gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python table_metadata = {'searchmode': 'raw'}, querystring = '_search=te+AND+do' expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']]
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:442: AssertionError _ test_searchmode[table_metadata2-_search=te+AND+do&_searchmode=raw-expected_rows2] _ [gw20] linux -- Python 3.9.9 /gnu/store/slsh0qjv5j68xda2bb6h8gsxwyi1j25a-python-wrapper-3.9.9/bin/python table_metadata = {}, querystring = '_search=te+AND+do&_searchmode=raw' expected_rows = [[1, 'barry cat', 'terry dog', 'panther'], [2, 'terry dog', 'sara weasel', 'puma']]
/tmp/guix-build-datasette-0.64.2.drv-0/source/tests/test_table_api.py:442: AssertionError
=========================== short test summary info ============================
FAILED tests/test_api.py::test_row_strange_table_name - assert 400 == 200
FAILED tests/test_api.py::test_database_page_for_database_with_dot_in_name - ...
FAILED tests/test_api.py::test_tilde_encoded_database_names[fo%o] - assert 30...
FAILED tests/test_api.py::test_tilde_encoded_database_names[f~/c.d] - assert ...
FAILED tests/test_api.py::test_database_with_space_in_name[/searchable.json]
FAILED tests/test_api.py::test_database_with_space_in_name[.json] - httpx.Too...
FAILED tests/test_api.py::test_database_with_space_in_name[/searchable_view]
FAILED tests/test_api.py::test_database_with_space_in_name[/] - httpx.TooMany...
FAILED tests/test_api.py::test_database_with_space_in_name[/searchable] - htt...
FAILED tests/test_api.py::test_database_with_space_in_name[/searchable_view.json]
FAILED tests/test_cli.py::test_weird_database_names[database (1).sqlite] - As...
FAILED tests/test_cli.py::test_weird_database_names[test-database (1).sqlite]
FAILED tests/test_html.py::test_row_html_compound_primary_key[/fixtures/compound_primary_key/a~2Fb,~2Ec~2Dd-expected1]
FAILED tests/test_html.py::test_css_classes_on_body[/fixtures/table~2Fwith~2Fslashes~2Ecsv-expected_classes5]
FAILED tests/test_html.py::test_templates_considered[/fixtures/table~2Fwith~2Fslashes~2Ecsv-table-fixtures-tablewithslashescsv-fa7563.html, table.html]
FAILED tests/test_html.py::test_alternate_url_json[/fixtures/table~2Fwith~2Fslashes~2Ecsv-http://localhost/fixtures/table~2Fwith~2Fslashes~2Ecsv.json]
FAILED tests/test_html.py::test_edit_sql_link_on_canned_queries[/fixtures/~F0~9D~90~9C~F0~9D~90~A2~F0~9D~90~AD~F0~9D~90~A2~F0~9D~90~9E~F0~9D~90~AC-/fixtures?sql=select+id%2C+name+from+facet_cities+order+by+id+limit+1%3B]
FAILED tests/test_table_api.py::test_table_with_slashes_in_name - assert 302 ...
FAILED tests/test_table_api.py::test_custom_query_with_unicode_characters - j...
FAILED tests/test_table_api.py::test_searchable[/fixtures/searchable.json?_search=te+AND+do&_searchmode=raw-expected_rows3]
FAILED tests/test_table_api.py::test_searchmode[table_metadata1-_search=te+AND+do-expected_rows1]
FAILED tests/test_table_api.py::test_searchmode[table_metadata2-_search=te+AND+do*&_searchmode=raw-expected_rows2]
=========== 22 failed, 1049 passed, 3 skipped in 1522.28s (0:25:22) ============
error: in phase 'check': uncaught exception:
%exception #<&invoke-error program: "/gnu/store/ziqwkzz6znb5d3c245xn0cq5ra2ly0w3-python-pytest-7.1.3/bin/pytest" arguments: ("-vv" "-n" "24" "-m" "not serial") exit-status: 1 term-signal: #f stop-signal: #f>
phase `check' failed after 1523.3 seconds
Thank you! |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2048/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1641013220 | I_kwDOBm6k_c5hz9_k | 2045 | First column on a view page has no facet option in cog menu | simonw 9599 | open | 0 | Datasette 1.0 3268330 | 0 | 2023-03-26T18:02:47Z | 2023-03-26T18:02:48Z | OWNER | e.g. first column on this page - cog menu has no option to facet. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2045/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
|||||||
1636616315 | I_kwDOBm6k_c5hjMh7 | 2042 | Gather feedback on new ?_extra= design | simonw 9599 | open | 0 | 0 | 2023-03-22T23:07:43Z | 2023-03-22T23:08:19Z | OWNER | Now that I've landed: - #1999 See also: - #262 I want to get some feedback from people on the design of the new The big change is that the default JSON representation is now MUCH slimmer - it only gives you keys for If you want extra stuff you can ask for it with the new You can use |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2042/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1620515757 | I_kwDOBm6k_c5glxut | 2039 | Subtle bug with `--load-extension` and `--static` flags with absolute Windows paths with`C:\` | asg017 15178711 | open | 0 | 0 | 2023-03-12T21:18:52Z | 2023-03-12T21:18:52Z | CONTRIBUTOR | From the Datasette discord: A user tried running the following command on windows:
This is hard because most absolute windows paths have a colon in them, like The "solution" is to use a relative path instead, but that doesn't feel that great. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2039/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1617962395 | I_kwDOJHON9s5gcCWb | 10 | Include schema in README | simonw 9599 | closed | 0 | 0 | 2023-03-09T20:38:59Z | 2023-03-09T20:48:18Z | 2023-03-09T20:48:18Z | MEMBER | As seen in other tools like https://github.com/simonw/git-history |
apple-notes-to-sqlite 611552758 | issue | { "url": "https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/10/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1617938730 | I_kwDOJHON9s5gb8kq | 9 | Default to just storing plaintext, store HTML if `--html` is passed | simonw 9599 | open | 0 | 0 | 2023-03-09T20:19:06Z | 2023-03-09T20:19:06Z | MEMBER | The full I'm tempted to say you don't get HTML unless you pass a |
apple-notes-to-sqlite 611552758 | issue | { "url": "https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/9/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1616440856 | I_kwDOJHON9s5gWO4Y | 5 | Configure full text search | simonw 9599 | open | 0 | 0 | 2023-03-09T05:20:46Z | 2023-03-09T05:20:46Z | MEMBER | FTS would be useful. Maybe even extract the plain text from the notes to make that index easier to create, rather than creating it against the HTML. Can use the |
apple-notes-to-sqlite 611552758 | issue | { "url": "https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/5/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1616422013 | I_kwDOJHON9s5gWKR9 | 3 | `apple-notes-to-sqlite --dump` option | simonw 9599 | closed | 0 | 0 | 2023-03-09T05:05:49Z | 2023-03-09T05:06:14Z | 2023-03-09T05:06:14Z | MEMBER | Option that doesn't write to the database at all, it just outputs all the notes to stdout as newline-delimited JSON. |
apple-notes-to-sqlite 611552758 | issue | { "url": "https://api.github.com/repos/dogsheep/apple-notes-to-sqlite/issues/3/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1605959201 | I_kwDOBm6k_c5fuP4h | 2032 | datasette errors when foreign key integrity is enabled | cldellow 193185 | open | 0 | 0 | 2023-03-02T01:27:51Z | 2023-03-02T01:31:58Z | CONTRIBUTOR | By default, SQLite does not enforce foreign key constraints. I typically enable these checks by running:
inside of a If a plugin causes the schema to change (eg datasette-scraper creating a new table, or datasette-edit-schema changing a column), then https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/internal_db.py#L71-L77 will fail with:
This could be resolved by either:
- deleting from the Let me know if you'd be open to a PR that addresses this -- since foreign key constraints aren't enabled by default, I guess it's questionable whether this is a bug. I think I can workaround this by inspecting the database parameter in |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2032/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1592327343 | I_kwDOBm6k_c5e6Pyv | 2029 | Sorry Simon, didn't know how else to contact you | llchristopherson 5804626 | open | 0 | 0 | 2023-02-20T19:02:53Z | 2023-02-20T19:02:53Z | NONE | Hi Simon, Would you be willing to chat with me about Datasette? I have some questions. I am working on a project to evaluate data ingestion tools for a research organization and I ran across Datasette. I have looked through a lot of your documentation, but still have some questions, which are very specific. If you would be willing to write me back about this, my email is laura@renci.org. Thanks, Laura |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2029/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1577548579 | I_kwDOBm6k_c5eB3sj | 2021 | Docker images for 1.0 alphas? | meowcat 1563881 | open | 0 | 0 | 2023-02-09T09:35:52Z | 2023-02-09T09:35:52Z | NONE | Hi, would you consider putting 1.0alpha images on Dockerhub? (Also, how usable are the alphas?) |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2021/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1575880841 | I_kwDOBm6k_c5d7giJ | 2020 | Documentation refers to "off" setting; doesn't seem to work, "false" does | dmick 1350673 | open | 0 | 0 | 2023-02-08T10:38:10Z | 2023-02-08T10:38:10Z | NONE | https://docs.datasette.io/en/stable/settings.html#suggest-facets, among others, suggests using "off" to disable the setting; however, this doesn't appear to work in the JSON config files, where it apparently needs to be a "JSON boolean" and have the values "true" or "false". Perhaps the Python code is more flexible?...but either way, the documentation probably should mention it. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2020/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1571711808 | I_kwDOBm6k_c5drmtA | 2018 | `check_visibility` gives confusing (wrong?) results if permission is `None` | cldellow 193185 | open | 0 | 0 | 2023-02-06T01:03:08Z | 2023-02-06T01:03:46Z | CONTRIBUTOR | I'm trying to gate access to an edit UI on the user having I expected datasette.check_visibility to be a good way to do this: ```python visible, private = await datasette.check_visibility( request.actor, permissions=[ ("update-row", (database, table)), ], )
``` But Based on the update-row permissions docs, I expected this to be default deny, and so no explicit permission would result in false. I think the root cause is that But |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2018/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1565179870 | I_kwDOBm6k_c5dSr_e | 2013 | Datasette uses non-standard quoting for identifiers | cldellow 193185 | open | 0 | 0 | 2023-02-01T00:05:39Z | 2023-02-01T00:06:30Z | CONTRIBUTOR | Related to #2001, but where #2001 was about literals, this is about identifiers From https://www.sqlite.org/lang_keywords.html:
Datasette uses this quoting here -- https://github.com/simonw/datasette/blob/0b4a28691468b5c758df74fa1d72a823813c96bf/datasette/utils/init.py#L345-L349, in some of the other DB access code, and in some of the test fixtures. Migrating to standard double quote identifiers would make it easier to get Datasette working with alternative backends |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2013/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1564774831 | I_kwDOBm6k_c5dRJGv | 2012 | Missing space in database summary | simonw 9599 | open | 0 | 0 | 2023-01-31T18:01:13Z | 2023-01-31T18:01:13Z | OWNER | Spotted this on an instance index page: |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2012/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1557599877 | I_kwDODFE5qs5c1xaF | 12 | location history changes | gerardrbentley 14809320 | open | 0 | 0 | 2023-01-26T03:57:25Z | 2023-01-26T03:57:25Z | NONE | not sure if each download is unique, but I had to change some things to work with the takeout zip I made 2023-01-25 filename changed from "Location History.json" to "Records.json"
```py def get_timestamp_ms(raw_timestamp): try: return datetime.datetime.strptime(raw_timestamp, "%Y-%m-%dT%H:%M:%SZ").timestamp() except ValueError: return datetime.datetime.strptime(raw_timestamp, "%Y-%m-%dT%H:%M:%S.%fZ").timestamp() def save_location_history(db, zf): location_history = json.load( zf.open("Takeout/Location History/Records.json") ) db["location_history"].upsert_all( ( { "id": id_for_location_history(row), "latitude": row["latitudeE7"] / 1e7, "longitude": row["longitudeE7"] / 1e7, "accuracy": row["accuracy"], "timestampMs": get_timestamp_ms(row["timestamp"]), "when": row["timestamp"], } for row in location_history["locations"] ), pk="id", ) def id_for_location_history(row): # We want an ID that is unique but can be sorted by in # date order - so we use the isoformat date + the first # 6 characters of a hash of the JSON first_six = hashlib.sha1( json.dumps(row, separators=(",", ":"), sort_keys=True).encode("utf8") ).hexdigest()[:6] return "{}-{}".format( row['timestamp'], first_six, ) ``` example locations from mine
|
google-takeout-to-sqlite 206649770 | issue | { "url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/12/reactions", "total_count": 2, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 2 } |
||||||||
1554032168 | I_kwDOBm6k_c5coKYo | 2002 | Document how actors are displayed | simonw 9599 | open | 0 | 0 | 2023-01-24T00:08:49Z | 2023-01-24T00:08:49Z | OWNER | https://github.com/simonw/datasette/blob/e4ebef082de90db4e1b8527abc0d582b7ae0bc9d/datasette/utils/init.py#L1052-L1056 This logic should be reflected in the documentation on https://docs.datasette.io/en/stable/authentication.html#actors |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/2002/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1550536442 | I_kwDOCGYnMM5ca076 | 521 | Custom JSON encoder | janrito 31504 | open | 0 | 0 | 2023-01-20T09:19:40Z | 2023-01-20T09:19:40Z | NONE | It would be nice if we could specify a custom encoder (and decoder) for types that will need extra deserialisation – e.g., sets, enums or sparse matrices – or even project-specific types |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/521/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1533673397 | I_kwDOBm6k_c5baf-1 | 1991 | fts5 tables are not auto-detected and hidden | keturn 83819 | open | 0 | 0 | 2023-01-15T06:00:42Z | 2023-01-20T04:54:24Z | NONE | I set up a Datasette instance and was following the docs on full-text search. When I used fts4, datasette automatically hid the FTS tables and added the FTS search box where appropriate, but when I changed to fts5 it no longer does either. If I manually set My table and view creation code looks like this:
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1991/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1538197093 | I_kwDOBm6k_c5brwZl | 1995 | foreign_keys error 500 | jonschoning 137183 | open | 0 | 0 | 2023-01-18T15:27:36Z | 2023-01-18T16:44:01Z | NONE | Error 500 expected string or bytes-like object run
Schema: ``` CREATE TABLE IF NOT EXISTS "user" ( "id" INTEGER PRIMARY KEY, "name" VARCHAR NOT NULL, "password_hash" VARCHAR NOT NULL, "api_token" VARCHAR NULL, "private_default" BOOLEAN NOT NULL, "archive_default" BOOLEAN NOT NULL, "privacy_lock" BOOLEAN NOT NULL, CONSTRAINT "unique_user_name" UNIQUE ("name") ); CREATE TABLE IF NOT EXISTS "bookmark" ( "id" INTEGER PRIMARY KEY, "user_id" INTEGER NOT NULL REFERENCES "user" ON DELETE RESTRICT ON UPDATE RESTRICT, "slug" VARCHAR NOT NULL DEFAULT (Lower(Hex(Randomblob(6)))), "href" VARCHAR NOT NULL, "description" VARCHAR NOT NULL, "extended" VARCHAR NOT NULL, "time" TIMESTAMP NOT NULL, "shared" BOOLEAN NOT NULL, "to_read" BOOLEAN NOT NULL, "selected" BOOLEAN NOT NULL, "archive_href" VARCHAR NULL, CONSTRAINT "unique_user_href" UNIQUE ("user_id", "href"), CONSTRAINT "unique_user_slug" UNIQUE ("user_id", "slug") ); CREATE TABLE IF NOT EXISTS "bookmark_tag" ( "id" INTEGER PRIMARY KEY, "user_id" INTEGER NOT NULL REFERENCES "user" ON DELETE RESTRICT ON UPDATE RESTRICT, "tag" VARCHAR NOT NULL, "bookmark_id" INTEGER NOT NULL REFERENCES "bookmark" ON DELETE RESTRICT ON UPDATE RESTRICT, "seq" INTEGER NOT NULL, CONSTRAINT "unique_user_tag_bookmark_id" UNIQUE ("user_id", "tag", "bookmark_id"), CONSTRAINT "unique_user_bookmark_id_tag_seq" UNIQUE ("user_id", "bookmark_id", "tag", "seq") ); CREATE TABLE IF NOT EXISTS "note" ( "id" INTEGER PRIMARY KEY, "user_id" INTEGER NOT NULL REFERENCES "user" ON DELETE RESTRICT ON UPDATE RESTRICT, "slug" VARCHAR NOT NULL DEFAULT (Lower(Hex(Randomblob(10)))), "length" INTEGER NOT NULL, "title" VARCHAR NOT NULL, "text" VARCHAR NOT NULL, "is_markdown" BOOLEAN NOT NULL, "shared" BOOLEAN NOT NULL DEFAULT false, "created" TIMESTAMP NOT NULL, "updated" TIMESTAMP NOT NULL ); CREATE INDEX idx_bookmark_time ON bookmark (user_id, time DESC); CREATE INDEX idx_bookmark_tag_bookmark_id ON bookmark_tag (bookmark_id, id, tag, seq); CREATE INDEX idx_note_user_created ON note (user_id, created DESC); ``` |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1995/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1532000914 | I_kwDOBm6k_c5bUHqS | 1990 | Suggestion: Highlight error messages ('These facets timed out') | pax 116795 | open | 0 | 0 | 2023-01-13T09:40:58Z | 2023-01-13T09:40:58Z | NONE | I had trouble figuring out why faceting didn't work in some instances, it took a while before I noticed the These facets timed out notice. It might help if that would be highlighted, or fading out highlight - if one might think it would be too visually disturbing. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1990/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1509783085 | I_kwDOBm6k_c5Z_XYt | 1969 | sql-formatter javascript is not now working with CloudFlare rocketloader | fgregg 536941 | open | 0 | 0 | 2022-12-23T21:14:06Z | 2023-01-10T01:56:33Z | CONTRIBUTOR | This is probably not a bug with datasette, but I thought you might want to know, @simonw. I noticed today that my CloudFlare proxied datasette instance lost the "Format SQL" option. I'm pretty sure it was there last week. In the CloudFlare settings, if I turn off Rocket Loader, I get the "Format SQL" option back. Rocket Loader works by asynchronously loading the javascript, so maybe there was a recent change that doesn't play well with the asynch loading? I'm up to date with https://github.com/simonw/datasette/commit/e03aed00026cc2e59c09ca41f69a247e1a85cc89 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1969/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1524431805 | I_kwDODEm0Qs5a3Pu9 | 72 | Import thread, including self- and others' replies | mcint 601708 | open | 0 | 0 | 2023-01-08T09:51:06Z | 2023-01-08T09:51:06Z | NONE | statuses-lookup, home-timeline, mentions (only for auth'ed user) don't cover this.
twitter-to-sqlite focuses on archiving users, but does not easily support archiving conversations or community activity. For reference, this is implemented in twarc, using a search, optionally recursively. Other research suggests that this formerly, or currently, requires a search query, use of undocumented |
twitter-to-sqlite 206156866 | issue | { "url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/72/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1516815571 | I_kwDOBm6k_c5aaMTT | 1975 | _col=id can cause id column to export twice in CSV export | simonw 9599 | open | 0 | 0 | 2023-01-03T00:25:15Z | 2023-01-03T00:25:21Z | OWNER |
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1975/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1515186569 | I_kwDOBm6k_c5aT-mJ | 1972 | Fix Sphinx warning about extlink extension | simonw 9599 | closed | 0 | 0 | 2022-12-31T19:12:04Z | 2022-12-31T19:13:26Z | 2022-12-31T19:13:26Z | OWNER |
Originally posted by @simonw in https://github.com/simonw/datasette/issues/1971#issuecomment-1368266904 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1972/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1505411725 | I_kwDODFdgUs5ZusKN | 78 | self-hosted or corp github enterprise | ebdavison 549431 | open | 0 | 0 | 2022-12-20T22:51:45Z | 2022-12-20T22:51:45Z | NONE | We use github enterprise at work and I would like to use this tool to pull info from that site rather than the public github.com instance. Is there an option for this? If not, can one be added for a custom repo URL? |
github-to-sqlite 207052882 | issue | { "url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/78/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1504352503 | I_kwDOBm6k_c5Zqpj3 | 1968 | Allow to hide some queries in metadata.yml | CharlesNepote 562352 | open | 0 | 0 | 2022-12-20T10:45:41Z | 2022-12-20T10:45:41Z | NONE | By default all queries are displayed. But there are many cases where it would be interesting to hide the queries by default: * the website is targeting non-tech people * the query is veeeeeery long (eg.) * reading the query is not important for the users, they only want to see the result Of course, the user still could have the option to see the query. It could be an option in the metadata file:
The priority could be: * no option in the metadata and nothing in the URL: query displayed * hide_sql in the metadata and nothing in the URL: query displayed as asked in the metadata * hide_sql in the metadata and &_hide_sql= in the URL: query as asked in the URL See also: #1824 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1968/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1468495358 | I_kwDOBm6k_c5Xh3X- | 1910 | Check incoming column types on various write APIs | simonw 9599 | open | 0 | Datasette 1.0a-next 8755003 | 0 | 2022-11-29T18:09:10Z | 2022-12-13T05:29:09Z | OWNER |
Originally posted by @simonw in https://github.com/simonw/datasette/issues/1863#issuecomment-1331089156 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1910/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
|||||||
1447465004 | I_kwDOBm6k_c5WRpAs | 1889 | Ability to create new tokens via the API | simonw 9599 | open | 0 | Datasette 1.0a-next 8755003 | 0 | 2022-11-14T06:21:36Z | 2022-12-13T05:29:08Z | OWNER | Refs: - #1850 Initially I decided that the API shouldn't be able to create new tokens at all - I don't like the idea of an API token holder creating themselves additional tokens. Then I realized that two of the API features are specifically more useful if you can generate fresh tokens via the API:
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1889/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
|||||||
1216436131 | I_kwDOBm6k_c5IgVej | 1721 | Implement plugin hooks: `register_table_extras`, `register_row_extras`, `register_query_extras` | simonw 9599 | open | 0 | Datasette 1.0a-next 8755003 | 0 | 2022-04-26T20:21:49Z | 2022-12-13T05:29:07Z | OWNER | Designed in: - #1720 Part of: - #262 - #1709 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1721/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
|||||||
1493306655 | I_kwDOBm6k_c5ZAg0f | 1945 | `view-instance` should not be checked for /-/actor.json | simonw 9599 | closed | 0 | Datasette 1.0a2 8711695 | 0 | 2022-12-13T04:01:46Z | 2022-12-13T04:11:56Z | 2022-12-13T04:11:56Z | OWNER | Spotted this while testing:
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1945/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
1487764628 | I_kwDOCGYnMM5YrXyU | 518 | flake8 ValueError: Error code '#' supplied to 'extend-ignore' option... | simonw 9599 | closed | 0 | 0 | 2022-12-10T01:30:24Z | 2022-12-10T01:36:46Z | 2022-12-10T01:36:46Z | OWNER |
https://github.com/simonw/sqlite-utils/actions/runs/3662011265/jobs/6190770361 I think from this: https://github.com/simonw/sqlite-utils/blob/e660635cea6c32f4022818380b1e1ee88e7c93a6/setup.cfg#L1-L3 |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/518/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1485017981 | I_kwDODEpn8M5Yg5N9 | 2 | table identifications has no column named previous_observation_taxon | heaversm 520541 | open | 0 | 0 | 2022-12-08T16:47:17Z | 2022-12-08T16:47:17Z | NONE | Installed successfully with pip and ran
|
inaturalist-to-sqlite 206202864 | issue | { "url": "https://api.github.com/repos/dogsheep/inaturalist-to-sqlite/issues/2/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1479920517 | I_kwDOBm6k_c5YNcuF | 1934 | Return number of ignored/replaced items from /-/insert | simonw 9599 | open | 0 | Datasette 1.0 3268330 | 0 | 2022-12-06T19:01:58Z | 2022-12-06T19:02:03Z | OWNER | Idea from here: - https://github.com/simonw/sqlite-utils/issues/516 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1934/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
|||||||
1469821027 | I_kwDOBm6k_c5Xm7Bj | 1921 | Document methods to get canned queries | eyeseast 25778 | open | 0 | 0 | 2022-11-30T15:26:33Z | 2022-11-30T23:34:21Z | CONTRIBUTOR | Two methods will get canned queries for a Datasette instance:
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1921/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1469796454 | I_kwDOBm6k_c5Xm1Bm | 1920 | Document Datasette.metadata() method | eyeseast 25778 | open | 0 | 0 | 2022-11-30T15:10:36Z | 2022-11-30T15:10:36Z | CONTRIBUTOR | Code is here: https://github.com/simonw/datasette/blob/main/datasette/app.py#L503 This will be the official way to access metadata from plugins. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1920/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1456013930 | I_kwDOBm6k_c5WyQJq | 1906 | Extract publish Heroku support to a plugin | simonw 9599 | open | 0 | Datasette 1.0 3268330 | 0 | 2022-11-19T00:02:51Z | 2022-11-19T00:03:10Z | OWNER |
Originally posted by @simonw in https://github.com/simonw/datasette/issues/1905#issuecomment-1320678715 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1906/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
|||||||
1455932972 | I_kwDOBm6k_c5Wx8Ys | 1904 | Datasette Lite tests failing due to httpx upgrade | simonw 9599 | closed | 0 | Datasette 1.0a0 8658075 | 0 | 2022-11-18T22:49:31Z | 2022-11-18T22:57:48Z | 2022-11-18T22:52:22Z | OWNER | Same problem as this one: - https://github.com/simonw/datasette-lite/issues/56 Caused this failure: https://github.com/simonw/datasette/actions/runs/3500765964 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1904/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | |||||
1450796965 | I_kwDOBm6k_c5WeWel | 1894 | Initialize CodeMirror during DOMContentLoaded instead of onload | bgrins 95570 | closed | 0 | 0 | 2022-11-16T03:52:19Z | 2022-11-18T07:29:02Z | 2022-11-18T07:29:02Z | CONTRIBUTOR | As per https://github.com/simonw/datasette/pull/1893/files#r1023248927 this should prevent a flash between the textarea being replaced by CodeMirror. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1894/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1454532488 | I_kwDOBm6k_c5WsmeI | 1902 | Document {% block crumbs %} for plugin authors | simonw 9599 | open | 0 | Datasette 1.0 3268330 | 0 | 2022-11-18T06:16:30Z | 2022-11-18T06:16:39Z | OWNER |
Originally posted by @simonw in https://github.com/simonw/datasette/issues/1901#issuecomment-1319588163 I should document this. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1902/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
|||||||
1453134846 | I_kwDOCGYnMM5WnRP- | 513 | Add or document streamlined workflow for importing Datasette csv / json exports | henry501 19328961 | open | 0 | 0 | 2022-11-17T10:54:47Z | 2022-11-17T10:54:47Z | NONE | I'm working on some small front-end enhancements to the laion-aesthetic-datasette project, and I wanted to partially populate a database directly using exports from the existing Datasette instance instead of downloading the parquet files and creating my own multi-GB database. There have been a number of small issues that are certainly related to my relative lack of familiarity with the toolkit, but that are still surprising. For example: a CSV export of the images table (http://laion-aesthetic.datasette.io/laion-aesthetic-6pls.csv?sql=select+rowid%2C+url%2C+text%2C+domain_id%2C+width%2C+height%2C+similarity%2C+punsafe%2C+pwatermark%2C+aesthetic%2C+hash%2C+index_level_0+from+images+order+by+random%28%29+limit+100) has nested single quotes, double quotes, and commas that aren't handled by rows_from_file. Similarly, the json output has to be manually transformed to add the column names and remove extraneous information before sqlite_utils can import it. I was able to work through these issues, but as an enhancement it would be really helpful to create or document a clear workflow that avoids the friction of this data transformation. |
sqlite-utils 140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/513/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1452360613 | I_kwDOBm6k_c5WkUOl | 1895 | Avoid using host name when building absolute URLs? | hubgit 14294 | open | 0 | 0 | 2022-11-16T22:21:27Z | 2022-11-16T22:21:27Z | NONE | When deploying Datasette to Cloud Run and rewriting certain routes from a Firebase app to the Cloud Run service, some of the URLs in the page start with I guess this is because a) the custom domain of the Firebase app isn't being passed through in the Would it be possible to not use the host name when building the absolute URLs, i.e. only include the path in the URL? |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1895/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1446657889 | I_kwDOBm6k_c5WOj9h | 1885 | Integrate inside GUI app (tkinter) | dmalves 5115787 | open | 0 | 0 | 2022-11-13T00:10:43Z | 2022-11-13T00:11:09Z | NONE | Hi, I'd like to integrate datasette inside a tkinter app. The app should be able to start/stop datasette server. How could I integrate datasette inside my app, so it can start and stop datasette server? |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1885/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1424980545 | I_kwDOBm6k_c5U73pB | 1861 | request.headers.get("Content-Type") fails | simonw 9599 | open | 0 | 0 | 2022-10-27T03:39:12Z | 2022-10-27T03:39:12Z | OWNER | Turns out this is case-sensitive, needs to be:
That's not great usability. It should be case insensitive. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1861/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1410548368 | I_kwDODFdgUs5UE0KQ | 77 | Feature: Support GitHub discussions | frosencrantz 631242 | open | 0 | 0 | 2022-10-16T16:53:38Z | 2022-10-16T16:53:38Z | CONTRIBUTOR | Hi @simonw I've been a happy user of this tool. Thank you for writing it and sharing it. I wanted to suggest a feature request to support Discussions. For example the VisiData project has discussions https://github.com/saulpw/visidata/discussions , and it would be useful if there was a way to pull that data into the database. However, I'm not offering a pull request. |
github-to-sqlite 207052882 | issue | { "url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/77/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1406860394 | I_kwDOBm6k_c5T2vxq | 1841 | Drop format_bytes for Jinja filesizeformat filter | simonw 9599 | open | 0 | 0 | 2022-10-12T22:06:34Z | 2022-10-12T22:06:34Z | OWNER | Turns out this isn't necessary: https://github.com/simonw/datasette/blob/5aa359b86907d11b3ee601510775a85a90224da8/datasette/utils/init.py#L849-L858 I can use this instead: https://jinja.palletsprojects.com/en/3.1.x/templates/#jinja-filters.filesizeformat |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1841/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1400083043 | I_kwDOBm6k_c5Tc5Jj | 1834 | inspect data is not used for caching database hash | fgregg 536941 | closed | 0 | 0 | 2022-10-06T17:52:01Z | 2022-10-06T20:06:21Z | 2022-10-06T20:06:08Z | CONTRIBUTOR | When databases are loaded, there is nothing preventing the rehashing of the database for immutable databases. what i might expect is that relevant values of With data that is many gigs large, this is a significant start up time. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1834/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
1399933513 | I_kwDOBm6k_c5TcUpJ | 1833 | Ability to submit long queries by POST | simonw 9599 | open | 0 | 0 | 2022-10-06T16:03:26Z | 2022-10-06T16:18:00Z | OWNER | Datasette doesn't limit URL lengths but some common web proxies do - the one in front of Google Cloud Run for example limits to 8KB total for incoming request headers: https://cloud.google.com/load-balancing/docs/quotas#https-lb-header-limits This means longer SQL queries can break! Need an optional mechanism for submitting queries by POST instead. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1833/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1396977994 | I_kwDOBm6k_c5TRDFK | 1830 | Add documentation for writing tests with signed actor cookies | simonw 9599 | open | 0 | 0 | 2022-10-04T23:51:26Z | 2022-10-04T23:51:26Z | OWNER | I use this pattirn in a lot of plugin tests, e.g. https://github.com/simonw/datasette-edit-templates/blob/087f6a6cabc20020f2b0524f11aa3a7836320848/tests/test_edit_templates.py#L55-L58
|
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1830/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1378636455 | I_kwDOBm6k_c5SLFKn | 1815 | `datasette publish provider .` to publish whole directory, similar to configuration directory mode | simonw 9599 | open | 0 | 0 | 2022-09-19T23:28:59Z | 2022-09-19T23:29:11Z | OWNER |
Originally posted by @simonw in https://github.com/simonw/datasette-publish-fly/issues/23#issuecomment-1251673489 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1815/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1375792876 | I_kwDOBm6k_c5SAO7s | 1811 | Drop-down menu with "REGEXP" choice | CharlesNepote 562352 | open | 0 | 0 | 2022-09-16T11:06:18Z | 2022-09-16T15:30:31Z | NONE | Drop-down menu below could add "REGEXP" choice when REGEXP sqlite extension is installed and used Not sure. Close the issue if you don't find it relevant. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1811/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1365741480 | I_kwDOBm6k_c5RZ4-o | 1806 | UX to recover from Error 500: "You can only execute one statement at a time." | jieter 1470389 | open | 0 | 0 | 2022-09-08T08:01:27Z | 2022-09-08T08:01:37Z | NONE | When using the Custom SQL query view, when accidentally adding a semicolon in the middle of my query, datasette errors with:
The error view doesn't contain the query textarea anymore, so it provides no easy way recover from the error. It would be nice if I could change and submit it again. |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1806/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
||||||||
1363244199 | I_kwDODFdgUs5RQXSn | 75 | Fetch repos doesn't support organisations | OverkillGuy 2757699 | open | 0 | 0 | 2022-09-06T12:55:06Z | 2022-09-06T12:55:06Z | NONE | Say I want to get all my Github Org's repos info, for data analysis. Not just the public repos, but also the private/internal repos. The endpoints are different for organisation, and this tool doesn't take it into account: https://github.com/dogsheep/github-to-sqlite/blob/ace13ec3d98090d99bd71871c286a4a612c96a50/github_to_sqlite/utils.py#L453 https://github.com/dogsheep/github-to-sqlite/blob/ace13ec3d98090d99bd71871c286a4a612c96a50/github_to_sqlite/utils.py#L455 The endpoints for organisation repos is instead (source):
Let's add support for organisations repo scraping. |
github-to-sqlite 207052882 | issue | { "url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/75/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [pull_request] TEXT, [body] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT , [active_lock_reason] TEXT, [performed_via_github_app] TEXT, [reactions] TEXT, [draft] INTEGER, [state_reason] TEXT); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);