id,node_id,number,title,user,user_label,state,locked,assignee,assignee_label,milestone,milestone_label,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,repo_label,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason 2001006157,PR_kwDOCGYnMM5f2OZC,604,Add more STRICT table support,16437338,tkhattra,closed,0,,,,,4,2023-11-19T19:38:53Z,2023-12-08T05:17:20Z,2023-12-08T05:05:27Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/604,"- https://github.com/simonw/sqlite-utils/issues/344#issuecomment-982014776 Make `table.transform()` preserve STRICT mode. ---- :books: Documentation preview :books:: https://sqlite-utils--604.org.readthedocs.build/en/604/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/604/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 959137143,MDU6SXNzdWU5NTkxMzcxNDM=,1415,feature request: document minimum permissions for service account for cloudrun,536941,fgregg,open,0,,,,,4,2021-08-03T13:48:43Z,2023-11-05T16:46:59Z,,CONTRIBUTOR,,"Thanks again for such a powerful project. For deploying to cloudrun from github actions, I'd like to create a service account with minimal permissions. It would be great to document what those minimum permission that need to be set in the IAM. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1415/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1901768721,PR_kwDOBm6k_c5anSg5,2191,"Move `permissions`, `allow` blocks, canned queries and more out of `metadata.yaml` and into `datasette.yaml`",15178711,asg017,closed,0,,,,,4,2023-09-18T21:21:16Z,2023-10-12T16:16:38Z,2023-10-12T16:16:38Z,CONTRIBUTOR,simonw/datasette/pulls/2191,"The PR moves the following fields from `metadata.yaml` to `datasette.yaml`: ``` permissions allow allow_sql queries extra_css_urls extra_js_urls ``` This is a significant breaking change that users will need to upgrade their `metadata.yaml` files for. But the format/locations are similar to the previous version, so it shouldn't be too difficult to upgrade. One note: I'm still working on the Configuration docs, specifically the ""reference"" section. Though it's pretty small, the rest of read to review",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2191/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1891212159,PR_kwDOBm6k_c5aD33C,2183,`datasette.yaml` plugin support,15178711,asg017,closed,0,,,,,4,2023-09-11T20:26:04Z,2023-09-13T21:06:25Z,2023-09-13T21:06:25Z,CONTRIBUTOR,simonw/datasette/pulls/2183,"Part of #2093 In #2149 , we ported over `""settings.json""` into the new `datasette.yaml` config file, with a top-level `""settings""` key. This PR ports over plugin configuration into top-level `""plugins""` key, as well as nested database/table plugin config. From now on, no plugin-related configuration is allowed in `metadata.yaml`, and must be in `datasette.yaml` in this new format. This is a pretty significant breaking change. Thankfully, you should be able to copy-paste your legacy plugin key/values into the new `datasette.yaml` format. An example of what `datasette.yaml` would look like with this new plugin config: ```yaml plugins: datasette-my-plugin: config_key: value databases: fixtures: plugins: datasette-my-plugin: config_key: fixtures-db-value tables: students: plugins: datasette-my-plugin: config_key: fixtures-students-table-value ``` As an additional benefit, this now works with the new `-s` flag: ```bash datasette --memory -s 'plugins.datasette-my-plugin.config_key' new_value ``` Marked as a ""Draft"" right now until I add better documentation. We also should have a plan for the next alpha release to document and publicize this change, especially for plugin authors (since their docs will have to change to say `datasette.yaml` instead of `metadata.yaml` ---- :books: Documentation preview :books:: https://datasette--2183.org.readthedocs.build/en/2183/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2183/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 336464733,MDU6SXNzdWUzMzY0NjQ3MzM=,328,"Installation instructions, including how to use the docker image",9599,simonw,closed,0,,,,,4,2018-06-28T03:59:33Z,2023-09-05T14:10:39Z,2018-06-28T04:02:10Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/328/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1870672704,PR_kwDOBm6k_c5Y-7Em,2162,"Add new `--internal internal.db` option, deprecate legacy `_internal` database",15178711,asg017,closed,0,,,,,4,2023-08-29T00:05:07Z,2023-08-29T03:24:23Z,2023-08-29T03:24:23Z,CONTRIBUTOR,simonw/datasette/pulls/2162,"refs #2157 This PR adds a new `--internal` option to datasette serve. If provided, it is the path to a persistent internal database that Datasette core and Datasette plugins can use to store data, as discussed in the proposal issue. This PR also removes and deprecates the previous in-memory `_internal` database. Those tables now appear in the `internal` database, with `core_` prefixes (ex `tables` in `_internal` is now `core_tables` in `internal`). ## A note on the new `core_` tables However, one important notes about those new `core_` tables: If a `--internal` DB is passed in, that means those `core_` tables will persist across multiple Datasette instances. This wasn't the case before, since `_internal` was always an in-memory database created from scratch. I tried to put those `core_` tables as `TEMP` tables - after all, there's always one 1 `internal` DB connection at a time, so I figured it would work. But, since we use the `Database()` wrapper for the internal DB, it has two separate connections: a default read-only connection and a write connection that is created when a write operation occurs. Which meant the `TEMP` tables would be created by the write connection, but not available in the read-only connection. So I had a brillant idea: Attach an in-memory named database with `cache=shared`, and create those tables there! ```sql ATTACH DATABASE 'file:datasette_internal_core?mode=memory&cache=shared' AS core; ``` We'd run this on both the read-only connection and the write-only connection. That way, those tables would stay in memory, they'd communicate with the `cache=shared` feature, and we'd be good to go. However, I couldn't find an easy way to run a `ATTACH DATABASE` command on the read-only query. Using `Database()` as a wrapper for the internal DB is pretty limiting - it's meant for Datasette ""data"" databases, where we want multiple readers and possibly 1 write connection at a time. But the internal database doesn't really require that kind of support - I think we could get away with a single read/write connection, but it seemed like too big of a rabbithole to go through now. ---- :books: Documentation preview :books:: https://datasette--2162.org.readthedocs.build/en/2162/ ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2162/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1865649347,I_kwDOBm6k_c5vM4zD,2156,datasette -s/--setting option for setting nested configuration options,9599,simonw,open,0,,,,,4,2023-08-24T18:09:27Z,2023-08-28T19:33:05Z,,OWNER,,"> I've been thinking about what it might look like to allow command-line arguments to be used to define _any_ of the configuration options in `datasette.yml`, as alternative and more convenient syntax. > > Here's what I've come up with: > ``` > datasette \ > -s settings.sql_time_limit_ms 1000 \ > -s plugins.datasette-auth-tokens.manage_tokens true \ > -s plugins.datasette-auth-tokens.manage_tokens_database tokens \ > mydatabase.db tokens.db > ``` > Which would be equivalent to `datasette.yml` containing this: > ```yaml > plugins: > datasette-auth-tokens: > manage_tokens: true > manage_tokens_database: tokens > settings: > sql_time_limit_ms: 1000 > ``` More details in https://github.com/simonw/datasette/issues/2143#issuecomment-1690792514 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2156/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1863810783,I_kwDOBm6k_c5vF37f,2150,form label { width: 15% } is a bad default,9599,simonw,closed,0,,,,,4,2023-08-23T18:22:27Z,2023-08-23T18:37:18Z,2023-08-23T18:35:48Z,OWNER,,"See: - https://github.com/simonw/datasette-configure-fts/issues/14 - https://github.com/simonw/datasette-auth-tokens/issues/12",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2150/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1843710170,I_kwDOBm6k_c5t5Mja,2136,Query view shouldn't return `columns`,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,4,2023-08-09T17:23:57Z,2023-08-09T19:03:04Z,2023-08-09T19:03:04Z,OWNER,,"I just noticed that https://latest.datasette.io/fixtures/roadside_attraction_characteristics.json?_labels=on&_size=1 returns: ```json { ""ok"": true, ""next"": ""1"", ""rows"": [ { ""rowid"": 1, ""attraction_id"": { ""value"": 1, ""label"": ""The Mystery Spot"" }, ""characteristic_id"": { ""value"": 2, ""label"": ""Paranormal"" } } ], ""truncated"": false } ``` But https://latest.datasette.io/fixtures.json?sql=select+rowid%2C+attraction_id%2C+characteristic_id+from+roadside_attraction_characteristics+order+by+rowid+limit+1 returns: ```json { ""rows"": [ { ""rowid"": 1, ""attraction_id"": 1, ""characteristic_id"": 2 } ], ""columns"": [ ""rowid"", ""attraction_id"", ""characteristic_id"" ], ""ok"": true, ""truncated"": false } ``` The `columns` key in the query response is inconsistent with the table response.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2136/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1822937426,I_kwDOBm6k_c5sp9FS,2111,Implement new /content.json?sql=...,9599,simonw,closed,0,,,9700784,Datasette 1.0a3,4,2023-07-26T18:22:39Z,2023-08-08T02:00:37Z,2023-08-08T02:00:22Z,OWNER,,"This will be the base that the remaining work builds on top of. Refs: - #2109 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2111/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1816917522,PR_kwDOCGYnMM5WJ6Jm,573,feat: Implement a prepare_connection plugin hook,15178711,asg017,closed,0,,,,,4,2023-07-22T22:48:44Z,2023-07-22T22:59:09Z,2023-07-22T22:59:09Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/573,"Just like the [Datasette prepare_connection hook](https://docs.datasette.io/en/stable/plugin_hooks.html#prepare-connection-conn-database-datasette), this PR adds a similar hook for the `sqlite-utils` plugin system. The sole argument is `conn`, since I don't believe a `database` or `datasette` argument would be relevant here. I want to do this so I can release `sqlite-utils` plugins for my [SQLite extensions](https://github.com/asg017/sqlite-ecosystem), similar to the Datasette plugins I've release for them. An example plugin: https://gist.github.com/asg017/d7cdf0d56e2be87efda28cebee27fa3c ```bash $ sqlite-utils install https://gist.github.com/asg017/d7cdf0d56e2be87efda28cebee27fa3c/archive/5f5ad549a40860787629c69ca120a08c32519e99.zip $ sqlite-utils memory 'select hello(""alex"") as response' [{""response"": ""Hello, alex!""}] ``` Refs: - #574 ---- :books: Documentation preview :books:: https://sqlite-utils--573.org.readthedocs.build/en/573/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/573/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1355148385,I_kwDOBm6k_c5Qxexh,1796,Research an upgrade to CodeMirror 6,9599,simonw,closed,0,,,,,4,2022-08-30T04:27:46Z,2023-07-03T04:58:21Z,2023-07-03T04:58:21Z,OWNER,,"There are still a bunch of bugs in CodeMirror 5 that affect various mobile browsers - see Datasette Discord report here: https://discord.com/channels/823971286308356157/823971286941302908/1013878624992108645 https://user-images.githubusercontent.com/9599/187349269-7b7c0c8c-3894-4810-82f0-de7c1eb940b3.mp4 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1796/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1575131737,I_kwDOCGYnMM5d4ppZ,525,Repeated calls to `Table.convert()` fail,167893,mcarpenter,closed,0,,,,,4,2023-02-07T22:40:47Z,2023-05-08T21:59:41Z,2023-05-08T21:54:02Z,CONTRIBUTOR,,"## Summary When using the API, repeated calls to `Table.convert()` do not work correctly since all conversions quietly use the callable (function, lambda) from the first call to `convert()` only. Subsequent invocations with different callables use the callable from the first invocation only. ## Example ```python from sqlite_utils import Database db = Database(memory=True) table = db['table'] col = 'x' table.insert_all([{col: 1}]) print(table.get(1)) table.convert(col, lambda x: x*2) print(table.get(1)) def zeroize(x): return 0 #zeroize = lambda x: 0 #zeroize.__name__ = 'zeroize' table.convert(col, zeroize) print(table.get(1)) ``` Output: ``` {'x': 1} {'x': 2} {'x': 4} ``` Expected: ``` {'x': 1} {'x': 2} {'x': 0} ``` ## Explanation This is some relevant [documentation](https://github.com/simonw/sqlite-utils/blob/1491b66dd7439dd87cd5cd4c4684f46eb3c5751b/docs/python-api.rst#registering-custom-sql-functions:~:text=By%20default%20registering%20a%20function%20with%20the%20same%20name%20and%20number%20of%20arguments%20will%20have%20no%20effect). * `Table.convert()` takes a `Callable` to perform data conversion on a column * The `Callable` is passed to `Database.register_function()` * `Database.register_function()` uses the callable's `__name__` attribute for registration * (Aside: all lambdas have a `__name__` of ``: I thought this was the problem, and it was close, but not quite) * However `convert()` first wraps the callable by local function [`convert_value()`](https://github.com/simonw/sqlite-utils/blob/fc221f9b62ed8624b1d2098e564f525c84497969/sqlite_utils/db.py#L2661) * Consequently `register_function()` sees name `convert_value` for all invocations from `convert()` * `register_function()` silently ignores registrations using the same name, retaining only the first such registration There's a mismatch between the comments and the code: https://github.com/simonw/sqlite-utils/blob/fc221f9b62ed8624b1d2098e564f525c84497969/sqlite_utils/db.py#L404 but actually the existing function is returned/used instead (as the ""registering custom sql functions"" doc I linked above says too). Seems like this can be rectified to match the comment? ## Suggested fix I think there are four things: 1. The call to `register_function()` from `convert()`should have an explicit `name=` parameter (to continue using `convert_value()` and the progress bar). 2. For functions, this name can be the real function name. (I understand the sqlite api needs a name, and it's nice if those are recognizable names where possible). For lambdas would `'lambda-{uuid}'` or similar be acceptable? 3. `register_function()` really should throw an error on repeated attempts to register a duplicate (function, arity)-pair. 4. A test? I haven't looked at the test framework here but seems this should be testable. ## See also - #458 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/525/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1699184583,I_kwDOCGYnMM5lR3_H,540,sphinx.builders.linkcheck build error,9599,simonw,closed,0,,,,,4,2023-05-07T18:37:09Z,2023-05-08T04:56:13Z,2023-05-07T18:42:36Z,OWNER,,"https://readthedocs.org/projects/sqlite-utils/builds/20512693/ ``` Running Sphinx v6.2.1 Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/registry.py"", line 442, in load_extension mod = import_module(extname) File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/importlib/__init__.py"", line 127, in import_module return _bootstrap._gcd_import(name[level:], package, level) File """", line 1014, in _gcd_import File """", line 991, in _find_and_load File """", line 975, in _find_and_load_unlocked File """", line 671, in _load_unlocked File """", line 783, in exec_module File """", line 219, in _call_with_frames_removed File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/builders/linkcheck.py"", line 20, in from requests import Response File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/requests/__init__.py"", line 43, in import urllib3 File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/urllib3/__init__.py"", line 38, in raise ImportError( ImportError: urllib3 v2.0 only supports OpenSSL 1.1.1+, currently the 'ssl' module is compiled with OpenSSL 1.0.2n 7 Dec 2017. See: https://github.com/urllib3/urllib3/issues/2168 The above exception was the direct cause of the following exception: Traceback (most recent call last): File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/cmd/build.py"", line 280, in build_main app = Sphinx(args.sourcedir, args.confdir, args.outputdir, File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/application.py"", line 225, in __init__ self.setup_extension(extension) File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/application.py"", line 404, in setup_extension self.registry.load_extension(self, extname) File ""/home/docs/checkouts/readthedocs.org/user_builds/sqlite-utils/envs/latest/lib/python3.8/site-packages/sphinx/registry.py"", line 445, in load_extension raise ExtensionError(__('Could not import extension %s') % extname, sphinx.errors.ExtensionError: Could not import extension sphinx.builders.linkcheck (exception: urllib3 v2.0 only supports OpenSSL 1.1.1+, currently the 'ssl' module is compiled with OpenSSL 1.0.2n 7 Dec 2017. See: https://github.com/urllib3/urllib3/issues/2168) Extension error: Could not import extension sphinx.builders.linkcheck (exception: urllib3 v2.0 only supports OpenSSL 1.1.1+, currently the 'ssl' module is compiled with OpenSSL 1.0.2n 7 Dec 2017. See: https://github.com/urllib3/urllib3/issues/2168) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/540/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1699174055,I_kwDOCGYnMM5lR1an,539,"`--raw-lines` option, like `--raw` for multiple lines",9599,simonw,closed,0,,,,,4,2023-05-07T18:07:46Z,2023-05-07T18:43:24Z,2023-05-07T18:26:18Z,OWNER,,I wanted to output newline-separated output of the first column of every row in the results - like `--row` but for more than one line.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/539/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1393202060,I_kwDOCGYnMM5TCpOM,496,devrel/python api: Pylance type hinting,7908073,chapmanjacobd,open,0,,,,,4,2022-10-01T03:03:34Z,2023-05-03T05:53:27Z,,CONTRIBUTOR,,"Pylance is generally pretty good at figuring out stuff but `sqlite-utils` has some quirks which make type hinting kinda useless. Maybe you don't care but I thought I would bring it to your attention. For example: ``` db[""subs""].insert_all(subs, pk=""index"") ``` ``` Cannot access member ""insert_all"" for type ""View"" Member ""insert_all"" is unknown ``` `insert_all` and all the other methods show up as a type issues because the program can't know whether something is a View or a Table. Fair enough. But that basically throws all type checking out the window. `pk=""index""` also shows up as a type issue: ``` Argument of type ""Literal['index']"" cannot be assigned to parameter ""pk"" of type ""Default"" in function ""insert_all"" ""Literal['index']"" is incompatible with ""Default"" ``` I think this is because DEFAULT is an empty class? maybe a few small changes could be made to make the library more type-friendly The interim solution is of course to turn off type hints completely for the line ``` db[""subs""].insert_all(subs, pk=""index"") # type: ignore ``` ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/496/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1686033652,I_kwDOBm6k_c5kftT0,2065,Datasette cannot be installed with Rye,9599,simonw,closed,0,,,,,4,2023-04-27T03:35:42Z,2023-04-27T05:09:36Z,2023-04-27T05:09:36Z,OWNER,,"https://github.com/mitsuhiko/rye I tried this: rye install datasette But now: ``` % ~/.rye/shims/datasette Traceback (most recent call last): File ""/Users/simon/.rye/shims/datasette"", line 5, in from datasette.cli import cli File ""/Users/simon/.rye/tools/datasette/lib/python3.11/site-packages/datasette/cli.py"", line 17, in from .app import ( File ""/Users/simon/.rye/tools/datasette/lib/python3.11/site-packages/datasette/app.py"", line 14, in import pkg_resources ModuleNotFoundError: No module named 'pkg_resources' ``` I think that's because `setuptools` is not included in Rye.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2065/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1373210675,I_kwDODD6af85R2Ygz,13,fails before generating views. ERR: table sqlite_master may not be modified,116795,pax,open,0,,,,,4,2022-09-14T15:41:50Z,2023-04-11T03:46:17Z,,NONE,,"generates checkins.db but seems to fail before generating views note: it worked on an Ubuntu WSL but fails on macOS 12.5.1 later edit: I suspect this is a problem with my local set-up, `dogsheep-beta index` also throws the same error full error: Importing 2591 checkins [###################################-] 98% 00:00:00 Traceback (most recent call last): File ""/Users/pax/devbox/envAll/bin/swarm-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/swarm_to_sqlite/cli.py"", line 77, in cli ensure_foreign_keys(db) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/swarm_to_sqlite/utils.py"", line 145, in ensure_foreign_keys db[fk.table].add_foreign_key(fk.column, fk.other_table, fk.other_column) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/sqlite_utils/db.py"", line 2123, in add_foreign_key self.db.add_foreign_keys([(self.name, column, other_table, other_column)]) File ""/Users/pax/devbox/envAll/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1086, in add_foreign_keys cursor.execute( sqlite3.OperationalError: table sqlite_master may not be modified",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1620164673,PR_kwDOCGYnMM5L08O8,531,Add paths for homebrew on Apple silicon,25778,eyeseast,closed,0,,,,,4,2023-03-11T22:27:52Z,2023-04-09T01:49:44Z,2023-04-09T01:49:43Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/531,"This also passes in the extension path when specified in GIS methods. Wherever we know an extension path, we use `db.init_spatialite(find_spatialite() or load_extension)`. ---- :books: Documentation preview :books:: https://sqlite-utils--531.org.readthedocs.build/en/531/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/531/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1646734246,I_kwDOBm6k_c5iJyum,2049,Custom SQL queries should use new JSON ?_extra= format,9599,simonw,open,0,,,8755003,Datasette 1.0a-next,4,2023-03-30T00:42:53Z,2023-04-05T23:29:27Z,,OWNER,,"Related: - #262 I've made the change to the table view, now I need the new format to work for arbitrary SQL queries too. Note that this incorporates both arbitrary SQL queries and canned queries.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2049/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1590183272,I_kwDOBm6k_c5eyEVo,2027,"How to redirect from ""/"" to a specific db/table",1350673,dmick,open,0,,,,,4,2023-02-18T03:14:01Z,2023-03-08T04:42:22Z,,NONE,,"Using nginx to redirect public IP to the local uvicorn server as 'normal'. I can't figure out how to redirect such that '/' results in accessing the one db/table I want to serve; redirecting / to /db/table breaks some of the CSS; fooling with base_url doesn't seem to help. Can someone explain this, if it's possible?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2027/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1515815014,I_kwDOBm6k_c5aWYBm,1973,render_cell plugin hook's row object is not a sqlite.Row,193185,cldellow,open,0,,,,,4,2023-01-01T20:27:46Z,2023-01-29T00:40:31Z,,CONTRIBUTOR,,"From https://docs.datasette.io/en/stable/plugin_hooks.html#render-cell-row-value-column-table-database-datasette: > row - sqlite.Row > The SQLite row object that the value being rendered is part of This appears to actually be a [CustomRow](https://github.com/simonw/datasette/blob/f0fadc28ddb9f82e5cc1ecaa51e8a342eb6dc528/datasette/utils/__init__.py#L773-L789), but I think that's unrelated to my issue. I have a table: ```sql CREATE TABLE IF NOT EXISTS ""dss_job_stats""( job_id integer not null references dss_job(id) on delete cascade, host text not null, // other columns elided as irrelevant primary key (job_id, host) ); ``` On datasette 0.63.2, the `render_cell` hook receives a `row` value that looks like: ``` CustomRow([('job_id', {'value': 2, 'label': '2'}), ('host', 'cldellow.com')]) ``` I expected the `job_id` value to be `2`, but it's actually `{'value': 2, 'label': '2'}`. I can work around this, but was wondering if this was intended behaviour?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1973/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1553615704,I_kwDOBm6k_c5cmktY,2001,Datasette is not compatible with SQLite's strict quoting compilation option,406380,gwk,open,0,,,,,4,2023-01-23T19:10:07Z,2023-01-25T04:59:58Z,,NONE,,"I have linked Python3.11 on macOS against recent SQLite that was compiled using `-DSQLITE_DQS=0`. This option disables interpretation of double-quoted identifiers as string literals, described in the SQLite docs as a ""MySQL 3.x misfeature"". See https://www.sqlite.org/quirks.html#dblquote for background. Datasette uses the double-quote syntax in a number of key places, and is thus completely broken in this environment. My experience was to `pip install datasette`, then run `datasette serve -I my-data.db`. When I visit `http://127.0.0.1:8001` I get a 500 response. The error: `sqlite3.OperationalError: no such column: geometry_columns` The responsible SQL: `'select 1 from sqlite_master where tbl_name = ""geometry_columns""'` I then installed datasette from GitHub master in development mode and changed the offending SQL to use correct quotes: `""select 1 from sqlite_master where tbl_name = 'geometry_columns'""`. With this change, I get a little further, but have the same problem with the first table name in my database (in my case, ""Meta""): ``` OperationalError: no such column: Meta Traceback (most recent call last): File ""/Users/gwk/external/datasette/datasette/app.py"", line 1522, in route_path response = await view(request, send) ^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/views/base.py"", line 151, in view return await self.dispatch_request(request) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/views/base.py"", line 105, in dispatch_request response = await handler(request) ^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/views/index.py"", line 70, in get ""fts_table"": await db.fts_table(table), ^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/database.py"", line 363, in fts_table return await self.execute_fn(lambda conn: detect_fts(conn, table)) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/database.py"", line 213, in execute_fn return await asyncio.get_event_loop().run_in_executor( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/usr/local/py/Python.framework/Versions/3.11/lib/python3.11/concurrent/futures/thread.py"", line 58, in run result = self.fn(*self.args, **self.kwargs) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/database.py"", line 211, in in_thread return fn(conn) ^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/database.py"", line 363, in return await self.execute_fn(lambda conn: detect_fts(conn, table)) ^^^^^^^^^^^^^^^^^^^^^^^ File ""/Users/gwk/external/datasette/datasette/utils/__init__.py"", line 588, in detect_fts rows = conn.execute(detect_fts_sql(table)).fetchall() ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ sqlite3.OperationalError: no such column: Meta INFO: 127.0.0.1:50258 - ""GET / HTTP/1.1"" 500 Internal Server Error ``` I will try to continue playing with this, but I also hope that the datasette developers will enable this mode in a test environment as I am unlikely to be able to exercise all of the SQL in the codebase, or make a pull request very soon. Note that the DQS setting compile-time option can be overridden at runtime with calls to the C API: ``` sqlite3_db_config(db, SQLITE_DBCONFIG_DQS_DDL, 0, (void*)0); sqlite3_db_config(db, SQLITE_DBCONFIG_DQS_DML, 0, (void*)0); ``` As far as I can tell, `sqlite3_db_config` is not exposed in Python, but perhaps we could figure out how to invoke it using `ctypes`. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2001/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1529707837,I_kwDOBm6k_c5bLX09,1988,Reconsider pattern where plugins could break existing template context,9599,simonw,open,0,,,3268330,Datasette 1.0,4,2023-01-11T21:13:43Z,2023-01-11T21:25:05Z,,OWNER,,"> I hadn't run into an issue with plugins like `datasette-template-sql` interfering with the existing context for other features before! Definitely not a good thing. _Originally posted by @simonw in https://github.com/simonw/datasette-write/issues/6#issuecomment-1379490596_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1988/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1529452371,I_kwDOBm6k_c5bKZdT,1987,installpython3.com is now a spam website,9599,simonw,closed,0,,,,,4,2023-01-11T17:55:12Z,2023-01-11T18:29:26Z,2023-01-11T18:29:25Z,OWNER,,"Need to stop linking to it from the docs. I'll link to https://www.python.org/about/gettingstarted/ instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1987/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 806849424,MDU6SXNzdWU4MDY4NDk0MjQ=,1221,Support SSL/TLS directly,9599,simonw,closed,0,,,,,4,2021-02-12T00:18:29Z,2022-12-18T02:39:04Z,2021-02-12T00:52:18Z,OWNER,,This should be pretty easy because Uvicorn supports them already. Need a good mechanism for testing it - https://pypi.org/project/trustme/ looks ideal.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1221/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1384549993,I_kwDOBm6k_c5Sho5p,1818,Setting to turn off table row counts entirely,9599,simonw,open,0,,,,,4,2022-09-24T06:39:22Z,2022-12-11T02:03:09Z,,OWNER,,"There are situations - such as loading SQLite files remotely using HTTP range headers - where counting all of the rows in a table should be avoided entirely. > > Also, this chunked inefficiency means that I have to hack the URL to not load tables of a database as it seems to try to load the whole database when I click on a database. > > I bet that's because Datasette tries to show a count of all of the rows in each table when it shows the list on that page, which triggers a full table scan. > > Would be great to have a setting that turns that feature off, which could then be exposed as a query string option for Datasette Lite. _Originally posted by @simonw in https://github.com/simonw/datasette-lite/issues/49#issuecomment-1256880715_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1818/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1198822563,I_kwDOBm6k_c5HdJSj,1706,"[feature] immutable mode for a directory, not just individual sqlite file",9020979,hydrosquall,open,0,,,,,4,2022-04-10T00:50:57Z,2022-12-09T19:11:40Z,,CONTRIBUTOR,,"## Motivation - I have a directory of sqlite databases - I'd like to use immutable mode when opening them for better performance [docs](https://docs.datasette.io/en/0.54/performance.html#immutable-mode) - Currently using this flag throws the following error IsADirectoryError: [Errno 21] Is a directory: '/name-of-directory' ## Proposal Immutable flag works for both single files and directories datasette -i /folder-of-sqlite-files",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1706/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1473659191,I_kwDOBm6k_c5X1kE3,1929,Incorrect link from the API explorer to the JSON API documentation,3556,davidbgk,closed,0,,,,,4,2022-12-03T02:08:58Z,2022-12-06T19:36:23Z,2022-12-06T19:34:20Z,CONTRIBUTOR,,"I installed `datasette==1.0a1`. When I go to http://127.0.0.1:8001/-/api I have a link: `Use this tool to try out the [Datasette API](https://docs.datasette.io/en/1.0a1/json_api.html).` but that documentation page does not exist. I'm not sure where it has to be fixed, should it link to the stable page https://docs.datasette.io/en/stable/json_api.html , the latest one https://docs.datasette.io/en/latest/json_api.html#the-json-write-api or would it be more appropriated to deploy documentation for the `1.0a1` version?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1929/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1479914599,I_kwDOCGYnMM5YNbRn,516,Feature request: output number of ignored/replaced rows for insert command,9599,simonw,open,0,,,,,4,2022-12-06T18:59:21Z,2022-12-06T19:08:14Z,,OWNER,,"https://hachyderm.io/@briandorsey/109468185742876820 > I'm fiddling with piping json to `insert -ignore` I'd love to see the count of records inserted & ignored, but didn't see a way to do that in the help/docs. > > Example: `xh ""https://hachyderm.io/api/v1/timelines/tag/rust?max_id=109443380308326328"" | sqlite-utils insert aoc.db aoc - --pk=id --ignore`",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/516/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1470509936,I_kwDOBm6k_c5XpjNw,1924,Docs for replace:true and ignore:true options for insert API,9599,simonw,closed,0,,,7867486,Datasette 1.0a1,4,2022-12-01T01:33:25Z,2022-12-01T18:15:15Z,2022-12-01T02:08:02Z,OWNER,,Equivalent to https://sqlite-utils.datasette.io/en/stable/cli.html#insert-replacing-data,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1924/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1450303205,I_kwDOBm6k_c5Wcd7l,1891,1.0a0 release notes,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,4,2022-11-15T19:58:20Z,2022-11-29T19:23:41Z,2022-11-29T19:23:41Z,OWNER,,"This release will mainly help preview the new Datasette write API: - #1850",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1891/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1425029275,I_kwDOBm6k_c5U8Dib,1864,Delete a single record from an existing table,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,4,2022-10-27T04:53:22Z,2022-11-29T18:54:04Z,2022-11-29T18:54:04Z,OWNER,,"API design: ``` POST /db/table/row-pks/-/delete Or... DELETE /db/table/row-pks/-/delete ``` I'm just going to do `POST` for the moment, like I did here: - #1874 Permission: `delete-row` Still needed: - [ ] Tests for rowid tables - [ ] Tests for compound primary keys",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1864/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1456012874,I_kwDOBm6k_c5WyP5K,1905,`publish heroku` failing due to old Python version,9599,simonw,closed,0,,,,,4,2022-11-19T00:01:45Z,2022-11-19T01:12:05Z,2022-11-19T00:52:29Z,OWNER,,"Reported on Discord: https://discord.com/channels/823971286308356157/823971286941302908/1042814317118115901 ``` -----> Building on the Heroku-22 stack -----> Determining which buildpack to use for this app -----> Python app detected -----> Using Python version specified in runtime.txt ! Requested runtime 'python-3.8.10' is not available for this stack (heroku-22). ! For supported versions, see: https://devcenter.heroku.com/articles/python-support ! Push rejected, failed to compile Python app. ! Push failed ▸ Build failed ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1905/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1452364777,I_kwDOBm6k_c5WkVPp,1896,Extract logic for resolving a URL to a database / table / row,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,4,2022-11-16T22:25:20Z,2022-11-18T22:57:47Z,2022-11-18T22:56:55Z,OWNER,,"> In trying to write this I realize that there's a lot of duplicated code with delete row, specifically around resolving the incoming URL into a row (or a database or a table). > > Since this is so common, I think it's worth extracting the logic out first. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1863#issuecomment-1317755263_ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1896/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1452495049,I_kwDOBm6k_c5Wk1DJ,1899,Clicking within the CodeMirror area below the SQL (i.e. when there's only a single line) doesn't cause the editor to get focused ,95570,bgrins,closed,0,,,,,4,2022-11-17T00:29:52Z,2022-11-18T07:28:28Z,2022-11-18T07:20:53Z,CONTRIBUTOR,,"After the upgrade to 6 (#1893) I noticed this. I think it's because we're doing overflow:hidden to accomplish the CSS resizer. When there's a single line of SQL there's a gap below that line where clicking doesn't do anything. It should focus at the end of the line.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1899/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1433576351,I_kwDOBm6k_c5VcqOf,1880,Datasette with many and large databases > Memory use,525934,amitkoth,open,0,,,,,4,2022-11-02T18:10:27Z,2022-11-16T17:50:29Z,,NONE,,"> Datasette maintains an in-memory SQLite database with details of the the databases, tables and columns for all of the attached databases. The above is from the docs ^. There's two problems here - the number of datasette ""instances"" in a single server/VM and the size of the database itself. We want the **opposite** of in-memory, including what happens on SQLlite - documented in https://www.sqlite.org/inmemorydb.html From the context in https://github.com/simonw/datasette/issues/1150 - does it mean datasette is memory-bound to the size of the dataset - which might be a deal-breaker for many large-scale use cases? In an extreme case - let's say a single server had 100 SQLlite databases, which would enable 100 ""instances"" of datasette to run, one per client (e.g. in a SaaS multi-tenant environment). How could we achieve all these goals: 1. Allow any _one_ of these 100 databases to grow to say 2Tb in size 2. Have one datasette instance, which connects to 1 of the 100 instances, based on incoming credentials/tenant ID 3. Minimize memory use entirely - both by datasette and SQLlite, such that almost all operations are executed in real-time on-disk with little to no memory consumption per-tenant, or per-database. Any ideas appreciated - we're looking to use this in a SaaS type of setting - many instances, single server. @simonw great work on datasette, in general! Possibly related to https://github.com/simonw/datasette/issues/1480 but we don't want use any kind of serverless infra - this is a long-running VM/server.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1880/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1429030341,I_kwDOBm6k_c5VLUXF,1874,API to drop a table,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,4,2022-10-30T21:55:11Z,2022-11-15T19:59:53Z,2022-11-14T05:45:06Z,OWNER,,"`POST /db/table/-/drop` Require `drop-table` permission.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1874/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1423364990,I_kwDOBm6k_c5U1tN-,1858,`max_signed_tokens_ttl` setting for a maximum duration on API tokens,9599,simonw,closed,0,,,8658075,Datasette 1.0a0,4,2022-10-26T03:05:53Z,2022-11-15T19:58:52Z,2022-10-27T03:15:05Z,OWNER,,"It's currently possible to use `/-/create-token` to create a token that lasts forever. Some administrators may wish to have a maximum expiry instead. I should support that with a setting.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1858/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 473083260,MDU6SXNzdWU0NzMwODMyNjA=,50,"""Too many SQL variables"" on large inserts",9599,simonw,closed,0,,,,,4,2019-07-25T21:43:31Z,2022-11-04T14:38:36Z,2019-07-28T11:59:33Z,OWNER,,"Reported here: https://github.com/dogsheep/healthkit-to-sqlite/issues/9 It looks like there's a default limit of 999 variables - we need to be smart about that, maybe dynamically lower the batch size based on the number of columns.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/50/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 802513359,MDU6SXNzdWU4MDI1MTMzNTk=,1217,Possible to deploy as a python app (for Rstudio connect server)?,6165713,plpxsk,open,0,,,,,4,2021-02-05T22:21:24Z,2022-11-04T11:37:52Z,,NONE,,"Is it possible to deploy a `datasette` application as a python web app? In my enterprise, I have option to deploy python apps via [Rstudio Connect](https://github.com/rstudio/rsconnect-python), and I would like to publish a `datasette` dashboard for sharing. I welcome any pointers to converting `datasette serve` into a python app that can be run as something like `python datasette.py --my_data.db`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1217/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1342430983,I_kwDOBm6k_c5QA98H,1786,Adjust height of textarea for no JS case,9599,simonw,closed,0,,,,,4,2022-08-18T01:15:15Z,2022-10-27T21:50:12Z,2022-08-18T16:06:09Z,OWNER,,"Datasette Lite: https://lite.datasette.io/?sql=https://gist.githubusercontent.com/simonw/1f8a91123ccefd8844187225b1832d7a/raw/5069075b86aa79358fbab3d4482d1d269077d632/recipes.sql#/data?sql=select+id%2C+name%2C+ingredients%2C+%28%0A++select+json_group_array%28value%29+from+json_each%28ingredients%29%0A++where+value+in+%28select+value+from+json_each%28%3Ap0%29%29%0A%29+as+matching_ingredients%0Afrom+recipes%0Awhere+json_array_length%28matching_ingredients%29+%3E+0%0Aorder+by+json_array_length%28matching_ingredients%29+desc&p0=%5B%22sugar%22%2C+%22cheese%22%5D ![46F8101E-8CE3-4F61-B200-F865E6B5DBCC](https://user-images.githubusercontent.com/9599/185270723-f55513b0-b561-434d-9d7c-4fe5be9756e0.jpeg) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1786/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1413641049,I_kwDOCGYnMM5UQnNZ,501,Tests failing due to updated tabulate library,9599,simonw,closed,0,,,,,4,2022-10-18T18:07:52Z,2022-10-18T18:23:40Z,2022-10-18T18:23:40Z,OWNER,,"Failure here: https://github.com/simonw/sqlite-utils/actions/runs/3275786702/jobs/5391063221 I figured out the problem: ```diff diff --git a/docs/cli-reference.rst b/docs/cli-reference.rst index b88e38a..82b4b6c 100644 --- a/docs/cli-reference.rst +++ b/docs/cli-reference.rst @@ -112,11 +112,15 @@ See :ref:`cli_query`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, - simple, textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, + latex, latex_booktabs, latex_longtable, latex_raw, + mediawiki, mixed_grid, mixed_outline, moinmoin, + orgtbl, outline, pipe, plain, presto, pretty, + psql, rounded_grid, rounded_outline, rst, simple, + simple_grid, simple_outline, textile, tsv, + unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings -r, --raw Raw output, first column of first row @@ -176,11 +180,15 @@ See :ref:`cli_memory`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, - simple, textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, + latex, latex_booktabs, latex_longtable, latex_raw, + mediawiki, mixed_grid, mixed_outline, moinmoin, + orgtbl, outline, pipe, plain, presto, pretty, + psql, rounded_grid, rounded_outline, rst, simple, + simple_grid, simple_outline, textile, tsv, + unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings -r, --raw Raw output, first column of first row @@ -401,11 +409,14 @@ See :ref:`cli_search`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, simple, - textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, latex, + latex_booktabs, latex_longtable, latex_raw, mediawiki, + mixed_grid, mixed_outline, moinmoin, orgtbl, outline, + pipe, plain, presto, pretty, psql, rounded_grid, + rounded_outline, rst, simple, simple_grid, + simple_outline, textile, tsv, unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --load-extension TEXT Path to SQLite extension, with optional :entrypoint @@ -651,11 +662,14 @@ See :ref:`cli_tables`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, simple, - textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, latex, + latex_booktabs, latex_longtable, latex_raw, mediawiki, + mixed_grid, mixed_outline, moinmoin, orgtbl, outline, + pipe, plain, presto, pretty, psql, rounded_grid, + rounded_outline, rst, simple, simple_grid, + simple_outline, textile, tsv, unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --columns Include list of columns for each table @@ -689,11 +703,14 @@ See :ref:`cli_views`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, simple, - textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, latex, + latex_booktabs, latex_longtable, latex_raw, mediawiki, + mixed_grid, mixed_outline, moinmoin, orgtbl, outline, + pipe, plain, presto, pretty, psql, rounded_grid, + rounded_outline, rst, simple, simple_grid, + simple_outline, textile, tsv, unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --columns Include list of columns for each view @@ -732,11 +749,15 @@ See :ref:`cli_rows`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, - simple, textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, + latex, latex_booktabs, latex_longtable, latex_raw, + mediawiki, mixed_grid, mixed_outline, moinmoin, + orgtbl, outline, pipe, plain, presto, pretty, + psql, rounded_grid, rounded_outline, rst, simple, + simple_grid, simple_outline, textile, tsv, + unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --load-extension TEXT Path to SQLite extension, with optional @@ -768,11 +789,14 @@ See :ref:`cli_triggers`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, simple, - textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, latex, + latex_booktabs, latex_longtable, latex_raw, mediawiki, + mixed_grid, mixed_outline, moinmoin, orgtbl, outline, + pipe, plain, presto, pretty, psql, rounded_grid, + rounded_outline, rst, simple, simple_grid, + simple_outline, textile, tsv, unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --load-extension TEXT Path to SQLite extension, with optional :entrypoint @@ -804,11 +828,14 @@ See :ref:`cli_indexes`. --tsv Output TSV --no-headers Omit CSV headers -t, --table Output as a formatted table - --fmt TEXT Table format - one of fancy_grid, fancy_outline, - github, grid, html, jira, latex, latex_booktabs, - latex_longtable, latex_raw, mediawiki, moinmoin, - orgtbl, pipe, plain, presto, pretty, psql, rst, simple, - textile, tsv, unsafehtml, youtrack + --fmt TEXT Table format - one of asciidoc, double_grid, + double_outline, fancy_grid, fancy_outline, github, + grid, heavy_grid, heavy_outline, html, jira, latex, + latex_booktabs, latex_longtable, latex_raw, mediawiki, + mixed_grid, mixed_outline, moinmoin, orgtbl, outline, + pipe, plain, presto, pretty, psql, rounded_grid, + rounded_outline, rst, simple, simple_grid, + simple_outline, textile, tsv, unsafehtml, youtrack --json-cols Detect JSON cols and output them as JSON, not escaped strings --load-extension TEXT Path to SQLite extension, with optional :entrypoint diff --git a/docs/cli.rst b/docs/cli.rst index 8bc4176..1d67e88 100644 --- a/docs/cli.rst +++ b/docs/cli.rst @@ -187,10 +187,15 @@ Available ``--fmt`` options are: cog.out(""\n"" + ""\n"".join('- ``{}``'.format(t) for t in tabulate.tabulate_formats) + ""\n\n"") .. ]]] +- ``asciidoc`` +- ``double_grid`` +- ``double_outline`` - ``fancy_grid`` - ``fancy_outline`` - ``github`` - ``grid`` +- ``heavy_grid`` +- ``heavy_outline`` - ``html`` - ``jira`` - ``latex`` @@ -198,15 +203,22 @@ Available ``--fmt`` options are: - ``latex_longtable`` - ``latex_raw`` - ``mediawiki`` +- ``mixed_grid`` +- ``mixed_outline`` - ``moinmoin`` - ``orgtbl`` +- ``outline`` - ``pipe`` - ``plain`` - ``presto`` - ``pretty`` - ``psql`` +- ``rounded_grid`` +- ``rounded_outline`` - ``rst`` - ``simple`` +- ``simple_grid`` +- ``simple_outline`` - ``textile`` - ``tsv`` - ``unsafehtml`` ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/501/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1413610718,I_kwDOCGYnMM5UQfze,500,Turn --flatten into a documented utility function,9599,simonw,closed,0,,,,,4,2022-10-18T17:43:36Z,2022-10-18T18:02:10Z,2022-10-18T18:00:40Z,OWNER,,The `--flatten` implementation isn't currently available to Python code - people have to roll their own implementation. Feedback from a conversation at DjangoCon.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/500/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 703246031,MDU6SXNzdWU3MDMyNDYwMzE=,51,github-to-sqlite should handle rate limits better,9599,simonw,open,0,,,,,4,2020-09-17T04:01:50Z,2022-10-14T16:34:07Z,,MEMBER,,From #50 - right now it will crash with an error of it hits the rate limit. Since the rate limit information (including reset time) is available in the headers it could automatically sleep and try again instead.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/51/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 377155320,MDU6SXNzdWUzNzcxNTUzMjA=,370,Integration with JupyterLab,82988,psychemedia,open,0,,,,,4,2018-11-04T13:57:13Z,2022-09-29T08:17:47Z,,CONTRIBUTOR,,"I just watched a demo video for the [JupyterLab Chart Editor](https://www.crowdcast.io/e/introducing-JupyterLab-Chart-Editor/) which wraps the plotly chart editor app in a JupyterLab panel and lets you open a plotly chart JSON file in that editor. Essentially, it pops an HTML app into a panel in JupyterLab, and I think registers the app as a file viewer for a particular file type. (I'm not completely taken by it, tbh, because it means you can do irreproducible things to the chart definition file, but that's another issue). JupyterLab extensions can also open files from a dialogue as the iframe/html previewer shows: https://github.com/timkpaine/jupyterlab_iframe. This made me wonder about what `datasette` integration with JupyterLab might do. For example, by right-clicking on a CSV file (for which there is already a CSV table view) in the file browser, offer a *View / Run as datasette* file viewer option that will: - run the CSV file through `csvs-to-sqlite`; - launch the `datasette` server and display the `datasette` view in a JupyterLab panel. (? Create a new SQLite db for each CSV file and launch each datasette view on a new port? Or have a JupyterLab (session?) SQLite db that stores all `datasette` viewed CSVs and runs on a single port?) As a freebie, the `datasette` API would allow you to run efficient SQL queries against the file eg using using `pandas.read_sql()` queries in a notebook in the same space. Related: - [JupyterLab extensions docs](https://jupyterlab.readthedocs.io/en/stable/user/extensions.html) - a [cookiecutter for wrting JupyterLab extensions using Javascript](https://github.com/jupyterlab/extension-cookiecutter-js) - a [cookiecutter for writing JupyterLab extensions using Typescript](https://github.com/jupyterlab/extension-cookiecutter-ts) - tutorial: [Let’s Make an xkcd JupyterLab Extension](https://jupyterlab.readthedocs.io/en/stable/developer/xkcd_extension_tutorial.html)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/370/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1382457780,I_kwDOCGYnMM5SZqG0,490,Ability to insert multi-line files,6180701,jeqo,closed,0,,,,,4,2022-09-22T13:29:22Z,2022-09-26T18:24:44Z,2022-09-23T16:37:58Z,NONE,,"I was looking into how to parse application log files that contain multiline text (e.g. Java stack traces) into sqlite. I can see that at the moment `--lines` helps, but falls short when processing multi-line texts. I wonder if this functionality would be useful for sqlite-utils. A similar approach to Elastic logstash/filebeat can be adopted: https://www.elastic.co/guide/en/beats/filebeat/current/multiline-examples.html Potential changes: - add a `--multiline` option - additional properties for - multiline-pattern (regex expression) - multiline-negate: true/false - multiline-what: previous or next Or if this is achievable in a different way, please share. Thanks!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/490/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1374626873,I_kwDOBm6k_c5R7yQ5,1810,Featured table(s) on the homepage,9599,simonw,open,0,,,,,4,2022-09-15T14:30:49Z,2022-09-15T15:51:25Z,,OWNER,,"Many Datasette instances mainly exist to serve a single table - for example: - https://global-power-plants.datasettes.com/global-power-plants/global-power-plants - https://laion-aesthetic.datasette.io/laion-aesthetic-6pls/images It would be neat if the / homepage of those instances could be configured to highlight that specific table. Or maybe more than one?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1810/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1359557737,I_kwDOBm6k_c5RCTRp,1798,"Parts of YAML file do not work when db name is ""off""",562352,CharlesNepote,closed,0,,,,,4,2022-09-01T22:10:57Z,2022-09-02T00:02:53Z,2022-09-01T23:56:33Z,NONE,,"I guess this issue is not very important and probably rare. To reproduce: * create and populate a db named `off.db` * in the yaml file, add any kind of information below `databases:\n off:` * the data are not taken into account (because ""off"" is interpreted as ""false"") YAML file: ```yaml title: Some title description_html: |-

This is an experiment.

databases: off: tables: products_from_owners: title: products_from_owners* description_html: |-

Description

``` The result for http://xxxx.xxx/-/metadata gives: ```json { ""title"": ""Some title"", ""description_html"": ""

This is an experiment.

"", ""databases"": { ""false"": { ""tables"": { ""products_from_owners"": { ""title"": ""products_from_owners*"", ""description_html"": ""

Description

"" } } } } } ``` => see the `""false""` instead of `""off""`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1798/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1355433619,PR_kwDOCGYnMM4-B7Mc,480,search_sql add include_rank option,7908073,chapmanjacobd,closed,0,,,,,4,2022-08-30T09:10:29Z,2022-08-31T03:40:35Z,2022-08-31T03:40:35Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/480,"I haven't tested this yet but wanted to get a heads-up whether this kind of change would be useful or if I should just duplicate the function and tweak it within my code ---- :books: Documentation preview :books:: https://sqlite-utils--480.org.readthedocs.build/en/480/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/480/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 855476501,MDU6SXNzdWU4NTU0NzY1MDE=,1298,improve table horizontal scroll experience,192568,mroswell,open,0,,,,,4,2021-04-12T01:55:16Z,2022-08-30T21:11:49Z,,CONTRIBUTOR,,"Wide tables aren't a huge problem if you know to click and drag right. But it's not at all obvious to do that. (it also tends to blue-select any content as it's dragging.) Depending on column widths, public users might entirely miss all the columns to the right. There is a scrollbar at the bottom of the table, but I'm displaying ALL my records because it's the only way for datasette-vega to make accurate charts. So that bottom scrollbar is likely to be missed. I wonder if some sort of javascript-y mouseover to an arrow might help, similar to those seen in image carousels. Ah: here's a perfect example: 1. Visit http://google.com 2. Search for: animals endangered 3. Note the 'g-right-button' (in the code) that looks like a right-facing caret in a circle. 4. Click on that and the carousel scrolls right (and 'g-left-button' appears on the left). Might be tricky to do that on a table, rather than a one-row carousel, but it's worth experimenting with. Another option is just to put the scrollbars at the top of the table, too. Meantime, I'm trying to build a button like the ""View/hide all columns on https://salaries.news.baltimoresun.com/salaries-be494cf/2019+Maryland+state+salaries Might be nice to have that available by default, with settings in the metadata showing which are on by default. (I saw some other closed issues related to horizontal scrolling, and admit I don't entirely understand them. For instance, the animated gif at https://github.com/simonw/datasette/issues/998#issuecomment-714117534 confuses me. ) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1298/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1326087800,PR_kwDOCGYnMM48hI-_,460,Cross-link CLI to Python docs,9599,simonw,closed,0,,,,,4,2022-08-02T16:18:28Z,2022-08-18T21:58:10Z,2022-08-18T21:58:07Z,OWNER,simonw/sqlite-utils/pulls/460,"Work in progress, partly to test the ReadTheDocs preview link action. Refs: - #426 ---- :books: Documentation preview :books:: https://readthedocs-preview--460.org.readthedocs.build/en/460/ ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/460/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1318907685,I_kwDOBm6k_c5OnO8l,1773,500 error if sorted by a column not in the ?_col= list,9599,simonw,closed,0,,,8303187,Datasette 0.62,4,2022-07-27T01:20:27Z,2022-08-14T16:06:25Z,2022-08-14T15:44:05Z,OWNER,,"For example: https://latest.datasette.io/fixtures/sortable?_sort_desc=sortable&_col=sortable_with_nulls That's `?_sort_desc=sortable&_col=sortable_with_nulls` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1773/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 779156520,MDU6SXNzdWU3NzkxNTY1MjA=,1175,Use structlog for logging,9599,simonw,open,0,,,,,4,2021-01-05T15:11:36Z,2022-07-26T12:52:10Z,,OWNER,,To solve #241 JSON logging.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1175/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1279863844,I_kwDOCGYnMM5MSSwk,449,Utilities for duplicating tables and creating a table with the results of a query,1690072,davidleejy,closed,0,,,,,4,2022-06-22T09:41:43Z,2022-07-15T21:46:13Z,2022-07-15T21:21:36Z,CONTRIBUTOR,,"is there a duplicate table functionality? Otherwise, I'd be happy to submit a PR. In sqlite3 it would look like: ```python import sqlite3 as sl con = sl.connect('prompt-tune.db') def db_duplicate_table(table_name, table_name_new, con=con): # Duplicates table `table_name` to a new table `table_name_new`. try: cur = con.cursor() cur.execute(f""""""CREATE TABLE {table_name_new} AS SELECT * FROM {table_name}"""""") except Exception as e: print(e) finally: cur.close() db_duplicate_table('orig_table', 'new_table') ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/449/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 727848625,MDU6SXNzdWU3Mjc4NDg2MjU=,12,"Some workout columns should be float, not text",9599,simonw,open,0,,,,,4,2020-10-23T02:47:02Z,2022-06-23T04:35:02Z,,MEMBER,,"Columns `duration`, `totalDistance` and `totalEnergyBurned` should be converted to float. https://github.com/dogsheep/healthkit-to-sqlite/blob/71e36e1cf034b96de2a8e6652265d782d3fdf63b/healthkit_to_sqlite/utils.py#L50-L57",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1257724585,I_kwDOCGYnMM5K91qp,441,Combining `rows_where()` and `search()` to limit which rows are searched,1448859,betatim,closed,0,,,,,4,2022-06-02T06:01:55Z,2022-06-14T21:57:57Z,2022-06-14T21:54:38Z,NONE,,"What is the right way to limit a full text search query to some rows of a table? For example, I have a table that contains the following columns: `title`, `content`, `owner` (each row represents a document). The `owner` column is a username. It feels right to store all documents in one table, instead of having one table per owner. In particular because I'd like to full text search all documents, only documents owned by one user and documents owned by a set of users. I tried to combine `.rows_where(""owner = ?"", ""1234"")` and `.search()` from the `Table` class but I don't think that is meant to work. I discovered `.search_sql()` as a way to generate the FTS SQL statement. By hand I can edit it to add a `AND [original].[owner] = :owner` to the `where` clause. This seems to do what I want. My two questions: 1. is adding a `AND ...` to the `where` clause actually the right thing to do or should I be doing something else (my SQL skills are low)? 2. is there a built-in to sqlite-utils way to achieve this? Right now I am thinking I will make my own version of `search_sql()` that generates a query that contains an additional `owner = :owner` for my particular use-case. Bonus question: is this generally useful/something to add to sqlite-utils or too niche?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/441/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1237586379,I_kwDOBm6k_c5JxBHL,1742,?_trace=1 fails with datasette-geojson for some reason,9599,simonw,open,0,,,,,4,2022-05-16T19:06:05Z,2022-05-16T19:42:13Z,,OWNER,,view-source:https://calands.datasettes.com/calands/CPAD_2020a_SuperUnits.geojson?_sort=id&id__exact=4&_labels=on&_trace=1 is showing me a blank page.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1742/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1223459734,I_kwDOBm6k_c5I7IOW,1737,Automated test for Pyodide compatibility,9599,simonw,closed,0,,,,,4,2022-05-02T23:24:25Z,2022-05-02T23:40:50Z,2022-05-02T23:40:50Z,OWNER,,"Refs: - #1733 Need something in the test suite such that if Datasette breaks against Pyodide in the future we hear about it. I'm thinking this is an opportunity to use [shot-scraper javascript](https://github.com/simonw/shot-scraper#scraping-pages-using-javascript).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1737/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1065432388,I_kwDOBm6k_c4_gTVE,1534,Maybe return JSON from HTML pages if `Accept: application/json` is sent,9599,simonw,closed,0,,,,,4,2021-11-28T20:48:09Z,2022-04-27T21:59:34Z,2022-02-02T23:39:33Z,OWNER,,"Relates to #1533 - and to the work I've been doing on the https://github.com/simonw/datasette-table Web Component. It would be useful to support users pasting in a URL to a Datasette table or query without first having to add the `.json` extension themselves - since then other systems could hit that URL with `Accept: application/json` to get back the JSON representation without first needing to read the `Link: ` header from #1533 to figure out what the URL to that JSON is. (There is weird logic deep in Datasette that says that you add `.json` to the path UNLESS the table name itself ends with `.json`, in which case you add `?_format=json` - this is super-confusing). [Update: I removed that confusing feature here: [https://simonwillison.net/2022/Mar/19/weeknotes/](https://simonwillison.net/2022/Mar/19/weeknotes/)]",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1534/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 340396247,MDU6SXNzdWUzNDAzOTYyNDc=,339,Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way,12617395,bsilverm,closed,0,,,,,4,2018-07-11T20:38:06Z,2022-03-21T22:22:40Z,2022-03-21T22:22:34Z,NONE,,"Is it possible to configure the sql_time_limit_ms beyond 60 seconds? It seems queries are still timing out at 60 seconds when sql_time_limit_ms is set to 180000. We have a very large data set and often encounter timeouts when testing new queries from the datasette UI. We are optimizing our database as much as we can, but still may require more than 60 seconds for complex queries.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/339/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1065429936,I_kwDOBm6k_c4_gSuw,1532,Use datasette-table Web Component to guide the design of the JSON API for 1.0,9599,simonw,open,0,,,3268330,Datasette 1.0,4,2021-11-28T20:37:18Z,2022-03-16T20:13:34Z,,OWNER,,"I realized that one of the reasons I'm having trouble committing to nailing down the JSON API for 1.0 is that I don't use it much myself - I use the `?_shape=array` one quite often, but I don't have any projects that are using the default, more fully-featured API. As an experiment I built a Web Component for embedding Datasette tables on pages - https://github.com/simonw/datasette-table - and I think it's actually going to be a really useful tool for helping me dog food the v1.0 API design.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1532/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1131295060,I_kwDOBm6k_c5DbjFU,1634,Update Dockerfile generated by `datasette publish`,9599,simonw,open,0,,,3268330,Datasette 1.0,4,2022-02-11T00:07:26Z,2022-03-11T17:38:08Z,,OWNER,,"The generated `Dockerfile` currently looks something like this: ```Dockerfile FROM python:3.8 COPY . /app WORKDIR /app ENV DATASETTE_SECRET 'edab49cbc5d5f6f33238f54852037e3fee710821960b73edd2ce743454182ae2' RUN pip install -U datasette datasette-auth-passwords datasette-tiddlywiki datasette-graphql RUN datasette inspect fixtures.db other.db --inspect-file inspect-data.json ENV PORT 8080 EXPOSE 8080 CMD datasette serve --host 0.0.0.0 -i fixtures.db -i other.db --cors --inspect-file inspect-data.json --metadata metadata.json --create --port $PORT /data/*.db ``` This is still on Python 3.8, and it generates a pretty large image compared to the `Dockerfile` used for https://hub.docker.com/datasetteproject/datasette - https://github.com/simonw/datasette/blob/0.60.2/Dockerfile Here's the code that generates it: https://github.com/simonw/datasette/blob/7d24fd405f3c60e4c852c5d746c91aa2ba23cf5b/datasette/utils/__init__.py#L389-L400",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1634/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 2, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1154399841,I_kwDOBm6k_c5Ezr5h,1645,"Sensible `cache-control` headers for static assets, including those served by plugins",697092,curiousleo,open,0,,,3268330,Datasette 1.0,4,2022-02-28T18:12:03Z,2022-03-08T02:59:29Z,,NONE,,"## What I'm seeing With `default_cache_ttl = 86400`, I see the following: A table view returns `Cache-control: max-age=86400`: ![Screenshot_20220228_190000](https://user-images.githubusercontent.com/697092/156034352-4d64683e-39c8-49af-81df-0217a5957bbd.png) A static asset returns no `Cache-control` header: ![Screenshot_20220228_185933](https://user-images.githubusercontent.com/697092/156034363-d0b03cc2-5889-4ed2-b601-8c1846b8469a.png) ## What I expected to see I expected the static asset to return a `Cache-control` header indicating that this response can be cached. ## Why this matters I'm productionising a Datasette deployment right now and was looking into putting it behind a Varnish instance. I was surprised to see requests for static assets being served from Datasette rather than Varnish, this is what led me to look more closely at the response headers. While Datasette serves those static assets pretty quickly, I don't see why Datasette should serve them. By their nature, static assets like images and JS files are very cacheable, so it should be easy to serve them from a cache like Varnish. (Note that Varnish can easily be configured to override this header, enabling caching for static assets. But it would be better if this override was not necessary.) ## Discussion It seems clear to me that serving static assets without a `Cache-control` header is not ideal. I see two options here: A. Static assets use the same logic as table / SQL views to set the `Cache-control` header based on `default_cache_ttl`. B. An additional setting for static assets is introduced (`default_static_cache_ttl`, say).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1645/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1063388037,I_kwDOCGYnMM4_YgOF,343,Provide function to generate hash_id from specified columns,82988,psychemedia,closed,0,,,,,4,2021-11-25T10:12:12Z,2022-03-02T04:25:25Z,2022-03-02T04:25:25Z,NONE,,"Hi I note that you define `_hash()` to create a `hash_id` from non-id column values in a table [here](https://github.com/simonw/sqlite-utils/blob/8f386a0d300d1b1c76132bb75972b755049fb742/sqlite_utils/db.py#L2996). It would be useful to be able to call a complementary function to generate a corresponding `_id` from a subset of specified columns when adding items to another table, eg to support the creation of foreign keys. Or is there a better pattern for doing that?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/343/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 677272618,MDU6SXNzdWU2NzcyNzI2MTg=,928,Test failures caused by failed attempts to mock pip,9599,simonw,closed,0,,,,,4,2020-08-11T23:53:18Z,2022-02-23T16:19:47Z,2020-08-12T00:07:49Z,OWNER,,"Errors like this one: https://github.com/simonw/datasette/pull/927/checks?check_run_id=973559696 ``` 2020-08-11T23:36:39.8801334Z =================================== FAILURES =================================== 2020-08-11T23:36:39.8802411Z _________________________________ test_install _________________________________ 2020-08-11T23:36:39.8803242Z 2020-08-11T23:36:39.8804935Z thing = 2020-08-11T23:36:39.8806663Z comp = 'main', import_path = 'pip._internal.cli.main' 2020-08-11T23:36:39.8807696Z 2020-08-11T23:36:39.8808728Z def _dot_lookup(thing, comp, import_path): 2020-08-11T23:36:39.8810573Z try: 2020-08-11T23:36:39.8812262Z > return getattr(thing, comp) 2020-08-11T23:36:39.8817136Z E AttributeError: module 'pip._internal.cli' has no attribute 'main' 2020-08-11T23:36:39.8843043Z 2020-08-11T23:36:39.8855951Z /opt/hostedtoolcache/Python/3.8.5/x64/lib/python3.8/unittest/mock.py:1215: AttributeError 2020-08-11T23:36:39.8873372Z 2020-08-11T23:36:39.8877803Z During handling of the above exception, another exception occurred: 2020-08-11T23:36:39.8906532Z 2020-08-11T23:36:39.8925767Z def get_src_prefix(): 2020-08-11T23:36:39.8928277Z # type: () -> str 2020-08-11T23:36:39.8930068Z if running_under_virtualenv(): 2020-08-11T23:36:39.8949721Z src_prefix = os.path.join(sys.prefix, 'src') 2020-08-11T23:36:39.8951813Z else: 2020-08-11T23:36:39.8969014Z # FIXME: keep src in cwd for now (it is not a temporary folder) 2020-08-11T23:36:39.9012110Z try: 2020-08-11T23:36:39.9013489Z > src_prefix = os.path.join(os.getcwd(), 'src') 2020-08-11T23:36:39.9014538Z E FileNotFoundError: [Errno 2] No such file or directory 2020-08-11T23:36:39.9016122Z 2020-08-11T23:36:39.9017617Z /opt/hostedtoolcache/Python/3.8.5/x64/lib/python3.8/site-packages/pip/_internal/locations.py:50: FileNotFoundError 2020-08-11T23:36:39.9018802Z 2020-08-11T23:36:39.9020070Z During handling of the above exception, another exception occurred: 2020-08-11T23:36:39.9020930Z 2020-08-11T23:36:39.9022275Z args = (), keywargs = {} 2020-08-11T23:36:39.9023183Z 2020-08-11T23:36:39.9024077Z @wraps(func) 2020-08-11T23:36:39.9024984Z def patched(*args, **keywargs): 2020-08-11T23:36:39.9028770Z > with self.decoration_helper(patched, 2020-08-11T23:36:39.9031861Z args, 2020-08-11T23:36:39.9038358Z keywargs) as (newargs, newkeywargs): 2020-08-11T23:36:39.9039654Z 2020-08-11T23:36:39.9040566Z /opt/hostedtoolcache/Python/3.8.5/x64/lib/python3.8/unittest/mock.py:1322: 2020-08-11T23:36:39.9041492Z _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/928/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 335200136,MDU6SXNzdWUzMzUyMDAxMzY=,327,Explore if SquashFS can be used to shrink size of packaged Docker containers,9599,simonw,open,0,,,,,4,2018-06-24T18:15:16Z,2022-02-17T23:37:24Z,,OWNER,,"Inspired by this article: https://cldellow.com/2018/06/22/sqlite-parquet-vtable.html#sqlite-database-indexed--squashed https://en.wikipedia.org/wiki/SquashFS is ""a compressed read-only file system for Linux"" - which means it could be a really nice fit for Datasette and its read-only SQLite databases. It would be interesting to explore a Dockerfile recipe that used SquashFS to compress the SQLite database file that was bundled up by `datasette package` and friends.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/327/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1126692066,I_kwDOCGYnMM5DJ_Ti,403,Document how to add a primary key to a rowid table using `sqlite-utils transform --pk`,536941,fgregg,closed,0,,,,,4,2022-02-08T01:39:40Z,2022-02-09T04:22:43Z,2022-02-08T19:33:59Z,CONTRIBUTOR,,"*Original title: Add option for adding a new, serial, primary key* sometimes we have tables that don't have primary keys, but ought to have them. we *can* use rowid for that, but it would often be nicer to have an explicit primary key. using the current value of rowid would be fine.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/403/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1072792507,I_kwDOCGYnMM4_8YO7,352,`sqlite-utils insert --extract colname`,9599,simonw,open,0,,,,,4,2021-12-07T00:55:44Z,2022-02-03T22:59:36Z,,OWNER,,"Is there a reason I've not added `--extract` as an option for `sqlite-utils insert` next? There's a `extracts=` option for the various `table.insert()` etc methods - last line in this code block: https://github.com/simonw/sqlite-utils/blob/213a0ff177f23a35f3b235386366ff132eb879f1/sqlite_utils/db.py#L2483-L2495",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/352/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1094981339,I_kwDOCGYnMM5BRBbb,363,Better error message if `--convert` code fails to return a dict,9599,simonw,closed,0,,,,,4,2022-01-06T05:26:28Z,2022-02-03T22:52:30Z,2022-02-03T22:51:30Z,OWNER,,"Here's the traceback if your `--convert` function doesn't return a dict right now: ``` % sqlite-utils insert /tmp/all.db blah /tmp/log.log --convert 'all.upper()' --all Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/bin/sqlite-utils"", line 33, in sys.exit(load_entry_point('sqlite-utils', 'console_scripts', 'sqlite-utils')()) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/share/virtualenvs/sqlite-utils-C4Ilevlm/lib/python3.8/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 949, in insert insert_upsert_implementation( File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 834, in insert_upsert_implementation db[table].insert_all( File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/db.py"", line 2602, in insert_all first_record = next(records) File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/db.py"", line 3044, in fix_square_braces for record in records: File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/cli.py"", line 831, in docs = (decode_base64_values(doc) for doc in docs) File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/utils.py"", line 86, in decode_base64_values to_fix = [ File ""/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/utils.py"", line 89, in if isinstance(doc[k], dict) TypeError: string indices must be integers ``` It would be nicer if that returned a more useful error message. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/361#issuecomment-1006295276_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/363/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1065431383,I_kwDOBm6k_c4_gTFX,1533,"Add `Link: rel=""alternate""` header pointing to JSON for a table/query",9599,simonw,closed,0,,,3268330,Datasette 1.0,4,2021-11-28T20:43:25Z,2022-02-02T07:56:51Z,2022-02-02T07:49:33Z,OWNER,,"Originally explored in https://github.com/simonw/datasette-notebook/issues/2#issuecomment-980789406 - I wanted an efficient way to scan a list of URLs and figure out which if any of those corresponded to Datasette tables, canned queries or SQL output that could be represented as a table on a page. It looks like a neat way to do that is with ` Link:` header like this: `Link: http://127.0.0.1:8058/fixtures/compound_three_primary_keys.json; rel=""alternate""; type=""application/datasette+json""` I can put a ` Could add support for `--batch-size` as seen in `insert`/`upsert` too - causing it to break the list up into batches and commit for each one. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/391#issuecomment-1021876055_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/392/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 752966476,MDU6SXNzdWU3NTI5NjY0NzY=,1114,--load-extension=spatialite not working with datasetteproject/datasette docker image,2182,danp,closed,0,,,,,4,2020-11-29T17:35:20Z,2022-01-20T21:29:42Z,2020-11-29T17:37:45Z,CONTRIBUTOR,,"https://github.com/simonw/datasette/commit/6aa5886379dd9017215904fb28567b80018902f9 added the `--load-extension=spatialite` shortcut looking for the extension in these places: https://github.com/simonw/datasette/blob/12877d7a48e2aa28bb5e780f929a218f7265d849/datasette/utils/__init__.py#L56-L60 However, in the datasetteproject/datasette docker image the file is at `/usr/local/lib/mod_spatialite.so`. This results in the example command [here](https://docs.datasette.io/en/stable/installation.html#loading-spatialite) failing: ``` % docker run --rm -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/data.db --load-extension=spatialite Error: Could not find SpatiaLite extension ``` But it does work when given an explicit path: ``` % docker run --rm -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/data.db --load-extension=/usr/local/lib/mod_spatialite.so INFO: Started server process [1] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://0.0.0.0:8001 (Press CTRL+C to quit) ... ``` Perhaps `SPATIALITE_PATHS` should include `/usr/local/lib/mod_spatialite.so`?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1114/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 838382890,MDU6SXNzdWU4MzgzODI4OTA=,1273,Refresh SpatiaLite documentation,9599,simonw,open,0,,,,,4,2021-03-23T06:05:55Z,2022-01-20T21:28:50Z,,OWNER,,https://docs.datasette.io/en/0.55/spatialite.html was written before I had tools like [geojson-to-sqlite](https://datasette.io/tools/geojson-to-sqlite) and [shapefile-to-sqlite](https://datasette.io/tools/shapefile-to-sqlite).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1273/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1102484126,I_kwDOBm6k_c5BtpKe,1595,Release notes for 0.60,9599,simonw,closed,0,,,7571612,Datasette 0.60,4,2022-01-13T22:23:14Z,2022-01-14T01:37:39Z,2022-01-14T01:37:39Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1595/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1099586786,I_kwDOCGYnMM5Bilzi,383,Add documentation page with the output of `--help`,9599,simonw,closed,0,,,,,4,2022-01-11T20:25:58Z,2022-01-11T22:55:05Z,2022-01-11T21:44:05Z,OWNER,,"Can be maintained using `cog` from #373. Similar in purpose to the API reference page, but this is for the CLI.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/383/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1097135860,I_kwDOCGYnMM5BZPb0,374,`--fmt` should imply `-t`,9599,simonw,closed,0,,,7558727,3.21,4,2022-01-09T08:23:07Z,2022-01-10T19:27:26Z,2022-01-09T18:07:59Z,OWNER,,Not sure why I didn't implement this.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/374/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1087919372,I_kwDOBm6k_c5A2FUM,1578,Confirm if documented nginx proxy config works for row pages with escaped characters in their primary key,9599,simonw,open,0,,,,,4,2021-12-23T18:27:59Z,2021-12-24T21:33:19Z,,OWNER,,"Found this while working on https://github.com/simonw/datasette-tiddlywiki Then clicking on `/tiddlywiki/tiddlers/%24%3A%2FDefaultTiddlers` returns a 404.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1578/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 781262510,MDU6SXNzdWU3ODEyNjI1MTA=,1181,"Certain database names results in 404: ""Database not found: None""",1470389,jieter,closed,0,,,6346396,Datasette 0.54,4,2021-01-07T12:01:16Z,2021-12-21T18:25:15Z,2021-01-25T05:13:19Z,NONE,,"I have a file named `test-database (1).sqlite`. When requesting the home route `/`, I see datasette is able to read it correctly: However, if I click any of the links, datasette replies with: `Error 404 Database not found: None` It seems the hash is crucial, as renaming the file to `database (1).sqlite` makes the error go away. This lines checks for a single dash: https://github.com/simonw/datasette/blob/97fb10c17dd007a275ab743742e93e932335ad67/datasette/views/base.py#L184 ``` $ datasette test-database\ \(1\).sqlite INFO: Started server process [68314] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) INFO: 127.0.0.1:54043 - ""GET /favicon.ico HTTP/1.1"" 200 OK INFO: 127.0.0.1:54043 - ""GET / HTTP/1.1"" 200 OK ... INFO: 127.0.0.1:54044 - ""GET /favicon.ico HTTP/1.1"" 200 OK INFO: 127.0.0.1:54044 - ""GET /test-database (1) HTTP/1.1"" 404 Not Found ``` Version: ``` $ datasette --version datasette, version 0.53 ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1181/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 706001517,MDU6SXNzdWU3MDYwMDE1MTc=,163,Idea: conversions= could take Python functions,9599,simonw,open,0,,,,,4,2020-09-22T00:37:12Z,2021-12-20T00:56:52Z,,OWNER,,"Right now you use `conversions=` like this: ```python db[""example""].insert({ ""name"": ""The Bigfoot Discovery Museum"" }, conversions={""name"": ""upper(?)""}) ``` How about if you could optionally provide a Python function (or a lambda) like this? ```python db[""example""].insert({ ""name"": ""The Bigfoot Discovery Museum"" }, conversions={""name"": lambda s: s.upper()}) ``` This would work by creating a random name for that function, registering it (similar to #162), executing the SQL and then un-registering the custom function at the end.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/163/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1083246400,PR_kwDOBm6k_c4wAMK8,1562,"Update janus requirement from <0.8,>=0.6.2 to >=0.6.2,<1.1",49699333,dependabot[bot],closed,0,,,,,4,2021-12-17T13:11:10Z,2021-12-17T23:08:29Z,2021-12-17T23:08:28Z,CONTRIBUTOR,simonw/datasette/pulls/1562,"Updates the requirements on [janus](https://github.com/aio-libs/janus) to permit the latest version.
Release notes

Sourced from janus's releases.

janus 1.0.0 release

  • Dropped Python 3.6 support
  • Janus is marked as stable, no API changes was made for years
Changelog

Sourced from janus's changelog.

1.0.0 (2021-12-17)

  • Drop Python 3.6 support

0.7.0 (2021-11-24)

  • Add SyncQueue and AsyncQueue Protocols to provide type hints for sync and async queues #374

0.6.2 (2021-10-24)

  • Fix Python 3.10 compatibility #358

0.6.1 (2020-10-26)

  • Raise RuntimeError on queue.join() after queue closing. #295

  • Replace timeout type from Optional[int] to Optional[float] #267

0.6.0 (2020-10-10)

  • Drop Python 3.5, the minimal supported version is Python 3.6

  • Support Python 3.9

  • Refomat with black

0.5.0 (2020-04-23)

  • Remove explicit loop arguments and forbid creating queues outside event loops #246

0.4.0 (2018-07-28)

  • Add py.typed macro #89

  • Drop python 3.4 support and fix minimal version python3.5.3 #88

  • Add property with that indicates if queue is closed #86

0.3.2 (2018-07-06)

  • Fixed python 3.7 support #97

... (truncated)

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1562/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 1077322009,I_kwDOCGYnMM5ANqEZ,355,Allow users to pass a full convert() function definition,9599,simonw,closed,0,,,,,4,2021-12-10T23:59:58Z,2021-12-11T00:51:15Z,2021-12-11T00:49:31Z,OWNER,,"> I think the fix for this is to change the rules about what code is accepted in both the `-` mode and the literal code string mode: you can pass in a Python expression, OR a fragment that gets turned into a function, OR code that implements its own `def convert(value)` function. So this would work too: > ```sh > sqlite-utils convert my.db mytable col1 ' > def convert(value): > return value.upper() > ' > ``` _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/353#issuecomment-991381679_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/355/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 863884805,MDU6SXNzdWU4NjM4ODQ4MDU=,1304,"Document how to send multiple values for ""Named parameters"" ",9308268,rayvoelker,open,0,,,,,4,2021-04-21T13:19:06Z,2021-12-08T03:23:14Z,,NONE,,"https://docs.datasette.io/en/stable/sql_queries.html#named-parameters I thought that I had seen an example of how to do this example below, but I can't seem to find it ```sql select * from bib where bib.bib_record_num in (1008088,1008092) ``` ```sql select * from bib where bib.bib_record_num in (:bib_record_numbers) ``` ![image](https://user-images.githubusercontent.com/9308268/115558839-2333a480-a281-11eb-85e6-ce3bada79140.png) https://ilsweb.cincinnatilibrary.org/collection-analysis/current_collection-204d100?sql=select%0D%0A++*%0D%0Afrom%0D%0A++bib%0D%0Awhere%0D%0A++bib.bib_record_num+in+%28%3Abib_record_numbers%29&bib_record_numbers=1008088%2C1008092 Or, maybe this isn't a fully supported feature. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1304/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 1059219106,I_kwDOBm6k_c4_Imai,1524,"Improve Apache proxy documentation, link to demo",9599,simonw,closed,0,,,,,4,2021-11-20T20:03:14Z,2021-11-20T23:34:03Z,2021-11-20T23:34:03Z,OWNER,,"> The latest demo is now live at https://datasette-apache-proxy-demo.fly.dev/prefix/fixtures/sortable?_facet=pk2 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1519#issuecomment-974697824_ I'm going to put out 0.59.3 bugfix release with this, but I'd like to first improve the documentation on https://docs.datasette.io/en/stable/deploying.html#apache-proxy-configuration to highlight the new demo.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1524/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 459590021,MDU6SXNzdWU0NTk1OTAwMjE=,519,Decide what goes into Datasette 1.0,9599,simonw,closed,0,,,3268330,Datasette 1.0,4,2019-06-23T15:47:41Z,2021-11-15T23:26:11Z,2021-11-15T23:26:11Z,OWNER,,Datasette ASGI #272 is a big part of it... but 1.0 will generally be an indicator that Datasette is a stable platform for developers to write plugins and custom templates against. So lots to think about.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/519/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 1053122092,I_kwDOCGYnMM4-xV4s,339,`table.lookup()` option to populate additional columns when creating a record,9599,simonw,closed,0,,,,,4,2021-11-15T01:41:17Z,2021-11-15T02:02:34Z,2021-11-15T02:02:00Z,OWNER,,"> For the commits table I feel like I want a version of `table.lookup()` that can be passed additional columns to populate only if the record does not exist yet. _Originally posted by @simonw in https://github.com/simonw/git-history/issues/12#issuecomment-967455017_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/339/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 845794436,MDU6SXNzdWU4NDU3OTQ0MzY=,1284,Feature or Documentation Request: Individual table as home page template,192568,mroswell,open,0,,,,,4,2021-03-31T03:56:17Z,2021-11-04T03:15:01Z,,CONTRIBUTOR,,"It would be great to have a sample showing how to move a single database that has a single table, to the index page. I'm trying it now, and find there is a real depth of Datasette and Python understanding that's required to be successful. I've got all the basic jinja concepts down... variables, template control structures, template inheritance, template overrides, css, html, the --template-dir and --static arguments, etc. But copying the table.html file to index.html doesn't work. There are undocumented functions and filters... I can figure some of them out (yay, url_builder.py and utils/__init__.py!) but it's a slog better handled by a much stronger Python developer. One sample would make a world of difference. The ideal form of this documentation would be a diff between the default table.html and how that would look if essentially moved to index.html. The use case is for everyone who wants to create a public-facing website to explore a single table at the root directory. (Maybe a second bit of documentation for people who have a single database with multiple tables.) (Hmm... might be cool to have a setting for that, where it happens automagically! If only one table, then home page is at the table level. if only one database, then home page is at the database level.... as an option.) I suppose I could ignore this, and somehow do this in the DNS settings once I hook up Vercel to a domain name, maybe.. and remove the breadcrumbs in table.html... but for now, a documentation request in the form of a diff... for viewing a single table (or a single database) at the root. (Actually, there's probably room for a whole expanded section on templates. Noticed some nice table metadata in one of the datasette examples, for instance... Hmm... maybe a whole library of solutions in one place... maybe a documentation hackathon! If that's of interest, of course it's a separate issue. ) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1284/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 761915790,MDU6SXNzdWU3NjE5MTU3OTA=,206,sqlite-utils should suggest --csv if JSON parsing fails,9599,simonw,closed,0,,,,,4,2020-12-11T05:17:56Z,2021-10-30T15:52:17Z,2021-01-03T18:42:22Z,OWNER,,"``` ~ % gsutil cat gs://ossf-criticality-score/python_top_200.csv | sqlite-utils insert /tmp/crit.db crit - ... File ""/usr/local/Cellar/python@3.9/3.9.0_3/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/decoder.py"", line 337, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File ""/usr/local/Cellar/python@3.9/3.9.0_3/Frameworks/Python.framework/Versions/3.9/lib/python3.9/json/decoder.py"", line 355, in raw_decode raise JSONDecodeError(""Expecting value"", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) ``` A nicer error message here would be one that says the JSON is invalid but suggests that maybe you could try `--csv`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/206/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 995098231,MDU6SXNzdWU5OTUwOTgyMzE=,1470,?_sort=rowid with _next= returns error,19851673,eigenfoo,closed,0,,,,,4,2021-09-13T16:36:15Z,2021-10-18T19:30:15Z,2021-10-10T01:15:03Z,NONE,,"For example: - Go to https://cryptics.eigenfoo.xyz/clues/clues?_next=100 (this is the second page of results in a Datasette site) - Search anything using the FTS search bar. For example, searching for `hello` will take you to https://cryptics.eigenfoo.xyz/clues/clues?_search=hello&_sort=rowid&_next=100 - A `500 Error: list index out of range` is raised. This is because the search URL includes the `&_next=100` UTM parameter, carried over from where the FTS search was run. However, there isn't a second page in the search results, so a `list index out of range` error is raised. You can confirm that removing this UTM parameter from the URL returns the appropriate search results. The FTS search request should strip any `_next` UTM parameter. --- ```bash datasette, version 0.58.1 sqlite-utils, version 3.17 ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1470/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 990844088,MDU6SXNzdWU5OTA4NDQwODg=,325,sqlite-utils memory can't deal with multiple files with the same name,144773,karlb,closed,0,,,,,4,2021-09-08T08:14:42Z,2021-09-22T20:52:56Z,2021-09-22T20:45:45Z,NONE,,"When I use multiple files with the same name, e.g. in `sqlite-utils memory a/bug.csv b/bug.csv`, sqlite-utils creates invalid views. ``` Traceback (most recent call last): File ""/home/karl/.local/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/sqlite_utils/cli.py"", line 1299, in memory db[csv_table].transform(types=tracker.types) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/sqlite_utils/db.py"", line 1287, in transform self.db.execute(sql) File ""/home/karl/.local/pipx/venvs/sqlite-utils/lib/python3.9/site-packages/sqlite_utils/db.py"", line 421, in execute return self.conn.execute(sql) sqlite3.OperationalError: error in view t1: no such table: main.bug ``` This can be reproduced with ```sh #!/bin/bash mkdir foo mkdir bar echo -e 'col1,col2\nval1,val2' > foo/bug.csv echo -e 'col3,col4\nval3,val4' > bar/bug.csv sqlite-utils memory */bug.csv 'SELECT 1' ``` Ideally, the tables would get unique names by including the next path segment until the names are unique. But just making the numbered t* aliases work would be good enough. This problem can of course be worked around by renaming the files, but it would be nice if this case was handled more gracefully. Thanks a lot for this great tool!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/325/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 978743426,MDU6SXNzdWU5Nzg3NDM0MjY=,13,xml.etree.ElementTree.ParseError: not well-formed (invalid token),9599,simonw,closed,0,,,,,4,2021-08-25T05:48:21Z,2021-08-26T18:45:13Z,2021-08-26T18:45:13Z,MEMBER,,"Got this error today: ``` (evernote-to-sqlite) /tmp % evernote-to-sqlite enex evernote.db simonwillison\'s\ notebook.enex Importing from ENEX [######------------------------------] 17% Traceback (most recent call last): File ""/Users/simon/.local/bin/evernote-to-sqlite"", line 8, in sys.exit(cli()) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1137, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1062, in main rv = self.invoke(ctx) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1668, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 1404, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/click/core.py"", line 763, in invoke return __callback(*args, **kwargs) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/evernote_to_sqlite/cli.py"", line 31, in enex save_note(db, note) File ""/Users/simon/.local/pipx/venvs/evernote-to-sqlite/lib/python3.9/site-packages/evernote_to_sqlite/utils.py"", line 36, in save_note content = ET.tostring(ET.fromstring(content_xml)).decode(""utf-8"") File ""/usr/local/Cellar/python@3.9/3.9.6/Frameworks/Python.framework/Versions/3.9/lib/python3.9/xml/etree/ElementTree.py"", line 1347, in XML parser.feed(text) xml.etree.ElementTree.ParseError: not well-formed (invalid token): line 2, column 132 ```",303218369,evernote-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 831751367,MDU6SXNzdWU4MzE3NTEzNjc=,246,Escaping FTS search strings,16001974,DeNeutoy,closed,0,,,,,4,2021-03-15T12:15:09Z,2021-08-18T18:57:13Z,2021-08-18T18:43:12Z,CONTRIBUTOR,," Thanks for the excellent library, it's very nice to use! I've been building some in memory search functionality for a data annotation tool i'm making, and I got tripped up a little bit with escaping the full text search queries. First I tried using `db.quote(q)`, which doesn't work, because sqlite FTS has it's own (separate)[ query syntax](https://www2.sqlite.org/fts5.html#full_text_query_syntax). You can see this happening here also: http://search-24ways.herokuapp.com/24ways-f8f455f/articles?_search=acces%2A I got around this by aggressively escaping quotes inside the query string like this: ```python quoted = q.replace('""', '""""') quoted = f'""{quoted}""' print(quoted) results = db[""data""].search(quoted, columns=[""id""]) return [x[""id""] for x in results] ``` This works in the sense it doesn't crash, but it also removes access to the search query syntax. Given the well specified definition, it might be possible for sqlite-utils to provide a `db.quote_query(q)` which would intelligently escape a query whilst leaving the syntax intact. This would be very nice! ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/246/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268469569,MDU6SXNzdWUyNjg0Njk1Njk=,39,Protect against malicious SQL that causes damage even though our DB is immutable,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-10-25T16:44:27Z,2021-08-17T23:52:07Z,2017-11-05T02:53:47Z,OWNER,,"I’m currently operating under the assumption that it’s safe to allow arbitrary SQL statements because we are dealing with an immutable database. But this might not be the case - there are some pretty weird SQLite language extensions (ATTACH, PRAGMA etc) and I’m not certain they cannot be used to break things in a way that would affect future requests to the API. Solution: provide a “safe mode” option which disables the ?sql= mechanism. This still leaves the URL filter lookups, so I need to make sure that those are “safe”. In the future I may also implement a whitelist option where datasets can be configured to only allow specific filters against specific columns.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/39/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 965102534,MDU6SXNzdWU5NjUxMDI1MzQ=,311,Add reference documentation generated from docstrings,9599,simonw,closed,0,,,,,4,2021-08-10T16:04:00Z,2021-08-11T12:03:50Z,2021-08-11T12:03:50Z,OWNER,,"Using https://www.sphinx-doc.org/en/master/usage/extensions/autodoc.html I'm not a big fan of this kind of documentation because it so often comes in place of narrative documentation - but the library has great narrative documentation now, so the reference documentation can link to it in places. This will also encourage me to add good docstrings everywhere, useful for IDEs and suchlike.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/311/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 963527045,MDU6SXNzdWU5NjM1MjcwNDU=,1424,Document exceptions that can be raised by db.execute() and friends,9599,simonw,open,0,,,,,4,2021-08-08T22:23:25Z,2021-08-08T22:27:31Z,,OWNER,,Not currently covered here: https://docs.datasette.io/en/stable/internals.html#await-db-execute-sql,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1424/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 961367843,MDU6SXNzdWU5NjEzNjc4NDM=,1422,Ability to default to hiding the SQL for a canned query,9599,simonw,closed,0,,,,,4,2021-08-05T02:51:39Z,2021-08-07T05:32:29Z,2021-08-07T05:32:29Z,OWNER,,"I'm working on a project with some HUGE (400+ lines of SQL) canned queries right now. Any time you land on the canned query page you have to scroll down a long distance to get to the results! Would be useful to be able to default to https://latest.datasette.io/fixtures/magic_parameters?_hide_sql=1 without needing the parameter.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1422/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 952179830,MDU6SXNzdWU5NTIxNzk4MzA=,2,Command for fetching Hacker News threads from the search API,9599,simonw,open,0,,,,,4,2021-07-25T02:00:45Z,2021-07-25T03:12:57Z,,MEMBER,,"I want to be able to fetch every item for a domain, e.g. https://news.ycombinator.com/from?site=simonwillison.net",248903544,hacker-news-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 855446829,MDExOlB1bGxSZXF1ZXN0NjEzMTc4OTY4,1296,Dockerfile: use Ubuntu 20.10 as base,82332573,tmcl-it,open,0,,,,,4,2021-04-12T00:23:32Z,2021-07-20T08:52:13Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/1296,"This PR changes the main Dockerfile to use ubuntu:20.10 as base image instead of python:3.9.2-slim-buster (itself based on debian:buster-slim). The Dockerfile is essentially the one from https://github.com/simonw/datasette/issues/1249#issuecomment-803698983 with some additional cleanups to slim it down. This fixes a couple of issues: 1. The SQLite version in Debian Buster (2.6.0) doesn't support generated columns 2. Installing SpatiaLite from the Debian sid repositories has the side effect of also installing updates to libc and libstdc++ from sid. As a bonus, the Docker image becomes smaller: ``` $ docker image ls REPOSITORY TAG IMAGE ID CREATED SIZE datasette 0.56-ubuntu f7aca255140a 5 hours ago 212MB datasetteproject/datasette 0.56 efb3b282f390 13 days ago 258MB ``` ### Reproduction of the first issue ``` $ curl -O https://latest.datasette.io/fixtures.db % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 100 260k 0 260k 0 0 489k 0 --:--:-- --:--:-- --:--:-- 489k $ docker run -v `pwd`:/mnt datasetteproject/datasette:0.56 datasette /mnt/fixtures.db Traceback (most recent call last): File ""/usr/local/bin/datasette"", line 8, in sys.exit(cli()) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.9/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/usr/local/lib/python3.9/site-packages/datasette/cli.py"", line 544, in serve asyncio.get_event_loop().run_until_complete(check_databases(ds)) File ""/usr/local/lib/python3.9/asyncio/base_events.py"", line 642, in run_until_complete return future.result() File ""/usr/local/lib/python3.9/site-packages/datasette/cli.py"", line 584, in check_databases await database.execute_fn(check_connection) File ""/usr/local/lib/python3.9/site-packages/datasette/database.py"", line 155, in execute_fn return await asyncio.get_event_loop().run_in_executor( File ""/usr/local/lib/python3.9/concurrent/futures/thread.py"", line 52, in run result = self.fn(*self.args, **self.kwargs) File ""/usr/local/lib/python3.9/site-packages/datasette/database.py"", line 153, in in_thread return fn(conn) File ""/usr/local/lib/python3.9/site-packages/datasette/utils/__init__.py"", line 892, in check_connection for r in conn.execute( sqlite3.DatabaseError: malformed database schema (generated_columns) - near ""AS"": syntax error ``` Here is the SQLite version: ``` $ docker run -v `pwd`:/mnt -it datasetteproject/datasette:0.56 /bin/bash root@d9220d3b95dd:/# python3 Python 3.9.2 (default, Mar 27 2021, 02:50:26) [GCC 8.3.0] on linux Type ""help"", ""copyright"", ""credits"" or ""license"" for more information. >>> import sqlite3 >>> sqlite3.version '2.6.0' ``` ### Reproduction of the second issue ``` $ docker build . -t datasette --build-arg VERSION=0.55 [...snip...] The following packages will be upgraded: libc-bin libc6 libstdc++6 [...snip...] Unpacking libc6:amd64 (2.31-11) over (2.28-10) ... [...snip...] Unpacking libstdc++6:amd64 (10.2.1-6) over (8.3.0-6) ... [...snip...] ``` Both libc and libstdc++ are backwards compatible, so the image still works, but it will result in a combination of libraries and Python versions that exists only in the Datasette image, so it's likely untested. In addition, since Debian sid is an always-changing rolling-release, the versions of libc, libstdc++, Spatialite, and their dependencies change frequently, so the library versions in the Datasette image will depend on the day when it was built. ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1296/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 944870799,MDU6SXNzdWU5NDQ4NzA3OTk=,1394,Big performance boost on faceting: skip the inner order by,9599,simonw,closed,0,,,,,4,2021-07-14T23:32:29Z,2021-07-16T02:23:32Z,2021-07-15T00:05:50Z,OWNER,,"I just noticed something that could make for a huge performance improvement in faceting. The default query used by Datasette when faceting looks like this: ```sql select country_long, count(*) from ( select * from [global-power-plants] order by rowid ) where country_long is not null group by country_long order by count(*) desc ``` Here it takes 53ms: https://global-power-plants.datasettes.com/global-power-plants?sql=select%0D%0A++country_long%2C%0D%0A++count%28*%29%0D%0Afrom+%28%0D%0A++select+*+from+%5Bglobal-power-plants%5D+order+by+rowid%0D%0A%29%0D%0Awhere%0D%0A++country_long+is+not+null%0D%0Agroup+by%0D%0A++country_long%0D%0Aorder+by%0D%0A++count%28*%29+desc Note that there's a `order by rowid` in there which isn't necessary - the order on that inner query doesn't matter since we're grouping and counting. I had assumed SQLite would optimize this away - but it turns out it doesn't! Consider this version of the query, with that pointless order by removed: ``` select country_long, count(*) from ( select * from [global-power-plants] ) where country_long is not null group by country_long order by count(*) desc ``` https://global-power-plants.datasettes.com/global-power-plants?sql=select%0D%0A++country_long%2C%0D%0A++count%28*%29%0D%0Afrom+%28%0D%0A++select+*+from+%5Bglobal-power-plants%5D%0D%0A%29%0D%0Awhere%0D%0A++country_long+is+not+null%0D%0Agroup+by%0D%0A++country_long%0D%0Aorder+by%0D%0A++count%28*%29+desc runs in 7.2ms! I tried this optimization on a table with 2.5m rows in it - without the optimization it took 5 seconds, with the optimization it took 450ms. So this is a very significant improvement!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1394/reactions"", ""total_count"": 2, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 466996584,MDExOlB1bGxSZXF1ZXN0Mjk2NzM1MzIw,557,Get tests running on Windows using Travis CI,9599,simonw,closed,0,,,,,4,2019-07-11T16:36:57Z,2021-07-10T23:39:48Z,2021-07-10T23:39:48Z,OWNER,simonw/datasette/pulls/557,Refs #511,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/557/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 915421499,MDU6SXNzdWU5MTU0MjE0OTk=,267,row.update() or row.pk,12721157,Gravitar64,open,0,,,,,4,2021-06-08T19:56:00Z,2021-06-22T17:27:27Z,,NONE,,"Hi, fantastic framework for working with Sqlite3 databases!!! I tried to update spezific rows in a table and used for row in db[tablename]: newValue = row[""counter""] * row[""prize""] row.update({""Fieldname"": newValue}) print(row) This updates the value in the printet row, but not in the database. So I switched to db[tablename].update(id, {""Filedname"": newValue}) This works fine. But row.update would be nicer, because no need for the id (its that row), no need for the tablename and the db (all defined in the for row ... loop). Thx ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/267/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 925305186,MDU6SXNzdWU5MjUzMDUxODY=,282,Automatic type detection for CSV data,9599,simonw,closed,0,,,,,4,2021-06-19T03:33:21Z,2021-06-19T04:42:03Z,2021-06-19T04:38:00Z,OWNER,,"I've touched on this before in #179 - but now that I've added `sqlite-utils memory` this is much more important - because unlike with `sqlite-utils insert` the in-memory command doesn't give you the opportunity to fix any types you imported from CSV, so queries like `select * from stdin where age > 3` are never going to work correctly against these temporary in-memory tables. Teaching `sqlite-utils insert` to detect types for columns in a CSV file would be a backwards-compatibility breaking change. Teaching `sqlite-utils memory` that trick would not be, since it hasn't been included in a release yet. It's a little inconsistent, but I'm going to have `sqlite-utils memory` default to detecting types while `sqlite-utils insert` does not. In each case this can be controlled by a new command-line option: cat file.csv | sqlite-utils memory - --no-detect-types To opt-in for `sqlite-utils insert`: cat file.csv | sqlite-utils insert blah.db blah - --detect-types I'll have short options for these too: `-n` for `--no-detect-types` and `-d` for `--detect-types`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/282/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",,completed 709577625,MDU6SXNzdWU3MDk1Nzc2MjU=,179,sqlite-utils transform/insert --detect-types,9599,simonw,closed,0,,,,,4,2020-09-26T17:28:55Z,2021-06-19T03:36:16Z,2021-06-19T03:36:05Z,OWNER,,"Idea from https://github.com/simonw/datasette-edit-tables/issues/13 - provide Python utility methods and accompanying CLI options for detecting the likely types of TEXT columns. So if you have a text column that actually contained exclusively integer string values, it can let you know and let you run transform against it.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/179/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 919314806,MDU6SXNzdWU5MTkzMTQ4MDY=,270,Cannot set type JSON,4068,frafra,closed,0,,,,,4,2021-06-11T23:53:22Z,2021-06-16T17:34:49Z,2021-06-16T15:47:06Z,NONE,,"It would be great if the column type could be set to JSON. That would not be different from handling a regular string. It would be something like `repr(value)` and it would work with both JSON and CSV inputs, no matter if `value` is a real list or just a string representing a list.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/270/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 756876238,MDExOlB1bGxSZXF1ZXN0NTMyMzQ4OTE5,1130,Fix footer not sticking to bottom in short pages,3243482,abdusco,open,0,,,,,4,2020-12-04T07:29:01Z,2021-06-15T13:27:48Z,,CONTRIBUTOR,simonw/datasette/pulls/1130,Fixes https://github.com/simonw/datasette/issues/1129,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1130/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 919181559,MDU6SXNzdWU5MTkxODE1NTk=,268,db.schema property and sqlite-utils schema command,9599,simonw,closed,0,,,,,4,2021-06-11T20:25:47Z,2021-06-11T20:51:56Z,2021-06-11T20:51:56Z,OWNER,,"`table.schema` returns the schema for a table. `db.schema` should return the schema for the whole databes. Can do this using `select sql from sqlite_master where sql is not null`: https://latest.datasette.io/fixtures?sql=select+sql+from+sqlite_master+where+sql+is+not+null",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/268/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 912485040,MDU6SXNzdWU5MTI0ODUwNDA=,1361,Intermittent CI failure: restore_working_directory FileNotFoundError,9599,simonw,closed,0,,,,,4,2021-06-05T22:48:13Z,2021-06-05T23:16:24Z,2021-06-05T23:16:24Z,OWNER,,"e.g. in https://github.com/simonw/datasette/runs/2754772233 - this is an intermittent error: ``` __________ ERROR at setup of test_hook_register_routes_render_message __________ [gw0] linux -- Python 3.8.10 /opt/hostedtoolcache/Python/3.8.10/x64/bin/python tmpdir = local('/tmp/pytest-of-runner/pytest-0/popen-gw0/test_hook_register_routes_rend0') request = > @pytest.fixture def restore_working_directory(tmpdir, request): > previous_cwd = os.getcwd() E FileNotFoundError: [Errno 2] No such file or directory ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1361/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 864979486,MDExOlB1bGxSZXF1ZXN0NjIxMTE3OTc4,1306,Avoid error sorting by relationships if related tables are not allowed,416374,gfrmin,closed,0,,,,,4,2021-04-22T13:53:17Z,2021-06-02T04:27:00Z,2021-06-02T04:25:28Z,CONTRIBUTOR,simonw/datasette/pulls/1306,Refs #1305,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1306/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 903978133,MDU6SXNzdWU5MDM5NzgxMzM=,1343,Figure out how to publish alpha/beta releases to Docker Hub,9599,simonw,closed,0,,,,,4,2021-05-27T16:42:17Z,2021-05-27T16:46:37Z,2021-05-27T16:45:41Z,OWNER,,"> It looks like all I need to do to ship an alpha version to Docker Hub is NOT point the `latest` tag at it after it goes live: https://github.com/simonw/datasette/blob/1a8972f9c012cd22b088c6b70661a9c3d3847853/.github/workflows/publish.yml#L75-L77 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1319#issuecomment-849780481_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1343/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 884952179,MDU6SXNzdWU4ODQ5NTIxNzk=,1320,Can't use apt-get in Dockerfile when using datasetteproj/datasette as base,2670795,brandonrobertz,closed,0,,,,,4,2021-05-10T19:37:27Z,2021-05-24T18:15:56Z,2021-05-24T18:07:08Z,CONTRIBUTOR,,"The datasette base Docker image is super convenient, but there's one problem: if any of the plugins you install require additional system dependencies (e.g., xz, git, curl) then any attempt to use apt in said Dockerfile results in an explosion: ``` $ docker-compose build Building server [+] Building 9.9s (7/9) => [internal] load build definition from Dockerfile 0.0s => => transferring dockerfile: 666B 0.0s => [internal] load .dockerignore 0.0s => => transferring context: 34B 0.0s => [internal] load metadata for docker.io/datasetteproject/datasette:latest 0.6s => [base 1/4] FROM docker.io/datasetteproject/datasette@sha256:2250d0fbe57b1d615a8d6df0c9d43deb9533532e00bac68854773d8ff8dcf00a 0.0s => [internal] load build context 1.8s => => transferring context: 2.44MB 1.8s => CACHED [base 2/4] WORKDIR /datasette 0.0s => ERROR [base 3/4] RUN apt-get update && apt-get install --no-install-recommends -y git ssh curl xz-utils 9.2s ------ > [base 3/4] RUN apt-get update && apt-get install --no-install-recommends -y git ssh curl xz-utils: #6 0.446 Get:1 http://security.debian.org/debian-security buster/updates InRelease [65.4 kB] #6 0.449 Get:2 http://deb.debian.org/debian buster InRelease [121 kB] #6 0.459 Get:3 http://httpredir.debian.org/debian sid InRelease [157 kB] #6 0.784 Get:4 http://deb.debian.org/debian buster-updates InRelease [51.9 kB] #6 0.790 Get:5 http://httpredir.debian.org/debian sid/main amd64 Packages [8626 kB] #6 1.003 Get:6 http://deb.debian.org/debian buster/main amd64 Packages [7907 kB] #6 1.180 Get:7 http://security.debian.org/debian-security buster/updates/main amd64 Packages [286 kB] #6 7.095 Get:8 http://deb.debian.org/debian buster-updates/main amd64 Packages [10.9 kB] #6 8.058 Fetched 17.2 MB in 8s (2243 kB/s) #6 8.058 Reading package lists... #6 9.166 E: flAbsPath on /var/lib/dpkg/status failed - realpath (2: No such file or directory) #6 9.166 E: Could not open file - open (2: No such file or directory) #6 9.166 E: Problem opening #6 9.166 E: The package lists or status file could not be parsed or opened. ``` The problem seems to be from completely wiping out `/var/lib/dpkg` in the upstream Dockerfile: https://github.com/simonw/datasette/blob/1b697539f5b53cec3fe13c0f4ada13ba655c88c7/Dockerfile#L18 I've tested without removing the directory and apt works as expected.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1320/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273775212,MDU6SXNzdWUyNzM3NzUyMTI=,88,Add NHS England Hospitals example to wiki,15543,tomdyson,closed,0,,,,,4,2017-11-14T12:29:10Z,2021-03-22T23:46:36Z,2017-11-14T22:54:06Z,CONTRIBUTOR,,"https://nhs-england-hospitals.now.sh and an associated map visualisation: http://run.plnkr.co/preview/cj9zlf1qc0003414y90ajkwpk/ Datasette is wonderful! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/88/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 797651831,MDU6SXNzdWU3OTc2NTE4MzE=,1212,Tests are very slow. ,4488943,kbaikov,closed,0,,,,,4,2021-01-31T08:06:16Z,2021-02-19T22:54:13Z,2021-02-19T22:54:13Z,CONTRIBUTOR,,"Working on my PR i noticed that tests are very slow. The plain pytest run took about 37 minutes for me. However i could shave of about 10 minutes from that if i used pytest-xdist to parallelize execution. `pytest -n 8` is run only in 28 minutes on my machine. I can create a PR to mention that in your documentation. This will be a simple change to add pytest-xdist to requirements and change a command to run pytest in documentation. Does that make sense to you? After a bit more investigation it looks like python-xdist is not an answer. It creates a race condition for tests that try to clead temp dir before run. Profiling shows that most time is spent on conn.executescript(TABLES) in make_app_client function. Which makes sense. Perhaps the better approach would be look at the app_client fixture which is already session scoped, but not used by all test cases. And/or use conn = sqlite3.connect("":memory:"") which is much faster. And/or truncate tables after each TC instead of deleting the file and re-creating them. I can take a look which is the best approach if you give the go-ahead. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1212/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 808843401,MDU6SXNzdWU4MDg4NDM0MDE=,1226,--port option should validate port is between 0 and 65535,9599,simonw,closed,0,,,,,4,2021-02-15T22:01:33Z,2021-02-18T18:41:27Z,2021-02-18T18:41:27Z,OWNER,,"Currently throws an ugly error message: ``` (datasette-graphql) datasette-graphql % datasette fivethirtyeight.db -p 80094 INFO: Started server process [45497] INFO: Waiting for application startup. INFO: Application startup complete. Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/datasette-graphql-n1OSJCS8/bin/datasette"", line 8, in sys.exit(cli()) ... server = await loop.create_server( File ""/Users/simon/.pyenv/versions/3.8.2/lib/python3.8/asyncio/base_events.py"", line 1461, in create_server sock.bind(sa) OverflowError: bind(): port must be 0-65535. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1226/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 807174161,MDU6SXNzdWU4MDcxNzQxNjE=,227,Error reading csv files with large column data,295329,camallen,closed,0,,,,,4,2021-02-12T11:51:47Z,2021-02-16T11:48:03Z,2021-02-14T21:17:19Z,NONE,,"*Feel free to close this issue - I mostly added it for reference for future folks that run into this :)* I have a CSV file with one column that has very long strings. When i try to import this file via the `insert` command I get the following error: ``` sqlite-utils insert database.db table_name file_with_large_column.csv Traceback (most recent call last): File ""/usr/local/bin/sqlite-utils"", line 10, in sys.exit(cli()) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/usr/local/lib/python3.7/site-packages/sqlite_utils/cli.py"", line 774, in insert default=default, File ""/usr/local/lib/python3.7/site-packages/sqlite_utils/cli.py"", line 705, in insert_upsert_implementation docs, pk=pk, batch_size=batch_size, alter=alter, **extra_kwargs File ""/usr/local/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1852, in insert_all first_record = next(records) File ""/usr/local/lib/python3.7/site-packages/sqlite_utils/cli.py"", line 703, in docs = (decode_base64_values(doc) for doc in docs) File ""/usr/local/lib/python3.7/site-packages/sqlite_utils/cli.py"", line 681, in docs = (dict(zip(headers, row)) for row in reader) _csv.Error: field larger than field limit (131072) ``` Built with the docker image `datasetteproject/datasette:0.54` with the following versions: ``` # sqlite-utils --version sqlite-utils, version 3.4.1 # datasette --version datasette, version 0.54 ``` It appears this is a [known issue](https://stackoverflow.com/a/54517228/2761423) reading in csv files in python and [doesn't look to be modifiable](https://github.com/python/cpython/blob/ea46579067fd2d4e164d6605719ffec690c4d621/Modules/_csv.c#L1685) through system / env vars (i may be very wrong on this). Noting that using sqlite3 `import` command work without error (not using the python csv reader) ``` sqlite3 database.db sqlite> .mode csv sqlite> .import file_with_large_column.csv table_name ``` Sadly I couldn't see an easy way around this while using the cli as it appears this value needs to be changed in python code. FWIW I've switched to using https://datasette.io/tools/csvs-to-sqlite for importing csv data and it's working well. Finally, I'm loving https://datasette.io/ thank you very much for an amazing tool and data ecosytem 🙇‍♀️ ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/227/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 806743116,MDU6SXNzdWU4MDY3NDMxMTY=,1220,Installing datasette via docker: Path 'fixtures.db' does not exist,30607,aborruso,closed,0,,,,,4,2021-02-11T21:09:14Z,2021-02-12T21:35:17Z,2021-02-12T21:35:17Z,NONE,,"Hi, If I run ``` docker run -p 8001:8001 -v `pwd`:/mnt \ 1 ↵ datasetteproject/datasette \ datasette -p 8001 -h 0.0.0.0 fixtures.db ``` I have ``` Error: Invalid value for '[FILES]...': Path 'fixtures.db' does not exist. ``` If I run `test -f fixtures.db && echo ""it exists.""` I have `it exists.`. What's my error? Thank you",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1220/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 803338729,MDU6SXNzdWU4MDMzMzg3Mjk=,33,photo-to-sqlite: command not found,11855322,robmarkcole,open,0,,,,,4,2021-02-08T08:42:57Z,2021-02-12T15:00:44Z,,NONE,,"Having installed in a venv I get: ``` (venv) (base) Robins-MacBook:datasette robin$ photo-to-sqlite apple-photos photos.db -bash: photo-to-sqlite: command not found ```",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 789336592,MDU6SXNzdWU3ODkzMzY1OTI=,1195,"view_name = ""query"" for the query page",9599,simonw,open,0,,,,,4,2021-01-19T20:21:36Z,2021-01-25T04:40:08Z,,OWNER,,It uses `view_name` of `database` at the moment which isn't as useful.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1195/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 773913793,MDExOlB1bGxSZXF1ZXN0NTQ0OTIzNDM3,1158,Modernize code to Python 3.6+,6774676,eumiro,closed,0,,,6346396,Datasette 0.54,4,2020-12-23T16:21:38Z,2021-01-24T21:20:50Z,2020-12-23T17:04:32Z,CONTRIBUTOR,simonw/datasette/pulls/1158,"- compact dict and set building - remove redundant parentheses - simplify chained conditions - change method name to lowercase - use triple double quotes for docstrings please feel free to accept/reject any of these independent commits",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1158/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 548591089,MDU6SXNzdWU1NDg1OTEwODk=,657,Allow creation of virtual tables at startup,1055831,dazzag24,open,0,,,,,4,2020-01-12T16:10:55Z,2021-01-15T20:24:35Z,,NONE,,"Hi, I've been experimenting with SQLite reading from huge datasets using this excellent Parquet extension from @cldellow. https://cldellow.com/2018/06/22/sqlite-parquet-vtable.html https://github.com/cldellow/sqlite-parquet-vtable This works really well, but I was keen to see if I could combine datasette with this. Having previously experimented with the spatialite extension I knew that datasette supports loading extensions in the underlying sqlite instance. However I hit a blocker as the current design only allows SELECT statements to be executed and so I am unable to execute the crucial CREATE VIRTUAL TABLE ......... command that is required to load the data from the parquet file into the table. It seems like this would be a simple-ish change, but I don't know enough about the architecture of datasette to start implementing this myself? Could this be done as a datasette plugin? or would this require more fundamental changes at initialisation time? My thoughts are that something at init time could detect that the user was loading a *.parquet file and then switch to a mode were it loads that via the ""CREATE VIRTUAL TABLE..."" rather than loading the *.db file in the default case?? I'm happy to contribute code and testing, I just need some pointers on the best approach. Thanks Darren",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/657/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 777677671,MDU6SXNzdWU3Nzc2Nzc2NzE=,1169,Prettier package not actually being cached,3637,benpickles,closed,0,,,,,4,2021-01-03T17:04:41Z,2021-01-04T19:52:34Z,2021-01-04T19:52:33Z,CONTRIBUTOR,,"With the current configuration Prettier seems to be installed on every run - which can been [seen from the output](https://github.com/simonw/datasette/runs/1631686028?check_suite_focus=true#step:4:4): ``` npx: installed 1 in 5.166s ``` Prettier isn't explicitly being installed (it's surprising that actually installing the dependencies isn't included in the [actions/cache docs](https://github.com/actions/cache/blob/main/examples.md#macos-and-ubuntu)) but it turns out that `npx` will automatically install the package for the specified command (it actually _guesses_ the package name from the name of the command). I'm not sure where Prettier ends up being installed but it doesn't appear to be in `~/.npm` according to the [post-cache output](https://github.com/simonw/datasette/runs/1631686028#step:7:2) (or `./node_modules` when I tested locally): ``` Cache hit occurred on the primary key Linux-npm-565329898f77080e58b14d45cf816ab94877e6f2ece9d395c369c533548a7ee7, not saving cache. ``` I think there are a couple of approaches to tackling this, you could manually install/cache Prettier within the action, or add a `package.json` with Prettier. I would go with the latter because it's a more standard and maintainable approach and it will also ensure that, along with CI, anyone working on the project will run the same version of Prettier (you'll also get Dependabot JavaScript updates). I've tested the [`package.json` approach on a branch](https://github.com/simonw/datasette/compare/main...benpickles:cache-prettier) and am happy to turn it into a pull request if you fancy. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1169/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 777707544,MDU6SXNzdWU3Nzc3MDc1NDQ=,219,reset_counts() method and command,9599,simonw,closed,0,,,,,4,2021-01-03T20:08:28Z,2021-01-03T20:59:37Z,2021-01-03T20:59:37Z,OWNER,,"> Thought: maybe there should be a `.reset_counts()` method too, for if the table gets out of date with the triggers. > > One way that could happen is if a table is dropped and recreated - the counts in the `_counts` table would likely no longer match the number of rows in that table. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753545757_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/219/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 456568880,MDU6SXNzdWU0NTY1Njg4ODA=,509,Support opening multiple databases with the same stem,9599,simonw,closed,0,9599,simonw,3268330,Datasette 1.0,4,2019-06-15T19:32:00Z,2020-12-22T20:04:35Z,2020-12-22T20:04:35Z,OWNER,,"e.g. I should be able to do this: datasette App/data.db Other_App/data.db This currently errors because you can't have two databases taking the `/data` URL path. Instead, how about in this particular case assigning the second database `/data-1`?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/509/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 771316301,MDU6SXNzdWU3NzEzMTYzMDE=,31,"Searching for ""github-to-sqlite"" throws an error",9599,simonw,closed,0,,,,,4,2020-12-19T06:07:20Z,2020-12-19T06:18:07Z,2020-12-19T06:18:07Z,MEMBER,,"https://datasette.io/-/beta?q=github-to-sqlite&sort=relevance&type=blog.db%2Fentries - ""no such column: to""",197431109,dogsheep-beta,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-beta/issues/31/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 443021509,MDU6SXNzdWU0NDMwMjE1MDk=,461,Paginate + search for databases/tables on the homepage,9599,simonw,open,0,,,3268330,Datasette 1.0,4,2019-05-11T18:05:34Z,2020-12-17T22:14:46Z,,OWNER,,Split out from #460 - in order to support large numbers of connected databases the homepage needs to be paginated.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/461/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 763283616,MDU6SXNzdWU3NjMyODM2MTY=,207,sqlite-utils analyze-tables command,9599,simonw,closed,0,,,,,4,2020-12-12T04:33:12Z,2020-12-13T07:25:23Z,2020-12-13T07:20:13Z,OWNER,,"A command which analyzes a table (potentially taking quite a while if the table is large) and outputs information for each column - things like: - How many unique values does this column have? - How many null rows? - How many blank rows? (defined as empty string) - What are the 10 most common values? - What are the 10 least common values? The command can output this information to the terminal, but it should also provide an option for writing the information to a database table so it can be explored later.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/207/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 760312579,MDU6SXNzdWU3NjAzMTI1Nzk=,1134,"""_searchmode=raw"" throws an index out of range error when combined with ""_search_COLUMN""",2181410,clausjuhl,closed,0,,,,,4,2020-12-09T13:05:37Z,2020-12-10T05:57:17Z,2020-12-09T19:56:55Z,NONE,,"Hi Simon! Maybe it's just me, but when [using _searchmode=raw (trying to enable wildcard-searching) in combination with the ""_search_COLUMN""-table argument](https://byraadsarkivet.aarhus.dk/db/cases?_searchmode=raw&_search_title=sundhedsfrem*), I get a list index out of range error. [When combining with the simpler ""_search""-argument everything works, including wildcard-seaches.](https://byraadsarkivet.aarhus.dk/db/cases?_search=sundhedsfrem*&_searchmode=raw). Here's the traceback: ``` Traceback (most recent call last): File ""/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/utils/asgi.py"", line 122, in route_path return await view(new_scope, receive, send) File ""/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/utils/asgi.py"", line 196, in view request, **scope[""url_route""][""kwargs""] File ""/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/views/base.py"", line 204, in get request, database, hash, correct_hash_provided, **kwargs File ""/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/views/base.py"", line 342, in view_get request, database, hash, **kwargs File ""/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/views/table.py"", line 393, in data search_col = key.split(""_search_"", 1)[1] IndexError: list index out of range ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1134/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 309047460,MDU6SXNzdWUzMDkwNDc0NjA=,188,Ability to bundle metadata and templates inside the SQLite file,9599,simonw,open,0,,,,,4,2018-03-27T16:42:07Z,2020-12-04T17:18:34Z,,OWNER,,"One of the nicest qualities of SQLite as a data format is that you get a single file which you can then backup or share with other people. Datasette breaks this a little once you start including custom metadata.json or template files and CSS. It would be cool if there was an optional mechanism for baking that extra configuration into the SQLite file itself. That way entire datasette mini-applications (including canned queries and custom HTML and CSS) could be constructed as single .db files. Since datasette configuration is all file-based, one way to achieve that would be to support a ""datasette_files"" table which, if present is used to search for file contents by path. This is inline with the philosophy described by https://www.sqlite.org/appfileformat.html ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/188/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 610829227,MDU6SXNzdWU2MTA4MjkyMjc=,749,Cloud Run fails to serve database files larger than 32MB,9599,simonw,closed,0,,,,,4,2020-05-01T16:06:46Z,2020-12-03T00:31:15Z,2020-12-03T00:31:14Z,OWNER,,"https://cloud.google.com/run/quotas lists the maximum response size as 32MB. I spotted a bug where attempting to download a database file larger than that from a Cloud Run deployment (in this case it was https://github-to-sqlite.dogsheep.net/github.db after I [accidentally increased the size of that database](https://github.com/dogsheep/github-to-sqlite/commit/630bdba68a23c0ac453e015518ef0bf41107a952)) returned a 500 error because of this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/749/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 749983857,MDU6SXNzdWU3NDk5ODM4NTc=,1106,Rebrand and redirect config.rst as settings.rst,9599,simonw,closed,0,,,6055094,Datasette 0.52,4,2020-11-24T19:38:17Z,2020-11-24T21:39:58Z,2020-11-24T21:39:58Z,OWNER,,"> I'd like to redirect https://docs.datasette.io/en/stable/config.html to a new https://docs.datasette.io/en/stable/settings.html page too. I can use https://docs.readthedocs.io/en/stable/user-defined-redirects.html for that. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/1105#issuecomment-733190827_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1106/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 737394470,MDU6SXNzdWU3MzczOTQ0NzA=,1084,Table/database action menu cut off if too short ,9599,simonw,closed,0,,,6055094,Datasette 0.52,4,2020-11-06T01:55:23Z,2020-11-21T23:45:59Z,2020-11-21T23:45:59Z,OWNER,,"![3CC0C181-959E-4B20-BE39-806ED93E833E](https://user-images.githubusercontent.com/9599/98316836-03891800-1f90-11eb-9e52-5266baf33296.jpeg) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1084/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 727627923,MDU6SXNzdWU3Mjc2Mjc5MjM=,1041,extra_js_urls and extra_css_urls should respect base_url setting,9599,simonw,closed,0,,,6026070,0.51,4,2020-10-22T18:34:33Z,2020-10-31T20:49:28Z,2020-10-31T20:48:58Z,OWNER,,"_Originally posted by @simonw in https://github.com/simonw/datasette/issues/1033#issuecomment-714681365_ Refs #1023",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1041/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 732634375,MDExOlB1bGxSZXF1ZXN0NTEyNTQ1MzY0,1061,.blob output renderer,9599,simonw,closed,0,,,6026070,0.51,4,2020-10-29T20:25:08Z,2020-10-29T22:01:40Z,2020-10-29T22:01:39Z,OWNER,simonw/datasette/pulls/1061,"- [x] Remove the `/-/...blob/...` route I added in #1040 in place of the new `.blob` renderer URLs - [x] Link to new `.blob` download links on the arbitrary query page (using `_blob_hash=...`) - plus tests for this Closes #1050, Closes #1051",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1061/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 729017519,MDExOlB1bGxSZXF1ZXN0NTA5NTkwMjA1,1049,Add template block prior to extra URL loaders,82988,psychemedia,closed,0,,,,,4,2020-10-25T13:08:55Z,2020-10-29T09:20:52Z,2020-10-29T09:20:34Z,CONTRIBUTOR,simonw/datasette/pulls/1049,"To handle packages that require Javascript state setting prior to loading a package (eg [`thebelab`](https://thebelab.readthedocs.io/en/latest/examples/minimal_example.html), provide a template block before the URLs are loaded.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1049/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 727915394,MDExOlB1bGxSZXF1ZXN0NTA4NzE5NTY3,1043,Include LICENSE in sdist,45380,bollwyvl,closed,0,,,,,4,2020-10-23T05:04:12Z,2020-10-26T00:14:57Z,2020-10-23T20:54:35Z,CONTRIBUTOR,simonw/datasette/pulls/1043,"Hi, thanks for `datasette`! This PR adds the `LICENSE` to source distributions, which seems the norm for Apache-2.0 stuff. I noticed the [0.50.2 sdist](https://files.pythonhosted.org/packages/f2/ba/1b5f182c3f1769c0863bcaa77406bdcb81c92e31bb579959c01b1d8951c0/datasette-0.50.2.tar.gz) doesn't ship `LICENSE`, but the 0.5.2 `whl` does, so I'm assuming the intent _is_ to ship... and it's a one-liner! Motivation: It might be a bit of a slog, but I'm looking to see about getting `datasette` (and friends!) available on conda-forge. There are a few missing upstreams (`asgi-csrf`, `python-basecov`, `mergedeep`) and some of the plugins don't even appear to _have_ tarballs (just `whl`!), but the little stuff like licenses are nice to get out handled upstream vs separately grabbing them.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1043/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 718521469,MDU6SXNzdWU3MTg1MjE0Njk=,1011,column name links broken in 0.50.1,649467,mhalle,closed,0,,,,,4,2020-10-10T03:37:51Z,2020-10-10T04:09:32Z,2020-10-10T03:52:07Z,NONE,,"I just upgraded from 0.49 to 0.50.1 and found that the links on column headers are broken. If I inspect the source, they have a leading ""//"" (without host or port) rather than including base_url like other links on the page do. The links in the ""gears"" menu for each column do work. I don't have custom templates for my project. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1011/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 718238967,MDU6SXNzdWU3MTgyMzg5Njc=,1003,from_json jinja2 filter,649467,mhalle,open,0,,,,,4,2020-10-09T15:30:58Z,2020-10-09T17:17:07Z,,NONE,,"When JSON fields are rendered in a jinja2 template, it is handy to be able to manipulate them as data (e.g., iterate over an array of values). Ansible has a ""from_json"" function, which just called json.loads. It's a trivial as a datasette plugin, but it seems generally useful. Does it makes sense to add it directly into the app?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1003/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 705108492,MDU6SXNzdWU3MDUxMDg0OTI=,970,"request an ""-o"" option on ""datasette server"" to open the default browser at the running url",2861690,secretGeek,closed,0,,,5971510,Datasette 0.50,4,2020-09-20T13:16:34Z,2020-10-08T23:54:27Z,2020-09-22T14:27:04Z,NONE,,"This is a request for a ""convenience"" feature, and only a nice to have. It's based on seeing this feature in several little command line hypertext server apps. If you run, for example: datasette.exe serve --open ""mydb.s3db"" I would like it if default browser is launched, at the URL that is being served. The angular cli does this, for example ng serve --open #see https://angular.io/cli/serve ...as does my usual mini web server of choice when inspecting local static files.... npx http-server -o # see https://www.npmjs.com/package/http-server Just a tiny thing. Love your work!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/970/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 706017416,MDU6SXNzdWU3MDYwMTc0MTY=,164,sqlite-utils transform sub-command,9599,simonw,closed,0,,,5897911,2.20,4,2020-09-22T01:32:20Z,2020-09-24T20:34:50Z,2020-09-22T07:48:05Z,OWNER,,The `.transform()` method in #114 warrants an equivalent CLI tool.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/164/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 616271236,MDU6SXNzdWU2MTYyNzEyMzY=,112,"add_foreign_key(...., ignore=True)",9599,simonw,closed,0,,,5896742,2.19,4,2020-05-12T00:24:00Z,2020-09-20T22:17:34Z,2020-09-20T22:17:34Z,OWNER,,"When using this library I often find myself wanting to ""add this foreign key, but only if it doesn't exist yet"". The `ignore=True` parameter is increasingly being used for this else where in the library (e.g. in `create_view()`).",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/112/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 455852801,MDU6SXNzdWU0NTU4NTI4MDE=,507,Every datasette plugin on the ecosystem page should have a screenshot,9599,simonw,open,0,,,,,4,2019-06-13T17:02:51Z,2020-09-17T02:47:35Z,,OWNER,,https://github.com/simonw/datasette/blob/master/docs/ecosystem.rst,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/507/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 449854604,MDU6SXNzdWU0NDk4NTQ2MDQ=,492,Facets not correctly persisted in hidden form fields,9599,simonw,closed,0,,,3268330,Datasette 1.0,4,2019-05-29T14:49:39Z,2020-09-15T20:12:29Z,2020-09-15T20:12:29Z,OWNER,,"Steps to reproduce: visit https://2a4b892.datasette.io/fixtures/roadside_attractions?_facet_m2m=attraction_characteristic and click ""Apply"" Result is a 500: `no such column: attraction_characteristic` The error occurs because of this hidden HTML input: This should be: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/492/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 683804172,MDU6SXNzdWU2ODM4MDQxNzI=,134,--load-extension option for sqlite-utils query,9599,simonw,closed,0,,,,,4,2020-08-21T20:12:42Z,2020-08-21T21:06:26Z,2020-08-21T20:54:19Z,OWNER,,"I got this error: ``` % sqlite-utils calands.db 'create table superunits_with_maps_view_concrete as select * from superunits_with_maps_view' Traceback (most recent call last): ... cursor = db.conn.execute(sql, dict(param)) sqlite3.OperationalError: no such function: AsGeoJSON ``` A `--load-extension=/usr/local/lib/mod_spatialite.dylib` option (imitating the same option for Datasette) would help.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/134/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 675724951,MDU6SXNzdWU2NzU3MjQ5NTE=,918,Security issue: read-only canned queries leak CSRF token in URL,9599,simonw,closed,0,,,,,4,2020-08-09T16:03:01Z,2020-08-09T16:56:48Z,2020-08-09T16:11:59Z,OWNER,,"The HTML form for a read-only canned query includes the hidden CSRF token field added in #798 for writable canned queries (#698). This means that submitting those read-only forms exposes the CSRF token in the URL - for example on https://latest.datasette.io/fixtures/neighborhood_search submitting the form took me to: https://latest.datasette.io/fixtures/neighborhood_search?text=down&csrftoken=IlFubnoxVVpLU1NGT3NMVUoi.HbOPd2YH_epQmp8f_aAt0s-MxtU This token could potentially leak to an attacker if the resulting page has a link to an external site on it and the user clicks the link, since the token would be exposed in the referral logs.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/918/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 665817570,MDU6SXNzdWU2NjU4MTc1NzA=,125,"Output binary columns in ""sqlite-utils query"" JSON",9599,simonw,closed,0,,,,,4,2020-07-26T16:47:02Z,2020-07-27T00:49:41Z,2020-07-27T00:48:45Z,OWNER,,You get an error if you try to run a query that returns data from a BLOB.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/125/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 660355904,MDU6SXNzdWU2NjAzNTU5MDQ=,43,github-to-sqlite tags command for fetching tags,9599,simonw,closed,0,,,,,4,2020-07-18T20:14:12Z,2020-07-18T23:05:56Z,2020-07-18T21:52:15Z,MEMBER,,Fetches paginated data from https://api.github.com/repos/simonw/datasette/tags,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/43/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 658476055,MDU6SXNzdWU2NTg0NzYwNTU=,896,Use white-space: pre-wrap on ALL table cell contents,9599,simonw,closed,0,,,,,4,2020-07-16T19:05:21Z,2020-07-17T01:26:08Z,2020-07-17T01:26:08Z,OWNER,,"Is there any reason NOT to apply `white-space: pre-wrap` to the contents of all table cells in Datasette? The default display mechanism of HTML (stripping leading/trailing slashes and collapsing all other whitespace) doesn't really make sense for displaying the kind of data that Datasette works with.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/896/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 648749062,MDExOlB1bGxSZXF1ZXN0NDQyNTA1MDg4,883,Skip counting hidden tables,3243482,abdusco,open,0,,,,,4,2020-07-01T07:38:08Z,2020-07-02T00:25:44Z,,CONTRIBUTOR,simonw/datasette/pulls/883,"Potential fix for https://github.com/simonw/datasette/issues/859. Disabling table counts for hidden tables speeds up database page quite a bit. In my setup it reduced load time by 2/3 (~300 -> ~90ms)",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/883/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 637966833,MDU6SXNzdWU2Mzc5NjY4MzM=,840,Log out mechanism for clearing ds_actor cookie,9599,simonw,closed,0,,,5533512,Datasette 0.45,4,2020-06-12T19:41:51Z,2020-06-29T04:31:43Z,2020-06-29T04:31:43Z,OWNER,,"Need a cookie clearing mechanism and a way to show that you are logged in. `datasette-auth-github` had a solution for this that can be pulled into core.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/840/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 638259643,MDU6SXNzdWU2MzgyNTk2NDM=,847,Take advantage of .coverage being a SQLite database,9599,simonw,closed,0,,,,,4,2020-06-14T00:41:25Z,2020-06-28T20:50:21Z,2020-06-28T20:50:21Z,OWNER,,"The `.coverage` file generated by running `pytest-cov` is now a SQLite database! I could do something interesting with this. Maybe after each test run for a new commit I could store that database file somewhere? Lots of interesting challenges here. I got a change into `coveragepy` last year which helps make the custom SQL functions available for doing fun things in Datasette: https://github.com/nedbat/coveragepy/issues/868 Bigger challenge: if I have a DB file for every commit, that's hundreds (potentially thousands) of DB files. Datasette isn't designed to handle thousands of files like that. So, do I figure out how to have Datasette open a file on-command for just a single request? Or, an easier option, do I copy data from those files into a single database with a modified schema to include the commit hash in each table row? (Following on from #841 and #844)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/847/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 635108074,MDU6SXNzdWU2MzUxMDgwNzQ=,824,Example authentication plugin,9599,simonw,closed,0,,,5512395,Datasette 0.44,4,2020-06-09T04:49:53Z,2020-06-12T00:11:51Z,2020-06-12T00:11:50Z,OWNER,,https://github.com/simonw/datasette-auth-github/issues/62 will work for this.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/824/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 637253789,MDU6SXNzdWU2MzcyNTM3ODk=,833,/-/metadata and so on should respect view-instance permission,9599,simonw,closed,0,,,5512395,Datasette 0.44,4,2020-06-11T19:07:21Z,2020-06-11T22:15:32Z,2020-06-11T22:14:59Z,OWNER,,"The only URLs that should be available without authentication at all times are the `/-/static/` prefix, to allow for HTTP caching.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/833/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 632919570,MDExOlB1bGxSZXF1ZXN0NDI5NjEzODkz,809,Publish secrets,9599,simonw,closed,0,,,5512395,Datasette 0.44,4,2020-06-07T02:00:31Z,2020-06-11T16:02:13Z,2020-06-11T16:02:03Z,OWNER,simonw/datasette/pulls/809,Refs #787. Will need quite a bit of manual testing since this involves code which runs against Heroku and Cloud Run.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/809/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 626171242,MDU6SXNzdWU2MjYxNzEyNDI=,777,Error pages not correctly loading CSS,9599,simonw,closed,0,,,5512395,Datasette 0.44,4,2020-05-28T02:47:52Z,2020-06-09T00:35:29Z,2020-06-09T00:35:29Z,OWNER,,"e.g. https://latest.datasette.io/fixtures/compound_three_primary_keys.tsv?_size=max The HTML starts like this: ```html Error 404 ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/777/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 570301333,MDU6SXNzdWU1NzAzMDEzMzM=,684,Add documentation on Database introspection methods to internals.rst,9599,simonw,closed,0,,,3268330,Datasette 1.0,4,2020-02-25T04:20:24Z,2020-06-04T18:56:15Z,2020-05-30T18:40:39Z,OWNER,,`internals.rst` will be landing as part of #683,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/684/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 625930207,MDU6SXNzdWU2MjU5MzAyMDc=,770,register_output_renderer can_render mechanism,9599,simonw,closed,0,,,5471110,Datasette 0.43,4,2020-05-27T18:29:14Z,2020-05-28T05:57:16Z,2020-05-28T05:57:16Z,OWNER,,"I would like is the ability for renderers to opt-in / opt-out of being displayed as options on the page. https://www.niche-museums.com/browse/museums for example shows a atom link because the datasette-atom plugin is installed... but clicking it will give you a 400 error because the correct columns are not present. Here's the code that passes a list of renderers to the template: https://github.com/simonw/datasette/blob/2d099ad9c657d2cab59de91cdb8bfed2da236ef6/datasette/views/base.py#L411-L423 A renderer is currently defined as a two-key dictionary: ```python @hookimpl def register_output_renderer(datasette): return { 'extension': 'test', 'callback': render_test } ``` I can add a third key, `""should_suggest""` which is a function that returns `True` or `False` for a given query. If that key is missing it is assumed to return `True`. One catch: what arguments should be passed to the `should_suggest(...)` function? UPDATE: now calling it `can_render` instead. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/581#issuecomment-634856748_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/770/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611540797,MDU6SXNzdWU2MTE1NDA3OTc=,751,Ability to set custom default _size on a per-table basis,9599,simonw,closed,0,,,5471110,Datasette 0.43,4,2020-05-04T00:13:03Z,2020-05-28T05:00:22Z,2020-05-28T05:00:20Z,OWNER,,"I have some tables where I'd like the default page size to be 10, without affecting the rest of my Datasette instance.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/751/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 530653633,MDU6SXNzdWU1MzA2NTM2MzM=,645,Mechanism for register_output_renderer to suggest extension or not,9599,simonw,closed,0,,,,,4,2019-12-01T01:26:27Z,2020-05-28T02:22:18Z,2020-05-28T02:22:12Z,OWNER,,"[datasette-atom](https://github.com/simonw/datasette-atom) only works if the user constructs a SQL query with specific output columns (`atom_id` ,`atom_updated` etc). It would be good if the `.atom` link wasn't shown on the query/table page unless those columns were present. Right now you get a link which results in a 400 error: See also #581.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/645/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 615626118,MDU6SXNzdWU2MTU2MjYxMTg=,22,Try out ExifReader,9599,simonw,open,0,,,,,4,2020-05-11T06:32:13Z,2020-05-14T05:59:53Z,,MEMBER,,"https://pypi.org/project/ExifReader/ New fork that should be able to handle EXIF in HEIC files. Forked here: https://github.com/ianare/exif-py/issues/102#issuecomment-626376522 Refs #3 ",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 322283067,MDU6SXNzdWUzMjIyODMwNjc=,254,Escaping named parameters in canned queries,247131,philroche,closed,0,,,,,4,2018-05-11T12:43:30Z,2020-05-10T14:54:14Z,2020-05-10T14:54:13Z,NONE,,"Thank you very much for this project. I have created some canned queries but some of the filters include a colon eg. ""com.ubuntu.cloud:server:18.04:amd64"". When saved these colons are parsed as named parameters. Is there a way to escape colons in a canned query?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/254/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611284481,MDU6SXNzdWU2MTEyODQ0ODE=,38,[Feature Request] Support Repo Name in Search 🥺,5779832,zzeleznick,closed,0,,,,,4,2020-05-02T22:08:51Z,2020-05-03T02:34:32Z,2020-05-02T23:15:11Z,NONE,,"## Description Per your [v2.2 release tweet](https://twitter.com/simonw/status/1256700238099693568) I played with the demo, but the output did not match my expectations. ## Expected Behavior Expected a search query for ""twitter"" contained within the `repo` column to return non-zero results. ## Actual Behavior 😭 [0 rows where repo contains ""twitter"" sorted by starred_at descending](https://github-to-sqlite.dogsheep.net/github/stars?repo__contains=twitter&_sort_desc=starred_at) ## Best Explanation Per the table schema (see appendix) `repo` is of type `INTEGER` which built from `repo_id` and does not expose the repo name in search. ## Desired Behavior Given that searching for ""206156866"" is less intuitive than ""twitter"", it would be great to support this via extending the search capabilities or by adding an additional column. ✅ 104 rows where repo contains ""twitter"" ❌ [104 rows where repo contains ""206156866"" sorted by starred_at descending](https://github-to-sqlite.dogsheep.net/github/stars?repo__contains=206156866&_sort_desc=starred_at) ## Appendix ``` CREATE TABLE [stars] ( [user] INTEGER REFERENCES [users]([id]), [repo] INTEGER REFERENCES [repos]([id]), [starred_at] TEXT, PRIMARY KEY ([user], [repo]) ); CREATE INDEX [idx_stars_repo] ON [stars] ([repo]); CREATE INDEX [idx_stars_user] ON [stars] ([user]); ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 609950090,MDU6SXNzdWU2MDk5NTAwOTA=,33,Fall back to authentication via ENV,2029,garethr,closed,0,,,,,4,2020-04-30T12:58:14Z,2020-05-02T18:46:10Z,2020-05-02T18:45:37Z,NONE,,"Would you accept a PR that falls back to looking for an environment variable for the GitHub token? Specifically a change here: https://github.com/dogsheep/github-to-sqlite/blob/c34d5a18bfc41fa08755ba3d5cf9fe09ff204238/github_to_sqlite/cli.py#L271 I'd like to use `github-to-sqlite` in a GitHub Action workflow and this would be simpler than trying to fill out the prompt or generate a file with sensitive content. Wanted to check first, I'm happy to submit a PR with tests and updates to the docs. ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610192152,MDU6SXNzdWU2MTAxOTIxNTI=,747,Directory configuration mode should support metadata.yaml,9599,simonw,closed,0,,,,,4,2020-04-30T16:05:30Z,2020-04-30T19:04:19Z,2020-04-30T19:04:19Z,OWNER,,Refs #739 - `metadata.yml` or `metadata.yaml` should be detected in the same way as `metadata.json` is.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/747/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 607770595,MDU6SXNzdWU2MDc3NzA1OTU=,743,escape_fts() does not correctly escape * wildcards,9599,simonw,closed,0,,,,,4,2020-04-27T18:48:53Z,2020-04-27T19:11:30Z,2020-04-27T19:11:01Z,OWNER,,"Spotted in #732. This should not return any results... but it does: https://latest.datasette.io/fixtures/searchable?_search=bar%2A&_trace=1 The query from trace is: ``` ""sql"": ""select count(*) from searchable where rowid in (select rowid from searchable_fts where searchable_fts match escape_fts(:search))"", ""params"": { ""search"": ""bar*"" } ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/743/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 607067303,MDExOlB1bGxSZXF1ZXN0NDA5MTIzODk3,737,"Custom pages mechanism, refs #648",9599,simonw,closed,0,,,,,4,2020-04-26T17:31:41Z,2020-04-26T18:46:43Z,2020-04-26T18:46:43Z,OWNER,simonw/datasette/pulls/737,"Refs #648. TODO: - [x] Pass a `view_name` to `render_template()` - [x] Mechanism for custom status code / headers / redirect - [x] Documentation",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/737/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 549287310,MDU6SXNzdWU1NDkyODczMTA=,76,order_by mechanism,10501166,metab0t,closed,0,,,,,4,2020-01-14T02:06:03Z,2020-04-16T06:23:29Z,2020-04-16T03:13:06Z,NONE,,"In some cases, I want to iterate rows in a table with `ORDER BY` clause. It would be nice to have a `rows_order_by` function similar to `rows_where`. In a more general case, `rows_filter` function might be added to allow more customized filtering to iterate rows.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/76/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 593751293,MDU6SXNzdWU1OTM3NTEyOTM=,97,"Adding a ""recreate"" flag to the `Database` constructor",1448859,betatim,closed,0,,,,,4,2020-04-04T05:41:10Z,2020-04-15T14:29:31Z,2020-04-13T03:52:29Z,NONE,,"I have a [script](https://github.com/betatim/binder-datasette/blob/master/create-db.ipynb) that imports data into a sqlite DB. When I re-run that script I'd like to remove the existing sqlite DB, instead of adding to it. The pragmatic answer is to add the check and file deletion to my script. However I thought it would be easy and useful for others to add a `recreate=True` flag to `db = sqlite_utils.Database(""binder-launches.db"")`. After taking a look at the code for it I am not so sure any more. This is because the connection string could be a URL (or ""connection string"") like `""file:///tmp/foo.db""`. I don't know what the equivalent of `os.path.exists()` is for a connection string or how to detect that something is a connection string and raise an error ""can't use recreate=True and conn_string at the same time"". Does anyone have an idea/suggestion where to start investigating?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/97/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 546051181,MDU6SXNzdWU1NDYwNTExODE=,16,Exception running first command: IndexError: list index out of range,15092,jayvdb,closed,0,,,,,4,2020-01-07T03:01:58Z,2020-04-14T18:37:21Z,2020-04-14T18:37:21Z,NONE,,"Exception running first command without an existing db or auth. ```py > mkdir ~/.github/coala > /usr/bin/github-to-sqlite repos ~/.github/coala coala Traceback (most recent call last): File ""/usr/bin/github-to-sqlite"", line 11, in load_entry_point('github-to-sqlite==0.6', 'console_scripts', 'github-to-sqlite')() File ""/usr/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/usr/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/usr/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/usr/lib/python3.7/site-packages/github_to_sqlite/cli.py"", line 163, in repos utils.save_repo(db, repo) File ""/usr/lib/python3.7/site-packages/github_to_sqlite/utils.py"", line 120, in save_repo to_save[""owner""] = save_user(db, to_save[""owner""]) File ""/usr/lib/python3.7/site-packages/github_to_sqlite/utils.py"", line 61, in save_user return db[""users""].upsert(to_save, pk=""id"", alter=True).last_pk File ""/usr/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1135, in upsert extracts=extracts, File ""/usr/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1162, in upsert_all upsert=True, File ""/usr/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1105, in insert_all row = list(self.rows_where(""rowid = ?"", [self.last_rowid]))[0] IndexError: list index out of range ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 544571092,MDU6SXNzdWU1NDQ1NzEwOTI=,15,Assets table with downloads,2029,garethr,closed,0,,,5225818,1.0,4,2020-01-02T13:05:28Z,2020-03-28T12:17:01Z,2020-03-23T19:17:32Z,NONE,,"The `releases` command extracts the releases table, but data about the individual assets are locked up in the JSON document in the `assets` field. My main interest is in individual and aggregate download counts. I was wondering if creating a new table with a record per asset may be useful? If so I'm happy to send a PR when I get a moment. Do you have opinions about that simply being part of the `releases` command or would you prefer a separate command as well?",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503233021,MDU6SXNzdWU1MDMyMzMwMjE=,1,Use better pagination (and implement progress bar),9599,simonw,closed,0,,,,,4,2019-10-07T04:58:11Z,2020-03-27T22:13:57Z,2020-03-27T22:13:57Z,MEMBER,,"Right now we attempt to load everything at once - which caps out at 5,000 items and is really slow. We can do better by implementing pagination using count and offset.",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 490803176,MDU6SXNzdWU0OTA4MDMxNzY=,8,--sql and --attach options for feeding commands from SQL queries,9599,simonw,closed,0,,,,,4,2019-09-08T20:35:49Z,2020-03-20T23:13:01Z,2020-03-20T23:13:01Z,MEMBER,,"Say you want to fetch Twitter profiles for a list of accounts that are stored in another database: $ twitter-to-sqlite users-lookup users.db --attach attending.db \ --sql ""select Twitter from attending.attendes where Twitter is not null"" The SQL query you feed in is expected to return a list of screen names suitable for processing further by the command. Should be supported by all three of: - [x] `twitter-to-sqlite users-lookup` - [x] `twitter-to-sqlite user-timeline` - [x] `twitter-to-sqlite followers` and `friends` The `--attach` option allows other SQLite databases to be attached to the connection. Without it the SQL query will have to read from the single attached database.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275087397,MDU6SXNzdWUyNzUwODczOTc=,120,Plugin that adds an authentication layer of some sort,9599,simonw,closed,0,,,,,4,2017-11-18T15:39:13Z,2020-03-16T18:48:06Z,2020-03-16T18:48:06Z,OWNER,,"Would allow people who want to host private data to do so. .sh ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/120/reactions"", ""total_count"": 7, ""+1"": 5, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 2, ""rocket"": 0, ""eyes"": 0}",,completed 569317377,MDU6SXNzdWU1NjkzMTczNzc=,681,Cashe-header missing in http-response,2181410,clausjuhl,closed,0,,,,,4,2020-02-22T10:50:45Z,2020-02-24T20:53:57Z,2020-02-24T20:53:56Z,NONE,,"Hi Simon. I need some help with both understanding and adding http-headers. If I call datasette on localhost with --config default_cache_ttl:120 and --cors, I only get the following response-headers: access-control-allow-origin: * content-type: text/html; charset=utf-8 date: Sat, 22 Feb 2020 10:32:15 GMT referrer-policy: no-referrer server: uvicorn transfer-encoding: chunked Cors works, but no caching-header is set? Same thing happens if I use the command in a Dockerfile and run datasette with docker. Second, how can one add headers to uvicorn? I've tried to add uvicorn commands to the Dockerfile, before the final datasette command, but it doesn't work. Is there any way to add headers to the uvicorn.run() command i datasette? I particular, I would like to add some of the missing security-headers: Thank you for a great product!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/681/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 345821500,MDU6SXNzdWUzNDU4MjE1MDA=,352,render_cell(value) plugin hook,9599,simonw,closed,0,,,,,4,2018-07-30T15:56:20Z,2020-02-10T16:18:58Z,2018-08-05T00:14:57Z,OWNER,,To allow plugins to customize how values matching a specific pattern are displayed in the HTML table view.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/352/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 558600274,MDU6SXNzdWU1NTg2MDAyNzQ=,81,"Remove .detect_column_types() from table, make it a documented API",9599,simonw,closed,0,,,,,4,2020-02-01T21:25:54Z,2020-02-01T21:55:35Z,2020-02-01T21:55:35Z,OWNER,,"I used it in `geojson-to-sqlite` here: https://github.com/simonw/geojson-to-sqlite/blob/f10e44264712dd59ae7dfa2e6fd5a904b682fb33/geojson_to_sqlite/utils.py#L45-L50 It would make more sense for this method to live on the Database rather than the Table - or even to exist as a separate utility method entirely. Then it should be documented.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/81/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520715188,MDU6SXNzdWU1MjA3MTUxODg=,622,Datasette should work with Python 3.8 (and drop compatibility with Python 3.5),9599,simonw,closed,0,,,,,4,2019-11-11T03:12:36Z,2019-11-12T05:52:49Z,2019-11-12T05:09:13Z,OWNER,,"See #595, #594, #404. The big thing holding me back from ditching Python 3.5 was glitch.com - but they now offer Python 3.7: https://support.glitch.com/t/can-you-upgrade-python-to-latest-version/7980/25?u=simonw",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/622/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 518725064,MDU6SXNzdWU1MTg3MjUwNjQ=,29,`import` command fails on empty files,21148,jacobian,closed,0,,,,,4,2019-11-06T20:34:26Z,2019-11-09T20:33:38Z,2019-11-09T19:36:36Z,CONTRIBUTOR,,"If a file in the export is empty (in my case it was `account-suspensions.js`), `twitter-to-sqlite import` fails: ``` $ twitter-to-sqlite import twitter.db ~/Downloads/twitter-2019-11-06-926f4f3be4b3b1fcb1aa387c40cd14f7c8aaf9bbcdb2d78ac14d9989add501bb.zip Traceback (most recent call last): File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/bin/twitter-to-sqlite"", line 10, in sys.exit(cli()) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py"", line 627, in import_ archive.import_from_file(db, filename, content) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/archive.py"", line 224, in import_from_file db[table_name].upsert_all(rows, hash_id=""pk"") File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1113, in upsert_all extracts=extracts, File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/sqlite_utils/db.py"", line 980, in insert_all first_record = next(records) StopIteration ``` This appears to be because `db.upsert_all` is called with no rows -- I think? I hacked around this by modifying `import_from_file` to have an `if rows:` clause: ``` for table, rows in to_insert.items(): if rows: table_name = ""archive_{}"".format(table.replace(""-"", ""_"")) ... ``` I'm happy to work up a real PR if that's the right approach, but I'm not sure it is.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/29/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 519038979,MDU6SXNzdWU1MTkwMzg5Nzk=,10,Failed to import workout points,9599,simonw,closed,0,,,,,4,2019-11-07T04:50:22Z,2019-11-08T01:18:37Z,2019-11-08T01:18:37Z,MEMBER,,"I just ran the script and it failed to import any `workout_points`, though it did import `workouts`.",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 514459062,MDU6SXNzdWU1MTQ0NTkwNjI=,27,retweets-of-me command,9599,simonw,closed,0,,,,,4,2019-10-30T07:43:01Z,2019-11-03T01:12:58Z,2019-11-03T01:12:58Z,MEMBER,,https://developer.twitter.com/en/docs/tweets/post-and-engage/api-reference/get-statuses-retweets_of_me,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487600595,MDU6SXNzdWU0ODc2MDA1OTU=,3,Option to fetch only checkins more recent than the current max checkin,9599,simonw,closed,0,,,,,4,2019-08-30T17:46:45Z,2019-10-16T20:41:23Z,2019-10-16T20:39:59Z,MEMBER,,"The Foursquare checkins API supports ""return every checkin occurring after this point"" - I can pass it the maximum createdAt date currently stored in the database. This will allow for quick incremental fetches via a cron.",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 505928530,MDU6SXNzdWU1MDU5Mjg1MzA=,18,Command to import home-timeline,9599,simonw,closed,0,,,,,4,2019-10-11T15:47:54Z,2019-10-11T16:51:33Z,2019-10-11T16:51:12Z,MEMBER,,"Feature request: https://twitter.com/johankj/status/1182563563136868352 > Would it be possible to save all tweets in my timeline from the last X days? I would love to see how big a percentage some users are of my daily timeline as a metric on whether I should unfollow them/move them to a list.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 481885279,MDU6SXNzdWU0ODE4ODUyNzk=,569,More advanced connection pooling,9599,simonw,open,0,,,,,4,2019-08-17T13:20:41Z,2019-10-02T22:44:37Z,,OWNER,,"We need a much smarter way of handling database connections. Today, connections are simple: Datasette runs a number of threads (defaults to 3) and each thread gets a threadlocal read-only (or immutable) connection to each attached database - opened on demand. For Datasette Library (#417) I want to support potentially hundreds of attached databases. Datasette Edit (#567) is going to introduce a need for writable connections too. I'd also like to be able to run joins across multiple databases (#283) which further complicates things. Supporting thousands of open SQLite connections at once feels like it won't provide good enough performance (though I should benchmark that to be sure). Some kind of connection pooling is likely to be necessary.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/569/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 487987958,MDExOlB1bGxSZXF1ZXN0MzEzMTA1NjM0,57,Add triggers while enabling FTS,49260,amjith,closed,0,,,,,4,2019-09-02T04:23:40Z,2019-09-03T01:03:59Z,2019-09-02T23:42:29Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/57,"This adds the option for a user to set up triggers in the database to keep their FTS table in sync with the parent table. Ref: https://sqlite.org/fts5.html#external_content_and_contentless_tables I would prefer to make the creation of triggers the default behavior, but that will break existing usage where people have been calling `populate_fts` after inserting new rows. I am happy to make changes to the PR as you see fit. ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/57/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 472429048,MDU6SXNzdWU0NzI0MjkwNDg=,9,Too many SQL variables,166463,tholo,closed,0,,,,,4,2019-07-24T18:24:17Z,2019-07-26T10:01:05Z,2019-07-26T10:01:05Z,NONE,,"Decided to try importing my data, and ran into this: ``` Traceback (most recent call last): File ""/Users/tholo/Source/health/bin/healthkit-to-sqlite"", line 10, in sys.exit(cli()) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/healthkit_to_sqlite/cli.py"", line 50, in cli convert_xml_to_sqlite(fp, db, progress_callback=bar.update) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/healthkit_to_sqlite/utils.py"", line 41, in convert_xml_to_sqlite write_records(records, db) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/healthkit_to_sqlite/utils.py"", line 80, in write_records column_order=[""startDate"", ""endDate"", ""value"", ""unit""], File ""/Users/tholo/Source/health/lib/python3.7/site-packages/sqlite_utils/db.py"", line 911, in insert_all result = self.db.conn.execute(sql, values) sqlite3.OperationalError: too many SQL variables ``` Added some debug output in sqlite_utils/db.py, which resulted in: ``` INSERT INTO [rBodyMassIndex] ([creationDate], [endDate], [metadata_HKWasUserEntered], [metadata_Health Mate App Version], [metadata_Modified Date], [metadata_Withings Link], [metadata_Withings User Identifier], [sourceName], [sourceVersion], [startDate], [unit], [value]) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) ; ``` with the attached data: ``` ['2019-06-27 22:55:10 -0700', '2011-06-22 21:05:53 -0700', '0', '4.4.2', '2011-06-23 04:05:53 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308801953&type=1', '301293', 'Health Mate', '4040200', '2011-06-22 21:05:53 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2011-06-23 09:36:27 -0700', '0', '4.4.2', '2011-06-23 16:36:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308846987&type=1', '301293', 'Health Mate', '4040200', '2011-06-23 09:36:27 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2011-06-23 23:54:07 -0700', '0', '4.4.2', '2011-06-24 06:55:19 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308898447&type=1', '301293', 'Health Mate', '4040200', '2011-06-23 23:54:07 -0700', 'count', '30.679', '2019-06-27 22:55:10 -0700', '2011-06-24 09:13:40 -0700', '0', '4.4.2', '2011-06-24 16:14:35 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308932020&type=1', '301293', 'Health Mate', '4040200', '2011-06-24 09:13:40 -0700', 'count', '30.3549', '2019-06-27 22:55:10 -0700', '2011-06-25 08:30:08 -0700', '0', '4.4.2', '2011-06-25 15:30:49 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309015808&type=1', '301293', 'Health Mate', '4040200', '2011-06-25 08:30:08 -0700', 'count', '30.3395', '2019-06-27 22:55:10 -0700', '2011-06-26 07:47:51 -0700', '0', '4.4.2', '2011-06-26 14:48:27 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309099671&type=1', '301293', 'Health Mate', '4040200', '2011-06-26 07:47:51 -0700', 'count', '30.2315', '2019-06-27 22:55:10 -0700', '2011-06-28 08:48:26 -0700', '0', '4.4.2', '2011-06-28 15:49:13 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309276106&type=1', '301293', 'Health Mate', '4040200', '2011-06-28 08:48:26 -0700', 'count', '30.0617', '2019-06-27 22:55:10 -0700', '2011-06-29 09:21:16 -0700', '0', '4.4.2', '2011-06-29 16:21:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309364476&type=1', '301293', 'Health Mate', '4040200', '2011-06-29 09:21:16 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-06-30 08:41:46 -0700', '0', '4.4.2', '2011-06-30 15:42:30 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309448506&type=1', '301293', 'Health Mate', '4040200', '2011-06-30 08:41:46 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2011-07-01 09:05:28 -0700', '0', '4.4.2', '2011-07-01 16:06:24 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309536328&type=1', '301293', 'Health Mate', '4040200', '2011-07-01 09:05:28 -0700', 'count', '29.8611', '2019-06-27 22:55:10 -0700', '2011-07-02 08:58:50 -0700', '0', '4.4.2', '2011-07-02 15:59:40 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309622330&type=1', '301293', 'Health Mate', '4040200', '2011-07-02 08:58:50 -0700', 'count', '29.8765', '2019-06-27 22:55:10 -0700', '2011-07-04 09:33:43 -0700', '0', '4.4.2', '2011-07-04 16:34:19 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309797223&type=1', '301293', 'Health Mate', '4040200', '2011-07-04 09:33:43 -0700', 'count', '30.0309', '2019-06-27 22:55:10 -0700', '2011-07-06 09:40:23 -0700', '0', '4.4.2', '2011-07-06 16:41:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309970423&type=1', '301293', 'Health Mate', '4040200', '2011-07-06 09:40:23 -0700', 'count', '30.1852', '2019-06-27 22:55:10 -0700', '2011-07-08 08:08:48 -0700', '0', '4.4.2', '2011-07-08 15:09:51 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310137728&type=1', '301293', 'Health Mate', '4040200', '2011-07-08 08:08:48 -0700', 'count', '30.0309', '2019-06-27 22:55:10 -0700', '2011-07-09 08:31:05 -0700', '0', '4.4.2', '2011-07-09 15:31:48 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310225465&type=1', '301293', 'Health Mate', '4040200', '2011-07-09 08:31:05 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-07-10 08:14:36 -0700', '0', '4.4.2', '2011-07-10 15:15:12 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310310876&type=1', '301293', 'Health Mate', '4040200', '2011-07-10 08:14:36 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2011-07-12 07:55:21 -0700', '0', '4.4.2', '2011-07-12 14:55:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310482521&type=1', '301293', 'Health Mate', '4040200', '2011-07-12 07:55:21 -0700', 'count', '30.108', '2019-06-27 22:55:10 -0700', '2011-07-13 08:48:05 -0700', '0', '4.4.2', '2011-07-13 15:48:42 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310572085&type=1', '301293', 'Health Mate', '4040200', '2011-07-13 08:48:05 -0700', 'count', '30', '2019-06-27 22:55:10 -0700', '2011-07-14 09:05:16 -0700', '0', '4.4.2', '2011-07-14 16:05:57 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310659516&type=1', '301293', 'Health Mate', '4040200', '2011-07-14 09:05:16 -0700', 'count', '29.9074', '2019-06-27 22:55:10 -0700', '2011-07-15 07:09:56 -0700', '0', '4.4.2', '2011-07-15 14:10:35 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310738996&type=1', '301293', 'Health Mate', '4040200', '2011-07-15 07:09:56 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-07-16 09:26:04 -0700', '0', '4.4.2', '2011-07-16 16:26:44 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310833564&type=1', '301293', 'Health Mate', '4040200', '2011-07-16 09:26:04 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-07-17 09:52:59 -0700', '0', '4.4.2', '2011-07-17 16:53:38 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310921579&type=1', '301293', 'Health Mate', '4040200', '2011-07-17 09:52:59 -0700', 'count', '29.8765', '2019-06-27 22:55:10 -0700', '2011-07-19 08:56:16 -0700', '0', '4.4.2', '2011-07-19 15:57:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311090976&type=1', '301293', 'Health Mate', '4040200', '2011-07-19 08:56:16 -0700', 'count', '29.7685', '2019-06-27 22:55:10 -0700', '2011-07-21 08:21:20 -0700', '0', '4.4.2', '2011-07-21 15:22:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311261680&type=1', '301293', 'Health Mate', '4040200', '2011-07-21 08:21:20 -0700', 'count', '29.7685', '2019-06-27 22:55:10 -0700', '2011-07-23 08:49:56 -0700', '0', '4.4.2', '2011-07-23 15:50:40 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311436196&type=1', '301293', 'Health Mate', '4040200', '2011-07-23 08:49:56 -0700', 'count', '29.7222', '2019-06-27 22:55:10 -0700', '2011-07-24 09:17:35 -0700', '0', '4.4.2', '2011-07-24 16:18:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311524255&type=1', '301293', 'Health Mate', '4040200', '2011-07-24 09:17:35 -0700', 'count', '29.5833', '2019-06-27 22:55:10 -0700', '2011-07-25 07:51:55 -0700', '0', '4.4.2', '2011-07-25 14:52:48 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311605515&type=1', '301293', 'Health Mate', '4040200', '2011-07-25 07:51:55 -0700', 'count', '29.5525', '2019-06-27 22:55:10 -0700', '2011-08-06 10:04:05 -0700', '0', '4.4.2', '2011-08-06 17:04:47 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1312650245&type=1', '301293', 'Health Mate', '4040200', '2011-08-06 10:04:05 -0700', 'count', '29.7377', '2019-06-27 22:55:10 -0700', '2011-08-08 07:52:22 -0700', '0', '4.4.2', '2011-08-08 14:53:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1312815142&type=1', '301293', 'Health Mate', '4040200', '2011-08-08 07:52:22 -0700', 'count', '29.6605', '2019-06-27 22:55:10 -0700', '2011-08-10 07:57:30 -0700', '0', '4.4.2', '2011-08-10 14:58:12 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1312988250&type=1', '301293', 'Health Mate', '4040200', '2011-08-10 07:57:30 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-08-12 07:51:14 -0700', '0', '4.4.2', '2011-08-12 14:51:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1313160674&type=1', '301293', 'Health Mate', '4040200', '2011-08-12 07:51:14 -0700', 'count', '29.6914', '2019-06-27 22:55:10 -0700', '2011-08-13 07:45:28 -0700', '0', '4.4.2', '2011-08-13 14:46:08 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1313246728&type=1', '301293', 'Health Mate', '4040200', '2011-08-13 07:45:28 -0700', 'count', '29.5833', '2019-06-27 22:55:10 -0700', '2011-08-17 09:06:20 -0700', '0', '4.4.2', '2011-08-17 16:07:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1313597180&type=1', '301293', 'Health Mate', '4040200', '2011-08-17 09:06:20 -0700', 'count', '29.5679', '2019-06-27 22:55:10 -0700', '2011-08-22 08:28:08 -0700', '0', '4.4.2', '2011-08-22 15:28:57 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1314026888&type=1', '301293', 'Health Mate', '4040200', '2011-08-22 08:28:08 -0700', 'count', '29.9846', '2019-06-27 22:55:10 -0700', '2011-08-25 08:59:30 -0700', '0', '4.4.2', '2011-08-25 16:00:15 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1314287970&type=1', '301293', 'Health Mate', '4040200', '2011-08-25 08:59:30 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2011-08-30 08:13:59 -0700', '0', '4.4.2', '2011-08-30 15:46:08 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1314717239&type=1', '301293', 'Health Mate', '4040200', '2011-08-30 08:13:59 -0700', 'count', '29.784', '2019-06-27 22:55:10 -0700', '2011-09-12 08:47:51 -0700', '0', '4.4.2', '2011-09-12 15:48:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1315842471&type=1', '301293', 'Health Mate', '4040200', '2011-09-12 08:47:51 -0700', 'count', '29.7377', '2019-06-27 22:55:10 -0700', '2011-09-13 09:17:27 -0700', '0', '4.4.2', '2011-09-13 16:48:30 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1315930647&type=1', '301293', 'Health Mate', '4040200', '2011-09-13 09:17:27 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-10-01 09:12:20 -0700', '0', '4.4.2', '2011-10-01 16:13:00 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1317485540&type=1', '301293', 'Health Mate', '4040200', '2011-10-01 09:12:20 -0700', 'count', '29.8148', '2019-06-27 22:55:10 -0700', '2011-10-11 11:14:11 -0700', '0', '4.4.2', '2011-10-11 18:15:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1318356851&type=1', '301293', 'Health Mate', '4040200', '2011-10-11 11:14:11 -0700', 'count', '29.7377', '2019-06-27 22:55:10 -0700', '2011-10-16 09:29:47 -0700', '0', '4.4.2', '2011-10-16 16:30:39 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1318782587&type=1', '301293', 'Health Mate', '4040200', '2011-10-16 09:29:47 -0700', 'count', '29.6914', '2019-06-27 22:55:10 -0700', '2011-10-19 09:21:44 -0700', '0', '4.4.2', '2011-10-19 16:22:25 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1319041304&type=1', '301293', 'Health Mate', '4040200', '2011-10-19 09:21:44 -0700', 'count', '29.7685', '2019-06-27 22:55:10 -0700', '2011-10-24 07:04:22 -0700', '0', '4.4.2', '2011-10-24 14:05:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1319465062&type=1', '301293', 'Health Mate', '4040200', '2011-10-24 07:04:22 -0700', 'count', '29.5988', '2019-06-27 22:55:10 -0700', '2011-11-07 09:33:17 -0700', '0', '4.4.2', '2011-11-07 16:33:58 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1320683597&type=1', '301293', 'Health Mate', '4040200', '2011-11-07 09:33:17 -0700', 'count', '29.8611', '2019-06-27 22:55:10 -0700', '2011-11-10 07:59:03 -0700', '0', '4.4.2', '2011-11-10 14:59:48 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1320937143&type=1', '301293', 'Health Mate', '4040200', '2011-11-10 07:59:03 -0700', 'count', '29.9383', '2019-06-27 22:55:10 -0700', '2011-11-13 09:28:31 -0700', '0', '4.4.2', '2011-11-13 16:29:20 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1321201711&type=1', '301293', 'Health Mate', '4040200', '2011-11-13 09:28:31 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-11-21 08:45:06 -0700', '0', '4.4.2', '2011-11-21 15:46:04 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1321890306&type=1', '301293', 'Health Mate', '4040200', '2011-11-21 08:45:06 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2011-11-23 09:55:44 -0700', '0', '4.4.2', '2011-11-23 16:56:18 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1322067344&type=1', '301293', 'Health Mate', '4040200', '2011-11-23 09:55:44 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2011-11-29 09:50:44 -0700', '0', '4.4.2', '2011-11-29 16:51:31 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1322585444&type=1', '301293', 'Health Mate', '4040200', '2011-11-29 09:50:44 -0700', 'count', '30.1698', '2019-06-27 22:55:10 -0700', '2011-11-30 11:13:21 -0700', '0', '4.4.2', '2011-11-30 18:14:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1322676801&type=1', '301293', 'Health Mate', '4040200', '2011-11-30 11:13:21 -0700', 'count', '30.0617', '2019-06-27 22:55:10 -0700', '2011-12-04 10:24:36 -0700', '0', '4.4.2', '2011-12-04 17:25:24 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1323019476&type=1', '301293', 'Health Mate', '4040200', '2011-12-04 10:24:36 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2011-12-10 09:22:18 -0700', '0', '4.4.2', '2011-12-10 16:23:07 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1323534138&type=1', '301293', 'Health Mate', '4040200', '2011-12-10 09:22:18 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-12-26 10:36:42 -0700', '0', '4.4.2', '2011-12-26 17:37:31 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1324921002&type=1', '301293', 'Health Mate', '4040200', '2011-12-26 10:36:42 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2012-01-11 11:24:13 -0700', '0', '4.4.2', '2012-01-11 18:25:04 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1326306253&type=1', '301293', 'Health Mate', '4040200', '2012-01-11 11:24:13 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2012-01-15 10:17:09 -0700', '0', '4.4.2', '2012-01-15 17:17:51 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1326647829&type=1', '301293', 'Health Mate', '4040200', '2012-01-15 10:17:09 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2012-01-19 09:24:32 -0700', '0', '4.4.2', '2012-01-19 16:25:21 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1326990272&type=1', '301293', 'Health Mate', '4040200', '2012-01-19 09:24:32 -0700', 'count', '29.7994', '2019-06-27 22:55:10 -0700', '2012-01-29 10:26:13 -0700', '0', '4.4.2', '2012-01-29 17:26:52 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1327857973&type=1', '301293', 'Health Mate', '4040200', '2012-01-29 10:26:13 -0700', 'count', '30.0154', '2019-06-27 22:55:10 -0700', '2012-02-03 10:13:28 -0700', '0', '4.4.2', '2012-02-03 17:15:01 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1328289208&type=1', '301293', 'Health Mate', '4040200', '2012-02-03 10:13:28 -0700', 'count', '29.8457', '2019-06-27 22:55:10 -0700', '2012-02-12 09:23:01 -0700', '0', '4.4.2', '2012-02-12 16:23:53 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1329063781&type=1', '301293', 'Health Mate', '4040200', '2012-02-12 09:23:01 -0700', 'count', '30.1235', '2019-06-27 22:55:10 -0700', '2012-03-03 09:26:06 -0700', '0', '4.4.2', '2012-03-03 16:26:54 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1330791966&type=1', '301293', 'Health Mate', '4040200', '2012-03-03 09:26:06 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2012-03-11 11:23:15 -0700', '0', '4.4.2', '2012-03-11 18:24:16 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1331490195&type=1', '301293', 'Health Mate', '4040200', '2012-03-11 11:23:15 -0700', 'count', '30.2161', '2019-06-27 22:55:10 -0700', '2012-03-16 09:39:36 -0700', '0', '4.4.2', '2012-03-16 16:40:20 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1331915976&type=1', '301293', 'Health Mate', '4040200', '2012-03-16 09:39:36 -0700', 'count', '30.2778', '2019-06-27 22:55:10 -0700', '2012-03-21 08:33:07 -0700', '0', '4.4.2', '2012-03-21 15:34:00 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1332343987&type=1', '301293', 'Health Mate', '4040200', '2012-03-21 08:33:07 -0700', 'count', '30.1389', '2019-06-27 22:55:10 -0700', '2012-04-11 08:49:34 -0700', '0', '4.4.2', '2012-04-11 15:50:18 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1334159374&type=1', '301293', 'Health Mate', '4040200', '2012-04-11 08:49:34 -0700', 'count', '30.0154', '2019-06-27 22:55:10 -0700', '2012-04-13 08:32:06 -0700', '0', '4.4.2', '2012-04-13 15:32:49 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1334331126&type=1', '301293', 'Health Mate', '4040200', '2012-04-13 08:32:06 -0700', 'count', '29.9383', '2019-06-27 22:55:10 -0700', '2012-04-20 08:21:38 -0700', '0', '4.4.2', '2012-04-20 15:52:45 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1334935298&type=1', '301293', 'Health Mate', '4040200', '2012-04-20 08:21:38 -0700', 'count', '30.2006', '2019-06-27 22:55:10 -0700', '2012-04-25 09:00:01 -0700', '0', '4.4.2', '2012-04-25 16:00:42 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1335369601&type=1', '301293', 'Health Mate', '4040200', '2012-04-25 09:00:01 -0700', 'count', '30.2006', '2019-06-27 22:55:10 -0700', '2012-05-04 11:10:18 -0700', '0', '4.4.2', '2012-05-04 18:10:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1336155018&type=1', '301293', 'Health Mate', '4040200', '2012-05-04 11:10:18 -0700', 'count', '30.4321', '2019-06-27 22:55:10 -0700', '2012-05-12 09:35:00 -0700', '0', '4.4.2', '2012-05-12 16:35:43 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1336840500&type=1', '301293', 'Health Mate', '4040200', '2012-05-12 09:35:00 -0700', 'count', '30.1235', '2019-06-27 22:55:10 -0700', '2012-05-22 09:27:53 -0700', '0', '4.4.2', '2012-05-22 16:28:37 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1337704073&type=1', '301293', 'Health Mate', '4040200', '2012-05-22 09:27:53 -0700', 'count', '30.4167', '2019-06-27 22:55:10 -0700', '2012-05-31 09:23:16 -0700', '0', '4.4.2', '2012-05-31 16:24:04 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1338481396&type=1', '301293', 'Health Mate', '4040200', '2012-05-31 09:23:16 -0700', 'count', '30.2006', '2019-06-27 22:55:10 -0700', '2012-06-08 09:29:07 -0700', '0', '4.4.2', '2012-06-08 16:29:52 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1339172947&type=1', '301293', 'Health Mate', '4040200', '2012-06-08 09:29:07 -0700', 'count', '30.5247', '2019-06-27 22:55:10 -0700', '2012-06-21 08:07:33 -0700', '0', '4.4.2', '2012-06-21 15:08:20 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1340291253&type=1', '301293', 'Health Mate', '4040200', '2012-06-21 08:07:33 -0700', 'count', '30.5864', '2019-06-27 22:55:10 -0700', '2012-08-08 10:02:22 -0700', '0', '4.4.2', '2012-08-08 17:03:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1344445342&type=1', '301293', 'Health Mate', '4040200', '2012-08-08 10:02:22 -0700', 'count', '30.6636', '2019-06-27 22:55:10 -0700', '2012-08-17 09:11:32 -0700', '0', '4.4.2', '2012-08-17 16:42:05 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1345219892&type=1', '301293', 'Health Mate', '4040200', '2012-08-17 09:11:32 -0700', 'count', '30.8796', '2019-06-27 22:55:10 -0700', '2012-09-10 08:27:21 -0700', '0', '4.4.2', '2012-09-10 15:28:07 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1347290841&type=1', '301293', 'Health Mate', '4040200', '2012-09-10 08:27:21 -0700', 'count', '31.034', '2019-06-27 22:55:10 -0700', '2012-09-17 08:35:33 -0700', '0', '4.4.2', '2012-09-17 15:35:33 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1347896133&type=1', '301293', 'Health Mate', '4040200', '2012-09-17 08:35:33 -0700', 'count', '30.7099', '2019-06-27 22:55:10 -0700', '2012-09-26 08:59:46 -0700', '0', '4.4.2', '2012-09-26 16:13:18 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1348675186&type=1', '301293', 'Health Mate', '4040200', '2012-09-26 08:59:46 -0700', 'count', '30.679', '2019-06-27 22:55:10 -0700', '2012-10-18 08:51:16 -0700', '0', '4.4.2', '2012-10-18 15:51:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1350575476&type=1', '301293', 'Health Mate', '4040200', '2012-10-18 08:51:16 -0700', 'count', '30.7716', '2019-06-27 22:55:10 -0700', '2012-11-15 08:54:57 -0700', '0', '4.4.2', '2012-11-15 15:55:58 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1352994897&type=1', '301293', 'Health Mate', '4040200', '2012-11-15 08:54:57 -0700', 'count', '31.0802', '2019-06-27 22:55:10 -0700', '2012-12-17 09:13:40 -0700', '0', '4.4.2', '2012-12-17 16:20:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1355760820&type=1', '301293', 'Health Mate', '4040200', '2012-12-17 09:13:40 -0700', 'count', '29.784', '2019-06-27 22:55:10 -0700', '2012-12-19 11:09:55 -0700', '0', '4.4.2', '2012-12-19 18:10:37 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1355940595&type=1', '301293', 'Health Mate', '4040200', '2012-12-19 11:09:55 -0700', 'count', '29.6914', '2019-06-27 22:55:10 -0700', '2012-12-25 10:37:41 -0700', '0', '4.4.2', '2012-12-25 17:38:25 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1356457061&type=1', '301293', 'Health Mate', '4040200', '2012-12-25 10:37:41 -0700', 'count', '29.8765', '2019-06-27 22:55:10 -0700', '2013-01-01 10:44:02 -0700', '0', '4.4.2', '2013-01-01 17:44:46 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1357062242&type=1', '301293', 'Health Mate', '4040200', '2013-01-01 10:44:02 -0700', 'count', '30.0772', '2019-06-27 22:55:10 -0700', '2013-01-15 09:10:46 -0700', '0', '4.4.2', '2013-01-15 16:11:28 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1358266246&type=1', '301293', 'Health Mate', '4040200', '2013-01-15 09:10:46 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2013-01-20 11:03:39 -0700', '0', '4.4.2', '2013-01-20 18:04:22 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1358705019&type=1', '301293', 'Health Mate', '4040200', '2013-01-20 11:03:39 -0700', 'count', '30.108', '2019-06-27 22:55:10 -0700', '2013-01-30 08:56:30 -0700', '0', '4.4.2', '2013-01-30 15:57:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1359561390&type=1', '301293', 'Health Mate', '4040200', '2013-01-30 08:56:30 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2013-02-04 11:02:35 -0700', '0', '4.4.2', '2013-02-04 18:03:25 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1360000955&type=1', '301293', 'Health Mate', '4040200', '2013-02-04 11:02:35 -0700', 'count', '29.8148', '2019-06-27 22:55:10 -0700', '2013-02-07 09:07:06 -0700', '0', '4.4.2', '2013-02-07 16:07:49 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1360253226&type=1', '301293', 'Health Mate', '4040200', '2013-02-07 09:07:06 -0700', 'count', '30.1389', '2019-06-27 22:55:10 -0700', '2013-02-19 08:49:57 -0700', '0', '4.4.2', '2013-02-19 15:50:39 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1361288997&type=1', '301293', 'Health Mate', '4040200', '2013-02-19 08:49:57 -0700', 'count', '30.1235', '2019-06-27 22:55:10 -0700', '2013-03-02 11:20:54 -0700', '0', '4.4.2', '2013-03-02 18:21:38 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1362248454&type=1', '301293', 'Health Mate', '4040200', '2013-03-02 11:20:54 -0700', 'count', '30', '2019-06-27 22:55:10 -0700', '2013-04-23 08:05:30 -0700', '0', '4.4.2', '2013-04-23 15:06:59 +0000', 'withings-bd2://timeline/measure?user """""" id=301293&date=1366729530&type=1', '301293', 'Health Mate', '4040200', '2013-04-23 08:05:30 -0700', 'count', '30.5247', '2019-06-27 22:55:10 -0700', '2013-05-09 09:49:18 -0700', '0', '4.4.2', '2013-05-09 16:50:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1368118158&type=1', '301293', 'Health Mate', '4040200', '2013-05-09 09:49:18 -0700', 'count', '30.4167', '2019-06-27 22:55:10 -0700', '2013-06-09 09:28:47 -0700', '0', '4.4.2', '2013-06-09 16:29:30 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1370795327&type=1', '301293', 'Health Mate', '4040200', '2013-06-09 09:28:47 -0700', 'count', '30.8333', '2019-06-27 22:55:10 -0700', '2013-07-09 08:00:17 -0700', '0', '4.4.2', '2013-07-09 15:01:00 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1373382017&type=1', '301293', 'Health Mate', '4040200', '2013-07-09 08:00:17 -0700', 'count', '30.8179', '2019-06-27 22:55:10 -0700', '2013-07-28 09:16:55 -0700', '0', '4.4.2', '2013-07-28 16:17:39 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1375028215&type=1', '301293', 'Health Mate', '4040200', '2013-07-28 09:16:55 -0700', 'count', '30.5556', '2019-06-27 22:55:10 -0700', '2013-09-13 09:22:19 -0700', '0', '4.4.2', '2013-09-13 16:23:08 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1379089339&type=1', '301293', 'Health Mate', '4040200', '2013-09-13 09:22:19 -0700', 'count', '30.9568', '2019-06-27 22:55:10 -0700', '2013-09-24 08:08:23 -0700', '0', '4.4.2', '2013-09-24 15:09:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1380035303&type=1', '301293', 'Health Mate', '4040200', '2013-09-24 08:08:23 -0700', 'count', '31.4352', '2019-06-27 22:55:10 -0700', '2013-10-01 08:15:13 -0700', '0', '4.4.2', '2013-10-01 15:15:57 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1380640513&type=1', '301293', 'Health Mate', '4040200', '2013-10-01 08:15:13 -0700', 'count', '31.2037', '2019-06-27 22:55:10 -0700', '2013-10-23 09:31:25 -0700', '0', '4.4.2', '2013-10-23 16:32:13 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1382545885&type=1', '301293', 'Health Mate', '4040200', '2013-10-23 09:31:25 -0700', 'count', '31.8056'] ```",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 467218270,MDU6SXNzdWU0NjcyMTgyNzA=,558,Support unicode in url,380586,0x1997,closed,0,,,,,4,2019-07-12T04:43:24Z,2019-07-15T01:29:30Z,2019-07-14T02:49:33Z,NONE,,"Hi, I defined some custom queries in my `metadata.json`. There are Chinese characters in the names of the queries. So the urls are like `http://127.0.0.1:8001/mydb/测试查询`. When opening such urls, datasette will throw an exception. ``` Traceback (most recent call last): File ""/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/utils/asgi.py"", line 100, in __call__ return await view(new_scope, receive, send) File ""/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/utils/asgi.py"", line 172, in view request, **scope[""url_route""][""kwargs""] File ""/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/views/base.py"", line 267, in get request, database, hash, correct_hash_provided, **kwargs File ""/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/views/base.py"", line 471, in view_get for key in self.ds.renderers.keys() File ""/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/views/base.py"", line 471, in for key in self.ds.renderers.keys() File ""/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/utils/__init__.py"", line 655, in path_with_format path = request.path File ""/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/utils/asgi.py"", line 49, in path self.scope.get(""raw_path"", self.scope[""path""].encode(""latin-1"")) UnicodeEncodeError: 'latin-1' codec can't encode characters in position 9-11: ordinal not in range(256) ``` This used to work when datasette was based on sanic. Btw, thanks for the great work!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/558/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 465728430,MDExOlB1bGxSZXF1ZXN0Mjk1NzExNTA0,554,Fix static mounts using relative paths and prevent traversal exploits,3243482,abdusco,closed,0,,,,,4,2019-07-09T11:32:02Z,2019-07-11T16:29:26Z,2019-07-11T16:13:19Z,CONTRIBUTOR,simonw/datasette/pulls/554,"While debugging why my static mounts using a relative path (`--static mystatic:rel/path/to/dir`) not working, I noticed that the requests fail no matter what, returning 404 errors. The reason is that datasette tries to prevent traversal exploits by checking if the path is relative to its registered directory. This check fails when the mount is a relative directory, because `/abs/dir/file` obviously not under `dir/file`. https://github.com/simonw/datasette/blob/81fa8b6cdc5457b42a224779e5291952314e8d20/datasette/utils/asgi.py#L303-L306 This also has the consequence of returning any requested file, because when `/abs/dir/../../evil.file` resolves `aiofiles` happily returns it to the client after it resolves the path itself. The solution is to make sure we're checking relativity of paths after they're fully resolved. I've implemented the mentioned changes and also updated the tests.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/554/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 443038584,MDU6SXNzdWU0NDMwMzg1ODQ=,465,Decide what to do about /-/inspect,9599,simonw,closed,0,,,,,4,2019-05-11T21:39:46Z,2019-06-28T16:34:33Z,2019-06-28T16:34:33Z,OWNER,,"It's not clear to me what this endpoint should do now as a result of #419 - it's still useful to be able to introspect databases for tools like datasette-registry, but since we aren't pre-calculating introspection data any more I need to rethink the approach. For one thing, this endpoint may need to be paginated. Or maybe it should be split up into separate endpoints for each connected database? Those should probably be paginated too seeing as fivethirtyeight has 400+ tables.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/465/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 272391665,MDU6SXNzdWUyNzIzOTE2NjU=,48,Switch to ujson,9599,simonw,closed,0,,,,,4,2017-11-08T23:50:29Z,2019-06-24T06:57:54Z,2019-06-24T06:57:43Z,OWNER,,"ujson is already a dependency of Sanic, and should be quite a bit faster.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/48/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 449848803,MDU6SXNzdWU0NDk4NDg4MDM=,25,"Allow .insert(..., foreign_keys=()) to auto-detect table and primary key",9599,simonw,closed,0,,,,,4,2019-05-29T14:39:22Z,2019-06-13T05:32:32Z,2019-06-13T05:32:32Z,OWNER,,"The `foreign_keys=` argument currently takes a list of triples: ```python db[""usages""].insert_all( usages_to_insert, foreign_keys=( (""line_id"", ""lines"", ""id""), (""definition_id"", ""definitions"", ""id""), ), ) ``` As of #16 we have a mechanism for detecting the primary key column (the third item in this triple) - we should use that here too, so foreign keys can be optionally defined as a list of pairs.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 346028655,MDU6SXNzdWUzNDYwMjg2NTU=,356,Ability to display facet counts for many-to-many relationships,9599,simonw,closed,0,,,,,4,2018-07-31T04:14:26Z,2019-05-29T21:39:12Z,2019-05-25T16:30:09Z,OWNER,,Parent: #354,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/356/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 413867537,MDU6SXNzdWU0MTM4Njc1Mzc=,16,add_column() should support REFERENCES {other_table}({other_column}),9599,simonw,closed,0,,,,,4,2019-02-24T21:00:45Z,2019-05-29T05:17:59Z,2019-05-29T04:56:18Z,OWNER,,Related to #2 ,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 292011379,MDU6SXNzdWUyOTIwMTEzNzk=,184,500 from missing table name,222245,carlmjohnson,closed,0,,,,,4,2018-01-26T19:46:45Z,2019-05-21T16:17:29Z,2018-04-13T18:18:59Z,NONE,,"https://github.com/simonw/datasette/blob/56623e48da5412b25fb39cc26b9c743b684dd968/datasette/app.py#L517-L519 throws an error if it gets an empty list back. Simplest solution is to write a helper func that just says ```python result = list(await self.execute(name, sql, params) if result: return result[0][0] ``` and use it anywhere `[0][0]` is now.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/184/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 443023308,MDU6SXNzdWU0NDMwMjMzMDg=,462,Replace most of `.inspect()` (and `datasette inspect`) with table counting,9599,simonw,closed,0,,,4305096,0.28,4,2019-05-11T18:26:06Z,2019-05-16T14:31:05Z,2019-05-16T14:31:05Z,OWNER,,"This is the last part of #419 - with the move to supporting mutable databases by default, the inspect-data mechanism currently in use no-longer makes much sense. The one optimization I think it's worth keeping for databases opened in immutable mode is the cached table counts. I think `datasette inspect` should cut down to only counting the rows in the tables - the other things done by inspect (figuring out columns, foreign key relationships, FTS etc) should all be fast enough that they can be reliably performed at runtime even against large databases. If performing them at run-time has performance issues, I would rather cache those results internally within Datasette after they are first calculated than continue to support them in the `datasette inspect` command - to keep things simpler.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/462/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273157085,MDU6SXNzdWUyNzMxNTcwODU=,59,datasette publish hyper,9599,simonw,closed,0,,,,,4,2017-11-11T16:27:26Z,2019-05-13T19:01:00Z,2019-05-13T19:00:44Z,OWNER,,"This is a bit tricky, because unlike Now there doesn't seem to be a way to tell Hyper to ""build this Dockerfile and deploy the resulting image"". They expect you to build a container and publish it to a registry instead. https://docs.hyper.sh/Reference/CLI/load.html allows you to publish an image directly from a tarball, but that still leaves the challenge of creating that image. The nice thing about the Now integration is that you don't need to have Docker installed on your local machine.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/59/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 316323336,MDU6SXNzdWUzMTYzMjMzMzY=,231,metadata.json support for plugin configuration options,9599,simonw,closed,0,,,,,4,2018-04-20T15:58:47Z,2019-05-13T18:56:21Z,2019-05-13T18:56:21Z,OWNER,,"My [datasette-cluster-map](https://github.com/simonw/datasette-cluster-map) plugin currently works by detecting `latitude` and `longitude` columns. I'd like to be able to configure it to look for different column names. One way to do this could be to support optional plugin configuration as part of `metadata.json`. Something like this: { ""title"": ""Polar Bear Ear Tags, 2009-2011"", ""source"": ""USGS Alaska Science Center, Polar Bear Research Program"", ""source_url"": ""https://alaska.usgs.gov/products/data.php?dataid=130"", ""plugins"": { ""datasette_cluster_map"": { ""latitude_columns"": [ ""latitude"", ""Capture Latitude"" ], ""longitude_columns"": [ ""longitude"", ""Capture Longitude"" ] } } } These settings should be supported at the root level or at the individual database or table level. They could also be exposed in the https://datasette-cluster-map-demo.now.sh/-/plugins debug tool. Refs #14",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/231/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 435531034,MDU6SXNzdWU0MzU1MzEwMzQ=,435,Tracing support for seeing what SQL queries were executed,9599,simonw,closed,0,,,4305096,0.28,4,2019-04-21T17:37:37Z,2019-05-11T20:32:21Z,2019-05-11T19:07:42Z,OWNER,,"Features like faceting, foreign key expansions and now the inspect-less index view mean Datasette can end up executing a surprisingly large number of SQL queries to render a single page. Past experience with projects like [tikbar](https://github.com/simonw/tikibar) have shown that being able to see what actually went into rendering a page can be critical for optimizing performance and generally understanding how everything works. Support a tracing mode (probably via a `?_trace=1` querystring) which adds information about what is actually going on to both the HTML and the JSON.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/435/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 432893491,MDExOlB1bGxSZXF1ZXN0MjcwMjUxMDIx,432,"Refactor facets to a class and new plugin, refs #427",9599,simonw,closed,0,,,,,4,2019-04-13T20:04:45Z,2019-05-03T00:04:24Z,2019-05-03T00:04:24Z,OWNER,simonw/datasette/pulls/432,WIP for #427,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/432/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 423316403,MDU6SXNzdWU0MjMzMTY0MDM=,422,Figure out what to do about table counts in a mutable world,9599,simonw,closed,0,,,,,4,2019-03-20T15:27:15Z,2019-05-02T05:43:11Z,2019-05-02T05:43:11Z,OWNER,,"In moving away from the existing static inspect method (see #420 and #419) the biggest thing lost is full table row counts. These can be expensive against large tables, but currently Datasette runs the `count (*) from x` query once at inspection time and then reuses it for every page. We can run those counts with a timelimit, but this means that for larger tables we won't be able to show a count at all, which is disappointing. Is there a way we can find an approximate or lower bound count for a table?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/422/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 324162476,MDU6SXNzdWUzMjQxNjI0NzY=,271,Mechanism for automatically picking up changes when on-disk .db file changes,9599,simonw,closed,0,,,,,4,2018-05-17T19:53:15Z,2019-01-10T21:35:18Z,2019-01-10T21:35:18Z,OWNER,,"It would be useful if Datasette could spot when a SQLite database file changes on disk and restart itself (hence re-running .inspect() and picking up the new content hash). Ideally this could happen in an atomic way so no requests get dropped during the switch-over. This may not play well with SQLite opening databases in immutable mode. Research required.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/271/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 341123355,MDU6SXNzdWUzNDExMjMzNTU=,342,Requesting support for query description,12617395,bsilverm,closed,0,,,,,4,2018-07-13T18:50:16Z,2018-07-24T04:53:21Z,2018-07-16T02:33:54Z,NONE,,"It would be great if the metadata file allowed you to enter a description for the query. We have a lot of pre-defined queries that can only be so descriptive by their name. It would be nice if an optional description could be included underneath the name within the UI, or on hover where it currently shows the SQL.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/342/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 336924199,MDU6SXNzdWUzMzY5MjQxOTk=,330,Limit text display in cells containing large amounts of text,82988,psychemedia,closed,0,,,,,4,2018-06-29T09:15:22Z,2018-07-24T04:53:20Z,2018-07-10T16:20:48Z,CONTRIBUTOR,,"The default preview of a database shows all columns (is the row count limited?) which is fine in many cases but can take a long time to load / offer a large overhead if the table is a SpatiaLite table containing geometry columns that include large shapefiles. Would it make sense to have a setting that can limit the amount of text displayed in any given cell in the table preview, or (less useful?) suppress (with notification) the display of overlong columns unless enabled by the user? An issue then arises if a user does want to see all the text in a cell: 1) for a particular cell; 2) for every cell in the table; 3) for all cells in a particular column or columns (I haven't checked but what if a column contains e.g. raw image data? Does this display as raw data? Or can this be rendered in a context aware way as an image preview? I guess a custom template would be one way to do that?)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/330/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 338768551,MDU6SXNzdWUzMzg3Njg1NTE=,333,Datasette on Zeit Now returns http URLs for facet and next links,9599,simonw,closed,0,,,,,4,2018-07-06T00:40:49Z,2018-07-24T04:53:20Z,2018-07-24T01:51:53Z,OWNER,,"e.g. on https://fivethirtyeight.datasettes.com/fivethirtyeight-ac35616/nba-elo%2Fnbaallelo.json?_facet=lg_id&_size=0 ``` { ""facet_results"": { ""lg_id"": { ""name"": ""lg_id"", ""results"": [ { ""value"": ""NBA"", ""label"": ""NBA"", ""count"": 118016, ""toggle_url"": ""http://fivethirtyeight.datasettes.com/fivethirtyeight-ac35616/nba-elo%2Fnbaallelo.json?_facet=lg_id&_size=1&lg_id=NBA"", ""selected"": false }, { ""value"": ""ABA"", ""label"": ""ABA"", ""count"": 8298, ""toggle_url"": ""http://fivethirtyeight.datasettes.com/fivethirtyeight-ac35616/nba-elo%2Fnbaallelo.json?_facet=lg_id&_size=1&lg_id=ABA"", ""selected"": false } ], ""truncated"": false } }, ""suggested_facets"": [ { ""name"": ""_iscopy"", ""toggle_url"": ""/fivethirtyeight-ac35616/nba-elo%2Fnbaallelo.json?_facet=lg_id&_size=1&_facet=_iscopy"" } ], ""next_url"": ""http://fivethirtyeight.datasettes.com/fivethirtyeight-ac35616/nba-elo%2Fnbaallelo.json?_facet=lg_id&_size=1&_next=1"", } ``` `next_url` and `facet_results` both link to `http://` when they should link to `https://`. Note that suggested facets doesn't include the full URL at all, which is a consistency bug.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/333/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 328172521,MDU6SXNzdWUzMjgxNzI1MjE=,303,Support table names ending with .json or .csv,9599,simonw,closed,0,,,,,4,2018-05-31T14:53:23Z,2018-06-15T06:55:50Z,2018-06-15T06:55:50Z,OWNER,,"This is needed for #266 - if a table name ends with `.json` or `.csv` right now our URL pattern matching will do the wrong thing. We should be smarter about this. This does mean we will have some URLs that look like this: http://localhost:8001/dbname/weird.json - returning HTML, not JSON http://localhost:8001/dbname/weird.json.json - returning JSON http://localhost:8001/dbname/weird.json.csv - returning CSV ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/303/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275089535,MDU6SXNzdWUyNzUwODk1MzU=,121,?_json=foo&_json=bar query string argument ,9599,simonw,closed,0,,,,,4,2017-11-18T16:09:55Z,2018-05-31T13:48:12Z,2018-05-28T18:11:51Z,OWNER,,"Causes the specified columns in the output to be treated as JSON, and returned deserialized in the .json or .jsono response. This will be particularly powerful when combined with https://sqlite.org/json1.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/121/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 325352370,MDExOlB1bGxSZXF1ZXN0MTg5NzA3Mzc0,279,Add version number support with Versioneer,198537,rgieseke,closed,0,,,,,4,2018-05-22T15:39:45Z,2018-05-22T19:35:23Z,2018-05-22T19:35:22Z,CONTRIBUTOR,simonw/datasette/pulls/279,"I think that's all for getting Versioneer support, I've been happily using it in a couple of projects ... ``` In [2]: datasette.__version__ Out[2]: '0.22+3.g6e12445' ``` Repo: https://github.com/warner/python-versioneer Versioneer Licence: Public Domain (CC0-1.0) Closes #273 ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/279/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 320592643,MDU6SXNzdWUzMjA1OTI2NDM=,251,"Explore ""distinct values for column"" in inspect()",9599,simonw,closed,0,,,,,4,2018-05-06T13:27:24Z,2018-05-14T22:47:55Z,2018-05-14T22:47:55Z,OWNER,,"A lot of datasets have columns which have a small number of possible values in them - this one for example: https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9?sql=select+distinct+category+from+%5Binconvenient-sequel%2Fratings%5D%3B Detecting these could be interesting as part of `.inspect()`, since it would allow for various UI enhancements like autocomplete / select box filters for those columns. The problem is detecting them efficiently. `.inspect()` shouldn't spend 5 minutes churning through columns on giant tables trying to determine if they have a small collection of unique values.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/251/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 312620566,MDU6SXNzdWUzMTI2MjA1NjY=,199,Ability to apply sort on mobile in portrait mode,9599,simonw,closed,0,,,,,4,2018-04-09T17:35:04Z,2018-04-10T00:37:53Z,2018-04-10T00:34:38Z,OWNER,,"Missed this in #189... on mobile in portrait mode we hide the column headers, which means you can't click them to sort! You can sort in landscape mode at least. Need to come up with an alternative sort UI for portrait on mobile.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/199/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268591332,MDU6SXNzdWUyNjg1OTEzMzI=,42,Homepage UI for editing metadata file,9599,simonw,closed,0,,,,,4,2017-10-26T00:22:03Z,2017-12-10T03:02:14Z,2017-12-10T03:02:14Z,OWNER,,"Since we are going to have a metadata file which sets the title/description/etc for each database, why not allow you to run the app in —dev mode which makes the homepage into a WYSIWYG editor that can save to that file format.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/42/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 278814220,MDU6SXNzdWUyNzg4MTQyMjA=,161,Support WITH query ,388154,wsxiaoys,closed,0,,,,,4,2017-12-03T20:00:40Z,2017-12-08T06:18:12Z,2017-12-04T04:52:41Z,NONE,,"Currently datasettle failed with error message: Statement must begin with SELECT Example query ```sql WITH RECURSIVE cnt(x) AS ( SELECT 1 UNION ALL SELECT x+1 FROM cnt LIMIT 1000000 ) SELECT x FROM cnt; ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/161/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 277589569,MDU6SXNzdWUyNzc1ODk1Njk=,155,A primary key column that has foreign key restriction associated won't rendering label column,388154,wsxiaoys,closed,0,,,2949431,Custom templates edition,4,2017-11-29T00:40:02Z,2017-12-07T05:39:53Z,2017-12-07T05:39:53Z,NONE,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/155/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273247186,MDU6SXNzdWUyNzMyNDcxODY=,68,Support for title/source/license metadata,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-11-12T17:04:21Z,2017-12-04T04:55:43Z,2017-11-13T15:26:11Z,OWNER,,"I've decided this is important for launch: I want to set a precedent for people citing, licensing and documenting their datasets. Not sure how best to go about supporting this. I'd like to allow for the following data to be optionally attached to any given database: - Title - Description, potentially in markdown? - Original source URL - License I'd also like the ability to attach descriptions to individual tables - and maybe even to table columns? The question then becomes: how should this information be stored. A few options: - In the SQLite database itself, in a specially named table. Problem here is that this means having to modify SQLite databases before publishing them. - In a separate SQLite database that can be published alongside the databases we are publishing. - In a JSON file. This is neat, but JSON files are not a great editing experience once you start including multiple lines (e.g. a markdown description). - In a YAML file. This is a better format for multi-line descriptions, but still isn't a great editing experience. Whatever the format, it can be made much more usable by offering a web-based editing UI for populating it (a special mode the server can be run in).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/68/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274314940,MDU6SXNzdWUyNzQzMTQ5NDA=,105,Consider data-package as a format for metadata,9599,simonw,closed,0,,,,,4,2017-11-15T21:43:34Z,2017-11-20T19:50:53Z,2017-11-20T19:50:53Z,OWNER,,http://frictionlessdata.io/specs/data-package/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/105/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274343647,MDExOlB1bGxSZXF1ZXN0MTUyOTE0NDgw,107,add support for ?field__isnull=1,3433657,raynae,closed,0,,,,,4,2017-11-15T23:36:36Z,2017-11-17T15:12:29Z,2017-11-17T13:29:22Z,CONTRIBUTOR,simonw/datasette/pulls/107,Is this what you had in mind for [this issue](https://github.com/simonw/datasette/issues/64)?,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/107/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 273248366,MDU6SXNzdWUyNzMyNDgzNjY=,69,Enforce pagination (or at least limits) for arbitrary custom SQL,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-11-12T17:21:33Z,2017-11-13T20:32:47Z,2017-11-13T19:35:47Z,OWNER,,"It's way too easy to accidentally trigger a page that returns 100,000 rows at the moment. I need to use the LIMIT clause on views and custom SQL - I can support pagination ""next"" links using offset as well.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/69/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 272661336,MDU6SXNzdWUyNzI2NjEzMzY=,49,Pick a name,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-11-09T17:56:17Z,2017-11-10T18:33:22Z,2017-11-10T18:33:22Z,OWNER,,"Options so far: * immutabase * datasite * sqlstatic * dbserve * sqlserve Terms to play with: * immutable * sqlite * dataset * json * static * serve",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/49/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed