```
So if I can get `TabContainer` into that `self.optional` list I'll have fixed this problem.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771202454,"Use YAML examples in documentation by default, not JSON",
https://github.com/simonw/datasette/issues/1153#issuecomment-1627450852,https://api.github.com/repos/simonw/datasette/issues/1153,1627450852,IC_kwDOBm6k_c5hAO3k,9599,simonw,2023-07-08T18:17:35Z,2023-07-08T18:17:35Z,OWNER,"I figured out a workaround:
```python
extensions = [
""sphinx.ext.extlinks"",
""sphinx.ext.autodoc"",
""sphinx_copybutton"",
]
if not os.environ.get(""DISABLE_SPHINX_INLINE_TABS""):
extensions += [""sphinx_inline_tabs""]
```
That way I can run `sphinx-build -b xml . _build` successfully if I set that environment variable.
I get some noisy warnings, but it runs OK. And the resulting `docs.db` file has rows like this, which I think are fine:
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771202454,"Use YAML examples in documentation by default, not JSON",
https://github.com/simonw/datasette/issues/1153#issuecomment-1627451646,https://api.github.com/repos/simonw/datasette/issues/1153,1627451646,IC_kwDOBm6k_c5hAPD-,9599,simonw,2023-07-08T18:21:24Z,2023-07-08T18:21:24Z,OWNER,"This one was tricky:
I wanted complete control over the YAML example here, so I could ensure it used multi-line strings correctly.
I ended up changing my cog helper function to this:
```python
import json
import textwrap
from yaml import safe_dump
from ruamel.yaml import round_trip_load
def metadata_example(cog, data=None, yaml=None):
assert data or yaml, ""Must provide data= or yaml=""
assert not (data and yaml), ""Cannot use data= and yaml=""
output_yaml = None
if yaml:
# dedent it first
yaml = textwrap.dedent(yaml).strip()
# round_trip_load to preserve key order:
data = round_trip_load(yaml)
output_yaml = yaml
else:
output_yaml = safe_dump(data, sort_keys=False)
cog.out(""\n.. tab:: YAML\n\n"")
cog.out("" .. code-block:: yaml\n\n"")
cog.out(textwrap.indent(output_yaml, "" ""))
cog.out(""\n\n.. tab:: JSON\n\n"")
cog.out("" .. code-block:: json\n\n"")
cog.out(textwrap.indent(json.dumps(data, indent=2), "" ""))
cog.out(""\n"")
```
This allows me to call it ith YAML in some places:
```
.. [[[cog
metadata_example(cog, yaml=""""""
databases:
fixtures:
queries:
neighborhood_search:
fragment: fragment-goes-here
hide_sql: true
sql: |-
select neighborhood, facet_cities.name, state
from facetable join facet_cities on facetable.city_id = facet_cities.id
where neighborhood like '%' || :text || '%' order by neighborhood;
"""""")
.. ]]]
```
I had to introduce https://pypi.org/project/ruamel.yaml/ as a dependency here in order to load YAML from disk while maintaining key order.
I'm still using `safe_dump(data, sort_keys=False)` from PyYAML as I couldn't get the result I wanted for outputting YAML from an input of JSON using PyYAML.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771202454,"Use YAML examples in documentation by default, not JSON",
https://github.com/simonw/datasette/issues/1153#issuecomment-1627455892,https://api.github.com/repos/simonw/datasette/issues/1153,1627455892,IC_kwDOBm6k_c5hAQGU,9599,simonw,2023-07-08T18:39:19Z,2023-07-08T18:39:19Z,OWNER,"```
ERROR: Could not find a version that satisfies the requirement Sphinx==6.1.3; extra == ""docs"" (from datasette[docs,test]) (from versions: 0.1.61611, 0.1.61798, 0.1.61843, 0.1.61945, 0.1.61950, 0.2, 0.3, 0.4, 0.4.1, 0.4.2, 0.4.3, 0.5, 0.5.1, 0.5.2b1, 0.5.2, 0.6b1, 0.6, 0.6.1, 0.6.2, 0.6.3, 0.6.4, 0.6.5, 0.6.6, 0.6.7, 1.0b1, 1.0b2, 1.0, 1.0.1, 1.0.2, 1.0.3, 1.0.4, 1.0.5, 1.0.6, 1.0.7, 1.0.8, 1.1, 1.1.1, 1.1.2, 1.1.3, 1.2b1, 1.2b2, 1.2b3, 1.2, 1.2.1, 1.2.2, 1.2.3, 1.3b1, 1.3b2, 1.3b3, 1.3, 1.3.1, 1.3.2, 1.3.3, 1.3.4, 1.3.5, 1.3.6, 1.4a1, 1.4b1, 1.4, 1.4.1, 1.4.2, 1.4.3, 1.4.4, 1.4.5, 1.4.6, 1.4.7, 1.4.8, 1.4.9, 1.5a1, 1.5a2, 1.5b1, 1.5, 1.5.1, 1.5.2, 1.5.3, 1.5.4, 1.5.5, 1.5.6, 1.6b1, 1.6b2, 1.6b3, 1.6.1, 1.6.2, 1.6.3, 1.6.4, 1.6.5, 1.6.6, 1.6.7, 1.7.0b1, 1.7.0b2, 1.7.0, 1.7.1, 1.7.2, 1.7.3, 1.7.4, 1.7.5, 1.7.6, 1.7.7, 1.7.8, 1.7.9, 1.8.0b1, 1.8.0, 1.8.1, 1.8.2, 1.8.3, 1.8.4, 1.8.5, 1.8.6, 2.0.0b1, 2.0.0b2, 2.0.0, 2.0.1, 2.1.0, 2.1.1, 2.1.2, 2.2.0, 2.2.1, 2.2.2, 2.3.0, 2.3.1, 2.4.0, 2.4.1, 2.4.2, 2.4.3, 2.4.4, 2.4.5, 3.0.0b1, 3.0.0, 3.0.1, 3.0.2, 3.0.3, 3.0.4, 3.1.0, 3.1.1, 3.1.2, 3.2.0, 3.2.1, 3.3.0, 3.3.1, 3.4.0, 3.4.1, 3.4.2, 3.4.3, 3.5.0, 3.5.1, 3.5.2, 3.5.3, 3.5.4, 4.0.0b1, 4.0.0b2, 4.0.0, 4.0.1, 4.0.2, 4.0.3, 4.1.0, 4.1.1, 4.1.2, 4.2.0, 4.3.0, 4.3.1, 4.3.2, 4.4.0, 4.5.0, 5.0.0b1, 5.0.0, 5.0.1, 5.0.2, 5.1.0, 5.1.1, 5.2.0, 5.2.0.post0, 5.2.1, 5.2.2, 5.2.3, 5.3.0)
ERROR: No matching distribution found for Sphinx==6.1.3; extra == ""docs""
```
I'm going to drop Python 3.7.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771202454,"Use YAML examples in documentation by default, not JSON",
https://github.com/simonw/datasette/issues/1153#issuecomment-1627478910,https://api.github.com/repos/simonw/datasette/issues/1153,1627478910,IC_kwDOBm6k_c5hAVt-,9599,simonw,2023-07-08T20:01:19Z,2023-07-08T20:01:19Z,OWNER,"Some examples:
- https://docs.datasette.io/en/latest/sql_queries.html#canned-queries
- https://docs.datasette.io/en/latest/sql_queries.html#canned-query-parameters
- https://docs.datasette.io/en/latest/authentication.html#access-to-an-instance
- https://docs.datasette.io/en/latest/facets.html#facets-in-metadata
- https://docs.datasette.io/en/latest/full_text_search.html#configuring-full-text-search-for-a-table-or-view
- https://docs.datasette.io/en/latest/metadata.html
- https://docs.datasette.io/en/latest/custom_templates.html#custom-css-and-javascript
- https://docs.datasette.io/en/latest/plugins.html#plugin-configuration
I need to fix this section: https://docs.datasette.io/en/latest/writing_plugins.html#writing-plugins-that-accept-configuration","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771202454,"Use YAML examples in documentation by default, not JSON",
https://github.com/simonw/datasette/issues/1153#issuecomment-1627480353,https://api.github.com/repos/simonw/datasette/issues/1153,1627480353,IC_kwDOBm6k_c5hAWEh,9599,simonw,2023-07-08T20:09:48Z,2023-07-08T20:09:48Z,OWNER,https://docs.datasette.io/en/latest/writing_plugins.html#writing-plugins-that-accept-configuration is fixed now.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771202454,"Use YAML examples in documentation by default, not JSON",
https://github.com/simonw/datasette/issues/1153#issuecomment-805041522,https://api.github.com/repos/simonw/datasette/issues/1153,805041522,MDEyOklzc3VlQ29tbWVudDgwNTA0MTUyMg==,9599,simonw,2021-03-23T16:22:46Z,2021-03-23T16:22:46Z,OWNER,"That's a good idea. I could do that with JavaScript - loading YAML and converting it to JSON in JavaScript shouldn't be hard, and it's better than JSON-to-YAML because there's only one correct JSON representation of a YAML file whereas you can represent a JSON document in YAML in a bunch of different ways.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771202454,"Use YAML examples in documentation by default, not JSON",
https://github.com/simonw/datasette/issues/1153#issuecomment-805042880,https://api.github.com/repos/simonw/datasette/issues/1153,805042880,MDEyOklzc3VlQ29tbWVudDgwNTA0Mjg4MA==,9599,simonw,2021-03-23T16:24:32Z,2021-03-23T16:24:32Z,OWNER,... actually I think I would do that conversion in Python. The client-side YAML parsers all look a little bit heavy to me in terms of additional page weight.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771202454,"Use YAML examples in documentation by default, not JSON",
https://github.com/simonw/datasette/issues/1153#issuecomment-805047117,https://api.github.com/repos/simonw/datasette/issues/1153,805047117,MDEyOklzc3VlQ29tbWVudDgwNTA0NzExNw==,9599,simonw,2021-03-23T16:30:15Z,2021-03-23T16:46:06Z,OWNER,"https://cdnjs.cloudflare.com/ajax/libs/js-yaml/4.0.0/js-yaml.min.js is only 12.5KB zipped, 38KB total - so that's not a bad option.
https://github.com/nodeca/js-yaml","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771202454,"Use YAML examples in documentation by default, not JSON",
https://github.com/simonw/datasette/issues/1153#issuecomment-805050163,https://api.github.com/repos/simonw/datasette/issues/1153,805050163,MDEyOklzc3VlQ29tbWVudDgwNTA1MDE2Mw==,9599,simonw,2021-03-23T16:34:35Z,2021-03-23T16:35:32Z,OWNER,"https://docs.datasette.io/en/stable/metadata.html has this example:
```yaml
title: Demonstrating Metadata from YAML
description_html: |-
This description includes a long HTML string
- YAML is better for embedding HTML strings than JSON!
license: ODbL
license_url: https://opendatacommons.org/licenses/odbl/
databases:
fixtures:
tables:
no_primary_key:
hidden: true
queries:
neighborhood_search:
sql: |-
select neighborhood, facet_cities.name, state
from facetable join facet_cities on facetable.city_id = facet_cities.id
where neighborhood like '%' || :text || '%' order by neighborhood;
title: Search neighborhoods
description_html: |-
This demonstrates basic LIKE search
```
I ran this in the browser dev tools:
```javascript
var s = document.createElement('script')
s.src = 'https://cdnjs.cloudflare.com/ajax/libs/js-yaml/4.0.0/js-yaml.min.js'
document.head.appendChild(s)
var yamlExample = document.querySelector('.highlight-yaml').textContent);
console.log(JSON.stringify(window.jsyaml.load(yamlExample), null, 4))
```
And got:
```json
{
""title"": ""Demonstrating Metadata from YAML"",
""description_html"": ""
This description includes a long HTML string
\n\n - YAML is better for embedding HTML strings than JSON!
\n
"",
""license"": ""ODbL"",
""license_url"": ""https://opendatacommons.org/licenses/odbl/"",
""databases"": {
""fixtures"": {
""tables"": {
""no_primary_key"": {
""hidden"": true
}
},
""queries"": {
""neighborhood_search"": {
""sql"": ""select neighborhood, facet_cities.name, state\nfrom facetable join facet_cities on facetable.city_id = facet_cities.id\nwhere neighborhood like '%' || :text || '%' order by neighborhood;"",
""title"": ""Search neighborhoods"",
""description_html"": ""This demonstrates basic LIKE search""
}
}
}
}
}
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771202454,"Use YAML examples in documentation by default, not JSON",
https://github.com/simonw/datasette/issues/1153#issuecomment-805055291,https://api.github.com/repos/simonw/datasette/issues/1153,805055291,MDEyOklzc3VlQ29tbWVudDgwNTA1NTI5MQ==,9599,simonw,2021-03-23T16:41:31Z,2021-03-23T16:41:31Z,OWNER,"One downside of doing this conversion in JavaScript: it's much harder to get the same JSON syntax highlighting as that provided by Sphinx:
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771202454,"Use YAML examples in documentation by default, not JSON",
https://github.com/simonw/datasette/issues/1153#issuecomment-805056806,https://api.github.com/repos/simonw/datasette/issues/1153,805056806,MDEyOklzc3VlQ29tbWVudDgwNTA1NjgwNg==,9599,simonw,2021-03-23T16:43:38Z,2021-03-23T16:43:38Z,OWNER,"I used this code to get that:
```javascript
var jsonVersion = JSON.stringify(window.jsyaml.load(document.querySelector('.highlight-yaml').textContent), null, 4);
div.querySelector('.highlight pre').innerText = jsonVersion;
div.querySelector('.highlight pre').style.whiteSpace = 'pre-wrap'
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771202454,"Use YAML examples in documentation by default, not JSON",
https://github.com/simonw/datasette/issues/1153#issuecomment-805109341,https://api.github.com/repos/simonw/datasette/issues/1153,805109341,MDEyOklzc3VlQ29tbWVudDgwNTEwOTM0MQ==,9599,simonw,2021-03-23T17:55:48Z,2021-03-23T18:41:57Z,OWNER,"Beginnings of a UI element for switching between them:
```html
```
That `` has a padding of 12px, so using 12px padding on the tab links should get them to line up better.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771202454,"Use YAML examples in documentation by default, not JSON",
https://github.com/simonw/datasette/issues/1154#issuecomment-766462197,https://api.github.com/repos/simonw/datasette/issues/1154,766462197,MDEyOklzc3VlQ29tbWVudDc2NjQ2MjE5Nw==,9599,simonw,2021-01-24T23:47:06Z,2021-01-24T23:47:06Z,OWNER,"I'm going to document this but mark it as unstable, using a new documentation convention for marking unstable APIs.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771208009,Documentation for new _internal database and tables,
https://github.com/simonw/datasette/issues/1154#issuecomment-766465719,https://api.github.com/repos/simonw/datasette/issues/1154,766465719,MDEyOklzc3VlQ29tbWVudDc2NjQ2NTcxOQ==,9599,simonw,2021-01-25T00:09:22Z,2021-01-25T00:09:22Z,OWNER,"https://docs.datasette.io/en/latest/internals.html#the-internal-database
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771208009,Documentation for new _internal database and tables,
https://github.com/simonw/datasette/issues/1155#issuecomment-748356492,https://api.github.com/repos/simonw/datasette/issues/1155,748356492,MDEyOklzc3VlQ29tbWVudDc0ODM1NjQ5Mg==,9599,simonw,2020-12-18T22:49:32Z,2020-12-22T01:13:05Z,OWNER,"There's some messy code that needs fixing here. The `datasette.databases` dictionary right now has a key that corresponds to the `/_internal` URL in the path, and a value that's a `Database()` object. BUT... the `Database()` object doesn't know what its key is.
While fixing this I should fix the issue where Datasette gets confused by multiple databases with the same stem: https://github.com/simonw/datasette/issues/509","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771216293,Better internal database_name for _internal database,
https://github.com/simonw/datasette/issues/1155#issuecomment-748367922,https://api.github.com/repos/simonw/datasette/issues/1155,748367922,MDEyOklzc3VlQ29tbWVudDc0ODM2NzkyMg==,9599,simonw,2020-12-18T23:15:24Z,2020-12-18T23:15:24Z,OWNER,"The code for building up that `.databases` dictionary is a bit convoluted. Here's the code that adds a `:memory:` database if the user specified `--memory` OR if there are no files to be attached:
https://github.com/simonw/datasette/blob/ebc7aa287c99fe6114b79aeab8efb8d4489a6182/datasette/app.py#L221-L241
I'm not sure why I wrote it this way, instead of just calling `.add_database("":memory:"", Database(..., is_memory=True)`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771216293,Better internal database_name for _internal database,
https://github.com/simonw/datasette/issues/1155#issuecomment-748368384,https://api.github.com/repos/simonw/datasette/issues/1155,748368384,MDEyOklzc3VlQ29tbWVudDc0ODM2ODM4NA==,9599,simonw,2020-12-18T23:17:00Z,2020-12-18T23:17:00Z,OWNER,Here's the commit where I added it. https://github.com/simonw/datasette/commit/9743e1d91b5f0a2b3c1c0bd6ffce8739341f43c4 - I didn't yet have the `.add_database()` mechanism. Today the `MEMORY` object bit is no longer needed.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771216293,Better internal database_name for _internal database,
https://github.com/simonw/datasette/issues/1155#issuecomment-748368660,https://api.github.com/repos/simonw/datasette/issues/1155,748368660,MDEyOklzc3VlQ29tbWVudDc0ODM2ODY2MA==,9599,simonw,2020-12-18T23:18:04Z,2020-12-19T01:12:00Z,OWNER,"A `Database` should have a `.name` which is unique across the Datasette instance and is used in the URL. The `path` should be optional, only set for file databases. A new `.memory_name` property can be used for shared memory databases.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771216293,Better internal database_name for _internal database,
https://github.com/simonw/datasette/issues/1155#issuecomment-748368938,https://api.github.com/repos/simonw/datasette/issues/1155,748368938,MDEyOklzc3VlQ29tbWVudDc0ODM2ODkzOA==,9599,simonw,2020-12-18T23:19:04Z,2020-12-18T23:19:04Z,OWNER,`Database` internal class is documented here: https://docs.datasette.io/en/latest/internals.html#database-class,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771216293,Better internal database_name for _internal database,
https://github.com/simonw/datasette/issues/1155#issuecomment-748397998,https://api.github.com/repos/simonw/datasette/issues/1155,748397998,MDEyOklzc3VlQ29tbWVudDc0ODM5Nzk5OA==,9599,simonw,2020-12-19T01:28:18Z,2020-12-19T01:28:18Z,OWNER,"`datasette-graphql` returns an error due to this issue:
On the console:
```
INFO: 127.0.0.1:63116 - ""POST /graphql/_internal HTTP/1.1"" 500 Internal Server Error
Traceback (most recent call last):
File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/promise/promise.py"", line 844, in handle_future_result
resolve(future.result())
File ""/Users/simon/.local/share/virtualenvs/datasette-AWNrQs95/lib/python3.8/site-packages/datasette_graphql/utils.py"", line 603, in resolve_table
data, _, _ = await view.data(
File ""/Users/simon/Dropbox/Development/datasette/datasette/views/table.py"", line 304, in data
db = self.ds.databases[database]
graphql.error.located_error.GraphQLLocatedError: ':memory:c6dd5abe1a757a7de00d99b699175bd33d9a575f05b5751bf856b8656fb07edd'
```
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771216293,Better internal database_name for _internal database,
https://github.com/simonw/datasette/issues/1155#issuecomment-749170608,https://api.github.com/repos/simonw/datasette/issues/1155,749170608,MDEyOklzc3VlQ29tbWVudDc0OTE3MDYwOA==,9599,simonw,2020-12-21T20:01:47Z,2020-12-21T20:01:47Z,OWNER,I removed that `MEMORY` object() in dcdfb2c301341d45b66683e3e3be72f9c7585b2f,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771216293,Better internal database_name for _internal database,
https://github.com/simonw/datasette/issues/1155#issuecomment-749176936,https://api.github.com/repos/simonw/datasette/issues/1155,749176936,MDEyOklzc3VlQ29tbWVudDc0OTE3NjkzNg==,9599,simonw,2020-12-21T20:18:15Z,2020-12-21T20:18:15Z,OWNER,"Fun query:
```sql
select table_name, group_concat(name, ', ') from columns group by database_name, table_name
```
https://latest.datasette.io/_internal?sql=select+table_name%2C+group_concat%28name%2C+%27%2C+%27%29+from+columns+group+by+database_name%2C+table_name","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771216293,Better internal database_name for _internal database,
https://github.com/simonw/datasette/issues/1155#issuecomment-749723557,https://api.github.com/repos/simonw/datasette/issues/1155,749723557,MDEyOklzc3VlQ29tbWVudDc0OTcyMzU1Nw==,9599,simonw,2020-12-22T19:08:27Z,2020-12-22T19:08:27Z,OWNER,"I'm going to have the `.add_database()` method select the name used in the path, de-duping against any existing names. It will then set database.name to that so that the database has access to its own name.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771216293,Better internal database_name for _internal database,
https://github.com/simonw/datasette/issues/1156#issuecomment-749158111,https://api.github.com/repos/simonw/datasette/issues/1156,749158111,MDEyOklzc3VlQ29tbWVudDc0OTE1ODExMQ==,9599,simonw,2020-12-21T19:33:45Z,2020-12-21T19:33:45Z,OWNER,"One reason for this change: it means I can use that database for more stuff. I've been thinking about moving metadata storage there for example, which fits a database called `_internal` but not one called `_schemas`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",772408750,Rename _schemas to _internal,
https://github.com/simonw/datasette/issues/1157#issuecomment-749179460,https://api.github.com/repos/simonw/datasette/issues/1157,749179460,MDEyOklzc3VlQ29tbWVudDc0OTE3OTQ2MA==,9599,simonw,2020-12-21T20:24:19Z,2020-12-21T20:24:19Z,OWNER,"Three places to fix:
https://github.com/simonw/datasette/blob/dcdfb2c301341d45b66683e3e3be72f9c7585b2f/datasette/tracer.py#L40-L42
https://github.com/simonw/datasette/blob/dcdfb2c301341d45b66683e3e3be72f9c7585b2f/datasette/utils/__init__.py#L139-L152
https://github.com/simonw/datasette/blob/dcdfb2c301341d45b66683e3e3be72f9c7585b2f/datasette/views/base.py#L460-L461","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",772438273,Use time.perf_counter() instead of time.time() to measure performance,
https://github.com/simonw/datasette/issues/116#issuecomment-392574208,https://api.github.com/repos/simonw/datasette/issues/116,392574208,MDEyOklzc3VlQ29tbWVudDM5MjU3NDIwOA==,9599,simonw,2018-05-28T17:23:41Z,2018-05-28T17:23:41Z,OWNER,"I'm handling this as separate documentation sections instead, e.g. http://datasette.readthedocs.io/en/latest/spatialite.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274884209,Add documentation section about SQLite extensions,
https://github.com/simonw/datasette/issues/1160#issuecomment-751925934,https://api.github.com/repos/simonw/datasette/issues/1160,751925934,MDEyOklzc3VlQ29tbWVudDc1MTkyNTkzNA==,9599,simonw,2020-12-29T02:40:13Z,2020-12-29T20:25:57Z,OWNER,"Basic command design:
datasette insert data.db blah.csv
The options can include:
- `--format` to specify the exact format - without this it will be guessed based on the filename
- `--table` to specify the table (otherwise the filename is used)
- `--pk` to specify one or more primary key columns
- `--replace` to specify that existing rows with a matching primary key should be replaced
- `--upsert` to specify that existing matching rows should be upserted
- `--ignore` to ignore matching rows
- `--alter` to alter the table to add missing columns
- `--type column type` to specify the type of a column - useful when working with CSV or TSV files","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-751926095,https://api.github.com/repos/simonw/datasette/issues/1160,751926095,MDEyOklzc3VlQ29tbWVudDc1MTkyNjA5NQ==,9599,simonw,2020-12-29T02:41:15Z,2020-12-29T02:41:15Z,OWNER,"The UI can live at `/-/insert` and be available by default to the `root` user only. It can offer the following:
- Upload a file and have the import type detected (equivalent to `datasette insert data.db thatfile.csv`)
- Copy and paste the data to be inserted into a textarea
- API equivalents of these","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-751926218,https://api.github.com/repos/simonw/datasette/issues/1160,751926218,MDEyOklzc3VlQ29tbWVudDc1MTkyNjIxOA==,9599,simonw,2020-12-29T02:41:57Z,2020-12-29T02:41:57Z,OWNER,"Other names I considered:
- `datasette load`
- `datasette import` - I decided to keep this name available for any future work that might involve plugins that help import data from APIs as opposed to inserting it from files","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-751926437,https://api.github.com/repos/simonw/datasette/issues/1160,751926437,MDEyOklzc3VlQ29tbWVudDc1MTkyNjQzNw==,9599,simonw,2020-12-29T02:43:21Z,2020-12-29T02:43:37Z,OWNER,"Default formats to support:
- CSV
- TSV
- JSON and newline-delimited JSON
- YAML
Each of these will be implemented as a default plugin.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-751943837,https://api.github.com/repos/simonw/datasette/issues/1160,751943837,MDEyOklzc3VlQ29tbWVudDc1MTk0MzgzNw==,9599,simonw,2020-12-29T04:40:30Z,2020-12-29T04:40:30Z,OWNER,"The `insert` command should also accept URLs - anything starting with `http://` or `https://`.
It should accept more than one file name at a time for bulk inserts.
if using a URL that URL will be passed to the method that decides if a plugin implementation can handle the import or not. This will allow plugins to register themselves for specific websites.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-751945094,https://api.github.com/repos/simonw/datasette/issues/1160,751945094,MDEyOklzc3VlQ29tbWVudDc1MTk0NTA5NA==,9599,simonw,2020-12-29T04:48:11Z,2020-12-29T04:48:11Z,OWNER,"It would be pretty cool if you could launch Datasette directly against an insert-compatible file or URL without first having to load it into a SQLite database file.
Or imagine being able to tail a log file and like that directly into a new Datasette process, which then runs a web server with the UI while simultaneously continuing to load new entries from that log into the in-memory SQLite database that it is serving...
Not quite sure what that CLI interface would look like. Maybe treat that as a future stretch goal for the moment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-751946262,https://api.github.com/repos/simonw/datasette/issues/1160,751946262,MDEyOklzc3VlQ29tbWVudDc1MTk0NjI2Mg==,9599,simonw,2020-12-29T04:56:12Z,2020-12-29T04:56:32Z,OWNER,"Potential design for this: a `datasette memory` command which takes most of the same arguments as `datasette serve` but starts an in-memory database and treats the command arguments as things that should be inserted into that in-memory database.
tail -f access.log | datasette memory - \
--format clf -p 8002 -o","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-751947991,https://api.github.com/repos/simonw/datasette/issues/1160,751947991,MDEyOklzc3VlQ29tbWVudDc1MTk0Nzk5MQ==,9599,simonw,2020-12-29T05:06:50Z,2020-12-29T05:07:03Z,OWNER,"Given the URL option could it be possible for plugins to ""subscribe"" to URLs that keep on streaming?
datasette insert db.db https://example.con/streaming-api \
--format api-stream","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-752203909,https://api.github.com/repos/simonw/datasette/issues/1160,752203909,MDEyOklzc3VlQ29tbWVudDc1MjIwMzkwOQ==,9599,simonw,2020-12-29T18:54:19Z,2020-12-29T18:54:19Z,OWNER,More thoughts on this: the key mechanism that populates the tables needs to be an `aysnc def` method of some sort so that it can run as part of the async loop in core Datasette - for importing from web uploads.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-752208036,https://api.github.com/repos/simonw/datasette/issues/1160,752208036,MDEyOklzc3VlQ29tbWVudDc1MjIwODAzNg==,9599,simonw,2020-12-29T19:06:35Z,2020-12-29T19:06:35Z,OWNER,"If I'm going to execute 1000s of writes in an `async def` operation it may make sense to break that up into smaller chunks, so as not to block the event loop for too long.
https://stackoverflow.com/a/36648102 and https://github.com/python/asyncio/issues/284 confirm that `await asyncio.sleep(0)` is the recommended way of doing this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-752236520,https://api.github.com/repos/simonw/datasette/issues/1160,752236520,MDEyOklzc3VlQ29tbWVudDc1MjIzNjUyMA==,9599,simonw,2020-12-29T20:48:51Z,2020-12-29T20:48:51Z,OWNER,It would be neat if `datasette insert` could accept a `--plugins-dir` option which allowed one-off format plugins to be registered. Bit tricky to implement since the `--format` Click option will already be populated by that plugin hook call.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-752257666,https://api.github.com/repos/simonw/datasette/issues/1160,752257666,MDEyOklzc3VlQ29tbWVudDc1MjI1NzY2Ng==,9599,simonw,2020-12-29T22:09:18Z,2020-12-29T22:09:18Z,OWNER,"### Figuring out the API design
I want to be able to support different formats, and be able to parse them into tables either streaming or in one go depending on if the format supports that.
Ideally I want to be able to pull the first 1,024 bytes for the purpose of detecting the format, then replay those bytes again later. I'm considering this a stretch goal though.
CSV is easy to parse as a stream - here’s [how sqlite-utils does it](https://github.com/simonw/sqlite-utils/blob/f1277f638f3a54a821db6e03cb980adad2f2fa35/sqlite_utils/cli.py#L630):
dialect = ""excel-tab"" if tsv else ""excel""
with file_progress(json_file, silent=silent) as json_file:
reader = csv_std.reader(json_file, dialect=dialect)
headers = next(reader)
docs = (dict(zip(headers, row)) for row in reader)
Problem: using `db.insert_all()` could block for a long time on a big set of rows. Probably easiest to batch the records before calling `insert_all()` and then run a batch at a time using a `db.execute_write_fn()` call.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-752259345,https://api.github.com/repos/simonw/datasette/issues/1160,752259345,MDEyOklzc3VlQ29tbWVudDc1MjI1OTM0NQ==,9599,simonw,2020-12-29T22:11:54Z,2020-12-29T22:11:54Z,OWNER,"Important detail from https://docs.python.org/3/library/csv.html#csv.reader
> If *csvfile* is a file object, it should be opened with `newline=''`. [1]
>
> [...]
>
> If `newline=''` is not specified, newlines embedded inside quoted fields will not be interpreted correctly, and on platforms that use `\r\n` linendings on write an extra `\r` will be added. It should always be safe to specify `newline=''`, since the csv module does its own ([universal](https://docs.python.org/3/glossary.html#term-universal-newlines)) newline handling.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-752265600,https://api.github.com/repos/simonw/datasette/issues/1160,752265600,MDEyOklzc3VlQ29tbWVudDc1MjI2NTYwMA==,9599,simonw,2020-12-29T22:39:56Z,2020-12-29T22:39:56Z,OWNER,"Does it definitely make sense to break this operation up into the code that turns the incoming format into a iterator of dictionaries, then the code that inserts those into the database using `sqlite-utils`?
That seems right for simple imports, where the incoming file represents a sequence of records in a single table. But what about more complex formats? What if a format needs to be represented as multiple tables?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-752266076,https://api.github.com/repos/simonw/datasette/issues/1160,752266076,MDEyOklzc3VlQ29tbWVudDc1MjI2NjA3Ng==,9599,simonw,2020-12-29T22:42:23Z,2020-12-29T22:42:59Z,OWNER,"Aside: maybe `datasette insert` works against simple files, but a later mechanism called `datasette import` allows plugins to register sub-commands, like `datasette import github ...` or `datasette import jira ...` or whatever.
This would be useful for import mechanisms that are likely to need their own custom set of command-line options unique to that source.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-752267905,https://api.github.com/repos/simonw/datasette/issues/1160,752267905,MDEyOklzc3VlQ29tbWVudDc1MjI2NzkwNQ==,9599,simonw,2020-12-29T22:52:09Z,2020-12-29T22:52:09Z,OWNER,"What's the simplest thing that could possible work? I think it's `datasette insert blah.db data.csv` - no URL handling, no other formats.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-752273306,https://api.github.com/repos/simonw/datasette/issues/1160,752273306,MDEyOklzc3VlQ29tbWVudDc1MjI3MzMwNg==,9599,simonw,2020-12-29T23:19:15Z,2020-12-29T23:19:15Z,OWNER,It would be nice if this abstraction could support progress bars as well. These won't necessarily work for every format - or they might work for things loaded from files but not things loaded over URLs (if the `content-length` HTTP header is missing) - but if they ARE possible it would be good to provide them - both for the CLI interface and the web insert UI.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-752273400,https://api.github.com/repos/simonw/datasette/issues/1160,752273400,MDEyOklzc3VlQ29tbWVudDc1MjI3MzQwMA==,9599,simonw,2020-12-29T23:19:46Z,2020-12-29T23:19:46Z,OWNER,I'm going to break out some separate tickets.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-752273873,https://api.github.com/repos/simonw/datasette/issues/1160,752273873,MDEyOklzc3VlQ29tbWVudDc1MjI3Mzg3Mw==,9599,simonw,2020-12-29T23:22:30Z,2020-12-29T23:22:30Z,OWNER,"How much of this should I get done in a branch before merging into `main`?
The challenge here is the plugin hook design: ideally I don't want an incomplete plugin hook design in `main` since that could be a blocker for a release.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-752274078,https://api.github.com/repos/simonw/datasette/issues/1160,752274078,MDEyOklzc3VlQ29tbWVudDc1MjI3NDA3OA==,9599,simonw,2020-12-29T23:23:39Z,2020-12-29T23:23:39Z,OWNER,"If I design this right I can ship a full version of the command-line `datasette insert` command in a release without doing any work at all on the Web UI version of it - that UI can then come later, without needing any changes to be made to the plugin hook.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-752274509,https://api.github.com/repos/simonw/datasette/issues/1160,752274509,MDEyOklzc3VlQ29tbWVudDc1MjI3NDUwOQ==,9599,simonw,2020-12-29T23:26:02Z,2020-12-29T23:26:02Z,OWNER,"The documentation for this plugin hook is going to be pretty detailed, since it involves writing custom classes.
I'll stick it all on the existing hooks page for the moment, but I should think about breaking up the plugin hook documentation into a page-per-hook in the future.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-752275611,https://api.github.com/repos/simonw/datasette/issues/1160,752275611,MDEyOklzc3VlQ29tbWVudDc1MjI3NTYxMQ==,9599,simonw,2020-12-29T23:32:04Z,2020-12-29T23:32:04Z,OWNER,"If I can get this working for CSV, TSV, JSON and JSON-NL that should be enough to exercise the API design pretty well across both streaming and non-streaming formats.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1160#issuecomment-753568428,https://api.github.com/repos/simonw/datasette/issues/1160,753568428,MDEyOklzc3VlQ29tbWVudDc1MzU2ODQyOA==,9599,simonw,2021-01-03T05:02:32Z,2021-01-03T05:02:32Z,OWNER,"Should this command include a `--fts` option for configuring full-text search on one-or-more columns?
I thought about doing that for `sqlite-utils insert` in https://github.com/simonw/sqlite-utils/issues/202 and decided not to because of the need to include extra options covering the FTS version, porter stemming options and whether or not to create triggers.
But maybe I can set sensible defaults for that with `datasette insert ... -f title -f body`? Worth thinking about a bit more.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook",
https://github.com/simonw/datasette/issues/1161#issuecomment-752253095,https://api.github.com/repos/simonw/datasette/issues/1161,752253095,MDEyOklzc3VlQ29tbWVudDc1MjI1MzA5NQ==,9599,simonw,2020-12-29T21:49:57Z,2020-12-29T21:49:57Z,OWNER,https://github.com/Homebrew/homebrew-core/pull/67983,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776101101,Update a whole bunch of links to datasette.io instead of datasette.readthedocs.io,
https://github.com/simonw/datasette/issues/1163#issuecomment-752274340,https://api.github.com/repos/simonw/datasette/issues/1163,752274340,MDEyOklzc3VlQ29tbWVudDc1MjI3NDM0MA==,9599,simonw,2020-12-29T23:25:02Z,2020-12-29T23:25:02Z,OWNER,This will be built on top of `httpx` since that's already a dependency.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776128565,"""datasette insert data.db url-to-csv""",
https://github.com/simonw/datasette/issues/1164#issuecomment-752756612,https://api.github.com/repos/simonw/datasette/issues/1164,752756612,MDEyOklzc3VlQ29tbWVudDc1Mjc1NjYxMg==,9599,simonw,2020-12-30T20:59:54Z,2020-12-30T20:59:54Z,OWNER,"I tried a few different pure-Python JavaScript minifying libraries and none of them produced results as good as https://www.npmjs.com/package/uglify-js for the plugin code I'm considering in #983.
So I think I'll need to rely on a Node.js tool for this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette,
https://github.com/simonw/datasette/issues/1164#issuecomment-752757075,https://api.github.com/repos/simonw/datasette/issues/1164,752757075,MDEyOklzc3VlQ29tbWVudDc1Mjc1NzA3NQ==,9599,simonw,2020-12-30T21:01:27Z,2020-12-30T21:01:27Z,OWNER,"I don't want Datasette contributors to need a working Node.js install to run the tests or work on Datasette unless they are explicitly working on the JavaScript.
I think I'm going to do this with a unit test that runs only if `upglify-js` is available on the path and confirms that the `*.min.js` version of each script in the repository correctly matches the results from running `uglify-js` against it.
That way if anyone checks in a change to JavaScript but forgets to run the minifier the tests will fail in CI.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette,
https://github.com/simonw/datasette/issues/1164#issuecomment-752768652,https://api.github.com/repos/simonw/datasette/issues/1164,752768652,MDEyOklzc3VlQ29tbWVudDc1Mjc2ODY1Mg==,9599,simonw,2020-12-30T21:46:29Z,2020-12-30T21:46:29Z,OWNER,Running https://skalman.github.io/UglifyJS-online/ against https://github.com/simonw/datasette/blob/0.53/datasette/static/table.js knocks it down from 7810 characters to 4643.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette,
https://github.com/simonw/datasette/issues/1164#issuecomment-752768785,https://api.github.com/repos/simonw/datasette/issues/1164,752768785,MDEyOklzc3VlQ29tbWVudDc1Mjc2ODc4NQ==,9599,simonw,2020-12-30T21:47:06Z,2020-12-30T21:47:06Z,OWNER,If I'm going to minify `table.js` I'd like to offer a source map for it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette,
https://github.com/simonw/datasette/issues/1164#issuecomment-752769452,https://api.github.com/repos/simonw/datasette/issues/1164,752769452,MDEyOklzc3VlQ29tbWVudDc1Mjc2OTQ1Mg==,9599,simonw,2020-12-30T21:50:16Z,2020-12-30T21:50:16Z,OWNER,If I implement this I can automate the CodeMirror minification and remove the bit about running `uglify-js` against it from the documentation here: https://docs.datasette.io/en/0.53/contributing.html#upgrading-codemirror,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette,
https://github.com/simonw/datasette/issues/1164#issuecomment-753220412,https://api.github.com/repos/simonw/datasette/issues/1164,753220412,MDEyOklzc3VlQ29tbWVudDc1MzIyMDQxMg==,9599,simonw,2020-12-31T22:47:36Z,2020-12-31T22:47:36Z,OWNER,"I'm trying to minify `table.js` and I ran into a problem:
Uglification failed. Unexpected character '`'
It turns out `uglify-js` doesn't support ES6 syntax!
But `uglify-es` does:
npm install uglify-es
Annoyingly it looks like `uglify-es` uses the same CLI command, `uglifyjs`. So after installing it this seemed to work:
npx uglifyjs table.js --source-map -o table.min.js
I really don't like how `npx uglifyjs` could mean different things depending on which package was installed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette,
https://github.com/simonw/datasette/issues/1164#issuecomment-753220665,https://api.github.com/repos/simonw/datasette/issues/1164,753220665,MDEyOklzc3VlQ29tbWVudDc1MzIyMDY2NQ==,9599,simonw,2020-12-31T22:49:36Z,2020-12-31T22:49:36Z,OWNER,"I started with a 7K `table.js` file.
`npx uglifyjs table.js --source-map -o table.min.js` gave me a 5.6K `table.min.js` file.
`npx uglifyjs table.js --source-map -o table.min.js --compress --mangle` gave me 4.5K.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette,
https://github.com/simonw/datasette/issues/1164#issuecomment-753221362,https://api.github.com/repos/simonw/datasette/issues/1164,753221362,MDEyOklzc3VlQ29tbWVudDc1MzIyMTM2Mg==,9599,simonw,2020-12-31T22:55:57Z,2020-12-31T22:55:57Z,OWNER,"I had to add this as the first line in `table.min.js` for the source mapping to work:
```
//# sourceMappingURL=/-/static/table.min.js.map
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette,
https://github.com/simonw/datasette/issues/1164#issuecomment-754182058,https://api.github.com/repos/simonw/datasette/issues/1164,754182058,MDEyOklzc3VlQ29tbWVudDc1NDE4MjA1OA==,9599,simonw,2021-01-04T19:53:31Z,2021-01-04T19:53:31Z,OWNER,This will be helped by the new `package.json` added in #1170.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette,
https://github.com/simonw/datasette/issues/1165#issuecomment-752757910,https://api.github.com/repos/simonw/datasette/issues/1165,752757910,MDEyOklzc3VlQ29tbWVudDc1Mjc1NzkxMA==,9599,simonw,2020-12-30T21:04:18Z,2020-12-30T21:04:18Z,OWNER,https://jestjs.io/ looks worth trying here.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776635426,Mechanism for executing JavaScript unit tests,
https://github.com/simonw/datasette/issues/1165#issuecomment-752777744,https://api.github.com/repos/simonw/datasette/issues/1165,752777744,MDEyOklzc3VlQ29tbWVudDc1Mjc3Nzc0NA==,9599,simonw,2020-12-30T22:30:24Z,2020-12-30T22:30:24Z,OWNER,"https://www.valentinog.com/blog/jest/ was useful.
I created a `static/__tests__` folder and added this file as `plugins.spec.js`:
```javascript
const datasette = require(""../plugins.js"");
describe(""Datasette Plugins"", () => {
test(""it should have datasette.plugins"", () => {
expect(!!datasette.plugins).toEqual(true);
});
test(""registering a plugin should work"", () => {
datasette.plugins.register(""numbers"", (a, b) => a + b, [""a"", ""b""]);
var result = datasette.plugins.call(""numbers"", { a: 1, b: 2 });
expect(result).toEqual([3]);
datasette.plugins.register(""numbers"", (a, b) => a * b, [""a"", ""b""]);
var result2 = datasette.plugins.call(""numbers"", { a: 1, b: 2 });
expect(result2).toEqual([3, 2]);
});
});
```
In `static/plugins.js` I put this:
```javascript
var datasette = datasette || {};
datasette.plugins = (() => {
var registry = {};
return {
register: (hook, fn, parameters) => {
if (!registry[hook]) {
registry[hook] = [];
}
registry[hook].push([fn, parameters]);
},
call: (hook, args) => {
args = args || {};
var results = [];
(registry[hook] || []).forEach(([fn, parameters]) => {
/* Call with the correct arguments */
var result = fn.apply(fn, parameters.map(parameter => args[parameter]));
if (result !== undefined) {
results.push(result);
}
});
return results;
}
};
})();
module.exports = datasette;
```
Note the `module.exports` line at the end.
Then inside `static/` I ran the following command:
```
% npx jest -c '{}'
PASS __tests__/plugins.spec.js
Datasette Plugins
✓ it should have datasette.plugins (3 ms)
✓ registering a plugin should work (1 ms)
Test Suites: 1 passed, 1 total
Tests: 2 passed, 2 total
Snapshots: 0 total
Time: 1.163 s
Ran all test suites.
```
The `-c {}` was necessary because I didn't have a Jest configuration or a `package.json`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776635426,Mechanism for executing JavaScript unit tests,
https://github.com/simonw/datasette/issues/1165#issuecomment-752779490,https://api.github.com/repos/simonw/datasette/issues/1165,752779490,MDEyOklzc3VlQ29tbWVudDc1Mjc3OTQ5MA==,9599,simonw,2020-12-30T22:38:43Z,2020-12-30T22:38:43Z,OWNER,Turned that into a TIL: https://til.simonwillison.net/javascript/jest-without-package-json,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776635426,Mechanism for executing JavaScript unit tests,
https://github.com/simonw/datasette/issues/1165#issuecomment-752779820,https://api.github.com/repos/simonw/datasette/issues/1165,752779820,MDEyOklzc3VlQ29tbWVudDc1Mjc3OTgyMA==,9599,simonw,2020-12-30T22:40:28Z,2020-12-30T22:40:28Z,OWNER,"I don't know if Jest on the command-line is the right tool for this. It works for the `plugins.js` script but I'm increasingly going to want to start adding tests for browser JavaScript features - like the https://github.com/simonw/datasette/blob/0.53/datasette/static/table.js script - which will need to run in a browser.
So maybe I should just find a browser testing solution and figure out how to run that under CI in GitHub Actions. Maybe https://www.cypress.io/ ?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776635426,Mechanism for executing JavaScript unit tests,
https://github.com/simonw/datasette/issues/1165#issuecomment-752780000,https://api.github.com/repos/simonw/datasette/issues/1165,752780000,MDEyOklzc3VlQ29tbWVudDc1Mjc4MDAwMA==,9599,simonw,2020-12-30T22:41:25Z,2020-12-30T22:41:25Z,OWNER,Jest works with Puppeteer: https://jestjs.io/docs/en/puppeteer,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776635426,Mechanism for executing JavaScript unit tests,
https://github.com/simonw/datasette/issues/1165#issuecomment-752828851,https://api.github.com/repos/simonw/datasette/issues/1165,752828851,MDEyOklzc3VlQ29tbWVudDc1MjgyODg1MQ==,9599,simonw,2020-12-31T03:19:38Z,2020-12-31T03:19:38Z,OWNER,"I got Cypress working! I added the `datasette.plugins` code to the table template and ran a test called `plugins.spec.js` using the following:
```javascript
context('datasette.plugins API', () => {
beforeEach(() => {
cy.visit('/fixtures/compound_three_primary_keys')
});
it('should exist', () => {
let datasette;
cy.window().then(win => {
datasette = win.datasette;
}).then(() => {
expect(datasette).to.exist;
expect(datasette.plugins).to.exist;
});
});
it('should register and execute plugins', () => {
let datasette;
cy.window().then(win => {
datasette = win.datasette;
}).then(() => {
expect(datasette.plugins.call('numbers')).to.deep.equal([]);
// Register a plugin
datasette.plugins.register(""numbers"", (a, b) => a + b, ['a', 'b']);
var result = datasette.plugins.call(""numbers"", {a: 1, b: 2});
expect(result).to.deep.equal([3]);
// Second plugin
datasette.plugins.register(""numbers"", (a, b) => a * b, ['a', 'b']);
var result2 = datasette.plugins.call(""numbers"", {a: 1, b: 2});
expect(result2).to.deep.equal([3, 2]);
});
});
});
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776635426,Mechanism for executing JavaScript unit tests,
https://github.com/simonw/datasette/issues/1165#issuecomment-752839433,https://api.github.com/repos/simonw/datasette/issues/1165,752839433,MDEyOklzc3VlQ29tbWVudDc1MjgzOTQzMw==,9599,simonw,2020-12-31T04:29:40Z,2020-12-31T04:29:40Z,OWNER,Important to absorb the slightly bizarre assertion syntax from Chai - docs here https://www.chaijs.com/api/bdd/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776635426,Mechanism for executing JavaScript unit tests,
https://github.com/simonw/datasette/issues/1165#issuecomment-752846267,https://api.github.com/repos/simonw/datasette/issues/1165,752846267,MDEyOklzc3VlQ29tbWVudDc1Mjg0NjI2Nw==,9599,simonw,2020-12-31T05:10:41Z,2020-12-31T05:13:14Z,OWNER,"https://github.com/PostHog/posthog/tree/master/cypress/integration has some useful examples, linked from this article: https://posthog.com/blog/cypress-end-to-end-tests
Also useful: their workflow https://github.com/PostHog/posthog/blob/master/.github/workflows/e2e.yml","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776635426,Mechanism for executing JavaScript unit tests,
https://github.com/simonw/datasette/issues/1166#issuecomment-753193475,https://api.github.com/repos/simonw/datasette/issues/1166,753193475,MDEyOklzc3VlQ29tbWVudDc1MzE5MzQ3NQ==,9599,simonw,2020-12-31T21:33:00Z,2020-12-31T21:33:00Z,OWNER,"I want a CI check that confirms that files conform to prettier - but only `datasette/static/*.js` files that are not already minified.
This seems to do the job:
npx prettier --check 'datasette/static/*[!.min].js'
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777140799,Adopt Prettier for JavaScript code formatting,
https://github.com/simonw/datasette/issues/1166#issuecomment-753195905,https://api.github.com/repos/simonw/datasette/issues/1166,753195905,MDEyOklzc3VlQ29tbWVudDc1MzE5NTkwNQ==,9599,simonw,2020-12-31T21:34:46Z,2020-12-31T21:34:46Z,OWNER,This action looks good - tag 3.2 is equivalent to this commit hash: https://github.com/creyD/prettier_action/tree/bb361e2979cff283ca7684908deac8f95400e779,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777140799,Adopt Prettier for JavaScript code formatting,
https://github.com/simonw/datasette/issues/1166#issuecomment-753197957,https://api.github.com/repos/simonw/datasette/issues/1166,753197957,MDEyOklzc3VlQ29tbWVudDc1MzE5Nzk1Nw==,9599,simonw,2020-12-31T21:36:14Z,2020-12-31T21:36:14Z,OWNER,"Maybe not that action actually - I wanted to use a pre-built action to avoid installing Prettier every time, but that's what it seems to do: https://github.com/creyD/prettier_action/blob/bb361e2979cff283ca7684908deac8f95400e779/entrypoint.sh#L28-L37","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777140799,Adopt Prettier for JavaScript code formatting,
https://github.com/simonw/datasette/issues/1166#issuecomment-753200580,https://api.github.com/repos/simonw/datasette/issues/1166,753200580,MDEyOklzc3VlQ29tbWVudDc1MzIwMDU4MA==,9599,simonw,2020-12-31T21:38:06Z,2020-12-31T21:38:06Z,OWNER,"I think this should work:
```
- uses: actions/cache@v2
with:
path: ~/.npm
key: ${{ runner.os }}-node-${{ hashFiles('**/prettier.yml' }}
```
I'll use the `prettier.yml` workflow that I'm about to create as the cache key for the NPM cache.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777140799,Adopt Prettier for JavaScript code formatting,
https://github.com/simonw/datasette/issues/1166#issuecomment-753209192,https://api.github.com/repos/simonw/datasette/issues/1166,753209192,MDEyOklzc3VlQ29tbWVudDc1MzIwOTE5Mg==,9599,simonw,2020-12-31T21:44:22Z,2020-12-31T21:44:22Z,OWNER,"Tests passed in https://github.com/simonw/datasette/runs/1631677726?check_suite_focus=true
I'm going to try submitting a pull request with badly formatted JavaScript to see if it gets caught.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777140799,Adopt Prettier for JavaScript code formatting,
https://github.com/simonw/datasette/issues/1166#issuecomment-753210536,https://api.github.com/repos/simonw/datasette/issues/1166,753210536,MDEyOklzc3VlQ29tbWVudDc1MzIxMDUzNg==,9599,simonw,2020-12-31T21:45:19Z,2020-12-31T21:45:19Z,OWNER,"Oops, committed that bad formatting test to `main` instead of a branch!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777140799,Adopt Prettier for JavaScript code formatting,
https://github.com/simonw/datasette/issues/1166#issuecomment-753211535,https://api.github.com/repos/simonw/datasette/issues/1166,753211535,MDEyOklzc3VlQ29tbWVudDc1MzIxMTUzNQ==,9599,simonw,2020-12-31T21:46:04Z,2020-12-31T21:46:04Z,OWNER,"https://github.com/simonw/datasette/runs/1631682372?check_suite_focus=true failed!
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777140799,Adopt Prettier for JavaScript code formatting,
https://github.com/simonw/datasette/issues/1166#issuecomment-753214664,https://api.github.com/repos/simonw/datasette/issues/1166,753214664,MDEyOklzc3VlQ29tbWVudDc1MzIxNDY2NA==,9599,simonw,2020-12-31T21:58:04Z,2020-12-31T21:58:04Z,OWNER,Wrote a TIL about this: https://til.simonwillison.net/github-actions/prettier-github-actions,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777140799,Adopt Prettier for JavaScript code formatting,
https://github.com/simonw/datasette/issues/1166#issuecomment-753224351,https://api.github.com/repos/simonw/datasette/issues/1166,753224351,MDEyOklzc3VlQ29tbWVudDc1MzIyNDM1MQ==,9599,simonw,2020-12-31T23:23:29Z,2020-12-31T23:23:29Z,OWNER,I should configure the action to only run if changes have been made within the `datasette/static` directory.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777140799,Adopt Prettier for JavaScript code formatting,
https://github.com/simonw/datasette/issues/1167#issuecomment-766487520,https://api.github.com/repos/simonw/datasette/issues/1167,766487520,MDEyOklzc3VlQ29tbWVudDc2NjQ4NzUyMA==,9599,simonw,2021-01-25T01:44:43Z,2021-01-25T01:44:43Z,OWNER,"Thanks @benpickles, I just merged that in. I'll use it in the documentation.
# To check code is conformant
npm run prettier -- --check
# To fix it if it isn't
npm run fix
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777145954,Add Prettier to contributing documentation,
https://github.com/simonw/datasette/issues/1167#issuecomment-766491566,https://api.github.com/repos/simonw/datasette/issues/1167,766491566,MDEyOklzc3VlQ29tbWVudDc2NjQ5MTU2Ng==,9599,simonw,2021-01-25T02:01:19Z,2021-01-25T02:01:19Z,OWNER,New documentation section here (I documented Black as well): https://docs.datasette.io/en/latest/contributing.html#code-formatting,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777145954,Add Prettier to contributing documentation,
https://github.com/simonw/datasette/issues/1168#issuecomment-1739816358,https://api.github.com/repos/simonw/datasette/issues/1168,1739816358,IC_kwDOBm6k_c5ns32m,9599,simonw,2023-09-28T18:29:05Z,2023-09-28T18:29:05Z,OWNER,Datasette Cloud really wants this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables,
https://github.com/simonw/datasette/issues/1168#issuecomment-753366024,https://api.github.com/repos/simonw/datasette/issues/1168,753366024,MDEyOklzc3VlQ29tbWVudDc1MzM2NjAyNA==,9599,simonw,2021-01-01T18:48:34Z,2021-01-01T18:48:34Z,OWNER,Also: in #188 I proposed bundling metadata in the SQLite database itself alongside the data. This is a great way of ensuring metadata travels with the data when it is downloaded as a SQLite `.db` file. But how would that play with the idea of an in-memory `_metadata` table? Could that table perhaps offer views that join data across multiple attached physical databases?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables,
https://github.com/simonw/datasette/issues/1168#issuecomment-753388809,https://api.github.com/repos/simonw/datasette/issues/1168,753388809,MDEyOklzc3VlQ29tbWVudDc1MzM4ODgwOQ==,9599,simonw,2021-01-01T21:47:51Z,2021-01-01T21:47:51Z,OWNER,"A database that exposes metadata will have the same restriction as the new `_internal` database that exposes columns and tables, in that it needs to take permissions into account. A user should not be able to view metadata for tables that they are not able to see.
As such, I'd rather bundle any metadata tables into the existing `_internal` database so I don't have to solve that permissions problem in two places.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables,
https://github.com/simonw/datasette/issues/1168#issuecomment-753389477,https://api.github.com/repos/simonw/datasette/issues/1168,753389477,MDEyOklzc3VlQ29tbWVudDc1MzM4OTQ3Nw==,9599,simonw,2021-01-01T21:49:57Z,2021-01-01T21:49:57Z,OWNER,"What if metadata was stored in a JSON text column in the existing `_internal` tables? This would allow for users to invent additional metadata fields in the future beyond the current `license`, `license_url` etc fields - without needing a schema change.
The downside of JSON columns generally is that they're harder to run indexed queries against. For metadata I don't think that matters - even with 10,000 tables each with their own metadata a SQL query asking for e.g. ""everything that has Apache 2 as the license"" would return in just a few ms.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables,
https://github.com/simonw/datasette/issues/1168#issuecomment-753389938,https://api.github.com/repos/simonw/datasette/issues/1168,753389938,MDEyOklzc3VlQ29tbWVudDc1MzM4OTkzOA==,9599,simonw,2021-01-01T21:54:15Z,2021-01-01T21:54:15Z,OWNER,"So what if the `databases`, `tables` and `columns` tables in `_internal` each grew a new `metadata` text column?
These columns could be populated by Datasette on startup through reading the `metadata.json` file. But how would plugins interact with them?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables,
https://github.com/simonw/datasette/issues/1168#issuecomment-753390262,https://api.github.com/repos/simonw/datasette/issues/1168,753390262,MDEyOklzc3VlQ29tbWVudDc1MzM5MDI2Mg==,9599,simonw,2021-01-01T21:58:11Z,2021-01-01T21:58:11Z,OWNER,"One possibility: plugins could write directly to that in-memory database table. But how would they know to write again should the server restart? Maybe they would write to it once when called by the `startup` plugin hook, and then update it (and their own backing store) when metadata changes for some reason. Feels a bit messy though.
Also: if I want to support metadata optionally living in a `_metadata` table colocated with the data in a SQLite database file itself, how would that affect the `metadata` columns in `_internal`? How often would Datasette denormalize and copy data across from the on-disk `_metadata` tables to the `_internal` in-memory columns?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables,
https://github.com/simonw/datasette/issues/1168#issuecomment-753390791,https://api.github.com/repos/simonw/datasette/issues/1168,753390791,MDEyOklzc3VlQ29tbWVudDc1MzM5MDc5MQ==,9599,simonw,2021-01-01T22:00:42Z,2021-01-01T22:00:42Z,OWNER,"Here are the requirements I'm currently trying to satisfy:
- It should be possible to query the metadata for ALL attached tables in one place, potentially with pagination and filtering
- Metadata should be able to exist in the current `metadata.json` file
- It should also be possible to bundle metadata in a table in the SQLite database files themselves
- Plugins should be able to define their own special mechanisms for metadata. This is particularly interesting for providing a UI that allows users to edit the metadata for their existing tables.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables,
https://github.com/simonw/datasette/issues/1168#issuecomment-753391869,https://api.github.com/repos/simonw/datasette/issues/1168,753391869,MDEyOklzc3VlQ29tbWVudDc1MzM5MTg2OQ==,9599,simonw,2021-01-01T22:04:30Z,2021-01-01T22:04:30Z,OWNER,"The sticking point here seems to be the plugin hook. Allowing plugins to over-ride the way the question ""give me the metadata for this database/table/column"" is answered makes the database-backed metadata mechanisms much more complicated to think about.
What if plugins didn't get to over-ride metadata in this way, but could instead update the metadata in a persistent Datasette-managed storage mechanism?
Then maybe Datasette could do the following:
- Maintain metadata in `_internal` that has been loaded from `metadata.json`
- Know how to check a database for baked-in metadata (maybe in a `_metadata` table)
- Know how to fall back on the `_internal` metadata if no baked-in metadata is available
If database files were optionally allowed to store metadata about tables that live in another database file this could perhaps solve the plugin needs - since an ""edit metadata"" plugin would be able to edit records in a separate, dedicated `metadata.db` database to store new information about tables in other files.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables,
https://github.com/simonw/datasette/issues/1168#issuecomment-753392102,https://api.github.com/repos/simonw/datasette/issues/1168,753392102,MDEyOklzc3VlQ29tbWVudDc1MzM5MjEwMg==,9599,simonw,2021-01-01T22:06:33Z,2021-01-01T22:06:33Z,OWNER,"Some SQLite databases include SQL comments in the schema definition which tell you what each column means:
```sql
CREATE TABLE User
-- A table comment
(
uid INTEGER, -- A field comment
flags INTEGER -- Another field comment
);
```
The problem with these is that they're not exposed to SQLite in any mechanism other than parsing the `CREATE TABLE` statement from the `sqlite_master` table to extract those columns.
I had an idea to build a plugin that could return these. That would be easy with a ""get metadata for this column"" plugin hook - in the absence of one a plugin could still run that reads the schemas on startup and uses them to populate a metadata database table somewhere.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables,
https://github.com/simonw/datasette/issues/1168#issuecomment-753398542,https://api.github.com/repos/simonw/datasette/issues/1168,753398542,MDEyOklzc3VlQ29tbWVudDc1MzM5ODU0Mg==,9599,simonw,2021-01-01T22:37:24Z,2021-01-01T22:37:24Z,OWNER,"The direction I'm leaning in now is the following:
- Metadata always lives in SQLite tables
- These tables can be co-located with the database they describe (same DB file)
- ... or they can be in a different DB file and reference the other database that they are describing
- Metadata provided on startup in a `metadata.json` file is loaded into an in-memory metadata table using that same mechanism
Plugins that want to provide metadata can do so by populating a table. They could even maintain their own in-memory database for this, or they could write to the `_internal` in-memory database, or they could write to a table in a database on disk.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables,
https://github.com/simonw/datasette/issues/1168#issuecomment-753399366,https://api.github.com/repos/simonw/datasette/issues/1168,753399366,MDEyOklzc3VlQ29tbWVudDc1MzM5OTM2Ng==,9599,simonw,2021-01-01T22:42:37Z,2021-01-01T22:42:37Z,OWNER,"So what would the database schema for this look like?
I'm leaning towards a single table called `_metadata`, because that's a neater fit for baking the metadata into the database file along with the data that it is describing. Alternatively I could have multiple tables sharing that prefix - `_metadata_database` and `_metadata_tables` and `_metadata_columns` perhaps.
If it's just a single `_metadata` table, the schema could look like this:
| database | table | column | metadata |
| --- | --- | --- | --- |
| | mytable | | {""title"": ""My Table"" } |
| | mytable | mycolumn | {""description"": ""Column description"" } |
| otherdb | othertable | | {""description"": ""Table in another DB"" } |
If the `database` column is `null` it means ""this is describing a table in the same database file as this `_metadata` table"".
The alternative to the `metadata` JSON column would be separate columns for each potential metadata value - `license`, `source`, `about`, `about_url` etc. But that makes it harder for people to create custom metadata fields.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables,
https://github.com/simonw/datasette/issues/1168#issuecomment-753399428,https://api.github.com/repos/simonw/datasette/issues/1168,753399428,MDEyOklzc3VlQ29tbWVudDc1MzM5OTQyOA==,9599,simonw,2021-01-01T22:43:14Z,2021-01-01T22:43:22Z,OWNER,"Could this use a compound primary key on `database, table, column`? Does that work with null values?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables,
https://github.com/simonw/datasette/issues/1168#issuecomment-753399635,https://api.github.com/repos/simonw/datasette/issues/1168,753399635,MDEyOklzc3VlQ29tbWVudDc1MzM5OTYzNQ==,9599,simonw,2021-01-01T22:45:21Z,2021-01-01T22:50:21Z,OWNER,"Would also need to figure out the precedence rules:
- What happens if the database has a `_metadata` table with data that conflicts with a remote metadata record from another database? I think the other database should win, because that allows plugins to over-ride the default metadata for something.
- Do JSON values get merged together? So if one table provides a description and another provides a title do both values get returned?
- If a database has a `license`, does that ""cascade"" down to the tables? What about `source` and `about`?
- What if there are two databases (or more) that provide conflicting metadata for a table in some other database? Also, `_internal` may have loaded data from `metadata.json` that conflicts with some other remote table metadata definition.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables,
https://github.com/simonw/datasette/issues/1168#issuecomment-753400265,https://api.github.com/repos/simonw/datasette/issues/1168,753400265,MDEyOklzc3VlQ29tbWVudDc1MzQwMDI2NQ==,9599,simonw,2021-01-01T22:52:09Z,2021-01-01T22:52:09Z,OWNER,"From an implementation perspective, I think the way this works is SQL queries read the relevant metadata from ALL available metadata tables, then Python code solves the precedence rules to produce the final, combined metadata for a database/table/column.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables,
https://github.com/simonw/datasette/issues/1168#issuecomment-753400306,https://api.github.com/repos/simonw/datasette/issues/1168,753400306,MDEyOklzc3VlQ29tbWVudDc1MzQwMDMwNg==,9599,simonw,2021-01-01T22:52:44Z,2021-01-01T22:52:44Z,OWNER,"Also: probably load column metadata as part of the table metadata rather than loading column metadata individually, since it's going to be rare to want the metadata for a single column rather than for an entire table full of columns.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables,
https://github.com/simonw/datasette/issues/1168#issuecomment-753400420,https://api.github.com/repos/simonw/datasette/issues/1168,753400420,MDEyOklzc3VlQ29tbWVudDc1MzQwMDQyMA==,9599,simonw,2021-01-01T22:53:58Z,2021-01-01T22:53:58Z,OWNER,"Precedence idea:
- First priority is non-_internal metadata from other databases - if those conflict then pick then the alphabetically-ordered-first database name wins
- Next priority: `_internal` metadata, which should have been loaded from `metadata.json`
- Last priority: the `_metadata` table from that database itself, i.e. the default ""baked in"" metadata","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables,
https://github.com/simonw/datasette/issues/1168#issuecomment-753401001,https://api.github.com/repos/simonw/datasette/issues/1168,753401001,MDEyOklzc3VlQ29tbWVudDc1MzQwMTAwMQ==,9599,simonw,2021-01-01T23:01:45Z,2021-01-01T23:01:45Z,OWNER,I need to prototype this. Could I do that as a plugin? I think so - I could try out the algorithm for loading metadata and display it on pages using some custom templates.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables,
https://github.com/simonw/datasette/issues/1168#issuecomment-753402423,https://api.github.com/repos/simonw/datasette/issues/1168,753402423,MDEyOklzc3VlQ29tbWVudDc1MzQwMjQyMw==,9599,simonw,2021-01-01T23:16:05Z,2021-01-01T23:16:05Z,OWNER,"One catch: solving the ""show me all metadata for everything in this Datasette instance"" problem.
Ideally there would be a SQLite table that can be queried for this. But the need to resolve the potentially complex set of precedence rules means that table would be difficult if not impossible to provide at run-time.
Ideally a denormalized table would be available that featured the results of running those precedence rule calculations. But how to handle keeping this up-to-date? It would need to be recalculated any time a `_metadata` table in any of the attached databases had an update.
This is a much larger problem - but one potential fix would be to use triggers to maintain a ""version number"" for the `_metadata` table - similar to SQLite's own built-in `schema_version` mechanism. Triggers could increment a counter any time a record in that table was added, deleted or updated.
Such a mechanism would have applications outside of just this `_metadata` system. The ability to attach a version number to any table and have it automatically incremented when that table changes (via triggers) could help with all kinds of other Datasette-at-scale problems, including things like cached table counts.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables,
https://github.com/simonw/datasette/issues/1168#issuecomment-753524779,https://api.github.com/repos/simonw/datasette/issues/1168,753524779,MDEyOklzc3VlQ29tbWVudDc1MzUyNDc3OQ==,9599,simonw,2021-01-02T20:19:26Z,2021-01-02T20:19:26Z,OWNER,Idea: version the metadata scheme. If the table is called `_metadata_v1` it gives me a clear path to designing a new scheme in the future.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables,
https://github.com/simonw/datasette/issues/1168#issuecomment-834636796,https://api.github.com/repos/simonw/datasette/issues/1168,834636796,MDEyOklzc3VlQ29tbWVudDgzNDYzNjc5Ng==,9599,simonw,2021-05-07T17:22:52Z,2021-05-07T17:22:52Z,OWNER,Related: Here's an implementation of a `get_metadata()` plugin hook by @brandonrobertz https://github.com/next-LI/datasette/commit/3fd8ce91f3108c82227bf65ff033929426c60437,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables,
https://github.com/simonw/datasette/issues/1169#issuecomment-753653033,https://api.github.com/repos/simonw/datasette/issues/1169,753653033,MDEyOklzc3VlQ29tbWVudDc1MzY1MzAzMw==,9599,simonw,2021-01-03T17:52:53Z,2021-01-03T17:52:53Z,OWNER,"Oh that's so frustrating! I was worried about that - I spotted a few runs that seemed faster and hoped that it meant that the package was coming out of the `~/.npm` cache, but evidently that's not the case.
You've convinced me that Datasette itself should have a `package.json` - the Dependabot argument is a really good one.
But... I'd really love to figure out a general pattern for using `npx` scripts in GitHub Actions workflows in a cache-friendly way. I have plenty of other projects that I'd love to run Prettier or Uglify or `puppeteer-cli` in without adding a `package.json` to them.
Any ideas? The best I can think of is for the workflow itself to write out a `package.json` file (using `echo '{ ... }' > package.json`) as part of the run - that way the cache should work (I think) but I don't get a misleading `package.json` file sitting in the repo.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777677671,Prettier package not actually being cached,
https://github.com/simonw/datasette/issues/1169#issuecomment-753653260,https://api.github.com/repos/simonw/datasette/issues/1169,753653260,MDEyOklzc3VlQ29tbWVudDc1MzY1MzI2MA==,9599,simonw,2021-01-03T17:54:40Z,2021-01-03T17:54:40Z,OWNER,And @benpickles yes I would land that pull request straight away as-is. Thanks!,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777677671,Prettier package not actually being cached,
https://github.com/simonw/datasette/issues/1169#issuecomment-753657180,https://api.github.com/repos/simonw/datasette/issues/1169,753657180,MDEyOklzc3VlQ29tbWVudDc1MzY1NzE4MA==,9599,simonw,2021-01-03T18:23:30Z,2021-01-03T18:23:30Z,OWNER,"Also welcome in that PR would be a bit of documentation for contributors, see #1167 - but no problem if you leave that out, I'm happy to add it later.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777677671,Prettier package not actually being cached,
https://github.com/simonw/datasette/issues/1171#issuecomment-754286618,https://api.github.com/repos/simonw/datasette/issues/1171,754286618,MDEyOklzc3VlQ29tbWVudDc1NDI4NjYxOA==,9599,simonw,2021-01-04T23:37:45Z,2021-01-04T23:37:45Z,OWNER,https://github.com/actions/virtual-environments/issues/1820#issuecomment-719549887 looks useful - not sure if those notes are for iOS or macOS though.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables,
https://github.com/simonw/datasette/issues/1171#issuecomment-754286783,https://api.github.com/repos/simonw/datasette/issues/1171,754286783,MDEyOklzc3VlQ29tbWVudDc1NDI4Njc4Mw==,9599,simonw,2021-01-04T23:38:18Z,2021-01-04T23:38:18Z,OWNER,Oh wow maybe I need to Notarize it too? https://developer.apple.com/documentation/xcode/notarizing_macos_software_before_distribution,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables,
https://github.com/simonw/datasette/issues/1171#issuecomment-754287882,https://api.github.com/repos/simonw/datasette/issues/1171,754287882,MDEyOklzc3VlQ29tbWVudDc1NDI4Nzg4Mg==,9599,simonw,2021-01-04T23:40:10Z,2021-01-04T23:42:32Z,OWNER,"This looks VERY useful: https://github.com/mitchellh/gon - "" Sign, notarize, and package macOS CLI tools and applications written in any language. Available as both a CLI and a Go library.""
And it installs like this:
brew install mitchellh/gon/gon","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables,
https://github.com/simonw/datasette/issues/1171#issuecomment-754295380,https://api.github.com/repos/simonw/datasette/issues/1171,754295380,MDEyOklzc3VlQ29tbWVudDc1NDI5NTM4MA==,9599,simonw,2021-01-04T23:54:32Z,2021-01-04T23:54:32Z,OWNER,"https://github.com/search?l=YAML&q=gon+json&type=Code reveals some examples of people using `gon` in workflows.
These look useful:
* https://github.com/coherence/hub-server/blob/3b7e9c7c5bce9e244b14b854f1f89d66f53a5a39/.github/workflows/release_build.yml
* https://github.com/simoncozens/pilcrow/blob/5abc145e7fb9577086afe47b48fd730cb8195386/.github/workflows/buildapp.yaml","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables,
https://github.com/simonw/datasette/issues/1171#issuecomment-754296761,https://api.github.com/repos/simonw/datasette/issues/1171,754296761,MDEyOklzc3VlQ29tbWVudDc1NDI5Njc2MQ==,9599,simonw,2021-01-04T23:55:44Z,2021-01-04T23:55:44Z,OWNER,Bit uncomfortable that it looks like you need to include your Apple ID username and password in the CI configuration to do this. I'll use GitHub Secrets for this but I don't like it - I'll definitely setup a dedicated code signing account that's not my access-to-everything AppleID for this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables,
https://github.com/simonw/datasette/issues/1171#issuecomment-754958998,https://api.github.com/repos/simonw/datasette/issues/1171,754958998,MDEyOklzc3VlQ29tbWVudDc1NDk1ODk5OA==,9599,simonw,2021-01-05T23:16:33Z,2021-01-05T23:16:33Z,OWNER,"That's really useful, thanks @rcoup ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables,
https://github.com/simonw/datasette/issues/1171#issuecomment-756335394,https://api.github.com/repos/simonw/datasette/issues/1171,756335394,MDEyOklzc3VlQ29tbWVudDc1NjMzNTM5NA==,9599,simonw,2021-01-07T19:35:59Z,2021-01-07T19:35:59Z,OWNER,"I requested a D-U-N-S number as a first step in getting a developer certificate: https://developer.apple.com/support/D-U-N-S/
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables,
https://github.com/simonw/datasette/issues/1173#issuecomment-754463845,https://api.github.com/repos/simonw/datasette/issues/1173,754463845,MDEyOklzc3VlQ29tbWVudDc1NDQ2Mzg0NQ==,9599,simonw,2021-01-05T07:41:43Z,2021-01-05T07:41:43Z,OWNER,"https://github.com/oleksis/pyinstaller-manylinux looks useful, via https://twitter.com/oleksis/status/1346341987876823040","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778682317,GitHub Actions workflow to build manylinux binary,
https://github.com/simonw/datasette/issues/1175#issuecomment-754696725,https://api.github.com/repos/simonw/datasette/issues/1175,754696725,MDEyOklzc3VlQ29tbWVudDc1NDY5NjcyNQ==,9599,simonw,2021-01-05T15:12:30Z,2021-01-05T15:12:30Z,OWNER,Some tips here: https://github.com/tiangolo/fastapi/issues/78,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779156520,Use structlog for logging,
https://github.com/simonw/datasette/issues/1176#issuecomment-1030762279,https://api.github.com/repos/simonw/datasette/issues/1176,1030762279,IC_kwDOBm6k_c49cC8n,9599,simonw,2022-02-06T06:38:08Z,2022-02-06T06:41:37Z,OWNER,"Might do this using Sphinx auto-generated function and class documentation hooks, as seen here in `sqlite-utils`: https://sqlite-utils.datasette.io/en/stable/python-api.html#spatialite-helpers
This would encourage me to add really good docstrings.
```
.. _python_api_gis_find_spatialite:
Finding SpatiaLite
------------------
.. autofunction:: sqlite_utils.utils.find_spatialite
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions",
https://github.com/simonw/datasette/issues/1176#issuecomment-1031108559,https://api.github.com/repos/simonw/datasette/issues/1176,1031108559,IC_kwDOBm6k_c49dXfP,9599,simonw,2022-02-07T06:11:27Z,2022-02-07T06:11:27Z,OWNER,I'm going with `@documented` as the decorator for functions that should be documented.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions",
https://github.com/simonw/datasette/issues/1176#issuecomment-1031122800,https://api.github.com/repos/simonw/datasette/issues/1176,1031122800,IC_kwDOBm6k_c49da9w,9599,simonw,2022-02-07T06:34:21Z,2022-02-07T06:34:21Z,OWNER,"New section is here: https://docs.datasette.io/en/latest/internals.html#the-datasette-utils-module
But it's not correctly displaying the new autodoc stuff:
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions",
https://github.com/simonw/datasette/issues/1176#issuecomment-1031123719,https://api.github.com/repos/simonw/datasette/issues/1176,1031123719,IC_kwDOBm6k_c49dbMH,9599,simonw,2022-02-07T06:36:32Z,2022-02-07T06:36:32Z,OWNER,"https://github.com/simonw/sqlite-utils/blob/main/.readthedocs.yaml looks like this (it works correctly):
```yaml
version: 2
sphinx:
configuration: docs/conf.py
python:
version: ""3.8""
install:
- method: pip
path: .
extra_requirements:
- docs
```
Compare to the current Datasette one here: https://github.com/simonw/datasette/blob/d9b508ffaa91f9f1840b366f5d282712d445f16b/.readthedocs.yaml#L1-L13
Looks like I need this bit:
```python
python:
version: ""3.8""
install:
- method: pip
path: .
extra_requirements:
- docs
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions",
https://github.com/simonw/datasette/issues/1176#issuecomment-1031125347,https://api.github.com/repos/simonw/datasette/issues/1176,1031125347,IC_kwDOBm6k_c49dblj,9599,simonw,2022-02-07T06:40:16Z,2022-02-07T06:40:16Z,OWNER,"Read The Docs error:
> Problem in your project's configuration. Invalid ""python.version"": .readthedocs.yaml: Invalid configuration option: python.version. Make sure the key name is correct.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions",
https://github.com/simonw/datasette/issues/1176#issuecomment-1031126547,https://api.github.com/repos/simonw/datasette/issues/1176,1031126547,IC_kwDOBm6k_c49db4T,9599,simonw,2022-02-07T06:42:58Z,2022-02-07T06:42:58Z,OWNER,"That fixed it: https://docs.datasette.io/en/latest/internals.html#parse-metadata-content
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions",
https://github.com/simonw/datasette/issues/1176#issuecomment-1031126801,https://api.github.com/repos/simonw/datasette/issues/1176,1031126801,IC_kwDOBm6k_c49db8R,9599,simonw,2022-02-07T06:43:31Z,2022-02-07T06:43:31Z,OWNER,Here's the new test: https://github.com/simonw/datasette/blob/03305ea183b1534bc4cef3a721fe5f3700273b84/tests/test_docs.py#L91-L104,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions",
https://github.com/simonw/datasette/issues/1176#issuecomment-754951786,https://api.github.com/repos/simonw/datasette/issues/1176,754951786,MDEyOklzc3VlQ29tbWVudDc1NDk1MTc4Ng==,9599,simonw,2021-01-05T22:56:27Z,2021-01-05T22:56:43Z,OWNER,"Idea: introduce a `@documented` decorator which marks specific functions as part of the public, documented API. The unit tests can then confirm that anything with that decorator is both documented and tested.
```python
@documented
def escape_css_string(s):
return _css_re.sub(
lambda m: ""\\"" + (f""{ord(m.group()):X}"".zfill(6)),
s.replace(""\r\n"", ""\n""),
)
```
Or maybe `@public`?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions",
https://github.com/simonw/datasette/issues/1176#issuecomment-754952040,https://api.github.com/repos/simonw/datasette/issues/1176,754952040,MDEyOklzc3VlQ29tbWVudDc1NDk1MjA0MA==,9599,simonw,2021-01-05T22:57:09Z,2021-01-05T22:57:09Z,OWNER,It might be neater to move all of the non-public functions into a separate module - `datasette.utils.internal` perhaps.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions",
https://github.com/simonw/datasette/issues/1176#issuecomment-754952146,https://api.github.com/repos/simonw/datasette/issues/1176,754952146,MDEyOklzc3VlQ29tbWVudDc1NDk1MjE0Ng==,9599,simonw,2021-01-05T22:57:26Z,2021-01-05T22:57:26Z,OWNER,Known public APIs might be worth adding type annotations to as well.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions",
https://github.com/simonw/datasette/issues/1176#issuecomment-754957378,https://api.github.com/repos/simonw/datasette/issues/1176,754957378,MDEyOklzc3VlQ29tbWVudDc1NDk1NzM3OA==,9599,simonw,2021-01-05T23:12:03Z,2021-01-05T23:12:03Z,OWNER,This needs to be done for Datasette 1.0. At the very least I need to ensure it's clear that `datasette.utils` is not part of the public API unless explicitly marked as such.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions",
https://github.com/simonw/datasette/issues/1176#issuecomment-755159583,https://api.github.com/repos/simonw/datasette/issues/1176,755159583,MDEyOklzc3VlQ29tbWVudDc1NTE1OTU4Mw==,9599,simonw,2021-01-06T08:28:20Z,2021-01-06T08:28:20Z,OWNER,I used `from datasette.utils import path_with_format` in https://github.com/simonw/datasette-export-notebook/blob/0.1/datasette_export_notebook/__init__.py just now.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions",
https://github.com/simonw/datasette/issues/1176#issuecomment-969616626,https://api.github.com/repos/simonw/datasette/issues/1176,969616626,IC_kwDOBm6k_c45yyzy,9599,simonw,2021-11-16T01:29:13Z,2021-11-16T01:29:13Z,OWNER,"I'm inclined to create a Sphinx reference documentation page for this, as I did for `sqlite-utils` here: https://sqlite-utils.datasette.io/en/stable/reference.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions",
https://github.com/simonw/datasette/issues/1177#issuecomment-1074017633,https://api.github.com/repos/simonw/datasette/issues/1177,1074017633,IC_kwDOBm6k_c5ABDVh,9599,simonw,2022-03-21T15:08:51Z,2022-03-21T15:08:51Z,OWNER,"Related:
- #1062 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780153562,Ability to stream all rows as newline-delimited JSON,
https://github.com/simonw/datasette/issues/1178#issuecomment-755156606,https://api.github.com/repos/simonw/datasette/issues/1178,755156606,MDEyOklzc3VlQ29tbWVudDc1NTE1NjYwNg==,9599,simonw,2021-01-06T08:21:49Z,2021-01-06T08:21:49Z,OWNER,"https://github.com/simonw/datasette-export-notebook/blob/aec398eab4f34791d240d7bc47b6eec575b357be/datasette_export_notebook/__init__.py#L18-L23
Maybe this is a bug in `datasette.absolute_url`? Perhaps it doesn't take the scheme into account.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run,
https://github.com/simonw/datasette/issues/1178#issuecomment-755157066,https://api.github.com/repos/simonw/datasette/issues/1178,755157066,MDEyOklzc3VlQ29tbWVudDc1NTE1NzA2Ng==,9599,simonw,2021-01-06T08:22:47Z,2021-01-06T08:22:47Z,OWNER,"Weird... https://github.com/simonw/datasette/blob/a882d679626438ba0d809944f06f239bcba8ee96/datasette/app.py#L609-L613
```python
def absolute_url(self, request, path):
url = urllib.parse.urljoin(request.url, path)
if url.startswith(""http://"") and self.setting(""force_https_urls""):
url = ""https://"" + url[len(""http://"") :]
return url
```
That looks like it should work. Needs more digging.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run,
https://github.com/simonw/datasette/issues/1178#issuecomment-755157281,https://api.github.com/repos/simonw/datasette/issues/1178,755157281,MDEyOklzc3VlQ29tbWVudDc1NTE1NzI4MQ==,9599,simonw,2021-01-06T08:23:14Z,2021-01-06T08:23:14Z,OWNER,"https://latest-with-plugins.datasette.io/-/settings says `""force_https_urls"": false`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run,
https://github.com/simonw/datasette/issues/1178#issuecomment-755157732,https://api.github.com/repos/simonw/datasette/issues/1178,755157732,MDEyOklzc3VlQ29tbWVudDc1NTE1NzczMg==,9599,simonw,2021-01-06T08:24:12Z,2021-01-06T08:24:12Z,OWNER,https://latest-with-plugins.datasette.io/fixtures/sortable.json has the bug too - the `next_url` is `http://` when it should be `https://`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run,
https://github.com/simonw/datasette/issues/1178#issuecomment-755158310,https://api.github.com/repos/simonw/datasette/issues/1178,755158310,MDEyOklzc3VlQ29tbWVudDc1NTE1ODMxMA==,9599,simonw,2021-01-06T08:25:31Z,2021-01-06T08:25:31Z,OWNER,Moving this to the Datasette repo.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run,
https://github.com/simonw/datasette/issues/1178#issuecomment-755160187,https://api.github.com/repos/simonw/datasette/issues/1178,755160187,MDEyOklzc3VlQ29tbWVudDc1NTE2MDE4Nw==,9599,simonw,2021-01-06T08:29:35Z,2021-01-06T08:29:35Z,OWNER,"https://latest-with-plugins.datasette.io/-/asgi-scope
```
{'asgi': {'spec_version': '2.1', 'version': '3.0'},
'client': ('169.254.8.129', 54971),
'headers': [(b'host', b'latest-with-plugins.datasette.io'),
(b'user-agent',
b'Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:84.0) Gecko'
b'/20100101 Firefox/84.0'),
(b'accept',
b'text/html,application/xhtml+xml,application/xml;q=0.9,image/'
b'webp,*/*;q=0.8'),
(b'accept-language', b'en-US,en;q=0.5'),
(b'dnt', b'1'),
(b'cookie',
b'_ga_LL6M7BK6D4=GS1.1.1609886546.49.1.1609886923.0; _ga=GA1.1'
b'.894633707.1607575712'),
(b'upgrade-insecure-requests', b'1'),
(b'x-client-data', b'CgSL6ZsV'),
(b'x-cloud-trace-context',
b'e776af843c657d2a3da28a73b726e6fe/14187666787557102189;o=1'),
(b'x-forwarded-for', b'148.64.98.14'),
(b'x-forwarded-proto', b'https'),
(b'forwarded', b'for=""148.64.98.14"";proto=https'),
(b'accept-encoding', b'gzip, deflate, br'),
(b'content-length', b'0')],
'http_version': '1.1',
'method': 'GET',
'path': '/-/asgi-scope',
'query_string': b'',
'raw_path': b'/-/asgi-scope',
'root_path': '',
'scheme': 'http',
'server': ('169.254.8.130', 8080),
'type': 'http'}
```
Note the `'scheme': 'http'` but also the `(b'x-forwarded-proto', b'https')`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run,
https://github.com/simonw/datasette/issues/1178#issuecomment-755163886,https://api.github.com/repos/simonw/datasette/issues/1178,755163886,MDEyOklzc3VlQ29tbWVudDc1NTE2Mzg4Ng==,9599,simonw,2021-01-06T08:37:51Z,2021-01-06T08:37:51Z,OWNER,"Easiest fix would be for `publish cloudrun` to set `force_https_urls`:
`datasette publish now` used to do this: https://github.com/simonw/datasette/blob/07e208cc6d9e901b87552c1be2854c220b3f9b6d/datasette/publish/now.py#L59-L63","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run,
https://github.com/simonw/datasette/issues/1178#issuecomment-755468795,https://api.github.com/repos/simonw/datasette/issues/1178,755468795,MDEyOklzc3VlQ29tbWVudDc1NTQ2ODc5NQ==,9599,simonw,2021-01-06T18:14:35Z,2021-01-06T18:14:35Z,OWNER,Deploying that change now to test it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run,
https://github.com/simonw/datasette/issues/1178#issuecomment-755476820,https://api.github.com/repos/simonw/datasette/issues/1178,755476820,MDEyOklzc3VlQ29tbWVudDc1NTQ3NjgyMA==,9599,simonw,2021-01-06T18:24:47Z,2021-01-06T18:24:47Z,OWNER,"Issue fixed - https://latest-with-plugins.datasette.io/github/issue_comments.Notebook?_labels=on displays the correct schemes now.
I can't think of a reason anyone on Cloud Run would ever NOT want the `force_https_urls` option, but just in case I've made it so if you pass `--extra-options --setting force_https_urls off` to `publish cloudrun` your setting will be respected.
https://github.com/simonw/datasette/blob/97fb10c17dd007a275ab743742e93e932335ad67/datasette/publish/cloudrun.py#L105-L110","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run,
https://github.com/simonw/datasette/issues/1179#issuecomment-755161574,https://api.github.com/repos/simonw/datasette/issues/1179,755161574,MDEyOklzc3VlQ29tbWVudDc1NTE2MTU3NA==,9599,simonw,2021-01-06T08:32:31Z,2021-01-06T08:32:31Z,OWNER,An optional `path` argument to https://docs.datasette.io/en/stable/plugin_hooks.html#register-output-renderer-datasette which shows the path WITHOUT the `.Notebook` extension would be useful here.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks,
https://github.com/simonw/datasette/issues/1179#issuecomment-755486103,https://api.github.com/repos/simonw/datasette/issues/1179,755486103,MDEyOklzc3VlQ29tbWVudDc1NTQ4NjEwMw==,9599,simonw,2021-01-06T18:32:41Z,2021-01-06T18:34:11Z,OWNER,"This parameter will return the URL path, with querystring arguments, to the HTML version of the page - e.g. `/github/issue_comments` or `/github/issue_comments?_sort_desc=created_at`
Open questions:
- What should it be called? `path` could be misleading since it also includes the querystring.
- Should I provide a `url` or `full_url` version which includes `https://blah.com/...`?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks,
https://github.com/simonw/datasette/issues/1179#issuecomment-755489974,https://api.github.com/repos/simonw/datasette/issues/1179,755489974,MDEyOklzc3VlQ29tbWVudDc1NTQ4OTk3NA==,9599,simonw,2021-01-06T18:35:24Z,2021-01-06T18:35:24Z,OWNER,Django calls this ` HttpRequest.get_full_path()` - for the path plus the querystring.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks,
https://github.com/simonw/datasette/issues/1179#issuecomment-755492945,https://api.github.com/repos/simonw/datasette/issues/1179,755492945,MDEyOklzc3VlQ29tbWVudDc1NTQ5Mjk0NQ==,9599,simonw,2021-01-06T18:37:39Z,2021-01-06T18:37:39Z,OWNER,I think I'll call this `full_path` for consistency with Django.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks,
https://github.com/simonw/datasette/issues/1179#issuecomment-755495387,https://api.github.com/repos/simonw/datasette/issues/1179,755495387,MDEyOklzc3VlQ29tbWVudDc1NTQ5NTM4Nw==,9599,simonw,2021-01-06T18:39:23Z,2021-01-06T18:39:23Z,OWNER,"In that case maybe there are three new arguments: `path`, `full_path` and `url`.
I'll also add `request.full_path` for consistency with these: https://github.com/simonw/datasette/blob/97fb10c17dd007a275ab743742e93e932335ad67/datasette/utils/asgi.py#L77-L90","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks,
https://github.com/simonw/datasette/issues/1179#issuecomment-766434629,https://api.github.com/repos/simonw/datasette/issues/1179,766434629,MDEyOklzc3VlQ29tbWVudDc2NjQzNDYyOQ==,9599,simonw,2021-01-24T21:23:47Z,2021-01-24T21:23:47Z,OWNER,I'm just going to do `path` and `full_path` (which includes the querystring)`. The `datasette.absolute_url()` method can be used by plugins that need the full URL.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks,
https://github.com/simonw/datasette/issues/1179#issuecomment-766484257,https://api.github.com/repos/simonw/datasette/issues/1179,766484257,MDEyOklzc3VlQ29tbWVudDc2NjQ4NDI1Nw==,9599,simonw,2021-01-25T01:30:57Z,2021-01-25T01:30:57Z,OWNER,"The challenge here is figuring out what the original path, without the `.format`, actually was - while taking into account that Datasette has a special case for tables that themselves end in a `.something`.
The `path_with_format()` function nearly does what we need here:
https://github.com/simonw/datasette/blob/b6a7b58fa01af0cd5a5e94bd17d686d283a46819/datasette/utils/__init__.py#L710-L729
It can be called with `replace_format=""csv""` to REMOVE the `.csv` format and replace it with something else.
Problem is, we want to use it to get rid of the format entirely.
We could update `path_with_format()` to accept `format=''` to mean ""remove the format entirely"", but it's a bit messy. It may be better to reconsider the design of `path_with_format()` and related utility functions entirely.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks,
https://github.com/simonw/datasette/issues/1179#issuecomment-766484435,https://api.github.com/repos/simonw/datasette/issues/1179,766484435,MDEyOklzc3VlQ29tbWVudDc2NjQ4NDQzNQ==,9599,simonw,2021-01-25T01:31:36Z,2021-01-25T01:31:36Z,OWNER,Relevant existing tests: https://github.com/simonw/datasette/blob/461670a0b87efa953141b449a9a261919864ceb3/tests/test_utils.py#L365-L398,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks,
https://github.com/simonw/datasette/issues/1180#issuecomment-755500475,https://api.github.com/repos/simonw/datasette/issues/1180,755500475,MDEyOklzc3VlQ29tbWVudDc1NTUwMDQ3NQ==,9599,simonw,2021-01-06T18:43:41Z,2021-01-06T18:43:41Z,OWNER,Relevant code: https://github.com/simonw/datasette/blob/97fb10c17dd007a275ab743742e93e932335ad67/datasette/utils/__init__.py#L919-L940,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780767542,Lazily evaluated arguments for call_with_supported_arguments,
https://github.com/simonw/datasette/issues/1180#issuecomment-756312213,https://api.github.com/repos/simonw/datasette/issues/1180,756312213,MDEyOklzc3VlQ29tbWVudDc1NjMxMjIxMw==,9599,simonw,2021-01-07T18:56:24Z,2021-01-07T18:56:24Z,OWNER,The `async_call_with_supported_arguments` version should be able to await any of the lazy arguments that are awaitable - can use `await_me_maybe` for that.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780767542,Lazily evaluated arguments for call_with_supported_arguments,
https://github.com/simonw/datasette/issues/1181#issuecomment-756482163,https://api.github.com/repos/simonw/datasette/issues/1181,756482163,MDEyOklzc3VlQ29tbWVudDc1NjQ4MjE2Mw==,9599,simonw,2021-01-08T01:06:23Z,2021-01-08T01:06:54Z,OWNER,"Yes, that logic is definitely at fault. It looks like it applies `urllib.parse.unquote_plus()` AFTER it's tried to do the `-` hash splitting thing, which is why it's failing here:
https://github.com/simonw/datasette/blob/97fb10c17dd007a275ab743742e93e932335ad67/datasette/views/base.py#L184-L198","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",781262510,"Certain database names results in 404: ""Database not found: None""",
https://github.com/simonw/datasette/issues/1181#issuecomment-756487966,https://api.github.com/repos/simonw/datasette/issues/1181,756487966,MDEyOklzc3VlQ29tbWVudDc1NjQ4Nzk2Ng==,9599,simonw,2021-01-08T01:25:42Z,2021-01-08T01:25:42Z,OWNER,I'm going to add a unit test that tries a variety of weird database names.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",781262510,"Certain database names results in 404: ""Database not found: None""",
https://github.com/simonw/datasette/issues/1181#issuecomment-761703555,https://api.github.com/repos/simonw/datasette/issues/1181,761703555,MDEyOklzc3VlQ29tbWVudDc2MTcwMzU1NQ==,9599,simonw,2021-01-17T00:24:20Z,2021-01-17T00:24:40Z,OWNER,"Here's the incomplete sketch of a test - to go at the bottom of `test_cli.py`.
```python
@pytest.mark.parametrize(
""filename"", [""test-database (1).sqlite"", ""database (1).sqlite""]
)
def test_weird_database_names(ensure_eventloop, tmpdir, filename):
# https://github.com/simonw/datasette/issues/1181
runner = CliRunner()
db_path = str(tmpdir / filename)
sqlite3.connect(db_path).execute(""vacuum"")
result1 = runner.invoke(cli, [db_path, ""--get"", ""/""])
assert result1.exit_code == 0, result1.output
homepage_html = result1.output
assert False
```
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",781262510,"Certain database names results in 404: ""Database not found: None""",
https://github.com/simonw/datasette/issues/1182#issuecomment-757373082,https://api.github.com/repos/simonw/datasette/issues/1182,757373082,MDEyOklzc3VlQ29tbWVudDc1NzM3MzA4Mg==,9599,simonw,2021-01-09T21:55:33Z,2021-01-09T21:55:33Z,OWNER,I'll leave the page there but change it into more of a blurb about the existence of the plugins and tools directories.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",782692159,"Retire ""Ecosystem"" page in favour of datasette.io/plugins and /tools",
https://github.com/simonw/datasette/issues/1182#issuecomment-757373741,https://api.github.com/repos/simonw/datasette/issues/1182,757373741,MDEyOklzc3VlQ29tbWVudDc1NzM3Mzc0MQ==,9599,simonw,2021-01-09T22:01:41Z,2021-01-09T22:01:41Z,OWNER,It can talk about Dogsheep too.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",782692159,"Retire ""Ecosystem"" page in favour of datasette.io/plugins and /tools",
https://github.com/simonw/datasette/issues/1182#issuecomment-757375858,https://api.github.com/repos/simonw/datasette/issues/1182,757375858,MDEyOklzc3VlQ29tbWVudDc1NzM3NTg1OA==,9599,simonw,2021-01-09T22:18:47Z,2021-01-09T22:18:47Z,OWNER,https://docs.datasette.io/en/latest/ecosystem.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",782692159,"Retire ""Ecosystem"" page in favour of datasette.io/plugins and /tools",
https://github.com/simonw/datasette/issues/1183#issuecomment-758356097,https://api.github.com/repos/simonw/datasette/issues/1183,758356097,MDEyOklzc3VlQ29tbWVudDc1ODM1NjA5Nw==,9599,simonw,2021-01-12T02:40:30Z,2021-01-12T02:40:30Z,OWNER,"So how would this work?
I think I'm going to automatically use these values if the `_counts` table exists, unless a `ignore_counts_table` boolean setting has been set. I won't bother looking to see if the triggers have been created.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",782708469,"Take advantage of sqlite-utils cached table counts, if available",
https://github.com/simonw/datasette/issues/1183#issuecomment-758356640,https://api.github.com/repos/simonw/datasette/issues/1183,758356640,MDEyOklzc3VlQ29tbWVudDc1ODM1NjY0MA==,9599,simonw,2021-01-12T02:42:08Z,2021-01-12T02:42:08Z,OWNER,"Should Datasette have subcommands for this? `datasette enable-counts data.db` and `datasette disable-counts data.db` and `datasette reset-counts data.db` commands?
Maybe. The `sqlite-utils` CLI tool could be used here instead, but that won't be easily available if Datasette was installed as a standalone binary or using `brew install datasette` or `pipx install datasette`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",782708469,"Take advantage of sqlite-utils cached table counts, if available",
https://github.com/simonw/datasette/issues/1185#issuecomment-759066777,https://api.github.com/repos/simonw/datasette/issues/1185,759066777,MDEyOklzc3VlQ29tbWVudDc1OTA2Njc3Nw==,9599,simonw,2021-01-12T22:07:58Z,2021-01-12T22:07:58Z,OWNER,"https://docs.datasette.io/en/stable/sql_queries.html?highlight=pragma#named-parameters documentation is out-of-date as well:
> Datasette disallows custom SQL containing the string PRAGMA, as SQLite pragma statements can be used to change database settings at runtime. If you need to include the string ""pragma"" in a query you can do so safely using a named parameter.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",784628163,"""Statement may not contain PRAGMA"" error is not strictly true",
https://github.com/simonw/datasette/issues/1185#issuecomment-759067427,https://api.github.com/repos/simonw/datasette/issues/1185,759067427,MDEyOklzc3VlQ29tbWVudDc1OTA2NzQyNw==,9599,simonw,2021-01-12T22:09:21Z,2021-01-12T22:09:21Z,OWNER,"That allow-list was added in #761 but is not currently documented. It's here in the code:
https://github.com/simonw/datasette/blob/8e8fc5cee5c78da8334495c6d6257d5612c40792/datasette/utils/__init__.py#L173-L186","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",784628163,"""Statement may not contain PRAGMA"" error is not strictly true",
https://github.com/simonw/datasette/issues/1185#issuecomment-759069342,https://api.github.com/repos/simonw/datasette/issues/1185,759069342,MDEyOklzc3VlQ29tbWVudDc1OTA2OTM0Mg==,9599,simonw,2021-01-12T22:13:18Z,2021-01-12T22:13:18Z,OWNER,I'm going to change the error message to list the allowed pragmas.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",784628163,"""Statement may not contain PRAGMA"" error is not strictly true",
https://github.com/simonw/datasette/issues/1186#issuecomment-759874332,https://api.github.com/repos/simonw/datasette/issues/1186,759874332,MDEyOklzc3VlQ29tbWVudDc1OTg3NDMzMg==,9599,simonw,2021-01-14T01:59:35Z,2021-01-14T01:59:35Z,OWNER,Updated documentation: https://docs.datasette.io/en/latest/custom_templates.html#custom-css-and-javascript and https://docs.datasette.io/en/latest/plugin_hooks.html#extra-js-urls-template-database-table-columns-view-name-request-datasette,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",785573793,"script type=""module"" support",
https://github.com/simonw/datasette/issues/1187#issuecomment-759875239,https://api.github.com/repos/simonw/datasette/issues/1187,759875239,MDEyOklzc3VlQ29tbWVudDc1OTg3NTIzOQ==,9599,simonw,2021-01-14T02:02:24Z,2021-01-14T02:02:31Z,OWNER,"This plugin hook currently returns a string of JavaScript. It could optionally return this instead of a string:
```json
{
""script"": ""string of JavaScript goes here"",
""module"": true
}
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",785588942,"extra_body_script() support for script type=""module""",
https://github.com/simonw/datasette/issues/119#issuecomment-639047315,https://api.github.com/repos/simonw/datasette/issues/119,639047315,MDEyOklzc3VlQ29tbWVudDYzOTA0NzMxNQ==,9599,simonw,2020-06-04T18:46:39Z,2020-06-04T18:46:39Z,OWNER,"The OAuth dance needed for this is a pretty nasty barrier to plugin installation and configuration.
I'm going to focus on making it easy to copy and paste data into sheets instead.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275082158,"Build an ""export this data to google sheets"" plugin",
https://github.com/simonw/datasette/issues/1190#issuecomment-1699925224,https://api.github.com/repos/simonw/datasette/issues/1190,1699925224,IC_kwDOBm6k_c5lUszo,9599,simonw,2023-08-30T22:16:38Z,2023-08-30T22:16:38Z,OWNER,"This is going to happen in this tool instead:
- https://github.com/simonw/dclient","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098146,`datasette publish upload` mechanism for uploading databases to an existing Datasette instance,
https://github.com/simonw/datasette/issues/1190#issuecomment-766430644,https://api.github.com/repos/simonw/datasette/issues/1190,766430644,MDEyOklzc3VlQ29tbWVudDc2NjQzMDY0NA==,9599,simonw,2021-01-24T20:57:03Z,2021-01-24T20:57:03Z,OWNER,"I really like this idea. It feels like an opportunity for a plugin that adds two things: an API endpoint to Datasette for accepting uploaded databases, and a `datasette publish upload` subcommand which can upload files to that endpoint (with some kind of authentication mechanism).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098146,`datasette publish upload` mechanism for uploading databases to an existing Datasette instance,
https://github.com/simonw/datasette/issues/1190#issuecomment-766433153,https://api.github.com/repos/simonw/datasette/issues/1190,766433153,MDEyOklzc3VlQ29tbWVudDc2NjQzMzE1Mw==,9599,simonw,2021-01-24T21:13:25Z,2021-01-24T21:13:25Z,OWNER,"This ties in to a bunch of other ideas that are in flight at the moment.
If you're publishing databases by uploading them, how do you attach metadata? Ideally by baking it into the database file itself, using the mechanism from #1169.
How could this interact with the `datasette insert` concept from #1163? Could you pass a CSV file to the `upload` command and have that converted and uploaded for you, or would you create the database file locally using `datasette insert` and then upload it as a separate `datasette upload` step?
Lots to think about here.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098146,`datasette publish upload` mechanism for uploading databases to an existing Datasette instance,
https://github.com/simonw/datasette/issues/1191#issuecomment-761103910,https://api.github.com/repos/simonw/datasette/issues/1191,761103910,MDEyOklzc3VlQ29tbWVudDc2MTEwMzkxMA==,9599,simonw,2021-01-15T18:19:29Z,2021-01-15T18:19:29Z,OWNER,This relates to #987 (documented HTML hooks for JavaScript plugins) but is not quite the same thing.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates,
https://github.com/simonw/datasette/issues/1191#issuecomment-761104933,https://api.github.com/repos/simonw/datasette/issues/1191,761104933,MDEyOklzc3VlQ29tbWVudDc2MTEwNDkzMw==,9599,simonw,2021-01-15T18:21:26Z,2021-12-17T07:03:02Z,OWNER,"Also related: #857 (comprehensive documentation of variables available to templates) - since then the plugin hook could be fed the full template context and use that to do its thing.
Or maybe the plugin hooks gets to return the name of a template that should be `{% include %}` into the page at that point? But the plugin may want to add extra context that is available to that template include.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates,
https://github.com/simonw/datasette/issues/1191#issuecomment-761703022,https://api.github.com/repos/simonw/datasette/issues/1191,761703022,MDEyOklzc3VlQ29tbWVudDc2MTcwMzAyMg==,9599,simonw,2021-01-17T00:20:00Z,2021-01-17T00:20:00Z,OWNER,"Plugins that want to provide extra context to the template can already do so using the `extra_template_vars()` plugin hook.
So this hook could work by returning a list of template filenames to be included. Those templates can be bundled with the plugin. Since they are included they will have access to the template context and to any `extra_template_vars()` values.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates,
https://github.com/simonw/datasette/issues/1191#issuecomment-761703232,https://api.github.com/repos/simonw/datasette/issues/1191,761703232,MDEyOklzc3VlQ29tbWVudDc2MTcwMzIzMg==,9599,simonw,2021-01-17T00:21:31Z,2021-01-17T00:21:54Z,OWNER,"I think this ends up being a whole collection of new plugin hooks, something like:
- `include_table_top`
- `include_table_bottom`
- `include_row_top`
- `include_row_bottom`
- `include_database_top`
- `include_database_bottom`
- `include_query_bottom`
- `include_query_bottom`
- `include_index_bottom`
- `include_index_bottom`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates,
https://github.com/simonw/datasette/issues/1191#issuecomment-761703368,https://api.github.com/repos/simonw/datasette/issues/1191,761703368,MDEyOklzc3VlQ29tbWVudDc2MTcwMzM2OA==,9599,simonw,2021-01-17T00:22:46Z,2021-01-17T00:22:46Z,OWNER,I'm going to prototype this in a branch.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates,
https://github.com/simonw/datasette/issues/1191#issuecomment-761705076,https://api.github.com/repos/simonw/datasette/issues/1191,761705076,MDEyOklzc3VlQ29tbWVudDc2MTcwNTA3Ng==,9599,simonw,2021-01-17T00:35:13Z,2021-01-17T00:37:51Z,OWNER,"I'm going to try using Jinja macros to implement this: https://jinja.palletsprojects.com/en/2.11.x/templates/#macros
Maybe using one of these tricks to auto-load the macro? http://codyaray.com/2015/05/auto-load-jinja2-macros","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates,
https://github.com/simonw/datasette/issues/1191#issuecomment-765757433,https://api.github.com/repos/simonw/datasette/issues/1191,765757433,MDEyOklzc3VlQ29tbWVudDc2NTc1NzQzMw==,9599,simonw,2021-01-22T23:43:43Z,2021-01-22T23:43:43Z,OWNER,"Another potential use for this: plugins that provide authentication (like `datasette-auth-passwords` and `datasette-auth-github`) could use it to add a chunk of HTML to the ""permission denied"" page that links to their mechanism of authenticating.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates,
https://github.com/simonw/datasette/issues/1191#issuecomment-766466030,https://api.github.com/repos/simonw/datasette/issues/1191,766466030,MDEyOklzc3VlQ29tbWVudDc2NjQ2NjAzMA==,9599,simonw,2021-01-25T00:11:04Z,2021-01-25T00:11:04Z,OWNER,"I can combine this with #987 - each of these areas of the page can be wrapped in a `` with a class that matches the name of the plugin hook, that way JavaScript plugins can append their content in the same place as Python plugins.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates,
https://github.com/simonw/datasette/issues/1191#issuecomment-766523866,https://api.github.com/repos/simonw/datasette/issues/1191,766523866,MDEyOklzc3VlQ29tbWVudDc2NjUyMzg2Ng==,9599,simonw,2021-01-25T03:58:34Z,2021-01-25T03:58:34Z,OWNER,"I've got a good prototype working now, but I'm dropping this from the Datasette 0.54 milestone because it requires a bunch of additional work to make sure it is really well tested and documented.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates,
https://github.com/simonw/datasette/issues/1191#issuecomment-766524016,https://api.github.com/repos/simonw/datasette/issues/1191,766524016,MDEyOklzc3VlQ29tbWVudDc2NjUyNDAxNg==,9599,simonw,2021-01-25T03:59:17Z,2021-01-25T03:59:17Z,OWNER,More work can happen in the PR: #1204,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates,
https://github.com/simonw/datasette/issues/1193#issuecomment-797159434,https://api.github.com/repos/simonw/datasette/issues/1193,797159434,MDEyOklzc3VlQ29tbWVudDc5NzE1OTQzNA==,9599,simonw,2021-03-12T01:01:54Z,2021-03-12T01:01:54Z,OWNER,"DuckDB has a read-only mechanism: https://duckdb.org/docs/api/python
```python
import duckdb
con = duckdb.connect(database=""/tmp/blah.db"", read_only=True)
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787173276,Research plugin hook for alternative database backends,
https://github.com/simonw/datasette/issues/1194#issuecomment-762390401,https://api.github.com/repos/simonw/datasette/issues/1194,762390401,MDEyOklzc3VlQ29tbWVudDc2MjM5MDQwMQ==,9599,simonw,2021-01-18T17:42:38Z,2021-01-18T17:42:38Z,OWNER,"Relevant code: https://github.com/simonw/datasette/blob/a882d679626438ba0d809944f06f239bcba8ee96/datasette/views/table.py#L815-L827
It looks like there are other arguments that may not be persisted too.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",788447787,?_size= argument is not persisted by hidden form fields in the table filters,
https://github.com/simonw/datasette/issues/1194#issuecomment-762390568,https://api.github.com/repos/simonw/datasette/issues/1194,762390568,MDEyOklzc3VlQ29tbWVudDc2MjM5MDU2OA==,9599,simonw,2021-01-18T17:43:03Z,2021-01-18T17:43:03Z,OWNER,Should I just blanket copy over any query string argument that starts with an underscore? Any reason _not_ to do that?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",788447787,?_size= argument is not persisted by hidden form fields in the table filters,
https://github.com/simonw/datasette/issues/1194#issuecomment-766491911,https://api.github.com/repos/simonw/datasette/issues/1194,766491911,MDEyOklzc3VlQ29tbWVudDc2NjQ5MTkxMQ==,9599,simonw,2021-01-25T02:02:15Z,2021-01-25T02:02:15Z,OWNER,I'm going to copy across anything starting with an underscore.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",788447787,?_size= argument is not persisted by hidden form fields in the table filters,
https://github.com/simonw/datasette/issues/1195#issuecomment-763108730,https://api.github.com/repos/simonw/datasette/issues/1195,763108730,MDEyOklzc3VlQ29tbWVudDc2MzEwODczMA==,9599,simonw,2021-01-19T20:22:37Z,2021-01-19T20:22:37Z,OWNER,I can use this test: https://github.com/simonw/datasette/blob/c38c42948cbfddd587729413fd6082ba352eaece/tests/test_plugins.py#L238-L294,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",789336592,"view_name = ""query"" for the query page",
https://github.com/simonw/datasette/issues/1195#issuecomment-766534634,https://api.github.com/repos/simonw/datasette/issues/1195,766534634,MDEyOklzc3VlQ29tbWVudDc2NjUzNDYzNA==,9599,simonw,2021-01-25T04:38:30Z,2021-01-25T04:38:30Z,OWNER,"This has proved surprisingly difficult to implement, due to the weird way the QueryView is actually called. The class itself isn't used like other view classes - instead, the `.data()` methods of both `DatabaseView` and `TableView` dispatch out to `QueryView.data()` when they need to:
https://github.com/simonw/datasette/blob/07e163561592c743e4117f72102fcd350a600909/datasette/views/table.py#L259-L270
https://github.com/simonw/datasette/blob/07e163561592c743e4117f72102fcd350a600909/datasette/views/table.py#L290-L294
https://github.com/simonw/datasette/blob/07e163561592c743e4117f72102fcd350a600909/datasette/views/database.py#L39-L44
It turns out this is a bad pattern because it makes changes like this one WAY harder than they should be.
I think I should clean this up as part of #878.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",789336592,"view_name = ""query"" for the query page",
https://github.com/simonw/datasette/issues/1195#issuecomment-766534748,https://api.github.com/repos/simonw/datasette/issues/1195,766534748,MDEyOklzc3VlQ29tbWVudDc2NjUzNDc0OA==,9599,simonw,2021-01-25T04:38:56Z,2021-01-25T04:38:56Z,OWNER,"Here's a diff showing how far I got before I abandoned this particular effort:
```diff
diff --git a/datasette/views/base.py b/datasette/views/base.py
index a21b929..04e4aa9 100644
--- a/datasette/views/base.py
+++ b/datasette/views/base.py
@@ -120,7 +120,7 @@ class BaseView:
handler = getattr(self, request.method.lower(), None)
return await handler(request, *args, **kwargs)
- async def render(self, templates, request, context=None):
+ async def render(self, templates, request, context=None, view_name=None):
context = context or {}
template = self.ds.jinja_env.select_template(templates)
template_context = {
@@ -135,7 +135,7 @@ class BaseView:
}
return Response.html(
await self.ds.render_template(
- template, template_context, request=request, view_name=self.name
+ template, template_context, request=request, view_name=view_name
)
)
@@ -155,7 +155,7 @@ class BaseView:
class DataView(BaseView):
- name = """"
+ view_name = ""no-view-name""
re_named_parameter = re.compile("":([a-zA-Z0-9_]+)"")
async def options(self, request, *args, **kwargs):
@@ -414,6 +414,10 @@ class DataView(BaseView):
args[""table""] = urllib.parse.unquote_plus(args[""table""])
return _format, args
+ async def get_view_name(self, request, database, hash, **kwargs):
+ # Defaults to self.view_name, but can be over-ridden by subclasses
+ return self.view_name
+
async def view_get(self, request, database, hash, correct_hash_provided, **kwargs):
_format, kwargs = await self.get_format(request, database, kwargs)
@@ -424,6 +428,8 @@ class DataView(BaseView):
# HTML views default to expanding all foreign key labels
kwargs[""default_labels""] = True
+ view_name = await self.get_view_name(request, database, hash, **kwargs)
+
extra_template_data = {}
start = time.perf_counter()
status_code = 200
@@ -489,7 +495,7 @@ class DataView(BaseView):
database=database,
table=data.get(""table""),
request=request,
- view_name=self.name,
+ view_name=view_name,
# These will be deprecated in Datasette 1.0:
args=request.args,
data=data,
@@ -533,7 +539,7 @@ class DataView(BaseView):
database=database,
table=data.get(""table""),
request=request,
- view_name=self.name,
+ view_name=view_name,
)
it_can_render = await await_me_maybe(it_can_render)
if it_can_render:
@@ -565,7 +571,7 @@ class DataView(BaseView):
}
if ""metadata"" not in context:
context[""metadata""] = self.ds.metadata
- r = await self.render(templates, request=request, context=context)
+ r = await self.render(templates, request=request, context=context, view_name=view_name)
r.status = status_code
ttl = request.args.get(""_ttl"", None)
diff --git a/datasette/views/database.py b/datasette/views/database.py
index f6fd579..e425213 100644
--- a/datasette/views/database.py
+++ b/datasette/views/database.py
@@ -23,7 +23,11 @@ from .base import DatasetteError, DataView
class DatabaseView(DataView):
- name = ""database""
+ async def get_view_name(self, request, db_name, table_and_format):
+ if request.args.get(""sql""):
+ return ""query""
+ else:
+ return ""database""
async def data(self, request, database, hash, default_labels=False, _size=None):
await self.check_permissions(
@@ -145,7 +149,7 @@ class DatabaseView(DataView):
class DatabaseDownload(DataView):
- name = ""database_download""
+ view_name = ""database_download""
async def view_get(self, request, database, hash, correct_hash_present, **kwargs):
await self.check_permissions(
diff --git a/datasette/views/index.py b/datasette/views/index.py
index b6b8cbe..d750e3d 100644
--- a/datasette/views/index.py
+++ b/datasette/views/index.py
@@ -16,7 +16,7 @@ COUNT_DB_SIZE_LIMIT = 100 * 1024 * 1024
class IndexView(BaseView):
- name = ""index""
+ view_name = ""index""
async def get(self, request, as_format):
await self.check_permission(request, ""view-instance"")
diff --git a/datasette/views/special.py b/datasette/views/special.py
index 9750dd0..dbd1e00 100644
--- a/datasette/views/special.py
+++ b/datasette/views/special.py
@@ -6,7 +6,7 @@ import secrets
class JsonDataView(BaseView):
- name = ""json_data""
+ view_name = ""json_data""
def __init__(self, datasette, filename, data_callback, needs_request=False):
self.ds = datasette
@@ -42,7 +42,7 @@ class JsonDataView(BaseView):
class PatternPortfolioView(BaseView):
- name = ""patterns""
+ view_name = ""patterns""
async def get(self, request):
await self.check_permission(request, ""view-instance"")
@@ -50,7 +50,7 @@ class PatternPortfolioView(BaseView):
class AuthTokenView(BaseView):
- name = ""auth_token""
+ view_name = ""auth_token""
async def get(self, request):
token = request.args.get(""token"") or """"
@@ -68,7 +68,7 @@ class AuthTokenView(BaseView):
class LogoutView(BaseView):
- name = ""logout""
+ view_name = ""logout""
async def get(self, request):
if not request.actor:
@@ -87,7 +87,7 @@ class LogoutView(BaseView):
class PermissionsDebugView(BaseView):
- name = ""permissions_debug""
+ view_name = ""permissions_debug""
async def get(self, request):
await self.check_permission(request, ""view-instance"")
@@ -102,7 +102,7 @@ class PermissionsDebugView(BaseView):
class AllowDebugView(BaseView):
- name = ""allow_debug""
+ view_name = ""allow_debug""
async def get(self, request):
errors = []
@@ -136,7 +136,7 @@ class AllowDebugView(BaseView):
class MessagesDebugView(BaseView):
- name = ""messages_debug""
+ view_name = ""messages_debug""
async def get(self, request):
await self.check_permission(request, ""view-instance"")
diff --git a/datasette/views/table.py b/datasette/views/table.py
index 0a3504b..45d298a 100644
--- a/datasette/views/table.py
+++ b/datasette/views/table.py
@@ -257,7 +257,16 @@ class RowTableShared(DataView):
class TableView(RowTableShared):
- name = ""table""
+ view_name = ""table""
+
+ async def get_view_name(self, request, db_name, table_and_format):
+ canned_query = await self.ds.get_canned_query(
+ db_name, table_and_format, request.actor
+ )
+ if canned_query:
+ return ""query""
+ else:
+ return ""table""
async def post(self, request, db_name, table_and_format):
# Handle POST to a canned query
@@ -923,7 +932,7 @@ async def _sql_params_pks(db, table, pk_values):
class RowView(RowTableShared):
- name = ""row""
+ view_name = ""row""
async def data(self, request, database, hash, table, pk_path, default_labels=False):
await self.check_permissions(
diff --git a/tests/test_plugins.py b/tests/test_plugins.py
index 715c7c1..7ce2b1b 100644
--- a/tests/test_plugins.py
+++ b/tests/test_plugins.py
@@ -252,7 +252,7 @@ def test_plugin_config_file(app_client):
},
),
(
- ""/fixtures/"",
+ ""/fixtures"",
{
""template"": ""database.html"",
""database"": ""fixtures"",
@@ -285,6 +285,38 @@ def test_plugin_config_file(app_client):
],
},
),
+ (
+ ""/fixtures?sql=select+1+as+one"",
+ {
+ ""template"": ""query.html"",
+ ""database"": ""fixtures"",
+ ""table"": None,
+ ""config"": {""depth"": ""database""},
+ ""view_name"": ""query"",
+ ""request_path"": ""/fixtures"",
+ ""added"": 15,
+ ""columns"": [
+ ""one"",
+ ],
+ },
+ ),
+ (
+ ""/fixtures/neighborhood_search"",
+ {
+ ""template"": ""query.html"",
+ ""database"": ""fixtures"",
+ ""table"": None,
+ ""config"": {""depth"": ""database""},
+ ""view_name"": ""query"",
+ ""request_path"": ""/fixtures/neighborhood_search"",
+ ""added"": 15,
+ ""columns"": [
+ ""neighborhood"",
+ ""name"",
+ ""state"",
+ ],
+ },
+ ),
],
)
def test_hook_extra_body_script(app_client, path, expected_extra_body_script):
```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",789336592,"view_name = ""query"" for the query page",
https://github.com/simonw/datasette/issues/1195#issuecomment-766535046,https://api.github.com/repos/simonw/datasette/issues/1195,766535046,MDEyOklzc3VlQ29tbWVudDc2NjUzNTA0Ng==,9599,simonw,2021-01-25T04:40:08Z,2021-01-25T04:40:08Z,OWNER,"Also: should the view name really be the same for both of these pages?
- https://latest.datasette.io/fixtures?sql=select+*+from+facetable
- https://latest.datasette.io/fixtures/neighborhood_search
Where one of them is a canned query and the other is an arbitrary query?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",789336592,"view_name = ""query"" for the query page",
https://github.com/simonw/datasette/issues/1197#issuecomment-766430111,https://api.github.com/repos/simonw/datasette/issues/1197,766430111,MDEyOklzc3VlQ29tbWVudDc2NjQzMDExMQ==,9599,simonw,2021-01-24T20:53:40Z,2021-01-24T20:53:40Z,OWNER,"https://devcenter.heroku.com/articles/slug-compiler#slug-size says that the maximum allowed size is 500MB - my hunch is that the Datasette application itself weighs in at only a dozen or so MB but I haven't measured it. So I would imagine anything up to around 450MB should work OK on Heroku.
Cloud Run works for up to about 2GB in my experience.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",791381623,DB size limit for publishing with Heroku,
https://github.com/simonw/datasette/issues/1198#issuecomment-766428183,https://api.github.com/repos/simonw/datasette/issues/1198,766428183,MDEyOklzc3VlQ29tbWVudDc2NjQyODE4Mw==,9599,simonw,2021-01-24T20:40:37Z,2021-01-24T20:40:37Z,OWNER,https://docs.datasette.io/en/latest/testing_plugins.html#testing-outbound-http-calls-with-pytest-httpx,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792625812,Plugin testing documentation on using pytest-httpx,
https://github.com/simonw/datasette/issues/1199#issuecomment-766181628,https://api.github.com/repos/simonw/datasette/issues/1199,766181628,MDEyOklzc3VlQ29tbWVudDc2NjE4MTYyOA==,9599,simonw,2021-01-23T21:25:18Z,2021-01-23T21:25:18Z,OWNER,"Comment thread here: https://news.ycombinator.com/item?id=25881911 - cperciva says:
> There's an even better reason for databases to not write to memory mapped pages: Pages get synched out to disk at the kernel's leisure. This can be ok for a cache but it's definitely not what you want for a database!
But... Datasette is often used in read-only mode, so that disadvantage often doesn't apply.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792652391,Experiment with PRAGMA mmap_size=N,
https://github.com/simonw/datasette/issues/1199#issuecomment-881932880,https://api.github.com/repos/simonw/datasette/issues/1199,881932880,IC_kwDOBm6k_c40kTpQ,9599,simonw,2021-07-17T17:39:17Z,2021-07-17T17:39:17Z,OWNER,"I asked about optimizing performance on the SQLite forum and this came up as a suggestion: https://sqlite.org/forum/forumpost/9a6b9ae8e2048c8b?t=c
I can start by trying this:
PRAGMA mmap_size=268435456;","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792652391,Experiment with PRAGMA mmap_size=N,
https://github.com/simonw/datasette/issues/12#issuecomment-348245757,https://api.github.com/repos/simonw/datasette/issues/12,348245757,MDEyOklzc3VlQ29tbWVudDM0ODI0NTc1Nw==,9599,simonw,2017-11-30T16:39:45Z,2017-11-30T16:39:45Z,OWNER,"It is now possible to over-ride templates on a per-database / per-row or per-
table basis.
When you access e.g. `/mydatabase/mytable` Datasette will look for the following:
- table-mydatabase-mytable.html
- table.html
If you provided a `--template-dir` argument to datasette serve it will look in
that directory first.
The lookup rules are as follows:
Index page (/):
index.html
Database page (/mydatabase):
database-mydatabase.html
database.html
Table page (/mydatabase/mytable):
table-mydatabase-mytable.html
table.html
Row page (/mydatabase/mytable/id):
row-mydatabase-mytable.html
row.html
If a table name has spaces or other unexpected characters in it, the template
filename will follow the same rules as our custom `` CSS classes
introduced in 8ab3a16 - for example, a table called ""Food Trucks""
will attempt to load the following templates:
table-mydatabase-Food-Trucks-399138.html
table.html
It is possible to extend the default templates using Jinja template
inheritance. If you want to customize EVERY row template with some additional
content you can do so by creating a `row.html` template like this:
{% extends ""default:row.html"" %}
{% block content %}
EXTRA HTML AT THE TOP OF THE CONTENT BLOCK
This line renders the original block:
{{ super() }}
{% endblock %}
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267523511,Make it so you can override templates,
https://github.com/simonw/datasette/issues/120#issuecomment-599702870,https://api.github.com/repos/simonw/datasette/issues/120,599702870,MDEyOklzc3VlQ29tbWVudDU5OTcwMjg3MA==,9599,simonw,2020-03-16T18:48:05Z,2020-03-16T18:48:05Z,OWNER,"I've built two of these so far: https://github.com/simonw/datasette-auth-github and https://github.com/simonw/datasette-auth-existing-cookies
Closing this ticket in favour of #699","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275087397,Plugin that adds an authentication layer of some sort,
https://github.com/simonw/datasette/issues/1200#issuecomment-777178728,https://api.github.com/repos/simonw/datasette/issues/1200,777178728,MDEyOklzc3VlQ29tbWVudDc3NzE3ODcyOA==,9599,simonw,2021-02-11T03:13:59Z,2021-02-11T03:13:59Z,OWNER,"I came up with the need for this while playing with this tool: https://calands.datasettes.com/calands?sql=select%0D%0A++AsGeoJSON(geometry)%2C+*%0D%0Afrom%0D%0A++CPAD_2020a_SuperUnits%0D%0Awhere%0D%0A++PARK_NAME+like+'%25mini%25'+and%0D%0A++Intersects(GeomFromGeoJSON(%3Afreedraw)%2C+geometry)+%3D+1%0D%0A++and+CPAD_2020a_SuperUnits.rowid+in+(%0D%0A++++select%0D%0A++++++rowid%0D%0A++++from%0D%0A++++++SpatialIndex%0D%0A++++where%0D%0A++++++f_table_name+%3D+'CPAD_2020a_SuperUnits'%0D%0A++++++and+search_frame+%3D+GeomFromGeoJSON(%3Afreedraw)%0D%0A++)&freedraw={""type""%3A""MultiPolygon""%2C""coordinates""%3A[[[[-122.42202758789064%2C37.82280243352759]%2C[-122.39868164062501%2C37.823887203271454]%2C[-122.38220214843751%2C37.81846319511331]%2C[-122.35061645507814%2C37.77071473849611]%2C[-122.34924316406251%2C37.74465712069939]%2C[-122.37258911132814%2C37.703380457832374]%2C[-122.39044189453125%2C37.690340943717715]%2C[-122.41241455078126%2C37.680559803205135]%2C[-122.44262695312501%2C37.67295135774715]%2C[-122.47283935546876%2C37.67295135774715]%2C[-122.52502441406251%2C37.68382032669382]%2C[-122.53463745117189%2C37.6892542140253]%2C[-122.54699707031251%2C37.690340943717715]%2C[-122.55798339843751%2C37.72945260537781]%2C[-122.54287719726564%2C37.77831314799672]%2C[-122.49893188476564%2C37.81303878836991]%2C[-122.46185302734376%2C37.82822612280363]%2C[-122.42889404296876%2C37.82822612280363]%2C[-122.42202758789064%2C37.82280243352759]]]]} - before I fixed https://github.com/simonw/datasette-leaflet-geojson/issues/16 it was loading a LOT of maps, which felt bad. I wanted to be able to link people to that page with a hard limit on the number of rows displayed on that page.
It's mainly to guard against unexpected behaviour from limit-less queries though. It's not a very high priority feature!","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792890765,?_size=10 option for the arbitrary query page would be useful,
https://github.com/simonw/datasette/issues/1201#issuecomment-766543387,https://api.github.com/repos/simonw/datasette/issues/1201,766543387,MDEyOklzc3VlQ29tbWVudDc2NjU0MzM4Nw==,9599,simonw,2021-01-25T05:07:40Z,2021-01-25T05:13:29Z,OWNER,Changes: https://github.com/simonw/datasette/compare/0.53...a5ede3cdd455e2bb1a1fb2f4e1b5a9855caf5179,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792904595,Release notes for Datasette 0.54,
https://github.com/simonw/datasette/issues/1201#issuecomment-766545442,https://api.github.com/repos/simonw/datasette/issues/1201,766545442,MDEyOklzc3VlQ29tbWVudDc2NjU0NTQ0Mg==,9599,simonw,2021-01-25T05:13:59Z,2021-01-25T05:13:59Z,OWNER,"The big stuff:
- Database(memory_name=) for shared in-memory databases, closes #1151
- The `_internal` database - #1150
- script type=module support, closes #1186 , #1187
- Improved design for the `.add_database()` method 8919f99c2f7f245aca7f94bd53d5ac9d04aa42b5 - which means databases with the same stem can now be opened, #509
- Adopted Prettier #1166
Smaller:
- force_https_urls on for publish cloudrun, refs #1178
- Fixed bug in example nginx config, refs #1091
- Shrunk ecosystem docs in favour of datasette.io, closes #1182
- request.full_path property, closes #1184
- Better PRAGMA error message, closes #1185
- publish heroku now uses python-3.8.7
- Plugin testing documentation on using pytest-httpx Closes #1198
- Contributing docs for Black and Prettier, closes #1167
- All ?_ parameters now copied to hidden form fields, closes #1194
- Fixed bug loading database called 'test-database (1).sqlite' - Closes #1181.
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792904595,Release notes for Datasette 0.54,
https://github.com/simonw/datasette/issues/1201#issuecomment-766545604,https://api.github.com/repos/simonw/datasette/issues/1201,766545604,MDEyOklzc3VlQ29tbWVudDc2NjU0NTYwNA==,9599,simonw,2021-01-25T05:14:31Z,2021-01-25T05:14:31Z,OWNER,"The two big ticket items are `