html_url,issue_url,id,node_id,user,user_label,created_at,updated_at,author_association,body,reactions,issue,issue_label,performed_via_github_app https://github.com/simonw/datasette/issues/1199#issuecomment-766181628,https://api.github.com/repos/simonw/datasette/issues/1199,766181628,MDEyOklzc3VlQ29tbWVudDc2NjE4MTYyOA==,9599,simonw,2021-01-23T21:25:18Z,2021-01-23T21:25:18Z,OWNER,"Comment thread here: https://news.ycombinator.com/item?id=25881911 - cperciva says: > There's an even better reason for databases to not write to memory mapped pages: Pages get synched out to disk at the kernel's leisure. This can be ok for a cache but it's definitely not what you want for a database! But... Datasette is often used in read-only mode, so that disadvantage often doesn't apply.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792652391,Experiment with PRAGMA mmap_size=N, https://github.com/simonw/sqlite-utils/pull/224#issuecomment-765678057,https://api.github.com/repos/simonw/sqlite-utils/issues/224,765678057,MDEyOklzc3VlQ29tbWVudDc2NTY3ODA1Nw==,37962604,polyrand,2021-01-22T20:53:06Z,2021-01-23T20:13:27Z,NONE,"I'm using the FTS methods in sqlite-utils for this website: [drwn.io](https://drwn.io/). I wanted to get pagination to have some kind of infinite scrolling in the landing page, and I ended up using that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792297010,Add fts offset docs., https://github.com/simonw/datasette/issues/1191#issuecomment-765757433,https://api.github.com/repos/simonw/datasette/issues/1191,765757433,MDEyOklzc3VlQ29tbWVudDc2NTc1NzQzMw==,9599,simonw,2021-01-22T23:43:43Z,2021-01-22T23:43:43Z,OWNER,"Another potential use for this: plugins that provide authentication (like `datasette-auth-passwords` and `datasette-auth-github`) could use it to add a chunk of HTML to the ""permission denied"" page that links to their mechanism of authenticating.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates, https://github.com/simonw/datasette/issues/1196#issuecomment-765639968,https://api.github.com/repos/simonw/datasette/issues/1196,765639968,MDEyOklzc3VlQ29tbWVudDc2NTYzOTk2OA==,2826376,QAInsights,2021-01-22T19:37:15Z,2021-01-22T19:37:15Z,NONE,I tried deployment in WSL. It is working fine https://jmeter.vercel.app/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",791237799,Access Denied Error in Windows, https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765525338,https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1,765525338,MDEyOklzc3VlQ29tbWVudDc2NTUyNTMzOA==,25372415,cobiadigital,2021-01-22T16:22:44Z,2021-01-22T16:22:44Z,NONE,"rs1333049 associated with coronary artery disease https://www.snpedia.com/index.php/Rs1333049 ``` select rsid, genotype, case genotype when 'CC' then '1.9x increased risk for coronary artery disease' when 'CG' then '1.5x increased risk for CAD' when 'GG' then 'normal' end as interpretation from genome where rsid = 'rs1333049' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",496415321,Figure out some interesting example SQL queries, https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765523517,https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1,765523517,MDEyOklzc3VlQ29tbWVudDc2NTUyMzUxNw==,25372415,cobiadigital,2021-01-22T16:20:25Z,2021-01-22T16:20:25Z,NONE,"rs53576: the oxytocin receptor (OXTR) gene ``` select rsid, genotype, case genotype when 'AA' then 'Lack of empathy?' when 'AG' then 'Lack of empathy?' when 'GG' then 'Optimistic and empathetic; handle stress well' end as interpretation from genome where rsid = 'rs53576' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",496415321,Figure out some interesting example SQL queries, https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765506901,https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1,765506901,MDEyOklzc3VlQ29tbWVudDc2NTUwNjkwMQ==,25372415,cobiadigital,2021-01-22T15:58:41Z,2021-01-22T15:58:58Z,NONE,"Both rs10757274 and rs2383206 can both indicate higher risks of heart disease https://www.snpedia.com/index.php/Rs2383206 ``` select rsid, genotype, case genotype when 'AA' then 'Normal' when 'AG' then '~1.2x increased risk for heart disease' when 'GG' then '~1.3x increased risk for heart disease' end as interpretation from genome where rsid = 'rs10757274' ``` ``` select rsid, genotype, case genotype when 'AA' then 'Normal' when 'AG' then '1.4x increased risk for heart disease' when 'GG' then '1.7x increased risk for heart disease' end as interpretation from genome where rsid = 'rs2383206' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",496415321,Figure out some interesting example SQL queries, https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765502845,https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1,765502845,MDEyOklzc3VlQ29tbWVudDc2NTUwMjg0NQ==,25372415,cobiadigital,2021-01-22T15:53:19Z,2021-01-22T15:53:19Z,NONE,"rs7903146 Influences risk of Type-2 diabetes https://www.snpedia.com/index.php/Rs7903146 ``` select rsid, genotype, case genotype when 'CC' then 'Normal (lower) risk of Type 2 Diabetes and Gestational Diabetes.' when 'CT' then '1.4x increased risk for diabetes (and perhaps colon cancer).' when 'TT' then '2x increased risk for Type-2 diabetes' end as interpretation from genome where rsid = 'rs7903146' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",496415321,Figure out some interesting example SQL queries, https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765498984,https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1,765498984,MDEyOklzc3VlQ29tbWVudDc2NTQ5ODk4NA==,25372415,cobiadigital,2021-01-22T15:48:25Z,2021-01-22T15:49:33Z,NONE,"The ""Warrior Gene"" https://www.snpedia.com/index.php/Rs4680 ``` select rsid, genotype, case genotype when 'AA' then '(worrier) advantage in memory and attention tasks' when 'AG' then 'Intermediate dopamine levels, other effects' when 'GG' then '(warrior) multiple associations, see details' end as interpretation from genome where rsid = 'rs4680' ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",496415321,Figure out some interesting example SQL queries, https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765495861,https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1,765495861,MDEyOklzc3VlQ29tbWVudDc2NTQ5NTg2MQ==,25372415,cobiadigital,2021-01-22T15:44:00Z,2021-01-22T15:44:00Z,NONE,"Risk of autoimmune disorders: https://www.snpedia.com/index.php/Genotype ``` select rsid, genotype, case genotype when 'AA' then '2x risk of rheumatoid arthritis and other autoimmune diseases' when 'GG' then 'Normal risk for autoimmune disorders' end as interpretation from genome where rsid = 'rs2476601' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",496415321,Figure out some interesting example SQL queries, https://github.com/simonw/datasette/issues/1195#issuecomment-763108730,https://api.github.com/repos/simonw/datasette/issues/1195,763108730,MDEyOklzc3VlQ29tbWVudDc2MzEwODczMA==,9599,simonw,2021-01-19T20:22:37Z,2021-01-19T20:22:37Z,OWNER,I can use this test: https://github.com/simonw/datasette/blob/c38c42948cbfddd587729413fd6082ba352eaece/tests/test_plugins.py#L238-L294,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",789336592,"view_name = ""query"" for the query page", https://github.com/simonw/sqlite-utils/issues/223#issuecomment-762540514,https://api.github.com/repos/simonw/sqlite-utils/issues/223,762540514,MDEyOklzc3VlQ29tbWVudDc2MjU0MDUxNA==,9599,simonw,2021-01-19T01:14:58Z,2021-01-19T01:15:54Z,OWNER,"Example from https://docs.python.org/3/library/csv.html#csv.reader ```pycon >>> import csv >>> with open('eggs.csv', newline='') as csvfile: ... spamreader = csv.reader(csvfile, delimiter=' ', quotechar='|') ... for row in spamreader: ... print(', '.join(row)) Spam, Spam, Spam, Spam, Spam, Baked Beans Spam, Lovely Spam, Wonderful Spam ``` I'm going to add `--quotechar` as well as `--delimiter`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",788527932,--delimiter option for CSV import, https://github.com/simonw/datasette/issues/1175#issuecomment-762488336,https://api.github.com/repos/simonw/datasette/issues/1175,762488336,MDEyOklzc3VlQ29tbWVudDc2MjQ4ODMzNg==,758858,hannseman,2021-01-18T21:59:28Z,2021-01-18T22:00:31Z,NONE,"I encountered your issue when trying to find a solution and came up with the following, maybe it can help. ```python import logging.config from typing import Tuple import structlog import uvicorn from example.config import settings shared_processors: Tuple[structlog.types.Processor, ...] = ( structlog.contextvars.merge_contextvars, structlog.stdlib.add_logger_name, structlog.stdlib.add_log_level, structlog.processors.TimeStamper(fmt=""iso""), ) logging_config = { ""version"": 1, ""disable_existing_loggers"": False, ""formatters"": { ""json"": { ""()"": structlog.stdlib.ProcessorFormatter, ""processor"": structlog.processors.JSONRenderer(), ""foreign_pre_chain"": shared_processors, }, ""console"": { ""()"": structlog.stdlib.ProcessorFormatter, ""processor"": structlog.dev.ConsoleRenderer(), ""foreign_pre_chain"": shared_processors, }, **uvicorn.config.LOGGING_CONFIG[""formatters""], }, ""handlers"": { ""default"": { ""level"": ""DEBUG"", ""class"": ""logging.StreamHandler"", ""formatter"": ""json"" if not settings.debug else ""console"", }, ""uvicorn.access"": { ""level"": ""INFO"", ""class"": ""logging.StreamHandler"", ""formatter"": ""access"", }, ""uvicorn.default"": { ""level"": ""INFO"", ""class"": ""logging.StreamHandler"", ""formatter"": ""default"", }, }, ""loggers"": { """": {""handlers"": [""default""], ""level"": ""INFO""}, ""uvicorn.error"": { ""handlers"": [""default"" if not settings.debug else ""uvicorn.default""], ""level"": ""INFO"", ""propagate"": False, }, ""uvicorn.access"": { ""handlers"": [""default"" if not settings.debug else ""uvicorn.access""], ""level"": ""INFO"", ""propagate"": False, }, }, } def setup_logging() -> None: structlog.configure( processors=[ structlog.stdlib.filter_by_level, *shared_processors, structlog.stdlib.PositionalArgumentsFormatter(), structlog.processors.StackInfoRenderer(), structlog.processors.format_exc_info, structlog.stdlib.ProcessorFormatter.wrap_for_formatter, ], context_class=dict, logger_factory=structlog.stdlib.LoggerFactory(), wrapper_class=structlog.stdlib.AsyncBoundLogger, cache_logger_on_first_use=True, ) logging.config.dictConfig(logging_config) ``` And then it'll be run on the startup event: ```python @app.on_event(""startup"") async def startup_event() -> None: setup_logging() ``` It depends on a setting called `debug` which controls if it should output the regular uvicorn logging or json. ","{""total_count"": 15, ""+1"": 7, ""-1"": 0, ""laugh"": 1, ""hooray"": 1, ""confused"": 0, ""heart"": 5, ""rocket"": 1, ""eyes"": 0}",779156520,Use structlog for logging, https://github.com/simonw/datasette/issues/1036#issuecomment-762391426,https://api.github.com/repos/simonw/datasette/issues/1036,762391426,MDEyOklzc3VlQ29tbWVudDc2MjM5MTQyNg==,4997607,philshem,2021-01-18T17:45:00Z,2021-01-18T17:45:00Z,NONE,"It might be possible with this library: https://docs.python.org/3/library/imghdr.html quick test of the downloaded blob: ``` >>> import imghdr >>> imghdr.what('material_culture-1-image.blob') 'jpeg' ``` The output plugin would be cool. I'll look into making my first datasette plugin. I'm also imagining displaying the image in the browser -- but that would be a step 2. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1194#issuecomment-762390568,https://api.github.com/repos/simonw/datasette/issues/1194,762390568,MDEyOklzc3VlQ29tbWVudDc2MjM5MDU2OA==,9599,simonw,2021-01-18T17:43:03Z,2021-01-18T17:43:03Z,OWNER,Should I just blanket copy over any query string argument that starts with an underscore? Any reason _not_ to do that?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",788447787,?_size= argument is not persisted by hidden form fields in the table filters, https://github.com/simonw/datasette/issues/1194#issuecomment-762390401,https://api.github.com/repos/simonw/datasette/issues/1194,762390401,MDEyOklzc3VlQ29tbWVudDc2MjM5MDQwMQ==,9599,simonw,2021-01-18T17:42:38Z,2021-01-18T17:42:38Z,OWNER,"Relevant code: https://github.com/simonw/datasette/blob/a882d679626438ba0d809944f06f239bcba8ee96/datasette/views/table.py#L815-L827 It looks like there are other arguments that may not be persisted too.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",788447787,?_size= argument is not persisted by hidden form fields in the table filters, https://github.com/simonw/datasette/issues/1036#issuecomment-762387875,https://api.github.com/repos/simonw/datasette/issues/1036,762387875,MDEyOklzc3VlQ29tbWVudDc2MjM4Nzg3NQ==,9599,simonw,2021-01-18T17:36:36Z,2021-01-18T17:36:36Z,OWNER,"As you can see, I'm pretty paranoid about serving content with `Content-Type` HTTP headers because I'm so worried about execution vulnerabilities. I'm much more comfortable exploring that kind of thing in plugins, since that way people can opt-in to riskier features. You found `datasette-media` which is my most comprehensive exploration of that idea so far - but there's definitely lots of room for more plugins along those lines. Maybe even an output plugin? `.jpg` as an export format which returns the `BLOB` column for a row as a JPEG image with the correct `content-type` header (but first verifies that the binary content does indeed look like a real JPEG) could be interesting.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-762385981,https://api.github.com/repos/simonw/datasette/issues/1036,762385981,MDEyOklzc3VlQ29tbWVudDc2MjM4NTk4MQ==,4997607,philshem,2021-01-18T17:32:13Z,2021-01-18T17:34:50Z,NONE,"Hi Simon Just finding this old issue regarding downloading blobs. Nice work! As a feature request, maybe it would be possible to assign a blob column as a certain data type (e.g. `.jpg`) and then each blob could be downloaded as that type of file (perhaps if the file types were constrained to normal blobs that people store in sqlite databases, this could avoid the execution stuff mentioned above). I guess the column blob-type definition could fit into this dropdown selection: Let me know if I should open a new issue with a feature request. (This could slowly go in the direction of displaying image blob-types in the browser.) Thanks for the great tool! --- edit: just reading the rest of the twitter thread: https://twitter.com/simonw/status/1318685933256855552 perhaps this is already possible in some form with the plugin datasette-media: https://github.com/simonw/datasette-media","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/dogsheep/swarm-to-sqlite/issues/11#issuecomment-761967094,https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/11,761967094,MDEyOklzc3VlQ29tbWVudDc2MTk2NzA5NA==,9599,simonw,2021-01-18T04:11:13Z,2021-01-18T04:11:13Z,MEMBER,"I just got a similar error: ``` File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/swarm_to_sqlite/utils.py"", line 79, in save_checkin checkins_table.m2m(""users"", user, m2m_table=""with"", pk=""id"") File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/sqlite_utils/db.py"", line 2048, in m2m id = other_table.insert(record, pk=pk, replace=True).last_pk File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1781, in insert return self.insert_all( File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1899, in insert_all self.insert_chunk( File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1709, in insert_chunk result = self.db.execute(query, params) File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/sqlite_utils/db.py"", line 226, in execute return self.conn.execute(sql, parameters) pysqlite3.dbapi2.OperationalError: table users has no column named countryCode ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743400216,Error thrown: sqlite3.OperationalError: table users has no column named lastName, https://github.com/simonw/datasette/issues/1191#issuecomment-761705076,https://api.github.com/repos/simonw/datasette/issues/1191,761705076,MDEyOklzc3VlQ29tbWVudDc2MTcwNTA3Ng==,9599,simonw,2021-01-17T00:35:13Z,2021-01-17T00:37:51Z,OWNER,"I'm going to try using Jinja macros to implement this: https://jinja.palletsprojects.com/en/2.11.x/templates/#macros Maybe using one of these tricks to auto-load the macro? http://codyaray.com/2015/05/auto-load-jinja2-macros","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates, https://github.com/simonw/datasette/issues/1181#issuecomment-761703555,https://api.github.com/repos/simonw/datasette/issues/1181,761703555,MDEyOklzc3VlQ29tbWVudDc2MTcwMzU1NQ==,9599,simonw,2021-01-17T00:24:20Z,2021-01-17T00:24:40Z,OWNER,"Here's the incomplete sketch of a test - to go at the bottom of `test_cli.py`. ```python @pytest.mark.parametrize( ""filename"", [""test-database (1).sqlite"", ""database (1).sqlite""] ) def test_weird_database_names(ensure_eventloop, tmpdir, filename): # https://github.com/simonw/datasette/issues/1181 runner = CliRunner() db_path = str(tmpdir / filename) sqlite3.connect(db_path).execute(""vacuum"") result1 = runner.invoke(cli, [db_path, ""--get"", ""/""]) assert result1.exit_code == 0, result1.output homepage_html = result1.output assert False ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",781262510,"Certain database names results in 404: ""Database not found: None""", https://github.com/simonw/datasette/issues/1191#issuecomment-761703368,https://api.github.com/repos/simonw/datasette/issues/1191,761703368,MDEyOklzc3VlQ29tbWVudDc2MTcwMzM2OA==,9599,simonw,2021-01-17T00:22:46Z,2021-01-17T00:22:46Z,OWNER,I'm going to prototype this in a branch.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates, https://github.com/simonw/datasette/issues/1191#issuecomment-761703232,https://api.github.com/repos/simonw/datasette/issues/1191,761703232,MDEyOklzc3VlQ29tbWVudDc2MTcwMzIzMg==,9599,simonw,2021-01-17T00:21:31Z,2021-01-17T00:21:54Z,OWNER,"I think this ends up being a whole collection of new plugin hooks, something like: - `include_table_top` - `include_table_bottom` - `include_row_top` - `include_row_bottom` - `include_database_top` - `include_database_bottom` - `include_query_bottom` - `include_query_bottom` - `include_index_bottom` - `include_index_bottom`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates, https://github.com/simonw/datasette/issues/1191#issuecomment-761703022,https://api.github.com/repos/simonw/datasette/issues/1191,761703022,MDEyOklzc3VlQ29tbWVudDc2MTcwMzAyMg==,9599,simonw,2021-01-17T00:20:00Z,2021-01-17T00:20:00Z,OWNER,"Plugins that want to provide extra context to the template can already do so using the `extra_template_vars()` plugin hook. So this hook could work by returning a list of template filenames to be included. Those templates can be bundled with the plugin. Since they are included they will have access to the template context and to any `extra_template_vars()` values.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates, https://github.com/simonw/datasette/issues/657#issuecomment-761179229,https://api.github.com/repos/simonw/datasette/issues/657,761179229,MDEyOklzc3VlQ29tbWVudDc2MTE3OTIyOQ==,9599,simonw,2021-01-15T20:24:35Z,2021-01-15T20:24:35Z,OWNER,"I'm not sure how I missed this issue but it's almost a year later and I'm finally taking a look at your Parquet work. This is yet more evidence that allowing plugins to provide their own custom `Database` objects would be a good idea. I started exploring what Datasette would like on PostgreSQL in #670 - my concern was that I would need to add a large amount of database abstraction code which would dramatically increase the complexity of the core project, but my thinking now is that it might be tractable - Datasette doesn't actually construct SQL in complex ways anywhere outside of the `TableView` class so abstracting away just that bit should be feasible.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",548591089,Allow creation of virtual tables at startup, https://github.com/simonw/datasette/issues/1191#issuecomment-761103910,https://api.github.com/repos/simonw/datasette/issues/1191,761103910,MDEyOklzc3VlQ29tbWVudDc2MTEwMzkxMA==,9599,simonw,2021-01-15T18:19:29Z,2021-01-15T18:19:29Z,OWNER,This relates to #987 (documented HTML hooks for JavaScript plugins) but is not quite the same thing.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",787098345,Ability for plugins to collaborate when adding extra HTML to blocks in default templates, https://github.com/simonw/datasette/issues/657#issuecomment-761101878,https://api.github.com/repos/simonw/datasette/issues/657,761101878,MDEyOklzc3VlQ29tbWVudDc2MTEwMTg3OA==,9599,simonw,2021-01-15T18:16:01Z,2021-01-15T18:16:01Z,OWNER,"I think the `startup()` plugin hook at https://docs.datasette.io/en/stable/plugin_hooks.html#startup-datasette should be able to fit this. You can write a plugin which uses that hook to execute `CREATE VIRTUAL TABLE` against one or more databases when Datasette first starts running. Would that work here?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",548591089,Allow creation of virtual tables at startup, https://github.com/simonw/sqlite-utils/issues/220#issuecomment-761015218,https://api.github.com/repos/simonw/sqlite-utils/issues/220,761015218,MDEyOklzc3VlQ29tbWVudDc2MTAxNTIxOA==,649467,mhalle,2021-01-15T15:40:08Z,2021-01-15T15:40:08Z,NONE,"Make sense. If you're coming from the sqlite3 side of things, rather than the datasette side, wanting the fts methods to work for views makes more sense. sqlite3 allows fts5 tables on views, so I was looking for CLI functionality to build the fts virtual tables. Ultimately, though, sharing fts virtual tables across tables and derivative views is likely more efficient. Maybe an explicit error message like, ""fts is not supported for views"" rather than just throwing an exception that the method doesn't exist"" might be helpful. Not critical though. Thanks.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",783778672,Better error message for *_fts methods against views, https://github.com/dogsheep/twitter-to-sqlite/pull/55#issuecomment-760950128,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/55,760950128,MDEyOklzc3VlQ29tbWVudDc2MDk1MDEyOA==,21148,jacobian,2021-01-15T13:44:52Z,2021-01-15T13:44:52Z,CONTRIBUTOR,"I found and fixed another bug, this one around importing the tweets table. @simonw let me know if you'd prefer this broken out into multiple PRs, happy to do that if it makes review/merging easier.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779211940,Fix archive imports, https://github.com/simonw/datasette/issues/1187#issuecomment-759875239,https://api.github.com/repos/simonw/datasette/issues/1187,759875239,MDEyOklzc3VlQ29tbWVudDc1OTg3NTIzOQ==,9599,simonw,2021-01-14T02:02:24Z,2021-01-14T02:02:31Z,OWNER,"This plugin hook currently returns a string of JavaScript. It could optionally return this instead of a string: ```json { ""script"": ""string of JavaScript goes here"", ""module"": true } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",785588942,"extra_body_script() support for script type=""module""", https://github.com/simonw/datasette/issues/1186#issuecomment-759874332,https://api.github.com/repos/simonw/datasette/issues/1186,759874332,MDEyOklzc3VlQ29tbWVudDc1OTg3NDMzMg==,9599,simonw,2021-01-14T01:59:35Z,2021-01-14T01:59:35Z,OWNER,Updated documentation: https://docs.datasette.io/en/latest/custom_templates.html#custom-css-and-javascript and https://docs.datasette.io/en/latest/plugin_hooks.html#extra-js-urls-template-database-table-columns-view-name-request-datasette,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",785573793,"script type=""module"" support", https://github.com/simonw/datasette/pull/1159#issuecomment-759306228,https://api.github.com/repos/simonw/datasette/issues/1159,759306228,MDEyOklzc3VlQ29tbWVudDc1OTMwNjIyOA==,552629,lovasoa,2021-01-13T08:59:31Z,2021-01-13T08:59:31Z,NONE,@simonw : Did you have the time to take a look at this ?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",774332247,Improve the display of facets information, https://github.com/simonw/sqlite-utils/issues/220#issuecomment-759098964,https://api.github.com/repos/simonw/sqlite-utils/issues/220,759098964,MDEyOklzc3VlQ29tbWVudDc1OTA5ODk2NA==,9599,simonw,2021-01-12T23:19:55Z,2021-01-12T23:19:55Z,OWNER,"I don't think it makes sense to call `.enable_fts()` on a view does it? When I'm working with views and FTS I tend to write my queries against a FTS table for one of the tables that is used by the view - https://docs.datasette.io/en/stable/full_text_search.html#configuring-full-text-search-for-a-table-or-view describes how I do that in Datasette for example. Can you expand on your use-case for FTS and views? I'm ready to be convinced otherwise, but I don't see how it would work right now.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",783778672,Better error message for *_fts methods against views, https://github.com/simonw/datasette/issues/1185#issuecomment-759069342,https://api.github.com/repos/simonw/datasette/issues/1185,759069342,MDEyOklzc3VlQ29tbWVudDc1OTA2OTM0Mg==,9599,simonw,2021-01-12T22:13:18Z,2021-01-12T22:13:18Z,OWNER,I'm going to change the error message to list the allowed pragmas.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",784628163,"""Statement may not contain PRAGMA"" error is not strictly true", https://github.com/simonw/datasette/issues/1185#issuecomment-759067427,https://api.github.com/repos/simonw/datasette/issues/1185,759067427,MDEyOklzc3VlQ29tbWVudDc1OTA2NzQyNw==,9599,simonw,2021-01-12T22:09:21Z,2021-01-12T22:09:21Z,OWNER,"That allow-list was added in #761 but is not currently documented. It's here in the code: https://github.com/simonw/datasette/blob/8e8fc5cee5c78da8334495c6d6257d5612c40792/datasette/utils/__init__.py#L173-L186","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",784628163,"""Statement may not contain PRAGMA"" error is not strictly true", https://github.com/simonw/datasette/issues/1185#issuecomment-759066777,https://api.github.com/repos/simonw/datasette/issues/1185,759066777,MDEyOklzc3VlQ29tbWVudDc1OTA2Njc3Nw==,9599,simonw,2021-01-12T22:07:58Z,2021-01-12T22:07:58Z,OWNER,"https://docs.datasette.io/en/stable/sql_queries.html?highlight=pragma#named-parameters documentation is out-of-date as well: > Datasette disallows custom SQL containing the string PRAGMA, as SQLite pragma statements can be used to change database settings at runtime. If you need to include the string ""pragma"" in a query you can do so safely using a named parameter.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",784628163,"""Statement may not contain PRAGMA"" error is not strictly true", https://github.com/simonw/datasette/issues/1091#issuecomment-758668359,https://api.github.com/repos/simonw/datasette/issues/1091,758668359,MDEyOklzc3VlQ29tbWVudDc1ODY2ODM1OQ==,6739646,tballison,2021-01-12T13:52:29Z,2021-01-12T13:52:29Z,NONE,"Y, thank you to both of you!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-758448525,https://api.github.com/repos/simonw/datasette/issues/1091,758448525,MDEyOklzc3VlQ29tbWVudDc1ODQ0ODUyNQ==,19328961,henry501,2021-01-12T06:55:08Z,2021-01-12T06:55:08Z,NONE,"Great, really happy I could help! Reverse proxies get tricky.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1183#issuecomment-758356640,https://api.github.com/repos/simonw/datasette/issues/1183,758356640,MDEyOklzc3VlQ29tbWVudDc1ODM1NjY0MA==,9599,simonw,2021-01-12T02:42:08Z,2021-01-12T02:42:08Z,OWNER,"Should Datasette have subcommands for this? `datasette enable-counts data.db` and `datasette disable-counts data.db` and `datasette reset-counts data.db` commands? Maybe. The `sqlite-utils` CLI tool could be used here instead, but that won't be easily available if Datasette was installed as a standalone binary or using `brew install datasette` or `pipx install datasette`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",782708469,"Take advantage of sqlite-utils cached table counts, if available", https://github.com/simonw/datasette/issues/1183#issuecomment-758356097,https://api.github.com/repos/simonw/datasette/issues/1183,758356097,MDEyOklzc3VlQ29tbWVudDc1ODM1NjA5Nw==,9599,simonw,2021-01-12T02:40:30Z,2021-01-12T02:40:30Z,OWNER,"So how would this work? I think I'm going to automatically use these values if the `_counts` table exists, unless a `ignore_counts_table` boolean setting has been set. I won't bother looking to see if the triggers have been created.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",782708469,"Take advantage of sqlite-utils cached table counts, if available", https://github.com/simonw/datasette/issues/1091#issuecomment-758283074,https://api.github.com/repos/simonw/datasette/issues/1091,758283074,MDEyOklzc3VlQ29tbWVudDc1ODI4MzA3NA==,9599,simonw,2021-01-11T23:12:46Z,2021-01-11T23:12:46Z,OWNER,Fantastic!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-758280611,https://api.github.com/repos/simonw/datasette/issues/1091,758280611,MDEyOklzc3VlQ29tbWVudDc1ODI4MDYxMQ==,6739646,tballison,2021-01-11T23:06:10Z,2021-01-11T23:06:10Z,NONE,"+1 Yep! Fixes it. If I navigate to https://corpora.tika.apache.org/datasette, I get a 404 (database not found: datasette), but if I navigate to https://corpora.tika.apache.org/datasette/file_profiles/, everything WORKS! Thank you!","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 1, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1182#issuecomment-757375858,https://api.github.com/repos/simonw/datasette/issues/1182,757375858,MDEyOklzc3VlQ29tbWVudDc1NzM3NTg1OA==,9599,simonw,2021-01-09T22:18:47Z,2021-01-09T22:18:47Z,OWNER,https://docs.datasette.io/en/latest/ecosystem.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",782692159,"Retire ""Ecosystem"" page in favour of datasette.io/plugins and /tools", https://github.com/simonw/datasette/issues/1182#issuecomment-757373741,https://api.github.com/repos/simonw/datasette/issues/1182,757373741,MDEyOklzc3VlQ29tbWVudDc1NzM3Mzc0MQ==,9599,simonw,2021-01-09T22:01:41Z,2021-01-09T22:01:41Z,OWNER,It can talk about Dogsheep too.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",782692159,"Retire ""Ecosystem"" page in favour of datasette.io/plugins and /tools", https://github.com/simonw/datasette/issues/1182#issuecomment-757373082,https://api.github.com/repos/simonw/datasette/issues/1182,757373082,MDEyOklzc3VlQ29tbWVudDc1NzM3MzA4Mg==,9599,simonw,2021-01-09T21:55:33Z,2021-01-09T21:55:33Z,OWNER,I'll leave the page there but change it into more of a blurb about the existence of the plugins and tools directories.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",782692159,"Retire ""Ecosystem"" page in favour of datasette.io/plugins and /tools", https://github.com/simonw/datasette/issues/1181#issuecomment-756487966,https://api.github.com/repos/simonw/datasette/issues/1181,756487966,MDEyOklzc3VlQ29tbWVudDc1NjQ4Nzk2Ng==,9599,simonw,2021-01-08T01:25:42Z,2021-01-08T01:25:42Z,OWNER,I'm going to add a unit test that tries a variety of weird database names.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",781262510,"Certain database names results in 404: ""Database not found: None""", https://github.com/simonw/datasette/issues/1181#issuecomment-756482163,https://api.github.com/repos/simonw/datasette/issues/1181,756482163,MDEyOklzc3VlQ29tbWVudDc1NjQ4MjE2Mw==,9599,simonw,2021-01-08T01:06:23Z,2021-01-08T01:06:54Z,OWNER,"Yes, that logic is definitely at fault. It looks like it applies `urllib.parse.unquote_plus()` AFTER it's tried to do the `-` hash splitting thing, which is why it's failing here: https://github.com/simonw/datasette/blob/97fb10c17dd007a275ab743742e93e932335ad67/datasette/views/base.py#L184-L198","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",781262510,"Certain database names results in 404: ""Database not found: None""", https://github.com/simonw/datasette/issues/1091#issuecomment-756453945,https://api.github.com/repos/simonw/datasette/issues/1091,756453945,MDEyOklzc3VlQ29tbWVudDc1NjQ1Mzk0NQ==,9599,simonw,2021-01-07T23:42:50Z,2021-01-07T23:42:50Z,OWNER,"@henry501 it looks like you spotted a bug in the documentation - I just addressed that, the fix is now live here: https://docs.datasette.io/en/latest/deploying.html#running-datasette-behind-a-proxy","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-756453010,https://api.github.com/repos/simonw/datasette/issues/1091,756453010,MDEyOklzc3VlQ29tbWVudDc1NjQ1MzAxMA==,9599,simonw,2021-01-07T23:39:58Z,2021-01-07T23:40:25Z,OWNER,"@tballison I think that's the solution! It looks like you need to use this in your config: `ProxyPass /datasette http://127.0.0.1:8001/datasette` Instead of this: `ProxyPass /datasette http://127.0.0.1:8001/` Give that a go and let me know if it fixes it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-756425587,https://api.github.com/repos/simonw/datasette/issues/1091,756425587,MDEyOklzc3VlQ29tbWVudDc1NjQyNTU4Nw==,19328961,henry501,2021-01-07T22:27:19Z,2021-01-07T22:27:19Z,NONE,"I found this issue while troubleshooting the same behavior with an nginx reverse proxy. The solution was to make sure I set: `proxy_pass http://server:8001/baseurl/ ` instead of just: `proxy_pass http://server:8001 ` The custom SQL query and header links are now correct. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1171#issuecomment-756335394,https://api.github.com/repos/simonw/datasette/issues/1171,756335394,MDEyOklzc3VlQ29tbWVudDc1NjMzNTM5NA==,9599,simonw,2021-01-07T19:35:59Z,2021-01-07T19:35:59Z,OWNER,"I requested a D-U-N-S number as a first step in getting a developer certificate: https://developer.apple.com/support/D-U-N-S/ ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables, https://github.com/simonw/datasette/issues/1180#issuecomment-756312213,https://api.github.com/repos/simonw/datasette/issues/1180,756312213,MDEyOklzc3VlQ29tbWVudDc1NjMxMjIxMw==,9599,simonw,2021-01-07T18:56:24Z,2021-01-07T18:56:24Z,OWNER,The `async_call_with_supported_arguments` version should be able to await any of the lazy arguments that are awaitable - can use `await_me_maybe` for that.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780767542,Lazily evaluated arguments for call_with_supported_arguments, https://github.com/simonw/datasette/issues/1180#issuecomment-755500475,https://api.github.com/repos/simonw/datasette/issues/1180,755500475,MDEyOklzc3VlQ29tbWVudDc1NTUwMDQ3NQ==,9599,simonw,2021-01-06T18:43:41Z,2021-01-06T18:43:41Z,OWNER,Relevant code: https://github.com/simonw/datasette/blob/97fb10c17dd007a275ab743742e93e932335ad67/datasette/utils/__init__.py#L919-L940,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780767542,Lazily evaluated arguments for call_with_supported_arguments, https://github.com/simonw/datasette/issues/1179#issuecomment-755495387,https://api.github.com/repos/simonw/datasette/issues/1179,755495387,MDEyOklzc3VlQ29tbWVudDc1NTQ5NTM4Nw==,9599,simonw,2021-01-06T18:39:23Z,2021-01-06T18:39:23Z,OWNER,"In that case maybe there are three new arguments: `path`, `full_path` and `url`. I'll also add `request.full_path` for consistency with these: https://github.com/simonw/datasette/blob/97fb10c17dd007a275ab743742e93e932335ad67/datasette/utils/asgi.py#L77-L90","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks, https://github.com/simonw/datasette/issues/1179#issuecomment-755492945,https://api.github.com/repos/simonw/datasette/issues/1179,755492945,MDEyOklzc3VlQ29tbWVudDc1NTQ5Mjk0NQ==,9599,simonw,2021-01-06T18:37:39Z,2021-01-06T18:37:39Z,OWNER,I think I'll call this `full_path` for consistency with Django.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks, https://github.com/simonw/datasette/issues/1179#issuecomment-755489974,https://api.github.com/repos/simonw/datasette/issues/1179,755489974,MDEyOklzc3VlQ29tbWVudDc1NTQ4OTk3NA==,9599,simonw,2021-01-06T18:35:24Z,2021-01-06T18:35:24Z,OWNER,Django calls this ` HttpRequest.get_full_path()` - for the path plus the querystring.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks, https://github.com/simonw/datasette/issues/1179#issuecomment-755486103,https://api.github.com/repos/simonw/datasette/issues/1179,755486103,MDEyOklzc3VlQ29tbWVudDc1NTQ4NjEwMw==,9599,simonw,2021-01-06T18:32:41Z,2021-01-06T18:34:11Z,OWNER,"This parameter will return the URL path, with querystring arguments, to the HTML version of the page - e.g. `/github/issue_comments` or `/github/issue_comments?_sort_desc=created_at` Open questions: - What should it be called? `path` could be misleading since it also includes the querystring. - Should I provide a `url` or `full_url` version which includes `https://blah.com/...`?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks, https://github.com/simonw/datasette/issues/782#issuecomment-755484384,https://api.github.com/repos/simonw/datasette/issues/782,755484384,MDEyOklzc3VlQ29tbWVudDc1NTQ4NDM4NA==,9599,simonw,2021-01-06T18:31:14Z,2021-01-06T18:31:57Z,OWNER,"In building https://latest-with-plugins.datasette.io/github/issue_comments.Notebook?_labels=on I discovered the following patterns for importing data into both Pandas and Observable/d3: ```python import pandas df = pandas.read_json( ""https://latest-with-plugins.datasette.io/github/issue_comments.json?_shape=array"" ) ``` And: ```javascript d3 = require(""d3@5"") rows = d3.json( ""https://latest-with-plugins.datasette.io/github/issue_comments.json?_shape=array"" ) ``` Once again I find myself torn on the best possible default. A list of JSON objects is instantly compatible with both `pandas.read_json()` and `d3.json()` - but it leaves nowhere to put the extra information like pagination and suchlike! Even given this I still think the correct default is an object with `""rows""`, `""total""` and `""next_url""` keys. I should commit to that and implement it - this thought exercise has been running for far too long.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879,Redesign default .json format, https://github.com/simonw/datasette/issues/1178#issuecomment-755476820,https://api.github.com/repos/simonw/datasette/issues/1178,755476820,MDEyOklzc3VlQ29tbWVudDc1NTQ3NjgyMA==,9599,simonw,2021-01-06T18:24:47Z,2021-01-06T18:24:47Z,OWNER,"Issue fixed - https://latest-with-plugins.datasette.io/github/issue_comments.Notebook?_labels=on displays the correct schemes now. I can't think of a reason anyone on Cloud Run would ever NOT want the `force_https_urls` option, but just in case I've made it so if you pass `--extra-options --setting force_https_urls off` to `publish cloudrun` your setting will be respected. https://github.com/simonw/datasette/blob/97fb10c17dd007a275ab743742e93e932335ad67/datasette/publish/cloudrun.py#L105-L110","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run, https://github.com/simonw/datasette/issues/1178#issuecomment-755468795,https://api.github.com/repos/simonw/datasette/issues/1178,755468795,MDEyOklzc3VlQ29tbWVudDc1NTQ2ODc5NQ==,9599,simonw,2021-01-06T18:14:35Z,2021-01-06T18:14:35Z,OWNER,Deploying that change now to test it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run, https://github.com/simonw/datasette/issues/1178#issuecomment-755163886,https://api.github.com/repos/simonw/datasette/issues/1178,755163886,MDEyOklzc3VlQ29tbWVudDc1NTE2Mzg4Ng==,9599,simonw,2021-01-06T08:37:51Z,2021-01-06T08:37:51Z,OWNER,"Easiest fix would be for `publish cloudrun` to set `force_https_urls`: `datasette publish now` used to do this: https://github.com/simonw/datasette/blob/07e208cc6d9e901b87552c1be2854c220b3f9b6d/datasette/publish/now.py#L59-L63","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run, https://github.com/simonw/datasette/issues/1179#issuecomment-755161574,https://api.github.com/repos/simonw/datasette/issues/1179,755161574,MDEyOklzc3VlQ29tbWVudDc1NTE2MTU3NA==,9599,simonw,2021-01-06T08:32:31Z,2021-01-06T08:32:31Z,OWNER,An optional `path` argument to https://docs.datasette.io/en/stable/plugin_hooks.html#register-output-renderer-datasette which shows the path WITHOUT the `.Notebook` extension would be useful here.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780278550,Make original path available to render hooks, https://github.com/simonw/datasette/issues/1178#issuecomment-755160187,https://api.github.com/repos/simonw/datasette/issues/1178,755160187,MDEyOklzc3VlQ29tbWVudDc1NTE2MDE4Nw==,9599,simonw,2021-01-06T08:29:35Z,2021-01-06T08:29:35Z,OWNER,"https://latest-with-plugins.datasette.io/-/asgi-scope ``` {'asgi': {'spec_version': '2.1', 'version': '3.0'}, 'client': ('169.254.8.129', 54971), 'headers': [(b'host', b'latest-with-plugins.datasette.io'), (b'user-agent', b'Mozilla/5.0 (Macintosh; Intel Mac OS X 10.15; rv:84.0) Gecko' b'/20100101 Firefox/84.0'), (b'accept', b'text/html,application/xhtml+xml,application/xml;q=0.9,image/' b'webp,*/*;q=0.8'), (b'accept-language', b'en-US,en;q=0.5'), (b'dnt', b'1'), (b'cookie', b'_ga_LL6M7BK6D4=GS1.1.1609886546.49.1.1609886923.0; _ga=GA1.1' b'.894633707.1607575712'), (b'upgrade-insecure-requests', b'1'), (b'x-client-data', b'CgSL6ZsV'), (b'x-cloud-trace-context', b'e776af843c657d2a3da28a73b726e6fe/14187666787557102189;o=1'), (b'x-forwarded-for', b'148.64.98.14'), (b'x-forwarded-proto', b'https'), (b'forwarded', b'for=""148.64.98.14"";proto=https'), (b'accept-encoding', b'gzip, deflate, br'), (b'content-length', b'0')], 'http_version': '1.1', 'method': 'GET', 'path': '/-/asgi-scope', 'query_string': b'', 'raw_path': b'/-/asgi-scope', 'root_path': '', 'scheme': 'http', 'server': ('169.254.8.130', 8080), 'type': 'http'} ``` Note the `'scheme': 'http'` but also the `(b'x-forwarded-proto', b'https')`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run, https://github.com/simonw/datasette/issues/1176#issuecomment-755159583,https://api.github.com/repos/simonw/datasette/issues/1176,755159583,MDEyOklzc3VlQ29tbWVudDc1NTE1OTU4Mw==,9599,simonw,2021-01-06T08:28:20Z,2021-01-06T08:28:20Z,OWNER,I used `from datasette.utils import path_with_format` in https://github.com/simonw/datasette-export-notebook/blob/0.1/datasette_export_notebook/__init__.py just now.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions", https://github.com/simonw/datasette/issues/1178#issuecomment-755158310,https://api.github.com/repos/simonw/datasette/issues/1178,755158310,MDEyOklzc3VlQ29tbWVudDc1NTE1ODMxMA==,9599,simonw,2021-01-06T08:25:31Z,2021-01-06T08:25:31Z,OWNER,Moving this to the Datasette repo.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run, https://github.com/simonw/datasette/issues/1178#issuecomment-755157732,https://api.github.com/repos/simonw/datasette/issues/1178,755157732,MDEyOklzc3VlQ29tbWVudDc1NTE1NzczMg==,9599,simonw,2021-01-06T08:24:12Z,2021-01-06T08:24:12Z,OWNER,https://latest-with-plugins.datasette.io/fixtures/sortable.json has the bug too - the `next_url` is `http://` when it should be `https://`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run, https://github.com/simonw/datasette/issues/1178#issuecomment-755157281,https://api.github.com/repos/simonw/datasette/issues/1178,755157281,MDEyOklzc3VlQ29tbWVudDc1NTE1NzI4MQ==,9599,simonw,2021-01-06T08:23:14Z,2021-01-06T08:23:14Z,OWNER,"https://latest-with-plugins.datasette.io/-/settings says `""force_https_urls"": false`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run, https://github.com/simonw/datasette/issues/1178#issuecomment-755157066,https://api.github.com/repos/simonw/datasette/issues/1178,755157066,MDEyOklzc3VlQ29tbWVudDc1NTE1NzA2Ng==,9599,simonw,2021-01-06T08:22:47Z,2021-01-06T08:22:47Z,OWNER,"Weird... https://github.com/simonw/datasette/blob/a882d679626438ba0d809944f06f239bcba8ee96/datasette/app.py#L609-L613 ```python def absolute_url(self, request, path): url = urllib.parse.urljoin(request.url, path) if url.startswith(""http://"") and self.setting(""force_https_urls""): url = ""https://"" + url[len(""http://"") :] return url ``` That looks like it should work. Needs more digging.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run, https://github.com/simonw/datasette/issues/1178#issuecomment-755156606,https://api.github.com/repos/simonw/datasette/issues/1178,755156606,MDEyOklzc3VlQ29tbWVudDc1NTE1NjYwNg==,9599,simonw,2021-01-06T08:21:49Z,2021-01-06T08:21:49Z,OWNER,"https://github.com/simonw/datasette-export-notebook/blob/aec398eab4f34791d240d7bc47b6eec575b357be/datasette_export_notebook/__init__.py#L18-L23 Maybe this is a bug in `datasette.absolute_url`? Perhaps it doesn't take the scheme into account.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",780267857,Use force_https_urls on when deploying with Cloud Run, https://github.com/simonw/datasette/issues/1101#issuecomment-755134771,https://api.github.com/repos/simonw/datasette/issues/1101,755134771,MDEyOklzc3VlQ29tbWVudDc1NTEzNDc3MQ==,9599,simonw,2021-01-06T07:28:01Z,2021-01-06T07:28:01Z,OWNER,"With this structure it will become possible to stream non-newline-delimited JSON array-of-objects too - the `stream_rows()` method could output `[` first, then each row followed by a comma, then `]` after the very last row.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",749283032,register_output_renderer() should support streaming data, https://github.com/simonw/datasette/issues/1101#issuecomment-755133937,https://api.github.com/repos/simonw/datasette/issues/1101,755133937,MDEyOklzc3VlQ29tbWVudDc1NTEzMzkzNw==,9599,simonw,2021-01-06T07:25:48Z,2021-01-06T07:26:43Z,OWNER,"Idea: instead of returning a dictionary, `register_output_renderer` could return an object. The object could have the following properties: - `.extension` - the extension to use - `.can_render(...)` - says if it can render this - `.can_stream(...)` - says if streaming is supported - `async .stream_rows(rows_iterator, send)` - method that loops through all rows and uses `send` to send them to the response in the correct format I can then deprecate the existing `dict` return type for 1.0.","{""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",749283032,register_output_renderer() should support streaming data, https://github.com/simonw/datasette/issues/1101#issuecomment-755128038,https://api.github.com/repos/simonw/datasette/issues/1101,755128038,MDEyOklzc3VlQ29tbWVudDc1NTEyODAzOA==,9599,simonw,2021-01-06T07:10:22Z,2021-01-06T07:10:22Z,OWNER,"Yet another use-case for this: I want to be able to stream newline-delimited JSON in order to better import into Pandas: pandas.read_json(""https://latest.datasette.io/fixtures/compound_three_primary_keys.json?_shape=array&_nl=on"", lines=True)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",749283032,register_output_renderer() should support streaming data, https://github.com/simonw/datasette/issues/1171#issuecomment-754958998,https://api.github.com/repos/simonw/datasette/issues/1171,754958998,MDEyOklzc3VlQ29tbWVudDc1NDk1ODk5OA==,9599,simonw,2021-01-05T23:16:33Z,2021-01-05T23:16:33Z,OWNER,"That's really useful, thanks @rcoup ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables, https://github.com/simonw/datasette/issues/782#issuecomment-754958610,https://api.github.com/repos/simonw/datasette/issues/782,754958610,MDEyOklzc3VlQ29tbWVudDc1NDk1ODYxMA==,9599,simonw,2021-01-05T23:15:24Z,2021-01-05T23:15:24Z,OWNER,https://latest-with-plugins.datasette.io/fixtures/roadside_attraction_characteristics/1.json-preview returns a 500 error at the moment - a KeyError on 'filtered_table_rows_count'.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879,Redesign default .json format, https://github.com/simonw/datasette/issues/576#issuecomment-754957658,https://api.github.com/repos/simonw/datasette/issues/576,754957658,MDEyOklzc3VlQ29tbWVudDc1NDk1NzY1OA==,9599,simonw,2021-01-05T23:12:50Z,2021-01-05T23:12:50Z,OWNER,See https://docs.datasette.io/en/stable/internals.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",497170355,Documented internals API for use in plugins, https://github.com/simonw/datasette/issues/576#issuecomment-754957563,https://api.github.com/repos/simonw/datasette/issues/576,754957563,MDEyOklzc3VlQ29tbWVudDc1NDk1NzU2Mw==,9599,simonw,2021-01-05T23:12:37Z,2021-01-05T23:12:37Z,OWNER,"I'm happy with how this has evolved, so I'm closing the issue.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",497170355,Documented internals API for use in plugins, https://github.com/simonw/datasette/issues/1176#issuecomment-754957378,https://api.github.com/repos/simonw/datasette/issues/1176,754957378,MDEyOklzc3VlQ29tbWVudDc1NDk1NzM3OA==,9599,simonw,2021-01-05T23:12:03Z,2021-01-05T23:12:03Z,OWNER,This needs to be done for Datasette 1.0. At the very least I need to ensure it's clear that `datasette.utils` is not part of the public API unless explicitly marked as such.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions", https://github.com/simonw/datasette/issues/1176#issuecomment-754952146,https://api.github.com/repos/simonw/datasette/issues/1176,754952146,MDEyOklzc3VlQ29tbWVudDc1NDk1MjE0Ng==,9599,simonw,2021-01-05T22:57:26Z,2021-01-05T22:57:26Z,OWNER,Known public APIs might be worth adding type annotations to as well.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions", https://github.com/simonw/datasette/issues/1176#issuecomment-754952040,https://api.github.com/repos/simonw/datasette/issues/1176,754952040,MDEyOklzc3VlQ29tbWVudDc1NDk1MjA0MA==,9599,simonw,2021-01-05T22:57:09Z,2021-01-05T22:57:09Z,OWNER,It might be neater to move all of the non-public functions into a separate module - `datasette.utils.internal` perhaps.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions", https://github.com/simonw/datasette/issues/1176#issuecomment-754951786,https://api.github.com/repos/simonw/datasette/issues/1176,754951786,MDEyOklzc3VlQ29tbWVudDc1NDk1MTc4Ng==,9599,simonw,2021-01-05T22:56:27Z,2021-01-05T22:56:43Z,OWNER,"Idea: introduce a `@documented` decorator which marks specific functions as part of the public, documented API. The unit tests can then confirm that anything with that decorator is both documented and tested. ```python @documented def escape_css_string(s): return _css_re.sub( lambda m: ""\\"" + (f""{ord(m.group()):X}"".zfill(6)), s.replace(""\r\n"", ""\n""), ) ``` Or maybe `@public`?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779691739,"Policy on documenting ""public"" datasette.utils functions", https://github.com/simonw/datasette/issues/1171#issuecomment-754911290,https://api.github.com/repos/simonw/datasette/issues/1171,754911290,MDEyOklzc3VlQ29tbWVudDc1NDkxMTI5MA==,59874,rcoup,2021-01-05T21:31:15Z,2021-01-05T21:31:15Z,NONE,"We did this for [Sno](https://sno.earth) under macOS — it's a PyInstaller binary/setup which uses [Packages](http://s.sudre.free.fr/Software/Packages/about.html) for packaging. * [Building & Signing](https://github.com/koordinates/sno/blob/master/platforms/Makefile#L67-L95) * [Packaging & Notarizing](https://github.com/koordinates/sno/blob/master/platforms/Makefile#L121-L215) * [Github Workflow](https://github.com/koordinates/sno/blob/master/.github/workflows/build.yml#L228-L269) has the CI side of it FYI (if you ever get to it) for Windows you need to get a code signing certificate. And if you want automated CI, you'll want to get an ""EV CodeSigning for HSM"" certificate from GlobalSign, which then lets you put the certificate into Azure Key Vault. Which you can use with [azuresigntool](https://github.com/vcsjones/AzureSignTool) to sign your code & installer. (Non-EV certificates are a waste of time, the user still gets big warnings at install time). ","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables, https://github.com/dogsheep/twitter-to-sqlite/issues/54#issuecomment-754729035,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/54,754729035,MDEyOklzc3VlQ29tbWVudDc1NDcyOTAzNQ==,21148,jacobian,2021-01-05T16:03:29Z,2021-01-05T16:03:29Z,CONTRIBUTOR,"I was able to fix this, at least enough to get _my_ archive to import. Not sure if there's more work to be done here or not.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779088071,Archive import appears to be broken on recent exports, https://github.com/dogsheep/twitter-to-sqlite/pull/55#issuecomment-754728696,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/55,754728696,MDEyOklzc3VlQ29tbWVudDc1NDcyODY5Ng==,21148,jacobian,2021-01-05T16:02:55Z,2021-01-05T16:02:55Z,CONTRIBUTOR,"This now works for me, though I'm entirely ensure if it's a just-my-export thing or a wider issue. Also, this doesn't contain any tests. So I'm not sure if there's more work to be done here, or if this is good enough.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779211940,Fix archive imports, https://github.com/dogsheep/twitter-to-sqlite/issues/54#issuecomment-754721153,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/54,754721153,MDEyOklzc3VlQ29tbWVudDc1NDcyMTE1Mw==,21148,jacobian,2021-01-05T15:51:09Z,2021-01-05T15:51:09Z,CONTRIBUTOR,"Correction: the failure is on `lists-member.js` (I was thrown by the `block` variable name, but that's just a coincidence)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779088071,Archive import appears to be broken on recent exports, https://github.com/simonw/datasette/issues/1175#issuecomment-754696725,https://api.github.com/repos/simonw/datasette/issues/1175,754696725,MDEyOklzc3VlQ29tbWVudDc1NDY5NjcyNQ==,9599,simonw,2021-01-05T15:12:30Z,2021-01-05T15:12:30Z,OWNER,Some tips here: https://github.com/tiangolo/fastapi/issues/78,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779156520,Use structlog for logging, https://github.com/simonw/datasette/issues/1167#issuecomment-754619930,https://api.github.com/repos/simonw/datasette/issues/1167,754619930,MDEyOklzc3VlQ29tbWVudDc1NDYxOTkzMA==,3637,benpickles,2021-01-05T12:57:57Z,2021-01-05T12:57:57Z,CONTRIBUTOR,"Not sure where exactly to put the actual docs (presumably somewhere in [docs/contributing.rst](https://github.com/simonw/datasette/blob/main/docs/contributing.rst)) but I've made a slight change to make it easier to run locally (copying [the approach in excalidraw](https://github.com/excalidraw/excalidraw/blob/ade2565f497243a5e428f4906d8ed80c872fd981/package.json#L90-L94)): https://github.com/simonw/datasette/compare/main...benpickles:prettier-docs ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777145954,Add Prettier to contributing documentation, https://github.com/simonw/datasette/issues/859#issuecomment-647922203,https://api.github.com/repos/simonw/datasette/issues/859,647922203,MDEyOklzc3VlQ29tbWVudDY0NzkyMjIwMw==,3243482,abdusco,2020-06-23T05:44:58Z,2021-01-05T08:22:43Z,CONTRIBUTOR,"I'm seeing the problem on database page. Index page and table page runs quite fast. - Tables have <10 columns (`id`, `url`, `title`, `body_html`, `date`, `author`, `meta` (for keeping unstructured json)). I've added index on `date` columns (using `sqlite-utils`) in addition to the index present on `id` columns. - All tables have FTS enabled on `text` and `varchar` columns (`title`, `body_html` etc) to speed up searching. - There are couple of tables related with foreign keys (think a thread in a forum and posts in that thread, related with `thread_id`) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",642572841,Database page loads too slowly with many large tables (due to table counts), https://github.com/simonw/datasette/issues/1173#issuecomment-754463845,https://api.github.com/repos/simonw/datasette/issues/1173,754463845,MDEyOklzc3VlQ29tbWVudDc1NDQ2Mzg0NQ==,9599,simonw,2021-01-05T07:41:43Z,2021-01-05T07:41:43Z,OWNER,"https://github.com/oleksis/pyinstaller-manylinux looks useful, via https://twitter.com/oleksis/status/1346341987876823040","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778682317,GitHub Actions workflow to build manylinux binary, https://github.com/simonw/datasette/issues/1171#issuecomment-754296761,https://api.github.com/repos/simonw/datasette/issues/1171,754296761,MDEyOklzc3VlQ29tbWVudDc1NDI5Njc2MQ==,9599,simonw,2021-01-04T23:55:44Z,2021-01-04T23:55:44Z,OWNER,Bit uncomfortable that it looks like you need to include your Apple ID username and password in the CI configuration to do this. I'll use GitHub Secrets for this but I don't like it - I'll definitely setup a dedicated code signing account that's not my access-to-everything AppleID for this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables, https://github.com/simonw/datasette/issues/1171#issuecomment-754295380,https://api.github.com/repos/simonw/datasette/issues/1171,754295380,MDEyOklzc3VlQ29tbWVudDc1NDI5NTM4MA==,9599,simonw,2021-01-04T23:54:32Z,2021-01-04T23:54:32Z,OWNER,"https://github.com/search?l=YAML&q=gon+json&type=Code reveals some examples of people using `gon` in workflows. These look useful: * https://github.com/coherence/hub-server/blob/3b7e9c7c5bce9e244b14b854f1f89d66f53a5a39/.github/workflows/release_build.yml * https://github.com/simoncozens/pilcrow/blob/5abc145e7fb9577086afe47b48fd730cb8195386/.github/workflows/buildapp.yaml","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables, https://github.com/simonw/datasette/issues/1171#issuecomment-754287882,https://api.github.com/repos/simonw/datasette/issues/1171,754287882,MDEyOklzc3VlQ29tbWVudDc1NDI4Nzg4Mg==,9599,simonw,2021-01-04T23:40:10Z,2021-01-04T23:42:32Z,OWNER,"This looks VERY useful: https://github.com/mitchellh/gon - "" Sign, notarize, and package macOS CLI tools and applications written in any language. Available as both a CLI and a Go library."" And it installs like this: brew install mitchellh/gon/gon","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables, https://github.com/simonw/datasette/issues/1171#issuecomment-754286783,https://api.github.com/repos/simonw/datasette/issues/1171,754286783,MDEyOklzc3VlQ29tbWVudDc1NDI4Njc4Mw==,9599,simonw,2021-01-04T23:38:18Z,2021-01-04T23:38:18Z,OWNER,Oh wow maybe I need to Notarize it too? https://developer.apple.com/documentation/xcode/notarizing_macos_software_before_distribution,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables, https://github.com/simonw/datasette/issues/1171#issuecomment-754286618,https://api.github.com/repos/simonw/datasette/issues/1171,754286618,MDEyOklzc3VlQ29tbWVudDc1NDI4NjYxOA==,9599,simonw,2021-01-04T23:37:45Z,2021-01-04T23:37:45Z,OWNER,https://github.com/actions/virtual-environments/issues/1820#issuecomment-719549887 looks useful - not sure if those notes are for iOS or macOS though.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables, https://github.com/simonw/datasette/issues/93#issuecomment-754285795,https://api.github.com/repos/simonw/datasette/issues/93,754285795,MDEyOklzc3VlQ29tbWVudDc1NDI4NTc5NQ==,9599,simonw,2021-01-04T23:35:13Z,2021-01-04T23:35:13Z,OWNER,Next step is to automate this all!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/1152#issuecomment-754285588,https://api.github.com/repos/simonw/datasette/issues/1152,754285588,MDEyOklzc3VlQ29tbWVudDc1NDI4NTU4OA==,9599,simonw,2021-01-04T23:34:30Z,2021-01-04T23:34:30Z,OWNER,"I think the way to do this is to have a new plugin hook that returns two SQL where clauses: one returning a list of resources that the user should be able to access (the allow-list) and one returning a list of resources they are explicitly forbidden from accessing (the deny-list). Either of these can be blank. Datasette can then combine those into a full SQL query and use it to answer the question ""show me a list of resources that the user is allowed to perform action X on"". It can also answer the existing question, ""is user X allowed to perform action Y on resource Z""?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770598024,Efficiently calculate list of databases/tables a user can view, https://github.com/simonw/datasette/issues/93#issuecomment-754233960,https://api.github.com/repos/simonw/datasette/issues/93,754233960,MDEyOklzc3VlQ29tbWVudDc1NDIzMzk2MA==,9599,simonw,2021-01-04T21:35:37Z,2021-01-04T21:35:37Z,OWNER,"I tested it by running a `tmate` session on the GitHub macOS machines, and it worked! ``` Mac-1609795972770:tmp runner$ wget 'https://github.com/simonw/datasette/releases/download/0.53/datasette-0.53-macos-binary.zip' --2021-01-04 21:34:10-- https://github.com/simonw/datasette/releases/download/0.53/datasette-0.53-macos-binary.zip Resolving github.com (github.com)... 140.82.114.4 Connecting to github.com (github.com)|140.82.114.4|:443... connected. HTTP request sent, awaiting response... 302 Found Location: https://github-production-release-asset-2e65be.s3.amazonaws.com/107914493/74658700-4e90-11eb-8f3b-ee77e6dfad90?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIWNJYAX4CSVEH53A%2F20210104%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20210104T213414Z&X-Amz-Expires=300&X-Amz-Signature=6f3c54211077092553590b33a7c36cd052895c9d4619607ad1df094782f64acf&X-Amz-SignedHeaders=host&actor_id=0&key_id=0&repo_id=107914493&response-content-disposition=attachment%3B%20filename%3Ddatasette-0.53-macos-binary.zip&response-content-type=application%2Foctet-stream [following] --2021-01-04 21:34:14-- https://github-production-release-asset-2e65be.s3.amazonaws.com/107914493/74658700-4e90-11eb-8f3b-ee77e6dfad90?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIWNJYAX4CSVEH53A%2F20210104%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20210104T213414Z&X-Amz-Expires=300&X-Amz-Signature=6f3c54211077092553590b33a7c36cd052895c9d4619607ad1df094782f64acf&X-Amz-SignedHeaders=host&actor_id=0&key_id=0&repo_id=107914493&response-content-disposition=attachment%3B%20filename%3Ddatasette-0.53-macos-binary.zip&response-content-type=application%2Foctet-stream Resolving github-production-release-asset-2e65be.s3.amazonaws.com (github-production-release-asset-2e65be.s3.amazonaws.com)... 52.217.43.164 Connecting to github-production-release-asset-2e65be.s3.amazonaws.com (github-production-release-asset-2e65be.s3.amazonaws.com)|52.217.43.164|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 8297283 (7.9M) [application/octet-stream] Saving to: ‘datasette-0.53-macos-binary.zip’ datasette-0.53-maco 100%[===================>] 7.91M --.-KB/s in 0.1s 2021-01-04 21:34:14 (73.4 MB/s) - ‘datasette-0.53-macos-binary.zip’ saved [8297283/8297283] Mac-1609795972770:tmp runner$ unzip datasette-0.53-macos-binary.zip Archive: datasette-0.53-macos-binary.zip creating: datasette-0.53-macos-binary/ inflating: datasette-0.53-macos-binary/datasette Mac-1609795972770:tmp runner$ datasette-0.53-macos-binary/datasette --help Usage: datasette [OPTIONS] COMMAND [ARGS]... Datasette! Options: --version Show the version and exit. --help Show this message and exit. Commands: serve* Serve up specified SQLite database files with a web UI inspect install Install Python packages - e.g. package Package specified SQLite files into a new datasette Docker... plugins List currently available plugins publish Publish specified SQLite database files to the internet along... uninstall Uninstall Python packages (e.g. Mac-1609795972770:tmp runner$ datasette-0.53-macos-binary/datasette --get /-/versions.json {""python"": {""version"": ""3.9.1"", ""full"": ""3.9.1 (default, Dec 10 2020, 10:36:35) \n[Clang 12.0.0 (clang-1200.0.32.27)]""}, ""datasette"": {""version"": ""0.53""}, ""asgi"": ""3.0"", ""uvicorn"": ""0.13.3"", ""sqlite"": {""version"": ""3.34.0"", ""fts_versions"": [""FTS5"", ""FTS4"", ""FTS3""], ""extensions"": {""json1"": null}, ""compile_options"": [""COMPILER=clang-12.0.0"", ""ENABLE_COLUMN_METADATA"", ""ENABLE_FTS3"", ""ENABLE_FTS3_PARENTHESIS"", ""ENABLE_FTS4"", ""ENABLE_FTS5"", ""ENABLE_GEOPOLY"", ""ENABLE_JSON1"", ""ENABLE_PREUPDATE_HOOK"", ""ENABLE_RTREE"", ""ENABLE_SESSION"", ""MAX_VARIABLE_NUMBER=250000"", ""THREADSAFE=1""]}} Mac-1609795972770:tmp runner$ ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-754229977,https://api.github.com/repos/simonw/datasette/issues/93,754229977,MDEyOklzc3VlQ29tbWVudDc1NDIyOTk3Nw==,9599,simonw,2021-01-04T21:28:01Z,2021-01-04T21:28:01Z,OWNER,"As an experiment, I put the macOS one in a zip file and attached it to the latest release: ``` mkdir datasette-0.53-macos-binary cp dist/datasette datasette-0.53-macos-binary zip -r datasette-0.53-macos-binary.zip datasette-0.53-macos-binary ``` It's available here: https://github.com/simonw/datasette/releases/tag/0.53 - download URL is https://github.com/simonw/datasette/releases/download/0.53/datasette-0.53-macos-binary.zip","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-754227543,https://api.github.com/repos/simonw/datasette/issues/93,754227543,MDEyOklzc3VlQ29tbWVudDc1NDIyNzU0Mw==,9599,simonw,2021-01-04T21:23:13Z,2021-01-04T21:23:13Z,OWNER,"``` (pyinstaller-venv) root@dogsheep:/tmp/pyinstaller-venv# dist/datasette --get /-/databases.json [{""name"": "":memory:"", ""path"": null, ""size"": 0, ""is_mutable"": true, ""is_memory"": true, ""hash"": null}] (pyinstaller-venv) root@dogsheep:/tmp/pyinstaller-venv# ls -lah dist/datasette -rwxr-xr-x 1 root root 8.9M Jan 4 21:05 dist/datasette ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-754219002,https://api.github.com/repos/simonw/datasette/issues/93,754219002,MDEyOklzc3VlQ29tbWVudDc1NDIxOTAwMg==,9599,simonw,2021-01-04T21:06:49Z,2021-01-04T21:22:27Z,OWNER,"Works on Linux/Ubuntu too, except I had to do `export BASE=` on a separate line. I also did this: ``` apt-get install python3 python3-venv python3 -m venv pyinstaller-venv source pyinstaller-venv/bin/activate pip install wheel pip install datasette pyinstaller export DATASETTE_BASE=$(python -c 'import os; print(os.path.dirname(__import__(""datasette"").__file__))') pyinstaller -F \ --add-data ""$DATASETTE_BASE/templates:datasette/templates"" \ --add-data ""$DATASETTE_BASE/static:datasette/static"" \ --hidden-import datasette.publish \ --hidden-import datasette.publish.heroku \ --hidden-import datasette.publish.cloudrun \ --hidden-import datasette.facets \ --hidden-import datasette.sql_functions \ --hidden-import datasette.actor_auth_cookie \ --hidden-import datasette.default_permissions \ --hidden-import datasette.default_magic_parameters \ --hidden-import datasette.blob_renderer \ --hidden-import datasette.default_menu_links \ --hidden-import uvicorn \ --hidden-import uvicorn.logging \ --hidden-import uvicorn.loops \ --hidden-import uvicorn.loops.auto \ --hidden-import uvicorn.protocols \ --hidden-import uvicorn.protocols.http \ --hidden-import uvicorn.protocols.http.auto \ --hidden-import uvicorn.protocols.websockets \ --hidden-import uvicorn.protocols.websockets.auto \ --hidden-import uvicorn.lifespan \ --hidden-import uvicorn.lifespan.on \ $(which datasette) ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-754218545,https://api.github.com/repos/simonw/datasette/issues/93,754218545,MDEyOklzc3VlQ29tbWVudDc1NDIxODU0NQ==,9599,simonw,2021-01-04T21:05:57Z,2021-01-04T21:05:57Z,OWNER,That BASE= trick seems to work with `zsh` but not with `bash`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-754215392,https://api.github.com/repos/simonw/datasette/issues/93,754215392,MDEyOklzc3VlQ29tbWVudDc1NDIxNTM5Mg==,9599,simonw,2021-01-04T20:59:20Z,2021-01-04T21:03:14Z,OWNER,"Updated `pyinstaller` recipe - lots of hidden imports needed now: ``` pip install wheel pip install datasette pyinstaller BASE=$(python -c 'import os; print(os.path.dirname(__import__(""datasette"").__file__))') \ pyinstaller -F \ --add-data ""$BASE/templates:datasette/templates"" \ --add-data ""$BASE/static:datasette/static"" \ --hidden-import datasette.publish \ --hidden-import datasette.publish.heroku \ --hidden-import datasette.publish.cloudrun \ --hidden-import datasette.facets \ --hidden-import datasette.sql_functions \ --hidden-import datasette.actor_auth_cookie \ --hidden-import datasette.default_permissions \ --hidden-import datasette.default_magic_parameters \ --hidden-import datasette.blob_renderer \ --hidden-import datasette.default_menu_links \ --hidden-import uvicorn \ --hidden-import uvicorn.logging \ --hidden-import uvicorn.loops \ --hidden-import uvicorn.loops.auto \ --hidden-import uvicorn.protocols \ --hidden-import uvicorn.protocols.http \ --hidden-import uvicorn.protocols.http.auto \ --hidden-import uvicorn.protocols.websockets \ --hidden-import uvicorn.protocols.websockets.auto \ --hidden-import uvicorn.lifespan \ --hidden-import uvicorn.lifespan.on \ $(which datasette) ```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-754215793,https://api.github.com/repos/simonw/datasette/issues/93,754215793,MDEyOklzc3VlQ29tbWVudDc1NDIxNTc5Mw==,9599,simonw,2021-01-04T21:00:14Z,2021-01-04T21:00:14Z,OWNER,"``` (pyinstaller-datasette) pyinstaller-datasette % file dist/datasette dist/datasette: Mach-O 64-bit executable x86_64 (pyinstaller-datasette) pyinstaller-datasette % ls -lah dist/datasette -rwxr-xr-x 1 simon wheel 8.0M Jan 4 12:58 dist/datasette ``` I'm surprised it's only 8MB!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/983#issuecomment-754210356,https://api.github.com/repos/simonw/datasette/issues/983,754210356,MDEyOklzc3VlQ29tbWVudDc1NDIxMDM1Ng==,222245,carlmjohnson,2021-01-04T20:49:05Z,2021-01-04T20:49:05Z,NONE,"For reasons [I've written about elsewhere](https://blog.carlmjohnson.net/post/2020/time-to-kill-ie11/), I'm in favor of modules. It has several beneficial effects. One, old browsers just ignore it all together. Two, if you include the same plain script on the page more than once, it will be executed twice, but if you include the same module script on a page twice, it will only execute once. Three, you get a module local namespace, instead of having to use the global window namespace or a function private namespace. OTOH, if you are going to use an old style script, the code from before isn't ideal, because you wipe out your registry if the script it included more than once. Also you may as well use object methods and splat arguments. The event based architecture probably makes more sense though. Just make up some event names prefixed with `datasette:` and listen for them on the root. The only concern with that approach is it can sometimes be tricky to make sure your plugins are run after datasette has run. Maybe ```js function mycallback(){ // whatever } if (window.datasette) { window.datasette.init(mycallback); } else { document.addEventListener('datasette:init', mycallback); } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/668#issuecomment-754194996,https://api.github.com/repos/simonw/datasette/issues/668,754194996,MDEyOklzc3VlQ29tbWVudDc1NDE5NDk5Ng==,9599,simonw,2021-01-04T20:18:39Z,2021-01-04T20:18:39Z,OWNER,I fixed this in #1115 - you can run `--load-extension=spatialite` now and it will look for the extension in common places.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",563347679,Make it easier to load SpatiaLite, https://github.com/simonw/datasette/issues/436#issuecomment-754193501,https://api.github.com/repos/simonw/datasette/issues/436,754193501,MDEyOklzc3VlQ29tbWVudDc1NDE5MzUwMQ==,9599,simonw,2021-01-04T20:15:41Z,2021-01-04T20:15:41Z,OWNER,"Sadly `publish.datasettes.com` was broken by changes to Zeit, and I don't think I'll be bringing it back.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",435819321,400 Error when trying to register new user via https://publish.datasettes.com/, https://github.com/simonw/datasette/issues/371#issuecomment-754192873,https://api.github.com/repos/simonw/datasette/issues/371,754192873,MDEyOklzc3VlQ29tbWVudDc1NDE5Mjg3Mw==,9599,simonw,2021-01-04T20:14:28Z,2021-01-04T20:14:28Z,OWNER,"Now that Digital Ocean has App Platform this is less necessary, especially since the documentation covers how to use App Platform here: https://docs.datasette.io/en/stable/deploying.html#deploying-using-buildpacks","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377156339,datasette publish digitalocean plugin, https://github.com/simonw/datasette/issues/102#issuecomment-754192267,https://api.github.com/repos/simonw/datasette/issues/102,754192267,MDEyOklzc3VlQ29tbWVudDc1NDE5MjI2Nw==,9599,simonw,2021-01-04T20:13:19Z,2021-01-04T20:13:19Z,OWNER,"I'm more likely to do Lambda than Elastic Beanstalk, especially now the size limit for Lambdas has been increased as part of their support for Docker.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274264175,datasette publish elasticbeanstalk, https://github.com/simonw/datasette/issues/221#issuecomment-754191699,https://api.github.com/repos/simonw/datasette/issues/221,754191699,MDEyOklzc3VlQ29tbWVudDc1NDE5MTY5OQ==,9599,simonw,2021-01-04T20:12:14Z,2021-01-04T20:12:14Z,OWNER,I'm going to close this. Plugins can register their own CLI tools (see https://github.com/simonw/click-app) if they need to.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315142414,Allow plugins to add new cli sub commands , https://github.com/simonw/datasette/issues/221#issuecomment-754190952,https://api.github.com/repos/simonw/datasette/issues/221,754190952,MDEyOklzc3VlQ29tbWVudDc1NDE5MDk1Mg==,9599,simonw,2021-01-04T20:10:51Z,2021-01-04T20:10:51Z,OWNER,Is this still a good idea? I don't have any pressing need for it at the moment.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315142414,Allow plugins to add new cli sub commands , https://github.com/simonw/datasette/issues/221#issuecomment-754190814,https://api.github.com/repos/simonw/datasette/issues/221,754190814,MDEyOklzc3VlQ29tbWVudDc1NDE5MDgxNA==,9599,simonw,2021-01-04T20:10:34Z,2021-01-04T20:10:34Z,OWNER,"For the `csvs-to-sqlite` case I'm going with `datasette insert` instead, see #1160.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315142414,Allow plugins to add new cli sub commands , https://github.com/simonw/datasette/issues/18#issuecomment-754188383,https://api.github.com/repos/simonw/datasette/issues/18,754188383,MDEyOklzc3VlQ29tbWVudDc1NDE4ODM4Mw==,9599,simonw,2021-01-04T20:05:48Z,2021-01-04T20:05:48Z,OWNER,"I'm not using Sanic any more, but this is still very feasible. If I ever do it I'll write a plugin.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267739593,See if I can get a websockets interface working, https://github.com/simonw/datasette/issues/103#issuecomment-754188099,https://api.github.com/repos/simonw/datasette/issues/103,754188099,MDEyOklzc3VlQ29tbWVudDc1NDE4ODA5OQ==,9599,simonw,2021-01-04T20:05:14Z,2021-01-04T20:05:14Z,OWNER,"Wontfix, Cloud Run is already implemented and is a better fit for Datasette.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274265878,datasette publish appengine, https://github.com/simonw/datasette/issues/913#issuecomment-754187520,https://api.github.com/repos/simonw/datasette/issues/913,754187520,MDEyOklzc3VlQ29tbWVudDc1NDE4NzUyMA==,9599,simonw,2021-01-04T20:04:10Z,2021-01-04T20:04:10Z,OWNER,That's pretty elegant: each plugin gets its own namespace and can register new settings.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",670209331,Mechanism for passing additional options to `datasette my.db` that affect plugins, https://github.com/simonw/datasette/issues/913#issuecomment-754187326,https://api.github.com/repos/simonw/datasette/issues/913,754187326,MDEyOklzc3VlQ29tbWVudDc1NDE4NzMyNg==,9599,simonw,2021-01-04T20:03:50Z,2021-01-04T20:03:50Z,OWNER,"I renamed `--config` to `--setting` and changed it to work like this: datasette --setting sql_time_limit_ms 1000 Note the lack of colons. This actually makes colons cleaner to use for plugins - I could support this: datasette --setting datasette-insert:unsafe 1","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",670209331,Mechanism for passing additional options to `datasette my.db` that affect plugins, https://github.com/simonw/datasette/issues/1111#issuecomment-754184287,https://api.github.com/repos/simonw/datasette/issues/1111,754184287,MDEyOklzc3VlQ29tbWVudDc1NDE4NDI4Nw==,9599,simonw,2021-01-04T19:57:53Z,2021-01-04T19:57:53Z,OWNER,Relevant new feature in sqlite-utils: the ability to use triggers to maintain fast counts. This optimization could help a lot here. https://github.com/simonw/sqlite-utils/issues/212,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",751195017,Accessing a database's `.json` is slow for very large SQLite files, https://github.com/simonw/datasette/issues/1164#issuecomment-754182058,https://api.github.com/repos/simonw/datasette/issues/1164,754182058,MDEyOklzc3VlQ29tbWVudDc1NDE4MjA1OA==,9599,simonw,2021-01-04T19:53:31Z,2021-01-04T19:53:31Z,OWNER,This will be helped by the new `package.json` added in #1170.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette, https://github.com/simonw/datasette/pull/1170#issuecomment-754181646,https://api.github.com/repos/simonw/datasette/issues/1170,754181646,MDEyOklzc3VlQ29tbWVudDc1NDE4MTY0Ng==,9599,simonw,2021-01-04T19:52:40Z,2021-01-04T19:52:40Z,OWNER,Thank you very much!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778126516,Install Prettier via package.json, https://github.com/simonw/datasette/issues/983#issuecomment-754181647,https://api.github.com/repos/simonw/datasette/issues/983,754181647,MDEyOklzc3VlQ29tbWVudDc1NDE4MTY0Nw==,11941245,jussiarpalahti,2021-01-04T19:52:40Z,2021-01-04T19:52:40Z,NONE,"I was thinking JavaScript plugins going with server side template extensions custom HTML. Attach my own widgets on there and listen for Datasette events to refresh when user interacts with main UI. Like a map view or table that updates according to selected column. There's certainly other ways to look at this. Perhaps you could list possible hooks or high level design doc on what would be possible with the plugin system? Re: modules. I would like to see modules supported at least in development. The developer experience is so much better than what JavaScript coding has been in the past. With large parts of NPM at your disposal I’d imagine even less experienced coder can whisk a custom plugin in no time. Proper production build system (like one you get with Pika or Parcel) could package everything up into bundles that older browsers can understand. Though that does come with performance and size penalties alongside the added complexity. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/1169#issuecomment-754007242,https://api.github.com/repos/simonw/datasette/issues/1169,754007242,MDEyOklzc3VlQ29tbWVudDc1NDAwNzI0Mg==,3637,benpickles,2021-01-04T14:29:57Z,2021-01-04T14:29:57Z,CONTRIBUTOR,I somewhat share your reluctance to add a package.json to seemingly every project out there but ultimately if they're project dependencies it's important they're managed within the codebase.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777677671,Prettier package not actually being cached, https://github.com/simonw/datasette/pull/1170#issuecomment-754004715,https://api.github.com/repos/simonw/datasette/issues/1170,754004715,MDEyOklzc3VlQ29tbWVudDc1NDAwNDcxNQ==,3637,benpickles,2021-01-04T14:25:44Z,2021-01-04T14:25:44Z,CONTRIBUTOR,I was going to re-add the filter to only run Prettier when there have been changes in `datasette/static` but that would mean it wouldn't run when the package is updated. That plus the fact that [the last run of the job took only 8 seconds](https://github.com/benpickles/datasette/runs/1640121514) is why I decided not to re-add the filter.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778126516,Install Prettier via package.json, https://github.com/simonw/datasette/pull/1170#issuecomment-754002859,https://api.github.com/repos/simonw/datasette/issues/1170,754002859,MDEyOklzc3VlQ29tbWVudDc1NDAwMjg1OQ==,22429695,codecov[bot],2021-01-04T14:22:52Z,2021-01-04T14:22:52Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=h1) Report > Merging [#1170](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=desc) (a5761cc) into [main](https://codecov.io/gh/simonw/datasette/commit/1e8fa3ac7cb2d6e516c47c306c86ed2334fc3dc0?el=desc) (1e8fa3a) will **not change** coverage. > The diff coverage is `n/a`. [![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1170/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1170 +/- ## ======================================= Coverage 91.55% 91.55% ======================================= Files 32 32 Lines 3932 3932 ======================================= Hits 3600 3600 Misses 332 332 ``` ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=footer). Last update [1e8fa3a...a5761cc](https://codecov.io/gh/simonw/datasette/pull/1170?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778126516,Install Prettier via package.json, https://github.com/simonw/datasette/issues/983#issuecomment-753690280,https://api.github.com/repos/simonw/datasette/issues/983,753690280,MDEyOklzc3VlQ29tbWVudDc1MzY5MDI4MA==,9599,simonw,2021-01-03T23:13:30Z,2021-01-03T23:13:30Z,OWNER,"Oh that's interesting, I hadn't thought about plugins firing events - just responding to events fired by the rest of the application.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/sqlite-utils/issues/219#issuecomment-753671902,https://api.github.com/repos/simonw/sqlite-utils/issues/219,753671902,MDEyOklzc3VlQ29tbWVudDc1MzY3MTkwMg==,9599,simonw,2021-01-03T20:31:04Z,2021-01-03T20:32:13Z,OWNER,A `table.has_count_triggers` property.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777707544,reset_counts() method and command, https://github.com/simonw/sqlite-utils/issues/219#issuecomment-753671235,https://api.github.com/repos/simonw/sqlite-utils/issues/219,753671235,MDEyOklzc3VlQ29tbWVudDc1MzY3MTIzNQ==,9599,simonw,2021-01-03T20:25:10Z,2021-01-03T20:25:10Z,OWNER,"To detect tables, look at the names of the triggers - `{table}{counts_table}_insert` and `{table}{counts_table}_delete`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777707544,reset_counts() method and command, https://github.com/simonw/sqlite-utils/issues/219#issuecomment-753671009,https://api.github.com/repos/simonw/sqlite-utils/issues/219,753671009,MDEyOklzc3VlQ29tbWVudDc1MzY3MTAwOQ==,9599,simonw,2021-01-03T20:22:53Z,2021-01-03T20:22:53Z,OWNER,I think this should be accompanied by a `sqlite-utils reset-counts` command.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777707544,reset_counts() method and command, https://github.com/simonw/sqlite-utils/issues/219#issuecomment-753670833,https://api.github.com/repos/simonw/sqlite-utils/issues/219,753670833,MDEyOklzc3VlQ29tbWVudDc1MzY3MDgzMw==,9599,simonw,2021-01-03T20:20:54Z,2021-01-03T20:20:54Z,OWNER,"This is a little tricky. We should assume that the existing values in the `_counts` table cannot be trusted at all when this method is called - so we should probably clear that table entirely and then re-populate it. But that means we need to figure out which tables in the database have the counts triggers defined.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777707544,reset_counts() method and command, https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753668099,https://api.github.com/repos/simonw/sqlite-utils/issues/215,753668099,MDEyOklzc3VlQ29tbWVudDc1MzY2ODA5OQ==,9599,simonw,2021-01-03T19:55:53Z,2021-01-03T19:55:53Z,OWNER,So if you instantiate the `Database()` constructor with `use_counts_table=True` any access to the `.count` properties will go through this table - otherwise regular `count(*)` queries will be executed.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777535402,Use _counts to speed up counts, https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753665521,https://api.github.com/repos/simonw/sqlite-utils/issues/215,753665521,MDEyOklzc3VlQ29tbWVudDc1MzY2NTUyMQ==,9599,simonw,2021-01-03T19:31:33Z,2021-01-03T19:31:33Z,OWNER,"I'm having second thoughts about this being the default behaviour. It's pretty weird. I feel like HUGE databases that need this are rare, so having it on by default doesn't make sense.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777535402,Use _counts to speed up counts, https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753662490,https://api.github.com/repos/simonw/sqlite-utils/issues/215,753662490,MDEyOklzc3VlQ29tbWVudDc1MzY2MjQ5MA==,9599,simonw,2021-01-03T19:05:53Z,2021-01-03T19:05:53Z,OWNER,Idea: a `.execute_count()` method that never uses the cache.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777535402,Use _counts to speed up counts, https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753661292,https://api.github.com/repos/simonw/sqlite-utils/issues/215,753661292,MDEyOklzc3VlQ29tbWVudDc1MzY2MTI5Mg==,9599,simonw,2021-01-03T18:56:06Z,2021-01-03T18:56:23Z,OWNER,"Another option: on creation of the `Database()` object, check to see if the `_counts` table exists and use that as the default for a `use_counts_table` property. Also flip that property to `True` if the user calls `.enable_counts()` at any time.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777535402,Use _counts to speed up counts, https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753661158,https://api.github.com/repos/simonw/sqlite-utils/issues/215,753661158,MDEyOklzc3VlQ29tbWVudDc1MzY2MTE1OA==,9599,simonw,2021-01-03T18:55:16Z,2021-01-03T18:55:16Z,OWNER,"Alternative implementation: provided `db.should_trust_counts` is `True`, try running the query: ```sql select count from _counts where [table] = ? ``` If the query fails to return a result OR throws an error because the table doesn't exist, run the `count(*)` query.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777535402,Use _counts to speed up counts, https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753660814,https://api.github.com/repos/simonw/sqlite-utils/issues/215,753660814,MDEyOklzc3VlQ29tbWVudDc1MzY2MDgxNA==,9599,simonw,2021-01-03T18:53:05Z,2021-01-03T18:53:05Z,OWNER,"Here's the current `.count` property: https://github.com/simonw/sqlite-utils/blob/036ec6d32313487527c66dea613a3e7118b97459/sqlite_utils/db.py#L597-L609 It's implemented on `Queryable` which means it's available on both `Table` and `View` - the optimization doesn't make sense for views. I'm a bit cautious about making that property so much more complex. In order to decide if it should try the `_counts` table first it needs to know: - Should it be trusting the counts? I'm thinking a `.should_trust_counts` property on `Database` which defaults to `True` would be good - then advanced users can turn that off if they know the counts should not be trusted. - Does the `_counts` table exist? - Are the triggers defined? Then it can do the query, and if the query fails it can fall back on the `count(*)`. That's quite a lot of extra activity though.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777535402,Use _counts to speed up counts, https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753660379,https://api.github.com/repos/simonw/sqlite-utils/issues/215,753660379,MDEyOklzc3VlQ29tbWVudDc1MzY2MDM3OQ==,9599,simonw,2021-01-03T18:50:15Z,2021-01-03T18:50:15Z,OWNER,"```python def cached_counts(self, tables=None): sql = ""select [table], count from {}"".format(self._counts_table_name) if tables: sql += "" where [table] in ({})"".format("", "".join(""?"" for table in tables)) return {r[0]: r[1] for r in self.execute(sql, tables).fetchall()} ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777535402,Use _counts to speed up counts, https://github.com/simonw/sqlite-utils/issues/206#issuecomment-753659260,https://api.github.com/repos/simonw/sqlite-utils/issues/206,753659260,MDEyOklzc3VlQ29tbWVudDc1MzY1OTI2MA==,9599,simonw,2021-01-03T18:42:01Z,2021-01-03T18:42:01Z,OWNER,"``` % sqlite-utils insert blah.db blah global_power_plant_database.csv Error: Invalid JSON - use --csv for CSV or --tsv for TSV files ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",761915790,sqlite-utils should suggest --csv if JSON parsing fails, https://github.com/simonw/datasette/issues/1169#issuecomment-753657180,https://api.github.com/repos/simonw/datasette/issues/1169,753657180,MDEyOklzc3VlQ29tbWVudDc1MzY1NzE4MA==,9599,simonw,2021-01-03T18:23:30Z,2021-01-03T18:23:30Z,OWNER,"Also welcome in that PR would be a bit of documentation for contributors, see #1167 - but no problem if you leave that out, I'm happy to add it later.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777677671,Prettier package not actually being cached, https://github.com/simonw/datasette/issues/1169#issuecomment-753653260,https://api.github.com/repos/simonw/datasette/issues/1169,753653260,MDEyOklzc3VlQ29tbWVudDc1MzY1MzI2MA==,9599,simonw,2021-01-03T17:54:40Z,2021-01-03T17:54:40Z,OWNER,And @benpickles yes I would land that pull request straight away as-is. Thanks!,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777677671,Prettier package not actually being cached, https://github.com/simonw/datasette/issues/1169#issuecomment-753653033,https://api.github.com/repos/simonw/datasette/issues/1169,753653033,MDEyOklzc3VlQ29tbWVudDc1MzY1MzAzMw==,9599,simonw,2021-01-03T17:52:53Z,2021-01-03T17:52:53Z,OWNER,"Oh that's so frustrating! I was worried about that - I spotted a few runs that seemed faster and hoped that it meant that the package was coming out of the `~/.npm` cache, but evidently that's not the case. You've convinced me that Datasette itself should have a `package.json` - the Dependabot argument is a really good one. But... I'd really love to figure out a general pattern for using `npx` scripts in GitHub Actions workflows in a cache-friendly way. I have plenty of other projects that I'd love to run Prettier or Uglify or `puppeteer-cli` in without adding a `package.json` to them. Any ideas? The best I can think of is for the workflow itself to write out a `package.json` file (using `echo '{ ... }' > package.json`) as part of the run - that way the cache should work (I think) but I don't get a misleading `package.json` file sitting in the repo.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777677671,Prettier package not actually being cached, https://github.com/simonw/datasette/issues/983#issuecomment-753600999,https://api.github.com/repos/simonw/datasette/issues/983,753600999,MDEyOklzc3VlQ29tbWVudDc1MzYwMDk5OQ==,475613,MarkusH,2021-01-03T11:11:21Z,2021-01-03T11:11:21Z,NONE,"With regards to JS/Browser events, given your example of menu items that plugins could add, I could imagine this code to work: ```js // as part of datasette datasette.events.AddMenuItem = 'DatasetteAddMenuItemEvent'; document.addEventListener(datasette.events.AddMenuItem, (e) => { // do whatever is needed to add the menu item. Data comes from `e` alert(e.title + ' ' + e.link); }); // as part of a plugin const event = new Event(datasette.events.AddMenuItem, {link: '/foo/bar', title: 'Go somewhere'}); Document.dispatchEvent(event) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-753587963,https://api.github.com/repos/simonw/datasette/issues/983,753587963,MDEyOklzc3VlQ29tbWVudDc1MzU4Nzk2Mw==,154364,dracos,2021-01-03T09:02:50Z,2021-01-03T10:00:05Z,NONE,"> but I'm already commited to requiring support for () => {} arrow functions Don't think you are :) (e.g. gzipped, using arrow functions in my example saves 2 bytes over spelling out function). On FMS, past month, looking at popular browsers, looks like we'd have 95.41% arrow support, 94.19% module support, and 4.58% (mostly IE9/IE11/Safari 9) supporting neither.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-753570710,https://api.github.com/repos/simonw/datasette/issues/983,753570710,MDEyOklzc3VlQ29tbWVudDc1MzU3MDcxMA==,9599,simonw,2021-01-03T05:29:56Z,2021-01-03T05:29:56Z,OWNER,"I thought about using browser events, but they don't quite match the API that I'm looking to provide. In particular, the great thing about Pluggy is that if you have multiple handlers registered for a specific plugin hook each of those handlers can return a value, and Pluggy will combine those values into a list of replies. This is great for things like plugin hooks that add extra menu items - each plugin can return a menu item (maybe as a label/URL/click-callback object) and the calling code can then add all of those items to the menu. See https://docs.datasette.io/en/stable/plugin_hooks.html#table-actions-datasette-actor-database-table for a Python example. I'm on the fence about relying on JavaScript modules. I need to think about browser compatibility for them - but I'm already commited to requiring support for `() => {}` arrow functions so maybe I'm committed to module support too already?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/1160#issuecomment-753568428,https://api.github.com/repos/simonw/datasette/issues/1160,753568428,MDEyOklzc3VlQ29tbWVudDc1MzU2ODQyOA==,9599,simonw,2021-01-03T05:02:32Z,2021-01-03T05:02:32Z,OWNER,"Should this command include a `--fts` option for configuring full-text search on one-or-more columns? I thought about doing that for `sqlite-utils insert` in https://github.com/simonw/sqlite-utils/issues/202 and decided not to because of the need to include extra options covering the FTS version, porter stemming options and whether or not to create triggers. But maybe I can set sensible defaults for that with `datasette insert ... -f title -f body`? Worth thinking about a bit more.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",775666296,"""datasette insert"" command and plugin hook", https://github.com/simonw/sqlite-utils/issues/202#issuecomment-753568264,https://api.github.com/repos/simonw/sqlite-utils/issues/202,753568264,MDEyOklzc3VlQ29tbWVudDc1MzU2ODI2NA==,9599,simonw,2021-01-03T05:00:24Z,2021-01-03T05:00:24Z,OWNER,"I'm not going to implement this, because it actually needs several additional options that already exist on `sqlite-utils enable-fts`: ``` --fts4 Use FTS4 --fts5 Use FTS5 --tokenize TEXT Tokenizer to use, e.g. porter --create-triggers Create triggers to update the FTS tables when the parent table changes. ``` I'd rather not add all four of those options to `sqlite-utils insert` just to support this shortcut.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",738514367,sqlite-utils insert -f colname - for configuring full-text search, https://github.com/simonw/sqlite-utils/issues/202#issuecomment-753567969,https://api.github.com/repos/simonw/sqlite-utils/issues/202,753567969,MDEyOklzc3VlQ29tbWVudDc1MzU2Nzk2OQ==,9599,simonw,2021-01-03T04:55:17Z,2021-01-03T04:55:43Z,OWNER,"The long version of this can be `--fts`, same as in `csvs-to-sqlite`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",738514367,sqlite-utils insert -f colname - for configuring full-text search, https://github.com/simonw/sqlite-utils/pull/203#issuecomment-753567932,https://api.github.com/repos/simonw/sqlite-utils/issues/203,753567932,MDEyOklzc3VlQ29tbWVudDc1MzU2NzkzMg==,9599,simonw,2021-01-03T04:54:43Z,2021-01-03T04:54:43Z,OWNER,"Another option: expand the `ForeignKey` object to have `.columns` and `.other_columns` properties in addition to the existing `.column` and `.other_column` properties. These new plural properties would always return a tuple, which would be a one-item tuple for a non-compound-foreign-key. The question then is what should `.column` and `.other_column` return for compound foreign keys? I'd be inclined to say they should return `None` - which would trigger errors in code that encounters a compound foreign key for the first time, but those errors would at least be a strong indicator as to what had gone wrong. We can label `.column` and `.other_column` as deprecated and then remove them in `sqlite-utils 4.0`. Since this would still be a breaking change in some minor edge-cases I'm thinking maybe 4.0 needs to happen in order to land this feature. I'm not opposed to doing that, I was just hoping it might be avoidable.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743384829,changes to allow for compound foreign keys, https://github.com/simonw/sqlite-utils/pull/203#issuecomment-753567744,https://api.github.com/repos/simonw/sqlite-utils/issues/203,753567744,MDEyOklzc3VlQ29tbWVudDc1MzU2Nzc0NA==,9599,simonw,2021-01-03T04:51:44Z,2021-01-03T04:51:44Z,OWNER,"One way that this could avoid a breaking change would be to have `fk.column` and `fk.other_column` remain as strings for non-compound-foreign-keys, but turn into tuples for a compound foreign key. This is a bit of an ugly API design, and it could still break existing code that encounters a compound foreign key for the first time - but it would leave code working for the more common case of a non-compound-foreign-key.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743384829,changes to allow for compound foreign keys, https://github.com/simonw/sqlite-utils/pull/203#issuecomment-753567508,https://api.github.com/repos/simonw/sqlite-utils/issues/203,753567508,MDEyOklzc3VlQ29tbWVudDc1MzU2NzUwOA==,9599,simonw,2021-01-03T04:48:17Z,2021-01-03T04:48:17Z,OWNER,"Sorry for taking so long to review this! This approach looks great to me - being able to optionally pass a tuple anywhere the API currently expects a column is smart, and it's consistent with how the `pk=` parameter works elsewhere. There's just one problem I can see with this: the way it changes the `ForeignKey(...)` interface to always return a tuple for `.column` and `.other_column`, even if that tuple only contains a single item. This represents a breaking change to the existing API - any code that expects `ForeignKey.column` to be a single string (which is any code that has been written against that) will break. As such, I'd have to bump the major version of `sqlite-utils` to `4.0` in order to ship this. Ideally I'd like to make this change in a way that doesn't represent an API compatibility break. I need to think a bit harder about how that might be achieved.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743384829,changes to allow for compound foreign keys, https://github.com/simonw/sqlite-utils/issues/217#issuecomment-753566184,https://api.github.com/repos/simonw/sqlite-utils/issues/217,753566184,MDEyOklzc3VlQ29tbWVudDc1MzU2NjE4NA==,9599,simonw,2021-01-03T04:27:38Z,2021-01-03T04:27:38Z,OWNER,Documented here: https://sqlite-utils.datasette.io/en/latest/python-api.html#quoting-strings-for-use-in-sql,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777543336,Rename .escape() to .quote(), https://github.com/simonw/sqlite-utils/issues/216#issuecomment-753566156,https://api.github.com/repos/simonw/sqlite-utils/issues/216,753566156,MDEyOklzc3VlQ29tbWVudDc1MzU2NjE1Ng==,9599,simonw,2021-01-03T04:27:14Z,2021-01-03T04:27:14Z,OWNER,Documented here: https://sqlite-utils.datasette.io/en/latest/python-api.html#introspection,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777540352,database.triggers_dict introspection property, https://github.com/simonw/sqlite-utils/issues/218#issuecomment-753563757,https://api.github.com/repos/simonw/sqlite-utils/issues/218,753563757,MDEyOklzc3VlQ29tbWVudDc1MzU2Mzc1Nw==,9599,simonw,2021-01-03T03:49:51Z,2021-01-03T03:49:51Z,OWNER,Documentation: https://sqlite-utils.datasette.io/en/latest/cli.html#listing-triggers,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777560474,"""sqlite-utils triggers"" command", https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753545757,https://api.github.com/repos/simonw/sqlite-utils/issues/215,753545757,MDEyOklzc3VlQ29tbWVudDc1MzU0NTc1Nw==,9599,simonw,2021-01-02T23:58:07Z,2021-01-02T23:58:07Z,OWNER,"Thought: maybe there should be a `.reset_counts()` method too, for if the table gets out of date with the triggers. One way that could happen is if a table is dropped and recreated - the counts in the `_counts` table would likely no longer match the number of rows in that table.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777535402,Use _counts to speed up counts, https://github.com/simonw/sqlite-utils/issues/215#issuecomment-753545381,https://api.github.com/repos/simonw/sqlite-utils/issues/215,753545381,MDEyOklzc3VlQ29tbWVudDc1MzU0NTM4MQ==,9599,simonw,2021-01-02T23:52:52Z,2021-01-02T23:52:52Z,OWNER,Idea: a `db.cached_counts()` method that returns a dictionary of data from the `_counts` table. Call it with a list of tables to get back the counts for just those tables.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777535402,Use _counts to speed up counts, https://github.com/simonw/sqlite-utils/issues/217#issuecomment-753544914,https://api.github.com/repos/simonw/sqlite-utils/issues/217,753544914,MDEyOklzc3VlQ29tbWVudDc1MzU0NDkxNA==,9599,simonw,2021-01-02T23:47:42Z,2021-01-02T23:47:42Z,OWNER,https://github.com/simonw/sqlite-utils/blob/9a5c92b63e7917c93cc502478493c51c781b2ecc/sqlite_utils/db.py#L231-L239,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777543336,Rename .escape() to .quote(), https://github.com/simonw/sqlite-utils/issues/213#issuecomment-753535488,https://api.github.com/repos/simonw/sqlite-utils/issues/213,753535488,MDEyOklzc3VlQ29tbWVudDc1MzUzNTQ4OA==,9599,simonw,2021-01-02T22:03:48Z,2021-01-02T22:03:48Z,OWNER,"I got this error while prototyping this: too many levels of trigger recursion It looks like that's because SQLite doesn't like triggers on a table that themselves then update that table - so I'm going to exclude the `_counts` table from this mechanism.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777529979,db.enable_counts() method, https://github.com/simonw/sqlite-utils/issues/213#issuecomment-753533775,https://api.github.com/repos/simonw/sqlite-utils/issues/213,753533775,MDEyOklzc3VlQ29tbWVudDc1MzUzMzc3NQ==,9599,simonw,2021-01-02T21:47:10Z,2021-01-02T21:47:10Z,OWNER,"I'm going to skip virtual tables, which I can identify using this property: https://github.com/simonw/sqlite-utils/blob/1cad7fad3e7a5b734088f5cc545b69a055e636da/sqlite_utils/db.py#L720-L726","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777529979,db.enable_counts() method, https://github.com/simonw/datasette/issues/1012#issuecomment-753531657,https://api.github.com/repos/simonw/datasette/issues/1012,753531657,MDEyOklzc3VlQ29tbWVudDc1MzUzMTY1Nw==,45380,bollwyvl,2021-01-02T21:25:36Z,2021-01-02T21:25:36Z,CONTRIBUTOR,"Actually, on more research, I found out this is handled by the [trove-classifiers package](https://github.com/pypa/trove-classifiers/blob/master/src/trove_classifiers/__init__.py#L2) now, so it's just a one-liner pr instead of fire-up-a-docker-container-and-do-some-migrations","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",718540751,For 1.0 update trove classifier in setup.py, https://github.com/simonw/datasette/issues/1168#issuecomment-753524779,https://api.github.com/repos/simonw/datasette/issues/1168,753524779,MDEyOklzc3VlQ29tbWVudDc1MzUyNDc3OQ==,9599,simonw,2021-01-02T20:19:26Z,2021-01-02T20:19:26Z,OWNER,Idea: version the metadata scheme. If the table is called `_metadata_v1` it gives me a clear path to designing a new scheme in the future.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/sqlite-utils/issues/212#issuecomment-753422324,https://api.github.com/repos/simonw/sqlite-utils/issues/212,753422324,MDEyOklzc3VlQ29tbWVudDc1MzQyMjMyNA==,9599,simonw,2021-01-02T03:00:34Z,2021-01-02T03:00:34Z,OWNER,"Here's a prototype: ```python with db.conn: db.conn.executescript("""""" CREATE TABLE IF NOT EXISTS [_counts] ([table] TEXT PRIMARY KEY, [count] INTEGER DEFAULT 0); CREATE TRIGGER IF NOT EXISTS [Street_Tree_List_counts_ai] AFTER INSERT ON [Street_Tree_List] BEGIN INSERT OR REPLACE INTO _counts VALUES ('Street_Tree_List', COALESCE( (SELECT count FROM _counts WHERE [table]='Street_Tree_List'), 0) + 1); END; CREATE TRIGGER IF NOT EXISTS [Street_Tree_List_counts_ad] AFTER DELETE ON [Street_Tree_List] BEGIN INSERT OR REPLACE INTO _counts VALUES ('Street_Tree_List', COALESCE( (SELECT count FROM _counts WHERE [table]='Street_Tree_List'), 0) - 1); END; INSERT OR REPLACE INTO _counts VALUES ('Street_Tree_List', (select count(*) from [Street_Tree_List])); """""") ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777392020,Mechanism for maintaining cache of table counts using triggers, https://github.com/simonw/sqlite-utils/issues/210#issuecomment-753406744,https://api.github.com/repos/simonw/sqlite-utils/issues/210,753406744,MDEyOklzc3VlQ29tbWVudDc1MzQwNjc0NA==,9599,simonw,2021-01-02T00:02:39Z,2021-01-02T00:02:39Z,OWNER,"It looks like https://github.com/ofajardo/pyreadr is a good library for this. I won't add this to `sqlite-utils` because it's quite a bulky dependency for a relatively small feature. Normally I'd write a `rdata-to-sqlite` tool similar to https://pypi.org/project/dbf-to-sqlite/ - but I'm actually working on a new plugin hook for Datasette that might be an even better fit for this. The idea is to allow Datasette plugins to define input formats - such as RData - which would then result in being able to import them on the command-line with `datasette insert my.db file.rdata` or by uploading a file through the Datasette web interface. That work is happening over here: https://github.com/simonw/datasette/issues/1160 - I'll close this issue in favour of a sometime-in-the-future `datasette-import-rdata` plugin.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",767685961,Support of RData files, https://github.com/simonw/sqlite-utils/issues/209#issuecomment-753405835,https://api.github.com/repos/simonw/sqlite-utils/issues/209,753405835,MDEyOklzc3VlQ29tbWVudDc1MzQwNTgzNQ==,9599,simonw,2021-01-01T23:52:06Z,2021-01-01T23:52:06Z,OWNER,I just hit this one too. Such a weird bug!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",766156875,Test failure with sqlite 3.34 in test_cli.py::test_optimize, https://github.com/simonw/datasette/issues/1168#issuecomment-753402423,https://api.github.com/repos/simonw/datasette/issues/1168,753402423,MDEyOklzc3VlQ29tbWVudDc1MzQwMjQyMw==,9599,simonw,2021-01-01T23:16:05Z,2021-01-01T23:16:05Z,OWNER,"One catch: solving the ""show me all metadata for everything in this Datasette instance"" problem. Ideally there would be a SQLite table that can be queried for this. But the need to resolve the potentially complex set of precedence rules means that table would be difficult if not impossible to provide at run-time. Ideally a denormalized table would be available that featured the results of running those precedence rule calculations. But how to handle keeping this up-to-date? It would need to be recalculated any time a `_metadata` table in any of the attached databases had an update. This is a much larger problem - but one potential fix would be to use triggers to maintain a ""version number"" for the `_metadata` table - similar to SQLite's own built-in `schema_version` mechanism. Triggers could increment a counter any time a record in that table was added, deleted or updated. Such a mechanism would have applications outside of just this `_metadata` system. The ability to attach a version number to any table and have it automatically incremented when that table changes (via triggers) could help with all kinds of other Datasette-at-scale problems, including things like cached table counts.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753401001,https://api.github.com/repos/simonw/datasette/issues/1168,753401001,MDEyOklzc3VlQ29tbWVudDc1MzQwMTAwMQ==,9599,simonw,2021-01-01T23:01:45Z,2021-01-01T23:01:45Z,OWNER,I need to prototype this. Could I do that as a plugin? I think so - I could try out the algorithm for loading metadata and display it on pages using some custom templates.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753400420,https://api.github.com/repos/simonw/datasette/issues/1168,753400420,MDEyOklzc3VlQ29tbWVudDc1MzQwMDQyMA==,9599,simonw,2021-01-01T22:53:58Z,2021-01-01T22:53:58Z,OWNER,"Precedence idea: - First priority is non-_internal metadata from other databases - if those conflict then pick then the alphabetically-ordered-first database name wins - Next priority: `_internal` metadata, which should have been loaded from `metadata.json` - Last priority: the `_metadata` table from that database itself, i.e. the default ""baked in"" metadata","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753400306,https://api.github.com/repos/simonw/datasette/issues/1168,753400306,MDEyOklzc3VlQ29tbWVudDc1MzQwMDMwNg==,9599,simonw,2021-01-01T22:52:44Z,2021-01-01T22:52:44Z,OWNER,"Also: probably load column metadata as part of the table metadata rather than loading column metadata individually, since it's going to be rare to want the metadata for a single column rather than for an entire table full of columns.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753400265,https://api.github.com/repos/simonw/datasette/issues/1168,753400265,MDEyOklzc3VlQ29tbWVudDc1MzQwMDI2NQ==,9599,simonw,2021-01-01T22:52:09Z,2021-01-01T22:52:09Z,OWNER,"From an implementation perspective, I think the way this works is SQL queries read the relevant metadata from ALL available metadata tables, then Python code solves the precedence rules to produce the final, combined metadata for a database/table/column.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753399635,https://api.github.com/repos/simonw/datasette/issues/1168,753399635,MDEyOklzc3VlQ29tbWVudDc1MzM5OTYzNQ==,9599,simonw,2021-01-01T22:45:21Z,2021-01-01T22:50:21Z,OWNER,"Would also need to figure out the precedence rules: - What happens if the database has a `_metadata` table with data that conflicts with a remote metadata record from another database? I think the other database should win, because that allows plugins to over-ride the default metadata for something. - Do JSON values get merged together? So if one table provides a description and another provides a title do both values get returned? - If a database has a `license`, does that ""cascade"" down to the tables? What about `source` and `about`? - What if there are two databases (or more) that provide conflicting metadata for a table in some other database? Also, `_internal` may have loaded data from `metadata.json` that conflicts with some other remote table metadata definition.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753399428,https://api.github.com/repos/simonw/datasette/issues/1168,753399428,MDEyOklzc3VlQ29tbWVudDc1MzM5OTQyOA==,9599,simonw,2021-01-01T22:43:14Z,2021-01-01T22:43:22Z,OWNER,"Could this use a compound primary key on `database, table, column`? Does that work with null values?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753399366,https://api.github.com/repos/simonw/datasette/issues/1168,753399366,MDEyOklzc3VlQ29tbWVudDc1MzM5OTM2Ng==,9599,simonw,2021-01-01T22:42:37Z,2021-01-01T22:42:37Z,OWNER,"So what would the database schema for this look like? I'm leaning towards a single table called `_metadata`, because that's a neater fit for baking the metadata into the database file along with the data that it is describing. Alternatively I could have multiple tables sharing that prefix - `_metadata_database` and `_metadata_tables` and `_metadata_columns` perhaps. If it's just a single `_metadata` table, the schema could look like this: | database | table | column | metadata | | --- | --- | --- | --- | | | mytable | | {""title"": ""My Table"" } | | | mytable | mycolumn | {""description"": ""Column description"" } | | otherdb | othertable | | {""description"": ""Table in another DB"" } | If the `database` column is `null` it means ""this is describing a table in the same database file as this `_metadata` table"". The alternative to the `metadata` JSON column would be separate columns for each potential metadata value - `license`, `source`, `about`, `about_url` etc. But that makes it harder for people to create custom metadata fields.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753398542,https://api.github.com/repos/simonw/datasette/issues/1168,753398542,MDEyOklzc3VlQ29tbWVudDc1MzM5ODU0Mg==,9599,simonw,2021-01-01T22:37:24Z,2021-01-01T22:37:24Z,OWNER,"The direction I'm leaning in now is the following: - Metadata always lives in SQLite tables - These tables can be co-located with the database they describe (same DB file) - ... or they can be in a different DB file and reference the other database that they are describing - Metadata provided on startup in a `metadata.json` file is loaded into an in-memory metadata table using that same mechanism Plugins that want to provide metadata can do so by populating a table. They could even maintain their own in-memory database for this, or they could write to the `_internal` in-memory database, or they could write to a table in a database on disk.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753392102,https://api.github.com/repos/simonw/datasette/issues/1168,753392102,MDEyOklzc3VlQ29tbWVudDc1MzM5MjEwMg==,9599,simonw,2021-01-01T22:06:33Z,2021-01-01T22:06:33Z,OWNER,"Some SQLite databases include SQL comments in the schema definition which tell you what each column means: ```sql CREATE TABLE User -- A table comment ( uid INTEGER, -- A field comment flags INTEGER -- Another field comment ); ``` The problem with these is that they're not exposed to SQLite in any mechanism other than parsing the `CREATE TABLE` statement from the `sqlite_master` table to extract those columns. I had an idea to build a plugin that could return these. That would be easy with a ""get metadata for this column"" plugin hook - in the absence of one a plugin could still run that reads the schemas on startup and uses them to populate a metadata database table somewhere.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753391869,https://api.github.com/repos/simonw/datasette/issues/1168,753391869,MDEyOklzc3VlQ29tbWVudDc1MzM5MTg2OQ==,9599,simonw,2021-01-01T22:04:30Z,2021-01-01T22:04:30Z,OWNER,"The sticking point here seems to be the plugin hook. Allowing plugins to over-ride the way the question ""give me the metadata for this database/table/column"" is answered makes the database-backed metadata mechanisms much more complicated to think about. What if plugins didn't get to over-ride metadata in this way, but could instead update the metadata in a persistent Datasette-managed storage mechanism? Then maybe Datasette could do the following: - Maintain metadata in `_internal` that has been loaded from `metadata.json` - Know how to check a database for baked-in metadata (maybe in a `_metadata` table) - Know how to fall back on the `_internal` metadata if no baked-in metadata is available If database files were optionally allowed to store metadata about tables that live in another database file this could perhaps solve the plugin needs - since an ""edit metadata"" plugin would be able to edit records in a separate, dedicated `metadata.db` database to store new information about tables in other files.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753390791,https://api.github.com/repos/simonw/datasette/issues/1168,753390791,MDEyOklzc3VlQ29tbWVudDc1MzM5MDc5MQ==,9599,simonw,2021-01-01T22:00:42Z,2021-01-01T22:00:42Z,OWNER,"Here are the requirements I'm currently trying to satisfy: - It should be possible to query the metadata for ALL attached tables in one place, potentially with pagination and filtering - Metadata should be able to exist in the current `metadata.json` file - It should also be possible to bundle metadata in a table in the SQLite database files themselves - Plugins should be able to define their own special mechanisms for metadata. This is particularly interesting for providing a UI that allows users to edit the metadata for their existing tables.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753390262,https://api.github.com/repos/simonw/datasette/issues/1168,753390262,MDEyOklzc3VlQ29tbWVudDc1MzM5MDI2Mg==,9599,simonw,2021-01-01T21:58:11Z,2021-01-01T21:58:11Z,OWNER,"One possibility: plugins could write directly to that in-memory database table. But how would they know to write again should the server restart? Maybe they would write to it once when called by the `startup` plugin hook, and then update it (and their own backing store) when metadata changes for some reason. Feels a bit messy though. Also: if I want to support metadata optionally living in a `_metadata` table colocated with the data in a SQLite database file itself, how would that affect the `metadata` columns in `_internal`? How often would Datasette denormalize and copy data across from the on-disk `_metadata` tables to the `_internal` in-memory columns?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753389938,https://api.github.com/repos/simonw/datasette/issues/1168,753389938,MDEyOklzc3VlQ29tbWVudDc1MzM4OTkzOA==,9599,simonw,2021-01-01T21:54:15Z,2021-01-01T21:54:15Z,OWNER,"So what if the `databases`, `tables` and `columns` tables in `_internal` each grew a new `metadata` text column? These columns could be populated by Datasette on startup through reading the `metadata.json` file. But how would plugins interact with them?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753389477,https://api.github.com/repos/simonw/datasette/issues/1168,753389477,MDEyOklzc3VlQ29tbWVudDc1MzM4OTQ3Nw==,9599,simonw,2021-01-01T21:49:57Z,2021-01-01T21:49:57Z,OWNER,"What if metadata was stored in a JSON text column in the existing `_internal` tables? This would allow for users to invent additional metadata fields in the future beyond the current `license`, `license_url` etc fields - without needing a schema change. The downside of JSON columns generally is that they're harder to run indexed queries against. For metadata I don't think that matters - even with 10,000 tables each with their own metadata a SQL query asking for e.g. ""everything that has Apache 2 as the license"" would return in just a few ms.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753388809,https://api.github.com/repos/simonw/datasette/issues/1168,753388809,MDEyOklzc3VlQ29tbWVudDc1MzM4ODgwOQ==,9599,simonw,2021-01-01T21:47:51Z,2021-01-01T21:47:51Z,OWNER,"A database that exposes metadata will have the same restriction as the new `_internal` database that exposes columns and tables, in that it needs to take permissions into account. A user should not be able to view metadata for tables that they are not able to see. As such, I'd rather bundle any metadata tables into the existing `_internal` database so I don't have to solve that permissions problem in two places.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/1168#issuecomment-753366024,https://api.github.com/repos/simonw/datasette/issues/1168,753366024,MDEyOklzc3VlQ29tbWVudDc1MzM2NjAyNA==,9599,simonw,2021-01-01T18:48:34Z,2021-01-01T18:48:34Z,OWNER,Also: in #188 I proposed bundling metadata in the SQLite database itself alongside the data. This is a great way of ensuring metadata travels with the data when it is downloaded as a SQLite `.db` file. But how would that play with the idea of an in-memory `_metadata` table? Could that table perhaps offer views that join data across multiple attached physical databases?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777333388,Mechanism for storing metadata in _metadata tables, https://github.com/simonw/datasette/issues/983#issuecomment-753224999,https://api.github.com/repos/simonw/datasette/issues/983,753224999,MDEyOklzc3VlQ29tbWVudDc1MzIyNDk5OQ==,11941245,jussiarpalahti,2020-12-31T23:29:36Z,2020-12-31T23:29:36Z,NONE,"I have yet to build Datasette plugin and am unfamiliar with Pluggy. Since browsers have event handling builtin Datasette could communicate with plugins through it. Handlers register as listeners for custom Datasette events and Datasette's JS can then trigger said events. I was also wondering if you had looked at Javascript Modules for JS plugins? With services like Skypack (https://www.skypack.dev) NPM libraries can be loaded directly into browser, no build step needed. Same goes for local JS if you adhere to ES Module spec. If minification is required then tools such as Snowpack (https://www.snowpack.dev) could fit better. It uses https://github.com/evanw/esbuild for bundling and minification. On plugins you'd simply: ```javascript import {register} from '/assets/js/datasette' register.on({'click' : my_func}) ``` In Datasette HTML pages' head you'd merely import these files as modules one by one.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/1166#issuecomment-753224351,https://api.github.com/repos/simonw/datasette/issues/1166,753224351,MDEyOklzc3VlQ29tbWVudDc1MzIyNDM1MQ==,9599,simonw,2020-12-31T23:23:29Z,2020-12-31T23:23:29Z,OWNER,I should configure the action to only run if changes have been made within the `datasette/static` directory.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777140799,Adopt Prettier for JavaScript code formatting, https://github.com/simonw/datasette/issues/983#issuecomment-753221646,https://api.github.com/repos/simonw/datasette/issues/983,753221646,MDEyOklzc3VlQ29tbWVudDc1MzIyMTY0Ng==,9599,simonw,2020-12-31T22:58:47Z,2020-12-31T22:58:47Z,OWNER,"https://github.com/mishoo/UglifyJS/issues/1905#issuecomment-300485490 says: > `sourceMappingURL` aren't added by default in `3.x` due to one of the feature requests not to - some users are putting them within HTTP response headers instead. > > So the command line for that would be: > > ```js > $ uglifyjs main.js -cmo main.min.js --source-map url=main.min.js.map > ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/1164#issuecomment-753221362,https://api.github.com/repos/simonw/datasette/issues/1164,753221362,MDEyOklzc3VlQ29tbWVudDc1MzIyMTM2Mg==,9599,simonw,2020-12-31T22:55:57Z,2020-12-31T22:55:57Z,OWNER,"I had to add this as the first line in `table.min.js` for the source mapping to work: ``` //# sourceMappingURL=/-/static/table.min.js.map ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette, https://github.com/simonw/datasette/issues/1164#issuecomment-753220665,https://api.github.com/repos/simonw/datasette/issues/1164,753220665,MDEyOklzc3VlQ29tbWVudDc1MzIyMDY2NQ==,9599,simonw,2020-12-31T22:49:36Z,2020-12-31T22:49:36Z,OWNER,"I started with a 7K `table.js` file. `npx uglifyjs table.js --source-map -o table.min.js` gave me a 5.6K `table.min.js` file. `npx uglifyjs table.js --source-map -o table.min.js --compress --mangle` gave me 4.5K.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette, https://github.com/simonw/datasette/issues/1164#issuecomment-753220412,https://api.github.com/repos/simonw/datasette/issues/1164,753220412,MDEyOklzc3VlQ29tbWVudDc1MzIyMDQxMg==,9599,simonw,2020-12-31T22:47:36Z,2020-12-31T22:47:36Z,OWNER,"I'm trying to minify `table.js` and I ran into a problem: Uglification failed. Unexpected character '`' It turns out `uglify-js` doesn't support ES6 syntax! But `uglify-es` does: npm install uglify-es Annoyingly it looks like `uglify-es` uses the same CLI command, `uglifyjs`. So after installing it this seemed to work: npx uglifyjs table.js --source-map -o table.min.js I really don't like how `npx uglifyjs` could mean different things depending on which package was installed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776634318,Mechanism for minifying JavaScript that ships with Datasette, https://github.com/simonw/datasette/issues/983#issuecomment-753219521,https://api.github.com/repos/simonw/datasette/issues/983,753219521,MDEyOklzc3VlQ29tbWVudDc1MzIxOTUyMQ==,9599,simonw,2020-12-31T22:39:52Z,2020-12-31T22:39:52Z,OWNER,For inlining the `plugins.min.js` file into the Jinja templates I could use the trick described here: https://stackoverflow.com/a/41404611 - which adds a `{{ include_file('file.txt') }}` function to Jinja.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712260429,JavaScript plugin hooks mechanism similar to pluggy, https://github.com/simonw/datasette/issues/983#issuecomment-753219407,https://api.github.com/repos/simonw/datasette/issues/983,753219407,MDEyOklzc3VlQ29tbWVudDc1MzIxOTQwNw==,9599,simonw,2020-12-31T22:38:45Z,2020-12-31T22:39:10Z,OWNER,"You'll be able to add JavaScript plugins using a bunch of different mechanisms: - In a custom template, dropping the code in to a ` {% endblock %} ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729017519,Add template block prior to extra URL loaders, https://github.com/simonw/datasette/pull/1049#issuecomment-718341542,https://api.github.com/repos/simonw/datasette/issues/1049,718341542,MDEyOklzc3VlQ29tbWVudDcxODM0MTU0Mg==,9599,simonw,2020-10-29T03:48:12Z,2020-10-29T03:48:12Z,OWNER,"You could use Datasette's new `{{ urls.static_plugins(...) }}` template option - see https://docs.datasette.io/en/latest/internals.html#internals-datasette-urls - to generate a link to code that was bundled with the plugin: ```html+jinja {% block extra_head %} {% endblock %} ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729017519,Add template block prior to extra URL loaders, https://github.com/simonw/datasette/issues/1050#issuecomment-718317997,https://api.github.com/repos/simonw/datasette/issues/1050,718317997,MDEyOklzc3VlQ29tbWVudDcxODMxNzk5Nw==,283343,thadk,2020-10-29T02:24:50Z,2020-10-29T02:29:24Z,NONE,"Unsolicited feedback for an unreleased feature of the [current](https://github.com/simonw/datasette/commit/5e0b72247ecab4ce0fcec599b77a83d73a480872) unreleased GitHub version (I casually wanted to access a blob row) – the existing #1036 route doesn't support special characters in database or table names (e.g. `@()` ). Maybe this is motivation for your new idea here. Also I got this error/crash with my blob and wasn't able to get the file: https://gist.github.com/thadk/28ac32af0e88747ce9056c90b0b19d34","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729057388,Switch to .blob render extension for BLOB downloads, https://github.com/simonw/datasette/pull/1060#issuecomment-718243062,https://api.github.com/repos/simonw/datasette/issues/1060,718243062,MDEyOklzc3VlQ29tbWVudDcxODI0MzA2Mg==,22429695,codecov[bot],2020-10-28T22:23:33Z,2020-10-28T22:23:33Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1060?src=pr&el=h1) Report > Merging [#1060](https://codecov.io/gh/simonw/datasette/pull/1060?src=pr&el=desc) into [main](https://codecov.io/gh/simonw/datasette/commit/abcf0222496d8148b2e585ffa0ff192270a04b06?el=desc) will **increase** coverage by `6.42%`. > The diff coverage is `100.00%`. [![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1060/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1060?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1060 +/- ## ========================================== + Coverage 84.71% 91.13% +6.42% ========================================== Files 28 27 -1 Lines 3957 3677 -280 ========================================== - Hits 3352 3351 -1 + Misses 605 326 -279 ``` | [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1060?src=pr&el=tree) | Coverage Δ | | |---|---|---| | [datasette/cli.py](https://codecov.io/gh/simonw/datasette/pull/1060/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2NsaS5weQ==) | `73.63% <100.00%> (+0.13%)` | :arrow_up: | | [datasette/version.py](https://codecov.io/gh/simonw/datasette/pull/1060/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZlcnNpb24ucHk=) | `100.00% <100.00%> (ø)` | | ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1060?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1060?src=pr&el=footer). Last update [abcf022...4725d46](https://codecov.io/gh/simonw/datasette/pull/1060?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",731827081,New explicit versioning mechanism, https://github.com/simonw/sqlite-utils/issues/191#issuecomment-718170295,https://api.github.com/repos/simonw/sqlite-utils/issues/191,718170295,MDEyOklzc3VlQ29tbWVudDcxODE3MDI5NQ==,9599,simonw,2020-10-28T19:50:16Z,2020-10-28T19:50:16Z,OWNER,"I think I made a mistake when I designed the initial decorator. I should have had it work like this: ```python @db.register_function() def reverse_string(s): return """".join(reversed(list(s))) ``` As this leaves open the option to add new parameters in the future. To avoid breaking backwards compatibility I'll use the hack that detects the argument this time, but in the future I'll try to remember to always design decorators to be called like `@decorator()`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",731740458,Idea: @db.register_function(deterministic=True), https://github.com/simonw/sqlite-utils/issues/191#issuecomment-718168730,https://api.github.com/repos/simonw/sqlite-utils/issues/191,718168730,MDEyOklzc3VlQ29tbWVudDcxODE2ODczMA==,9599,simonw,2020-10-28T19:47:20Z,2020-10-28T19:47:20Z,OWNER,"https://stackoverflow.com/a/3931903 looks useful: ```python def trace(*args): def _trace(func): def wrapper(*args, **kwargs): print enter_string func(*args, **kwargs) print exit_string return wrapper if len(args) == 1 and callable(args[0]): # No arguments, this is the decorator # Set default values for the arguments enter_string = 'entering' exit_string = 'exiting' return _trace(args[0]) else: # This is just returning the decorator enter_string, exit_string = args return _trace ``` Can improve that code with `functools.wraps`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",731740458,Idea: @db.register_function(deterministic=True), https://github.com/simonw/datasette/pull/1059#issuecomment-718078447,https://api.github.com/repos/simonw/datasette/issues/1059,718078447,MDEyOklzc3VlQ29tbWVudDcxODA3ODQ0Nw==,9599,simonw,2020-10-28T17:07:59Z,2020-10-28T17:08:14Z,OWNER,"> #### 0.6.0 (2020-10-27) > > - aiofiles is now tested on ppc64le. > - Added name and mode properties to async file objects. [#82](https://github.com/Tinche/aiofiles/pull/82) > - Fixed a DeprecationWarning internally. [#75](https://github.com/Tinche/aiofiles/pull/75) > - Python 3.9 support and tests.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",731445447,"Update aiofiles requirement from <0.6,>=0.4 to >=0.4,<0.7", https://github.com/simonw/datasette/pull/1059#issuecomment-717938992,https://api.github.com/repos/simonw/datasette/issues/1059,717938992,MDEyOklzc3VlQ29tbWVudDcxNzkzODk5Mg==,22429695,codecov[bot],2020-10-28T13:38:46Z,2020-10-28T13:38:46Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1059?src=pr&el=h1) Report > Merging [#1059](https://codecov.io/gh/simonw/datasette/pull/1059?src=pr&el=desc) into [main](https://codecov.io/gh/simonw/datasette/commit/7d9fedc176717a7e3d22a96575ae0aada5a65440?el=desc) will **not change** coverage. > The diff coverage is `n/a`. [![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1059/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1059?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1059 +/- ## ======================================= Coverage 84.71% 84.71% ======================================= Files 28 28 Lines 3957 3957 ======================================= Hits 3352 3352 Misses 605 605 ``` ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1059?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1059?src=pr&el=footer). Last update [7d9fedc...e46327a](https://codecov.io/gh/simonw/datasette/pull/1059?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",731445447,"Update aiofiles requirement from <0.6,>=0.4 to >=0.4,<0.7", https://github.com/simonw/datasette/issues/1057#issuecomment-717531272,https://api.github.com/repos/simonw/datasette/issues/1057,717531272,MDEyOklzc3VlQ29tbWVudDcxNzUzMTI3Mg==,9599,simonw,2020-10-27T20:51:09Z,2020-10-27T20:51:09Z,OWNER,"That works! ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",730797787,--cors should enable /fixtures.db CORS access, https://github.com/simonw/datasette/issues/1058#issuecomment-717527606,https://api.github.com/repos/simonw/datasette/issues/1058,717527606,MDEyOklzc3VlQ29tbWVudDcxNzUyNzYwNg==,9599,simonw,2020-10-27T20:44:06Z,2020-10-27T20:44:06Z,OWNER,Example: https://github.com/simonw/datasette/blob/5a1519796037105bc20bcf2f91a76e022926c204/datasette/views/database.py#L26-L32,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",730802994,Database download should implement cascading permissions, https://github.com/simonw/datasette/pull/1056#issuecomment-717489501,https://api.github.com/repos/simonw/datasette/issues/1056,717489501,MDEyOklzc3VlQ29tbWVudDcxNzQ4OTUwMQ==,22429695,codecov[bot],2020-10-27T19:39:41Z,2020-10-27T19:39:41Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1056?src=pr&el=h1) Report > Merging [#1056](https://codecov.io/gh/simonw/datasette/pull/1056?src=pr&el=desc) into [main](https://codecov.io/gh/simonw/datasette/commit/26bb4a268127da2c38f4241abe45444b2a6f7874?el=desc) will **not change** coverage. > The diff coverage is `n/a`. [![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1056/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1056?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1056 +/- ## ======================================= Coverage 84.70% 84.70% ======================================= Files 28 28 Lines 3955 3955 ======================================= Hits 3350 3350 Misses 605 605 ``` ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1056?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1056?src=pr&el=footer). Last update [26bb4a2...a7b2aab](https://codecov.io/gh/simonw/datasette/pull/1056?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",730752399,"Radical new colour scheme and base styles, courtesy of @natbat", https://github.com/simonw/sqlite-utils/pull/189#issuecomment-717361487,https://api.github.com/repos/simonw/sqlite-utils/issues/189,717361487,MDEyOklzc3VlQ29tbWVudDcxNzM2MTQ4Nw==,9599,simonw,2020-10-27T16:24:04Z,2020-10-27T16:24:04Z,OWNER,"This is great, thank you very much.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729818242,Allow iterables other than Lists in m2m records, https://github.com/simonw/sqlite-utils/pull/189#issuecomment-717359145,https://api.github.com/repos/simonw/sqlite-utils/issues/189,717359145,MDEyOklzc3VlQ29tbWVudDcxNzM1OTE0NQ==,35681,adamwolf,2020-10-27T16:20:32Z,2020-10-27T16:20:32Z,CONTRIBUTOR,"No problem. I added a test. Let me know if it looks sufficient or if you want me to to tweak something! If you don't mind, would you tag this PR as ""hacktoberfest-accepted""? If you do mind, no problem and I'm sorry for asking :) My kiddos like the shirts.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729818242,Allow iterables other than Lists in m2m records, https://github.com/simonw/datasette/issues/1054#issuecomment-717051707,https://api.github.com/repos/simonw/datasette/issues/1054,717051707,MDEyOklzc3VlQ29tbWVudDcxNzA1MTcwNw==,9599,simonw,2020-10-27T07:41:21Z,2020-10-27T07:41:21Z,OWNER,Essentially it's this problem: https://github.com/python-versioneer/python-versioneer/issues/140,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",730199464,Switch from versioneer to concrete version in setup.py, https://github.com/simonw/datasette/issues/1054#issuecomment-717050585,https://api.github.com/repos/simonw/datasette/issues/1054,717050585,MDEyOklzc3VlQ29tbWVudDcxNzA1MDU4NQ==,9599,simonw,2020-10-27T07:38:50Z,2020-10-27T07:38:50Z,OWNER,"Maybe imitate how Django does this, e.g. https://github.com/django/django/commit/6b9b2af7352908d40ca4d31bdb1b80c013cab29a","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",730199464,Switch from versioneer to concrete version in setup.py, https://github.com/simonw/sqlite-utils/pull/189#issuecomment-716756103,https://api.github.com/repos/simonw/sqlite-utils/issues/189,716756103,MDEyOklzc3VlQ29tbWVudDcxNjc1NjEwMw==,9599,simonw,2020-10-26T18:56:19Z,2020-10-26T18:56:19Z,OWNER,"This is a great fix, thanks! If you add a unit test somewhere in here I'll merge the PR: https://github.com/simonw/sqlite-utils/blob/main/tests/test_m2m.py","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729818242,Allow iterables other than Lists in m2m records, https://github.com/simonw/datasette/issues/1051#issuecomment-716681602,https://api.github.com/repos/simonw/datasette/issues/1051,716681602,MDEyOklzc3VlQ29tbWVudDcxNjY4MTYwMg==,9599,simonw,2020-10-26T16:51:58Z,2020-10-26T16:51:58Z,OWNER,"I still need to improve the current binary display on the query page though, where it outputs a Python `b'...'` literal.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729096595,Better display of binary data on arbitrary query results page, https://github.com/simonw/datasette/issues/1051#issuecomment-716681167,https://api.github.com/repos/simonw/datasette/issues/1051,716681167,MDEyOklzc3VlQ29tbWVudDcxNjY4MTE2Nw==,9599,simonw,2020-10-26T16:51:15Z,2020-10-26T16:51:15Z,OWNER,"Crazy idea: generate a signed URL containing a base64 of the gzip of the binary content (to try and reduce size). No: this will blow through URL limits in various hosting providers and possibly even browsers. It could be made to work a little bit more reliably with some extra JavaScript that turns it into a download on the browser-side, but that would be hideously complicated. Also the signed bit doesn't prevent people from generating SQL queries that generate nasty binary blobs for download. I'm beginning to think that restricting this feature to just table view, not query view, is a better idea. Query view can still get at the binary using JSON and base64.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729096595,Better display of binary data on arbitrary query results page, https://github.com/simonw/datasette/issues/976#issuecomment-716305890,https://api.github.com/repos/simonw/datasette/issues/976,716305890,MDEyOklzc3VlQ29tbWVudDcxNjMwNTg5MA==,9599,simonw,2020-10-26T05:07:10Z,2020-10-26T05:07:10Z,OWNER,"I used the new `datasette.urls` methods to handle escaping table names. https://github.com/simonw/datasette/blob/f5dbe61a4568c0915ec6be820095c2960cf0857c/datasette/utils/__init__.py#L996-L1008","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",708289783,Idea: -o could open to a more convenient location, https://github.com/simonw/datasette/issues/1052#issuecomment-716265360,https://api.github.com/repos/simonw/datasette/issues/1052,716265360,MDEyOklzc3VlQ29tbWVudDcxNjI2NTM2MA==,9599,simonw,2020-10-26T02:17:58Z,2020-10-26T02:17:58Z,OWNER,"The default z-index values for Leaflet are defined here: https://github.com/Leaflet/Leaflet/blob/b346bb8bf7bb80899baa1f4fc1536bae58e7e3e6/dist/leaflet.css#L81-L91 ```css .leaflet-pane { z-index: 400; } .leaflet-tile-pane { z-index: 200; } .leaflet-overlay-pane { z-index: 400; } .leaflet-shadow-pane { z-index: 500; } .leaflet-marker-pane { z-index: 600; } .leaflet-tooltip-pane { z-index: 650; } .leaflet-popup-pane { z-index: 700; } .leaflet-map-pane canvas { z-index: 100; } .leaflet-map-pane svg { z-index: 200; } ``` So a `z-index` of 1000 on the menu should fix this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729183332,Column action menu overlapped by Leaflet maps, https://github.com/simonw/datasette/pull/1043#issuecomment-716237524,https://api.github.com/repos/simonw/datasette/issues/1043,716237524,MDEyOklzc3VlQ29tbWVudDcxNjIzNzUyNA==,45380,bollwyvl,2020-10-26T00:14:57Z,2020-10-26T00:14:57Z,CONTRIBUTOR,"Sorry, I was out of the loop this weekend. The missing sdists were in some the `datasette-*` plugins... i'll capture my findings more concretely in one spot when i have a chance...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727915394,Include LICENSE in sdist, https://github.com/simonw/datasette/issues/1051#issuecomment-716204271,https://api.github.com/repos/simonw/datasette/issues/1051,716204271,MDEyOklzc3VlQ29tbWVudDcxNjIwNDI3MQ==,9599,simonw,2020-10-25T20:08:04Z,2020-10-25T20:08:04Z,OWNER,"This is bad though, because if I want to provide binary data in CSV as requested in #1034 I need some way of providing that data. Which suggests to me that the base64 option is the only one that can make sense for arbitrary SQL queries represented as CSV. Download links won't work.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729096595,Better display of binary data on arbitrary query results page, https://github.com/simonw/datasette/issues/1051#issuecomment-716204090,https://api.github.com/repos/simonw/datasette/issues/1051,716204090,MDEyOklzc3VlQ29tbWVudDcxNjIwNDA5MA==,9599,simonw,2020-10-25T20:06:42Z,2020-10-25T20:06:42Z,OWNER,"Providing a binary download link here is actually extremely difficult. The problem is that the SQL query itself represents data that can change from one moment to the next. It's no good showing a ""Binary: 55 bytes"" message that links to that same SQL query but with a `.blob` extension and arguments to select the particular result, because the data may change in a way that causes that query to return a different row - at which point the download link will give you the wrong data, not the 55 bytes you asked for. So providing a download link risks being misleading.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729096595,Better display of binary data on arbitrary query results page, https://github.com/simonw/datasette/issues/1050#issuecomment-716174203,https://api.github.com/repos/simonw/datasette/issues/1050,716174203,MDEyOklzc3VlQ29tbWVudDcxNjE3NDIwMw==,9599,simonw,2020-10-25T16:27:39Z,2020-10-25T16:53:27Z,OWNER,"Idea: `.blob` output rendererer, where you tell it which column you want using `?_blob_column=x`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729057388,Switch to .blob render extension for BLOB downloads, https://github.com/simonw/datasette/issues/1050#issuecomment-716175236,https://api.github.com/repos/simonw/datasette/issues/1050,716175236,MDEyOklzc3VlQ29tbWVudDcxNjE3NTIzNg==,9599,simonw,2020-10-25T16:35:20Z,2020-10-25T16:35:20Z,OWNER,"This is clearly a better solution than the one I implemented in #1040 - I don't have to add a new route, I don't have to implement permission checks, it reuses mechanism.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729057388,Switch to .blob render extension for BLOB downloads, https://github.com/simonw/datasette/pull/1049#issuecomment-716146238,https://api.github.com/repos/simonw/datasette/issues/1049,716146238,MDEyOklzc3VlQ29tbWVudDcxNjE0NjIzOA==,22429695,codecov[bot],2020-10-25T13:13:32Z,2020-10-25T13:13:32Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1049?src=pr&el=h1) Report > Merging [#1049](https://codecov.io/gh/simonw/datasette/pull/1049?src=pr&el=desc) into [main](https://codecov.io/gh/simonw/datasette/commit/42f4851e3e7885f1092f104d6c883cea40b12f02?el=desc) will **not change** coverage. > The diff coverage is `n/a`. [![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1049/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1049?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1049 +/- ## ======================================= Coverage 84.72% 84.72% ======================================= Files 28 28 Lines 3942 3942 ======================================= Hits 3340 3340 Misses 602 602 ``` ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1049?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1049?src=pr&el=footer). Last update [42f4851...50a743a](https://codecov.io/gh/simonw/datasette/pull/1049?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729017519,Add template block prior to extra URL loaders, https://github.com/simonw/datasette/issues/838#issuecomment-716123598,https://api.github.com/repos/simonw/datasette/issues/838,716123598,MDEyOklzc3VlQ29tbWVudDcxNjEyMzU5OA==,82988,psychemedia,2020-10-25T10:20:12Z,2020-10-25T10:53:24Z,CONTRIBUTOR,"I'm trying to [run something behind a MyBinder proxy](https://github.com/ouseful-testing/nbsearch), but seem to have something set up incorrectly and not sure what the fix is? I'm starting datasette with jupyter-server-proxy setup: ``` # __init__.py def setup_nbsearch(): return { ""command"": [ ""datasette"", ""serve"", f""{_NBSEARCH_DB_PATH}"", ""-p"", ""{port}"", ""--config"", ""base_url:{base_url}nbsearch/"" ], ""absolute_url"": True, # The following needs a the labextension installing. # eg in postBuild: jupyter labextension install jupyterlab-server-proxy ""launcher_entry"": { ""enabled"": True, ""title"": ""nbsearch"", }, } ``` where the `base_url` gets automatically populated by the server-proxy. I define the loaders as: ``` # __init__.py from datasette import hookimpl @hookimpl def extra_css_urls(database, table, columns, view_name, datasette): return [ ""/-/static-plugins/nbsearch/prism.css"", ""/-/static-plugins/nbsearch/nbsearch.css"", ] ``` but these seem to also need a base_url prefix set somehow? Currently, the generated HTML loads properly but internal links are incorrect; eg they take the form `` which resolves to eg `https://notebooks.gesis.org/hub/-/static-plugins/nbsearch/prism.css` rather than required URL of form `https://notebooks.gesis.org/binder/jupyter/user/ouseful-testing-nbsearch-0fx1mx67/nbsearch/-/static-plugins/nbsearch/prism.css`. The main css is loaded correctly: ``","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",637395097,Incorrect URLs when served behind a proxy with base_url set, https://github.com/simonw/datasette/issues/1034#issuecomment-716078777,https://api.github.com/repos/simonw/datasette/issues/1034,716078777,MDEyOklzc3VlQ29tbWVudDcxNjA3ODc3Nw==,9599,simonw,2020-10-25T01:25:11Z,2020-10-25T01:25:11Z,OWNER,"SQLite actually has APIs that could help here: https://www.sqlite.org/c3ref/column_database_name.html - for any given SQL query they identify the origin/table/column that is the source of each resulting column. Those aren't exposed in the Python `sqlite3` module though, so using them could be extremely tricky.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-716078605,https://api.github.com/repos/simonw/datasette/issues/1034,716078605,MDEyOklzc3VlQ29tbWVudDcxNjA3ODYwNQ==,9599,simonw,2020-10-25T01:22:22Z,2020-10-25T01:22:22Z,OWNER,For arbitrary CSV the only solution I can think of is to embed the base64 value.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-716078512,https://api.github.com/repos/simonw/datasette/issues/1034,716078512,MDEyOklzc3VlQ29tbWVudDcxNjA3ODUxMg==,9599,simonw,2020-10-25T01:21:11Z,2020-10-25T01:21:11Z,OWNER,"What should happen for CSV export of arbitrary SQL queries, where there's no obvious BLOB to link to?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-716078420,https://api.github.com/repos/simonw/datasette/issues/1034,716078420,MDEyOklzc3VlQ29tbWVudDcxNjA3ODQyMA==,9599,simonw,2020-10-25T01:20:00Z,2020-10-25T01:20:00Z,OWNER,That documentation: https://docs.datasette.io/en/latest/internals.html#absolute-url-request-path,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-716077541,https://api.github.com/repos/simonw/datasette/issues/1034,716077541,MDEyOklzc3VlQ29tbWVudDcxNjA3NzU0MQ==,9599,simonw,2020-10-25T01:09:38Z,2020-10-25T01:10:04Z,OWNER,I should turn `datasette.absolute_url(...)` into a documented internal API on https://docs.datasette.io/en/stable/internals.html#datasette-class,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-716077508,https://api.github.com/repos/simonw/datasette/issues/1034,716077508,MDEyOklzc3VlQ29tbWVudDcxNjA3NzUwOA==,9599,simonw,2020-10-25T01:09:17Z,2020-10-25T01:09:17Z,OWNER,Here's how those absolute `next_url` values are generated: https://github.com/simonw/datasette/blob/5db7ae3ce165ded57c7fb1cfbdb3258b1cf06c10/datasette/views/table.py#L774-L776,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-716077436,https://api.github.com/repos/simonw/datasette/issues/1034,716077436,MDEyOklzc3VlQ29tbWVudDcxNjA3NzQzNg==,9599,simonw,2020-10-25T01:08:35Z,2020-10-25T01:08:42Z,OWNER,"This is actually a bit tricky to implement, for a few reasons: - Need to generate a full URL, including the `https://host/` bit. I've done this for `next_url` in the JSON output before, thankfully. - This only makes sense for CSV output for tables. If it's the CSV output of an arbitrary query there's no `/db/table/-/blob/pk/column.blob` page for me to link to. - Need to generate those `/.../-/blob/...` URLs for the data that is being output as CSV.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-713277810,https://api.github.com/repos/simonw/datasette/issues/1034,713277810,MDEyOklzc3VlQ29tbWVudDcxMzI3NzgxMA==,9599,simonw,2020-10-21T03:40:50Z,2020-10-25T01:01:23Z,OWNER,Blocked awaiting #1036 (update: now unblocked),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1046#issuecomment-716071507,https://api.github.com/repos/simonw/datasette/issues/1046,716071507,MDEyOklzc3VlQ29tbWVudDcxNjA3MTUwNw==,9599,simonw,2020-10-25T00:06:47Z,2020-10-25T00:06:47Z,OWNER,"I used https://primer.style/octicons/download-16 instead. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",728895193,Link to blob downloads in the right places, https://github.com/simonw/datasette/pull/1040#issuecomment-713920562,https://api.github.com/repos/simonw/datasette/issues/1040,713920562,MDEyOklzc3VlQ29tbWVudDcxMzkyMDU2Mg==,22429695,codecov[bot],2020-10-21T22:44:12Z,2020-10-24T23:08:14Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1040?src=pr&el=h1) Report > Merging [#1040](https://codecov.io/gh/simonw/datasette/pull/1040?src=pr&el=desc) into [main](https://codecov.io/gh/simonw/datasette/commit/bf82b3d6a605c9ddadd5fb739249dfe6defaf635?el=desc) will **increase** coverage by `0.10%`. > The diff coverage is `100.00%`. [![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1040/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1040?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1040 +/- ## ========================================== + Coverage 84.65% 84.76% +0.10% ========================================== Files 28 28 Lines 3924 3938 +14 ========================================== + Hits 3322 3338 +16 + Misses 602 600 -2 ``` | [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1040?src=pr&el=tree) | Coverage Δ | | |---|---|---| | [datasette/views/index.py](https://codecov.io/gh/simonw/datasette/pull/1040/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZpZXdzL2luZGV4LnB5) | `98.18% <ø> (+1.69%)` | :arrow_up: | | [datasette/views/special.py](https://codecov.io/gh/simonw/datasette/pull/1040/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZpZXdzL3NwZWNpYWwucHk=) | `92.70% <ø> (-0.82%)` | :arrow_down: | | [datasette/app.py](https://codecov.io/gh/simonw/datasette/pull/1040/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2FwcC5weQ==) | `96.37% <100.00%> (+0.17%)` | :arrow_up: | | [datasette/views/base.py](https://codecov.io/gh/simonw/datasette/pull/1040/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZpZXdzL2Jhc2UucHk=) | `93.77% <100.00%> (ø)` | | | [datasette/views/table.py](https://codecov.io/gh/simonw/datasette/pull/1040/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZpZXdzL3RhYmxlLnB5) | `96.07% <100.00%> (+0.22%)` | :arrow_up: | ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1040?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1040?src=pr&el=footer). Last update [bf82b3d...4f3165f](https://codecov.io/gh/simonw/datasette/pull/1040?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",726910999,/db/table/-/blob/pk/column.blob download URL, https://github.com/simonw/datasette/issues/1046#issuecomment-716066342,https://api.github.com/repos/simonw/datasette/issues/1046,716066342,MDEyOklzc3VlQ29tbWVudDcxNjA2NjM0Mg==,9599,simonw,2020-10-24T23:02:07Z,2020-10-24T23:02:25Z,OWNER,"A download icon would be nice for the links in the table display. I like this one https://primer.style/octicons/download-24 ```svg ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",728895193,Link to blob downloads in the right places, https://github.com/simonw/datasette/issues/1033#issuecomment-716066000,https://api.github.com/repos/simonw/datasette/issues/1033,716066000,MDEyOklzc3VlQ29tbWVudDcxNjA2NjAwMA==,82988,psychemedia,2020-10-24T22:58:33Z,2020-10-24T22:58:33Z,CONTRIBUTOR,"From [the docs](https://docs.datasette.io/en/latest/internals.html#datasette-urls), I note: ``` datasette.urls.instance() Returns the URL to the Datasette instance root page. This is usually ""/"" ``` What about the proxy case? Eg if I am using jupyter-server-proxy on a MyBinder or local Jupyter notebook server site, `https://example.com:PORT/weirdpath/datasette`, what does `datasette.urls.instance()` refer to? - [ ] `https://example.com:PORT/weirdpath/datasette` - [ ] `https://example.com:PORT/weirdpath/` - [ ] `https://example.com:PORT/` - [ ] `https://example.com` - [ ] something else?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725099777,datasette.urls.static_plugins(...) method, https://github.com/simonw/datasette/issues/1033#issuecomment-716048564,https://api.github.com/repos/simonw/datasette/issues/1033,716048564,MDEyOklzc3VlQ29tbWVudDcxNjA0ODU2NA==,9599,simonw,2020-10-24T20:08:31Z,2020-10-24T20:08:31Z,OWNER,Documentation here: https://docs.datasette.io/en/latest/internals.html#datasette-urls,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725099777,datasette.urls.static_plugins(...) method, https://github.com/simonw/datasette/issues/575#issuecomment-716048199,https://api.github.com/repos/simonw/datasette/issues/575,716048199,MDEyOklzc3VlQ29tbWVudDcxNjA0ODE5OQ==,9599,simonw,2020-10-24T20:05:44Z,2020-10-24T20:05:44Z,OWNER,https://docs.datasette.io/en/latest/writing_plugins.html#static-assets,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",497162288,Plugin documentation should cover how to bundle static/templates in setup.py, https://github.com/simonw/datasette/issues/1042#issuecomment-715643763,https://api.github.com/repos/simonw/datasette/issues/1042,715643763,MDEyOklzc3VlQ29tbWVudDcxNTY0Mzc2Mw==,9599,simonw,2020-10-24T00:34:31Z,2020-10-24T00:34:52Z,OWNER,I'm going to rename that to template variable from `select_templates` to `templates_considered` too.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1042#issuecomment-715643646,https://api.github.com/repos/simonw/datasette/issues/1042,715643646,MDEyOklzc3VlQ29tbWVudDcxNTY0MzY0Ng==,9599,simonw,2020-10-24T00:33:46Z,2020-10-24T00:33:46Z,OWNER,"I'd like to do this all in the `datasette.render_template()` method to ensure it's available to plugins as well, not just core code that uses the `BaseView` class. This code is the problem: https://github.com/simonw/datasette/blob/d3e9b0aecb6f8e9b2befd9c654ccb7ce852db3e7/datasette/views/base.py#L114-L133 I think I'll fix this by moving the `select_templates` mechanism into `datasette.render_templates()`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1045#issuecomment-715641183,https://api.github.com/repos/simonw/datasette/issues/1045,715641183,MDEyOklzc3VlQ29tbWVudDcxNTY0MTE4Mw==,9599,simonw,2020-10-24T00:19:29Z,2020-10-24T00:19:29Z,OWNER,"It turns out it already does that: https://github.com/simonw/datasette/blob/976e5f74aae1fa0d406df6691dc8b5feeebe8788/datasette/app.py#L710-L720 But the documentation doesn't reflect that: > `template` - string > > > The template file to be rendered, e.g. `my_plugin.html`. Datasette will search for this file first in the `--template-dir=` location, if it was specified - then in the plugin's bundled templates and finally in Datasette's set of default templates.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",728600048,"Document that datasette.render_template(template, ...) also accepts a list of templates", https://github.com/simonw/datasette/issues/1042#issuecomment-715618333,https://api.github.com/repos/simonw/datasette/issues/1042,715618333,MDEyOklzc3VlQ29tbWVudDcxNTYxODMzMw==,9599,simonw,2020-10-23T22:33:24Z,2020-10-23T22:33:24Z,OWNER,"It wouldn't be a disaster if template-loading plugins were unable to hook into the `/{slug1}/{slug2}.html` custom page mechanism, since plugins can define their own pages already using `register_routes()`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1042#issuecomment-715618077,https://api.github.com/repos/simonw/datasette/issues/1042,715618077,MDEyOklzc3VlQ29tbWVudDcxNTYxODA3Nw==,9599,simonw,2020-10-23T22:32:24Z,2020-10-23T22:32:24Z,OWNER,"Another option: the first version of the plugin hook could accept only the template filename. Subsequent releases could add more arguments, since Pluggy allows new arguments to be added without breaking backwards compatibility.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1042#issuecomment-715617830,https://api.github.com/repos/simonw/datasette/issues/1042,715617830,MDEyOklzc3VlQ29tbWVudDcxNTYxNzgzMA==,9599,simonw,2020-10-23T22:31:26Z,2020-10-23T22:31:26Z,OWNER,"So maybe this should be a `register_template_loader` mechanism that returns a Jinja loader after all? That would mean that only the template filename could be used as the input to the plugin, which doesn't seem as useful as emulating the `extra_template_vars()` interface.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1042#issuecomment-715617405,https://api.github.com/repos/simonw/datasette/issues/1042,715617405,MDEyOklzc3VlQ29tbWVudDcxNTYxNzQwNQ==,9599,simonw,2020-10-23T22:29:53Z,2020-10-23T22:29:53Z,OWNER,"Also consider that `DatasetteRouter` uses `.list_templates()` to gather together `{slug}.html` style templates for the custom page templates mechanism: https://github.com/simonw/datasette/blob/976e5f74aae1fa0d406df6691dc8b5feeebe8788/datasette/app.py#L949-L967 For that to work with the new plugin hook, custom template providing plugins will need a way to provide a list of templates that they know about.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1042#issuecomment-715616757,https://api.github.com/repos/simonw/datasette/issues/1042,715616757,MDEyOklzc3VlQ29tbWVudDcxNTYxNjc1Nw==,9599,simonw,2020-10-23T22:27:28Z,2020-10-23T22:27:28Z,OWNER,"Almost all of the core template loading happens in the `BaseView.render` method: https://github.com/simonw/datasette/blob/976e5f74aae1fa0d406df6691dc8b5feeebe8788/datasette/views/base.py#L114-L133 The one exception is the 404 handling code here: https://github.com/simonw/datasette/blob/976e5f74aae1fa0d406df6691dc8b5feeebe8788/datasette/app.py#L1034-L1042","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1042#issuecomment-715614971,https://api.github.com/repos/simonw/datasette/issues/1042,715614971,MDEyOklzc3VlQ29tbWVudDcxNTYxNDk3MQ==,9599,simonw,2020-10-23T22:20:14Z,2020-10-23T22:23:51Z,OWNER,"Alternative plugin hook idea: ```python @hookspec def load_template(template, database, table, columns, view_name, request, datasette): ""Load the specified template, returning the template code as a string"" ``` Imitating the existing `extra_template_vars` family of hooks: https://docs.datasette.io/en/stable/plugin_hooks.html#extra-template-vars-template-database-table-columns-view-name-request-datasette","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/pull/1040#issuecomment-715587715,https://api.github.com/repos/simonw/datasette/issues/1040,715587715,MDEyOklzc3VlQ29tbWVudDcxNTU4NzcxNQ==,9599,simonw,2020-10-23T21:01:07Z,2020-10-23T21:03:10Z,OWNER,"A download icon would be nice for the links in the table display. I like this one https://primer.style/octicons/download-24","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",726910999,/db/table/-/blob/pk/column.blob download URL, https://github.com/simonw/datasette/pull/1043#issuecomment-715586711,https://api.github.com/repos/simonw/datasette/issues/1043,715586711,MDEyOklzc3VlQ29tbWVudDcxNTU4NjcxMQ==,9599,simonw,2020-10-23T20:58:26Z,2020-10-23T20:58:26Z,OWNER,I misunderstood - `asgi-csrf` already has an sdist.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727915394,Include LICENSE in sdist, https://github.com/simonw/datasette/pull/1043#issuecomment-715585140,https://api.github.com/repos/simonw/datasette/issues/1043,715585140,MDEyOklzc3VlQ29tbWVudDcxNTU4NTE0MA==,9599,simonw,2020-10-23T20:54:29Z,2020-10-23T20:54:29Z,OWNER,Thanks. I'll push a source release of `asgi-csrf`.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727915394,Include LICENSE in sdist, https://github.com/simonw/datasette/pull/1044#issuecomment-715584579,https://api.github.com/repos/simonw/datasette/issues/1044,715584579,MDEyOklzc3VlQ29tbWVudDcxNTU4NDU3OQ==,9599,simonw,2020-10-23T20:53:01Z,2020-10-23T20:53:01Z,OWNER,Thanks for this!,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727916744,Add minimum supported python, https://github.com/simonw/datasette/issues/745#issuecomment-715556545,https://api.github.com/repos/simonw/datasette/issues/745,715556545,MDEyOklzc3VlQ29tbWVudDcxNTU1NjU0NQ==,9599,simonw,2020-10-23T19:47:10Z,2020-10-23T19:47:10Z,OWNER,Dupe of #647 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",608613033,Extract the hash-URL mechanism out into a plugin, https://github.com/simonw/datasette/issues/1042#issuecomment-715497419,https://api.github.com/repos/simonw/datasette/issues/1042,715497419,MDEyOklzc3VlQ29tbWVudDcxNTQ5NzQxOQ==,9599,simonw,2020-10-23T18:12:40Z,2020-10-23T18:12:40Z,OWNER,Maybe the template loader can optionally return some extra context to pass to the template. That could be used to solve the templates considered comment.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1042#issuecomment-715496859,https://api.github.com/repos/simonw/datasette/issues/1042,715496859,MDEyOklzc3VlQ29tbWVudDcxNTQ5Njg1OQ==,9599,simonw,2020-10-23T18:11:27Z,2020-10-23T18:11:27Z,OWNER,"When loading a template the filename is required, but you can optionally also send a set of extra arguments which the template loader can take into consideration.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1042#issuecomment-715490532,https://api.github.com/repos/simonw/datasette/issues/1042,715490532,MDEyOklzc3VlQ29tbWVudDcxNTQ5MDUzMg==,9599,simonw,2020-10-23T17:57:34Z,2020-10-23T17:57:34Z,OWNER,"A better version of this hook would be passed the database, table and query name depending on what was being rendered. This would require some re-thinking of how core templates are loaded, especially since I would want the templates considered comment to continue working.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/pull/1044#issuecomment-714916127,https://api.github.com/repos/simonw/datasette/issues/1044,714916127,MDEyOklzc3VlQ29tbWVudDcxNDkxNjEyNw==,22429695,codecov[bot],2020-10-23T05:12:52Z,2020-10-23T05:12:52Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1044?src=pr&el=h1) Report > Merging [#1044](https://codecov.io/gh/simonw/datasette/pull/1044?src=pr&el=desc) into [main](https://codecov.io/gh/simonw/datasette/commit/d0cc6f4c32e1f89238ddec782086b3122f445bd4?el=desc) will **not change** coverage. > The diff coverage is `n/a`. [![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1044/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1044?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1044 +/- ## ======================================= Coverage 84.65% 84.65% ======================================= Files 28 28 Lines 3924 3924 ======================================= Hits 3322 3322 Misses 602 602 ``` ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1044?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1044?src=pr&el=footer). Last update [d0cc6f4...6453ab1](https://codecov.io/gh/simonw/datasette/pull/1044?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727916744,Add minimum supported python, https://github.com/simonw/datasette/pull/1043#issuecomment-714915025,https://api.github.com/repos/simonw/datasette/issues/1043,714915025,MDEyOklzc3VlQ29tbWVudDcxNDkxNTAyNQ==,22429695,codecov[bot],2020-10-23T05:09:09Z,2020-10-23T05:09:09Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1043?src=pr&el=h1) Report > Merging [#1043](https://codecov.io/gh/simonw/datasette/pull/1043?src=pr&el=desc) into [main](https://codecov.io/gh/simonw/datasette/commit/d0cc6f4c32e1f89238ddec782086b3122f445bd4?el=desc) will **not change** coverage. > The diff coverage is `n/a`. [![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1043/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1043?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1043 +/- ## ======================================= Coverage 84.65% 84.65% ======================================= Files 28 28 Lines 3924 3924 ======================================= Hits 3322 3322 Misses 602 602 ``` ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1043?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1043?src=pr&el=footer). Last update [d0cc6f4...dc4129c](https://codecov.io/gh/simonw/datasette/pull/1043?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727915394,Include LICENSE in sdist, https://github.com/simonw/datasette/issues/1012#issuecomment-714908859,https://api.github.com/repos/simonw/datasette/issues/1012,714908859,MDEyOklzc3VlQ29tbWVudDcxNDkwODg1OQ==,45380,bollwyvl,2020-10-23T04:49:20Z,2020-10-23T04:49:20Z,CONTRIBUTOR,"Good luck on 1.0! It may also be worth lobbying for a `Framework::Datasette::1.0` classifier. This would be a nice way to allow the ecosystem to self-document a bit more [discoverably](https://pypi.org/search/?q=&o=&c=Framework+%3A%3A+Datasette%3A%3A+1.0). I was surprised to see the [PR for `Framework::Jupyter`](https://github.com/pypa/warehouse/pull/1905/files) is a... database migration! Of course, there may be more workflow to it!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",718540751,For 1.0 update trove classifier in setup.py, https://github.com/simonw/datasette/issues/1042#issuecomment-714868867,https://api.github.com/repos/simonw/datasette/issues/1042,714868867,MDEyOklzc3VlQ29tbWVudDcxNDg2ODg2Nw==,9599,simonw,2020-10-23T02:31:17Z,2020-10-23T02:31:17Z,OWNER,I'll build this in conjunction with a plugin that supports editing templates stored in SQLite.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1042#issuecomment-714868624,https://api.github.com/repos/simonw/datasette/issues/1042,714868624,MDEyOklzc3VlQ29tbWVudDcxNDg2ODYyNA==,9599,simonw,2020-10-23T02:30:27Z,2020-10-23T02:30:37Z,OWNER,Maybe `register_template_loader(datasette)` which returns an object which is added in at the beginning of the list passed to `ChoiceLoader` here.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/datasette/issues/1042#issuecomment-714868207,https://api.github.com/repos/simonw/datasette/issues/1042,714868207,MDEyOklzc3VlQ29tbWVudDcxNDg2ODIwNw==,9599,simonw,2020-10-23T02:29:12Z,2020-10-23T02:29:12Z,OWNER,Relevant code: https://github.com/simonw/datasette/blob/d0cc6f4c32e1f89238ddec782086b3122f445bd4/datasette/app.py#L288-L311,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727802081,Plugin hook for loading templates, https://github.com/simonw/sqlite-utils/issues/173#issuecomment-714758139,https://api.github.com/repos/simonw/sqlite-utils/issues/173,714758139,MDEyOklzc3VlQ29tbWVudDcxNDc1ODEzOQ==,9599,simonw,2020-10-22T20:57:56Z,2020-10-22T20:57:56Z,OWNER,"I could use `ijson` to provide a progress bar for JSON arrays too. I'd prefer to keep that as an optional dependency though, since `sqlite-utils` is a library dependency for many other projects and it would be using `ijson` purely for the CLI component. Here's how to iterate through a list of objects being read from a file: ```python import json parser = ijson.items(open( ""/tmp/list.json"" ), ""item"") for object in parser: # ... ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",707478649,Progress bar for sqlite-utils insert, https://github.com/simonw/datasette/issues/1041#issuecomment-714683801,https://api.github.com/repos/simonw/datasette/issues/1041,714683801,MDEyOklzc3VlQ29tbWVudDcxNDY4MzgwMQ==,9599,simonw,2020-10-22T18:37:47Z,2020-10-22T18:37:47Z,OWNER,"I think I'll do this by looking for URLs that start with `/` - since it's also possible to have full `https://...` URLs in that setting. ```json { ""extra_css_urls"": [ ""/static/styles.css"" ], ""extra_js_urls"": [ ""/static/app.js"" ] } ``` I need to think about the `extra_css_urls` and `extra_js_urls` plugin hooks too: https://docs.datasette.io/en/stable/plugin_hooks.html#extra-css-urls-template-database-table-columns-view-name-request-datasette","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727627923,extra_js_urls and extra_css_urls should respect base_url setting, https://github.com/simonw/datasette/issues/1041#issuecomment-714682825,https://api.github.com/repos/simonw/datasette/issues/1041,714682825,MDEyOklzc3VlQ29tbWVudDcxNDY4MjgyNQ==,9599,simonw,2020-10-22T18:36:10Z,2020-10-22T18:36:10Z,OWNER,I'll need to update these docs once there's a solution for this in place: https://docs.datasette.io/en/latest/custom_templates.html#serving-static-files,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727627923,extra_js_urls and extra_css_urls should respect base_url setting, https://github.com/simonw/datasette/issues/1041#issuecomment-714682288,https://api.github.com/repos/simonw/datasette/issues/1041,714682288,MDEyOklzc3VlQ29tbWVudDcxNDY4MjI4OA==,9599,simonw,2020-10-22T18:35:15Z,2020-10-22T18:35:15Z,OWNER,"@psychemedia said: https://github.com/simonw/datasette/issues/1033#issuecomment-714657366 > How does `/-/static` relate to [current guidance docs around `static`](https://docs.datasette.io/en/latest/custom_templates.html?highlight=static#serving-static-files) regarding the `--static option` and metadata formulations such as `""extra_js_urls"": [ ""/static/app.js""]` (I've not managed to get this to work in a Jupyter server proxied set up; the [datasette / jupyter server proxy repo](https://github.com/simonw/jupyterserverproxy-datasette-demo) may provide a useful test example, eg via MyBinder, for folk to crib from?)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727627923,extra_js_urls and extra_css_urls should respect base_url setting, https://github.com/simonw/datasette/issues/1033#issuecomment-714681365,https://api.github.com/repos/simonw/datasette/issues/1033,714681365,MDEyOklzc3VlQ29tbWVudDcxNDY4MTM2NQ==,9599,simonw,2020-10-22T18:33:48Z,2020-10-22T18:33:48Z,OWNER,"That's a good question - I hadn't considered that. I'm going to open a new issue to have `extra_js_urls` respect the `base_url` setting, since the static files will be served from a different location.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725099777,datasette.urls.static_plugins(...) method, https://github.com/simonw/datasette/issues/1033#issuecomment-714657366,https://api.github.com/repos/simonw/datasette/issues/1033,714657366,MDEyOklzc3VlQ29tbWVudDcxNDY1NzM2Ng==,82988,psychemedia,2020-10-22T17:51:29Z,2020-10-22T17:51:29Z,CONTRIBUTOR,"How does `/-/static` relate to [current guidance docs around `static`](https://docs.datasette.io/en/latest/custom_templates.html?highlight=static#serving-static-files) regarding the `--static option` and metadata formulations such as `""extra_js_urls"": [ ""/static/app.js""]` (I've not managed to get this to work in a Jupyter server proxied set up; the [datasette / jupyter server proxy repo](https://github.com/simonw/jupyterserverproxy-datasette-demo) may provide a useful test example, eg via MyBinder, for folk to crib from?) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725099777,datasette.urls.static_plugins(...) method, https://github.com/simonw/datasette/pull/1031#issuecomment-714289680,https://api.github.com/repos/simonw/datasette/issues/1031,714289680,MDEyOklzc3VlQ29tbWVudDcxNDI4OTY4MA==,299380,frankier,2020-10-22T07:23:52Z,2020-10-22T07:23:52Z,NONE,"The bug is that currently when there are databases passed in, but no -i flag, e.g. in configuration directory mode, inclusion in inspect-data.json does not automatically cause databases to be considered immutable, as described in the documentation. The reason is that the -i flag is specified multiple=True, which means when it is not passed in we will get an empty list [], rather than None. So the current code decides that no databases are immutable rather than falling back to inspect-data.json -- as is presumably intended.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724369025,Fallback to databases in inspect-data.json when no -i options are passed, https://github.com/simonw/sqlite-utils/issues/171#issuecomment-714219725,https://api.github.com/repos/simonw/sqlite-utils/issues/171,714219725,MDEyOklzc3VlQ29tbWVudDcxNDIxOTcyNQ==,649467,mhalle,2020-10-22T04:38:35Z,2020-10-22T04:38:35Z,NONE,"Thanks. As I said, I think the result (being able to query tree structures like ancestors and descendants) is more important than the implementation, and I agree that this particular sqlite extension is too obscure. Just providing an sqlite utility to build or rebuild a transitive closure table might be more generically useful. I find that hierarchical data shows up pretty frequently in some data science problems.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",707407567,Idea: transitive closure tables for tree structures, https://github.com/simonw/sqlite-utils/issues/171#issuecomment-714208848,https://api.github.com/repos/simonw/sqlite-utils/issues/171,714208848,MDEyOklzc3VlQ29tbWVudDcxNDIwODg0OA==,9599,simonw,2020-10-22T04:07:14Z,2020-10-22T04:07:14Z,OWNER,"I made the `--load-extension` command much more widely supported in #137 - which should be useful for anyone who wants to use this extension. It's a bit too obscure for me to want to add direct Python library support relating to that extension though.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",707407567,Idea: transitive closure tables for tree structures, https://github.com/simonw/datasette/pull/1031#issuecomment-714206875,https://api.github.com/repos/simonw/datasette/issues/1031,714206875,MDEyOklzc3VlQ29tbWVudDcxNDIwNjg3NQ==,9599,simonw,2020-10-22T04:01:19Z,2020-10-22T04:01:19Z,OWNER,I don't fully understand the bug you're fixing here. Could you provide a bit more explanation?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724369025,Fallback to databases in inspect-data.json when no -i options are passed, https://github.com/simonw/datasette/pull/1040#issuecomment-714206533,https://api.github.com/repos/simonw/datasette/issues/1040,714206533,MDEyOklzc3VlQ29tbWVudDcxNDIwNjUzMw==,9599,simonw,2020-10-22T04:00:25Z,2020-10-22T04:00:25Z,OWNER,I've decided not to offer a configuration option to turn this off. I'll reconsider if someone asks for it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",726910999,/db/table/-/blob/pk/column.blob download URL, https://github.com/simonw/datasette/issues/998#issuecomment-714205783,https://api.github.com/repos/simonw/datasette/issues/998,714205783,MDEyOklzc3VlQ29tbWVudDcxNDIwNTc4Mw==,9599,simonw,2020-10-22T03:58:13Z,2020-10-22T03:58:13Z,OWNER,This is now live here: https://global-power-plants.datasettes.com/global-power-plants/global-power-plants,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",717699884,Wide tables should scroll horizontally within the page, https://github.com/simonw/datasette/issues/998#issuecomment-714117534,https://api.github.com/repos/simonw/datasette/issues/998,714117534,MDEyOklzc3VlQ29tbWVudDcxNDExNzUzNA==,9599,simonw,2020-10-22T01:12:06Z,2020-10-22T01:12:06Z,OWNER,"Demo: ![table-scroll](https://user-images.githubusercontent.com/9599/96806421-e74e7e00-13c8-11eb-95fe-44d01e4c2eb3.gif) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",717699884,Wide tables should scroll horizontally within the page, https://github.com/simonw/datasette/issues/998#issuecomment-714092002,https://api.github.com/repos/simonw/datasette/issues/998,714092002,MDEyOklzc3VlQ29tbWVudDcxNDA5MjAwMg==,9599,simonw,2020-10-22T00:55:10Z,2020-10-22T00:55:10Z,OWNER,This isn't blocked on #987 - it just means that `datasette-cluster-map` will need to learn to look for `.table-wrapper` first and fall back on the table.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",717699884,Wide tables should scroll horizontally within the page, https://github.com/simonw/datasette/issues/998#issuecomment-714090965,https://api.github.com/repos/simonw/datasette/issues/998,714090965,MDEyOklzc3VlQ29tbWVudDcxNDA5MDk2NQ==,9599,simonw,2020-10-22T00:54:30Z,2020-10-22T00:54:30Z,OWNER,"Easiest fix for the column action menu positioning - hide them when the user scrolls the containing div: ```javascript document.querySelector('.table-wrapper').addEventListener( 'scroll', () => document.querySelector('.dropdown-menu').style.display = 'none' ); ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",717699884,Wide tables should scroll horizontally within the page, https://github.com/simonw/datasette/pull/1038#issuecomment-713920461,https://api.github.com/repos/simonw/datasette/issues/1038,713920461,MDEyOklzc3VlQ29tbWVudDcxMzkyMDQ2MQ==,9599,simonw,2020-10-21T22:43:51Z,2020-10-21T22:43:51Z,OWNER,Thanks for spotting this!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",726154220,DOC: Fix syntax error, https://github.com/simonw/datasette/issues/1036#issuecomment-713899530,https://api.github.com/repos/simonw/datasette/issues/1036,713899530,MDEyOklzc3VlQ29tbWVudDcxMzg5OTUzMA==,9599,simonw,2020-10-21T21:55:00Z,2020-10-21T21:55:00Z,OWNER,"This code needs these permission checks: https://github.com/simonw/datasette/blob/bf82b3d6a605c9ddadd5fb739249dfe6defaf635/datasette/views/table.py#L911-L913","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-713821656,https://api.github.com/repos/simonw/datasette/issues/1036,713821656,MDEyOklzc3VlQ29tbWVudDcxMzgyMTY1Ng==,9599,simonw,2020-10-21T19:22:45Z,2020-10-21T19:41:48Z,OWNER,"So for https://latest.datasette.io/fixtures/binary_data the BLOB download URLs would be: `https://latest.datasette.io/fixtures/-/blob/binary_data/1/data.blob` - that last bit after the primary key is to indicate the `data` column With these headers: - `Content-Disposition: attachment; filename=""binary_data-1-data.blob""` - `X-Content-Type-Options: nosniff` - `Content-Type: application/binary`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-713830842,https://api.github.com/repos/simonw/datasette/issues/1036,713830842,MDEyOklzc3VlQ29tbWVudDcxMzgzMDg0Mg==,9599,simonw,2020-10-21T19:41:20Z,2020-10-21T19:41:20Z,OWNER,Another useful demo database: https://datasette-render-images-demo.datasette.io/favicons/favicons - see https://datasette-render-images-demo.datasette.io/favicons/favicons.csv,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-713829629,https://api.github.com/repos/simonw/datasette/issues/1036,713829629,MDEyOklzc3VlQ29tbWVudDcxMzgyOTYyOQ==,9599,simonw,2020-10-21T19:38:43Z,2020-10-21T19:38:43Z,OWNER,"Should this work just for BLOB columns, or should it work for other columns too? For the moment I'm going to restrict it to BLOBs, since data from other columns is available through the UI whereas BLOB columns are not.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-713818817,https://api.github.com/repos/simonw/datasette/issues/1036,713818817,MDEyOklzc3VlQ29tbWVudDcxMzgxODgxNw==,9599,simonw,2020-10-21T19:17:01Z,2020-10-21T19:17:01Z,OWNER,Actually I like `.blob`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-713818178,https://api.github.com/repos/simonw/datasette/issues/1036,713818178,MDEyOklzc3VlQ29tbWVudDcxMzgxODE3OA==,9599,simonw,2020-10-21T19:15:38Z,2020-10-21T19:16:34Z,OWNER,"What should the suggested filename be? I think something that includes the table name, primary key and the name of the column would work. How about a file extension? I guess `.binary`, then let the user rename it? Or `.raw`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1039#issuecomment-713754844,https://api.github.com/repos/simonw/datasette/issues/1039,713754844,MDEyOklzc3VlQ29tbWVudDcxMzc1NDg0NA==,9599,simonw,2020-10-21T17:58:27Z,2020-10-21T17:58:27Z,OWNER,"Now live: https://latest.datasette.io/fixtures/roadside_attraction_characteristics ![anim](https://user-images.githubusercontent.com/9599/96759016-55288480-138c-11eb-8ba0-d8e0f6dd8b1f.gif) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",726687572,Add an animation to the column actions menu, https://github.com/simonw/datasette/pull/1038#issuecomment-713320666,https://api.github.com/repos/simonw/datasette/issues/1038,713320666,MDEyOklzc3VlQ29tbWVudDcxMzMyMDY2Ng==,22429695,codecov[bot],2020-10-21T05:50:38Z,2020-10-21T05:50:38Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1038?src=pr&el=h1) Report > Merging [#1038](https://codecov.io/gh/simonw/datasette/pull/1038?src=pr&el=desc) into [main](https://codecov.io/gh/simonw/datasette/commit/66120a7a1cb592e8a21164cf537f62a4d7ab1dfc?el=desc) will **not change** coverage. > The diff coverage is `n/a`. [![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1038/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1038?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1038 +/- ## ======================================= Coverage 84.65% 84.65% ======================================= Files 28 28 Lines 3924 3924 ======================================= Hits 3322 3322 Misses 602 602 ``` ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1038?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1038?src=pr&el=footer). Last update [66120a7...7fc0cce](https://codecov.io/gh/simonw/datasette/pull/1038?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",726154220,DOC: Fix syntax error, https://github.com/simonw/datasette/issues/1036#issuecomment-713278349,https://api.github.com/repos/simonw/datasette/issues/1036,713278349,MDEyOklzc3VlQ29tbWVudDcxMzI3ODM0OQ==,9599,simonw,2020-10-21T03:42:29Z,2020-10-21T03:42:29Z,OWNER,Possible URL for this: `/db/table/-/blob/primary-keys` - this would use the `/db/table/-/` namespace proposed in #296.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/998#issuecomment-713269155,https://api.github.com/repos/simonw/datasette/issues/998,713269155,MDEyOklzc3VlQ29tbWVudDcxMzI2OTE1NQ==,9599,simonw,2020-10-21T03:17:07Z,2020-10-21T03:17:07Z,OWNER,"This may require updates to the column action menu JavaScript too, since it was not built with scrolling sideways in mind.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",717699884,Wide tables should scroll horizontally within the page, https://github.com/simonw/datasette/issues/1037#issuecomment-713268905,https://api.github.com/repos/simonw/datasette/issues/1037,713268905,MDEyOklzc3VlQ29tbWVudDcxMzI2ODkwNQ==,9599,simonw,2020-10-21T03:16:36Z,2020-10-21T03:16:36Z,OWNER,Dupe of #998.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",726094754,Add horizontal scrollbar to tables, https://github.com/simonw/datasette/issues/1037#issuecomment-713268498,https://api.github.com/repos/simonw/datasette/issues/1037,713268498,MDEyOklzc3VlQ29tbWVudDcxMzI2ODQ5OA==,9599,simonw,2020-10-21T03:15:44Z,2020-10-21T03:15:44Z,OWNER,"This may require updates to the column action menu JavaScript too, since it was not built with scrolling sideways in mind.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",726094754,Add horizontal scrollbar to tables, https://github.com/simonw/datasette/issues/1037#issuecomment-713267989,https://api.github.com/repos/simonw/datasette/issues/1037,713267989,MDEyOklzc3VlQ29tbWVudDcxMzI2Nzk4OQ==,9599,simonw,2020-10-21T03:14:34Z,2020-10-21T03:14:34Z,OWNER,"This is particularly relevant to the `datasette-cluster-map` plugin - the map is much nicer to use if the table itself can be scrolled. That plugin also makes this harder to build, because the plugin inserts the map as the direct predecessor of the `` element and hence breaks if you try to wrap that in a `
`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",726094754,Add horizontal scrollbar to tables, https://github.com/simonw/datasette/issues/1036#issuecomment-713226726,https://api.github.com/repos/simonw/datasette/issues/1036,713226726,MDEyOklzc3VlQ29tbWVudDcxMzIyNjcyNg==,9599,simonw,2020-10-21T01:04:25Z,2020-10-21T01:04:25Z,OWNER,"Extra security idea: a `blob_download_host` setting which can be used to indicate a host that should be used for downloads - for example `datasettestatic.com`. If this setting is populated then binary downloads are served from paths on that host only, and no other Datasette URLs from that host will be served.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/262#issuecomment-713208667,https://api.github.com/repos/simonw/datasette/issues/262,713208667,MDEyOklzc3VlQ29tbWVudDcxMzIwODY2Nw==,9599,simonw,2020-10-21T00:03:18Z,2020-10-21T00:03:18Z,OWNER,"I think I should prioritize the facets component of this, since that could have significant performance wins while also supporting `datasette-graphql`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-713200782,https://api.github.com/repos/simonw/datasette/issues/262,713200782,MDEyOklzc3VlQ29tbWVudDcxMzIwMDc4Mg==,9599,simonw,2020-10-20T23:41:30Z,2020-10-20T23:41:30Z,OWNER,This is now blocking https://github.com/simonw/datasette-graphql/issues/61 because that issue needs a way to turn off suggested facets when retrieving the results of a table query.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/1034#issuecomment-713191819,https://api.github.com/repos/simonw/datasette/issues/1034,713191819,MDEyOklzc3VlQ29tbWVudDcxMzE5MTgxOQ==,9599,simonw,2020-10-20T23:12:58Z,2020-10-20T23:12:58Z,OWNER,"Enzo has a great solution here: https://twitter.com/enzo_mdd/status/1318685442976436226 > Or maybe an option for a url. This keeps the CSV small but allows scripts to download binary data as needed. In #1036 I'm planning on adding a way for users to access BLOB data. I can include that URL in the CSV output.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1036#issuecomment-713186189,https://api.github.com/repos/simonw/datasette/issues/1036,713186189,MDEyOklzc3VlQ29tbWVudDcxMzE4NjE4OQ==,9599,simonw,2020-10-20T22:56:33Z,2020-10-20T22:56:33Z,OWNER,I think this plus the binary-CSV stuff in #1034 will justify a dedicated section of the documentation to talk about how Datasette handles binary BLOB columns.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-713185871,https://api.github.com/repos/simonw/datasette/issues/1036,713185871,MDEyOklzc3VlQ29tbWVudDcxMzE4NTg3MQ==,9599,simonw,2020-10-20T22:55:36Z,2020-10-20T22:55:36Z,OWNER,I can also use a `Content-Disposition` header to force a download. I'm reasonably confident that the combination of `Content-Disposition` and `X-Content-Type-Options: nosniff` and `application/binary` will let me allow users to download the contents of arbitrary BLOB columns without any XSS risk.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-713185173,https://api.github.com/repos/simonw/datasette/issues/1036,713185173,MDEyOklzc3VlQ29tbWVudDcxMzE4NTE3Mw==,9599,simonw,2020-10-20T22:53:41Z,2020-10-20T22:53:41Z,OWNER,"https://security.stackexchange.com/questions/12896/does-x-content-type-options-really-prevent-content-sniffing-attacks says: > In Tangled Web Michal Zalewski says: > > > Refrain from using Content-Type: application/octet-stream and use application/binary instead, especially for unknown document types. Refrain from returning Content-Type: text/plain. > > > > For example, any code-hosting platform must exercise caution when returning executables or source archives as application/octet-stream, because there is a risk they may be misinterpreted as HTML and displayed inline.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-713184374,https://api.github.com/repos/simonw/datasette/issues/1036,713184374,MDEyOklzc3VlQ29tbWVudDcxMzE4NDM3NA==,9599,simonw,2020-10-20T22:51:22Z,2020-10-20T22:51:22Z,OWNER,"From https://hackerone.com/reports/126197: > archive.uber.com mirrors pypi. When downloading `.tar.gz` files from archive.uber.com, the MIME type is `application/octet-stream`. Injecting `` into the start of the `.tar.gz` causes an XSS in Internet Explorer due to MIME sniffing. So you do have to be careful not to open accidental XSS holes with `application/octet-stream` thanks to (presumably older) versions of IE. From that thread it looks like the solution is to add a `X-Content-Type-Options: nosniff` header.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-713183306,https://api.github.com/repos/simonw/datasette/issues/1036,713183306,MDEyOklzc3VlQ29tbWVudDcxMzE4MzMwNg==,9599,simonw,2020-10-20T22:48:10Z,2020-10-20T22:48:10Z,OWNER,Twitter thread: https://twitter.com/dancow/status/1318681053347840005,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1034#issuecomment-713176082,https://api.github.com/repos/simonw/datasette/issues/1034,713176082,MDEyOklzc3VlQ29tbWVudDcxMzE3NjA4Mg==,9599,simonw,2020-10-20T22:27:33Z,2020-10-20T22:27:33Z,OWNER,"This feels good to me - it's consistent with how other features in Datasette work, and it means users who need the binary data in CSV (for whatever reason) can get it if they want to.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-713175741,https://api.github.com/repos/simonw/datasette/issues/1034,713175741,MDEyOklzc3VlQ29tbWVudDcxMzE3NTc0MQ==,9599,simonw,2020-10-20T22:26:45Z,2020-10-20T22:26:45Z,OWNER,"> New idea: since binary in CSV doesn't make sense anyway, emulate Datasette's HTML UI default and output this: > > id,title,data > 1,Some title, > 2,Other title, > > Then allow users to add ?_base64=1 to the URL to get base64 instead > https://twitter.com/simonw/status/1318679950635888641","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-713174690,https://api.github.com/repos/simonw/datasette/issues/1034,713174690,MDEyOklzc3VlQ29tbWVudDcxMzE3NDY5MA==,9599,simonw,2020-10-20T22:23:50Z,2020-10-20T22:23:50Z,OWNER,Or... default to `` and support a `?_base64=1` option which outputs in base64 instead.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-713174341,https://api.github.com/repos/simonw/datasette/issues/1034,713174341,MDEyOklzc3VlQ29tbWVudDcxMzE3NDM0MQ==,9599,simonw,2020-10-20T22:22:53Z,2020-10-20T22:23:14Z,OWNER,"An even easier option: do what the Datasette UI does and output `` for that CSV cell, as seen on https://latest.datasette.io/fixtures/binary_data","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-713172901,https://api.github.com/repos/simonw/datasette/issues/1034,713172901,MDEyOklzc3VlQ29tbWVudDcxMzE3MjkwMQ==,9599,simonw,2020-10-20T22:19:10Z,2020-10-20T22:20:28Z,OWNER,"I could go with the same format as `datasette-render-binary` but using `0x00` as the format for the hex bytes. 0x15 0x1C 0x02 0xC7 JFIF 0x00 0x01 Problem with this is that it's ambiguous: if the ASCII characters `0x15` occur in the text they will be indistinguishable from those hex bytes. But since representing binary data in CSV fundamentally doesn't make sense I'm not sure if that really matters.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/741#issuecomment-713171742,https://api.github.com/repos/simonw/datasette/issues/741,713171742,MDEyOklzc3VlQ29tbWVudDcxMzE3MTc0Mg==,9599,simonw,2020-10-20T22:16:25Z,2020-10-20T22:16:25Z,OWNER,See also #992 which will rename `--config` to `--setting`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",607223136,"Replace ""datasette publish --extra-options"" with ""--setting""", https://github.com/simonw/datasette/issues/262#issuecomment-713170979,https://api.github.com/repos/simonw/datasette/issues/262,713170979,MDEyOklzc3VlQ29tbWVudDcxMzE3MDk3OQ==,9599,simonw,2020-10-20T22:14:37Z,2020-10-20T22:14:37Z,OWNER,"I think it's worth having a plugin hook for this - it can be same hook that is used internally. Maybe `register_extra` - it lets you return one or more `extra` implementations, each with a name and an async function that gets called. Things like suggested facets will become `register_extra` hooks. Maybe actual facets too?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-713170284,https://api.github.com/repos/simonw/datasette/issues/262,713170284,MDEyOklzc3VlQ29tbWVudDcxMzE3MDI4NA==,9599,simonw,2020-10-20T22:13:01Z,2020-10-20T22:13:01Z,OWNER,In the documentation for `?_extra=` I think I'll emphasize the comma-separated version of it. Also: there will be `?_extra=` values which act as aliases for collection combinations - e.g. `?_extra=full` will toggle everything.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/782#issuecomment-712986115,https://api.github.com/repos/simonw/datasette/issues/782,712986115,MDEyOklzc3VlQ29tbWVudDcxMjk4NjExNQ==,9599,simonw,2020-10-20T16:28:46Z,2020-10-20T16:29:51Z,OWNER,"I think this all comes down to how the `?_extras=` mechanism works (see #262), as first hinted at in a30c5b220c15360d575e94b0e67f3255e120b916 (see commit message) when I added this long-forgotten undocumented feature: https://latest.datasette.io/fixtures/attraction_characteristic/2.json?_extras=foreign_key_tables Extras need to be able to execute additional SQL, since that would solve the problem we have now where the expensive ""suggested facets"" code runs on all `.json` output even when its results are not being shown.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879,Redesign default .json format, https://github.com/simonw/datasette/issues/1035#issuecomment-712976314,https://api.github.com/repos/simonw/datasette/issues/1035,712976314,MDEyOklzc3VlQ29tbWVudDcxMjk3NjMxNA==,9599,simonw,2020-10-20T16:21:42Z,2020-10-20T16:21:42Z,OWNER,"Makes me question if `datasette.urls` should grow functionality equivalent to the other path and querystring manipulation methods in `datasette.utils`: https://github.com/simonw/datasette/blob/66120a7a1cb592e8a21164cf537f62a4d7ab1dfc/datasette/utils/__init__.py#L216-L279","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725743755,"datasette.urls.table(..., format=""json"") argument", https://github.com/simonw/datasette/issues/1035#issuecomment-712965574,https://api.github.com/repos/simonw/datasette/issues/1035,712965574,MDEyOklzc3VlQ29tbWVudDcxMjk2NTU3NA==,9599,simonw,2020-10-20T16:13:57Z,2020-10-20T16:13:57Z,OWNER,"That `renderers[key] = path_with_format(` is in a base class which can be used for both arbitrary queries, canned queries and the table view. I think that's OK, but it means that the `format=""json""` argument on `datasette.urls.table()` won't be used by Datasette internally, it will just be available for plugins.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725743755,"datasette.urls.table(..., format=""json"") argument", https://github.com/simonw/datasette/issues/1035#issuecomment-712963959,https://api.github.com/repos/simonw/datasette/issues/1035,712963959,MDEyOklzc3VlQ29tbWVudDcxMjk2Mzk1OQ==,9599,simonw,2020-10-20T16:11:25Z,2020-10-20T16:11:25Z,OWNER,"Relevant code: https://github.com/simonw/datasette/blob/091441a4449beae559a8c0d007376dc85d3aa624/datasette/utils/__init__.py#L681-L696 Only used here: https://github.com/simonw/datasette/blob/091441a4449beae559a8c0d007376dc85d3aa624/datasette/views/base.py#L498-L502","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725743755,"datasette.urls.table(..., format=""json"") argument", https://github.com/simonw/datasette/issues/1026#issuecomment-712962517,https://api.github.com/repos/simonw/datasette/issues/1026,712962517,MDEyOklzc3VlQ29tbWVudDcxMjk2MjUxNw==,9599,simonw,2020-10-20T16:09:12Z,2020-10-20T16:09:12Z,OWNER,"That `datasette.urls.table(""db"", ""table"") + "".json""` example is bad because if the table name contains a `.` it should be `?_format=json` instead. Maybe `.table()` should have a `format=""json""` option that knows how to do this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722738988,How should datasette.client interact with base_url, https://github.com/simonw/datasette/issues/1026#issuecomment-712959034,https://api.github.com/repos/simonw/datasette/issues/1026,712959034,MDEyOklzc3VlQ29tbWVudDcxMjk1OTAzNA==,9599,simonw,2020-10-20T16:03:33Z,2020-10-20T16:03:55Z,OWNER,"Reconsidering this: I think the `.get()` etc methods should automatically add the `base_url` prefix for you, since these APIs are only intended to make internal calls. The clincher on this is when I went to add a section to the `datasette.client` documentation recommending you use `datasette.urls.path()` for every call to them that you make. But there's a problem: to handle table name escaping users are likely to want to use `datasette.urls.table()` anyway, like this: response = await datasette.client.get(datasette.urls.table(""db"", ""table"") + "".json"") This risks adding the `base_url` prefix twice. Maybe the `.table()` method could return a string-like object that is marked as already having the `base_url` prefix added, so the `client.get()` method knows not to add it again. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722738988,How should datasette.client interact with base_url, https://github.com/simonw/datasette/issues/991#issuecomment-712855389,https://api.github.com/repos/simonw/datasette/issues/991,712855389,MDEyOklzc3VlQ29tbWVudDcxMjg1NTM4OQ==,24740,furilo,2020-10-20T13:36:41Z,2020-10-20T13:36:41Z,NONE,"Here is one quick sketch (done in Figma :P) for an idea: a possible filter to switch between showing all tables from all databases, or grouping tables by database. (the switch is interactive) All tables: https://www.figma.com/proto/BjFrMroEtmVx6EeRjvSrox/Datasette-test?node-id=1%3A2&viewport=536%2C348%2C0.5&scaling=min-zoom Grouped: https://www.figma.com/proto/BjFrMroEtmVx6EeRjvSrox/Datasette-test?node-id=3%3A974&viewport=536%2C348%2C0.5&scaling=min-zoom When only 1 database: https://www.figma.com/proto/BjFrMroEtmVx6EeRjvSrox/Datasette-test?node-id=1%3A162&viewport=536%2C348%2C0.5&scaling=min-zoom Is this is useful, I can send some more suggestions/sketches. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",714377268,Redesign application homepage, https://github.com/simonw/datasette/issues/1023#issuecomment-712607608,https://api.github.com/repos/simonw/datasette/issues/1023,712607608,MDEyOklzc3VlQ29tbWVudDcxMjYwNzYwOA==,9599,simonw,2020-10-20T05:47:42Z,2020-10-20T05:47:42Z,OWNER,Requested alpha testers in https://github.com/simonw/datasette/issues/838#issuecomment-712604364,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722673818,Fix issues relating to base_url, https://github.com/simonw/datasette/issues/1026#issuecomment-712607227,https://api.github.com/repos/simonw/datasette/issues/1026,712607227,MDEyOklzc3VlQ29tbWVudDcxMjYwNzIyNw==,9599,simonw,2020-10-20T05:46:44Z,2020-10-20T05:46:44Z,OWNER,We have a solution for this now: `datasette.urls` from #1033 can be used by plugins to assemble the correct URLs to pass to `.get()` and friends.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722738988,How should datasette.client interact with base_url, https://github.com/simonw/datasette/issues/1023#issuecomment-712604541,https://api.github.com/repos/simonw/datasette/issues/1023,712604541,MDEyOklzc3VlQ29tbWVudDcxMjYwNDU0MQ==,9599,simonw,2020-10-20T05:39:44Z,2020-10-20T05:39:44Z,OWNER,Here's the alpha with most of this work ready for people to preview: https://github.com/simonw/datasette/releases/tag/0.51a0,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722673818,Fix issues relating to base_url, https://github.com/simonw/datasette/issues/838#issuecomment-712604364,https://api.github.com/repos/simonw/datasette/issues/838,712604364,MDEyOklzc3VlQ29tbWVudDcxMjYwNDM2NA==,9599,simonw,2020-10-20T05:39:15Z,2020-10-20T05:39:15Z,OWNER,"OK, I've made a ton of improvements to how the `base_url` setting works - see tickets linked from #1023. I've just pushed out an alpha release with those changes in it: https://github.com/simonw/datasette/releases/tag/0.51a0 @tsibley @tballison @ChristopherWilks I'd really appreciate your help testing this alpha! You can install it with: pip install datasette==0.51a0 It should work with just `ProxyPass`, without needing the `ProxyPassReverse` setting.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",637395097,Incorrect URLs when served behind a proxy with base_url set, https://github.com/simonw/datasette/issues/865#issuecomment-712597762,https://api.github.com/repos/simonw/datasette/issues/865,712597762,MDEyOklzc3VlQ29tbWVudDcxMjU5Nzc2Mg==,9599,simonw,2020-10-20T05:22:59Z,2020-10-20T05:22:59Z,OWNER,"OK, this is definitely working now.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",644582921,"base_url doesn't seem to work when adding criteria and clicking ""apply""", https://github.com/simonw/datasette/issues/1025#issuecomment-712593790,https://api.github.com/repos/simonw/datasette/issues/1025,712593790,MDEyOklzc3VlQ29tbWVudDcxMjU5Mzc5MA==,9599,simonw,2020-10-20T05:12:36Z,2020-10-20T05:12:36Z,OWNER,"I'm going to leave the cookies code setting cookies to default to the `""/""` top level.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722724086,"Fix last remaining links to ""/"" that do not respect base_url", https://github.com/simonw/datasette/issues/782#issuecomment-712590398,https://api.github.com/repos/simonw/datasette/issues/782,712590398,MDEyOklzc3VlQ29tbWVudDcxMjU5MDM5OA==,9599,simonw,2020-10-20T05:03:46Z,2020-10-20T05:04:09Z,OWNER,"OK, https://latest-with-plugins.datasette.io/ is running that now - e.g. https://latest-with-plugins.datasette.io/fixtures/roadside_attractions.json-preview or https://latest-with-plugins.datasette.io/fixtures/compound_three_primary_keys.json-preview ```json { ""rows"": [ { ""pk"": 1, ""name"": ""The Mystery Spot"", ""address"": ""465 Mystery Spot Road, Santa Cruz, CA 95065"", ""latitude"": 37.0167, ""longitude"": -122.0024 }, { ""pk"": 2, ""name"": ""Winchester Mystery House"", ""address"": ""525 South Winchester Boulevard, San Jose, CA 95128"", ""latitude"": 37.3184, ""longitude"": -121.9511 }, { ""pk"": 3, ""name"": ""Burlingame Museum of PEZ Memorabilia"", ""address"": ""214 California Drive, Burlingame, CA 94010"", ""latitude"": 37.5793, ""longitude"": -122.3442 }, { ""pk"": 4, ""name"": ""Bigfoot Discovery Museum"", ""address"": ""5497 Highway 9, Felton, CA 95018"", ""latitude"": 37.0414, ""longitude"": -122.0725 } ], ""total"": 4, ""next_url"": null } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879,Redesign default .json format, https://github.com/simonw/datasette/issues/782#issuecomment-712585921,https://api.github.com/repos/simonw/datasette/issues/782,712585921,MDEyOklzc3VlQ29tbWVudDcxMjU4NTkyMQ==,9599,simonw,2020-10-20T04:48:01Z,2020-10-20T04:48:01Z,OWNER,I'll update `datasette-json-preview` with that now.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879,Redesign default .json format, https://github.com/simonw/datasette/issues/782#issuecomment-712585687,https://api.github.com/repos/simonw/datasette/issues/782,712585687,MDEyOklzc3VlQ29tbWVudDcxMjU4NTY4Nw==,9599,simonw,2020-10-20T04:47:02Z,2020-10-20T04:47:12Z,OWNER,"Great point about CORS, I hadn't considered that. I think I'm going to keep the `Link:` header (added in #1014) because I quite enjoy using it with GitHub and WordPress, but I'm not going to have it be the default way of doing pagination. For the default shape I'm now leaning towards this: ```json { ""total"": 36, ""rows"": [{""id"": 1, ""name"": ""Cleo""}], ""next_url"": ""https://latest-with-plugins.datasette.io/fixtures/facetable.json?_next=5"" } ``` So three keys: `total`, `rows` and `next_url`. Then extra keys can be added using `?_extra=` with various named bundles.","{""total_count"": 3, ""+1"": 3, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879,Redesign default .json format, https://github.com/simonw/datasette/issues/1034#issuecomment-712582699,https://api.github.com/repos/simonw/datasette/issues/1034,712582699,MDEyOklzc3VlQ29tbWVudDcxMjU4MjY5OQ==,9599,simonw,2020-10-20T04:36:04Z,2020-10-20T04:36:14Z,OWNER,Asked for ideas on Twitter: https://twitter.com/simonw/status/1318409558805467136,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-712581994,https://api.github.com/repos/simonw/datasette/issues/1034,712581994,MDEyOklzc3VlQ29tbWVudDcxMjU4MTk5NA==,9599,simonw,2020-10-20T04:33:28Z,2020-10-20T04:33:28Z,OWNER,"The [datasette-render-binary](https://github.com/simonw/datasette-render-binary) plugin does this, which I really like - but without the different coloured fonts I'm not sure how readable it would be as just plain text: ![image](https://user-images.githubusercontent.com/9599/96540435-9c125f00-1252-11eb-85aa-5fc8d0e63728.png) Really the goal here is to find the most human-friendly option, so that people looking at the output have a vague idea what's going on. That's why I'm not leaping at the chance to use base64.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1034#issuecomment-712580976,https://api.github.com/repos/simonw/datasette/issues/1034,712580976,MDEyOklzc3VlQ29tbWVudDcxMjU4MDk3Ng==,9599,simonw,2020-10-20T04:29:23Z,2020-10-20T04:29:23Z,OWNER,Most obvious option is base64. Any other potential solutions I'm missing?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725184645,Better way of representing binary data in .csv output, https://github.com/simonw/datasette/issues/1025#issuecomment-712579674,https://api.github.com/repos/simonw/datasette/issues/1025,712579674,MDEyOklzc3VlQ29tbWVudDcxMjU3OTY3NA==,9599,simonw,2020-10-20T04:24:10Z,2020-10-20T04:24:10Z,OWNER,"Changed my mind, `error.html` needs access to `urls` in order to link to its CSS file. Passing it after all (it already got passed `ds.config(""base_url"")` so `ds` was available previously).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722724086,"Fix last remaining links to ""/"" that do not respect base_url", https://github.com/simonw/datasette/issues/782#issuecomment-712569695,https://api.github.com/repos/simonw/datasette/issues/782,712569695,MDEyOklzc3VlQ29tbWVudDcxMjU2OTY5NQ==,222245,carlmjohnson,2020-10-20T03:45:48Z,2020-10-20T03:46:14Z,NONE,"I vote against headers. It has a lot of strikes against it: poor discoverability, new developers often don’t know how to use them, makes CORS harder, makes it hard to use eg with JQ, needs ad hoc specification for each bit of metadata, etc. The only advantage of headers is that you don’t need to do .rows, but that’s actually good as a data validation step anyway—if .rows is missing assume there’s an error and do your error handling path instead of parsing the rest.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879,Redesign default .json format, https://github.com/simonw/datasette/issues/1025#issuecomment-712481127,https://api.github.com/repos/simonw/datasette/issues/1025,712481127,MDEyOklzc3VlQ29tbWVudDcxMjQ4MTEyNw==,9599,simonw,2020-10-19T22:40:37Z,2020-10-20T01:21:36Z,OWNER,Was blocked on #904 - now unblocked.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722724086,"Fix last remaining links to ""/"" that do not respect base_url", https://github.com/simonw/datasette/issues/1033#issuecomment-712529413,https://api.github.com/repos/simonw/datasette/issues/1033,712529413,MDEyOklzc3VlQ29tbWVudDcxMjUyOTQxMw==,9599,simonw,2020-10-20T01:21:12Z,2020-10-20T01:21:12Z,OWNER,Also refs #1023,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725099777,datasette.urls.static_plugins(...) method, https://github.com/simonw/datasette/issues/1025#issuecomment-712525557,https://api.github.com/repos/simonw/datasette/issues/1025,712525557,MDEyOklzc3VlQ29tbWVudDcxMjUyNTU1Nw==,9599,simonw,2020-10-20T01:07:02Z,2020-10-20T01:07:02Z,OWNER,"I fixed the `queries.html` one. I'm not going to fix these two: ``` datasette/templates/error.html: home datasette/templates/patterns.html: home / ``` Because the `error.html` one does not get passed a context (which makes sense since an error has occurred) and the pattern portfolio doesn't need to link to anywhere in particular.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722724086,"Fix last remaining links to ""/"" that do not respect base_url", https://github.com/simonw/datasette/issues/904#issuecomment-712524699,https://api.github.com/repos/simonw/datasette/issues/904,712524699,MDEyOklzc3VlQ29tbWVudDcxMjUyNDY5OQ==,9599,simonw,2020-10-20T01:04:12Z,2020-10-20T01:04:12Z,OWNER,Documentation is https://docs.datasette.io/en/latest/writing_plugins.html#building-urls-within-plugins and https://docs.datasette.io/en/latest/internals.html#internals-datasette-urls,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",663228985,"datasette.urls.table() / .instance() / .database() methods for constructing URLs, also exposed to templates", https://github.com/simonw/datasette/issues/904#issuecomment-712483066,https://api.github.com/repos/simonw/datasette/issues/904,712483066,MDEyOklzc3VlQ29tbWVudDcxMjQ4MzA2Ng==,9599,simonw,2020-10-19T22:46:48Z,2020-10-19T22:46:48Z,OWNER,"OK, I'm committing to `datasette.urls.table()` and friends, which will be available to (and documented for use in) plugins and will be exposed to templates as `{{ urls.table(...) }}`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",663228985,"datasette.urls.table() / .instance() / .database() methods for constructing URLs, also exposed to templates", https://github.com/simonw/datasette/issues/1020#issuecomment-712482504,https://api.github.com/repos/simonw/datasette/issues/1020,712482504,MDEyOklzc3VlQ29tbWVudDcxMjQ4MjUwNA==,9599,simonw,2020-10-19T22:45:01Z,2020-10-19T22:45:01Z,OWNER,"I'm having trouble coming up with the syntax for this. Here's one option: ```python response = await datasette.client.get(path, request=request) ``` But this feels confusing to me. We're not using the `request=` argument as a request - we're using it as a source of some default request values (the cookies and incoming headers, but not the path). We're essentially combining that request with the other arguments passed to `.get()`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",721068929,Method for datasette.client() to forward on authentication, https://github.com/simonw/datasette/issues/1020#issuecomment-712482015,https://api.github.com/repos/simonw/datasette/issues/1020,712482015,MDEyOklzc3VlQ29tbWVudDcxMjQ4MjAxNQ==,9599,simonw,2020-10-19T22:43:24Z,2020-10-19T22:43:24Z,OWNER,"... unless I want to support authentication mechanisms that work based on incoming IP address instead, in which case there's an argument for copying more over from the incoming request. Tailscale is a good example of a system where authentication based on IP address can actually work well, so this is worth doing. Also, there might be authentication mechanisms which work by setting a custom header on the incoming request (not to mention the `Authorization` header).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",721068929,Method for datasette.client() to forward on authentication, https://github.com/simonw/datasette/issues/1020#issuecomment-712481568,https://api.github.com/repos/simonw/datasette/issues/1020,712481568,MDEyOklzc3VlQ29tbWVudDcxMjQ4MTU2OA==,9599,simonw,2020-10-19T22:41:59Z,2020-10-19T22:41:59Z,OWNER,"It turns out this works just fine: ```python response = await datasette.client.get(path, cookies=request.cookies) ``` So I don't need a mechanism for this. I'm going to add this to the documentation instead.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",721068929,Method for datasette.client() to forward on authentication, https://github.com/simonw/datasette/issues/1028#issuecomment-712480866,https://api.github.com/repos/simonw/datasette/issues/1028,712480866,MDEyOklzc3VlQ29tbWVudDcxMjQ4MDg2Ng==,9599,simonw,2020-10-19T22:39:51Z,2020-10-19T22:39:51Z,OWNER,Documentation: https://docs.datasette.io/en/latest/spatialite.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",723803777,--load-extension=spatialite shortcut, https://github.com/simonw/datasette/issues/1032#issuecomment-712397537,https://api.github.com/repos/simonw/datasette/issues/1032,712397537,MDEyOklzc3VlQ29tbWVudDcxMjM5NzUzNw==,236498,saulpw,2020-10-19T19:37:55Z,2020-10-19T19:37:55Z,NONE,"python-dateutil is awesome, but it can only guess at one date at a time. So if you have a column of dates that are (presumably) in the same format, it can't use the full set of dates to deduce the format. Also, once it has parsed a date, you can't get the format it used, whether to parse or render other dates. These limitations prevent it from being a silver bullet for date parsing, though they're not enough for me to stop using it!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724878151,Bring date parsing into Datasette core, https://github.com/simonw/datasette/issues/1032#issuecomment-712367285,https://api.github.com/repos/simonw/datasette/issues/1032,712367285,MDEyOklzc3VlQ29tbWVudDcxMjM2NzI4NQ==,9599,simonw,2020-10-19T18:39:32Z,2020-10-19T18:39:32Z,OWNER,https://github.com/digital-land/brownfield-land-collection/blob/a09ddf9960a6af59e72dc02448f7b645e59bf227/bin/harmonise.py#L217-L247 is a beautiful example of this problem.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724878151,Bring date parsing into Datasette core, https://github.com/simonw/datasette/issues/1032#issuecomment-712365439,https://api.github.com/repos/simonw/datasette/issues/1032,712365439,MDEyOklzc3VlQ29tbWVudDcxMjM2NTQzOQ==,9599,simonw,2020-10-19T18:35:50Z,2020-10-19T18:37:57Z,OWNER,"Maybe I don't need to add `dateutil` as a dependency here? `pendulum` and `arrow` both depend on it so it's pretty deeply embedded in the Python date ecosystem. Adding it as a dependency seems reasonable.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724878151,Bring date parsing into Datasette core, https://github.com/simonw/datasette/issues/1032#issuecomment-712365236,https://api.github.com/repos/simonw/datasette/issues/1032,712365236,MDEyOklzc3VlQ29tbWVudDcxMjM2NTIzNg==,9599,simonw,2020-10-19T18:35:25Z,2020-10-19T18:35:25Z,OWNER,"Since I'm dealing with tables of data I usually have a whole column of examples, so heuristics that check for numbers-greater-than-12 could actually work well for many cases.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724878151,Bring date parsing into Datasette core, https://github.com/simonw/datasette/issues/1032#issuecomment-712364532,https://api.github.com/repos/simonw/datasette/issues/1032,712364532,MDEyOklzc3VlQ29tbWVudDcxMjM2NDUzMg==,9599,simonw,2020-10-19T18:34:10Z,2020-10-19T18:34:10Z,OWNER,Lots of great example date data in https://biglocal.datasettes.com/COVID_WARN_Notices,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724878151,Bring date parsing into Datasette core, https://github.com/simonw/datasette/issues/1032#issuecomment-712364317,https://api.github.com/repos/simonw/datasette/issues/1032,712364317,MDEyOklzc3VlQ29tbWVudDcxMjM2NDMxNw==,9599,simonw,2020-10-19T18:33:42Z,2020-10-19T18:33:42Z,OWNER,"Related challenge: timezones. I think I'll punt on those for the moment and just concentrate on dates, but I should keep them in mind.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724878151,Bring date parsing into Datasette core, https://github.com/simonw/datasette/issues/1032#issuecomment-712363825,https://api.github.com/repos/simonw/datasette/issues/1032,712363825,MDEyOklzc3VlQ29tbWVudDcxMjM2MzgyNQ==,9599,simonw,2020-10-19T18:32:43Z,2020-10-19T18:32:43Z,OWNER,"Horrible thought: I bet there is data out there that uses more than one date format in the same table! So this needs to be exposed as a visible per-column setting. Column action menu can help here, although it's not yet the full solution because it isn't yet visible on mobile.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724878151,Bring date parsing into Datasette core, https://github.com/simonw/datasette/issues/1032#issuecomment-712363344,https://api.github.com/repos/simonw/datasette/issues/1032,712363344,MDEyOklzc3VlQ29tbWVudDcxMjM2MzM0NA==,9599,simonw,2020-10-19T18:31:46Z,2020-10-19T18:31:46Z,OWNER,"As always with dates, the challenge is America. `mm/dd/yy` is indistinguishable from the more sensible `dd/mm/yy`. It's crucially important to visibly tell people which format you assumed, otherwise all kinds of weird invisible bugs can manifest.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724878151,Bring date parsing into Datasette core, https://github.com/simonw/datasette/issues/904#issuecomment-712355877,https://api.github.com/repos/simonw/datasette/issues/904,712355877,MDEyOklzc3VlQ29tbWVudDcxMjM1NTg3Nw==,9599,simonw,2020-10-19T18:17:22Z,2020-10-19T18:17:22Z,OWNER,I quite like `datasette.urls.table()` but I'm not sure why.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",663228985,"datasette.urls.table() / .instance() / .database() methods for constructing URLs, also exposed to templates", https://github.com/simonw/datasette/issues/904#issuecomment-712355706,https://api.github.com/repos/simonw/datasette/issues/904,712355706,MDEyOklzc3VlQ29tbWVudDcxMjM1NTcwNg==,9599,simonw,2020-10-19T18:17:03Z,2020-10-19T18:17:03Z,OWNER,"Options: - `datasette.urls.instance()` (or `datasette.urls.root()`), `datasette.urls.table()` etc - `datasette.url_for.instance()`... - `datasette.url.instance()`... - `datasette.root_url()`, `datasette.table_url()`...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",663228985,"datasette.urls.table() / .instance() / .database() methods for constructing URLs, also exposed to templates", https://github.com/simonw/datasette/issues/904#issuecomment-712354600,https://api.github.com/repos/simonw/datasette/issues/904,712354600,MDEyOklzc3VlQ29tbWVudDcxMjM1NDYwMA==,9599,simonw,2020-10-19T18:15:03Z,2020-10-19T18:15:39Z,OWNER,"Related: #1026 (How should datasette.client interact with base_url) Also this comment from https://github.com/simonw/datasette/issues/943#issuecomment-675752436 > One thing to consider here: Datasette's table and database name escaping rules can be a little bit convoluted. > > If a plugin wants to get back the first five rows of a table, it will need to construct a URL `/dbname/tablename?_size=5` - but it will need to know how to turn the database and table names into the correctly escaped `dbname` and `tablename` values.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",663228985,"datasette.urls.table() / .instance() / .database() methods for constructing URLs, also exposed to templates", https://github.com/simonw/datasette/issues/904#issuecomment-712324077,https://api.github.com/repos/simonw/datasette/issues/904,712324077,MDEyOklzc3VlQ29tbWVudDcxMjMyNDA3Nw==,9599,simonw,2020-10-19T17:39:38Z,2020-10-19T17:39:38Z,OWNER,"If I do these methods I think this should be available on the `datasette` object too, as an internal API for plugins to use to construct redirects and suchlike.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",663228985,"datasette.urls.table() / .instance() / .database() methods for constructing URLs, also exposed to templates", https://github.com/simonw/datasette/issues/904#issuecomment-710487083,https://api.github.com/repos/simonw/datasette/issues/904,710487083,MDEyOklzc3VlQ29tbWVudDcxMDQ4NzA4Mw==,9599,simonw,2020-10-16T19:34:10Z,2020-10-19T17:39:04Z,OWNER,"Alternatively, I could expose a single object that knows how to construct all kinds of URLs. Something like this: ``` {{ urls.instance() }} {{ urls.database(database_name) }} {{ urls.table(database_name, table_name) }} etc ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",663228985,"datasette.urls.table() / .instance() / .database() methods for constructing URLs, also exposed to templates", https://github.com/simonw/datasette/issues/1027#issuecomment-712320103,https://api.github.com/repos/simonw/datasette/issues/1027,712320103,MDEyOklzc3VlQ29tbWVudDcxMjMyMDEwMw==,9599,simonw,2020-10-19T17:35:04Z,2020-10-19T17:35:04Z,OWNER,Still need to configure proxying though. https://www.netnea.com/cms/apache-tutorial-9_setting-up-a-reverse-proxy/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722758132,Add documentation on serving Datasette behind a proxy using base_url, https://github.com/simonw/datasette/issues/991#issuecomment-712317638,https://api.github.com/repos/simonw/datasette/issues/991,712317638,MDEyOklzc3VlQ29tbWVudDcxMjMxNzYzOA==,9599,simonw,2020-10-19T17:30:56Z,2020-10-19T17:30:56Z,OWNER,https://biglocal.datasettes.com/ is one of my larger Datasettes in terms of number of databases.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",714377268,Redesign application homepage, https://github.com/dogsheep/dogsheep-beta/issues/29#issuecomment-712266834,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/29,712266834,MDEyOklzc3VlQ29tbWVudDcxMjI2NjgzNA==,9599,simonw,2020-10-19T16:01:23Z,2020-10-19T16:01:23Z,MEMBER,Might just be a documented pattern for how to configure this in YAML templates.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724759588,Add search highlighting snippets, https://github.com/simonw/datasette/pull/1030#issuecomment-711407607,https://api.github.com/repos/simonw/datasette/issues/1030,711407607,MDEyOklzc3VlQ29tbWVudDcxMTQwNzYwNw==,22429695,codecov[bot],2020-10-18T19:31:31Z,2020-10-19T08:01:51Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1030?src=pr&el=h1) Report > Merging [#1030](https://codecov.io/gh/simonw/datasette/pull/1030?src=pr&el=desc) into [main](https://codecov.io/gh/simonw/datasette/commit/568bd7bbf590861687db8c318f3d8cfcd1dfb47a?el=desc) will **decrease** coverage by `0.10%`. > The diff coverage is `68.75%`. [![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1030/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1030?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1030 +/- ## ========================================== - Coverage 84.63% 84.53% -0.11% ========================================== Files 28 28 Lines 3892 3905 +13 ========================================== + Hits 3294 3301 +7 - Misses 598 604 +6 ``` | [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1030?src=pr&el=tree) | Coverage Δ | | |---|---|---| | [datasette/utils/\_\_init\_\_.py](https://codecov.io/gh/simonw/datasette/pull/1030/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3V0aWxzL19faW5pdF9fLnB5) | `93.35% <68.75%> (-0.79%)` | :arrow_down: | | [datasette/views/index.py](https://codecov.io/gh/simonw/datasette/pull/1030/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZpZXdzL2luZGV4LnB5) | `96.49% <0.00%> (-1.76%)` | :arrow_down: | ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1030?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1030?src=pr&el=footer). Last update [568bd7b...e082533](https://codecov.io/gh/simonw/datasette/pull/1030?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",723982480,Make `package` command deal with a configuration directory argument, https://github.com/simonw/datasette/pull/1031#issuecomment-711792622,https://api.github.com/repos/simonw/datasette/issues/1031,711792622,MDEyOklzc3VlQ29tbWVudDcxMTc5MjYyMg==,22429695,codecov[bot],2020-10-19T07:57:17Z,2020-10-19T07:57:17Z,NONE,"# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1031?src=pr&el=h1) Report > Merging [#1031](https://codecov.io/gh/simonw/datasette/pull/1031?src=pr&el=desc) into [main](https://codecov.io/gh/simonw/datasette/commit/568bd7bbf590861687db8c318f3d8cfcd1dfb47a?el=desc) will **decrease** coverage by `0.02%`. > The diff coverage is `n/a`. [![Impacted file tree graph](https://codecov.io/gh/simonw/datasette/pull/1031/graphs/tree.svg?width=650&height=150&src=pr&token=eSahVY7kw1)](https://codecov.io/gh/simonw/datasette/pull/1031?src=pr&el=tree) ```diff @@ Coverage Diff @@ ## main #1031 +/- ## ========================================== - Coverage 84.63% 84.60% -0.03% ========================================== Files 28 28 Lines 3892 3892 ========================================== - Hits 3294 3293 -1 - Misses 598 599 +1 ``` | [Impacted Files](https://codecov.io/gh/simonw/datasette/pull/1031?src=pr&el=tree) | Coverage Δ | | |---|---|---| | [datasette/cli.py](https://codecov.io/gh/simonw/datasette/pull/1031/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL2NsaS5weQ==) | `74.22% <ø> (ø)` | | | [datasette/views/index.py](https://codecov.io/gh/simonw/datasette/pull/1031/diff?src=pr&el=tree#diff-ZGF0YXNldHRlL3ZpZXdzL2luZGV4LnB5) | `96.49% <0.00%> (-1.76%)` | :arrow_down: | ------ [Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1031?src=pr&el=continue). > **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta) > `Δ = absolute (impact)`, `ø = not affected`, `? = missing data` > Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1031?src=pr&el=footer). Last update [568bd7b...7e7eaa4](https://codecov.io/gh/simonw/datasette/pull/1031?src=pr&el=lastupdated). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724369025,Fallback to databases in inspect-data.json when no -i options are passed, https://github.com/dogsheep/github-to-sqlite/issues/50#issuecomment-711569063,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/50,711569063,MDEyOklzc3VlQ29tbWVudDcxMTU2OTA2Mw==,9599,simonw,2020-10-19T05:01:29Z,2020-10-19T05:01:29Z,MEMBER,"Demo of `--accept`: github-to-sqlite get /repos/simonw/datasette/readme --accept 'application/vnd.github.VERSION.html' ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",703218756,Commands for making authenticated API calls, https://github.com/dogsheep/dogsheep-beta/issues/28#issuecomment-711089647,https://api.github.com/repos/dogsheep/dogsheep-beta/issues/28,711089647,MDEyOklzc3VlQ29tbWVudDcxMTA4OTY0Nw==,9599,simonw,2020-10-17T22:43:13Z,2020-10-17T22:43:13Z,MEMBER,"Since my personal Dogsheep uses Datasette authentication, I'm going to need to pass through cookies. https://github.com/simonw/datasette/issues/1020 will solve that in the future but for now I need to solve it explicitly.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",723861683,Switch to using datasette.client, https://github.com/dogsheep/healthkit-to-sqlite/issues/11#issuecomment-711083698,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/11,711083698,MDEyOklzc3VlQ29tbWVudDcxMTA4MzY5OA==,572,jarib,2020-10-17T21:39:15Z,2020-10-17T21:39:15Z,NONE,Nice! Works perfectly. Thanks for the quick response and great tooling in general.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",723838331,export.xml file name varies with different language settings, https://github.com/dogsheep/healthkit-to-sqlite/issues/11#issuecomment-711081703,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/11,711081703,MDEyOklzc3VlQ29tbWVudDcxMTA4MTcwMw==,9599,simonw,2020-10-17T21:18:35Z,2020-10-17T21:18:35Z,MEMBER,"OK, if you upgrade to the just-released 1.0 this should work (it worked against my Spanish export).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",723838331,export.xml file name varies with different language settings, https://github.com/dogsheep/healthkit-to-sqlite/issues/11#issuecomment-711079760,https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/11,711079760,MDEyOklzc3VlQ29tbWVudDcxMTA3OTc2MA==,9599,simonw,2020-10-17T21:00:05Z,2020-10-17T21:00:05Z,MEMBER,Checking for either ` "" TEMPLATE = """"""
{}
"""""".strip() EN_MEDIA_SCRIPT = """""" Array.from(document.querySelectorAll('en-media')).forEach(el => { let hash = el.getAttribute('hash'); let type = el.getAttribute('type'); let path = `/evernote/resources_data/${hash}.json?_shape=array`; fetch(path).then(r => r.json()).then(rows => { let b64 = rows[0].data.encoded; let data = `data:${type};base64,${b64}`; el.innerHTML = ``; }); }); """""" @hookimpl def render_cell(value, table): if not table: # Don't render content from arbitrary SQL queries, could be XSS hole return if not value or not isinstance(value, str): return value = value.strip() if value.startswith(START) and value.endswith(END): trimmed = value[len(START) : -len(END)] trimmed = trimmed.split("">"", 1)[1] # Replace those horrible double newlines trimmed = trimmed.replace(""

"", ""
"") return jinja2.Markup(TEMPLATE.format(trimmed)) @hookimpl def extra_body_script(): return EN_MEDIA_SCRIPT ``` It works! It does however demonstrate that Evernote's ""clip this webpage"" feature means there is a LOT of weird HTML that can get into a note. It looks like they've filtered out the scripts but I wouldn't bet on it - they certainly don't filter out many of the inline styles. So running Bleach is almost certainly a good idea.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",718938889,Figure out how to display images from tags inline in Datasette, https://github.com/simonw/sqlite-utils/issues/49#issuecomment-710461468,https://api.github.com/repos/simonw/sqlite-utils/issues/49,710461468,MDEyOklzc3VlQ29tbWVudDcxMDQ2MTQ2OA==,9599,simonw,2020-10-16T19:18:19Z,2020-10-16T19:18:19Z,OWNER,"Reconsidering: #89 was a feature request that relates to this, so maybe this is worth implementing after all.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",472115381,extracts= should support multiple-column extracts, https://github.com/simonw/sqlite-utils/issues/89#issuecomment-710460242,https://api.github.com/repos/simonw/sqlite-utils/issues/89,710460242,MDEyOklzc3VlQ29tbWVudDcxMDQ2MDI0Mg==,9599,simonw,2020-10-16T19:17:27Z,2020-10-16T19:17:50Z,OWNER,"I came up with potential syntax for that here: https://github.com/simonw/sqlite-utils/issues/49#issuecomment-710393550 - based on how `table.extract(...)` works: ```python fresh_db.table(""tree"", extracts=[Extract( columns=(""CommonName"", ""LatinName""), table=""Species"", fk_column=""species_id"", rename={""CommonName"": ""name"", ""LatinName"": ""latin""} )]) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",573578548,Ability to customize columns used by extracts= feature, https://github.com/simonw/sqlite-utils/issues/187#issuecomment-710456981,https://api.github.com/repos/simonw/sqlite-utils/issues/187,710456981,MDEyOklzc3VlQ29tbWVudDcxMDQ1Njk4MQ==,9599,simonw,2020-10-16T19:15:13Z,2020-10-16T19:15:13Z,OWNER,This is a duplicate of #79.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",723460107,Maybe: Utility method / CLI tool for initializing SpatiaLite, https://github.com/simonw/sqlite-utils/issues/187#issuecomment-710440853,https://api.github.com/repos/simonw/sqlite-utils/issues/187,710440853,MDEyOklzc3VlQ29tbWVudDcxMDQ0MDg1Mw==,9599,simonw,2020-10-16T19:04:19Z,2020-10-16T19:04:19Z,OWNER,I split this off from #136.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",723460107,Maybe: Utility method / CLI tool for initializing SpatiaLite, https://github.com/simonw/sqlite-utils/issues/136#issuecomment-710428802,https://api.github.com/repos/simonw/sqlite-utils/issues/136,710428802,MDEyOklzc3VlQ29tbWVudDcxMDQyODgwMg==,9599,simonw,2020-10-16T18:56:47Z,2020-10-16T18:56:47Z,OWNER,"To keep the code cleaner, I'm tempted to support this instead: --load-extension=spatialite Where `spatialite` is a special shortcut value that triggers a search for that module in known locations. Users could still load a module in a file called `spatialite` in the current directory using: --load-extension=./spatialite In fact, `--load-extension=spatialite` could handle that case too by always checking for a file called `spatialite` before attempting to search for it in known locations.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",683812642,--load-extension=spatialite shortcut option, https://github.com/simonw/sqlite-utils/issues/69#issuecomment-710405658,https://api.github.com/repos/simonw/sqlite-utils/issues/69,710405658,MDEyOklzc3VlQ29tbWVudDcxMDQwNTY1OA==,9599,simonw,2020-10-16T18:42:48Z,2020-10-16T18:42:48Z,OWNER,"Did some work on this for #134 in 7e9aad7e1c09d1cf80d0b4d17d6157212a4b857d I still need to add `--load-extension` to other CLI methods, see #137. Closing this issue in favour of that one.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",534507142,Feature request: enable extensions loading, https://github.com/simonw/sqlite-utils/issues/48#issuecomment-710402331,https://api.github.com/repos/simonw/sqlite-utils/issues/48,710402331,MDEyOklzc3VlQ29tbWVudDcxMDQwMjMzMQ==,9599,simonw,2020-10-16T18:41:06Z,2020-10-16T18:41:06Z,OWNER,I could use this demo from JupyterCon 2020 https://gist.github.com/simonw/656c21b5800d5e4624dec9930f00e093,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",471818939,"Jupyter notebook demo of the library, launchable on Binder", https://github.com/simonw/sqlite-utils/issues/58#issuecomment-710399593,https://api.github.com/repos/simonw/sqlite-utils/issues/58,710399593,MDEyOklzc3VlQ29tbWVudDcxMDM5OTU5Mw==,9599,simonw,2020-10-16T18:39:31Z,2020-10-16T18:39:31Z,OWNER,I don't think this is valuable enough to justify adding to the library - especially since you can execute FTS search against views by joining to an FTS table built against an underlying table.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",488293926,Support enabling FTS on views, https://github.com/simonw/sqlite-utils/issues/49#issuecomment-710397574,https://api.github.com/repos/simonw/sqlite-utils/issues/49,710397574,MDEyOklzc3VlQ29tbWVudDcxMDM5NzU3NA==,9599,simonw,2020-10-16T18:38:21Z,2020-10-16T18:38:21Z,OWNER,"I'm not going to implement this. I'll leave `extract=...` as it is right now, suitable for quick simple single-column operations on input, but if users want to do something more complicated involving multiple columns they should use the `table.extract()` method after the initial insert instead.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",472115381,extracts= should support multiple-column extracts, https://github.com/simonw/sqlite-utils/issues/49#issuecomment-710395444,https://api.github.com/repos/simonw/sqlite-utils/issues/49,710395444,MDEyOklzc3VlQ29tbWVudDcxMDM5NTQ0NA==,9599,simonw,2020-10-16T18:37:10Z,2020-10-16T18:37:10Z,OWNER,"But this begins to feel too complicated, given that `table.extract()` can already be used to achieve the same thing.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",472115381,extracts= should support multiple-column extracts, https://github.com/simonw/sqlite-utils/issues/49#issuecomment-710393550,https://api.github.com/repos/simonw/sqlite-utils/issues/49,710393550,MDEyOklzc3VlQ29tbWVudDcxMDM5MzU1MA==,9599,simonw,2020-10-16T18:35:57Z,2020-10-16T18:36:39Z,OWNER,"If I want to support that most complicated example, I think the option to pass a `Extracts()` object to `extracts=` is the best way to do it: ```python fresh_db.table(""tree"", extracts=[Extract( columns=(""CommonName"", ""LatinName""), table=""Species"", fk_column=""species_id"", rename={""CommonName"": ""name"", ""LatinName"": ""latin""} )]) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",472115381,extracts= should support multiple-column extracts, https://github.com/simonw/sqlite-utils/issues/49#issuecomment-710390915,https://api.github.com/repos/simonw/sqlite-utils/issues/49,710390915,MDEyOklzc3VlQ29tbWVudDcxMDM5MDkxNQ==,9599,simonw,2020-10-16T18:34:26Z,2020-10-16T18:34:50Z,OWNER,"Here's the most complex example of `.extracts()`: ```python db[""Trees""].extract( [""CommonName"", ""LatinName""], table=""Species"", fk_column=""species_id"", rename={""CommonName"": ""name"", ""LatinName"": ""latin""} ) ``` Resulting in: ```sql CREATE TABLE [Species] ( [id] INTEGER PRIMARY KEY, [name] TEXT, [latin] TEXT ) ``` From https://sqlite-utils.readthedocs.io/en/stable/python-api.html#extracting-columns-into-a-separate-table","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",472115381,extracts= should support multiple-column extracts, https://github.com/simonw/sqlite-utils/issues/49#issuecomment-710364942,https://api.github.com/repos/simonw/sqlite-utils/issues/49,710364942,MDEyOklzc3VlQ29tbWVudDcxMDM2NDk0Mg==,9599,simonw,2020-10-16T18:18:48Z,2020-10-16T18:18:48Z,OWNER,"I think there is. It's a nice existing feature, and I don't think adding tuple support to it would be a huge lift.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",472115381,extracts= should support multiple-column extracts, https://github.com/simonw/sqlite-utils/issues/49#issuecomment-710363789,https://api.github.com/repos/simonw/sqlite-utils/issues/49,710363789,MDEyOklzc3VlQ29tbWVudDcxMDM2Mzc4OQ==,9599,simonw,2020-10-16T18:18:05Z,2020-10-16T18:18:05Z,OWNER,I wonder if there's value in extending the `extracts=` option at all given the existence of `table.extract()`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",472115381,extracts= should support multiple-column extracts, https://github.com/simonw/sqlite-utils/issues/49#issuecomment-710359724,https://api.github.com/repos/simonw/sqlite-utils/issues/49,710359724,MDEyOklzc3VlQ29tbWVudDcxMDM1OTcyNA==,9599,simonw,2020-10-16T18:15:31Z,2020-10-16T18:15:31Z,OWNER,"Using a tuple would work: ```python fresh_db.table(""tree"", extracts=[(""common_name"", ""latin_name"")]) ``` Or to define a custom name: ```python fresh_db.table(""tree"", extracts={(""common_name"", ""latin_name""): ""names""}) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",472115381,extracts= should support multiple-column extracts, https://github.com/simonw/sqlite-utils/issues/49#issuecomment-710346830,https://api.github.com/repos/simonw/sqlite-utils/issues/49,710346830,MDEyOklzc3VlQ29tbWVudDcxMDM0NjgzMA==,9599,simonw,2020-10-16T18:08:52Z,2020-10-16T18:09:21Z,OWNER,"The new `.extract()` method can handle multiple columns: https://github.com/simonw/sqlite-utils/blob/2c541fac352632e23e40b0d21e3f233f7a744a57/tests/test_extract.py#L70-L87","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",472115381,extracts= should support multiple-column extracts, https://github.com/simonw/sqlite-utils/issues/182#issuecomment-710258736,https://api.github.com/repos/simonw/sqlite-utils/issues/182,710258736,MDEyOklzc3VlQ29tbWVudDcxMDI1ODczNg==,9599,simonw,2020-10-16T17:20:41Z,2020-10-16T17:20:41Z,OWNER,Documentation: https://sqlite-utils.readthedocs.io/en/latest/cli.html#inserting-csv-or-tsv-data,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",711649325,"Better handling of encodings other than utf-8 for ""sqlite-utils insert""", https://github.com/simonw/sqlite-utils/issues/186#issuecomment-710198162,https://api.github.com/repos/simonw/sqlite-utils/issues/186,710198162,MDEyOklzc3VlQ29tbWVudDcxMDE5ODE2Mg==,9599,simonw,2020-10-16T16:41:00Z,2020-10-16T16:41:00Z,OWNER,"Failing test: ```python def test_extract_null_values(fresh_db): fresh_db[""species""].insert({""id"": 1, ""species"": ""Wolf""}, pk=""id"") fresh_db[""individuals""].insert_all( [ {""id"": 10, ""name"": ""Terriana"", ""species"": ""Fox""}, {""id"": 11, ""name"": ""Spenidorm"", ""species"": None}, {""id"": 12, ""name"": ""Grantheim"", ""species"": ""Wolf""}, {""id"": 13, ""name"": ""Turnutopia"", ""species"": None}, {""id"": 14, ""name"": ""Wargal"", ""species"": ""Wolf""}, ], pk=""id"", ) fresh_db[""individuals""].extract(""species"") assert fresh_db[""species""].schema == ( ""CREATE TABLE [species] (\n"" "" [id] INTEGER PRIMARY KEY,\n"" "" [species] TEXT\n"" "")"" ) assert fresh_db[""individuals""].schema == ( 'CREATE TABLE ""individuals"" (\n' "" [id] INTEGER PRIMARY KEY,\n"" "" [name] TEXT,\n"" "" [species_id] INTEGER,\n"" "" FOREIGN KEY(species_id) REFERENCES species(id)\n"" "")"" ) assert list(fresh_db[""species""].rows) == [ {""id"": 1, ""species"": ""Wolf""}, {""id"": 2, ""species"": ""Fox""}, ] assert list(fresh_db[""individuals""].rows) == [ {""id"": 10, ""name"": ""Terriana"", ""species_id"": 2}, {""id"": 11, ""name"": ""Spenidorm"", ""species_id"": None}, {""id"": 12, ""name"": ""Grantheim"", ""species_id"": 1}, {""id"": 13, ""name"": ""Turnutopia"", ""species_id"": None}, {""id"": 14, ""name"": ""Wargal"", ""species_id"": 1}, ] ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722816436,.extract() shouldn't extract null values, https://github.com/simonw/sqlite-utils/issues/182#issuecomment-710178871,https://api.github.com/repos/simonw/sqlite-utils/issues/182,710178871,MDEyOklzc3VlQ29tbWVudDcxMDE3ODg3MQ==,9599,simonw,2020-10-16T16:27:39Z,2020-10-16T16:28:14Z,OWNER,"The file is opened for me by `click.File()`, which also handles things like `-` for stdin. But i neee to be able to switch the encoding used to read from that based on the `--encoding` option. I think the way to do that is to open the file in binary mode and then wrap it in a codec reader: ```python fp = codecs.getreader(encoding)(fp) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",711649325,"Better handling of encodings other than utf-8 for ""sqlite-utils insert""", https://github.com/simonw/sqlite-utils/issues/186#issuecomment-709706260,https://api.github.com/repos/simonw/sqlite-utils/issues/186,709706260,MDEyOklzc3VlQ29tbWVudDcwOTcwNjI2MA==,9599,simonw,2020-10-16T03:17:02Z,2020-10-16T03:17:17Z,OWNER,Actually I think this should be an option to `.extract()` which controls if nulls are extracted or left alone. Maybe called `extract_nulls=True/False`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722816436,.extract() shouldn't extract null values, https://github.com/simonw/sqlite-utils/issues/186#issuecomment-709706065,https://api.github.com/repos/simonw/sqlite-utils/issues/186,709706065,MDEyOklzc3VlQ29tbWVudDcwOTcwNjA2NQ==,9599,simonw,2020-10-16T03:16:22Z,2020-10-16T03:16:22Z,OWNER,Either way I think I'm going to need to add some SQL which uses `where a = b or (a is null and b is null)`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722816436,.extract() shouldn't extract null values, https://github.com/simonw/sqlite-utils/issues/186#issuecomment-709705885,https://api.github.com/repos/simonw/sqlite-utils/issues/186,709705885,MDEyOklzc3VlQ29tbWVudDcwOTcwNTg4NQ==,9599,simonw,2020-10-16T03:15:39Z,2020-10-16T03:15:39Z,OWNER,The alternative solution here would be that a single `null` value DOES get extracted. To implement this I would need to add some logic that uses `is null` instead of `=`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722816436,.extract() shouldn't extract null values, https://github.com/simonw/sqlite-utils/issues/186#issuecomment-709705624,https://api.github.com/repos/simonw/sqlite-utils/issues/186,709705624,MDEyOklzc3VlQ29tbWVudDcwOTcwNTYyNA==,9599,simonw,2020-10-16T03:14:39Z,2020-10-16T03:14:39Z,OWNER,"How should this work with extractions covering multiple columns? If there's a single column then it makes sense that a `null` value would not be extracted into the lookup table, but would instead become stay as `null`. For a multiple column extraction, provided at least one of those columns is not null It should map to a record in the lookup table. Only if ALL of the extracted columns are null should the lookup value stay null.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722816436,.extract() shouldn't extract null values, https://github.com/simonw/datasette/issues/1027#issuecomment-709647525,https://api.github.com/repos/simonw/datasette/issues/1027,709647525,MDEyOklzc3VlQ29tbWVudDcwOTY0NzUyNQ==,9599,simonw,2020-10-15T23:49:51Z,2020-10-15T23:51:39Z,OWNER,"I'll install Apache on macOS to figure this out using https://formulae.brew.sh/formula/httpd `brew install httpd` output this at the end: ``` ==> httpd DocumentRoot is /usr/local/var/www. The default ports have been set in /usr/local/etc/httpd/httpd.conf to 8080 and in /usr/local/etc/httpd/extra/httpd-ssl.conf to 8443 so that httpd can run without sudo. To have launchd start httpd now and restart at login: brew services start httpd Or, if you don't want/need a background service you can just run: apachectl start ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722758132,Add documentation on serving Datasette behind a proxy using base_url, https://github.com/simonw/datasette/issues/1027#issuecomment-709646865,https://api.github.com/repos/simonw/datasette/issues/1027,709646865,MDEyOklzc3VlQ29tbWVudDcwOTY0Njg2NQ==,9599,simonw,2020-10-15T23:47:08Z,2020-10-15T23:47:08Z,OWNER,It should cover both nginx and Apache. nginx config is here: https://github.com/simonw/datasette/issues/1024#issuecomment-709598324,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722758132,Add documentation on serving Datasette behind a proxy using base_url, https://github.com/simonw/datasette/issues/1026#issuecomment-709636372,https://api.github.com/repos/simonw/datasette/issues/1026,709636372,MDEyOklzc3VlQ29tbWVudDcwOTYzNjM3Mg==,9599,simonw,2020-10-15T23:09:34Z,2020-10-15T23:09:34Z,OWNER,"I'm inclined to say that internal requests should ignore `base_url` - since that seems like the right thing for plugins that need to access default Datasette APIs. The one catch here is plugins that might want to proxy the current incoming URL for some reason - where that incoming `request.path` could include the `base_url`. Actually those should be fine - because it will have been stripped off earlier: https://github.com/simonw/datasette/blob/4f7c0ebd85ccd8c1853d7aa0147628f7c1b749cc/datasette/app.py#L963-L968","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722738988,How should datasette.client interact with base_url, https://github.com/simonw/datasette/issues/904#issuecomment-709635276,https://api.github.com/repos/simonw/datasette/issues/904,709635276,MDEyOklzc3VlQ29tbWVudDcwOTYzNTI3Ng==,9599,simonw,2020-10-15T23:05:54Z,2020-10-15T23:05:54Z,OWNER,Could have `instance_url()` take an optional path argument which is then turned into the correct path.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",663228985,"datasette.urls.table() / .instance() / .database() methods for constructing URLs, also exposed to templates", https://github.com/simonw/datasette/issues/904#issuecomment-709635021,https://api.github.com/repos/simonw/datasette/issues/904,709635021,MDEyOklzc3VlQ29tbWVudDcwOTYzNTAyMQ==,9599,simonw,2020-10-15T23:05:11Z,2020-10-15T23:05:11Z,OWNER,"I think this should be a family of functions: - `instance_url()` - the root URL of the instance (usually `/` unless `base_url` is set) - `database_url(database_name)` - already got this - `table_url(database_name, table_name)` - `row_url(database_name, table_name, row)` - not sure about this one. The idea would be for `row` to be correctly turned into a URL by introspecting the primary keys for that table, then pulling those values out of the SQLite `row` object. Might not be necessary though. I also need a way for plugins to link to e.g. `/-/configure-fts` - or even `/-/configure-fts/database-name/table-name`. What should that look like?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",663228985,"datasette.urls.table() / .instance() / .database() methods for constructing URLs, also exposed to templates", https://github.com/simonw/datasette/issues/904#issuecomment-709634261,https://api.github.com/repos/simonw/datasette/issues/904,709634261,MDEyOklzc3VlQ29tbWVudDcwOTYzNDI2MQ==,9599,simonw,2020-10-15T23:02:43Z,2020-10-15T23:02:43Z,OWNER,"Here's the current implementation of `database_url` - on the `BaseView` class, but only because it needs access to a `datasette` instance (to read `base_url`): https://github.com/simonw/datasette/blob/8f97b9b58e77f82fef1f10e9c9f6754b993544b6/datasette/views/base.py#L102-L108","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",663228985,"datasette.urls.table() / .instance() / .database() methods for constructing URLs, also exposed to templates", https://github.com/simonw/datasette/issues/904#issuecomment-709633823,https://api.github.com/repos/simonw/datasette/issues/904,709633823,MDEyOklzc3VlQ29tbWVudDcwOTYzMzgyMw==,9599,simonw,2020-10-15T23:01:13Z,2020-10-15T23:01:13Z,OWNER,Tracking ticket: #1023,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",663228985,"datasette.urls.table() / .instance() / .database() methods for constructing URLs, also exposed to templates", https://github.com/simonw/datasette/issues/988#issuecomment-709633762,https://api.github.com/repos/simonw/datasette/issues/988,709633762,MDEyOklzc3VlQ29tbWVudDcwOTYzMzc2Mg==,9599,simonw,2020-10-15T23:01:01Z,2020-10-15T23:01:01Z,OWNER,This is a dupe of https://github.com/simonw/datasette/issues/904,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",713209404,Mechanism for plugins to construct URLs that respect base_url, https://github.com/simonw/datasette/issues/865#issuecomment-709633080,https://api.github.com/repos/simonw/datasette/issues/865,709633080,MDEyOklzc3VlQ29tbWVudDcwOTYzMzA4MA==,9599,simonw,2020-10-15T22:58:51Z,2020-10-15T22:58:51Z,OWNER,"It looks like there are places where Datasette might return a redirect that doesn't take `base_url` into account - I'm planning on fixing those here, after which I think `ProxyPassReverse` should no longer be necessary. https://github.com/simonw/datasette/issues/1025#issuecomment-709632136","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",644582921,"base_url doesn't seem to work when adding criteria and clicking ""apply""", https://github.com/simonw/datasette/issues/900#issuecomment-709632765,https://api.github.com/repos/simonw/datasette/issues/900,709632765,MDEyOklzc3VlQ29tbWVudDcwOTYzMjc2NQ==,9599,simonw,2020-10-15T22:57:55Z,2020-10-15T22:57:55Z,OWNER,"I believe this particular bug has been fixed, based on my testing here: https://github.com/simonw/datasette/issues/1024#issuecomment-709622973 Please re-open the ticket if you are still experiencing it. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",661605489,Some links don't honor base_url, https://github.com/simonw/datasette/issues/1025#issuecomment-709632314,https://api.github.com/repos/simonw/datasette/issues/1025,709632314,MDEyOklzc3VlQ29tbWVudDcwOTYzMjMxNA==,9599,simonw,2020-10-15T22:56:25Z,2020-10-15T22:56:34Z,OWNER,"That `utils/asgi.py` line is the default path for setting cookies. That should likely take `base_url` into account too: https://github.com/simonw/datasette/blob/4f7c0ebd85ccd8c1853d7aa0147628f7c1b749cc/datasette/utils/asgi.py#L331-L342","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722724086,"Fix last remaining links to ""/"" that do not respect base_url", https://github.com/simonw/datasette/issues/1025#issuecomment-709632136,https://api.github.com/repos/simonw/datasette/issues/1025,709632136,MDEyOklzc3VlQ29tbWVudDcwOTYzMjEzNg==,9599,simonw,2020-10-15T22:55:44Z,2020-10-15T22:55:44Z,OWNER,"It looks like there are also some generated redirect responses that don't take `base_url` into account: ``` datasette % git grep '""/' -- '*.py' ':(exclude)*test_*.py' ':(exclude)datasette/app.py' datasette/_version.py: for i in cfg.versionfile_source.split(""/""): datasette/utils/asgi.py: path=""/"", datasette/views/base.py: should_redirect = ""/{}-{}"".format(name, expected) datasette/views/base.py: should_redirect += ""/"" + urllib.parse.quote_plus(kwargs[""table""]) datasette/views/base.py: should_redirect += ""/"" + kwargs[""pk_path""] datasette/views/special.py: response = Response.redirect(""/"") datasette/views/special.py: return Response.redirect(""/"") datasette/views/special.py: response = Response.redirect(""/"") datasette/views/special.py: return Response.redirect(""/"") ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722724086,"Fix last remaining links to ""/"" that do not respect base_url", https://github.com/simonw/datasette/issues/1025#issuecomment-709629920,https://api.github.com/repos/simonw/datasette/issues/1025,709629920,MDEyOklzc3VlQ29tbWVudDcwOTYyOTkyMA==,9599,simonw,2020-10-15T22:48:20Z,2020-10-15T22:48:20Z,OWNER,"Also these: ``` datasette % git grep '""/' -- '*.html' ':(exclude)*/patterns.html' datasette/templates/allow_debug.html:
datasette/templates/base.html: datasette/templates/error.html: home datasette/templates/logout.html: datasette/templates/messages_debug.html: datasette/templates/query.html: home / ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722724086,"Fix last remaining links to ""/"" that do not respect base_url", https://github.com/simonw/datasette/issues/865#issuecomment-709626786,https://api.github.com/repos/simonw/datasette/issues/865,709626786,MDEyOklzc3VlQ29tbWVudDcwOTYyNjc4Ng==,9599,simonw,2020-10-15T22:38:38Z,2020-10-15T22:38:38Z,OWNER,"I managed to recreate proxying using `nginx` in #1024 - but I could not replicate this bug. I did NOT use `ProxyPassReverse` though. I think that may be what caused the problem. I'll add a section to the documentation about this shortly.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",644582921,"base_url doesn't seem to work when adding criteria and clicking ""apply""", https://github.com/simonw/datasette/issues/1024#issuecomment-709625063,https://api.github.com/repos/simonw/datasette/issues/1024,709625063,MDEyOklzc3VlQ29tbWVudDcwOTYyNTA2Mw==,9599,simonw,2020-10-15T22:33:22Z,2020-10-15T22:33:22Z,OWNER,"Of those errors... `http://localhost:8000/robots.txt` 404 is fine. `http://localhost:8000/datasette/%5C%22https://www.openstreetmap.org/copyright%5C%22` looks to me like a `wget` parsing bug where it got confused by this JavaScript: ``` window.DATASETTE_CLUSTER_MAP_TILE_LAYER_OPTIONS = {""maxZoom"": 19, ""detectRetina"": true, ""attribution"": ""© OpenStreetMap contributors""}; ``` `http://localhost:8000/-/static-plugins/datasette_cluster_map/datasette-cluster-map.js` is a real bug. It's a bug in `datasette-cluster-map` but also requires me to solve #988 - mechanism for plugins to construct URLs that obey `base_url`. I'm not sure why I'm getting a hit to `http://localhost:8000/` since I wouldn't expect to link to `/` anywhere.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722674708,Figure out how to run an environment that exercises the base_url proxy setting, https://github.com/simonw/datasette/issues/1024#issuecomment-709622973,https://api.github.com/repos/simonw/datasette/issues/1024,709622973,MDEyOklzc3VlQ29tbWVudDcwOTYyMjk3Mw==,9599,simonw,2020-10-15T22:27:31Z,2020-10-15T22:27:31Z,OWNER,"Here's how I tested it: ``` time wget -r 'http://localhost:8000/datasette/' 2>&1 | grep -i -C 5 ""failed\|error"" > /tmp/errors.txt ``` This wrote out any errors (plus context) to the `errors.txt` log - and reported that the full crawl took 33s. Here's what I got in `errors.txt`: ``` 0K . 71.6M=0s 2020-10-15 15:23:09 (71.6 MB/s) - ‘localhost:8000/datasette/index.html’ saved [1276] Loading robots.txt; please ignore errors. --2020-10-15 15:23:09-- http://localhost:8000/robots.txt Reusing existing connection to localhost:8000. HTTP request sent, awaiting response... 404 Not Found -- --2020-10-15 15:23:09-- http://localhost:8000/robots.txt Reusing existing connection to localhost:8000. HTTP request sent, awaiting response... 404 Not Found 2020-10-15 15:23:09 ERROR 404: Not Found. --2020-10-15 15:23:09-- http://localhost:8000/datasette/-/static/app.css?b576be Reusing existing connection to localhost:8000. HTTP request sent, awaiting response... 200 OK Length: 8563 (8.4K) [text/css] -- -- 2020-10-15 15:23:13 (7.90 MB/s) - ‘localhost:8000/datasette/fixtures/primary_key_multiple_columns_explicit_label.json?_shape=object’ saved [58] --2020-10-15 15:23:13-- http://localhost:8000/-/static-plugins/datasette_cluster_map/datasette-cluster-map.js Reusing existing connection to localhost:8000. HTTP request sent, awaiting response... 404 Not Found 2020-10-15 15:23:13 ERROR 404: Not Found. --2020-10-15 15:23:13-- http://localhost:8000/datasette/fixtures?sql=select+pk%2C+name%2C+address%2C+latitude%2C+longitude+from+roadside_attractions+order+by+pk+limit+101 Reusing existing connection to localhost:8000. HTTP request sent, awaiting response... 200 OK Length: unspecified [text/html] -- -- 2020-10-15 15:23:13 (84.3 MB/s) - ‘localhost:8000/datasette/fixtures/roadside_attractions.json?_shape=object’ saved [619] --2020-10-15 15:23:13-- http://localhost:8000/datasette/fixtures/%5C%22https://www.openstreetmap.org/copyright%5C%22 Reusing existing connection to localhost:8000. HTTP request sent, awaiting response... 404 Not Found 2020-10-15 15:23:13 ERROR 404: Not Found. --2020-10-15 15:23:13-- http://localhost:8000/datasette/fixtures?sql=select+pk%2C+text1%2C+text2%2C+%5Bname+with+.+and+spaces%5D+from+searchable+order+by+pk+limit+101 Reusing existing connection to localhost:8000. HTTP request sent, awaiting response... 200 OK Length: unspecified [text/html] -- -- 2020-10-15 15:23:14 (28.6 MB/s) - ‘localhost:8000/datasette/fixtures/searchable_view_configured_by_metadata.json?_shape=array&_nl=on’ saved [180] --2020-10-15 15:23:14-- http://localhost:8000/ Reusing existing connection to localhost:8000. HTTP request sent, awaiting response... 404 Not Found 2020-10-15 15:23:14 ERROR 404: Not Found. --2020-10-15 15:23:14-- http://localhost:8000/datasette/fixtures?sql=select+pk1%2C+pk2%2C+pk3%2C+content+from+compound_three_primary_keys+order+by+pk1%2C+pk2%2C+pk3+limit+101&_hide_sql=1 Reusing existing connection to localhost:8000. HTTP request sent, awaiting response... 200 OK Length: unspecified [text/html] -- -- 2020-10-15 15:23:21 (64.1 MB/s) - ‘localhost:8000/datasette/fixtures.csv?sql=select+pk,+name,+address,+latitude,+longitude+from+roadside_attractions+order+by+pk+limit+101&_size=max’ saved [403] --2020-10-15 15:23:21-- http://localhost:8000/datasette/%5C%22https://www.openstreetmap.org/copyright%5C%22 Reusing existing connection to localhost:8000. HTTP request sent, awaiting response... 404 Not Found 2020-10-15 15:23:21 ERROR 404: Not Found. --2020-10-15 15:23:21-- http://localhost:8000/datasette/fixtures?sql=select+pk%2C+name%2C+address%2C+latitude%2C+longitude+from+roadside_attractions+order+by+pk+desc+limit+101 Reusing existing connection to localhost:8000. HTTP request sent, awaiting response... 200 OK Length: unspecified [text/html] ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722674708,Figure out how to run an environment that exercises the base_url proxy setting, https://github.com/simonw/datasette/issues/1024#issuecomment-709600335,https://api.github.com/repos/simonw/datasette/issues/1024,709600335,MDEyOklzc3VlQ29tbWVudDcwOTYwMDMzNQ==,9599,simonw,2020-10-15T21:28:02Z,2020-10-15T22:25:43Z,OWNER,"This is working OK so far: I'll try crawling it with `wget -r` to see if I get any errors.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722674708,Figure out how to run an environment that exercises the base_url proxy setting, https://github.com/simonw/datasette/issues/1024#issuecomment-709598324,https://api.github.com/repos/simonw/datasette/issues/1024,709598324,MDEyOklzc3VlQ29tbWVudDcwOTU5ODMyNA==,9599,simonw,2020-10-15T21:23:33Z,2020-10-15T21:26:55Z,OWNER,"Combining these two examples, here's the config file I am going to use for this. I'll save this as `nginx.conf`: ``` daemon off; events { worker_connections 1024; } http { server { listen 8000; location /datasette { proxy_pass http://127.0.0.1:8001; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; } } } ``` Then start the server with: ``` nginx -p `pwd` -c `pwd`/nginx.conf ``` And start Datasette like this: ``` datasette fixtures.db --config base_url:/datasette/ ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722674708,Figure out how to run an environment that exercises the base_url proxy setting, https://github.com/simonw/datasette/issues/1024#issuecomment-709597589,https://api.github.com/repos/simonw/datasette/issues/1024,709597589,MDEyOklzc3VlQ29tbWVudDcwOTU5NzU4OQ==,9599,simonw,2020-10-15T21:21:53Z,2020-10-15T21:23:25Z,OWNER,"Here's a recipe for running nginx against a custom config file: https://gist.github.com/simonw/35f0ebf9c1d6df158759 ``` daemon off; events { worker_connections 1024; } http { access_log /dev/stdout; error_log /dev/stderr; types { text/html html htm shtml; text/css css; image/gif gif; image/jpeg jpeg jpg; application/javascript js; } server { listen 8002; index index.html; root app; } } ``` ``` nginx -p `pwd` -c `pwd`/nginx.conf ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722674708,Figure out how to run an environment that exercises the base_url proxy setting, https://github.com/simonw/datasette/issues/1024#issuecomment-709595960,https://api.github.com/repos/simonw/datasette/issues/1024,709595960,MDEyOklzc3VlQ29tbWVudDcwOTU5NTk2MA==,9599,simonw,2020-10-15T21:18:14Z,2020-10-15T21:18:14Z,OWNER,Typing `nginx` starts it running as a daemon listening on port `http-alt` aka 8080. It uses the config file from ` /usr/local/etc/nginx/nginx.conf`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722674708,Figure out how to run an environment that exercises the base_url proxy setting, https://github.com/simonw/datasette/issues/1024#issuecomment-709590941,https://api.github.com/repos/simonw/datasette/issues/1024,709590941,MDEyOklzc3VlQ29tbWVudDcwOTU5MDk0MQ==,9599,simonw,2020-10-15T21:07:47Z,2020-10-15T21:07:47Z,OWNER,On macOS I ran `brew install nginx`. I'm going to try running it on port 8000 so I don't have to run it as root.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722674708,Figure out how to run an environment that exercises the base_url proxy setting, https://github.com/simonw/datasette/issues/1024#issuecomment-709590337,https://api.github.com/repos/simonw/datasette/issues/1024,709590337,MDEyOklzc3VlQ29tbWVudDcwOTU5MDMzNw==,9599,simonw,2020-10-15T21:06:24Z,2020-10-15T21:07:19Z,OWNER,"From https://stackoverflow.com/questions/32549684/nginx-proxy-and-remove-proxy-pass-prefix/32550251 it looks like the config I should use is: ``` server { listen 80; server_name example.com; location /datasette/ { proxy_pass http://127.0.0.1:8001; proxy_set_header X-Real-IP $remote_addr; proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for; proxy_set_header X-Forwarded-Proto $scheme; proxy_read_timeout 90; } } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722674708,Figure out how to run an environment that exercises the base_url proxy setting, https://github.com/simonw/datasette/issues/1024#issuecomment-709589297,https://api.github.com/repos/simonw/datasette/issues/1024,709589297,MDEyOklzc3VlQ29tbWVudDcwOTU4OTI5Nw==,9599,simonw,2020-10-15T21:04:31Z,2020-10-15T21:04:31Z,OWNER,I think nginx or Apache would be the best tools for this. I'm inclined to try with nginx first since I know it better.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",722674708,Figure out how to run an environment that exercises the base_url proxy setting, https://github.com/simonw/datasette/issues/838#issuecomment-709588425,https://api.github.com/repos/simonw/datasette/issues/838,709588425,MDEyOklzc3VlQ29tbWVudDcwOTU4ODQyNQ==,9599,simonw,2020-10-15T21:02:37Z,2020-10-15T21:02:37Z,OWNER,Tracking ticket: #1023,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",637395097,Incorrect URLs when served behind a proxy with base_url set, https://github.com/simonw/datasette/issues/865#issuecomment-709588373,https://api.github.com/repos/simonw/datasette/issues/865,709588373,MDEyOklzc3VlQ29tbWVudDcwOTU4ODM3Mw==,9599,simonw,2020-10-15T21:02:31Z,2020-10-15T21:02:31Z,OWNER,Tracking ticket: #1023,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",644582921,"base_url doesn't seem to work when adding criteria and clicking ""apply""", https://github.com/simonw/datasette/issues/900#issuecomment-709588322,https://api.github.com/repos/simonw/datasette/issues/900,709588322,MDEyOklzc3VlQ29tbWVudDcwOTU4ODMyMg==,9599,simonw,2020-10-15T21:02:26Z,2020-10-15T21:02:26Z,OWNER,Tracking ticket: #1023,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",661605489,Some links don't honor base_url, https://github.com/simonw/datasette/issues/988#issuecomment-709588290,https://api.github.com/repos/simonw/datasette/issues/988,709588290,MDEyOklzc3VlQ29tbWVudDcwOTU4ODI5MA==,9599,simonw,2020-10-15T21:02:21Z,2020-10-15T21:02:21Z,OWNER,Tracking ticket: #1023,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",713209404,Mechanism for plugins to construct URLs that respect base_url, https://github.com/simonw/datasette/issues/894#issuecomment-709575818,https://api.github.com/repos/simonw/datasette/issues/894,709575818,MDEyOklzc3VlQ29tbWVudDcwOTU3NTgxOA==,9599,simonw,2020-10-15T20:35:03Z,2020-10-15T20:35:03Z,OWNER,"Prototype so far: ```diff diff --git a/datasette/views/table.py b/datasette/views/table.py index ea11a51..d61f8bd 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -497,17 +497,32 @@ class TableView(RowTableShared): if sort and sort_desc: raise DatasetteError(""Cannot use _sort and _sort_desc at the same time"") + def parse_sort(sort): + if ""~"" in sort: + if sort.endswith(""~default""): + col = sort.rsplit(""~"", 1)[0] + return col, escape_sqlite(col) + elif sort.endswith(""~numeric""): + col = sort.rsplit(""~"", 1)[0] + return col, ""cast(nullif({}, '') as real)"".format(escape_sqlite(col)) + else: + return sort, escape_sqlite(sort) + else: + return sort, escape_sqlite(sort) + if sort: - if sort not in sortable_columns: - raise DatasetteError(""Cannot sort table by {}"".format(sort)) + sort_column, sort_clause = parse_sort(sort) + if sort_column not in sortable_columns: + raise DatasetteError(""Cannot sort table by {}"".format(sort_column)) - order_by = escape_sqlite(sort) + order_by = sort_clause if sort_desc: - if sort_desc not in sortable_columns: - raise DatasetteError(""Cannot sort table by {}"".format(sort_desc)) + sort_column, sort_clause = parse_sort(sort_desc) + if sort_column not in sortable_columns: + raise DatasetteError(""Cannot sort table by {}"".format(sort_column)) - order_by = ""{} desc"".format(escape_sqlite(sort_desc)) + order_by = ""{} desc"".format(sort_clause) from_sql = ""from {table_name} {where}"".format( table_name=escape_sqlite(table), ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",657572753,?sort=colname~numeric to sort by by column cast to real, https://github.com/simonw/datasette/issues/894#issuecomment-709572425,https://api.github.com/repos/simonw/datasette/issues/894,709572425,MDEyOklzc3VlQ29tbWVudDcwOTU3MjQyNQ==,9599,simonw,2020-10-15T20:28:18Z,2020-10-15T20:28:18Z,OWNER,"Also need to rethink this template logic that decides if to show a column as sorted or not: https://github.com/simonw/datasette/blob/4f7c0ebd85ccd8c1853d7aa0147628f7c1b749cc/datasette/templates/_table.html#L10-L14","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",657572753,?sort=colname~numeric to sort by by column cast to real, https://github.com/simonw/datasette/issues/894#issuecomment-709571143,https://api.github.com/repos/simonw/datasette/issues/894,709571143,MDEyOklzc3VlQ29tbWVudDcwOTU3MTE0Mw==,9599,simonw,2020-10-15T20:25:35Z,2020-10-15T20:25:35Z,OWNER,"`cast(nullif(colname, '') as real)` can fix this - it will treat `''` the same as `null`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",657572753,?sort=colname~numeric to sort by by column cast to real, https://github.com/simonw/datasette/issues/894#issuecomment-709569951,https://api.github.com/repos/simonw/datasette/issues/894,709569951,MDEyOklzc3VlQ29tbWVudDcwOTU2OTk1MQ==,9599,simonw,2020-10-15T20:23:02Z,2020-10-15T20:23:02Z,OWNER,"Something to watch out for: `""""` empty strings cast to `0.0`: `select cast(""100"" as real), ""100"", cast(null as real), cast("""" as real)` cast(""100"" as real) | ""100"" | cast(null as real) | cast("""" as real) -- | -- | -- | -- 100.0 | 100 |   | 0.0 https://latest.datasette.io/fixtures?sql=select+cast%28%22100%22+as+real%29%2C+%22100%22%2C+cast%28null+as+real%29%2C+cast%28%22%22+as+real%29","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",657572753,?sort=colname~numeric to sort by by column cast to real, https://github.com/simonw/datasette/issues/894#issuecomment-709562940,https://api.github.com/repos/simonw/datasette/issues/894,709562940,MDEyOklzc3VlQ29tbWVudDcwOTU2Mjk0MA==,9599,simonw,2020-10-15T20:08:16Z,2020-10-15T20:08:16Z,OWNER,Relevant code: https://github.com/simonw/datasette/blob/4f7c0ebd85ccd8c1853d7aa0147628f7c1b749cc/datasette/views/table.py#L485-L510,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",657572753,?sort=colname~numeric to sort by by column cast to real, https://github.com/simonw/datasette/issues/894#issuecomment-709546976,https://api.github.com/repos/simonw/datasette/issues/894,709546976,MDEyOklzc3VlQ29tbWVudDcwOTU0Njk3Ng==,9599,simonw,2020-10-15T19:35:55Z,2020-10-15T19:36:38Z,OWNER,"Much easier solution: if the suffix is `~numeric` then treat it as the column name sorted numerically. If the suffix is missing OR the suffix is `~default`, sort without casting. Only add the `~default` suffix if the column name itself contains at least one `~` symbol. Using `~` because it doesn't need to be URL-encoded.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",657572753,?sort=colname~numeric to sort by by column cast to real, https://github.com/simonw/datasette/issues/894#issuecomment-709539257,https://api.github.com/repos/simonw/datasette/issues/894,709539257,MDEyOklzc3VlQ29tbWVudDcwOTUzOTI1Nw==,9599,simonw,2020-10-15T19:19:29Z,2020-10-15T19:34:07Z,OWNER,"Urgh this isn't going to work. `%7E~%7E` gets decoded as `~~~` so I wouldn't be able to tell the difference. I could use double-percentage-encoding here instead. I feel like there's a simpler solution that I'm missing (and that may well be in use within Datasette already, I'm not doing great thinking this morning).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",657572753,?sort=colname~numeric to sort by by column cast to real, https://github.com/simonw/datasette/issues/894#issuecomment-709534197,https://api.github.com/repos/simonw/datasette/issues/894,709534197,MDEyOklzc3VlQ29tbWVudDcwOTUzNDE5Nw==,9599,simonw,2020-10-15T19:08:53Z,2020-10-15T19:17:55Z,OWNER,"Even better solution: use URL encoding in the parameter details. This is consistent with how `?_next=` tokens work, e.g. `?_next=0.291861560261786%2Ce%2Cj`. So the format can be: - `mycolumn` - `urlencoded-mycolumn$castname` For most columns this will look like: `?_sort=score$numeric` For columns with a `$` in their name it will be `?_sort=score%24hasdollar$numeric` Problem: both `$` and `,` are usually URL encoded anyway. I need a character which isn't encoded by default, so that I can use its encoded form to show it is part of the column name and its un-encoded form to split the cast indicator. `_` is a candidate here - not encoded by default, but can be encoded as `%5F`. The other unreserved non-alphanumeric characters are `-`, `.`, `_`, `~`. Of these, `~` is least likely to show up in a column name. So I'll use that. - `mycolumn` - `mycolumn~numeric` - `mycolumn%7Ewith%7Etildes~numeric`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",657572753,?sort=colname~numeric to sort by by column cast to real, https://github.com/simonw/datasette/issues/894#issuecomment-709532369,https://api.github.com/repos/simonw/datasette/issues/894,709532369,MDEyOklzc3VlQ29tbWVudDcwOTUzMjM2OQ==,9599,simonw,2020-10-15T19:05:07Z,2020-10-15T19:07:35Z,OWNER,"Simpler option: `?_sort=` column values look like this: - `mycolumn` - for sort by column - `mycolumn$numeric` - for sort by column after cast to float - `mycolumn$morename$default` - for the edge case where the column name itself contains a $ symbol","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",657572753,?sort=colname~numeric to sort by by column cast to real, https://github.com/simonw/datasette/issues/894#issuecomment-709531343,https://api.github.com/repos/simonw/datasette/issues/894,709531343,MDEyOklzc3VlQ29tbWVudDcwOTUzMTM0Mw==,9599,simonw,2020-10-15T19:03:12Z,2020-10-15T19:03:12Z,OWNER,"The Sort by `