html_url,issue_url,id,node_id,user,user_label,created_at,updated_at,author_association,body,reactions,issue,issue_label,performed_via_github_app https://github.com/simonw/datasette/issues/1175#issuecomment-762488336,https://api.github.com/repos/simonw/datasette/issues/1175,762488336,MDEyOklzc3VlQ29tbWVudDc2MjQ4ODMzNg==,758858,hannseman,2021-01-18T21:59:28Z,2021-01-18T22:00:31Z,NONE,"I encountered your issue when trying to find a solution and came up with the following, maybe it can help. ```python import logging.config from typing import Tuple import structlog import uvicorn from example.config import settings shared_processors: Tuple[structlog.types.Processor, ...] = ( structlog.contextvars.merge_contextvars, structlog.stdlib.add_logger_name, structlog.stdlib.add_log_level, structlog.processors.TimeStamper(fmt=""iso""), ) logging_config = { ""version"": 1, ""disable_existing_loggers"": False, ""formatters"": { ""json"": { ""()"": structlog.stdlib.ProcessorFormatter, ""processor"": structlog.processors.JSONRenderer(), ""foreign_pre_chain"": shared_processors, }, ""console"": { ""()"": structlog.stdlib.ProcessorFormatter, ""processor"": structlog.dev.ConsoleRenderer(), ""foreign_pre_chain"": shared_processors, }, **uvicorn.config.LOGGING_CONFIG[""formatters""], }, ""handlers"": { ""default"": { ""level"": ""DEBUG"", ""class"": ""logging.StreamHandler"", ""formatter"": ""json"" if not settings.debug else ""console"", }, ""uvicorn.access"": { ""level"": ""INFO"", ""class"": ""logging.StreamHandler"", ""formatter"": ""access"", }, ""uvicorn.default"": { ""level"": ""INFO"", ""class"": ""logging.StreamHandler"", ""formatter"": ""default"", }, }, ""loggers"": { """": {""handlers"": [""default""], ""level"": ""INFO""}, ""uvicorn.error"": { ""handlers"": [""default"" if not settings.debug else ""uvicorn.default""], ""level"": ""INFO"", ""propagate"": False, }, ""uvicorn.access"": { ""handlers"": [""default"" if not settings.debug else ""uvicorn.access""], ""level"": ""INFO"", ""propagate"": False, }, }, } def setup_logging() -> None: structlog.configure( processors=[ structlog.stdlib.filter_by_level, *shared_processors, structlog.stdlib.PositionalArgumentsFormatter(), structlog.processors.StackInfoRenderer(), structlog.processors.format_exc_info, structlog.stdlib.ProcessorFormatter.wrap_for_formatter, ], context_class=dict, logger_factory=structlog.stdlib.LoggerFactory(), wrapper_class=structlog.stdlib.AsyncBoundLogger, cache_logger_on_first_use=True, ) logging.config.dictConfig(logging_config) ``` And then it'll be run on the startup event: ```python @app.on_event(""startup"") async def startup_event() -> None: setup_logging() ``` It depends on a setting called `debug` which controls if it should output the regular uvicorn logging or json. ","{""total_count"": 15, ""+1"": 7, ""-1"": 0, ""laugh"": 1, ""hooray"": 1, ""confused"": 0, ""heart"": 5, ""rocket"": 1, ""eyes"": 0}",779156520,Use structlog for logging, https://github.com/simonw/datasette/issues/1036#issuecomment-762391426,https://api.github.com/repos/simonw/datasette/issues/1036,762391426,MDEyOklzc3VlQ29tbWVudDc2MjM5MTQyNg==,4997607,philshem,2021-01-18T17:45:00Z,2021-01-18T17:45:00Z,NONE,"It might be possible with this library: https://docs.python.org/3/library/imghdr.html quick test of the downloaded blob: ``` >>> import imghdr >>> imghdr.what('material_culture-1-image.blob') 'jpeg' ``` The output plugin would be cool. I'll look into making my first datasette plugin. I'm also imagining displaying the image in the browser -- but that would be a step 2. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1194#issuecomment-762390568,https://api.github.com/repos/simonw/datasette/issues/1194,762390568,MDEyOklzc3VlQ29tbWVudDc2MjM5MDU2OA==,9599,simonw,2021-01-18T17:43:03Z,2021-01-18T17:43:03Z,OWNER,Should I just blanket copy over any query string argument that starts with an underscore? Any reason _not_ to do that?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",788447787,?_size= argument is not persisted by hidden form fields in the table filters, https://github.com/simonw/datasette/issues/1194#issuecomment-762390401,https://api.github.com/repos/simonw/datasette/issues/1194,762390401,MDEyOklzc3VlQ29tbWVudDc2MjM5MDQwMQ==,9599,simonw,2021-01-18T17:42:38Z,2021-01-18T17:42:38Z,OWNER,"Relevant code: https://github.com/simonw/datasette/blob/a882d679626438ba0d809944f06f239bcba8ee96/datasette/views/table.py#L815-L827 It looks like there are other arguments that may not be persisted too.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",788447787,?_size= argument is not persisted by hidden form fields in the table filters, https://github.com/simonw/datasette/issues/1036#issuecomment-762387875,https://api.github.com/repos/simonw/datasette/issues/1036,762387875,MDEyOklzc3VlQ29tbWVudDc2MjM4Nzg3NQ==,9599,simonw,2021-01-18T17:36:36Z,2021-01-18T17:36:36Z,OWNER,"As you can see, I'm pretty paranoid about serving content with `Content-Type` HTTP headers because I'm so worried about execution vulnerabilities. I'm much more comfortable exploring that kind of thing in plugins, since that way people can opt-in to riskier features. You found `datasette-media` which is my most comprehensive exploration of that idea so far - but there's definitely lots of room for more plugins along those lines. Maybe even an output plugin? `.jpg` as an export format which returns the `BLOB` column for a row as a JPEG image with the correct `content-type` header (but first verifies that the binary content does indeed look like a real JPEG) could be interesting.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-762385981,https://api.github.com/repos/simonw/datasette/issues/1036,762385981,MDEyOklzc3VlQ29tbWVudDc2MjM4NTk4MQ==,4997607,philshem,2021-01-18T17:32:13Z,2021-01-18T17:34:50Z,NONE,"Hi Simon Just finding this old issue regarding downloading blobs. Nice work! As a feature request, maybe it would be possible to assign a blob column as a certain data type (e.g. `.jpg`) and then each blob could be downloaded as that type of file (perhaps if the file types were constrained to normal blobs that people store in sqlite databases, this could avoid the execution stuff mentioned above). I guess the column blob-type definition could fit into this dropdown selection: Let me know if I should open a new issue with a feature request. (This could slowly go in the direction of displaying image blob-types in the browser.) Thanks for the great tool! --- edit: just reading the rest of the twitter thread: https://twitter.com/simonw/status/1318685933256855552 perhaps this is already possible in some form with the plugin datasette-media: https://github.com/simonw/datasette-media","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/dogsheep/swarm-to-sqlite/issues/11#issuecomment-761967094,https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/11,761967094,MDEyOklzc3VlQ29tbWVudDc2MTk2NzA5NA==,9599,simonw,2021-01-18T04:11:13Z,2021-01-18T04:11:13Z,MEMBER,"I just got a similar error: ``` File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/swarm_to_sqlite/utils.py"", line 79, in save_checkin checkins_table.m2m(""users"", user, m2m_table=""with"", pk=""id"") File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/sqlite_utils/db.py"", line 2048, in m2m id = other_table.insert(record, pk=pk, replace=True).last_pk File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1781, in insert return self.insert_all( File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1899, in insert_all self.insert_chunk( File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/sqlite_utils/db.py"", line 1709, in insert_chunk result = self.db.execute(query, params) File ""/home/dogsheep/datasette-venv/lib/python3.8/site-packages/sqlite_utils/db.py"", line 226, in execute return self.conn.execute(sql, parameters) pysqlite3.dbapi2.OperationalError: table users has no column named countryCode ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743400216,Error thrown: sqlite3.OperationalError: table users has no column named lastName,