github
html_url | issue_url | id | node_id | user | created_at | updated_at | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
https://github.com/simonw/datasette/issues/357#issuecomment-558461851 | https://api.github.com/repos/simonw/datasette/issues/357 | 558461851 | MDEyOklzc3VlQ29tbWVudDU1ODQ2MTg1MQ== | 9599 | 2019-11-26T05:05:21Z | 2019-11-26T05:05:21Z | OWNER | Here's an example plugin I set up using the experimental hook in d11fd2cbaa6b31933b1319f81b5d1520726cb0b6 ```python import json from datasette import hookimpl import threading import requests import time def change_over_time(m, metadata_value): while True: print(metadata_value) fetched = requests.get(metadata_value).json() counter = m["counter"] m.clear() m["counter"] = counter + 1 m.update(fetched) m["counter"] += 1 m["title"] = "{} {}".format(m.get("title", ""), m["counter"]) time.sleep(10) @hookimpl(trylast=True) def load_metadata(metadata_value): m = { "counter": 0, } x = threading.Thread(target=change_over_time, args=(m, metadata_value), daemon=True) x.start() x.setName("datasette-metadata-counter") return m ``` It runs a separate thread that fetches the provided URL every 10 seconds: ``` datasette -m metadata.json --memory -p 8069 -m https://gist.githubusercontent.com/simonw/e8e4fcd7c0a9c951f7dd976921992157/raw/b702d18a6a078a0fb94ef1cee62e11a3396e0336/demo-metadata.json ``` I learned a bunch of things from this prototype. First, this is the wrong place to run the code: https://github.com/simonw/datasette/blob/d11fd2cbaa6b31933b1319f81b5d1520726cb0b6/datasette/cli.py#L337-L343 I wanted the plugin hook to be able to receive a `datasette` instance, so implementations could potentially run their own database queries. Calling the hook in the CLI function here happens BEFORE the `Datasette()` instance is created, so that doesn't work. I wanted to build a demo of a plugin that would load metadata periodically from an external URL (see #238) - but this threaded implementation is pretty naive. It results in a hit every 10 seconds even if no-one is using Datasette! A smarter implementation would be to fetch and cache the results - then only re-fetch them if more than 10 seconds have passed since the last time the metadata was accessed. But... doing t… | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
348043884 | |
https://github.com/simonw/datasette/issues/357#issuecomment-558459823 | https://api.github.com/repos/simonw/datasette/issues/357 | 558459823 | MDEyOklzc3VlQ29tbWVudDU1ODQ1OTgyMw== | 9599 | 2019-11-26T04:55:44Z | 2019-11-26T04:56:24Z | OWNER | This needs to play nicely with `asyncio` - which means that the plugin hook needs to be able to interact with the event loop somehow. That said... I don't particularly want to change everywhere that accesses metadata into a `await` call. So this is tricky. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
348043884 | |
https://github.com/simonw/datasette/issues/357#issuecomment-558446045 | https://api.github.com/repos/simonw/datasette/issues/357 | 558446045 | MDEyOklzc3VlQ29tbWVudDU1ODQ0NjA0NQ== | 9599 | 2019-11-26T03:43:17Z | 2019-11-26T03:43:17Z | OWNER | I think only one plugin gets to work at a time. The plugin can return a dictionary which is used for live lookups of metadata every time it's accessed - which means the plugin can itself mutate that dictionary. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
348043884 | |
https://github.com/simonw/datasette/issues/641#issuecomment-558443464 | https://api.github.com/repos/simonw/datasette/issues/641 | 558443464 | MDEyOklzc3VlQ29tbWVudDU1ODQ0MzQ2NA== | 9599 | 2019-11-26T03:30:02Z | 2019-11-26T03:30:02Z | OWNER | https://datasette.readthedocs.io/en/latest/custom_templates.html#serving-static-files | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
528442126 | |
https://github.com/simonw/datasette/issues/639#issuecomment-558439989 | https://api.github.com/repos/simonw/datasette/issues/639 | 558439989 | MDEyOklzc3VlQ29tbWVudDU1ODQzOTk4OQ== | 9599 | 2019-11-26T03:14:27Z | 2019-11-26T03:14:27Z | OWNER | @jacobian does this sound like something that could work? | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
527670799 | |
https://github.com/simonw/datasette/issues/357#issuecomment-558432963 | https://api.github.com/repos/simonw/datasette/issues/357 | 558432963 | MDEyOklzc3VlQ29tbWVudDU1ODQzMjk2Mw== | 9599 | 2019-11-26T02:40:31Z | 2019-11-26T02:40:31Z | OWNER | A plugin hook for this would enable #639. Renaming this issue. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
348043884 | |
https://github.com/simonw/datasette/issues/639#issuecomment-558432868 | https://api.github.com/repos/simonw/datasette/issues/639 | 558432868 | MDEyOklzc3VlQ29tbWVudDU1ODQzMjg2OA== | 9599 | 2019-11-26T02:40:06Z | 2019-11-26T02:40:06Z | OWNER | Unfortunately I don't think it's possible to do this with Heroku. Heroku treats all deployments as total replacements - that's part of how they achieve zero-downtime deployments, since they run the new deployment at the same time as the old deployment and then switch traffic over at the load balancer. I did have one idea that's relevant here: #238 - which would provide a mechanism for `metadata.json` to be hosted on a separate URL (e.g. a gist) and have Datasette periodically fetch a new copy. I closed that in favour of #357 - a plugin hook for loading metadata. That's still something I'm interested in exploring. | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
527670799 |