issue_comments
7 rows where "created_at" is on date 2019-11-26, reactions = "{"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0}" and user = 9599 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: issue_url, updated_at (date)
user 1
- simonw · 7 ✖
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
558461851 | https://github.com/simonw/datasette/issues/357#issuecomment-558461851 | https://api.github.com/repos/simonw/datasette/issues/357 | MDEyOklzc3VlQ29tbWVudDU1ODQ2MTg1MQ== | simonw 9599 | 2019-11-26T05:05:21Z | 2019-11-26T05:05:21Z | OWNER | Here's an example plugin I set up using the experimental hook in d11fd2cbaa6b31933b1319f81b5d1520726cb0b6 ```python import json from datasette import hookimpl import threading import requests import time def change_over_time(m, metadata_value): while True: print(metadata_value) fetched = requests.get(metadata_value).json() counter = m["counter"] m.clear() m["counter"] = counter + 1 m.update(fetched) m["counter"] += 1 m["title"] = "{} {}".format(m.get("title", ""), m["counter"]) time.sleep(10) @hookimpl(trylast=True) def load_metadata(metadata_value): m = { "counter": 0, } x = threading.Thread(target=change_over_time, args=(m, metadata_value), daemon=True) x.start() x.setName("datasette-metadata-counter") return m ``` It runs a separate thread that fetches the provided URL every 10 seconds:
First, this is the wrong place to run the code: I wanted the plugin hook to be able to receive a I wanted to build a demo of a plugin that would load metadata periodically from an external URL (see #238) - but this threaded implementation is pretty naive. It results in a hit every 10 seconds even if no-one is using Datasette! A smarter implementation would be to fetch and cache the results - then only re-fetch them if more than 10 seconds have passed since the last time the metadata was accessed. But... doing this neatly requires asyncio - and the plugin isn't running inside an event loop (since I could try and refactor everything so that all calls to read from Or maybe I could set it up so the plugin can start itself running in the event loop and call back to the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for loading metadata.json 348043884 | |
558459823 | https://github.com/simonw/datasette/issues/357#issuecomment-558459823 | https://api.github.com/repos/simonw/datasette/issues/357 | MDEyOklzc3VlQ29tbWVudDU1ODQ1OTgyMw== | simonw 9599 | 2019-11-26T04:55:44Z | 2019-11-26T04:56:24Z | OWNER | This needs to play nicely with That said... I don't particularly want to change everywhere that accesses metadata into a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for loading metadata.json 348043884 | |
558446045 | https://github.com/simonw/datasette/issues/357#issuecomment-558446045 | https://api.github.com/repos/simonw/datasette/issues/357 | MDEyOklzc3VlQ29tbWVudDU1ODQ0NjA0NQ== | simonw 9599 | 2019-11-26T03:43:17Z | 2019-11-26T03:43:17Z | OWNER | I think only one plugin gets to work at a time. The plugin can return a dictionary which is used for live lookups of metadata every time it's accessed - which means the plugin can itself mutate that dictionary. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for loading metadata.json 348043884 | |
558443464 | https://github.com/simonw/datasette/issues/641#issuecomment-558443464 | https://api.github.com/repos/simonw/datasette/issues/641 | MDEyOklzc3VlQ29tbWVudDU1ODQ0MzQ2NA== | simonw 9599 | 2019-11-26T03:30:02Z | 2019-11-26T03:30:02Z | OWNER | { "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Better documentation for --static option 528442126 | ||
558439989 | https://github.com/simonw/datasette/issues/639#issuecomment-558439989 | https://api.github.com/repos/simonw/datasette/issues/639 | MDEyOklzc3VlQ29tbWVudDU1ODQzOTk4OQ== | simonw 9599 | 2019-11-26T03:14:27Z | 2019-11-26T03:14:27Z | OWNER | @jacobian does this sound like something that could work? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
updating metadata.json without recreating the app 527670799 | |
558432963 | https://github.com/simonw/datasette/issues/357#issuecomment-558432963 | https://api.github.com/repos/simonw/datasette/issues/357 | MDEyOklzc3VlQ29tbWVudDU1ODQzMjk2Mw== | simonw 9599 | 2019-11-26T02:40:31Z | 2019-11-26T02:40:31Z | OWNER | A plugin hook for this would enable #639. Renaming this issue. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for loading metadata.json 348043884 | |
558432868 | https://github.com/simonw/datasette/issues/639#issuecomment-558432868 | https://api.github.com/repos/simonw/datasette/issues/639 | MDEyOklzc3VlQ29tbWVudDU1ODQzMjg2OA== | simonw 9599 | 2019-11-26T02:40:06Z | 2019-11-26T02:40:06Z | OWNER | Unfortunately I don't think it's possible to do this with Heroku. Heroku treats all deployments as total replacements - that's part of how they achieve zero-downtime deployments, since they run the new deployment at the same time as the old deployment and then switch traffic over at the load balancer. I did have one idea that's relevant here: #238 - which would provide a mechanism for |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
updating metadata.json without recreating the app 527670799 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
issue 3