issue_comments
6 rows where author_association = "CONTRIBUTOR", issue = 930807135 and user = 2670795 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: reactions, created_at (date), updated_at (date)
issue 1
- Plugin hook for dynamic metadata · 6 ✖
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
1066222323 | https://github.com/simonw/datasette/issues/1384#issuecomment-1066222323 | https://api.github.com/repos/simonw/datasette/issues/1384 | IC_kwDOBm6k_c4_jULz | brandonrobertz 2670795 | 2022-03-14T00:36:42Z | 2022-03-14T00:36:42Z | CONTRIBUTOR |
All good. Report back any issues you find with this stuff. Metadata/dynamic config hasn't been tested widely outside of what I've done AFAIK. If you find a strong use case for async meta, it's going to be better to know sooner rather than later! |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for dynamic metadata 930807135 | |
1066169718 | https://github.com/simonw/datasette/issues/1384#issuecomment-1066169718 | https://api.github.com/repos/simonw/datasette/issues/1384 | IC_kwDOBm6k_c4_jHV2 | brandonrobertz 2670795 | 2022-03-13T19:48:49Z | 2022-03-13T19:48:49Z | CONTRIBUTOR |
You shouldn't need to do this, as I mentioned previously. The code inside |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for dynamic metadata 930807135 | |
1066006292 | https://github.com/simonw/datasette/issues/1384#issuecomment-1066006292 | https://api.github.com/repos/simonw/datasette/issues/1384 | IC_kwDOBm6k_c4_ifcU | brandonrobertz 2670795 | 2022-03-13T02:09:44Z | 2022-03-13T02:09:44Z | CONTRIBUTOR |
Reading from sqlite DBs is pretty quick and I didn't notice significant performance issues when I was benchmarking. I tested on very large Datasette deployments (hundreds of DBs, millions of rows). See "Many small queries are efficient in sqlite" for more information on the rationale here. Also note that in the datasette-live-config reference plugin, the DB connection is cached, so that eliminated most of the performance worries we had. If you need to ensure fresh metadata is being read inside of a
Yes correct, the datadette-remote-metadata plugin doesn't do that. But the datasette-live-config plugin does. It supports a Good luck! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for dynamic metadata 930807135 | |
1065940779 | https://github.com/simonw/datasette/issues/1384#issuecomment-1065940779 | https://api.github.com/repos/simonw/datasette/issues/1384 | IC_kwDOBm6k_c4_iPcr | brandonrobertz 2670795 | 2022-03-12T18:49:29Z | 2022-03-12T18:50:07Z | CONTRIBUTOR | Hello! Just wanted to chime in and note that there's a plugin to have Datasette watch for updates to an external metadata.yaml/json and update the internal settings accordingly, so I think the cache/poll use case is already covered. @khusmann If you don't need truly dynamic metadata then what you've come up with or the plugin ought to work fine. Making the get_metadata async won't improve the situation by itself as only some of the code paths accessing metadata use that hook. The other paths use the internal metadata dict. Trying to force all paths through a async hook would have performance ramifications and making everything use the internal meta will cause problems for users that need changes to take effect immediately. This is why I came to the non-async solution as it was the path of least change within Datasette. As always, open to new ideas, etc! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for dynamic metadata 930807135 | |
869074701 | https://github.com/simonw/datasette/issues/1384#issuecomment-869074701 | https://api.github.com/repos/simonw/datasette/issues/1384 | MDEyOklzc3VlQ29tbWVudDg2OTA3NDcwMQ== | brandonrobertz 2670795 | 2021-06-26T23:45:18Z | 2021-06-26T23:45:37Z | CONTRIBUTOR |
I think you're right. I can't think of a reason why the plugin would care about the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for dynamic metadata 930807135 | |
869074182 | https://github.com/simonw/datasette/issues/1384#issuecomment-869074182 | https://api.github.com/repos/simonw/datasette/issues/1384 | MDEyOklzc3VlQ29tbWVudDg2OTA3NDE4Mg== | brandonrobertz 2670795 | 2021-06-26T23:37:42Z | 2021-06-26T23:37:42Z | CONTRIBUTOR |
Ideally this hook would be asynchronous, but when I started down that path I quickly realized how large of a change this would be, since metadata gets used synchronously across the entire Datasette codebase. (And calling async code from sync is non-trivial.) In my live-configuration implementation I use synchronous reads using a persistent sqlite connection. This works pretty well in practice, but I agree it's limiting. My thinking around this was to go with the path of least change as |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for dynamic metadata 930807135 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
user 1