issue_comments
5 rows where user = 167160 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date)
issue 1
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
1066194130 | https://github.com/simonw/datasette/issues/1384#issuecomment-1066194130 | https://api.github.com/repos/simonw/datasette/issues/1384 | IC_kwDOBm6k_c4_jNTS | khusmann 167160 | 2022-03-13T22:23:04Z | 2022-03-13T22:23:04Z | NONE | Ah, sorry, I didn't get what you were saying you the first time. Using _metadata_local in that way makes total sense -- I agree, refreshing metadata each cell was seeming quite excessive. Now I'm on the same page! :) |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for dynamic metadata 930807135 | |
1066143991 | https://github.com/simonw/datasette/issues/1384#issuecomment-1066143991 | https://api.github.com/repos/simonw/datasette/issues/1384 | IC_kwDOBm6k_c4_jBD3 | khusmann 167160 | 2022-03-13T17:13:09Z | 2022-03-13T17:13:09Z | NONE | Thanks for taking the time to reply @brandonrobertz , this is really helpful info.
Ah, that's nifty! Yeah, then caching on the python side is likely a waste :) I'm new to working with sqlite so this is super good to know the many-small-queries is a common pattern
For my reference, did you include a (If you didn't test this specific situation, no worries -- I'm just trying to calibrate my intuition on this and can do my own benchmarks at some point.)
Yeah, getting metadata (and static pages as well for that matter) from internal tables definitely has my vote for including as a standard feature! Its really nice to be able to distribute a single *.db with all the metadata and static pages bundled. My metadata are sufficiently complex/domain specific that it makes sense to continue on my own plugin for now, but I'll be thinking about more general parts I can spin off as possible contributions to liveconfig (if you're open to them) or other plugins in this ecosystem. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for dynamic metadata 930807135 | |
1065951744 | https://github.com/simonw/datasette/issues/1384#issuecomment-1065951744 | https://api.github.com/repos/simonw/datasette/issues/1384 | IC_kwDOBm6k_c4_iSIA | khusmann 167160 | 2022-03-12T19:47:17Z | 2022-03-12T19:47:17Z | NONE | Awesome, thanks @brandonrobertz ! The plugin is close, but looks like it only grabs remote metadata, is that right? Instead what I'm wanting is to grab metadata embedded in the attached databases. Rather than extending that plugin, at this point I've realized I need a lot more flexibility in metadata for my data model (esp around formatting cell values and custom file exports) so rather than extending that I'll continue working on a plugin specific to my app. If I'm understanding your plugin code correctly, you query the db using the sync handle every time
I agree -- because things like That leaves your app, where it sounds like you want changes made by the user in the browser in to be immediately reflected, rather than have to wait for the next metadata refresh. In this case I wonder if you could have your app make a sync write to the datasette object so the change would have the immediate effect, but then have a separate async polling mechanism to eventually write that change out to the database for long-term persistence. Then you'd have the best of both worlds, I think? But probably not worth the trouble if your use cases are small (and/or you're not reading metadata/config from tight loops like render_cell). |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for dynamic metadata 930807135 | |
1065929510 | https://github.com/simonw/datasette/issues/1384#issuecomment-1065929510 | https://api.github.com/repos/simonw/datasette/issues/1384 | IC_kwDOBm6k_c4_iMsm | khusmann 167160 | 2022-03-12T17:49:59Z | 2022-03-12T17:49:59Z | NONE | Ok, I'm taking a slightly different approach, which I think is sort of close to the in-memory _metadata table idea. I'm using a startup hook to load metadata / other info from the database, which I store in the datasette object for later:
Then, I can use this in other plugins:
For my app I don't need anything to update dynamically so it's fine to pre-populate everything on startup. It's also good to have things precached especially for a hook like render_cell, which would otherwise require a ton of redundant db queries. Makes me wonder if we could take a sort of similar caching approach with the internal _metadata table. Like have a little watchdog that could query all of the attached dbs for their _metadata tables every 5min or so, which then could be merged into the in memory _metadata table which then could be accessed sync by the plugins, or something like that. For most the use cases I can think of, live updates don't need to take into effect immediately; refreshing a cache every 5min or on some other trigger (adjustable w a config setting) would be just fine. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for dynamic metadata 930807135 | |
1062124485 | https://github.com/simonw/datasette/issues/1384#issuecomment-1062124485 | https://api.github.com/repos/simonw/datasette/issues/1384 | IC_kwDOBm6k_c4_TrvF | khusmann 167160 | 2022-03-08T19:26:32Z | 2022-03-08T19:26:32Z | NONE | Looks like I'm late to the party here, but wanted to join the convo if there's still time before this interface is solidified in v1.0. My plugin use case is for education / social science data, which is meta-data heavy in the documentation of measurement scales, instruments, collection procedures, etc. that I want to connect to columns, tables, and dbs (and render in static pages, but looks like I can do that with the jinja plugin hook). I'm still digging in and I think @brandonrobertz 's approach will work for me at least for now, but I want to bump this thread in the meantime -- are there still plans for an async metadata hook at some point in the future? (or are you considering other directions?) |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for dynamic metadata 930807135 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
user 1