html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,issue,performed_via_github_app https://github.com/simonw/datasette/issues/1384#issuecomment-1065929510,https://api.github.com/repos/simonw/datasette/issues/1384,1065929510,IC_kwDOBm6k_c4_iMsm,167160,2022-03-12T17:49:59Z,2022-03-12T17:49:59Z,NONE,"Ok, I'm taking a slightly different approach, which I think is sort of close to the in-memory _metadata table idea. I'm using a startup hook to load metadata / other info from the database, which I store in the datasette object for later: ``` @hookimpl def startup(datasette): async def inner(): datasette._mypluginmetadata = # await db query return inner ``` Then, I can use this in other plugins: ``` @hookimpl def render_cell(value, column, table, database, datasette): # use datasette._mypluginmetadata ``` For my app I don't need anything to update dynamically so it's fine to pre-populate everything on startup. It's also good to have things precached especially for a hook like render_cell, which would otherwise require a ton of redundant db queries. Makes me wonder if we could take a sort of similar caching approach with the internal _metadata table. Like have a little watchdog that could query all of the attached dbs for their _metadata tables every 5min or so, which then could be merged into the in memory _metadata table which then could be accessed sync by the plugins, or something like that. For most the use cases I can think of, live updates don't need to take into effect immediately; refreshing a cache every 5min or on some other trigger (adjustable w a config setting) would be just fine. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",930807135,