issues: 1733198948
This data as json
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | pull_request | body | repo | type | active_lock_reason | performed_via_github_app | reactions | draft | state_reason |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1733198948 | I_kwDOCGYnMM5nToRk | 555 | Filter table by a large bunch of ids | 10843208 | open | 0 | 1 | 2023-05-31T00:29:51Z | 2023-06-14T22:01:57Z | NONE | Hi! this might be a question related to both SQLite & sqlite-utils, and you might be more experienced with them. I have a large bunch of ids, and I'm wondering which is the best way to query them in terms of performance, and simplicity if possible. The naive approach would be something like Another approach might be creating a temp table, or in-memory db table, insert all ids in that table and then join with the target one. I failed to attach an in-memory db both using sqlite-utils, and plain sql's execute(), so my closest approach is something like,
That kinda worked, I couldn't find an option in sqlite-utils's |
140912432 | issue | { "url": "https://api.github.com/repos/simonw/sqlite-utils/issues/555/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |