id,node_id,number,title,user,state,locked,assignee,milestone,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason 695377804,MDU6SXNzdWU2OTUzNzc4MDQ=,153,table.optimize() should delete junk rows from *_fts_docsize,9599,closed,0,,,3,2020-09-07T20:31:09Z,2020-09-24T20:35:46Z,2020-09-07T21:16:33Z,OWNER,,"> The second challenge here is cleaning up all of those junk rows in existing `*_fts_docsize` tables. Doing that just to the demo database from https://github-to-sqlite.dogsheep.net/github.db dropped its size from 22MB to 16MB! Here's the SQL: > ```sql > DELETE FROM [licenses_fts_docsize] WHERE id NOT IN ( > SELECT rowid FROM [licenses_fts]); > ``` > I can do that as part of the existing `table.optimize()` method, which optimizes FTS tables. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/issues/149#issuecomment-688501064_",140912432,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/153/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed