html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,issue,performed_via_github_app https://github.com/simonw/datasette/issues/859#issuecomment-648234787,https://api.github.com/repos/simonw/datasette/issues/859,648234787,MDEyOklzc3VlQ29tbWVudDY0ODIzNDc4Nw==,9599,2020-06-23T15:22:51Z,2020-06-23T15:22:51Z,OWNER,"I wonder if this is a SQLite caching issue then? Datasette has a configuration option for this but I haven't spent much time experimenting with it so I don't know how much of an impact it can have: https://datasette.readthedocs.io/en/stable/config.html#cache-size-kb","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",642572841, https://github.com/simonw/datasette/issues/859#issuecomment-648163272,https://api.github.com/repos/simonw/datasette/issues/859,648163272,MDEyOklzc3VlQ29tbWVudDY0ODE2MzI3Mg==,9599,2020-06-23T13:52:23Z,2020-06-23T13:52:23Z,OWNER,"I'm chunking inserts at 100 at a time right now: https://github.com/simonw/sqlite-utils/blob/4d9a3204361d956440307a57bd18c829a15861db/sqlite_utils/db.py#L1030 I think the performance is more down to using Faker to create the test data - generating millions of entirely fake, randomized records takes a fair bit of time.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",642572841, https://github.com/simonw/datasette/issues/859#issuecomment-647894903,https://api.github.com/repos/simonw/datasette/issues/859,647894903,MDEyOklzc3VlQ29tbWVudDY0Nzg5NDkwMw==,9599,2020-06-23T04:07:59Z,2020-06-23T04:07:59Z,OWNER,"Just to check: are you seeing the problem on this page: https://latest.datasette.io/fixtures (the database page) - or this page (the table page): https://latest.datasette.io/fixtures/compound_three_primary_keys If it's the table page then the problem may well be #862.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",642572841, https://github.com/simonw/datasette/issues/859#issuecomment-647890619,https://api.github.com/repos/simonw/datasette/issues/859,647890619,MDEyOklzc3VlQ29tbWVudDY0Nzg5MDYxOQ==,9599,2020-06-23T03:48:21Z,2020-06-23T03:48:21Z,OWNER," sqlite-generate many-cols.db --tables 2 --rows 200000 --columns 50 Looks like that will take 35 minutes to run (it's not a particularly fast tool). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",642572841, https://github.com/simonw/datasette/issues/859#issuecomment-647890378,https://api.github.com/repos/simonw/datasette/issues/859,647890378,MDEyOklzc3VlQ29tbWVudDY0Nzg5MDM3OA==,9599,2020-06-23T03:47:19Z,2020-06-23T03:47:19Z,OWNER,"I generated a 600MB database using [sqlite-generate](https://github.com/simonw/sqlite-generate) just now - with 100 tables at 100,00 rows and 3 tables at 1,000,000 rows - and performance of the database page was fine, 250ms. Those tables only had 4 columns each though. You said ""200k+, 50+ rows in a couple of tables"" - does that mean 50+ columns? I'll try with larger numbers of columns and see what difference that makes. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",642572841,