home / github / issue_comments

Menu
  • Search all tables
  • GraphQL API

issue_comments: 1247149969

This data as json

html_url issue_url id node_id user created_at updated_at author_association body reactions issue performed_via_github_app
https://github.com/simonw/sqlite-utils/issues/297#issuecomment-1247149969 https://api.github.com/repos/simonw/sqlite-utils/issues/297 1247149969 IC_kwDOCGYnMM5KVf-R 9599 2022-09-14T18:28:53Z 2022-09-14T18:29:34Z OWNER

As an aside, https://avi.im/blag/2021/fast-sqlite-inserts/ inspired my to try pypy since that article claimed to get a 2.5x speedup using pypy compared to regular Python for a CSV import script.

Setup: brew install pypy3 cd /tmp pypy3 -m venv venv source venv/bin/activate pip install sqlite-utils I grabbed the first 760M of that https://static.openfoodfacts.org/data/en.openfoodfacts.org.products.csv file (didn't wait for the whole thing to download).

Then: time sqlite-utils insert pypy.db t en.openfoodfacts.org.products.csv --csv [------------------------------------] 0% [###################################-] 99% 11.76s user 2.26s system 93% cpu 14.981 total Compared to regular Python sqlite-utils doing the same thing: time sqlite-utils insert py.db t en.openfoodfacts.org.products.csv --csv [------------------------------------] 0% [###################################-] 99% 11.36s user 2.06s system 93% cpu 14.341 total So no perceivable performance difference.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
944846776  
Powered by Datasette · Queries took 1.943ms · About: github-to-sqlite