issue_comments
13 rows where "updated_at" is on date 2020-06-23 and user = 9599 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: issue_url, updated_at (date)
user 1
- simonw · 13 ✖
| id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | issue | performed_via_github_app |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 648442511 | https://github.com/simonw/sqlite-utils/issues/117#issuecomment-648442511 | https://api.github.com/repos/simonw/sqlite-utils/issues/117 | MDEyOklzc3VlQ29tbWVudDY0ODQ0MjUxMQ== | simonw 9599 | 2020-06-23T21:39:41Z | 2020-06-23T21:39:41Z | OWNER | So there are two sides to supporting this:
|
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Support for compound (composite) foreign keys 644161221 | |
| 648440634 | https://github.com/simonw/sqlite-utils/issues/117#issuecomment-648440634 | https://api.github.com/repos/simonw/sqlite-utils/issues/117 | MDEyOklzc3VlQ29tbWVudDY0ODQ0MDYzNA== | simonw 9599 | 2020-06-23T21:35:16Z | 2020-06-23T21:35:16Z | OWNER | Relevant discussion: https://github.com/simonw/sqlite-generate/issues/8#issuecomment-648438056 |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Support for compound (composite) foreign keys 644161221 | |
| 648440525 | https://github.com/simonw/sqlite-utils/issues/117#issuecomment-648440525 | https://api.github.com/repos/simonw/sqlite-utils/issues/117 | MDEyOklzc3VlQ29tbWVudDY0ODQ0MDUyNQ== | simonw 9599 | 2020-06-23T21:35:01Z | 2020-06-23T21:35:01Z | OWNER | Here's what's missing:
The first two columns returned by |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Support for compound (composite) foreign keys 644161221 | |
| 648434885 | https://github.com/simonw/sqlite-utils/issues/116#issuecomment-648434885 | https://api.github.com/repos/simonw/sqlite-utils/issues/116 | MDEyOklzc3VlQ29tbWVudDY0ODQzNDg4NQ== | simonw 9599 | 2020-06-23T21:21:33Z | 2020-06-23T21:21:33Z | OWNER | {
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Documentation for table.pks introspection property 644122661 | ||
| 648403834 | https://github.com/simonw/sqlite-utils/issues/116#issuecomment-648403834 | https://api.github.com/repos/simonw/sqlite-utils/issues/116 | MDEyOklzc3VlQ29tbWVudDY0ODQwMzgzNA== | simonw 9599 | 2020-06-23T20:36:29Z | 2020-06-23T20:36:29Z | OWNER | Should go in this section https://sqlite-utils.readthedocs.io/en/stable/python-api.html#introspection - under |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Documentation for table.pks introspection property 644122661 | |
| 648234787 | https://github.com/simonw/datasette/issues/859#issuecomment-648234787 | https://api.github.com/repos/simonw/datasette/issues/859 | MDEyOklzc3VlQ29tbWVudDY0ODIzNDc4Nw== | simonw 9599 | 2020-06-23T15:22:51Z | 2020-06-23T15:22:51Z | OWNER | I wonder if this is a SQLite caching issue then? Datasette has a configuration option for this but I haven't spent much time experimenting with it so I don't know how much of an impact it can have: https://datasette.readthedocs.io/en/stable/config.html#cache-size-kb |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Database page loads too slowly with many large tables (due to table counts) 642572841 | |
| 648163272 | https://github.com/simonw/datasette/issues/859#issuecomment-648163272 | https://api.github.com/repos/simonw/datasette/issues/859 | MDEyOklzc3VlQ29tbWVudDY0ODE2MzI3Mg== | simonw 9599 | 2020-06-23T13:52:23Z | 2020-06-23T13:52:23Z | OWNER | I'm chunking inserts at 100 at a time right now: https://github.com/simonw/sqlite-utils/blob/4d9a3204361d956440307a57bd18c829a15861db/sqlite_utils/db.py#L1030 I think the performance is more down to using Faker to create the test data - generating millions of entirely fake, randomized records takes a fair bit of time. |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Database page loads too slowly with many large tables (due to table counts) 642572841 | |
| 647894903 | https://github.com/simonw/datasette/issues/859#issuecomment-647894903 | https://api.github.com/repos/simonw/datasette/issues/859 | MDEyOklzc3VlQ29tbWVudDY0Nzg5NDkwMw== | simonw 9599 | 2020-06-23T04:07:59Z | 2020-06-23T04:07:59Z | OWNER | Just to check: are you seeing the problem on this page: https://latest.datasette.io/fixtures (the database page) - or this page (the table page): https://latest.datasette.io/fixtures/compound_three_primary_keys If it's the table page then the problem may well be #862. |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Database page loads too slowly with many large tables (due to table counts) 642572841 | |
| 647893140 | https://github.com/simonw/datasette/issues/596#issuecomment-647893140 | https://api.github.com/repos/simonw/datasette/issues/596 | MDEyOklzc3VlQ29tbWVudDY0Nzg5MzE0MA== | simonw 9599 | 2020-06-23T03:59:51Z | 2020-06-23T03:59:51Z | OWNER | Related: #862 - a time limit on the total time spent considering suggested facets for a table. |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Handle really wide tables better 507454958 | |
| 647892930 | https://github.com/simonw/datasette/issues/862#issuecomment-647892930 | https://api.github.com/repos/simonw/datasette/issues/862 | MDEyOklzc3VlQ29tbWVudDY0Nzg5MjkzMA== | simonw 9599 | 2020-06-23T03:58:48Z | 2020-06-23T03:58:48Z | OWNER | Should this be controlled be a separate configuration setting? I'm inclined to say no - I think instead I'll set the limit to be 10 * whatever |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Set an upper limit on total facet suggestion time for a page 643510821 | |
| 647890619 | https://github.com/simonw/datasette/issues/859#issuecomment-647890619 | https://api.github.com/repos/simonw/datasette/issues/859 | MDEyOklzc3VlQ29tbWVudDY0Nzg5MDYxOQ== | simonw 9599 | 2020-06-23T03:48:21Z | 2020-06-23T03:48:21Z | OWNER |
Looks like that will take 35 minutes to run (it's not a particularly fast tool). |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Database page loads too slowly with many large tables (due to table counts) 642572841 | |
| 647890378 | https://github.com/simonw/datasette/issues/859#issuecomment-647890378 | https://api.github.com/repos/simonw/datasette/issues/859 | MDEyOklzc3VlQ29tbWVudDY0Nzg5MDM3OA== | simonw 9599 | 2020-06-23T03:47:19Z | 2020-06-23T03:47:19Z | OWNER | I generated a 600MB database using sqlite-generate just now - with 100 tables at 100,00 rows and 3 tables at 1,000,000 rows - and performance of the database page was fine, 250ms. Those tables only had 4 columns each though. You said "200k+, 50+ rows in a couple of tables" - does that mean 50+ columns? I'll try with larger numbers of columns and see what difference that makes. |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Database page loads too slowly with many large tables (due to table counts) 642572841 | |
| 647889674 | https://github.com/simonw/datasette/issues/861#issuecomment-647889674 | https://api.github.com/repos/simonw/datasette/issues/861 | MDEyOklzc3VlQ29tbWVudDY0Nzg4OTY3NA== | simonw 9599 | 2020-06-23T03:44:17Z | 2020-06-23T03:44:17Z | OWNER | https://github.com/simonw/sqlite-generate is now ready to be used - see also https://pypi.org/project/sqlite-generate/ |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
Script to generate larger SQLite test files 642652808 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] (
[html_url] TEXT,
[issue_url] TEXT,
[id] INTEGER PRIMARY KEY,
[node_id] TEXT,
[user] INTEGER REFERENCES [users]([id]),
[created_at] TEXT,
[updated_at] TEXT,
[author_association] TEXT,
[body] TEXT,
[reactions] TEXT,
[issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
ON [issue_comments] ([user]);
issue 6