home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

10 rows where "created_at" is on date 2021-12-20, "updated_at" is on date 2021-12-20 and user = 9599 sorted by updated_at descending

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: issue_url, created_at (date)

issue 5

  • `sqlite-utils insert --convert` option 5
  • Writable canned queries fail to load custom templates 2
  • Idea: conversions= could take Python functions 1
  • TableView refactor 1
  • __call__() got an unexpected keyword argument 'specname' 1

user 1

  • simonw · 10 ✖

author_association 1

  • OWNER 10
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions issue performed_via_github_app
998354538 https://github.com/simonw/datasette/pull/1554#issuecomment-998354538 https://api.github.com/repos/simonw/datasette/issues/1554 IC_kwDOBm6k_c47ga5q simonw 9599 2021-12-20T23:52:04Z 2021-12-20T23:52:04Z OWNER

Abandoning this since it didn't work how I wanted.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
TableView refactor 1079129258  
997514220 https://github.com/simonw/datasette/issues/1547#issuecomment-997514220 https://api.github.com/repos/simonw/datasette/issues/1547 IC_kwDOBm6k_c47dNvs simonw 9599 2021-12-20T01:26:25Z 2021-12-20T01:26:25Z OWNER

OK, this should hopefully fix that for you:

pip install https://github.com/simonw/datasette/archive/f36e010b3b69ada104b79d83c7685caf9359049e.zip
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Writable canned queries fail to load custom templates 1076388044  
997513369 https://github.com/simonw/datasette/issues/1547#issuecomment-997513369 https://api.github.com/repos/simonw/datasette/issues/1547 IC_kwDOBm6k_c47dNiZ simonw 9599 2021-12-20T01:24:43Z 2021-12-20T01:24:43Z OWNER

@wragge thanks, that's a bug! Working on that in #1575.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Writable canned queries fail to load custom templates 1076388044  
997513177 https://github.com/simonw/datasette/issues/1575#issuecomment-997513177 https://api.github.com/repos/simonw/datasette/issues/1575 IC_kwDOBm6k_c47dNfZ simonw 9599 2021-12-20T01:24:25Z 2021-12-20T01:24:25Z OWNER

Looks like specname is new in Pluggy 1.0: https://github.com/pytest-dev/pluggy/blob/main/CHANGELOG.rst#pluggy-100-2021-08-25

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
__call__() got an unexpected keyword argument 'specname' 1084257842  
997507074 https://github.com/simonw/sqlite-utils/issues/356#issuecomment-997507074 https://api.github.com/repos/simonw/sqlite-utils/issues/356 IC_kwDOCGYnMM47dMAC simonw 9599 2021-12-20T01:10:06Z 2021-12-20T01:16:11Z OWNER

Work-in-progress improved help: ``` Usage: sqlite-utils insert [OPTIONS] PATH TABLE FILE

Insert records from FILE into a table, creating the table if it does not already exist.

By default the input is expected to be a JSON array of objects. Or:

  • Use --nl for newline-delimited JSON objects
  • Use --csv or --tsv for comma-separated or tab-separated input
  • Use --lines to write each incoming line to a column called "line"
  • Use --all to write the entire input to a column called "all"

You can also use --convert to pass a fragment of Python code that will be used to convert each input.

Your Python code will be passed a "row" variable representing the imported row, and can return a modified row.

If you are using --lines your code will be passed a "line" variable, and for --all an "all" variable.

Options: --pk TEXT Columns to use as the primary key, e.g. id --flatten Flatten nested JSON objects, so {"a": {"b": 1}} becomes {"a_b": 1} --nl Expect newline-delimited JSON -c, --csv Expect CSV input --tsv Expect TSV input --lines Treat each line as a single value called 'line' --all Treat input as a single value called 'all' --convert TEXT Python code to convert each item --import TEXT Python modules to import --delimiter TEXT Delimiter to use for CSV files --quotechar TEXT Quote character to use for CSV/TSV --sniff Detect delimiter and quote character --no-headers CSV file has no header row --batch-size INTEGER Commit every X records --alter Alter existing table to add any missing columns --not-null TEXT Columns that should be created as NOT NULL --default <TEXT TEXT>... Default value that should be set for a column --encoding TEXT Character encoding for input, defaults to utf-8 -d, --detect-types Detect types for columns in CSV/TSV data --load-extension TEXT SQLite extensions to load --silent Do not show progress bar --ignore Ignore records if pk already exists --replace Replace records if pk already exists --truncate Truncate table before inserting records, if table already exists -h, --help Show this message and exit. ```

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`sqlite-utils insert --convert` option 1077431957  
997508728 https://github.com/simonw/sqlite-utils/issues/356#issuecomment-997508728 https://api.github.com/repos/simonw/sqlite-utils/issues/356 IC_kwDOCGYnMM47dMZ4 simonw 9599 2021-12-20T01:14:43Z 2021-12-20T01:14:43Z OWNER

(This makes me want --extract from #352 even more.)

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`sqlite-utils insert --convert` option 1077431957  
997502242 https://github.com/simonw/sqlite-utils/issues/163#issuecomment-997502242 https://api.github.com/repos/simonw/sqlite-utils/issues/163 IC_kwDOCGYnMM47dK0i simonw 9599 2021-12-20T00:56:45Z 2021-12-20T00:56:52Z OWNER

Maybe sqlite-utils should absorb all of the functionality from sqlite-transform - having two separate tools doesn't necessarily make sense.

I implemented that in: - #251

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Idea: conversions= could take Python functions 706001517  
997497262 https://github.com/simonw/sqlite-utils/issues/356#issuecomment-997497262 https://api.github.com/repos/simonw/sqlite-utils/issues/356 IC_kwDOCGYnMM47dJmu simonw 9599 2021-12-20T00:40:15Z 2021-12-20T00:40:15Z OWNER

--flatten could do with a better description too.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`sqlite-utils insert --convert` option 1077431957  
997496931 https://github.com/simonw/sqlite-utils/issues/356#issuecomment-997496931 https://api.github.com/repos/simonw/sqlite-utils/issues/356 IC_kwDOCGYnMM47dJhj simonw 9599 2021-12-20T00:39:14Z 2021-12-20T00:39:52Z OWNER

``` % sqlite-utils insert --help Usage: sqlite-utils insert [OPTIONS] PATH TABLE JSON_FILE

Insert records from JSON file into a table, creating the table if it does not already exist.

Input should be a JSON array of objects, unless --nl or --csv is used.

Options: --pk TEXT Columns to use as the primary key, e.g. id --nl Expect newline-delimited JSON --flatten Flatten nested JSON objects -c, --csv Expect CSV --tsv Expect TSV --convert TEXT Python code to convert each item --import TEXT Python modules to import --delimiter TEXT Delimiter to use for CSV files --quotechar TEXT Quote character to use for CSV/TSV --sniff Detect delimiter and quote character --no-headers CSV file has no header row --batch-size INTEGER Commit every X records --alter Alter existing table to add any missing columns --not-null TEXT Columns that should be created as NOT NULL --default <TEXT TEXT>... Default value that should be set for a column --encoding TEXT Character encoding for input, defaults to utf-8 -d, --detect-types Detect types for columns in CSV/TSV data --load-extension TEXT SQLite extensions to load --silent Do not show progress bar --ignore Ignore records if pk already exists --replace Replace records if pk already exists --truncate Truncate table before inserting records, if table already exists -h, --help Show this message and exit. ``` I can add a bunch of extra help at the top there to explain all of this stuff. That "Input should be a JSON array of objects" bit could be expanded to several paragraphs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`sqlite-utils insert --convert` option 1077431957  
997492872 https://github.com/simonw/sqlite-utils/issues/356#issuecomment-997492872 https://api.github.com/repos/simonw/sqlite-utils/issues/356 IC_kwDOCGYnMM47dIiI simonw 9599 2021-12-20T00:23:31Z 2021-12-20T00:23:31Z OWNER

I think this should work on JSON, or CSV, or individual lines, or the entire content at once.

So I'll require --lines --convert ... to import individual lines, or --all --convert to run the conversion against the entire input at once.

What would --lines or --all do without --convert? Maybe insert records as {"line": "line of text"} or {"all": "whole input}.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
`sqlite-utils insert --convert` option 1077431957  

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
                ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
                ON [issue_comments] ([user]);
Powered by Datasette · Queries took 1269.374ms · About: github-to-sqlite