home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

8 rows where "updated_at" is on date 2022-03-24 sorted by updated_at descending

✖
✖

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: issue_url, created_at (date), updated_at (date)

issue 5

  • Document how to use a `--convert` function that runs initialization code first 4
  • when hashed urls are turned on, the _memory db has improperly long-lived cache expiry 1
  • don't set far expiry if hash is '000' 1
  • Make it easier to insert geometries, with documentation and maybe code 1
  • Mechanism for disabling faceting on large tables only 1

user 3

  • simonw 4
  • fgregg 3
  • eyeseast 1

author_association 2

  • CONTRIBUTOR 4
  • OWNER 4
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions issue performed_via_github_app
1078343231 https://github.com/simonw/sqlite-utils/issues/420#issuecomment-1078343231 https://api.github.com/repos/simonw/sqlite-utils/issues/420 IC_kwDOCGYnMM5ARjY_ simonw 9599 2022-03-24T21:16:10Z 2022-03-24T21:17:20Z OWNER

Aha! This may be possible already: https://github.com/simonw/sqlite-utils/blob/396f80fcc60da8dd844577114f7920830a2e5403/sqlite_utils/utils.py#L311-L316

And yes, this does indeed work - you can do something like this:

``` echo '{"name": "harry"}' | sqlite-utils insert db.db people - --convert ' import time

Simulate something expensive

time.sleep(1)

def convert(row): row["upper"] = row["name"].upper() ' And after running that: sqlite-utils dump db.db BEGIN TRANSACTION; CREATE TABLE [people] ( [name] TEXT, [upper] TEXT ); INSERT INTO "people" VALUES('harry','HARRY'); COMMIT; ``` So this is a documentation issue - there's a trick for it but I didn't know what the trick was!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Document how to use a `--convert` function that runs initialization code first 1178546862  
1078328774 https://github.com/simonw/sqlite-utils/issues/420#issuecomment-1078328774 https://api.github.com/repos/simonw/sqlite-utils/issues/420 IC_kwDOCGYnMM5ARf3G simonw 9599 2022-03-24T21:12:33Z 2022-03-24T21:12:33Z OWNER

Here's how the _compile_code() mechanism works at the moment: https://github.com/simonw/sqlite-utils/blob/396f80fcc60da8dd844577114f7920830a2e5403/sqlite_utils/utils.py#L308-L342

At the end it does this: python return locals["fn"] So it's already building and then returning a function.

The question is if there's a sensible way to allow people to further customize that function by executing some code first, in a way that's easy to explain.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Document how to use a `--convert` function that runs initialization code first 1178546862  
1078322301 https://github.com/simonw/sqlite-utils/issues/420#issuecomment-1078322301 https://api.github.com/repos/simonw/sqlite-utils/issues/420 IC_kwDOCGYnMM5AReR9 simonw 9599 2022-03-24T21:10:52Z 2022-03-24T21:10:52Z OWNER

I can think of three ways forward:

  • Figure out a pattern that gets that local file import workaround to work
  • Add another option such as --convert-init that lets you pass code that will be executed once at the start
  • Come up with a pattern where the --convert code can run some initialization code and then return a function which will be called against each value

I quite like the idea of that third option - I'm going to prototype it and see if I can work something out.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Document how to use a `--convert` function that runs initialization code first 1178546862  
1078315922 https://github.com/simonw/sqlite-utils/issues/420#issuecomment-1078315922 https://api.github.com/repos/simonw/sqlite-utils/issues/420 IC_kwDOCGYnMM5ARcuS simonw 9599 2022-03-24T21:09:27Z 2022-03-24T21:09:27Z OWNER

Yeah, this is WAY harder than it should be.

There's a clumsy workaround you could use which looks something like this: create a file my_enchant.py containing:

```python import enchant d = enchant.Dict("en_US")

def check(word): return d.check(word) `` Then runsqlite-utils` like this:

PYTHONPATH=. cat items.json | jq '.data' | sqlite-utils insert listings.db listings - --convert 'my_enchant.check(value)' --import my_enchant Except I tried that and it doesn't work! I don't know the right pattern for getting --import to work with modules in the same directory.

So yeah, this is definitely a big feature gap.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Document how to use a `--convert` function that runs initialization code first 1178546862  
1078126065 https://github.com/simonw/datasette/issues/1684#issuecomment-1078126065 https://api.github.com/repos/simonw/datasette/issues/1684 IC_kwDOBm6k_c5AQuXx fgregg 536941 2022-03-24T20:08:56Z 2022-03-24T20:13:19Z CONTRIBUTOR

would be nice if the behavior was

  1. try to facet all the columns
  2. for bigger tables try to facet the indexed columns
  3. for the biggest tables, turn off autofacetting completely

This is based on my assumption that what determines autofaceting is the rarity of unique values. Which may not be true!

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Mechanism for disabling faceting on large tables only 1179998071  
1077671779 https://github.com/simonw/sqlite-utils/issues/399#issuecomment-1077671779 https://api.github.com/repos/simonw/sqlite-utils/issues/399 IC_kwDOCGYnMM5AO_dj eyeseast 25778 2022-03-24T14:11:33Z 2022-03-24T14:11:43Z CONTRIBUTOR

Coming back to this. I was about to add a utility function to datasette-geojson to convert lat/lng columns to geometries. Thankfully I googled first. There's a SpatiaLite function for this: MakePoint.

sql select MakePoint(longitude, latitude) as geometry from places;

I'm not sure if that would work with conversions, since it needs two columns, but it's an option for tables that already have latitude, longitude columns.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Make it easier to insert geometries, with documentation and maybe code 1124731464  
1077047295 https://github.com/simonw/datasette/issues/1581#issuecomment-1077047295 https://api.github.com/repos/simonw/datasette/issues/1581 IC_kwDOBm6k_c5AMm__ fgregg 536941 2022-03-24T04:08:18Z 2022-03-24T04:08:18Z CONTRIBUTOR

this has been addressed by the datasette-hashed-urls plugin

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
when hashed urls are turned on, the _memory db has improperly long-lived cache expiry 1089529555  
1077047152 https://github.com/simonw/datasette/pull/1582#issuecomment-1077047152 https://api.github.com/repos/simonw/datasette/issues/1582 IC_kwDOBm6k_c5AMm9w fgregg 536941 2022-03-24T04:07:58Z 2022-03-24T04:07:58Z CONTRIBUTOR

this has been obviated by the datasette-hashed-urls plugin

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
don't set far expiry if hash is '000' 1090055810  

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
                ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
                ON [issue_comments] ([user]);
Powered by Datasette · Queries took 534.884ms · About: github-to-sqlite
  • Sort ascending
  • Sort descending
  • Facet by this
  • Hide this column
  • Show all columns
  • Show not-blank rows