home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

18 rows where "updated_at" is on date 2021-02-12 sorted by updated_at descending

✖
✖

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: issue_url, reactions, created_at (date), updated_at (date)

user 8

  • simonw 8
  • aborruso 2
  • leafgarland 2
  • bobwhitelock 2
  • robmarkcole 1
  • codecov[bot] 1
  • daniel-butler 1
  • RhetTbull 1

issue 8

  • photo-to-sqlite: command not found 4
  • Installing datasette via docker: Path 'fixtures.db' does not exist 4
  • Support SSL/TLS directly 3
  • sqlite-utils insert: options for column types 2
  • --no-headers option for CSV and TSV 2
  • Use Data from SQLite in other commands 1
  • Add compile option to Dockerfile to fix failing test (fixes #696) 1
  • Error reading csv files with large column data 1

author_association 3

  • OWNER 8
  • NONE 6
  • CONTRIBUTOR 4
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions issue performed_via_github_app
778511347 https://github.com/simonw/sqlite-utils/issues/228#issuecomment-778511347 https://api.github.com/repos/simonw/sqlite-utils/issues/228 MDEyOklzc3VlQ29tbWVudDc3ODUxMTM0Nw== simonw 9599 2021-02-12T23:27:50Z 2021-02-12T23:27:50Z OWNER

For the moment, a workaround can be to cat an additional row onto the start of the file.

echo "name,url,description" | cat - missing_headings.csv | sqlite-utils insert blah.db table - --csv
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
--no-headers option for CSV and TSV 807437089  
778510528 https://github.com/simonw/sqlite-utils/issues/131#issuecomment-778510528 https://api.github.com/repos/simonw/sqlite-utils/issues/131 MDEyOklzc3VlQ29tbWVudDc3ODUxMDUyOA== simonw 9599 2021-02-12T23:25:06Z 2021-02-12T23:25:06Z OWNER

If -c isn't available, maybe -t or --type would work for specifying column types: sqlite-utils insert db.db images images.tsv \ --tsv \ --type id int \ --type score float or sqlite-utils insert db.db images images.tsv \ --tsv \ -t id int \ -t score float

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
sqlite-utils insert: options for column types 675753042  
778508887 https://github.com/simonw/sqlite-utils/issues/131#issuecomment-778508887 https://api.github.com/repos/simonw/sqlite-utils/issues/131 MDEyOklzc3VlQ29tbWVudDc3ODUwODg4Nw== simonw 9599 2021-02-12T23:20:11Z 2021-02-12T23:20:11Z OWNER

Annoyingly -c is currently a shortcut for --csv - so I'd have to do a major version bump to use that.

https://github.com/simonw/sqlite-utils/blob/726219c3503e77440975cd15b74d006639feb0f8/sqlite_utils/cli.py#L601-L603

Particularly annoying because I attempted to remove the -c shortcut in https://github.com/simonw/sqlite-utils/commit/2c00567aac6d9c79087cfff0d054f64922b1473d#diff-76294b3d4afeb27e74e738daa01c26dd4dc9ccb6f4477451483a2ece1095902eL48 but forgot to remove it from the input options (I removed it from the output options).

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
sqlite-utils insert: options for column types 675753042  
778467759 https://github.com/simonw/datasette/issues/1220#issuecomment-778467759 https://api.github.com/repos/simonw/datasette/issues/1220 MDEyOklzc3VlQ29tbWVudDc3ODQ2Nzc1OQ== aborruso 30607 2021-02-12T21:35:17Z 2021-02-12T21:35:17Z NONE

Thank you

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Installing datasette via docker: Path 'fixtures.db' does not exist 806743116  
778439617 https://github.com/simonw/datasette/issues/1220#issuecomment-778439617 https://api.github.com/repos/simonw/datasette/issues/1220 MDEyOklzc3VlQ29tbWVudDc3ODQzOTYxNw== bobwhitelock 7476523 2021-02-12T20:33:27Z 2021-02-12T20:33:27Z CONTRIBUTOR

That Docker command will mount your current directory inside the Docker container at /mnt - so you shouldn't need to change anything locally, just run

docker run -p 8001:8001 -v `pwd`:/mnt \ datasetteproject/datasette \ datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db

and it will use the fixtures.db file within your current directory

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Installing datasette via docker: Path 'fixtures.db' does not exist 806743116  
770069864 https://github.com/dogsheep/github-to-sqlite/issues/60#issuecomment-770069864 https://api.github.com/repos/dogsheep/github-to-sqlite/issues/60 MDEyOklzc3VlQ29tbWVudDc3MDA2OTg2NA== daniel-butler 22578954 2021-01-29T21:52:05Z 2021-02-12T18:29:43Z CONTRIBUTOR

For the purposes below I am assuming the organization I would get all the repositories and their related commits from is called gh-organization. The github's owner id of gh-orgnization is 123456789.

bash github-to-sqlite repos github.db gh-organization

I'm on a windows computer running git bash to be able to use the | command. This works for me bash sqlite3 github.db "SELECT full_name FROM repos WHERE owner = '123456789';" | tr '\n\r' ' ' | xargs | { read repos; github-to-sqlite commits github.db $repos; }

On a pure linux system I think this would work because the new line character is normally \n bash sqlite3 github.db "SELECT full_name FROM repos WHERE owner = '123456789';" | tr '\n' ' ' | xargs | { read repos; github-to-sqlite commits github.db $repos; }`

As expected I ran into rate limit issues #51

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Use Data from SQLite in other commands 797097140  
778349672 https://github.com/simonw/sqlite-utils/issues/228#issuecomment-778349672 https://api.github.com/repos/simonw/sqlite-utils/issues/228 MDEyOklzc3VlQ29tbWVudDc3ODM0OTY3Mg== simonw 9599 2021-02-12T18:00:43Z 2021-02-12T18:00:43Z OWNER

I could combine this with #131 to allow types to be specified in addition to column names.

Probably need an option that means "ignore the existing heading row and use this one instead".

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
--no-headers option for CSV and TSV 807437089  
778349142 https://github.com/simonw/sqlite-utils/issues/227#issuecomment-778349142 https://api.github.com/repos/simonw/sqlite-utils/issues/227 MDEyOklzc3VlQ29tbWVudDc3ODM0OTE0Mg== simonw 9599 2021-02-12T17:59:35Z 2021-02-12T17:59:35Z OWNER

It looks like I can at least bump this size limit up to the maximum allowed by Python - I'll take a look at that.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Error reading csv files with large column data 807174161  
778246347 https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-778246347 https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33 MDEyOklzc3VlQ29tbWVudDc3ODI0NjM0Nw== RhetTbull 41546558 2021-02-12T15:00:43Z 2021-02-12T15:00:43Z CONTRIBUTOR

Yes, Big Sur Photos database doesn't have ZGENERICASSET table. PR #31 will fix this.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
photo-to-sqlite: command not found 803338729  
778014990 https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-778014990 https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33 MDEyOklzc3VlQ29tbWVudDc3ODAxNDk5MA== leafgarland 675335 2021-02-12T06:54:14Z 2021-02-12T06:54:14Z NONE

Ahh, that might be because macOS Big Sur has changed the structure of the photos db. Might need to wait for a later release, there is a PR which adds support for Big Sur.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
photo-to-sqlite: command not found 803338729  
778008752 https://github.com/simonw/datasette/issues/1220#issuecomment-778008752 https://api.github.com/repos/simonw/datasette/issues/1220 MDEyOklzc3VlQ29tbWVudDc3ODAwODc1Mg== aborruso 30607 2021-02-12T06:37:34Z 2021-02-12T06:37:34Z NONE

I have used my path, I'm running it from the folder in wich I have the db.

Do I must an absolute path?

Do I must create exactly that folder?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Installing datasette via docker: Path 'fixtures.db' does not exist 806743116  
778002092 https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-778002092 https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33 MDEyOklzc3VlQ29tbWVudDc3ODAwMjA5Mg== robmarkcole 11855322 2021-02-12T06:19:32Z 2021-02-12T06:19:32Z NONE

hi @leafgarland that results in a new error: (venv) (base) Robins-MacBook:datasette robin$ dogsheep-photos apple-photos photos.db Traceback (most recent call last): File "/Users/robin/datasette/venv/bin/dogsheep-photos", line 8, in <module> sys.exit(cli()) File "/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py", line 829, in __call__ return self.main(*args, **kwargs) File "/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py", line 782, in main rv = self.invoke(ctx) File "/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File "/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File "/Users/robin/datasette/venv/lib/python3.8/site-packages/click/core.py", line 610, in invoke return callback(*args, **kwargs) File "/Users/robin/datasette/venv/lib/python3.8/site-packages/dogsheep_photos/cli.py", line 206, in apple_photos db.conn.execute( sqlite3.OperationalError: no such table: attached.ZGENERICASSET

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
photo-to-sqlite: command not found 803338729  
777951854 https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-777951854 https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33 MDEyOklzc3VlQ29tbWVudDc3Nzk1MTg1NA== leafgarland 675335 2021-02-12T03:54:39Z 2021-02-12T03:54:39Z NONE

I think that is a typo in the docs, you can use

> dogsheep-photos apple-photos photos.db
{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
photo-to-sqlite: command not found 803338729  
777949755 https://github.com/simonw/datasette/pull/1223#issuecomment-777949755 https://api.github.com/repos/simonw/datasette/issues/1223 MDEyOklzc3VlQ29tbWVudDc3Nzk0OTc1NQ== codecov[bot] 22429695 2021-02-12T03:45:31Z 2021-02-12T03:45:31Z NONE

Codecov Report

Merging #1223 (d1cd1f2) into main (9603d89) will not change coverage. The diff coverage is n/a.

```diff @@ Coverage Diff @@

main #1223 +/-

======================================= Coverage 91.42% 91.42%
======================================= Files 32 32
Lines 3955 3955
======================================= Hits 3616 3616
Misses 339 339
```


Continue to review full report at Codecov.

Legend - Click here to learn more Δ = absolute <relative> (impact), ø = not affected, ? = missing data Powered by Codecov. Last update 9603d89...d1cd1f2. Read the comment docs.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Add compile option to Dockerfile to fix failing test (fixes #696) 806918878  
777927946 https://github.com/simonw/datasette/issues/1220#issuecomment-777927946 https://api.github.com/repos/simonw/datasette/issues/1220 MDEyOklzc3VlQ29tbWVudDc3NzkyNzk0Ng== bobwhitelock 7476523 2021-02-12T02:29:54Z 2021-02-12T02:29:54Z CONTRIBUTOR

According to https://github.com/simonw/datasette/blob/master/docs/installation.rst#using-docker it should be

docker run -p 8001:8001 -v `pwd`:/mnt \ datasetteproject/datasette \ datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db

This uses /mnt/fixtures.db whereas you're using fixtures.db - did you try using this path instead?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Installing datasette via docker: Path 'fixtures.db' does not exist 806743116  
777901052 https://github.com/simonw/datasette/issues/1221#issuecomment-777901052 https://api.github.com/repos/simonw/datasette/issues/1221 MDEyOklzc3VlQ29tbWVudDc3NzkwMTA1Mg== simonw 9599 2021-02-12T01:09:54Z 2021-02-12T01:09:54Z OWNER

I also tested this manually. I generated certificate files like so:

cd /tmp
python -m trustme

This created /tmp/server.pem, /tmp/client.pem and /tmp/server.key

Then I started Datasette like this:

datasette --memory --ssl-keyfile=/tmp/server.key --ssl-certfile=/tmp/server.pem

And exercise it using curl like so:

/tmp % curl --cacert /tmp/client.pem 'https://localhost:8001/_memory.json'
{"database": "_memory", "path": "/_memory", "size": 0, "tables": [], "hidden_count": 0, "views": [], "queries": [],
"private": false, "allow_execute_sql": true, "query_ms": 0.8843200000114848}

Note that without the --cacert option I get an error:

``` /tmp % curl 'https://localhost:8001/_memory.json' curl: (60) SSL certificate problem: Invalid certificate chain More details here: https://curl.haxx.se/docs/sslcerts.html

curl failed to verify the legitimacy of the server and therefore could not establish a secure connection to it. To learn more about this situation and how to fix it, please visit the web page mentioned above.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Support SSL/TLS directly 806849424  
777887190 https://github.com/simonw/datasette/issues/1221#issuecomment-777887190 https://api.github.com/repos/simonw/datasette/issues/1221 MDEyOklzc3VlQ29tbWVudDc3Nzg4NzE5MA== simonw 9599 2021-02-12T00:29:18Z 2021-02-12T00:29:18Z OWNER

I can use this recipe to start a datasette server in a sub-process during the pytest run and exercise it with real HTTP requests: https://til.simonwillison.net/pytest/subprocess-server

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Support SSL/TLS directly 806849424  
777883452 https://github.com/simonw/datasette/issues/1221#issuecomment-777883452 https://api.github.com/repos/simonw/datasette/issues/1221 MDEyOklzc3VlQ29tbWVudDc3Nzg4MzQ1Mg== simonw 9599 2021-02-12T00:19:30Z 2021-02-12T00:19:40Z OWNER

Uvicorn supports these options: https://www.uvicorn.org/#command-line-options ``` --ssl-keyfile TEXT SSL key file --ssl-certfile TEXT SSL certificate file --ssl-keyfile-password TEXT SSL keyfile password --ssl-version INTEGER SSL version to use (see stdlib ssl module's) [default: 2]

--ssl-cert-reqs INTEGER Whether client certificate is required (see stdlib ssl module's) [default: 0]

--ssl-ca-certs TEXT CA certificates file --ssl-ciphers TEXT Ciphers to use (see stdlib ssl module's) [default: TLSv1] `` For the moment I'm going to support just--ssl-keyfileand--ssl-certfileas arguments todatasette serve`. I'll add other options if people ask for them.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Support SSL/TLS directly 806849424  

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
                ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
                ON [issue_comments] ([user]);
Powered by Datasette · Queries took 530.156ms · About: github-to-sqlite
  • Sort ascending
  • Sort descending
  • Facet by this
  • Hide this column
  • Show all columns
  • Show not-blank rows