home / github

Menu
  • Search all tables
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

4 rows where user = 813732 sorted by updated_at descending

✖
✖

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: issue_url, reactions, created_at (date), updated_at (date)

issue 2

  • Serve all db files in a folder 3
  • Deploy a live instance of demos/apache-proxy 1

user 1

  • glasnt · 4 ✖

author_association 1

  • CONTRIBUTOR 4
id html_url issue_url node_id user created_at updated_at ▲ author_association body reactions issue performed_via_github_app
976117989 https://github.com/simonw/datasette/issues/1522#issuecomment-976117989 https://api.github.com/repos/simonw/datasette/issues/1522 IC_kwDOBm6k_c46LmDl glasnt 813732 2021-11-23T03:00:34Z 2021-11-23T03:00:34Z CONTRIBUTOR

I tried deploying the most recent version of the Dockerfile in this thread (link to comment), and after trying a few different different combinations, I was only successful when I used --no-cpu-throttling ("CPU Is always allocated" in the UI)

Using this method, I got a very similar issue to you: The first time I'd load the site I'd get a 503. But after that first load, I didn't get the issue again. It would re-occur if the service started from cold boot.

I suspect this is a race condition in the supervisord configuration. The errors I got were the same Connection refused: AH00957: http: attempt to connect to 127.0.0.1:8001 (127.0.0.1) failed, and that seems to indicate that datasette hadn't yet started.

Looking at the order of logs getting back, the processes reported successfully completing loading after the first 503 was returned, so that makes me think race condition.

I can replicate this locally, if I docker run and request localhost:5000/prefix before I get the datasette entered RUNNING state message. Cloud Run wakes up when requests are received, so this test would semi-replicate that, but local docker would be the equivalent of a persistent process, hence it doesn't normally exhibit the same issues.

Unfortunately supervisor/supervisor issue 122 (not linking as to prevent cross-project link spam) seems to say that dependency chaining is a feature that's been asked for for a long time, but hasn't been implemented. You could try some suggestions in that thread.

{
    "total_count": 1,
    "+1": 1,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Deploy a live instance of demos/apache-proxy 1058896236  
967747190 https://github.com/simonw/datasette/issues/1380#issuecomment-967747190 https://api.github.com/repos/simonw/datasette/issues/1380 IC_kwDOBm6k_c45rqZ2 glasnt 813732 2021-11-13T00:47:26Z 2021-11-13T00:47:26Z CONTRIBUTOR

Would it make sense to run datasette with a fswatch/inotifywait on a folder, then?

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Serve all db files in a folder 924748955  
953366110 https://github.com/simonw/datasette/issues/1380#issuecomment-953366110 https://api.github.com/repos/simonw/datasette/issues/1380 IC_kwDOBm6k_c440zZe glasnt 813732 2021-10-27T22:48:55Z 2021-10-27T22:48:55Z CONTRIBUTOR

It looks like if the files argument is a directory, config_dir is set, but files in that folder are only loaded into self.files at the Datasette class initialisation.

I tried seeing if I could get --reload to work, but I'm getting issues trying to use that command when specifying a directory, as the command serve ends up in the files list(?):

datasette serve . --reload Error: Invalid value for '[FILES]...': Path 'serve' does not exist.

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Serve all db files in a folder 924748955  
953334718 https://github.com/simonw/datasette/issues/1380#issuecomment-953334718 https://api.github.com/repos/simonw/datasette/issues/1380 IC_kwDOBm6k_c440ru- glasnt 813732 2021-10-27T21:45:04Z 2021-10-27T21:45:04Z CONTRIBUTOR

I am also getting this issue, using the currently most recent version of datasette

$ datasette --version datasette, version 0.59.1

If I run datasette within just a folder of files,

$ datasette serve .

Adding new files while datasette is running shows no new files, and removing files causes datasette to return 500 errors.

home Error 500 [Errno 2] No such file or directory: 'mydatabase.db' Powered by Datasette

{
    "total_count": 0,
    "+1": 0,
    "-1": 0,
    "laugh": 0,
    "hooray": 0,
    "confused": 0,
    "heart": 0,
    "rocket": 0,
    "eyes": 0
}
Serve all db files in a folder 924748955  

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
                ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
                ON [issue_comments] ([user]);
Powered by Datasette · Queries took 24.983ms · About: github-to-sqlite
  • Sort ascending
  • Sort descending
  • Facet by this
  • Hide this column
  • Show all columns
  • Show not-blank rows