issue_comments
9 rows where issue = 570101428, "updated_at" is on date 2020-02-24 and user = 9599 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: created_at (date), updated_at (date)
issue 1
- .execute_write() and .execute_write_fn() methods on Database · 9 ✖
| id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | issue | performed_via_github_app |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 590608228 | https://github.com/simonw/datasette/pull/683#issuecomment-590608228 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDYwODIyOA== | simonw 9599 | 2020-02-24T23:52:35Z | 2020-02-24T23:52:35Z | OWNER | I'm going to punt on the ability to introspect the write queue and poll for completion using a UUID for the moment. Can add those later. |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
.execute_write() and .execute_write_fn() methods on Database 570101428 | |
| 590607385 | https://github.com/simonw/datasette/pull/683#issuecomment-590607385 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDYwNzM4NQ== | simonw 9599 | 2020-02-24T23:49:37Z | 2020-02-24T23:49:37Z | OWNER | Here's the class UploadApp(HTTPEndpoint): def init(self, scope, receive, send, datasette): self.datasette = datasette super().init(scope, receive, send)
@hookimpl def asgi_wrapper(datasette): def wrap_with_asgi_auth(app): async def wrapped_app(scope, recieve, send): if scope["path"] == "/-/upload-csv": await UploadApp(scope, recieve, send, datasette) else: await app(scope, recieve, send)
|
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
.execute_write() and .execute_write_fn() methods on Database 570101428 | |
| 590606825 | https://github.com/simonw/datasette/pull/683#issuecomment-590606825 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDYwNjgyNQ== | simonw 9599 | 2020-02-24T23:47:38Z | 2020-02-24T23:47:38Z | OWNER | Another demo plugin: class DeleteTableApp(HTTPEndpoint): def init(self, scope, receive, send, datasette): self.datasette = datasette super().init(scope, receive, send)
@hookimpl def asgi_wrapper(datasette): def wrap_with_asgi_auth(app): async def wrapped_app(scope, recieve, send): if scope["path"] == "/-/delete-table": await DeleteTableApp(scope, recieve, send, datasette) else: await app(scope, recieve, send)
{% block content %} <form action="/-/delete-table" method="POST"></form> {{ super() }} {% endblock %} ``` (Needs CSRF protection added) I ran Datasette like this:
Result: I can delete tables!
|
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
.execute_write() and .execute_write_fn() methods on Database 570101428 | |
| 590599257 | https://github.com/simonw/datasette/pull/683#issuecomment-590599257 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDU5OTI1Nw== | simonw 9599 | 2020-02-24T23:21:56Z | 2020-02-24T23:22:35Z | OWNER | Also: are UUIDs really necessary here or could I use a simpler form of task identifier? Like an in-memory counter variable that starts at 0 and increments every time this instance of Datasette issues a new task ID? The neat thing about UUIDs is that I don't have to worry if there are multiple Datasette instances accepting writes behind a load balancer. That seems pretty unlikely (especially considering SQLite databases encourage only one process to be writing at a time)... but I am experimenting with PostgreSQL support in #670 so it's probably worth ensuring these task IDs really are globally unique. I'm going to stick with UUIDs. They're short-lived enough that their size doesn't really matter. |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
.execute_write() and .execute_write_fn() methods on Database 570101428 | |
| 590598689 | https://github.com/simonw/datasette/pull/683#issuecomment-590598689 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDU5ODY4OQ== | simonw 9599 | 2020-02-24T23:20:11Z | 2020-02-24T23:20:11Z | OWNER | I think But is it weird having a function that returns different types depending on if you passed I'm OK with the |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
.execute_write() and .execute_write_fn() methods on Database 570101428 | |
| 590598248 | https://github.com/simonw/datasette/pull/683#issuecomment-590598248 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDU5ODI0OA== | simonw 9599 | 2020-02-24T23:18:50Z | 2020-02-24T23:18:50Z | OWNER | I'm not convinced by the return value of the Do I really need that |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
.execute_write() and .execute_write_fn() methods on Database 570101428 | |
| 590593120 | https://github.com/simonw/datasette/pull/683#issuecomment-590593120 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDU5MzEyMA== | simonw 9599 | 2020-02-24T23:02:30Z | 2020-02-24T23:02:30Z | OWNER | I'm going to muck around with a couple more demo plugins - in particular one derived from datasette-upload-csvs - to make sure I'm comfortable with this API - then add a couple of tests and merge it with documentation that warns "this is still an experimental feature and may change". |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
.execute_write() and .execute_write_fn() methods on Database 570101428 | |
| 590592581 | https://github.com/simonw/datasette/pull/683#issuecomment-590592581 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDU5MjU4MQ== | simonw 9599 | 2020-02-24T23:00:44Z | 2020-02-24T23:01:09Z | OWNER | I've been testing this out by running one-off demo plugins. I saved the following in a file called class AsgiLogToSqliteViaWriteQueue: lookup_columns = ( "path", "user_agent", "referer", "accept_language", "content_type", "query_string", )
def header(d, name): return d.get(name.encode("utf8"), b"").decode("utf8") or None def lookup(db, table, value): return db[table].lookup({"name": value}) if value else None @hookimpl def asgi_wrapper(datasette): def wrap_with_class(app): return AsgiLogToSqliteViaWriteQueue( app, next(iter(datasette.databases.values())) )
``` |
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
.execute_write() and .execute_write_fn() methods on Database 570101428 | |
| 590518182 | https://github.com/simonw/datasette/pull/683#issuecomment-590518182 | https://api.github.com/repos/simonw/datasette/issues/683 | MDEyOklzc3VlQ29tbWVudDU5MDUxODE4Mg== | simonw 9599 | 2020-02-24T19:53:12Z | 2020-02-24T19:53:12Z | OWNER | Next steps are from comment https://github.com/simonw/datasette/issues/682#issuecomment-590517338
|
{
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} |
.execute_write() and .execute_write_fn() methods on Database 570101428 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] (
[html_url] TEXT,
[issue_url] TEXT,
[id] INTEGER PRIMARY KEY,
[node_id] TEXT,
[user] INTEGER REFERENCES [users]([id]),
[created_at] TEXT,
[updated_at] TEXT,
[author_association] TEXT,
[body] TEXT,
[reactions] TEXT,
[issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
ON [issue_comments] ([user]);

user 1