issues
8 rows where state = "closed" and user = 2181410 sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: comments, created_at (date), updated_at (date), closed_at (date)
id | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at ▲ | closed_at | author_association | pull_request | body | repo | type | active_lock_reason | performed_via_github_app | reactions | draft | state_reason |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
539590148 | MDU6SXNzdWU1Mzk1OTAxNDg= | 651 | fts5 syntax error when using punctuation | clausjuhl 2181410 | closed | 0 | 3 | 2019-12-18T10:25:35Z | 2021-07-14T19:26:06Z | 2019-12-30T06:42:55Z | NONE | Hi Simon I get a syntax error when using punctuation or special characters in a fulltext search (using fts5). I created the virtual table using sqlite-utils' "enable-fts"-command. The same error appears on Niche Museums https://www.niche-museums.com/browse/search?q=park., but works fine in most of your other datasette-examples, e.g. register-of-members-interests https://register-of-members-interests.datasettes.com/regmem-98dc8b7/items?_search=mins. What am I doing wrong? Many thanks! |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/651/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
760312579 | MDU6SXNzdWU3NjAzMTI1Nzk= | 1134 | "_searchmode=raw" throws an index out of range error when combined with "_search_COLUMN" | clausjuhl 2181410 | closed | 0 | 4 | 2020-12-09T13:05:37Z | 2020-12-10T05:57:17Z | 2020-12-09T19:56:55Z | NONE | Hi Simon! Maybe it's just me, but when using _searchmode=raw (trying to enable wildcard-searching) in combination with the "_search_COLUMN"-table argument, I get a list index out of range error. When combining with the simpler "_search"-argument everything works, including wildcard-seaches.. Here's the traceback: ``` Traceback (most recent call last): File "/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/utils/asgi.py", line 122, in route_path return await view(new_scope, receive, send) File "/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/utils/asgi.py", line 196, in view request, scope["url_route"]["kwargs"] File "/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/views/base.py", line 204, in get request, database, hash, correct_hash_provided, kwargs File "/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/views/base.py", line 342, in view_get request, database, hash, **kwargs File "/Users/cjk/.local/share/virtualenvs/minutes-jMDZ8Ssk/lib/python3.7/site-packages/datasette/views/table.py", line 393, in data search_col = key.split("search", 1)[1] IndexError: list index out of range ``` |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/1134/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
612378203 | MDU6SXNzdWU2MTIzNzgyMDM= | 757 | Question: Any fixed date for the release with the uft8-encoding fix? | clausjuhl 2181410 | closed | 0 | 3 | 2020-05-05T06:51:20Z | 2020-05-06T18:41:29Z | 2020-05-06T18:41:29Z | NONE | Just a little impatient :) |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/757/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
611835285 | MDU6SXNzdWU2MTE4MzUyODU= | 752 | Non-utf8 encoding in exceptionhandlers and custom-pages | clausjuhl 2181410 | closed | 0 | 1 | 2020-05-04T12:24:42Z | 2020-05-04T17:42:20Z | 2020-05-04T17:42:20Z | NONE | Hi Simon. Whenever a response is not piped through a router-view, the template is encoded in latin-1 (I think). This is especially a problem (for me) with the new custom_pages-functionality, but also problematic with the 404- and 500-handlers. Thanks! |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/752/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
569317377 | MDU6SXNzdWU1NjkzMTczNzc= | 681 | Cashe-header missing in http-response | clausjuhl 2181410 | closed | 0 | 4 | 2020-02-22T10:50:45Z | 2020-02-24T20:53:57Z | 2020-02-24T20:53:56Z | NONE | Hi Simon. I need some help with both understanding and adding http-headers. If I call datasette on localhost with --config default_cache_ttl:120 and --cors, I only get the following response-headers: access-control-allow-origin: * content-type: text/html; charset=utf-8 date: Sat, 22 Feb 2020 10:32:15 GMT referrer-policy: no-referrer server: uvicorn transfer-encoding: chunked Cors works, but no caching-header is set? Same thing happens if I use the command in a Dockerfile and run datasette with docker. Second, how can one add headers to uvicorn? I've tried to add uvicorn commands to the Dockerfile, before the final datasette command, but it doesn't work. Is there any way to add headers to the uvicorn.run() command i datasette? I particular, I would like to add some of the missing security-headers: Thank you for a great product! |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/681/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
556814876 | MDU6SXNzdWU1NTY4MTQ4NzY= | 662 | Escape_fts5_query-hookimplementation does not work with queries to standard tables | clausjuhl 2181410 | closed | 0 | 5 | 2020-01-29T11:56:03Z | 2020-01-30T00:30:20Z | 2020-01-30T00:30:19Z | NONE | Hi Simon Thank you for adding the escape_function, but it does not work on my datasette-installation (0.33). I've added the following file to my datasette-dir: /plugins/sql_functions.py: `from datasette import hookimpl def escape_fts_query(query): bits = query.split() return ' '.join('"{}"'.format(bit.replace('"', '')) for bit in bits) @hookimpl def prepare_connection(conn): conn.create_function("escape_fts_query", 1, escape_fts_query)` It has no effect on the standard queries to the tables though, as they still produce errors when including any characters like '-', '/', '+' or '?' Does the function only work when using costum queries, where I can include the escape_fts-function explicitly in the sql-query? PS. I'm calling datasette with --plugins=plugins, and my other plugins work just fine. PPS. The fts5 virtual table is created with 'sqlite3' like so:
Thanks! Originally posted by @clausjuhl in https://github.com/simonw/datasette/issues/651#issuecomment-579675357 |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/662/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
459936585 | MDU6SXNzdWU0NTk5MzY1ODU= | 527 | Unable to use rank when fts-table generated with csvs-to-sqlite | clausjuhl 2181410 | closed | 0 | 3 | 2019-06-24T14:49:48Z | 2019-06-24T15:21:18Z | 2019-06-24T15:09:10Z | NONE | Hi Simon. If i generate a fts-table with the csvs-to-sqlite f-option, I'm unable to use (in datasette's GUI) the internal ranking of the table for sorting or viewing, but if I generate the fts-table with the enable-fts argument from sqlite-utils, everyrthing works ok. Eg.: datasette, version 0.28 sqlite-utils, version 1.2.1 csvs-to-sqlite, version 0.9 No column named rank with these commands: $ csvs-to-sqlite minutes.csv minutes.db -f text_data $ datasette -i minutes.db select rank, * from minutes_fts where minutes_fts match 'dog' Everything ok with these commands: $ csvs-to-sqlite minutes.csv minutes.db $ sqlite-utils enable-fts minutes.db text_data $ datasette -i minutes.db select rank, * from minutes_fts where minutes_fts match 'dog' Am I doing something wrong? Thank you for a great application! |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/527/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed | ||||||
448189298 | MDU6SXNzdWU0NDgxODkyOTg= | 486 | Ability to add extra routes and related templates | clausjuhl 2181410 | closed | 0 | 2 | 2019-05-24T14:04:25Z | 2019-05-24T14:43:28Z | 2019-05-24T14:43:09Z | NONE | Hi Simon Thank for an excellent job! Datasette is such an obviously good idea (once you have that idea!) and so well done. The only thing that I miss, is the ability to add extras routes (with associated jinja2-templates). For most of the datasets, that I would like to publish, I would also like at least a page, that describes the data (semantics, provenance, biases...) and a page explaining our cookie- and privacy-policies (which would allows us to use something like Goggle Analytics). |
datasette 107914493 | issue | { "url": "https://api.github.com/repos/simonw/datasette/issues/486/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
completed |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [pull_request] TEXT, [body] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT , [active_lock_reason] TEXT, [performed_via_github_app] TEXT, [reactions] TEXT, [draft] INTEGER, [state_reason] TEXT); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);