{"html_url": "https://github.com/simonw/datasette/issues/243#issuecomment-391030083", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/243", "id": 391030083, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MTAzMDA4Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-22T15:17:10Z", "updated_at": "2018-05-22T15:17:10Z", "author_association": "OWNER", "body": "See also #278", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 318737808, "label": "--spatialite option for datasette publish commands"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/276#issuecomment-391025841", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/276", "id": 391025841, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MTAyNTg0MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-22T15:06:36Z", "updated_at": "2018-05-22T15:06:36Z", "author_association": "OWNER", "body": "The other reason I mention plugins is that I have an idea to outlaw JavaScript entirely from Datasette core and instead encourage ALL JavaScript functionality to move into plugins.right now that just means CodeMirror. I may set up some of those plugins (like CodeMirror) as default dependencies so you get them from \"pip install datasette\".\r\n\r\nI like the neatness of saying that core Datasette is a very simple JSON + HTML application, then encouraging people to go completely wild with JavaScript in the plugins.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 324835838, "label": "Handle spatialite geometry columns better"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/278#issuecomment-390993861", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/278", "id": 390993861, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MDk5Mzg2MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-22T13:40:14Z", "updated_at": "2018-05-22T14:38:05Z", "author_association": "OWNER", "body": "If we can't get `import sqlite3` to load the latest version but we can get `import pysqlite3` to work that's fine too - I can teach Datasette to import the best available version.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 325294102, "label": "Build smallest possible Docker image with Datasette plus recent SQLite (with json1) plus Spatialite 4.4.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/272#issuecomment-391011268", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/272", "id": 391011268, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MTAxMTI2OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-22T14:28:12Z", "updated_at": "2018-05-22T14:28:12Z", "author_association": "OWNER", "body": "I think I can do this almost entirely within my existing BaseView class structure.\r\n\r\nFirst, decouple the async data() methods by teaching them to take a querystring object as an argument instead of a Sanic request object. The get() method can then send that new object instead of a request.\r\n\r\nNext teach the base class how to obey the ASGI protocol.\r\n\r\nI should be able to get support for both Sanic and uvicorn/daphne working in the same codebase, which will make it easy to compare their performance. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 324188953, "label": "Port Datasette to ASGI"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/273#issuecomment-391003285", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/273", "id": 391003285, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MTAwMzI4NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-22T14:06:40Z", "updated_at": "2018-05-22T14:06:40Z", "author_association": "OWNER", "body": "That looks great. I don't think it's possible to derive the current commit version from the .zip downloaded directly from GitHub, so needing to pip install via git+https feels reasonable to me.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 324451322, "label": "Figure out a way to have /-/version return current git commit hash"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/276#issuecomment-391000659", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/276", "id": 391000659, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MTAwMDY1OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-22T13:59:27Z", "updated_at": "2018-05-22T13:59:27Z", "author_association": "OWNER", "body": "Right now the plugin stuff is early enough that I'd like to get as many potential plugin hooks as possible crafted out A much easier to judge if they should be added as actual hooks if we have a working branch prototype of them.\r\n\r\nSome kind of mechanism for custom column display is already needed - eg there are columns where I want to say \"render this as markdown\" or \"URLify any links in this text\" - or even \"use this date format\" or \"add commas to this integer\".\r\n\r\nYou can do it with a custom template but a lower-level mechanism would be nicer. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 324835838, "label": "Handle spatialite geometry columns better"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/255#issuecomment-390999055", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/255", "id": 390999055, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MDk5OTA1NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-22T13:54:55Z", "updated_at": "2018-05-22T13:54:55Z", "author_association": "OWNER", "body": "This shipped in Datasette 0.22. Here's my blog post about it: https://simonwillison.net/2018/May/20/datasette-facets/", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322477187, "label": "Facets"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/278#issuecomment-390993397", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/278", "id": 390993397, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MDk5MzM5Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-22T13:38:57Z", "updated_at": "2018-05-22T13:38:57Z", "author_association": "OWNER", "body": "Useful GitHub code search: https://github.com/search?utf8=\u2713&q=%22libspatialite-4.4.0%22+%22RC0%22&type=Code\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 325294102, "label": "Build smallest possible Docker image with Datasette plus recent SQLite (with json1) plus Spatialite 4.4.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/278#issuecomment-390991640", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/278", "id": 390991640, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MDk5MTY0MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-22T13:33:46Z", "updated_at": "2018-05-22T13:33:46Z", "author_association": "OWNER", "body": "For SpatiaLite this example may be useful - though it's building 4.3.0 and not 4.4.0: https://github.com/terranodo/spatialite-docker/blob/master/Dockerfile", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 325294102, "label": "Build smallest possible Docker image with Datasette plus recent SQLite (with json1) plus Spatialite 4.4.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/277#issuecomment-390804333", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/277", "id": 390804333, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MDgwNDMzMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-21T22:40:16Z", "updated_at": "2018-05-21T22:43:50Z", "author_association": "OWNER", "body": "We should merge this before refactoring the tests though, because that way we don't couple the new tests to the verification of this change.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 324836533, "label": "Refactor inspect logic"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/276#issuecomment-390795067", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/276", "id": 390795067, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MDc5NTA2Nw==", "user": {"value": 45057, "label": "russss"}, "created_at": "2018-05-21T21:55:57Z", "updated_at": "2018-05-21T21:55:57Z", "author_association": "CONTRIBUTOR", "body": "Well, we do have the capability to detect spatialite so my intention certainly wasn't to require it. \r\n\r\nI can see the advantage of having it as a plugin but it does touch a number of points in the code. I think I'm going to attack this by refactoring the necessary bits and seeing where that leads (which was my plan anyway).\r\n\r\nI think my main concern is - if I add certain plugin hooks for this, is anything else ever going to use them? I'm not sure I have an answer to that question yet, either way.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 324835838, "label": "Handle spatialite geometry columns better"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/276#issuecomment-390707760", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/276", "id": 390707760, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MDcwNzc2MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-21T16:30:35Z", "updated_at": "2018-05-21T16:30:35Z", "author_association": "OWNER", "body": "This probably needs to be in a plugin simply because getting Spatialite compiled and installed is a bit of a pain.\r\n\r\nIt's a great opportunity to expand the plugin hooks in useful ways though.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 324835838, "label": "Handle spatialite geometry columns better"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/277#issuecomment-390707183", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/277", "id": 390707183, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MDcwNzE4Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-21T16:28:39Z", "updated_at": "2018-05-21T16:28:39Z", "author_association": "OWNER", "body": "This is definitely a big improvement.\r\n\r\nI'd like to refactor the unit tests that cover .inspect() too - currently they are a huge ugly blob at the top of test_api.py", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 324836533, "label": "Refactor inspect logic"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/247#issuecomment-390689406", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/247", "id": 390689406, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MDY4OTQwNg==", "user": {"value": 11912854, "label": "jsancho-gpl"}, "created_at": "2018-05-21T15:29:31Z", "updated_at": "2018-05-21T15:29:31Z", "author_association": "NONE", "body": "I've changed my mind about the way to support external connectors aside of SQLite and I'm working in a more simple style that respects the original Datasette, i.e. less refactoring. I present you [a version of Datasette wich supports other database connectors](https://github.com/jsancho-gpl/datasette/tree/external-connectors) and [a Datasette connector for HDF5/PyTables files](https://github.com/jsancho-gpl/datasette-pytables).", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 319449852, "label": "SQLite code decoupled from Datasette"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/258#issuecomment-390577711", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/258", "id": 390577711, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MDU3NzcxMQ==", "user": {"value": 247131, "label": "philroche"}, "created_at": "2018-05-21T07:38:15Z", "updated_at": "2018-05-21T07:38:15Z", "author_association": "NONE", "body": "Excellent, I was not aware of the auto redirect to the new hash. My bad\r\n\r\nThis solves my use case.\r\n\r\nI do agree that your suggested --no-url-hash approach is much neater. I will investigate ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322741659, "label": "Add new metadata key persistent_urls which removes the hash from all database urls"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/274#issuecomment-390496376", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/274", "id": 390496376, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MDQ5NjM3Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-20T17:04:55Z", "updated_at": "2018-05-20T17:04:55Z", "author_association": "OWNER", "body": "http://datasette.readthedocs.io/en/latest/config.html", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 324652142, "label": "Rename --limit to --config, add --help-config"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/274#issuecomment-390433040", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/274", "id": 390433040, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MDQzMzA0MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-19T21:12:42Z", "updated_at": "2018-05-20T16:01:03Z", "author_association": "OWNER", "body": "Could also support these as optional environment variables - `DATASETTE_NAMEOFSETTING`", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 324652142, "label": "Rename --limit to --config, add --help-config"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/273#issuecomment-390250253", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/273", "id": 390250253, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MDI1MDI1Mw==", "user": {"value": 198537, "label": "rgieseke"}, "created_at": "2018-05-18T15:49:52Z", "updated_at": "2018-05-18T15:49:52Z", "author_association": "CONTRIBUTOR", "body": "Shouldn't [versioneer](https://github.com/warner/python-versioneer) do that?\r\n\r\nE.g. 0.21+2.g1076c97\r\n\r\nYou'd need to install via `pip install git+https://github.com/simow/datasette.git` though, this does a temp git clone.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 324451322, "label": "Figure out a way to have /-/version return current git commit hash"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/264#issuecomment-390105943", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/264", "id": 390105943, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MDEwNTk0Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-18T06:18:00Z", "updated_at": "2018-05-18T06:18:00Z", "author_association": "OWNER", "body": "Docs: http://datasette.readthedocs.io/en/latest/limits.html#default-facet-size", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 323673899, "label": "Make it possible to customize various facet settings"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/270#issuecomment-390105147", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/270", "id": 390105147, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MDEwNTE0Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-18T06:13:07Z", "updated_at": "2018-05-18T06:13:07Z", "author_association": "OWNER", "body": "I'm going to add a `/-/limits` page that shows the current limits.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 323830051, "label": "--limit= CLI option for setting limits"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/271#issuecomment-389989615", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/271", "id": 389989615, "node_id": "MDEyOklzc3VlQ29tbWVudDM4OTk4OTYxNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-17T19:56:13Z", "updated_at": "2018-05-17T19:56:13Z", "author_association": "OWNER", "body": "From https://www.sqlite.org/c3ref/open.html\r\n\r\n> **immutable**: The immutable parameter is a boolean query parameter that indicates that the database file is stored on read-only media. When immutable is set, SQLite assumes that the database file cannot be changed, even by a process with higher privilege, and so the database is opened read-only and all locking and change detection is disabled. Caution: Setting the immutable property on a database file that does in fact change can result in incorrect query results and/or SQLITE_CORRUPT errors. See also: SQLITE_IOCAP_IMMUTABLE.\r\n\r\nSo this would probably have to be a new mode, `datasette serve --detect-db-changes`, which no longer opens in immutable mode. Or maybe current behavior becomes not-the-default and you opt into it with `datasette serve --immutable`", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 324162476, "label": "Mechanism for automatically picking up changes when on-disk .db file changes"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/271#issuecomment-389989015", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/271", "id": 389989015, "node_id": "MDEyOklzc3VlQ29tbWVudDM4OTk4OTAxNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-17T19:54:10Z", "updated_at": "2018-05-17T19:54:10Z", "author_association": "OWNER", "body": "This is a departure from how Datasette has been designed so far, and it may turn out that it's not feasible or it requires too many philosophical changes to be worthwhile.\r\n\r\nIf we CAN do it though it would mean Datasette could stay running pointed at a directory on disk and new SQLite databases could be dropped into that directory by another process and served directly as they become available.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 324162476, "label": "Mechanism for automatically picking up changes when on-disk .db file changes"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/266#issuecomment-389894382", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/266", "id": 389894382, "node_id": "MDEyOklzc3VlQ29tbWVudDM4OTg5NDM4Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-17T14:51:13Z", "updated_at": "2018-05-17T14:53:23Z", "author_association": "OWNER", "body": "I should definitely sanity check if the `_next=` route really is the most efficient way to build this. It may turn out that iterating over a SQLite cursor with a million rows in it is super-efficient and would provide much more reliable performance (plus solve the problem for retrieving full custom SQL queries where we can't do keyset pagination).\r\n\r\nProblem here is that we run SQL queries in a thread pool. A query that returns millions of rows would presumably tie up a SQL thread until it has finished, which could block the server. This may be a reason to stick with `_next=` keyset pagination - since it ensures each SQL thread yields back again after each 1,000 rows.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 323681589, "label": "Export to CSV"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/266#issuecomment-389893810", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/266", "id": 389893810, "node_id": "MDEyOklzc3VlQ29tbWVudDM4OTg5MzgxMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-17T14:49:35Z", "updated_at": "2018-05-17T14:49:35Z", "author_association": "OWNER", "body": "Idea: add a `supports_csv = False` property to `BaseView` and over-ride it to `True` just on the view classes that should support CSV (Table and Row). Slight subtlety: the `DatabaseView` class only supports CSV in the `custom_sql()` path. Maybe that needs to be refactored a bit.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 323681589, "label": "Export to CSV"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/266#issuecomment-389626715", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/266", "id": 389626715, "node_id": "MDEyOklzc3VlQ29tbWVudDM4OTYyNjcxNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-16T18:50:46Z", "updated_at": "2018-05-16T18:50:46Z", "author_association": "OWNER", "body": "> I\u2019d recommend using the Windows-1252 encoding for maximum compatibility, unless you have any characters not in that set, in which case use UTF8 with a byte order mark. Bit of a pain, but some progams (eg various versions of Excel) don\u2019t read UTF8.\r\n**frankieroberto** https://twitter.com/frankieroberto/status/996823071947460616\r\n\r\n> There is software that consumes CSV and doesn't speak UTF8!? Huh. Well I can't just use Windows-1252 because I need to support the full UTF8 range of potential data - maybe I should support an optional ?_encoding=windows-1252 argument\r\n**simonw** https://twitter.com/simonw/status/996824677245857793", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 323681589, "label": "Export to CSV"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/266#issuecomment-389608473", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/266", "id": 389608473, "node_id": "MDEyOklzc3VlQ29tbWVudDM4OTYwODQ3Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-16T17:52:35Z", "updated_at": "2018-05-16T17:54:11Z", "author_association": "OWNER", "body": "There are some code examples in this issue which should help with the streaming part: https://github.com/channelcat/sanic/issues/1067\r\n\r\nAlso https://github.com/channelcat/sanic/blob/master/docs/sanic/streaming.md#response-streaming", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 323681589, "label": "Export to CSV"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/266#issuecomment-389592566", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/266", "id": 389592566, "node_id": "MDEyOklzc3VlQ29tbWVudDM4OTU5MjU2Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-16T17:01:29Z", "updated_at": "2018-05-16T17:02:21Z", "author_association": "OWNER", "body": "Let's provide a CSV Dialect definition too: https://frictionlessdata.io/specs/csv-dialect/ - via https://twitter.com/drewdaraabrams/status/996794915680997382", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 323681589, "label": "Export to CSV"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/266#issuecomment-389579762", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/266", "id": 389579762, "node_id": "MDEyOklzc3VlQ29tbWVudDM4OTU3OTc2Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-16T16:21:12Z", "updated_at": "2018-05-16T16:21:12Z", "author_association": "OWNER", "body": "> I basically want someone to tell me which arguments I can pass to Python's csv.writer() function that will result in the least complaints from people who try to parse the results :)\r\nhttps://twitter.com/simonw/status/996786815938977792", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 323681589, "label": "Export to CSV"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/266#issuecomment-389579363", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/266", "id": 389579363, "node_id": "MDEyOklzc3VlQ29tbWVudDM4OTU3OTM2Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-16T16:20:06Z", "updated_at": "2018-05-16T16:20:06Z", "author_association": "OWNER", "body": "I started a thread on Twitter discussing various CSV output dialects: https://twitter.com/simonw/status/996783395504979968 - I want to pick defaults which will work as well as possible for whatever tools people might be using to consume the data.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 323681589, "label": "Export to CSV"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/266#issuecomment-389572201", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/266", "id": 389572201, "node_id": "MDEyOklzc3VlQ29tbWVudDM4OTU3MjIwMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-16T15:58:43Z", "updated_at": "2018-05-16T16:00:47Z", "author_association": "OWNER", "body": "This will likely be implemented in the `BaseView` class, which needs to know how to spot the `.csv` extension, call the underlying JSON generating function and then return the `columns` and `rows` as correctly formatted CSV.\r\n\r\nhttps://github.com/simonw/datasette/blob/9959a9e4deec8e3e178f919e8b494214d5faa7fd/datasette/views/base.py#L201-L207\r\n\r\nThis means it will take ALL arguments that are available to the `.json` view. It may ignore some (e.g. `_facet=` makes no sense since CSV tables don't have space to show the facet results).\r\n\r\nIn streaming mode, things will behave a little bit differently - in particular, if `_stream=1` then `_next=` will be forbidden.\r\n\r\nIt can't include a length header because we don't know how many bytes it will be\r\n\r\nCSV output will throw an error if the endpoint doesn't have rows and columns keys eg `/-/inspect.json`\r\n\r\nSo the implementation...\r\n\r\n- looks for the `.csv` extension\r\n- internally fetches the `.json` data instead\r\n- If no `_stream` it just transposes that JSON to CSV with the correct content type header\r\n- If `_stream=1` - checks for `_next=` and throws an error if it was provided\r\n- Otherwise... fetch first page and emit CSV header and first set of rows\r\n- Then start async looping, emitting more CSV rows and following the `_next=` internal reference until done\r\n\r\nI like that this takes advantage of efficient pagination. It may not work so well for views which use offset/limit though.\r\n\r\nIt won't work at all for custom SQL because custom SQL doesn't support _next= pagination. That's fine.\r\n\r\nFor views... easiest fix is to cut off after first X000 records. That seems OK. View JSON would need to include a property that the mechanism can identify.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 323681589, "label": "Export to CSV"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/265#issuecomment-389566147", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/265", "id": 389566147, "node_id": "MDEyOklzc3VlQ29tbWVudDM4OTU2NjE0Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-16T15:41:42Z", "updated_at": "2018-05-16T15:41:42Z", "author_association": "OWNER", "body": "An official demo instance of Datasette dedicated to this use-case would be useful, especially if it was automatically deployed by Travis for every commit to master that passes the tests.\r\n\r\nMaybe there should be a permanent version of it deployed for each released version too?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 323677499, "label": "Add links to example Datasette instances to appropiate places in docs"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/263#issuecomment-389563719", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/263", "id": 389563719, "node_id": "MDEyOklzc3VlQ29tbWVudDM4OTU2MzcxOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-16T15:34:46Z", "updated_at": "2018-05-16T15:34:46Z", "author_association": "OWNER", "body": "The underlying mechanics for the `_extras` mechanism described in #262 may help with this.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 323671577, "label": "Facets should not execute for ?shape=array|object"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/255#issuecomment-389562708", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/255", "id": 389562708, "node_id": "MDEyOklzc3VlQ29tbWVudDM4OTU2MjcwOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-16T15:32:12Z", "updated_at": "2018-05-16T15:32:12Z", "author_association": "OWNER", "body": "This is now landed in master, ready for the next release.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322477187, "label": "Facets"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/255#issuecomment-389546040", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/255", "id": 389546040, "node_id": "MDEyOklzc3VlQ29tbWVudDM4OTU0NjA0MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-16T14:47:34Z", "updated_at": "2018-05-16T14:47:34Z", "author_association": "OWNER", "body": "Latest demo - now with multiple columns: https://datasette-suggested-facets-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List?_facet=qCaretaker&_facet=qCareAssistant&_facet=qLegalStatus\r\n\r\n![2018-05-16 at 7 47 am](https://user-images.githubusercontent.com/9599/40124418-63e680ba-58dd-11e8-8063-9686826abb8e.png)\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322477187, "label": "Facets"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/258#issuecomment-389536870", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/258", "id": 389536870, "node_id": "MDEyOklzc3VlQ29tbWVudDM4OTUzNjg3MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-16T14:22:31Z", "updated_at": "2018-05-16T14:22:31Z", "author_association": "OWNER", "body": "The principle benefit provided by the hash URLs is that Datasette can set a far-future cache expiry header on every response. This is particularly useful for JavaScript API work as it makes fantastic use of the browser's cache. It also means that if you are serving your API from behind a caching proxy like Cloudflare you get a fantastic cache hit rate.\r\n\r\nAn option to serve without persistent hashes would also need to turn off the cache headers.\r\n\r\nMaybe the option should support both? If you hit a page with the hash in the URL you still get the cache headers, but hits to the URL without the hash serve uncashed content directly.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322741659, "label": "Add new metadata key persistent_urls which removes the hash from all database urls"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/255#issuecomment-389397457", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/255", "id": 389397457, "node_id": "MDEyOklzc3VlQ29tbWVudDM4OTM5NzQ1Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-16T05:20:04Z", "updated_at": "2018-05-16T05:20:04Z", "author_association": "OWNER", "body": "Maybe `suggested_facets` should only be calculated for the HTML view.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322477187, "label": "Facets"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/255#issuecomment-389386919", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/255", "id": 389386919, "node_id": "MDEyOklzc3VlQ29tbWVudDM4OTM4NjkxOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-16T03:57:47Z", "updated_at": "2018-05-16T03:58:30Z", "author_association": "OWNER", "body": "I updated that demo to demonstrate the new foreign key label expansions: https://datasette-suggested-facets-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List?_facet=qLegalStatus\r\n\r\n![2018-05-15 at 8 58 pm](https://user-images.githubusercontent.com/9599/40095806-b645026a-5882-11e8-8100-76136df50212.png)\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322477187, "label": "Facets"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/258#issuecomment-389386142", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/258", "id": 389386142, "node_id": "MDEyOklzc3VlQ29tbWVudDM4OTM4NjE0Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-16T03:51:13Z", "updated_at": "2018-05-16T03:51:13Z", "author_association": "OWNER", "body": "The URL does persist across deployments already, in that you can use the URL without the hash and it will redirect to the current location. Here's an example of that: https://san-francisco.datasettes.com/sf-trees/Street_Tree_List.json\r\n\r\nThis also works if you attempt to hit the incorrect hash, e.g. if you have deployed a new version of the database with an updated hash. The old hash will redirect, e.g. https://san-francisco.datasettes.com/sf-trees-c4b972c/Street_Tree_List.json\r\n\r\nIf you serve Datasette from a HTTP/2 proxy (I've been using Cloudflare for this) you won't even have to pay the cost of the redirect - Datasette sends a `Link: ; rel=preload` header with those redirects, which causes Cloudflare to push out the redirected source as part of that HTTP/2 request. You can fire up the Chrome DevTools to watch this happen.\r\n\r\nhttps://github.com/simonw/datasette/blob/2b79f2bdeb1efa86e0756e741292d625f91cb93d/datasette/views/base.py#L91\r\n\r\nAll of that said... I'm not at all opposed to this feature. For consistency with other Datasette options (e.g. `--cors`) I'd prefer to do this as an optional argument to the `datasette serve` command - something like this:\r\n\r\n datasette serve mydb.db --no-url-hash", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322741659, "label": "Add new metadata key persistent_urls which removes the hash from all database urls"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/255#issuecomment-389147608", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/255", "id": 389147608, "node_id": "MDEyOklzc3VlQ29tbWVudDM4OTE0NzYwOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-15T12:24:46Z", "updated_at": "2018-05-15T12:24:46Z", "author_association": "OWNER", "body": "New demo (published with `datasette publish now --branch=suggested-facets fivethirtyeight.db sf-trees.db --name=datastte-suggested-facets-demo`): https://datasette-suggested-facets-demo.now.sh/fivethirtyeight-2628db9/comic-characters%2Fmarvel-wikia-data\r\n\r\nAfter turning on a couple of suggested facets... https://datasette-suggested-facets-demo.now.sh/fivethirtyeight-2628db9/comic-characters%2Fmarvel-wikia-data?_facet=SEX&_facet=ID\r\n\r\n![2018-05-15 at 7 24 am](https://user-images.githubusercontent.com/9599/40056411-fa265d16-5810-11e8-89ec-e38fe29ffb2c.png)\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322477187, "label": "Facets"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/255#issuecomment-389145872", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/255", "id": 389145872, "node_id": "MDEyOklzc3VlQ29tbWVudDM4OTE0NTg3Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-15T12:17:52Z", "updated_at": "2018-05-15T12:17:52Z", "author_association": "OWNER", "body": "Activity has now moved to this branch: https://github.com/simonw/datasette/commits/suggested-facets", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322477187, "label": "Facets"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/251#issuecomment-388987044", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/251", "id": 388987044, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODk4NzA0NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-14T22:47:55Z", "updated_at": "2018-05-14T22:47:55Z", "author_association": "OWNER", "body": "This work is now happening in the facets branch. Closing this in favor of #255.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 320592643, "label": "Explore \"distinct values for column\" in inspect()"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/259#issuecomment-388797919", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/259", "id": 388797919, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODc5NzkxOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-14T12:23:11Z", "updated_at": "2018-05-14T12:23:11Z", "author_association": "OWNER", "body": "For M2M to work we will need a mechanism for applying IN queries to the table view, so you can select multiple M2M filters. Maybe this would work:\r\n\r\n ?_m2m_category=123&_m2m_category=865", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322787470, "label": "inspect() should detect many-to-many relationships"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/255#issuecomment-388784787", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/255", "id": 388784787, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODc4NDc4Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-14T11:28:05Z", "updated_at": "2018-05-14T11:28:05Z", "author_association": "OWNER", "body": "To decide which facets to suggest: for each column, is the unique value count less than the number of rows matching the current query or is it less than 20 (if we are showing more than 20 rows)?\r\n\r\nMaybe only do this if there are less than ten non-float columns. Or always try for foreign keys and booleans, then if there are none of those try indexed text and integer fields, then finally try non-indexed text and integer fields but only if there are less than ten.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322477187, "label": "Facets"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/255#issuecomment-388784063", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/255", "id": 388784063, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODc4NDA2Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-14T11:25:00Z", "updated_at": "2018-05-14T11:25:15Z", "author_association": "OWNER", "body": "Can I get facets working across many2many relationships?\r\n\r\nThis would be fiendishly useful, but the querystring and `metadata.json` syntax is non-obvious.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322477187, "label": "Facets"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/255#issuecomment-388686463", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/255", "id": 388686463, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODY4NjQ2Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-14T03:23:44Z", "updated_at": "2018-05-14T03:25:22Z", "author_association": "OWNER", "body": "It would be neat if there was a mechanism for calculating aggregates per facet - e.g. calculating the sum() of specific columns against each facet result on https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9/nba-elo%2Fnbaallelo?_facet=lg_id&_facet=fran_id&lg_id=ABA&_facet=team_id", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322477187, "label": "Facets"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/256#issuecomment-388684356", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/256", "id": 388684356, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODY4NDM1Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-14T03:05:37Z", "updated_at": "2018-05-14T03:05:37Z", "author_association": "OWNER", "body": "I just landed pull request #257 - I haven't refactored the tests, I may do that later if it looks worthwhile.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322551723, "label": "Break up app.py into separate view modules"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/255#issuecomment-388645828", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/255", "id": 388645828, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODY0NTgyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-13T18:18:56Z", "updated_at": "2018-05-13T18:20:02Z", "author_association": "OWNER", "body": "I may be able to run the SQL for all of the facet counts in one go using a WITH CTE query - will have to microbenchmark this to make sure it is worthwhile: https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9?sql=with+blah+as+%28select+*+from+%5Bcollege-majors%2Fall-ages%5D%29%0D%0Aselect+*+from+%28select+%22Major_category%22%2C+Major_category%2C+count%28*%29+as+n+from%0D%0Ablah+group+by+Major_category+order+by+n+desc+limit+10%29%0D%0Aunion+all%0D%0Aselect+*+from+%28select+%22Major_category2%22%2C+Major_category%2C+count%28*%29+as+n+from%0D%0Ablah+group+by+Major_category+order+by+n+desc+limit+10%29", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322477187, "label": "Facets"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/257#issuecomment-388628966", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/257", "id": 388628966, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODYyODk2Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-13T14:00:47Z", "updated_at": "2018-05-13T14:06:35Z", "author_association": "OWNER", "body": "Running specific tests:\r\n```\r\nvenv35/bin/pip install pytest beautifulsoup4 aiohttp\r\nvenv35/bin/pytest tests/test_utils.py\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322591993, "label": "Refactor views"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/257#issuecomment-388627281", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/257", "id": 388627281, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODYyNzI4MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-13T13:36:21Z", "updated_at": "2018-05-13T13:36:21Z", "author_association": "OWNER", "body": "https://github.com/rtfd/readthedocs.org/issues/3812#issuecomment-373780860 suggests Python 3.5.2 may have the fix.\r\n\r\nYup, that worked:\r\n\r\n```\r\npyenv install 3.5.2\r\nrm -rf venv35\r\n/Users/simonw/.pyenv/versions/3.5.2/bin/python -mvenv venv35\r\nsource venv35/bin/activate\r\n# Not sure why I need this in my local environment but I do:\r\npip install datasette_plugin_demos\r\npython setup.py test\r\n```\r\n\r\nThis is now giving me the same test failure locally that I am seeing in Travis.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322591993, "label": "Refactor views"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/257#issuecomment-388626804", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/257", "id": 388626804, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODYyNjgwNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-13T13:28:20Z", "updated_at": "2018-05-13T13:28:20Z", "author_association": "OWNER", "body": "Unfortunately, running `python setup.py test` on my laptop using Python 3.5.0 in that virtualenv results in a flow of weird Sanic-related errors:\r\n\r\n```\r\n File \"/Users/simonw/Dropbox/Development/datasette/venv35/lib/python3.5/site-packages/sanic-0.7.0-py3.5.egg/sanic/testing.py\", line 16, in _local_request\r\n import aiohttp\r\n File \"/Users/simonw/Dropbox/Development/datasette/.eggs/aiohttp-2.3.2-py3.5-macosx-10.13-x86_64.egg/aiohttp/__init__.py\", line 6, in \r\n from .client import * # noqa\r\n File \"/Users/simonw/Dropbox/Development/datasette/.eggs/aiohttp-2.3.2-py3.5-macosx-10.13-x86_64.egg/aiohttp/client.py\", line 13, in \r\n from yarl import URL\r\n File \"/Users/simonw/Dropbox/Development/datasette/.eggs/yarl-1.2.4-py3.5-macosx-10.13-x86_64.egg/yarl/__init__.py\", line 11, in \r\n from .quoting import _Quoter, _Unquoter\r\n File \"/Users/simonw/Dropbox/Development/datasette/.eggs/yarl-1.2.4-py3.5-macosx-10.13-x86_64.egg/yarl/quoting.py\", line 3, in \r\n from typing import Optional, TYPE_CHECKING, cast\r\nImportError: cannot import name 'TYPE_CHECKING'\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322591993, "label": "Refactor views"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/257#issuecomment-388626721", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/257", "id": 388626721, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODYyNjcyMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-13T13:27:04Z", "updated_at": "2018-05-13T13:27:04Z", "author_association": "OWNER", "body": "I managed to get Python 3.5.0 running on my laptop using [pyenv](https://github.com/pyenv/pyenv). Here's the incantation I used:\r\n\r\n```\r\n# Install pyenv using homebrew (turns out I already had it)\r\nbrew install pyenv\r\n# Check which versions of Python I have installed\r\npyenv versions\r\n# Install Python 3.5.0\r\npyenv install 3.5.0\r\n# Figure out where pyenv has been installing things\r\npyenv root\r\n# Check I can run my newly installed Python 3.5.0\r\n/Users/simonw/.pyenv/versions/3.5.0/bin/python\r\n# Use it to create a new virtualenv\r\n/Users/simonw/.pyenv/versions/3.5.0/bin/python -mvenv venv35\r\nsource venv35/bin/activate\r\n# Install datasette into that virtualenv\r\npython setup.py install\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322591993, "label": "Refactor views"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/257#issuecomment-388625703", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/257", "id": 388625703, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODYyNTcwMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-13T13:10:09Z", "updated_at": "2018-05-13T13:10:09Z", "author_association": "OWNER", "body": "I'm still seeing intermittent Python 3.5 failures due to dictionary ordering differences.\r\n\r\nhttps://travis-ci.org/simonw/datasette/jobs/378356802\r\n\r\n```\r\n> assert expected_facet_results == facet_results\r\nE AssertionError: assert {'city': [{'c...alue': 'MI'}]} == {'city': [{'co...alue': 'MI'}]}\r\nE Omitting 1 identical items, use -vv to show\r\nE Differing items:\r\nE {'city': [{'count': 4, 'toggle_url': '_facet=state&_facet=city&state=MI&city=Detroit', 'value': 'Detroit'}]} != {'city': [{'count': 4, 'toggle_url': 'state=MI&_facet=state&_facet=city&city=Detroit', 'value': 'Detroit'}]}\r\nE Use -v to get the full diff\r\n```\r\n\r\nTo solve these cleanly I need to be able to run Python 3.5 on my local laptop rather than relying on Travis every time.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322591993, "label": "Refactor views"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/255#issuecomment-388588998", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/255", "id": 388588998, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODU4ODk5OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-12T22:57:30Z", "updated_at": "2018-05-12T23:00:24Z", "author_association": "OWNER", "body": "A few demos:\r\n\r\n* https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9/college-majors%2Fall-ages?_facet=Major_category\r\n* https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9/congress-age%2Fcongress-terms?_facet=chamber&_facet=state&_facet=party&_facet=incumbent\r\n* https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9/bechdel%2Fmovies?_facet=binary&_facet=test", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322477187, "label": "Facets"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/255#issuecomment-388589072", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/255", "id": 388589072, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODU4OTA3Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-12T22:59:07Z", "updated_at": "2018-05-12T22:59:07Z", "author_association": "OWNER", "body": "I need to decide how to display these. They currently look like this:\r\n\r\nhttps://datasette-facets-demo.now.sh/fivethirtyeight-2628db9/congress-age%2Fcongress-terms?_facet=chamber&_facet=state&_facet=party&_facet=incumbent&state=MO\r\n\r\n![2018-05-12 at 7 58 pm](https://user-images.githubusercontent.com/9599/39962230-e7bf9e10-561e-11e8-80a7-0941b8991318.png)\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322477187, "label": "Facets"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/255#issuecomment-388588011", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/255", "id": 388588011, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODU4ODAxMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-12T22:33:39Z", "updated_at": "2018-05-12T22:33:39Z", "author_association": "OWNER", "body": "Initial documentation: http://datasette.readthedocs.io/en/latest/facets.html", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322477187, "label": "Facets"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/255#issuecomment-388587855", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/255", "id": 388587855, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODU4Nzg1NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-12T22:30:23Z", "updated_at": "2018-05-12T22:30:23Z", "author_association": "OWNER", "body": "Adding some TODOs to the original description (so they show up as a todo progress bar)", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322477187, "label": "Facets"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/253#issuecomment-388550742", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/253", "id": 388550742, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODU1MDc0Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-12T12:09:02Z", "updated_at": "2018-05-12T12:09:02Z", "author_association": "OWNER", "body": "http://datasette.readthedocs.io/en/latest/full_text_search.html", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 321631020, "label": "Documentation explaining how to use SQLite FTS with Datasette"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/255#issuecomment-388525357", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/255", "id": 388525357, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODUyNTM1Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-12T03:01:14Z", "updated_at": "2018-05-12T03:01:14Z", "author_association": "OWNER", "body": "Facet counts will be generated by extra SQL queries with their own aggressive time limit.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322477187, "label": "Facets"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/254#issuecomment-388360255", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/254", "id": 388360255, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODM2MDI1NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-11T13:16:09Z", "updated_at": "2018-05-11T22:45:31Z", "author_association": "OWNER", "body": "Do you have an example I can look at?\r\n\r\nI think I have a possible route for fixing this, but it's pretty tricky (it involves adding a full SQL statement parser, but that's needed for some other potential improvements as well).\r\n\r\nIn the meantime, is this causing actual errors for you or is it more of an inconvenience (form fields being displayed that don't actually do anything)?\r\n\r\nAnother potential solution here could be to allow canned queries to optionally declare their parameters in metadata.json", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322283067, "label": "Escaping named parameters in canned queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/254#issuecomment-388497467", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/254", "id": 388497467, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODQ5NzQ2Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-11T22:06:00Z", "updated_at": "2018-05-11T22:06:34Z", "author_association": "OWNER", "body": "Got it, this seems to trigger the problem: https://datasette-zkcvlwdrhl.now.sh/simplestreams-270f20c?sql=select+*+from+cloudimage+where+%22content_id%22+%3D+%22com.ubuntu.cloud%3Areleased%3Adownload%22+order+by+id+limit+10", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322283067, "label": "Escaping named parameters in canned queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/254#issuecomment-388367027", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/254", "id": 388367027, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODM2NzAyNw==", "user": {"value": 247131, "label": "philroche"}, "created_at": "2018-05-11T13:41:46Z", "updated_at": "2018-05-11T13:41:46Z", "author_association": "NONE", "body": "An example deployment @ https://datasette-zkcvlwdrhl.now.sh/simplestreams-270f20c/cloudimage?content_id__exact=com.ubuntu.cloud%3Areleased%3Adownload\r\n\r\nIt is not causing errors, more of an inconvenience. I have worked around it using a `like` query instead. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322283067, "label": "Escaping named parameters in canned queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/251#issuecomment-386879878", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/251", "id": 386879878, "node_id": "MDEyOklzc3VlQ29tbWVudDM4Njg3OTg3OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-06T13:34:57Z", "updated_at": "2018-05-06T13:34:57Z", "author_association": "OWNER", "body": "If I'm going to expand column introspection in this way it would be useful to also capture column type information.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 320592643, "label": "Explore \"distinct values for column\" in inspect()"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/251#issuecomment-386879840", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/251", "id": 386879840, "node_id": "MDEyOklzc3VlQ29tbWVudDM4Njg3OTg0MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-06T13:34:24Z", "updated_at": "2018-05-06T13:34:24Z", "author_association": "OWNER", "body": "Here's a quick demo of that exploration: https://datasette-distinct-column-values.now.sh/-/inspect\r\n\r\nExample output:\r\n\r\n```\r\n{\r\n \"antiquities-act/actions_under_antiquities_act\": {\r\n \"columns\": [\r\n \"current_name\",\r\n \"states\",\r\n \"original_name\",\r\n \"current_agency\",\r\n \"action\",\r\n \"date\",\r\n \"year\",\r\n \"pres_or_congress\",\r\n \"acres_affected\"\r\n ],\r\n \"count\": 344,\r\n \"distinct_values_by_column\": {\r\n \"acres_affected\": null,\r\n \"action\": null,\r\n \"current_agency\": [\r\n \"NPS\",\r\n \"State of Montana\",\r\n \"BLM\",\r\n \"State of Arizona\",\r\n \"USFS\",\r\n \"State of North Dakota\",\r\n \"NPS, BLM\",\r\n \"State of South Carolina\",\r\n \"State of New York\",\r\n \"FWS\",\r\n \"FWS, NOAA\",\r\n \"NPS, FWS\",\r\n \"NOAA\",\r\n \"BLM, USFS\",\r\n \"NOAA, FWS\"\r\n ],\r\n \"current_name\": null,\r\n \"date\": null,\r\n \"original_name\": null,\r\n \"pres_or_congress\": null,\r\n \"states\": null,\r\n \"year\": null\r\n },\r\n \"foreign_keys\": {\r\n \"incoming\": [],\r\n \"outgoing\": []\r\n },\r\n \"fts_table\": null,\r\n \"hidden\": false,\r\n \"label_column\": null,\r\n \"name\": \"antiquities-act/actions_under_antiquities_act\",\r\n \"primary_keys\": []\r\n }\r\n}\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 320592643, "label": "Explore \"distinct values for column\" in inspect()"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/251#issuecomment-386879509", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/251", "id": 386879509, "node_id": "MDEyOklzc3VlQ29tbWVudDM4Njg3OTUwOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-06T13:29:26Z", "updated_at": "2018-05-06T13:29:26Z", "author_association": "OWNER", "body": "We can solve this using the `sqlite_timelimit(conn, 20)` helper, which can tell SQLite to give up after 20ms. We can wrap that around the following SQL:\r\n\r\n select distinct COLUMN from TABLE limit 21;\r\n\r\nThen we look at the number of rows returned. If it's 21 or more we know that this table had more than 21 distinct values, so we'll treat it as \"unlimited\". Likewise, if the SQL times out before 20ms is up we will skip this introspection.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 320592643, "label": "Explore \"distinct values for column\" in inspect()"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/237#issuecomment-386840806", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/237", "id": 386840806, "node_id": "MDEyOklzc3VlQ29tbWVudDM4Njg0MDgwNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-05T22:56:42Z", "updated_at": "2018-05-05T22:56:42Z", "author_association": "OWNER", "body": "Demo:\r\n\r\n datasette publish now ../datasettes/san-francisco/sf-film-locations.db --branch=master --name datasette-column-search-demo\r\n\r\nhttps://datasette-column-search-demo.now.sh/sf-film-locations/Film_Locations_in_San_Francisco?_search_Locations=justin", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 317475156, "label": "Support for ?_search_colname=blah searches"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/237#issuecomment-386840307", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/237", "id": 386840307, "node_id": "MDEyOklzc3VlQ29tbWVudDM4Njg0MDMwNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-05T22:45:45Z", "updated_at": "2018-05-05T22:45:45Z", "author_association": "OWNER", "body": "Documented here: http://datasette.readthedocs.io/en/latest/json_api.html#special-table-arguments", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 317475156, "label": "Support for ?_search_colname=blah searches"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/249#issuecomment-386692534", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/249", "id": 386692534, "node_id": "MDEyOklzc3VlQ29tbWVudDM4NjY5MjUzNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-04T18:26:30Z", "updated_at": "2018-05-04T18:26:30Z", "author_association": "OWNER", "body": "Demo: https://datasette-plugins-and-max-size-demo.now.sh/sf-trees/Street_Tree_List.json?_size=max", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 320090329, "label": "?_size=max argument "}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/248#issuecomment-386692333", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/248", "id": 386692333, "node_id": "MDEyOklzc3VlQ29tbWVudDM4NjY5MjMzMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-04T18:25:40Z", "updated_at": "2018-05-04T18:25:40Z", "author_association": "OWNER", "body": "Demo: https://datasette-plugins-and-max-size-demo.now.sh/-/plugins", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 319954545, "label": "/-/plugins should show version of each installed plugin"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/248#issuecomment-386357645", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/248", "id": 386357645, "node_id": "MDEyOklzc3VlQ29tbWVudDM4NjM1NzY0NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-03T16:36:59Z", "updated_at": "2018-05-03T16:36:59Z", "author_association": "OWNER", "body": "Even better: use `plugin_manager.list_plugin_distinfo()` from pluggy to get back a list of tuples, the second item in each tuple is a `pkg_resources.DistInfoDistribution` with a `.version` attribute.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 319954545, "label": "/-/plugins should show version of each installed plugin"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/245#issuecomment-386310149", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/245", "id": 386310149, "node_id": "MDEyOklzc3VlQ29tbWVudDM4NjMxMDE0OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-03T14:14:33Z", "updated_at": "2018-05-03T14:14:33Z", "author_association": "OWNER", "body": "Demos:\r\n\r\n* https://datasette-versions-and-shape-demo.now.sh/sf-trees-02c8ef1/qSpecies.json?_shape=array\r\n* https://datasette-versions-and-shape-demo.now.sh/sf-trees-02c8ef1/qSpecies.json?_shape=object\r\n* https://datasette-versions-and-shape-demo.now.sh/sf-trees-02c8ef1/qSpecies.json?_shape=arrays\r\n* https://datasette-versions-and-shape-demo.now.sh/sf-trees-02c8ef1/qSpecies.json?_shape=objects", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 319358200, "label": "?_shape=array option"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/244#issuecomment-386309928", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/244", "id": 386309928, "node_id": "MDEyOklzc3VlQ29tbWVudDM4NjMwOTkyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-03T14:13:49Z", "updated_at": "2018-05-03T14:13:49Z", "author_association": "OWNER", "body": "Demo: https://datasette-versions-and-shape-demo.now.sh/-/versions", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 318738000, "label": "/-/versions page"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/125#issuecomment-384678319", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/125", "id": 384678319, "node_id": "MDEyOklzc3VlQ29tbWVudDM4NDY3ODMxOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-26T15:14:31Z", "updated_at": "2018-04-26T15:14:31Z", "author_association": "OWNER", "body": "I shipped this last week as the first plugin: https://simonwillison.net/2018/Apr/20/datasette-plugins/\r\n\r\nDemo: https://datasette-cluster-map-demo.datasettes.com/polar-bears-455fe3a/USGS_WC_eartags_output_files_2009-2011-Status\r\n\r\nPlugin: https://github.com/simonw/datasette-cluster-map", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 275135393, "label": "Plot rows on a map with Leaflet and Leaflet.markercluster"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/44#issuecomment-384676488", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/44", "id": 384676488, "node_id": "MDEyOklzc3VlQ29tbWVudDM4NDY3NjQ4OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-26T15:09:57Z", "updated_at": "2018-04-26T15:09:57Z", "author_association": "OWNER", "body": "Remaining work for this is tracked in #150", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 269731374, "label": "?_group_count=country - return counts by specific column(s)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/79#issuecomment-384675792", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/79", "id": 384675792, "node_id": "MDEyOklzc3VlQ29tbWVudDM4NDY3NTc5Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-26T15:08:13Z", "updated_at": "2018-04-26T15:08:13Z", "author_association": "OWNER", "body": "Docs now live at http://datasette.readthedocs.io/\r\n\r\nI still need to document a few more parts of the API before closing this.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273569068, "label": "Add more detailed API documentation to the README"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/229#issuecomment-384512192", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/229", "id": 384512192, "node_id": "MDEyOklzc3VlQ29tbWVudDM4NDUxMjE5Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-26T04:49:46Z", "updated_at": "2018-04-26T04:49:46Z", "author_association": "OWNER", "body": "Documentation: http://datasette.readthedocs.io/en/latest/json_api.html#special-table-arguments", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 316123256, "label": "Table view should support ?_size=400 parameter"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/239#issuecomment-384503873", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/239", "id": 384503873, "node_id": "MDEyOklzc3VlQ29tbWVudDM4NDUwMzg3Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-26T03:45:11Z", "updated_at": "2018-04-26T03:45:11Z", "author_association": "OWNER", "body": "Documentation: http://datasette.readthedocs.io/en/latest/metadata.html#hiding-tables", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 317760361, "label": "Support for hidden tables in metadata.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/239#issuecomment-384500327", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/239", "id": 384500327, "node_id": "MDEyOklzc3VlQ29tbWVudDM4NDUwMDMyNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-26T03:18:12Z", "updated_at": "2018-04-26T03:18:20Z", "author_association": "OWNER", "body": "```\r\n{\r\n \"databases\": {\r\n \"database1\": {\r\n \"tables\": {\r\n \"example_table\": {\r\n \"hidden\": true\r\n }\r\n }\r\n }\r\n }\r\n}\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 317760361, "label": "Support for hidden tables in metadata.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/238#issuecomment-384362028", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/238", "id": 384362028, "node_id": "MDEyOklzc3VlQ29tbWVudDM4NDM2MjAyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-25T17:07:11Z", "updated_at": "2018-04-25T17:07:11Z", "author_association": "OWNER", "body": "On further thought: this is actually only an issue for immutable deployments to platforms like Zeit Now and Heroku.\r\n\r\nAs such, adding it to `datasette serve` feels clumsy. Maybe `datasette publish` should instead gain the ability to optionally install an extra mechanism that periodically pulls a fresh copy of `metadata.json` from a URL.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 317714268, "label": "External metadata.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/235#issuecomment-383764533", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/235", "id": 383764533, "node_id": "MDEyOklzc3VlQ29tbWVudDM4Mzc2NDUzMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-24T00:30:02Z", "updated_at": "2018-04-24T00:30:02Z", "author_association": "OWNER", "body": "The `resource` module in he standard library has the ability to set limits on memory usage for the current process: https://pymotw.com/2/resource/", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 316621102, "label": "Add limit on the size in KB of data returned from a single query"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/235#issuecomment-383727973", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/235", "id": 383727973, "node_id": "MDEyOklzc3VlQ29tbWVudDM4MzcyNzk3Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-23T21:23:59Z", "updated_at": "2018-04-23T21:23:59Z", "author_association": "OWNER", "body": "There might also be something clever we can do here with PRAGMA statements: https://stackoverflow.com/questions/14146881/limit-the-maximum-amount-of-memory-sqlite3-uses\r\n\r\nAnd https://www.sqlite.org/pragma.html", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 316621102, "label": "Add limit on the size in KB of data returned from a single query"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/231#issuecomment-383315348", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/231", "id": 383315348, "node_id": "MDEyOklzc3VlQ29tbWVudDM4MzMxNTM0OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-21T17:37:50Z", "updated_at": "2018-04-22T23:06:04Z", "author_association": "OWNER", "body": "I could also have an `\"autodetect\": false` option for that plugin to turn off autodetecting entirely.\r\n\r\nWould be useful if the plugin didn't append its JavaScript in pages that it wasn't used for - that might require making the `extra_js_urls()` hook optionally aware of the columns and table and metadata.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 316323336, "label": "metadata.json support for plugin configuration options"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/234#issuecomment-383410146", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/234", "id": 383410146, "node_id": "MDEyOklzc3VlQ29tbWVudDM4MzQxMDE0Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-22T20:32:30Z", "updated_at": "2018-04-22T20:47:02Z", "author_association": "OWNER", "body": "I built this wrong: my implementation is looking for the `label_column` on the table-being-displayed, but it should be looking for it on the table-the-foreign-key-links-to.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 316526433, "label": "label_column option in metadata.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/234#issuecomment-383399762", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/234", "id": 383399762, "node_id": "MDEyOklzc3VlQ29tbWVudDM4MzM5OTc2Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-22T17:54:39Z", "updated_at": "2018-04-22T17:54:39Z", "author_association": "OWNER", "body": "Docs here: http://datasette.readthedocs.io/en/latest/metadata.html#specifying-the-label-column-for-a-table", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 316526433, "label": "label_column option in metadata.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/234#issuecomment-383398182", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/234", "id": 383398182, "node_id": "MDEyOklzc3VlQ29tbWVudDM4MzM5ODE4Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-22T17:31:12Z", "updated_at": "2018-04-22T17:31:12Z", "author_association": "OWNER", "body": "```{\r\n \"databases\": {\r\n \"database1\": {\r\n \"tables\": {\r\n \"example_table\": {\r\n \"label_column\": \"name\"\r\n }\r\n }\r\n }\r\n }\r\n}\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 316526433, "label": "label_column option in metadata.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/232#issuecomment-383252624", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/232", "id": 383252624, "node_id": "MDEyOklzc3VlQ29tbWVudDM4MzI1MjYyNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-21T00:19:00Z", "updated_at": "2018-04-21T00:19:00Z", "author_association": "OWNER", "body": "Thanks!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 316365426, "label": "Fix a typo"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/14#issuecomment-383140111", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/14", "id": 383140111, "node_id": "MDEyOklzc3VlQ29tbWVudDM4MzE0MDExMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-20T15:52:33Z", "updated_at": "2018-04-20T15:52:33Z", "author_association": "OWNER", "body": "Here's a link demonstrating my new plugin: https://datasette-cluster-map-demo.now.sh/polar-bears-455fe3a/USGS_WC_eartags_output_files_2009-2011-Status", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 267707940, "label": "Datasette Plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/14#issuecomment-383139889", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/14", "id": 383139889, "node_id": "MDEyOklzc3VlQ29tbWVudDM4MzEzOTg4OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-20T15:51:47Z", "updated_at": "2018-04-20T15:51:47Z", "author_association": "OWNER", "body": "I released everything we have so far in [Datasette 0.20](https://github.com/simonw/datasette/releases/tag/0.20) and built and released an example plugin, [datasette-cluster-map](https://pypi.org/project/datasette-cluster-map/). Here's my blog entry about it: https://simonwillison.net/2018/Apr/20/datasette-plugins/", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 267707940, "label": "Datasette Plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/230#issuecomment-383109984", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/230", "id": 383109984, "node_id": "MDEyOklzc3VlQ29tbWVudDM4MzEwOTk4NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-20T14:15:39Z", "updated_at": "2018-04-20T14:15:39Z", "author_association": "OWNER", "body": "Refs #229", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 316128955, "label": "Setting page size AND max returned rows to 1000 doesn't seem to work"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/227#issuecomment-382967238", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/227", "id": 382967238, "node_id": "MDEyOklzc3VlQ29tbWVudDM4Mjk2NzIzOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-20T03:58:09Z", "updated_at": "2018-04-20T03:58:09Z", "author_association": "OWNER", "body": "Maybe prepare_table_data() vs prepare_table_context()", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 315960272, "label": "prepare_context() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/227#issuecomment-382966604", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/227", "id": 382966604, "node_id": "MDEyOklzc3VlQ29tbWVudDM4Mjk2NjYwNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-20T03:54:56Z", "updated_at": "2018-04-20T03:54:56Z", "author_association": "OWNER", "body": "Should this differentiate between preparing the data to be sent back as JSON and preparing the context for the template?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 315960272, "label": "prepare_context() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/227#issuecomment-382964794", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/227", "id": 382964794, "node_id": "MDEyOklzc3VlQ29tbWVudDM4Mjk2NDc5NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-20T03:45:18Z", "updated_at": "2018-04-20T03:45:18Z", "author_association": "OWNER", "body": "What if the context needs to make await calls?\r\n\r\nOne possible option: plugins can either manipulate the context in place OR they can return an awaitable. If they do that, the caller will await it.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 315960272, "label": "prepare_context() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/227#issuecomment-382959857", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/227", "id": 382959857, "node_id": "MDEyOklzc3VlQ29tbWVudDM4Mjk1OTg1Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-20T03:21:43Z", "updated_at": "2018-04-20T03:21:43Z", "author_association": "OWNER", "body": "Plus a generic prepare_context() hook called in the common render method.\r\n\r\nprepare_context_table(), prepare_context_row() etc\r\n\r\nArguments are context, request, self (hence can access self.ds)\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 315960272, "label": "prepare_context() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/227#issuecomment-382958693", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/227", "id": 382958693, "node_id": "MDEyOklzc3VlQ29tbWVudDM4Mjk1ODY5Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-20T03:15:52Z", "updated_at": "2018-04-20T03:15:52Z", "author_association": "OWNER", "body": "A better way to do this would be with many different plugin hooks, one for each view.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 315960272, "label": "prepare_context() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/228#issuecomment-382924910", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/228", "id": 382924910, "node_id": "MDEyOklzc3VlQ29tbWVudDM4MjkyNDkxMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-20T00:35:48Z", "updated_at": "2018-04-20T00:35:48Z", "author_association": "OWNER", "body": "Hiding tables with the `idx_` prefix should be good enough here, since false positives aren't very harmful.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 316031566, "label": "If spatialite detected, mark idx_XXX_Geometry tables as hidden"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/227#issuecomment-382808266", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/227", "id": 382808266, "node_id": "MDEyOklzc3VlQ29tbWVudDM4MjgwODI2Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-19T16:59:23Z", "updated_at": "2018-04-19T16:59:23Z", "author_association": "OWNER", "body": "Maybe this should have a second argument indicating which codepath was being handled. That way plugins could say \"only inject this extra context variable on the row page\".", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 315960272, "label": "prepare_context() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/224#issuecomment-382616527", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/224", "id": 382616527, "node_id": "MDEyOklzc3VlQ29tbWVudDM4MjYxNjUyNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-19T05:40:28Z", "updated_at": "2018-04-19T05:40:28Z", "author_association": "OWNER", "body": "No need to use `PackageLoader` after all, we can use the same mechanism we used for the static path:\r\n\r\nhttps://github.com/simonw/datasette/blob/b55809a1e20986bb2e638b698815a77902e8708d/datasette/utils.py#L694-L695", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 315517578, "label": "Ability for plugins to bundle templates"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/223#issuecomment-382413121", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/223", "id": 382413121, "node_id": "MDEyOklzc3VlQ29tbWVudDM4MjQxMzEyMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-18T14:47:18Z", "updated_at": "2018-04-18T14:47:18Z", "author_association": "OWNER", "body": "And tested `datasette package` - this time exercising the ability to pass more than one `--install` option:\r\n\r\n```\r\n$ datasette package sortable.db --branch=master --install requests --install datasette-plugin-demos\r\nSending build context to Docker daemon 125.4kB\r\nStep 1/7 : FROM python:3\r\n ---> 79e1dc9af1c1\r\nStep 2/7 : COPY . /app\r\n ---> 6e8e40bce378\r\nStep 3/7 : WORKDIR /app\r\nRemoving intermediate container 7cdc9ab20d09\r\n ---> f42258c2211f\r\nStep 4/7 : RUN pip install https://github.com/simonw/datasette/archive/master.zip requests datasette-plugin-demos\r\n ---> Running in a0f17cec08a4\r\nCollecting ...\r\nRemoving intermediate container a0f17cec08a4\r\n ---> beea84e73271\r\nStep 5/7 : RUN datasette inspect sortable.db --inspect-file inspect-data.json\r\n ---> Running in 4daa28792348\r\nRemoving intermediate container 4daa28792348\r\n ---> c60312d21b99\r\nStep 6/7 : EXPOSE 8001\r\n ---> Running in fa728468482d\r\nRemoving intermediate container fa728468482d\r\n ---> 8f219a61fddc\r\nStep 7/7 : CMD [\"datasette\", \"serve\", \"--host\", \"0.0.0.0\", \"sortable.db\", \"--cors\", \"--port\", \"8001\", \"--inspect-file\", \"inspect-data.json\"]\r\n ---> Running in cd4eaeb2ce9e\r\nRemoving intermediate container cd4eaeb2ce9e\r\n ---> 066e257c7c44\r\nSuccessfully built 066e257c7c44\r\n(venv) datasette $ docker run -p 8081:8001 066e257c7c44\r\nServe! files=('sortable.db',) on port 8001\r\n[2018-04-18 14:40:18 +0000] [1] [INFO] Goin' Fast @ http://0.0.0.0:8001\r\n[2018-04-18 14:40:18 +0000] [1] [INFO] Starting worker [1]\r\n[2018-04-18 14:46:01 +0000] - (sanic.access)[INFO][1:7]: GET http://localhost:8081/-/static-plugins/datasette_plugin_demos/plugin.js 200 16\r\n``` ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 315327860, "label": "datasette publish --install=name-of-plugin"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/223#issuecomment-382409989", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/223", "id": 382409989, "node_id": "MDEyOklzc3VlQ29tbWVudDM4MjQwOTk4OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-18T14:38:08Z", "updated_at": "2018-04-18T14:38:08Z", "author_association": "OWNER", "body": "Tested on Heroku as well.\r\n\r\n datasette publish heroku sortable.db --install datasette-plugin-demos --branch=master\r\n\r\nhttps://morning-tor-45944.herokuapp.com/-/static-plugins/datasette_plugin_demos/plugin.js\r\n\r\nhttps://morning-tor-45944.herokuapp.com/sortable-4bbaa6f?sql=select+random_integer%280%2C+10%29", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 315327860, "label": "datasette publish --install=name-of-plugin"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/223#issuecomment-382408128", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/223", "id": 382408128, "node_id": "MDEyOklzc3VlQ29tbWVudDM4MjQwODEyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-18T14:33:09Z", "updated_at": "2018-04-18T14:33:09Z", "author_association": "OWNER", "body": "Demo:\r\n\r\n datasette publish now sortable.db --install datasette-plugin-demos --branch=master\r\n\r\nProduced this deployment, with both the `random_integer()` function and the static file from https://github.com/simonw/datasette-plugin-demos/tree/0.2\r\n\r\nhttps://datasette-issue-223.now.sh/-/static-plugins/datasette_plugin_demos/plugin.js\r\n\r\nhttps://datasette-issue-223.now.sh/sortable-4bbaa6f?sql=select+random_integer%280%2C+10%29\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 315327860, "label": "datasette publish --install=name-of-plugin"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/14#issuecomment-382256729", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/14", "id": 382256729, "node_id": "MDEyOklzc3VlQ29tbWVudDM4MjI1NjcyOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-04-18T04:29:29Z", "updated_at": "2018-04-18T04:30:14Z", "author_association": "OWNER", "body": "I added a mechanism for plugins to serve static files and define custom CSS and JS URLs in #214 - see new documentation on http://datasette.readthedocs.io/en/latest/plugins.html#static-assets and http://datasette.readthedocs.io/en/latest/plugins.html#extra-css-urls", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 267707940, "label": "Datasette Plugins"}, "performed_via_github_app": null}