rowid,message,commits_fts,rank 1,Added pinboard-to-sqlite,14047, 2,Added goodreads-to-sqlite,14047, 3,List of all current Dogsheep tools,14047, 4,Initial commit,14047, 5,First release,14047, 6,"Just pull for dogsheep repos + sqlite-utils and datasette I accidentally started pulling everything from the dependent repos as well. Commit messages with REFRESH_DB in now trigger a rebuild from scratch.",14047, 7,--install=datasette-json-html,14047, 8,"Fetch previous copy of database Also installed missing bs4 dependency",14047, 9,"Scrape dependents demo, refs #34",14047, 10,"Release 2.1, refs #34",14047, 11,"Install test dependencies, refs #34",14047, 12,"New scrape-dependents command, refs #34",14047, 13,Default milestones facets are now repo and state,14047, 14,"Release 2.0 Backwards incompatible schema change, refs #31",14047, 15,"Removed debug output, refs #32",14047, 16,"Added missing issue-comments.json, refs #32",14047, 17,"Debug list of files in tests, for #32",14047, 18,"Fix for issue_comments bug #32 Refs #31",14047, 19,"Fix for creator foreign key on milestones, refs #31",14047, 20,"Foreign keys for milestones table, refs #31",14047, 21,"Use foreign key to repos table on issues and milestones, refs #31",14047, 22,"milestones now has FK to creator, plus repo column - closes #29",14047, 23,"Ensure issues.milestone/assignee are integers, closes #30",14047, 24,Ignore *.json and *.db and .DS_Store,14047, 25,Release 1.1,14047, 26,Note that demo includes datasette and sqlite-utils now,14047, 27,Demo also pulls datasette and sqlite-utils,14047, 28,"Handle 204 No Content from GitHub API, refs #28",14047, 29,"New contributors command, refs #28",14047, 30,"Use INTEGER for organization column, fixes #27",14047, 31,"Updated foreign keys test, refs #27",14047, 32,"Extract organizaion to users table, refs #27",14047, 33,Add datasette-pretty-json to demo,14047, 34,"Release 1.0.1 With bug fix for #26",14047, 35,"Send topic Accept header in fetch_repo() too, closes #26",14047, 36,"Also pull issue comments, refs #25",14047, 37,--install=py-gfm,14047, 38,--service github-to-sqlite,14047, 39,"Configure demo with demo-metadata.json This includes datasette-render-markdown",14047, 40,"Release 1.0, refs #23",14047, 41,"Link to demo from README, refs #13 and #23",14047, 42,"datasette-search-all plugin, refs #13",14047, 43,Install datasette (for datasette publish) - refs #13,14047, 44,"Install sqlite3 in action, refs #13",14047, 45,Redact email addresses before publishing,14047, 46,"Explicit title/description columns on milestone, refs #13",14047, 47,"Handle repos with no commits, closes #22 Refs #21",14047, 48,"Raise GitHub API errors as exceptions, refs #21",14047, 49,"Fixed bad bash syntax, refs #13",14047, 50,"Expose GITHUB_ACCESS_TOKEN env variable, refs #13",14047, 51,"Cat auth.json - revoke token immediately after this run, refs #13",14047, 52,"Cat auth.json - revoke token immediately after this run, refs #13",14047, 53,"Removed some debugging, refs #13",14047, 54,"No need for explicit auth.json if I get the key right, refs #13",14047, 55,"Revert ""More debugging for actions, refs #13"" This reverts commit aca2823f1987fafd1dfead79a275ce3819168f2a.",14047, 56,"More debugging for actions, refs #13",14047, 57,"Debug assertion, refs #13",14047, 58,"Just run against dogsheep repos, refs #13 Otherwise the demo will leak my private simonw repos",14047, 59,"Explicit auth.json, refs #13",14047, 60,"More action debugging, refs #13",14047, 61,"Some actions debugging output, refs #13",14047, 62,"Write auth.json with plain echo, refs #13",14047, 63,"Use jq to create auth.json Refs #13",14047, 64,"Deploy demo using Actions, refs #13",14047, 65,"raw_authors plus handle null authors, closes #18",14047, 66,"Upgrade to sqlite-utils 2.x, closes #20",14047, 67,"assets in a separate table, closes #15",14047, 68,"Full-text search for more tables, closes #19",14047, 69,Release 0.7,14047, 70,"Docs for commits command, closes #17",14047, 71,"commits now only gets new commits unless --all, refs #17",14047, 72,"github-to-sqlite commits command, refs #17",14047, 73,Release 0.6,14047, 74,"--auth is now optional, closes #9",14047, 75,'github-to-sqlite repos' now accepts multiple usernames,14047, 76,Fetch repo topics using Accept header,14047, 77,"'releases' command to fetch releases, closes #11",14047, 78,Rename test file so it actually runs,14047, 79,Release 0.5,14047, 80,"issue-comments command, closes #7 Also added --issue option to issues command, for fetching one specific issue.",14047, 81,alter=True when upserting users,14047, 82,Release 0.4,14047, 83,"'github-to-sqlite repos' command, closes #3",14047, 84,Release 0.3,14047, 85,"Extract license from repos table, closes #2",14047, 86,Release 0.2,14047, 87,"github-to-sqlite starred' command, closes #1",14047, 88,Release 0.1.1,14047, 89,Removed accidental debugging code,14047, 90,Implemented 'issues' OAuthed data fetching,14047, 91,Fixed README title,14047, 92,"Initial working version * ""github-to-sqlite auth"" command saves a token to auth.json * ""github-to-sqlite issues"" command only works with --load=",14047, 93,Release 0.2,14047, 94,Order of files in zip does not matter,14047, 95,Implemented location-history import command,14047, 96,Adjust sort order in test,14047, 97,Use pytest -vv,14047, 98,Initial working version,14047, 99,clarification in README,14047, 100,"Release 0.3.1 Updated README for PyPI",14047, 101,"Added ""Browsing your data with Datasette"" section",14047, 102,Release 0.3,14047, 103,"Configure full-text search, closes #1",14047, 104,"Compatibility with sqlite-utils 1.x, release 0.2a",14047, 105,Typo,14047, 106,Applied Black,14047, 107,Added requests-mock test dependency,14047, 108,user and trees commands,14047, 109,Release 0.5,14047, 110,Bump to sqlite-utils 2.4.4,14047, 111,Release 0.4,14047, 112,"Fixed workout points import for iOS 13 Workout lat/lon tracks are now stored in .gpx files. Closes #10",14047, 113,"sqlite-utils~=1.12.1 To get the bugfix for insert_all([])",14047, 114,Release 0.3.2,14047, 115,Fix for #9 - too many SQL variables,14047, 116,Release 0.3.1,14047, 117,"Use less RAM (#8) * Call el.clear() for each element * Clear root element each time Memory profile graphs here: https://github.com/dogsheep/healthkit-to-sqlite/issues/7",14047, 118,Release 0.3,14047, 119,Documentation for progress bar / --silent,14047, 120,"Break records out into separate tables, closes #6",14047, 121,"Better progress bar label, refs #5",14047, 122,"Added progress bar, --xml and --silent options --xml lets you pass path to an XML file - I used this to add some unit tests for the CLI itself. --silent means ""don't show a progress bar"" Closes #5",14047, 123,Release 0.2,14047, 124,"Use hash_id in less places I was getting import errors when duplicate hash_id was generated.",14047, 125,Fixed URLs in README,14047, 126,"Import records, closes #4",14047, 127,"Import workouts, closes #2",14047, 128,"Export activity summaries, closes #3",14047, 129,Usage instructions,14047, 130,"Added export.xml test fixture Also utils.find_all_tags() utility function",14047, 131,Initial framework,14047, 132,Release 0.2,14047, 133,Fixed bug with conservation_status column,14047, 134,Updated to sqlite-utils 2.x,14047, 135,Preparing release 0.1a,14047, 136,"Added observations_with_photos view Optimized for use with datasette-json-html",14047, 137,Guess medium_url from /square.jpg regular url,14047, 138,Removed obsolete import,14047, 139,"Removed obsolete code, applied black",14047, 140,First working version,14047, 141,Changelog badge,14047, 142,Added the --dry-run option,14047, 143,"Upload photos in a thread pool, closes #11",14047, 144,"Use thread pool for hashing This speeds it up a ton. Closes #10",14047, 145,Release 0.2a,14047, 146,"Only upload photos not already in S3, refs #9",14047, 147,Release 0.1a,14047, 148,"Add progress bar to upload command, closes #6",14047, 149,"Ask for bucket in s3-auth, refs #5",14047, 150,"photos-to-sqlite upload photos.db dirname command, closes #4",14047, 151,Ignore my auth.json,14047, 152,"Test for s3-auth command Also renamed JSON keys to have photos_s3 prefix - refs #5",14047, 153,"Initial checkn + s3-auth command, closes #5",14047, 154,Release 0.2,14047, 155,"Documentation for fetch, --all and --silent Closes #2",14047, 156,Fixed --silent option,14047, 157,"Pagination, progress bar and --since support Closes #1 Refs #2 - still needs README update Also upgraded to sqlite-utils 2.x",14047, 158,"Use pocket_ prefix in auth.json, closes #4",14047, 159,Fixed link to Pocket,14047, 160,Implemented 'fetch' command,14047, 161,Initial save_items() utility plus tests,14047, 162,Initial setup plus 'pocket-to-sqlite auth' command,14047, 163,Ran Black,14047, 164,Release 0.3.1,14047, 165,"Don't break if source is missing (#6) This broke for very old checkins from 2010 with no source set. Thanks, @mfa!",14047, 166,Release 0.3,14047, 167,"Upgraded to sqlite-utils 2.x, closes #7",14047, 168,Release 0.2,14047, 169,Applied black,14047, 170,"Added --since option, closes #3",14047, 171,"Link to your-foursquare-oauth-token tool, close #4",14047, 172,"Treat Foursquare timestamps as UTC, closes #5",14047, 173,Run pytest with -vv in Circle CI,14047, 174,"Implemented --save option, closes #2 Also added usage instructions to README.",14047, 175,"Can now fetch data from Foursquare API, closes #1 Also made createdAt field the original unix timestamp and added a new created field which is the ISO formatted version.",14047, 176,Use group_concat(distinct categories.name) for venue_details view,14047, 177,Added venue_categories to venue_details view,14047, 178,"Added venue_details view Includes date of first and last checkin plus count of checkins",14047, 179,"Require at least sqlite-utils 1.10 We need it for the view introspection methods",14047, 180,Added checkin_details SQL view,14047, 181,Added support for stickers,14047, 182,Implemented events and posts,14047, 183,"Initial working version Only supports loading checkins from a JSON file on disk: swarm-to-sqlite swarm.db -f checkins.json",14047, 184,Removed rogue parenthesis,14047, 185,Added changelog badge,14047, 186,0.21.1 bugfix release,14047, 187,"Fix for since_ids bug, closes #46",14047, 188,Release 0.21,14047, 189,"Better error for non-existing user, closes #37",14047, 190,"Fix for TypeError, closes #42",14047, 191,"New command: twitter-to-sqlite lists, refs #43",14047, 192,"Handle blank tweet[source], closes #44",14047, 193,"Release 0.20.1 Refs #41",14047, 194,"Fix for None, None since_id bug, closes #41",14047, 195,Updated tests for new tables in #40,14047, 196,Release 0.20,14047, 197,"New feature: track history of various counts, closes #40",14047, 198,"Improved --since= for searc, refs #39 Also fixed bug from the sqlite-utils 2.x upgrade caused by checking db[table].exists instead of db[table].exists()",14047, 199,"Improved --since= for home-timeline, refs #39",14047, 200,"Improved --since= for mentions-timeline, refs #39",14047, 201,"Improved --since= for user-timeline, refs #39",14047, 202,"Create tables for --since tracking, refs #39",14047, 203,Release 0.19,14047, 204,"followers/friends --sql/--attach options, closes #36",14047, 205,"Better progress bar formatting for user-timeline Closes #38",14047, 206,Release 0.18,14047, 207,"Docs for new user-timeline options, closes #35",14047, 208,Fixed bug in #35,14047, 209,"user-timeline now takes --sql/--attach/--ids and multiple identifiers Refs #35, refs #8. Still needs documentation.",14047, 210,Release 0.17,14047, 211,Fix #34 by upgrading sqlite-utils,14047, 212,Release 0.16,14047, 213,"Documentation for friends, closes #31",14047, 214,"Documented favorites, closes #32",14047, 215,"friends command, refs #31",14047, 216,Release 0.15,14047, 217,"Add indexes to following table, closes #28",14047, 218,"No need to save user twice, fixes #30",14047, 219,Reformatted with Black,14047, 220,"Fixed #29: import command fails on empty files By bumping sqlite-utils dependency to get this fix: https://github.com/simonw/sqlite-utils/issues/52",14047, 221,"Added example Twitter developer account application email Thanks to Jacob Kaplan-Moss: https://twitter.com/jacobian/status/1192510111719313408",14047, 222,Release 0.14,14047, 223,"Documentation for search command, closes #3",14047, 224,"Added --since and --since_id to search, refs #3",14047, 225,Release 0.13,14047, 226,"Documentation for mentions-timeline command, refs #26",14047, 227,"New mentions-timeline command, refs #26",14047, 228,Better auth.json explanation,14047, 229,Release 0.12,14047, 230,"Initial implementation of search command, refs #3",14047, 231,"Don't create index/foreign key that already exists, fixes #25",14047, 232,"Tweet source extraction and new migration system (#24) Closes #12 and #23",14047, 233,"get_profile() now saves user to DB This ensures we don't accidentally fail to create a user record for the currently authenticated user.",14047, 234,Instructions on updating favorited_by table with imported likes,14047, 235,Release 0.11.1,14047, 236,"Fix bugs running --since from scratch If tables were missing, script would throw an error.",14047, 237,Release 0.11,14047, 238,"Added --since_id and --since to user-timeline, refs #20",14047, 239,"--since and --since_id options for user-timeline, closes #19 Refs #20 Also added some initial rate limit error handling code.",14047, 240,"Removed unneccessary test file I moved this test into test_import.py in the previous commit.",14047, 241,"import command now works on files and directories, closes #22",14047, 242,Release 0.10,14047, 243,"favorites --stop_after option, refs #20",14047, 244,"Store unescaped full_text of Tweet, closes #21",14047, 245,"favorites command now populates favorited_by table, closes #14",14047, 246,Use archive_ in README,14047, 247,"Archive tables use _ not - Tables with hyphens in the name are harder to query because you have to remember to [escape-them].",14047, 248,"home-timeline command, closes #18",14047, 249,"twitter-to-sqlite import recreates archive- tables, closes #17",14047, 250,Release 0.8,14047, 251,"twitter-to-sqlite import command, closes #4",14047, 252,Release 0.7,14047, 253,"statuses-lookup command, closes #13",14047, 254,README tweaks,14047, 255,Release 0.6,14047, 256,"Documentation for follow/track commands, closes #11",14047, 257,"follow command now takes screen names, supports --sql and --ids refs #11",14047, 258,"Experimental follow/track commands, refs #11",14047, 259,Slightly better error handling,14047, 260,"Docs for --sql and --attach, refs #8",14047, 261,Fixed copy,14047, 262,Release 0.5,14047, 263,Fixed incorrect header in README,14047, 264,"followers-ids and friends-ids subcommands Closes #9",14047, 265,Added missing --ids in README,14047, 266,Release 0.4,14047, 267,Added list-members subcommand,14047, 268,"--attach and --sql for users-lookup, refs #8",14047, 269,"users-lookup command, closes #7",14047, 270,New --stop_after option for user-timeline,14047, 271,Release 0.2,14047, 272,"Extract media to separate table, closes #6",14047, 273,Extract places into separate table,14047, 274,Release 0.2,14047, 275,"Note max 3,200 tweets for other people's accounts",14047, 276,Test for new FTS tables,14047, 277,"Fix for bug where tweets were not saved This is a messy fix, need to dig in more",14047, 278,Enable FTS on tweets full_text,14047, 279,Release 0.1,14047, 280,user-timeline documentation,14047, 281,"Save followers in following m2m table, closes #1",14047, 282,Removed dead code,14047, 283,Added python-dateutil dependency,14047, 284,Circle CI now runs pytest,14047, 285,"Implemented favorites and user-timeline commands Plus tests",14047, 286,"Added twitter-to-sqlite fetch URL command Useful development tool - makes it easy to make authenticated API requests on the command-line. Also started the Design notes docs",14047, 287,Better heading,14047, 288,Added help text for --auth option,14047, 289,twitter-to-sqlite followers --auth option,14047, 290,twitter-to-sqlite auth -a my-auth.json option,14047, 291,Added note about followers command being rate limited,14047, 292,Break loop after last page,14047, 293,Documentation for auth and followers commands,14047, 294,Fixed regex,14047, 295,Deploy releases tagged with alpha or beta,14047, 296,"Release 0.1a Mainly doing this to reserve twitter-to-sqlite on PyPI",14047, 297,twitter-to-sqlite auth and followers commands,14047, 298,"Re-arranged full-text search docs Also documented ?_searchmode=raw - closes #748",14047, 299,Documentation for #747,14047, 300,"Directory configuration mode supports metadata.yaml, closes #747",14047, 301,Changelog badge,14047, 302,"Remove 'Serve!' line from serve CLI output It wasn't adding anything, and it was confusing when run in conjunction with the new config directory mode from #731",14047, 303,"403 for static directory listing, closes #740",14047, 304,"Configuration directory mode, closes #731",14047, 305,"Make request available when rendering custom pages, closes #738",14047, 306,"Mechanism for creating custom pages using templates Closes #648",14047, 307,Added more example plugins,14047, 308,Fixed a couple of spelling errors,14047, 309,Release Datasette 0.40,14047, 310,"Extra body CSS class for canned queries, closes #727",14047, 311,"Smarter merging of metadata and extra_metadata, closes #724",14047, 312,"Fixed bug with Templates considered comment, closes #689",14047, 313,"Expose extra_template_vars in _contex=1, refs #693",14047, 314,"Fix for missing view_name bug, closes #716",14047, 315,"Removed Zeit Now v1 support, closes #710",14047, 316,"Refactored .custom_sql() method to new QueryView class Refs #698",14047, 317,"dedent SQL for neighborhood_search fixture Makes this page a little prettier: https://latest.datasette.io/fixtures/neighborhood_search",14047, 318,--metadata accepts YAML as well as JSON - closes #713,14047, 319,"Refactor template setup into Datasette constructor Closes #707",14047, 320,"Run base_url tests against /fixtures/facetable too, refs #712",14047, 321,Fixed RST bug,14047, 322,Release 0.39,14047, 323,"Fixed typo in GitHub Action configuration, refs #705",14047, 324,"Deploy latest.datasett.io to Cloud Run, refs #705",14047, 325,"Removed deploy to Zeit Now, refs #705",14047, 326,"base_url configuration setting, closes #394 * base_url configuration setting * base_url works for static assets as well",14047, 327,"Fix for input type=search Webkit styling, closes #701",14047, 328,"Removed documentation for Zeit Now v1, refs #710",14047, 329,"Added datasette-publish-fly plugin to docs, closes #704",14047, 330,"Added example plugins to plugin hooks docs, closes #709",14047, 331,"Fix bug with over-riding default sort, closes #702",14047, 332,"""sort"" and ""sort_desc"" metadata properties, closes #702",14047, 333,Bump to click 7.1.1 to fix flaky tests,14047, 334,Updated documentation formatting,14047, 335,"Show sort arrow on primary key by default Closes #677. Refs #702.",14047, 336,Updated publish_subcommand example,14047, 337,"await Request(scope, receive).post_vars() method, closes #700 Needed for #698",14047, 338,Release 0.38,14047, 339,"Do not look for templates_path in default plugins Closes #697",14047, 340,Link to Datasette Writes blog entry,14047, 341,"Upgrade Dockerfile to SQLite 3.31.1, closes #695",14047, 342,"Fixes for new --memory option, refs #694",14047, 343,"--memory option for publish cloudrun, refs #694",14047, 344,Changelog for 0.37.1,14047, 345,Print exceptions if they occur in the write thread,14047, 346,"Handle scope path if it is a string I ran into this while running a unit test with httpx.AsyncClient",14047, 347,RST fix,14047, 348,Improved extra_template_vars documentation,14047, 349,"Don't count rows on homepage for DBs > 100MB (#688) Closes #649.",14047, 350,Documentation fix,14047, 351,Fixed typo,14047, 352,News and release notes for 0.37,14047, 353,Fixed incorrect target name,14047, 354,"Use inspect-file, if possible, for total row count (#666) For large tables, counting the number of rows in the table can take a significant amount of time. Instead, where an inspect-file is provided for an immutable database, look up the row-count for a plain count(*). Thanks, @kevindkeogh",14047, 355,?_searchmode=raw option (#686),14047, 356,".execute_write() and .execute_write_fn() methods on Database (#683) Closes #682.",14047, 357,"Only --reload on changes to immutable databases, closes #494",14047, 358,Updated README news for 0.36,14047, 359,"Release notes for 0.36, refs #679",14047, 360,"Added shapefile-to-sqlite, datasette-mask-columns, datasette-auth-existing-cookies, datasette-auth-existing-cookies Refs #679",14047, 361,"Fix db-to-sqlite command in ecosystem doc page (#669) Thanks, @adipasquale",14047, 362,"Better tests for prepare_connection() plugin hook, refs #678",14047, 363,"prepare_connection() now takes datasette and database args, refs #678",14047, 364,"Refactored run_sanity_checks to check_connection(conn), refs #674",14047, 365,Replaced self.ds.execute with db.execute in more places,14047, 366,"Docs for .render_template(), refs #577 Also improved parameter documentation for other methods, refs #576",14047, 367,".add_database() and .remove_database() methods, refs #671 Also made a start on the Datasette class documentation, refs #576",14047, 368,"Run black against everything, not just tests and datasette dirs",14047, 369,"Apply Black, update copyright to be 2017-2020",14047, 370,"More reliable tie-break ordering for facet results I was seeing a weird bug where the order of results running tests on my laptop was inconsistent, causing pytest failures even though the order of tests in Travis CI was fine. I think the fix is to explicitly state how facet ordering ties on the count should be resolved.",14047, 371,Reformatted with black,14047, 372,Updated release notes with #653,14047, 373,"Allow leading comments in SQL input field (#653) Thanks, @jaywgraves!",14047, 374,Release notes for 0.35,14047, 375,Added a bunch more plugins to the Ecosystem page,14047, 376,"Datasette.render_template() method, closes #577 Pull request #664.",14047, 377,geojson-to-sqlite,14047, 378,"Release notes for Datasette 0.34, plus news updates",14047, 379,"--port argument for datasette package, plus tests - closes #661 From pull request #663",14047, 380,"gcloud run is now GA, s/beta// (#660) Thanks, @glasnt",14047, 381,"_search= queries now correctly escaped, fixes #651 Queries with reserved words or characters according to the SQLite FTS5 query language could cause errors. Queries are now escaped like so: dog cat => ""dog"" ""cat""",14047, 382,Release 0.33,14047, 383,Link to JSK Medium post from news,14047, 384,"Added template_debug setting, closes #654",14047, 385,Documentation for --port=0,14047, 386,Apply black,14047, 387,Bump to uvicorn 0.11,14047, 388,Better handling of corrupted database files,14047, 389,Include asyncio task information in /-/threads debug page,14047, 390,Added Niche Museums to News,14047, 391,Examples of things you can do with plugins,14047, 392,Added datasette-haversine to plugins list,14047, 393,"Better documentation for --static, closes #641 https://datasette.readthedocs.io/en/stable/custom_templates.html#serving-static-files",14047, 394,index view is also important for plugin hooks,14047, 395,"Display 0 results, closes #637",14047, 396,"Suggest column facet only if at least one count > 1 Fixes #638",14047, 397,How to upgrade using Docker,14047, 398,"Include rowid in filter select, closes #636",14047, 399,"Move .execute() from Datasette to Database Refs #569 - I split this change out from #579",14047, 400,Datasette 0.32 and datasette-template-sql in news,14047, 401,Release notes for 0.32,14047, 402,"Render templates using Jinja async mode Closes #628",14047, 403,Release notes for 0.31.2,14047, 404,"Fix ""publish heroku"" + upgrade to use Python 3.8.0 Closes #633. Closes #632.",14047, 405,"Fix for datasette publish with just --source_url (#631) Closes #572",14047, 406,Badge linking to datasette on hub.docker.com,14047, 407,Fixed typo in release notes,14047, 408,Release notes for 0.31.1,14047, 409,"datasette publish uses python:3.8 base Docker image, closes #629",14047, 410,ReST fix,14047, 411,Final steps: build stable branch of Read The Docs,14047, 412,"Removed code that conditionally installs black Since we no longer support Python 3.5 we don't need this any more.",14047, 413,Datasette 0.31 in news section,14047, 414,Release notes for 0.31,14047, 415,"Support Python 3.8, stop supporting Python 3.5 (#627) * Upgrade to uvicorn 0.10.4 * Drop support for Python 3.5 * Bump all dependencies to latest releases * Update docs to reflect we no longer support 3.5 * Removed code that skipped black unit test on 3.5 Closes #622",14047, 416,"Include uvicorn version in /-/versions, refs #622",14047, 417,"Bump pint to 0.9 (#624) This fixes 2 deprecation warnings in Python 3.8 - refs #623 #622",14047, 418,"Test against Python 3.8 in Travis (#623) * Test against Python 3.8 in Travis * Avoid current_task warnings in Python 3.8",14047, 419,"CREATE INDEX statements on table page, closes #618",14047, 420,"datasette-csvs on Glitch now uses sqlite-utils It previously used csvs-to-sqlite but that had heavy dependencies. See https://support.glitch.com/t/can-you-upgrade-python-to-latest-version/7980/33",14047, 421,"Improved documentation for ""publish cloudrun""",14047, 422,"Improved UI for publish cloudrun, closes #608",14047, 423,Removed unused special_args_lists variable,14047, 424,"Removed _group_count=col feature, closes #504",14047, 425,"Handle spaces in DB names (#590) Closes #503 - thanks, @rixx",14047, 426,"Use select colnames, not select * for table view - refs #615",14047, 427,"pk__notin= filter, closes #614",14047, 428,"Offer to format readonly SQL (#602) Following discussion in #601, this PR adds a ""Format SQL"" button to read-only SQL (if the SQL actually differs from the formatting result). It also removes a console error on readonly SQL queries. Thanks, @rixx!",14047, 429,"Fix CSV export for nullable foreign keys, closes #612",14047, 430,Release notes for 0.30.2,14047, 431,"Don't show 'None' as label for nullable foreign key, closes #406",14047, 432,"Plugin static assets support both hyphens and underscores in names Closes #611",14047, 433,"Better documentation of --host, closes #574",14047, 434,"Don't suggest array facet if column is only [], closes #610",14047, 435,Only inspect first 100 records for #562,14047, 436,Only suggest array facet for arrays of strings - closes #562,14047, 437,"Use distinfo.project_name for plugin name if available, closes #606",14047, 438,Fixed dumb error,14047, 439,Release 0.30.1,14047, 440,"Persist _where= in hidden fields, closes #604",14047, 441,Update to latest black (#609),14047, 442,"Always pop as_format off args dict (#603) Closes #563. Thanks, @chris48s",14047, 443,Update news in README,14047, 444,Release 0.30,14047, 445,"Don't auto-format SQL on page load (#601) Closes #600",14047, 446,"Fix for /foo v.s. /foo-bar issue, closes #597 Pull request #599",14047, 447,"Fixed bug returning non-ascii characters in CSV, closes #584",14047, 448,"Use --platform=managed for publish cloudrun, closes #587",14047, 449,Add Python versions badge,14047, 450,"Display metadata footer on custom SQL queries (#589) Closes #408 - thanks, @rixx!",14047, 451,"Sort databases on homepage by argument order - #591 Closes #585 - thanks, @rixx!",14047, 452,"Button to format SQL, closes #136 SQL code will be formatted on page load, and can additionally be formatted by clicking the ""Format SQL"" button. Thanks, @rixx!",14047, 453,Allow EXPLAIN WITH... - closes #583,14047, 454,Added /-/threads debugging page,14047, 455,Changelog for 0.29.3 release,14047, 456,"detect_fts now works with alternative table escaping (#571) Fixes #570. See also https://github.com/simonw/sqlite-utils/pull/57",14047, 457,Refactored connection logic to database.connect(),14047, 458,"Fix numerous typos (#561) Thanks, @minho42!",14047, 459,"Fixed CodeMirror on database page, closes #560",14047, 460,Release 0.9.2,14047, 461,"Fix plus test for unicode characters in custom query name, closes #558",14047, 462,Fixed breadcrumbs on custom query page,14047, 463,News: Single sign-on against GitHub using ASGI middleware,14047, 464,"Bump to uvicorn 0.8.4 (#559) https://github.com/encode/uvicorn/commits/0.8.4 Query strings will now be included in log files: https://github.com/encode/uvicorn/pull/384",14047, 465,Updated release notes,14047, 466,Release 0.29.1,14047, 467,Removed unused variable,14047, 468,"Fix static mounts using relative paths and prevent traversal exploits (#554) Thanks, @abdusco! Closes #555",14047, 469,"Add support for running datasette as a module (#556) python -m datasette Thanks, @abdusco",14047, 470,"News: Datasette 0.29, datasette-auth-github, datasette-cors",14047, 471,Changelog for 0.29 release,14047, 472,"--plugin-secret option for datasette publish Closes #543 Also added new --show-files option to publish now and publish cloudrun - handy for debugging.",14047, 473,"Added datasette-auth-github and datasette-cors plugins to Ecosystem Closes #548",14047, 474,"Removed facet-by-m2m from docs, refs #550 Will bring this back in #551",14047, 475,"Removed ManyToManyFacet for the moment, closes #550",14047, 476,"Updated custom facet docs, closes #482",14047, 477,"Fix nav display on 500 page, closes #545",14047, 478,"white-space: pre-wrap for table SQL, closes #505",14047, 479,"min-height on .hd Now it should be the same size on the homepage as it is on pages with breadcrumbs",14047, 480,"Split pypi and docker travis tasks (#480) Thanks @glasnt!",14047, 481,"extra_template_vars plugin hook (#542) * extra_template_vars plugin hook Closes #541 * Workaround for cwd bug Based on https://github.com/pytest-dev/pytest/issues/1235#issuecomment-175295691",14047, 482,"Refactor templates for better top nav customization, refs #540",14047, 483,Better robustness in face of missing raw_path,14047, 484,Black,14047, 485,"Fix for accidentally leaking secrets in /-/metadata, closes #538",14047, 486,"Secret plugin configuration options (#539) Closes #538",14047, 487,"Switch to ~= dependencies, closes #532 (#536) * Switch to ~= dependencies, closes #532 * Bump click and click-default-group * imp. is deprecated, use types.ModuleType instead - thanks https://stackoverflow.com/a/32175781 * Upgrade to pytest 5",14047, 488,"Added asgi_wrapper plugin hook, closes #520",14047, 489,"Updated custom template docs, refs #521",14047, 490,"Unit test for _table custom template, refs #521",14047, 491,"Rename _rows_and_columns.html to _table.html, refs #521",14047, 492,"Default to raw value, use Row.display(key) for display, refs #521",14047, 493,"New experimental Row() for templates, refs #521",14047, 494,Typo,14047, 495,pip install -e .[docs] for docs dependencies,14047, 496,"Better coverage of sqlite-utils in FTS docs, closes #525",14047, 497,"Porting Datasette to ASGI, and Turtles all the way down",14047, 498,Added datasette-doublemetaphone to list of plugins,14047, 499,"Install test dependencies so deploy can work python tests/fixtures.py needs asgiref or it fails with an error",14047, 500,"Port Datasette from Sanic to ASGI + Uvicorn (#518) Datasette now uses ASGI internally, and no longer depends on Sanic. It now uses Uvicorn as the underlying HTTP server. This was thirteen months in the making... for full details see the issue: https://github.com/simonw/datasette/issues/272 And for a full sequence of commits plus commentary, see the pull request: https://github.com/simonw/datasette/pull/518",14047, 501,"Revert ""New encode/decode_path_component functions"" Refs #272 This reverts commit 9fdb47ca952b93b7b60adddb965ea6642b1ff523. Now that ASGI supports raw_path we don't need our own encoding scheme!",14047, 502,"Refactored view class hierarchy, refs #272 See https://github.com/simonw/datasette/issues/272#issuecomment-502393107",14047, 503,Fix typo in install step: should be install -e (#500),14047, 504, Added datasette-render-binary plugin to ecosystem,14047, 505,Added datasette-bplist plugin to ecosystem,14047, 506,"Upgrade pytest to 4.6.1, pluggy to 0.12.0 (#497)",14047, 507,Added datasette-jq plugin to ecosystem,14047, 508,Tidy up with Black,14047, 509,"Fix pagination when sorted by expanded foreign key Closes #489",14047, 510,Removed obsolete __init__ method,14047, 511,Fixed duplicate function name,14047, 512,"Start of unit tests for Database class, refs #485",14047, 513,"Rename InterruptedError => QueryInterrupted, closes #490",14047, 514,"Database.get_outbound_foreign_keys() refactor Following this, the only module that ever makes calls to the low-level execute_against_connection_in_thread() method is datasette/database.py",14047, 515,"Databse.primary_keys(table) / fts_table(table) refactor, closes #488 Also cleaned up some unused imports spotted by the linter.",14047, 516,Typo,14047, 517,"Refactored ConnectedDatabase to datasette/database.py Closes #487",14047, 518,"Refactor Datasette methods to ConnectedDatabase Refs #487",14047, 519,Sort keys to past tests in Python 3.5,14047, 520,"Don't use -v with pytest in Travis It seems to slow things down more than I expected.",14047, 521,"Better label detection, refs #485 This needs unit tests.",14047, 522,"Facet by many-to-many, closes #365",14047, 523,Travis now uses pytest -v,14047, 524,"Added ?_through= table argument, closes #355 Also added much more interesting many-to-many fixtures - roadside attractions!",14047, 525,Stack Overflow survey added to news,14047, 526,Link to blog post about 0.28,14047, 527,"Facet by date, closes #481",14047, 528,Updated tests for date(...) lookup,14047, 529,Fix ?col__date= for columns with spaces,14047, 530,Doc typo fix (#479),14047, 531,Typo fix,14047, 532,"Use -i with datasette publish, closes #469",14047, 533,New setup.py description,14047, 534,README for Datasette 0.28 release,14047, 535,"Tidy up README, reducing duplication with docs Refs #451",14047, 536,Release notes for 0.28 - closes #463,14047, 537,"Docs for facet-by-JSON-array, closes #477",14047, 538,Do not allow downloads of mutable databases - closes #474,14047, 539,"Source, license and about docs - closes #475",14047, 540,"Removed 'datasette skeleton', closes #476",14047, 541,Fixed some links,14047, 542,Replaced a straggling 'datasette publish now' reference,14047, 543,"Rename ""datasette publish now"" to ""datasette publish nowv1"" Also added an alias so ""datasette publish now"" continues to work. Closes #472",14047, 544,"New performance documentation, closes #421",14047, 545,"?_hash=1 no longer respected for mutable databases Closes #471, refs #419",14047, 546,"/-/databases sorts alphabetically Should fix test failure in Python 3.5",14047, 547,"New introspection endpoint: /-/databases - closes #470 Refs #419 and #465",14047, 548,"serve --inspect-file=X now populates cached table counts Closes #462",14047, 549,"Removed .inspect() and /-/inspect.json Refs #462 /-/inspect.json may return in some shape in #465",14047, 550,Removed accidental R,14047, 551,Another link to Glitch,14047, 552,It's 2019 now,14047, 553,Added inline contents for installation page,14047, 554,Improved introduction copy on Plugins docs page,14047, 555,Re-ordered documentation index page,14047, 556,Fixed broken link to global-power-plants demo,14047, 557,Wording tweaks,14047, 558,Docs on how to use sphinx-autobuild,14047, 559,"'Try Datasette without installing anything using Glitch' Also new 'Play with a live demo' section, both at the top of the Getting Started documentation page. https://datasette.readthedocs.io/en/latest/getting_started.html Closes #464",14047, 560,"publish heroku now uses Python 3.6.8 Also refactored temporary_heroku_directory out of utils.py",14047, 561,"Table counts now handles SQL Logic Error too I tried running Datasette against 22 database files at once and ran into a weird error where the table counts broke with an SQL Logic Error exception. Easy fix: catch that exception too and treat it the same as a regular Interrupted error.",14047, 562,Removed rogue print(),14047, 563,"Sometimes sort tables by number of relationships, closes #460",14047, 564,"Index page only shows row counts for smaller databases The index page now only shows row counts for immutable databases OR for databases with less than 30 tables provided it could get a count for each of those tables in less than 10ms. Closes #467, Refs #460",14047, 565,"Row count fix + sort index page databases alphabetically Sorting alphabetically should fix a test failure in Python 3.5 Refs #460",14047, 566,"Include views on homepage, fix table counts If we have less than 5 tables we now also show one or more views in the summary on the homepage. Also corrected the logic for the row counts - we now count hidden and visible tables separately. Closes #373, Refs #460",14047, 567,"Don't show hidden tables on index page, closes #455 Refs #460. Also bulked out HTML index page unit tests.",14047, 568,"Run sanity checks, not .inspect(), on startup Also fixes tests that did NOT like a call to run_until_complete in the Datasette() constructor.",14047, 569,"New run_sanity_checks mechanism, for SpatiLite Moved VirtualSpatialIndex check into a new mechanism that should allow us to add further sanity checks in the future. To test this I've had to commit a binary sample SpatiaLite database to the repository. I included a build script for creating that database. Closes #466",14047, 570,Fix test ordering,14047, 571,Black + fix broken test,14047, 572,"""datasette inspect foo.db"" now just calculates table counts Refs #462 * inspect command now just outputs table counts * test_inspect.py is now only tests for that CLI command * Updated some relevant documentation * Removed docs for /-/inspect since that is about to change",14047, 573,Fixed tests relating to #459,14047, 574,"Pass --token to now alias, refs #459",14047, 575,"Fixed ""datasette publish now ... --alias=x"" The --alias argument can now be passed more than once. Also updated our Travis configuration to use this. Fixes #459",14047, 576,"Finished implementation of ?_trace=1 debug tool I redesigned the JSON output and added a handy ""traceback"" key showing three relevant lines of the current traceback for each logged query. Closes #435",14047, 577,Changelog for 0.27.1,14047, 578,"Use now --target production instead of now alias Fix for this error: $ now alias --token=$NOW_TOKEN > WARN! The `now alias` command (no arguments) was deprecated in favour of `now --target production`. > Error! Couldn't find a deployment to alias. Please provide one as an argument. The command ""now alias --token=$NOW_TOKEN"" exited with 1. https://travis-ci.org/simonw/datasette/jobs/530597261",14047, 579,Fixed 500 error on homepage,14047, 580,Added some things to .gitignore,14047, 581,setup: add tests to package exclusion (#458),14047, 582,Run black and update docs for #457,14047, 583,"datasette publish cloudrun --service=x, closes #457",14047, 584,"tests/fixtures.py can now write out plugins too This command: python tests/fixtures.py \ fixtures.db \ metadata.json \ fixtures-plugins/ Will now create the fixtures.db and metadata.json files, AND create a folder called fixtures-plugins/ containing two test plugins. You can then run it like this: datasette fixtures.db \ -m metadata.json --plugins-dir=fixtures-plugins/",14047, 585,"New encode/decode_path_component functions ASGI cannot differentiate between / and %2F in a URL, so we need an alternative scheme for encoding the names of tables that contain special characters such as / For background, see https://github.com/django/asgiref/issues/51#issuecomment-450603464 Some examples: ""table/and/slashes"" => ""tableU+002FandU+002Fslashes"" ""~table"" => ""U+007Etable"" ""+bobcats!"" => ""U+002Bbobcats!"" ""U+007Etable"" => ""UU+002B007Etable""",14047, 586,"Promote Glitch instead of Datasette Publish Datasette Publish is currently broken due to Zeit API and platform changes.",14047, 587,Fixed crash on /:memory: page,14047, 588,"Removed pointless return variable handle_request() always returns None anyway.",14047, 589,"Respect --cors for error pages, closes #453",14047, 590,Added Code style: black badge,14047, 591,"Apply black to everything, enforce via unit tests (#449) I've run the black code formatting tool against everything: black tests datasette setup.py I also added a new unit test, in tests/test_black.py, which will fail if the code does not conform to black's exacting standards. This unit test only runs on Python 3.6 or higher, because black itself doesn't run on 3.5.",14047, 592,"Mark codemirror files as vendored (#367) This should stop GitHub from incorrectly stating that Datasette is 46% JavaScript.",14047, 593,Fixed 500 error on /-/metadata page,14047, 594,"""python3 -m pip"" is clearer (thanks @jaap3) (#368)",14047, 595,Use dist: xenial and python: 3.7 on Travis (#447),14047, 596,"Unit test for binary data display, refs #442",14047, 597,"Suppress rendering of binary data - thanks @russss (#442) Binary columns (including spatialite geographies) get shown as ugly binary strings in the HTML by default. Nobody wants to see that mess. Show the size of the column in bytes instead. If you want to decode the binary data, you can use a plugin to do it.",14047, 598,"Docs for 'datasette publish cloudrun', refs #434",14047, 599,"datasette publish cloudrun (#434) - thanks, @rprimet New publish subcommand that publishes using the new Google Cloud Run platform. datasette publish cloudrun database.db",14047, 600,"Pass view_name to extra_body_script hook (#443) At the moment it's not easy to tell whether the hook is being called in (for example) the row or table view, as in both cases the `database` and `table` parameters are provided. This passes the `view_name` added in #441 to the `extra_body_script` hook.",14047, 601,"Add a max-line-length setting for flake8 (#444) This stops my automatic editor linting from flagging lines which are too long. It's been lingering in my checkout for ages. 160 is an arbitrary large number - we could alter it if we have any opinions (but I find the line length limit to be my least favourite part of PEP8).",14047, 602,"Implemented ArrayFacet, closes #359",14047, 603,"Extract facet code out into a new plugin hook, closes #427 (#445) Datasette previously only supported one type of faceting: exact column value counting. With this change, faceting logic is extracted out into one or more separate classes which can implement other patterns of faceting - this is discussed in #427, but potential upcoming facet types include facet-by-date, facet-by-JSON-array, facet-by-many-2-many and more. A new plugin hook, register_facet_classes, can be used by plugins to add in additional facet classes. Each class must implement two methods: suggest(), which scans columns in the table to decide if they might be worth suggesting for faceting, and facet_results(), which executes the facet operation and returns results ready to be displayed in the UI.",14047, 604,"Entirely removed table_rows_count table property We were not displaying this anywhere, and it is now expensive to calculate. Refs #419, #420",14047, 605,"Show 'many rows' if count times out, refs #420",14047, 606,"Added missing file, refs #438",14047, 607,"Don't load setuptools plugins during test runs Uses pattern from https://docs.pytest.org/en/latest/example/simple.html#detect-if-running-from-within-a-pytest-run Closes #438",14047, 608,"DatabaseView no longer uses .inspect(), closes #420",14047, 609,Fixed bug where metadata.json hidden tables were ignored,14047, 610,"Index page no longer uses inspect data - refs #420 Also introduced a mechanism whereby table counts are calculated against a time limit but immutable databases have their table counts calculated on server startup.",14047, 611,Include request duration in traces,14047, 612,"New plugin hook: register_output_renderer hook (#441) Thanks @russss! * Add register_output_renderer hook This changeset refactors out the JSON renderer and then adds a hook and dispatcher system to allow custom output renderers to be registered. The CSV output renderer is untouched because supporting streaming renderers through this system would be significantly more complex, and probably not worthwhile. We can't simply allow hooks to be called at request time because we need a list of supported file extensions when the request is being routed in order to resolve ambiguous database/table names. So, renderers need to be registered at startup. I've tried to make this API independent of Sanic's request/response objects so that this can remain stable during the switch to ASGI. I'm using dictionaries to keep it simple and to make adding additional options in the future easy. Fixes #440",14047, 613,"Ensure sqlite_timelimit correctly clears handler If an error occurred inside the block the progress handler (used to enforce a time limit) was not being correctly cleared, resulting in timeout errors potentially occurring during subsequent SQL queries. The fix is described here: https://docs.python.org/3/library/contextlib.html#contextlib.contextmanager",14047, 614,"Fix for Python 3.5, refs #435",14047, 615,"Note that trace data format is very likely to change, refs #435",14047, 616,"Test for ?_trace=1, refs #435",14047, 617,"?_trace=1 now adds SQL trace info to JSON/HTML response Also added documentation for it. Refs #435",14047, 618,"Added ?_trace=1 option to trace SQL Currently just dumps all SQL statements out on the console.",14047, 619,"Added some guidelines Mainly to remind me that master needs to be releasable at all times!",14047, 620,"New ConnectedDatabase.mtime_ns property I plan to use this for some clever table count caching tricks",14047, 621,"Support multiple filters of the same type Closes #288",14047, 622,New ?column__date=yyyy-mm-dd filter,14047, 623,"New colname__in=x,y,z filter, closes #433",14047, 624,"Documentation for filters, plus new documentation unit test https://simonwillison.net/2018/Jul/28/documentation-unit-tests/",14047, 625,"Extract and refactor filters into filters.py This will help in implementing __in as a filter, refs #433",14047, 626,Slightly more interesting example link,14047, 627,Removed accidental extra default plugins module,14047, 628,Cleaned up pylint warnings,14047, 629,Moved BaseView.absolute_url() to Datasette,14047, 630,Moved expand_foreign_keys() from TableView to Datasette,14047, 631,Fixed broken link in documentation,14047, 632,"?_where= parameter on table views, closes #429 From pull request #430",14047, 633,"Persist show/hide state better, closes #425",14047, 634,"?_fts_table= and ?_fts_pk= arguments, closes #428",14047, 635,Upgrade to Jinja2==2.10.1 (#426),14047, 636,"New ?tags__arraycontains=tag lookup against JSON fields Part one of supporting facet-by-JSON-array, refs #359",14047, 637,"TableView.data() no longer uses .inspect, refs #420 BUT... it does a count(*) against the whole table which may take unbounded time. Fixing this is part of #422",14047, 638,"expandable_columns() no longer uses inspect, refs #420",14047, 639,"foreign_key_tables no longer uses inspect, refs #420",14047, 640,RowView.data() no longer uses inspect refs #420,14047, 641,"display_columns_and_rows() no longer uses inspect, refs #420",14047, 642,"expand_foreign_keys() no longer uses inspect, refs #420",14047, 643,"sortable_columns_for_table() no longer uses inspect() Refs #420",14047, 644,Removed rogue print(),14047, 645,"DatabaseDownload no longer uses .inspect(), refs #420",14047, 646,".database_url(database) no longer needs inspect, refs #420",14047, 647,".resolve_db_name() and .execute() work without inspect Refs #420",14047, 648,"table_exists() now uses async SQL, refs #420",14047, 649,"Fix for TypeError File ""../datasette/app.py"", line 138, in __init__ self.files = files + immutables TypeError: can only concatenate tuple (not ""list"") to tuple",14047, 650,"'datasette serve -i immutable.db' option, refs #419",14047, 651,"URL hashing is now off by default - closes #418 Prior to this commit Datasette would calculate the content hash of every database and redirect to a URL containing that hash, like so: https://v0-27.datasette.io/fixtures => https://v0-27.datasette.io/fixtures-dd88475 This assumed that all databases were opened in immutable mode and were not expected to change. This will be changing as a result of #419 - so this commit takes the first step in implementing that change by changing this default behaviour. Datasette will now only redirect hash-free URLs under two circumstances: * The new `hash_urls` config option is set to true (it defaults to false). * The user passes `?_hash=1` in the URL",14047, 652,"show/hide link for SQL on custom query page Closes #415",14047, 653,"Update spatialite.rst (#413) a line of sql added to create the idx_ in the python recipe",14047, 654,Fix for test failure with Click 7.0,14047, 655,"Allow more recent versions of Click Closes #414",14047, 656,"Support for :memory: databases If you start Datasette with no files, it will connect to :memory: instead. When starting it with files you can add --memory to also get a :memory: database.",14047, 657,about and about_url metadata options,14047, 658,Added datasette-jellyfish,14047, 659,Link to sqlite-utils blog entry,14047, 660,Added sqlite-utils blog entry to news section,14047, 661,"Added socrata2sql to the ecosystem page A fantastic new tool created by @achavez at the Dallas Morning News.",14047, 662,Expanded section on db-to-sqlite,14047, 663,"Show size of database file next to download link, closes #172",14047, 664,"Heroku --include-vcs-ignore (#407) Means `datasette publish heroku` can work under Travis, unlike this failure: https://travis-ci.org/simonw/fivethirtyeight-datasette/builds/488047550 ``` 2.25s$ datasette publish heroku fivethirtyeight.db -m metadata.json -n fivethirtyeight-datasette tar: unrecognized option '--exclude-vcs-ignores' Try 'tar --help' or 'tar --usage' for more information. ▸ Command failed: tar cz -C /tmp/tmpuaxm7i8f --exclude-vcs-ignores --exclude ▸ .git --exclude .gitmodules . > ▸ /tmp/f49440e0-1bf3-4d3f-9eb0-fbc2967d1fd4.tar.gz ▸ tar: unrecognized option '--exclude-vcs-ignores' ▸ Try 'tar --help' or 'tar --usage' for more information. ▸ The command ""datasette publish heroku fivethirtyeight.db -m metadata.json -n fivethirtyeight-datasette"" exited with 0. ``` The fix for that issue is to call the heroku command like this: heroku builds:create -a app_name --include-vcs-ignore",14047, 665,Datasette 0.27,14047, 666,"Added documentation on the Datasette Ecosystem https://datasette.readthedocs.io/en/latest/ecosystem.html",14047, 667,Export option: _shape=array&_nl=on for newline-delimited JSON,14047, 668,New 'datasette plugins' command to list installed plugins,14047, 669,Python 3.7.2 as base for Docker image,14047, 670,"Expose current git tag to Docker build, closes #399",14047, 671,app_client() fixture doesn't need to take **kwargs,14047, 672,Release 0.26.1,14047, 673,"Dockerfile now builds SQLite 3.26.0, closes #397",14047, 674,Bump aiohttp to 3.5.3 to fix a warning,14047, 675,"compile_options output in /-/versions, closes #396",14047, 676,Corrected import path in plugin docs,14047, 677,Supress pytest warnings from 3rd party modules,14047, 678,Switch to using PYPI_PASSWORD env var in Travis,14047, 679,Datasette 0.26 release notes,14047, 680,"Fix CSV export hidden form fields, closes #393",14047, 681,"Pass --token to 'now alias', if provided",14047, 682,"datasette publish now --alias option You can now use --alias to attempt to alias after you deploy. Also updated now.json to use version: 1",14047, 683,Fix some regex DeprecationWarnings (#392),14047, 684,--reload now also reloads if databases change on disk,14047, 685,Link to new tutorial from the README,14047, 686,Updated notes on FTS5 v.s. FTS4,14047, 687,"Better example commit This one updates the README news section as well",14047, 688,Improved release process documentation,14047, 689,"Tiny typo in customization docs (#390) Thanks, @jaywgraves",14047, 690,Release 0.25.2,14047, 691,Upgrade pytest to 4.0.2,14047, 692,Added docs on updating docs + the release process,14047, 693,Use python-3.6.7 runtime for Heroku deploys,14047, 694,"New make_app_client() pattern Because next version of pytest discourages calling fixture factories as functions",14047, 695,How to activate your virtual environment,14047, 696,"Travis to use Python 3.7-dev for a little longer 3.7 produces a 403 forbidden error: https://travis-ci.org/simonw/datasette/jobs/450716231#L6",14047, 697,Compatible with Python 3.7,14047, 698,Release 0.25.1,14047, 699,"Use Zeit cloud v1 to avoid 100MB image limit Closes #366 - thanks @slygent",14047, 700,More human friendly 'what is Datasette' intro text,14047, 701,Link to dev environment instructions from installation guide,14047, 702,"Started contributing docs: setting up a dev environment https://datasette.readthedocs.io/en/latest/contributing.html",14047, 703,"Link to ""The interesting ideas in Datasette""",14047, 704,"Travis applies :latest tag to Docker release, refs #329",14047, 705,Releasing Datasette 0.25,14047, 706,Fix small doc typo - thanks @jaywgraves (#365),14047, 707,"Fix json.loads in Python 3.5 3.5 requires a str, not a bytes https://travis-ci.org/simonw/datasette/jobs/421660555",14047, 708,Better docs for publish_subcommand() plugin hook,14047, 709,"extra_css_urls(template, database, table, datasette) The extra_css_urls and extra_js_urls hooks now take additional optional parameters. Also refactored them out of the Datasette class and into RenderMixin. Plus improved plugin documentation to explicitly list parameters.",14047, 710,"Refactoring: renamed ""name"" variable to ""database""",14047, 711,"render_cell(value, column, table, database, datasette) The render_cell plugin hook previously was only passed value. It is now passed (value, column, table, database, datasette).",14047, 712,Corrected indentation in metadata.rst,14047, 713,New plugin hook: extra_body_script,14047, 714,Added plugin_config() method,14047, 715,New ds.metadata() method,14047, 716,Do not show default plugins on /-/plugins,14047, 717," Import pysqlite3 if available, closes #360 (#361)",14047, 718,Refactor to use new datasatte.config(key) method,14047, 719,"Bump versions of pytest, pluggy and beautifulsoup4 (#358)",14047, 720,Hide 'view and edit SQL' if config.allow_sql turned off,14047, 721,"fts_table and fts_pk metadata configs, available for both tables and views",14047, 722,sortable_columns also now works with views,14047, 723,"render_cell(value) plugin hook, closes #352 New plugin hook for customizing the way cells values are rendered in HTML. The first full example of this hook in use is https://github.com/simonw/datasette-json-html",14047, 724,"Renamed variable, since docs are not written in Markdown",14047, 725,"Only run documented_views() fixture once per session Speeds up tests, because previously it ran once per view class.",14047, 726,"Docs for IndexView, TableView, RowView, closes #299 Also removed xfail from test_view_classes_are_documented, so any future *View classes that are added without documentation will cause the tests to fail.",14047, 727,"Added labels so unit tests can detect docs, refs #299",14047, 728,"xfail documentation unit tests for view classes, refs #299 More documentation unit tests. These ones check that every single **View class imported into the datasette/app.py module are covered by our documentation. Just one problem: they aren't documented yet. So I'm using the xfail pytest decorator to mark these tests as allowed-to-fail. When you run the test suite you now get a report of how many views still need to be documented, but it doesn't fail the tests. The output looks something like this: $ pytest tests/test_docs.py collected 31 items tests/test_docs.py ..........................XXXxx. [100%] ============ 26 passed, 2 xfailed, 3 xpassed in 1.06 seconds ============ Once I have documented all the views I will remove the xfail so any future views that are added without documentation will cause a test failure. We can detect that a view is documented by looking for ReST label in the docs, for example: .. _IndexView: Some view classes can be used to power multiple URLs - the JsonDataView class for example is used to power /-/metadata and /-/config and /-/plugins In this case, the second part of the label can indicate the variety of page, e.g: .. _JsonDataView_metadata: The test will pass as long as there is at least one label that starts with _JsonDataView.",14047, 729,Unit tests for publish now/heroku - closes #348,14047, 730,"""datasette publish heroku"" improvements * Fixed bug where --title= didn't work if -m not provided * Now using Python 3.6.6 instead of Python 3.6.3",14047, 731,"publish_subcommand hook + default plugins mechanism, used for publish heroku/now (#349) This change introduces a new plugin hook, publish_subcommand, which can be used to implement new subcommands for the ""datasette publish"" command family. I've used this new hook to refactor out the ""publish now"" and ""publish heroku"" implementations into separate modules. I've also added unit tests for these two publishers, mocking the subprocess.call and subprocess.check_output functions. As part of this, I introduced a mechanism for loading default plugins. These are defined in the new ""default_plugins"" list inside datasette/app.py Closes #217 (Plugin support for datasette publish) Closes #348 (Unit tests for ""datasette publish"") Refs #14, #59, #102, #103, #146, #236, #347",14047, 732,Unit test confirming all plugin hooks are documented,14047, 733,"Fix for Python 3.5 https://stackoverflow.com/a/42694113/6083",14047, 734,Removed unnecessary print statements from tests,14047, 735,"'Usage: datasette', not 'Usage: cli' - refs #336",14047, 736,"Ensure --help examples in docs are always up to date, closes #336 Unit tests now check that docs/*.txt help examples are all up-to-date. I ran into a problem here in that the terminal_width needed to be more accurately defined - so I replaced update-docs-help.sh with update-docs- help.py which hard-codes the terminal width.",14047, 737,"Removed docker -e flag docker -e flag is now deprecated: https://docs.docker.com/engine/deprecated/#-e-and---email-flags-on-docker-login",14047, 738,Release notes for 0.24 release,14047, 739,"Build and push new tagged releases to Docker Hub Based on method described in https://sebest.github.io/post/using-travis-ci-to-build-docker-images/",14047, 740,"URLify URLs in custom SQL queries, closes #298",14047, 741,"Unit tests for advanced export box HTML, closes #320",14047, 742,?_json_infinity=1 for handling Infinity/-Infinity - fixes #332,14047, 743,'publish now' uses force_https_urls:on - closes #333,14047, 744,"New force_https_urls option, refs #333",14047, 745,Removed unused imports,14047, 746,"Support title/description for canned queries, closes #342 Demo here: https://latest.datasette.io/fixtures/neighborhood_search",14047, 747,"Allow app names for `datasette publish heroku` Lets you supply the `-n` parameter for Heroku deploys, which also lets you update existing Heroku deployments.",14047, 748,"Bump aiohttp to fix compatibility with Python 3.7 Tests failed here: https://travis-ci.org/simonw/datasette/jobs/403223333",14047, 749,Run Travis CI against Python 3.7 as well,14047, 750,"Docs for datasette publish and package, closes #337 Also introduced a new mechanism for ensuring the --help examples in the documentation reflect the current output of the --help commands, via a new update-docs-help.sh script. Closes #336",14047, 751,New tagline: 'A tool for exploring and publishing data',14047, 752,"New truncate_cells_html config for truncating cells, closes #330",14047, 753,"Show custom error message if SpatiaLite needed, closes #331",14047, 754,"datasette publish heroku now supports --extra-options, closes #334",14047, 755,Release notes for 0.23.2,14047, 756,"Fix for row pages for tables with / in, closes #325",14047, 757,Removed rogue print statement left over from #309,14047, 758,Cleaned up view constructors to accept just a datasette instance,14047, 759,"Fix for weird nested exception in RequestTimeout I saw this error: sanic.exceptions.RequestTimeout: Request Timeout During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/Users/simonw/Dropbox/Development/datasette/venv/lib/python3.6/site-packages/sanic/handlers.py"", line 82, in response response = handler(request=request, exception=exception) File ""/Users/simonw/Dropbox/Development/datasette/datasette/app.py"", line 512, in on_exception if request.path.split(""?"")[0].endswith("".json""): AttributeError: 'NoneType' object has no attribute 'path' Strangely ""if request and request.path..."" did not work here, because the Sanic Request class extends builtins.dict and hence evaluates to False if it has no headers.",14047, 760,Added datasette-vega to news section,14047, 761,Documentation typo,14047, 762,Documentation tweaks,14047, 763,"Installation instructions, including docker image - closes #328",14047, 764,"Speed up Travis by reusing pip wheel cache across builds (#324) * Cache pip wheels between runs in Travis, refs #323 * Run pytest manually - ""python setup.py test"" appeared to still download a bunch of stuff: https://travis-ci.org/simonw/datasette/jobs/395306188 * Use extras_require so pip can install test dependencies: https://github.com/pypa/pip/issues/1197#issuecomment-228939212",14047, 765,"CSV export now respects --cors, fixes #326",14047, 766,Link to 0.23.1 release notes from news,14047, 767,Changelog for 0.23.1,14047, 768,"Updated Travis password, refs #317",14047, 769,"Correctly display empty strings in HTML table, closes #314",14047, 770,"Allow ""."" in database filenames, closes #302",14047, 771,"404s ending in slash redirect to remove that slash, closes #309",14047, 772,"Fixed incorrect display of compound primary keys with foreign key references Closes #319",14047, 773,"Docs + example of canned SQL query using || concatenation Closes #321",14047, 774,"Correctly display facets with value of 0 - fixes #318 Also added comprehensive unit test for facet display HTML.",14047, 775,Default 'expand labels' to checked in CSV advanced export,14047, 776,Release notes for 0.23,14047, 777,Advanced export box now obeys allow_csv_stream config - refs #266,14047, 778,"Renamed 'stream all records' to 'stream all rows', refs #266",14047, 779,"Docs for CSV export, refs #266",14047, 780,"Don't link to #export on custom query results The advanced CSV export options don't work for custom SQL queries. Refs #266",14047, 781,"Improved UI for CSV/JSON export, closes #266",14047, 782,"Streaming mode for downloading all rows as a CSV (#315) * table.csv?_stream=1 to download all rows - refs #266 This option causes Datasette to serve ALL rows in the table, by internally following the _next= pagination links and serving everything out as a stream. Also added new config option, allow_csv_stream, which can be used to disable this feature. * New config option max_csv_mb limiting size of CSV export",14047, 783,"Default to _labels=on on JSON/CSV links with foreign keys, refs #266",14047, 784,"Release tagged versions to PyPI and now alias When a new tagged version is pushed, this should: * bdist_wheel it and release to https://pypi.python.org/pypi/datasette * Set an alias of v0-22-1.datasette.io for the deployed demo",14047, 785,Link to latest.datasette.io from README,14047, 786,"Set Now deployment name with datasette publish, refs #313",14047, 787,"Try using a different name for each Now deploy Refs #313",14047, 788,"Deploy latest.datasette.io on commit to master - #313 If the tests pass in Travis CI, deploy an instance containing Datasette's test fixtures to https://latest.datasette.io/ Also set up an alias of truncated-commit-hash.datasette.io",14047, 789,"--version-note for datasette, datasette publish and datasette package This is a relatively obscure new command-line argument that helps solve the problem of showing accurate version information in deployed instances of Datasette even if they were deployed directly from source code. You can pass --version-note to datasette publish and package and it will then in turn be passed to datasette when it starts: datasette --version-note=hello fixtures.db Now if you visit /-/versions.json you will see this: { ""datasette"": { ""note"": ""hello"", ""version"": ""0+unknown"" }, ""python"": { ""full"": ""3.6.5 (default, Jun 6 2018, 19:19:24) \n[GCC 6.3.0 20170516]"", ""version"": ""3.6.5"" }, ... } I plan to use this in some Travis CI configuration, refs #313",14047, 790,"datasette publish now --token=X argument Lets you specify the auth token to use when deploying to Now. Tokens can be created at https://zeit.co/account/tokens",14047, 791,"Improved fixtures to support publication The fixtures database created by our unit tests makes for a good ""live"" demo of Datasette in action. I've improved the metadata it ships with to better support this use-case. I've also improved the mechanism for writing out fixtures: you can do this: python tests/fixtures.py fixtures.db To get just the fixtures database written out... or you can do this: python tests/fixtures.py fixtures.db fixtures.json To get metadata which you can then serve like so: datasette fixtures.db -m fixtures.json Refs #313",14047, 792,Renamed test_tables.db to fixtures.db in unit tests,14047, 793,"?_labels= and ?_label=COL to expand foreign keys in JSON/CSV These new querystring arguments can be used to request expanded foreign keys in both JSON and CSV formats. ?_labels=on turns on expansions for ALL foreign key columns ?_label=COLUMN1&_label=COLUMN2 can be used to pick specific columns to expand e.g. `Street_Tree_List.json?_label=qSpecies&_label=qLegalStatus` { ""rowid"": 233, ""TreeID"": 121240, ""qLegalStatus"": { ""value"" 2, ""label"": ""Private"" } ""qSpecies"": { ""value"": 16, ""label"": ""Sycamore"" } ""qAddress"": ""91 Commonwealth Ave"", ... } The labels option also works for the HTML and CSV views. HTML defaults to `?_labels=on`, so if you pass `?_labels=off` you can disable foreign key expansion entirely - or you can use `?_label=COLUMN` to request just specific columns. If you expand labels on CSV you get additional columns in the output: `/Street_Tree_List.csv?_label=qLegalStatus` rowid,TreeID,qLegalStatus,qLegalStatus_label... 1,141565,1,Permitted Site... 2,232565,2,Undocumented... I also refactored the existing foreign key expansion code. Closes #233. Refs #266.",14047, 794,"Cleaned up view_definition/table_definition code in table view Also moved those out of standard JSON into just the HTML template context",14047, 795,Extract string-to-bool logic into utils.py,14047, 796,"Switch back from python:3.6-slim-stretch to python:3.6 Turns out slim-stretch doesn't include gcc which means it can't build various Sanic dependencies. So `datasette publish now ...` was broken. Fixes #310",14047, 797,"Fixed CSV tests - Python 3.6.5 and 3.6.3 apparently differ The test used to expect CSV to come back like this: hello world """" With the final blank value encoded in quotes. Judging by Travis failures, this behaviour changed between Python 3.6.3 and 3.6.5: https://travis-ci.org/simonw/datasette/jobs/392586661",14047, 798,"Basic CSV export, refs #266 Tables and custom SQL query results can now be exported as CSV. The easiest way to do this is to use the .csv extension, e.g. /test_tables/facet_cities.csv By default this is served as Content-Type: text/plain so you can see it in your browser. If you want to download the file (using text/csv and with an appropriate Content-Disposition: attachment header) you can do so like this: /test_tables/facet_cities.csv?_dl=1 We link to the CSV and downloadable CSV URLs from the table and query pages. The links use ?_size=max and so by default will return 1,000 rows. Also fixes #303 - table names ending in .json or .csv are now detected and URLs are generated that look like this instead: /test_tables/table%2Fwith%2Fslashes.csv?_format=csv The ?_format= option is available for everything else too, but we link to the .csv / .json versions in most cases because they are aesthetically pleasing.",14047, 799,"Moved JsonDataView into views/special,py",14047, 800,"Test client wrapper removing need for gather_request - refs #272 As part of decoupling from Sanic, this will make it easier to run tests against ASGI instead.",14047, 801,Documented how to set multiple --config at once,14047, 802,"New cache_size_kb config for SQLite, closes #304",14047, 803,Fixed tests I broke in b18e45158,14047, 804,"Hide sort select on larger screens, closes #300",14047, 805,"Show error on 'datasette publish heroku --spatialite', refs #301",14047, 806,"datasette publish/package --spatialite, closes #243 New command-line argument which causes SpatiaLite to be installed and configured for the published Datasette. datasette publish now --spatialite mydb.db",14047, 807,"Upgrade pytest to 3.6.0 https://github.com/pytest-dev/pytest/issues/1875 made it impossible to declare a function as a fixture multiple times, which we were doing across different modules. The fix was to move our @pytest.fixture calls into decorators in the tests/fixtures.py module.",14047, 808,SQL syntax highlighting in docs,14047, 809,Fixed broken :ref:,14047, 810,Fixed broken test introduced in b0a95da96,14047, 811,Expanded SpatiaLite docs to cover GeoJSON plus lat-lon spatial indexes,14047, 812,"Show more useful error message for SQL interrupted, closes #142",14047, 813,Moved plugin HTML tests from test_html to test_plugins,14047, 814,Missing half of last commit fefb0db8ae15,14047, 815,"Unit test for 02870e57, closes #291",14047, 816,"Use scope='session' for all fixtures This means they will only be executed once which makes sense since the database they create is immutable.",14047, 817,"New ?_json=colname argument for returning unescaped JSON Also extracted docs for special JSON arguments into a new section. Closes #31",14047, 818,"Filter out duplicate JS/CSS URLs, refs #291 (testme)",14047, 819,Added docs on Importing shapefiles into SpatiaLite,14047, 820,"Added num_sql_threads config option, closes #285",14047, 821,"?_shape=arrayfirst, closes #287",14047, 822,"?_ttl= parameter and default_cache_ttl config Refs #285, Closes #289",14047, 823,Test that ensures all config options are documented,14047, 824,Hyperlink to www.srihash.org,14047, 825,Fixed documentation typo,14047, 826,"Initial documentation on using SpatiaLite https://datasette.readthedocs.io/en/latest/spatialite.html",14047, 827,Fix for 500 error on /db?sql=x,14047, 828,"boolean --config are now case insensitive, refs #284",14047, 829,"allow_sql config option to disable custom SQL, closes #284",14047, 830,"allow_facet, allow_download, suggest_facets boolean --config Refs #284",14047, 831,"Moved .execute() method from BaseView to Datasette class Also introduced new Results() class with results.truncated, results.description, results.rows",14047, 832,Set theme jekyll-theme-architect,14047, 833,Set theme jekyll-theme-leap-day,14047, 834,"Build Dockerfile with recent Sqlite + Spatialite (#280) Closes #278 ```bash $ docker run --rm -it datasette spatialite SpatiaLite version ..: 4.4.0-RC0 Supported Extensions: - 'VirtualShape' [direct Shapefile access] - 'VirtualDbf' [direct DBF access] - 'VirtualXL' [direct XLS access] - 'VirtualText' [direct CSV/TXT access] - 'VirtualNetwork' [Dijkstra shortest path] - 'RTree' [Spatial Index - R*Tree] - 'MbrCache' [Spatial Index - MBR cache] - 'VirtualSpatialIndex' [R*Tree metahandler] - 'VirtualElementary' [ElemGeoms metahandler] - 'VirtualKNN' [K-Nearest Neighbors metahandler] - 'VirtualXPath' [XML Path Language - XPath] - 'VirtualFDO' [FDO-OGR interoperability] - 'VirtualGPKG' [OGC GeoPackage interoperability] - 'VirtualBBox' [BoundingBox tables] - 'SpatiaLite' [Spatial SQL - OGC] PROJ.4 version ......: Rel. 4.9.3, 15 August 2016 GEOS version ........: 3.5.1-CAPI-1.9.1 r4246 TARGET CPU ..........: x86_64-linux-gnu the SPATIAL_REF_SYS table already contains some row(s) SQLite version ......: 3.23.1 Enter "".help"" for instructions SQLite version 3.23.1 2018-04-10 17:39:29 Enter "".help"" for instructions Enter SQL statements terminated with a "";"" spatialite> ``` ```bash $ docker run --rm -it datasette python -c ""import sqlite3; print(sqlite3.sqlite_version)"" 3.23.1 ``` Also updates the query used to check for FTS5 as the old version wasn't detecting FTS5 for some reason.",14047, 835,0.22.1 bugfix release,14047, 836,"Faceting no longer breaks pagination, fixes #282",14047, 837,Move version info back to separate module,14047, 838,"Add `__version_info__` derived from `__version__` This might be tuple of more than two values (major and minor version) if commits have been made after a release.",14047, 839,"Add version number support with Versioneer Repo: https://github.com/warner/python-versioneer Versioneer Licence: Public Domain (CC0-1.0) Closes #273",14047, 840,Refactor inspect logic,14047, 841,Link to /-/plugins example,14047, 842,Datasette 0.22: Datasette Facets,14047, 843,Typo fix,14047, 844,"Added docs for introspection endpoints https://datasette.readthedocs.io/en/latest/introspection.html",14047, 845,Only apply responsive table CSS to .rows-and-columns,14047, 846,"Renamed --limit to --config, added --help-config, closes #274 Removed the --page_size= argument to datasette serve in favour of: datasette serve --config default_page_size:50 mydb.db Added new help section: $ datasette --help-config Config options: default_page_size Default page size for the table view (default=100) max_returned_rows Maximum rows that can be returned from a table or custom query (default=1000) sql_time_limit_ms Time limit for a SQL query in milliseconds (default=1000) default_facet_size Number of values to return for requested facets (default=30) facet_time_limit_ms Time limit for calculating a requested facet (default=200) facet_suggest_time_limit_ms Time limit for calculating a suggested facet (default=50)",14047, 847,"Only apply responsive table styles to .rows-and-column Otherwise they interfere with tables in the description, e.g. on https://fivethirtyeight.datasettes.com/fivethirtyeight/nba-elo%2Fnbaallelo",14047, 848,"Suggested facets now use #fragment links Useful for pages with large amounts of content at the top like on https://fivethirtyeight.datasettes.com/fivethirtyeight-469e30d/nba-elo%2Fnbaallelo",14047, 849,Updated default facet limits in docs,14047, 850,"Added /-/limits and /-/limits.json, closes #270",14047, 851,"Show facets that timed out using new InterruptedError If the user requests some _facet= options that do not successfully execute in the configured facet_time_limit_ms, we now show a warning message like this: These facets timed out: rowid, Title To build this I had to clean up our SQLite interrupted logic. We now raise a custom InterruptedError exception when SQLite terminates due to exceeding a time limit. In implementing this I found and fixed a logic error where invalid SQL was being generated in some cases for our faceting calculations but the resulting sqlite3.OperationalError had been incorrectly captured and treated as a timeout. Refs #255 Closes #269",14047, 852,"--limit= mechanism plus new limits for facets Replaced the --max_returned_rows and --sql_time_limit_ms options to ""datasette serve"" with a new --limit option, which supports a larger list of limits. Example usage: datasette serve --limit max_returned_rows:1000 \ --limit sql_time_limit_ms:2500 \ --limit default_facet_size:50 \ --limit facet_time_limit_ms:1000 \ --limit facet_suggest_time_limit_ms:500 New docs: https://datasette.readthedocs.io/en/latest/limits.html Closes #270 Closes #264",14047, 853,"Empty string """" facets can now be selected in UI, refs #255",14047, 854,Docs: Speeding up facets with indexes,14047, 855,"Display currently selected facets better, refs #255",14047, 856,Facet documentation tweaks,14047, 857,"Added screenshots to facets and full_text_search docs, refs #255",14047, 858,Typo fix,14047, 859,"Clarified relationship between metadata and _facet= facets, updated docs - refs @255",14047, 860,"Reliable sort order for facets in Python 3.5, fixing test - refs #255",14047, 861,Hide facet button is now a ✖ - refs #255,14047, 862,"class=""suggested-facets""",14047, 863,"Show enabled facets in flexbox columns, refs #255",14047, 864,"Foreign key facets are now expanded to labels, refs #255",14047, 865,Use escape_sqlite() more consistently,14047, 866,Undid some slightly weird code formatting by 'black',14047, 867,"1,442 format for facet counts, refs #255",14047, 868,"_facet selections persist through table form, refs #255",14047, 869,Fix bug with toggle_url on integer facets,14047, 870,"Facets can now be toggled off again, refs #255",14047, 871,Removed un-used variable,14047, 872,"Facet results now have ""truncated"" field To indicate if there was more than 20 distinct values. Refs #255",14047, 873,_sort/_next links now use new path_with_replaced_args method,14047, 874,Never suggest a facet if it only results in on option,14047, 875,Facets no longer consider null values,14047, 876,"Initial implementation of suggested facets Causes tests to break at the moment",14047, 877,path_with_added_args now works with multiple existing args,14047, 878,"Facet toggling now works for integer columns, refs #255",14047, 879,"Facet ""selected"" key and toggle_url now toggles, refs #255",14047, 880,path_with_added_args now preserves order in Python 3.5,14047, 881,Extract /-/plugins view into a method,14047, 882,Used isort to re-order my imports,14047, 883,Ran black source formatting tool against new views/ and app.py,14047, 884,"Refactored views into new views/ modules, refs #256",14047, 885,"Case insensitive querystring comparison, fix Python 3.5",14047, 886,"Initial implementation of facets, plus tests and docs Refs #255",14047, 887,"utils.path_with_added_args() improvements * Now covered by unit tests * Preserves original order * Can handle multiple args of the same name, e.g. ?bar=1&bar=2",14047, 888,Update conf.py,14047, 889,"Documentation for SQLite full-text search support, closes #253",14047, 890,"/-/versions now includes SQLite fts_versions, closes #252",14047, 891,Slight simplification of /-/inspect,14047, 892,Added Datasette 0.21 to News,14047, 893,"Revert ""Travis should now deploy new tags to PyPI if tests pass"" This reverts commit d39b2e357e34469728f300273ab07c3904ea7a2b. It failed with this error: https://travis-ci.org/simonw/datasette/jobs/375398977 Uploading distributions to https://upload.pypi.org/legacy/ Uploading datasette-0.21-py3-none-any.whl 100% 182k/182k [00:00<00:00, 694kB/s] HTTPError: 403 Client Error: Invalid or non-existent authentication information. for url: https://upload.pypi.org/legacy/",14047, 894,Release Datasette 0.21,14047, 895,Travis should now deploy new tags to PyPI if tests pass,14047, 896,"Default tests to using a longer timelimit Every now and then a test will fail in Travis CI on Python 3.5 because it hit the default 20ms SQL time limit. Test fixtures now default to a 200ms time limit, and we only use the 20ms time limit for the specific test that tests query interruption. This should make our tests on Python 3.5 in Travis much more stable.",14047, 897,"Support _search_COLUMN=text searches, closes #237",14047, 898,"Unit tests for _search= feature, refs #237",14047, 899,"Show version on /-/plugins page, closes #248",14047, 900,"?_size=max option, closes #249",14047, 901,"Added /-/versions and /-/versions.json, closes #244 Sample output: { ""python"": { ""version"": ""3.6.3"", ""full"": ""3.6.3 (default, Oct 4 2017, 06:09:38) \n[GCC 4.2.1 Compatible Apple LLVM 9.0.0 (clang-900.0.37)]"" }, ""datasette"": { ""version"": ""0.20"" }, ""sqlite"": { ""version"": ""3.23.1"", ""extensions"": { ""json1"": null, ""spatialite"": ""4.3.0a"" } } }",14047, 902,"Bump up time limit for test_paginate_tables_and_views It was intermittently failing in Travis CI on Python 3.5: https://travis-ci.org/simonw/datasette/jobs/373713476",14047, 903,"Renamed ?_sql_time_limit_ms= to ?_timelimit, closes #242",14047, 904,"New ?_shape=array option + tweaks to _shape, closes #245 * Default is now ?_shape=arrays (renamed from lists) * New ?_shape=array returns an array of objects as the root object * Changed ?_shape=object to return the object as the root * Updated docs",14047, 905,?_shape=array experimental feature,14047, 906,"FTS tables now detected by inspect(), closes #240",14047, 907,"New ?_size=XXX querystring parameter for table view, closes #229 Also added documentation for all of the _special arguments. Plus deleted some duplicate logic implementing _group_count.",14047, 908,"If max_returned_rows==page_size, increment max_returned_rows Fixes #230, where if the two were equal pagination didn't work correctly.",14047, 909,"New hidden: True option for table metadat, closes #239",14047, 910,"Hide idx_* tables if spatialite detected, closes #228",14047, 911,Added class=rows-and-columns to custom query results table,14047, 912,Link to register-of-members-interests tutorial,14047, 913,Added CSS class rows-and-columns to main table,14047, 914,"label_column now defined on the table-being-linked-to, fixes #234",14047, 915,label_column option in metadata.json - closes #234,14047, 916,Link to documentation from README,14047, 917,Fix a typo (#232),14047, 918,Added missing hyphen,14047, 919,Added datasette-cluster-map blog entry to news,14047, 920,Added Datasette 0.20 to news,14047, 921,Datasette 0.20: static assets and templates for plugins,14047, 922,Add col-X classes to HTML table on custom query page,14047, 923,Fixed out-dated template in documentation,14047, 924,"Plugins can now bundle custom templates, closes #224 Refs #14",14047, 925,"Added /-/metadata /-/plugins /-/inspect, closes #225",14047, 926,Thanks to #214 JavaScript is no longer 'soon',14047, 927,"Documentation for --install option, refs #223",14047, 928,"datasette publish/package --install option, closes #223 Allows you to specify one or more additional packages to be installed, useful for deploying plugins.",14047, 929,"Ran black against datasette/cli.py https://pypi.org/project/black/ cli.py was getting a bit untidy due to all of the heavily annotated click function methods - used black to clean it up and make it easier to read.",14047, 930,Formatting tweak,14047, 931,"Fix for plugins in Python 3.5 (#222) ModuleNotFoundError is not a thing in Python 3.5, so catch KeyError/ImportError instead.",14047, 932,"New plugin hooks: extra_css_urls() and extra_js_urls() Closes #214",14047, 933,Fixed example HTML in CSS docs,14047, 934," /-/static-plugins/PLUGIN_NAME/ now serves static/ from plugins Refs #214",14047, 935," now gets class=""col-X"" - plus added col-X documentation Refs #209",14047, 936,"Use to_css_class for table cell column classes This ensures that columns with spaces in the name will still generate usable CSS class names. Refs #209",14047, 937,"Add column name classes to s, make PK bold",14047, 938,Additional test asserts,14047, 939,"Don't duplicate simple primary keys in the link column When there's a simple (single-column) primary key, it looks weird to duplicate it in the link column. This change removes the second PK column and treats the link column as if it were the PK column from a header/sorting perspective.",14047, 940,Correct escaping for HTML display of row links,14047, 941,"Longer time limit for test_paginate_compound_keys It was failing intermittently in Travis - see #209",14047, 942,"Use application/octet-stream for downloadable databses I'd also like to send the Content-Length here but that's not currently possible in Sanic - see bug report here: https://github.com/channelcat/sanic/issues/1194",14047, 943,Updated PyPI classifiers,14047, 944,Updated PyPI link to pypi.org,14047, 945,Datasette 0.19: plugin preview (with release notes),14047, 946,"Working implementation of #216 which passes the tests Reverted commit 5364fa7f3357f2de24fd45c85832205377642f19 (where I removed the code that didn't work). Added primary keys to order-by clause for sorting to get tests to pass",14047, 947,"Revert #216 until I can get tests to pass in Travis Revert ""Fix for _sort_desc=sortable_with_nulls test, refs #216"" This reverts commit 07fc2d113e462bfd8d7d56152c0d1fc55e0fdbe9. Revert ""Fixed #216 - paginate correctly when sorting by nullable column"" This reverts commit 2abe539a0f9f967ec0de6894774cb7ee83c4b3b9.",14047, 948,"Fix for _sort_desc=sortable_with_nulls test, refs #216",14047, 949,Fixed #216 - paginate correctly when sorting by nullable column,14047, 950,Apache 2.0 license badge,14047, 951,Removed rogue print() call,14047, 952,"Initial documentation for plugins, closes #213 https://datasette.readthedocs.io/en/latest/plugins.html",14047, 953,"New --plugins-dir=plugins/ option (#212) * New --plugins-dir=plugins/ option New option causing Datasette to load and evaluate all of the Python files in the specified directory and register any plugins that are defined in those files. This new option is available for the following commands: datasette serve mydb.db --plugins-dir=plugins/ datasette publish now/heroku mydb.db --plugins-dir=plugins/ datasette package mydb.db --plugins-dir=plugins/ * Unit tests for --plugins-dir=plugins/ Closes #211",14047, 954,Better fix for setup.py version,14047, 955,"Start of the plugin system, based on pluggy (#210) Uses https://pluggy.readthedocs.io/ originally created for the py.test project We're starting with two plugin hooks: prepare_connection(conn) This is called when a new SQLite connection is created. It can be used to register custom SQL functions. prepare_jinja2_environment(env) This is called with the Jinja2 environment. It can be used to register custom template tags and filters. An example plugin which uses these two hooks can be found at https://github.com/simonw/datasette-plugin-demos or installed using `pip install datasette-plugin-demos` Refs #14",14047, 956,"Return HTTP 405 on InvalidUsage rather than 500 This also stops it filling up the logs. This happens for HEAD requests at the moment - which perhaps should be handled better, but that's a different issue.",14047, 957,Added 0.18 to news,14047, 958,"Releasing v0.18 - support for units! Refs #203",14047, 959,"Don't attempt to deploy new tags to PyPI This isn't working through Travis at the moment, so I'm disabling it and switching back to manual deploys.",14047, 960,"Merge ""Support filtering with units"" from #205",14047, 961,Update number of expected tables,14047, 962,Unit test for unlabelled foreign keys from #207,14047, 963,"Link foreign keys which don't have labels This renders unlabeled FKs as simple links. I can't see why this would cause any major problems. Also includes bonus fixes for two minor issues: * In foreign key link hrefs the primary key was escaped using HTML escaping rather than URL escaping. This broke some non-integer PKs. * Print tracebacks to console when handling 500 errors.",14047, 964,"Added unit test for foreign key links in HTML Needed to add a further unit test for #207",14047, 965,"Fix sqlite error when loading rows with no incoming FKs This fixes `ERROR: conn=, sql = 'select ', params = {'id': '1'}` caused by an invalid query when loading incoming FKs. The error was ignored due to async but it still got printed to the console.",14047, 966,Add link to pint custom units page to docs,14047, 967,Tests for unit filtering,14047, 968,Allow custom units to be registered with Pint,14047, 969,Support units in filters,14047, 970,"Tidy up units support * Add units to exported JSON * Units key in metadata skeleton * Docs",14047, 971,"Initial units support Add support for specifying units for a column in metadata.json and rendering them on display using [pint](https://pint.readthedocs.io/en/latest/). ref #203",14047, 972,"Release 0.17 to fix issues with PyPI See https://twitter.com/simonw/status/984862976447414272",14047, 973,Releasing v0.16,14047, 974,Removed pathlib dependency (incompatible with Python 3.5),14047, 975,"Better mechanism for handling errors; 404s for missing table/database New error mechanism closes #193 404s for missing tables/databesse closes #184 Makes pull request #202 unnecessary.",14047, 976,long_description in markdown for the new PyPI,14047, 977,"Hide Spatialite system tables They were getting on my nerves.",14047, 978,"Allow explain select / explain query plan select Closes #201",14047, 979,Datasette Publish in readme,14047, 980,"datasette inspect now finds primary_keys Closes #195",14047, 981,"Ability to sort using form fields (for mobile portrait mode) We now display sort options as a select box plus a descending checkbox, which means you can apply sort orders even in portrait mode on a mobile phone where the column headers are hidden. Closes #199",14047, 982,Added Datasette 0.15 to news,14047, 983,Releasing v0.15,14047, 984,"table_rows => table_rows_count, filtered_table_rows => filtered_table_rows_count Renamed properties. Closes #194",14047, 985,"Fixed bug with human filter description, refs #189 We were showing this: 201 rows where sorted by sortable_with_nulls We now show this: 201 rows sorted by sortable_with_nulls",14047, 986,"Removed unnecessary enumerate template helper I made this obsolete in d1756d773685ca4f9c5b57fb40e1aa743bc95525 Refs #189",14047, 987,"New sortable_columns option in metadata.json to control sort options You can now explicitly set which columns in a table can be used for sorting using the _sort and _sort_desc arguments using metadata.json: { ""databases"": { ""database1"": { ""tables"": { ""example_table"": { ""sortable_columns"": [ ""height"", ""weight"" ] } } } } } Refs #189",14047, 988,"Error handling for ?_sort and ?_sort_desc Verifies that they match an existing column, and only one or the other option is provided - refs #189 Eses a new DatasetteError exception that closes #193",14047, 989,Correctly escape sort-by columns in SQL (refs #189),14047, 990,Column headers now link to sort/desc sort - refs #189,14047, 991,"Current sort order now reflected in human filter description Plus renamed human_description to human_description_en Refs #189",14047, 992,"_sort and _sort_desc parameters for table views Allows for paginated sorted results based on a specified column. Refs #189",14047, 993,Total row count now correct even if _next= applied,14047, 994,Use .custom_sql() for _group_count implementation (refs #150),14047, 995,"make html title more readable in query template (#180) tiny tweak to make this easier to visually parse—I think it matches your style in other templates",14047, 996,"New ?_shape=objects/object/lists param for JSON API (#192) New _shape= parameter replacing old .jsono extension Now instead of this: /database/table.jsono We use the _shape parameter like this: /database/table.json?_shape=objects Also introduced a new _shape called 'object' which looks like this: /database/table.json?_shape=object Returning an object for the rows key: ... ""rows"": { ""pk1"": { ... }, ""pk2"": { ... } } Refs #122",14047, 997,"Utility for writing test database fixtures to a .db file python tests/fixtures.py /tmp/hello.db This is useful for making a SQLite database of the test fixtures for interactive exploration.",14047, 998,"escape_sqlite_table_name => escape_sqlite, handles reserved words It can be used for column names as well as table names. Reserved word list from https://www.sqlite.org/lang_keywords.html",14047, 999,Three more news items,14047, 1000,"Compound primary key _next= now plays well with extra filters Closes #190",14047,