issue_comments
421 rows where author_association = "NONE" sorted by updated_at descending
This data as json, CSV (advanced)
Suggested facets: reactions, created_at (date), updated_at (date)
issue
- base_url configuration setting 13
- link_or_copy_directory() error - Invalid cross-device link 13
- .json and .csv exports fail to apply base_url 11
- Documentation with recommendations on running Datasette in production without using Docker 9
- JavaScript plugin hooks mechanism similar to pluggy 9
- Add GraphQL endpoint 8
- Full text search of all tables at once? 7
- Populate "endpoint" key in ASGI scope 7
- Figure out some interesting example SQL queries 6
- Metadata should be a nested arbitrary KV store 5
- Windows installation error 5
- Ways to improve fuzzy search speed on larger data sets? 5
- Redesign default JSON format in preparation for Datasette 1.0 5
- Port Datasette to ASGI 4
- Wildcard support in query parameters 4
- "Stream all rows" is not at all obvious 4
- Installing datasette via docker: Path 'fixtures.db' does not exist 4
- Package as standalone binary 3
- Plugin that adds an authentication layer of some sort 3
- Datasette serve should accept paths/URLs to CSVs and other file formats 3
- make uvicorn optional dependancy (because not ok on windows python yet) 3
- bump uvicorn to 0.9.0 to be Python-3.8 friendly 3
- Handle really wide tables better 3
- updating metadata.json without recreating the app 3
- upsert_all() throws issue when upserting to empty table 3
- Incorrect URLs when served behind a proxy with base_url set 3
- base_url doesn't seem to work when adding criteria and clicking "apply" 3
- Fallback to databases in inspect-data.json when no -i options are passed 3
- Improve the display of facets information 3
- Feature Request: Gmail 3
- ...
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
785485597 | https://github.com/simonw/datasette/pull/1243#issuecomment-785485597 | https://api.github.com/repos/simonw/datasette/issues/1243 | MDEyOklzc3VlQ29tbWVudDc4NTQ4NTU5Nw== | codecov[bot] 22429695 | 2021-02-25T00:28:30Z | 2021-02-25T00:28:30Z | NONE | Codecov Report
Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
fix small typo 815955014 | |
784638394 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-784638394 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5 | MDEyOklzc3VlQ29tbWVudDc4NDYzODM5NA== | UtahDave 306240 | 2021-02-24T00:36:18Z | 2021-02-24T00:36:18Z | NONE | I noticed that @simonw is using black for formatting. I ran black on my additions in this PR. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
WIP: Add Gmail takeout mbox import 813880401 | |
784347646 | https://github.com/simonw/datasette/issues/1241#issuecomment-784347646 | https://api.github.com/repos/simonw/datasette/issues/1241 | MDEyOklzc3VlQ29tbWVudDc4NDM0NzY0Ng== | Kabouik 7107523 | 2021-02-23T16:55:26Z | 2021-02-23T16:57:39Z | NONE |
Absolutely, that's why I thought my corner case with |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
[Feature request] Button to copy URL 814595021 | |
784312460 | https://github.com/simonw/datasette/issues/1240#issuecomment-784312460 | https://api.github.com/repos/simonw/datasette/issues/1240 | MDEyOklzc3VlQ29tbWVudDc4NDMxMjQ2MA== | Kabouik 7107523 | 2021-02-23T16:07:10Z | 2021-02-23T16:08:28Z | NONE | Likewise, while answering to another issue regarding the Vega plugin, I realized that there is no such way of linking rows after a custom query, I only get this "Link" column with individual URLs for the default SQL view: Or is it there and I am just missing the option in my custom queries? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Allow facetting on custom queries 814591962 | |
784157345 | https://github.com/simonw/datasette/issues/1218#issuecomment-784157345 | https://api.github.com/repos/simonw/datasette/issues/1218 | MDEyOklzc3VlQ29tbWVudDc4NDE1NzM0NQ== | soobrosa 1244799 | 2021-02-23T12:12:17Z | 2021-02-23T12:12:17Z | NONE | Topline this fixed the same problem for me.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
/usr/local/opt/python3/bin/python3.6: bad interpreter: No such file or directory 803356942 | |
783794520 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-783794520 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5 | MDEyOklzc3VlQ29tbWVudDc4Mzc5NDUyMA== | UtahDave 306240 | 2021-02-23T01:13:54Z | 2021-02-23T01:13:54Z | NONE | Also, @simonw I created a test based off the existing tests. I think it's working correctly |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
WIP: Add Gmail takeout mbox import 813880401 | |
783688547 | https://github.com/dogsheep/google-takeout-to-sqlite/issues/4#issuecomment-783688547 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/4 | MDEyOklzc3VlQ29tbWVudDc4MzY4ODU0Nw== | UtahDave 306240 | 2021-02-22T21:31:28Z | 2021-02-22T21:31:28Z | NONE | @Btibert3 I've opened a PR with my initial attempt at this. Would you be willing to give this a try? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Feature Request: Gmail 778380836 | |
783662968 | https://github.com/simonw/sqlite-utils/issues/220#issuecomment-783662968 | https://api.github.com/repos/simonw/sqlite-utils/issues/220 | MDEyOklzc3VlQ29tbWVudDc4MzY2Mjk2OA== | mhalle 649467 | 2021-02-22T20:44:51Z | 2021-02-22T20:44:51Z | NONE | Actually, coming back to this, I have a clearer use case for enabling fts generation for views: making it easier to bring in text from lookup tables and other joins. The datasette documentation describes populating an fts table like so:
Alternatively if you have fts support in sqlite_utils for views (which sqlite and fts5 support), you can do the same thing just by creating a view that captures the above joins as columns, then creating an fts table from that view. Such an fts table can be created using sqlite_utils, where one created with your method can't. The resulting fts table can then be used by a whole family of related tables and views in the manner you described earlier in this issue. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Better error message for *_fts methods against views 783778672 | |
783560017 | https://github.com/simonw/datasette/issues/1166#issuecomment-783560017 | https://api.github.com/repos/simonw/datasette/issues/1166 | MDEyOklzc3VlQ29tbWVudDc4MzU2MDAxNw== | thorn0 94334 | 2021-02-22T18:00:57Z | 2021-02-22T18:13:11Z | NONE | Hi! I don't think Prettier supports this syntax for globs: Tested it. Apparently, it works as a negated character class in regexes (like
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Adopt Prettier for JavaScript code formatting 777140799 | |
783265830 | https://github.com/simonw/datasette/issues/782#issuecomment-783265830 | https://api.github.com/repos/simonw/datasette/issues/782 | MDEyOklzc3VlQ29tbWVudDc4MzI2NTgzMA== | frankieroberto 30665 | 2021-02-22T10:21:14Z | 2021-02-22T10:21:14Z | NONE | @simonw:
Interesting! Although I don't think it matters too much what the underlying implementation is - I more meant that |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Redesign default JSON format in preparation for Datasette 1.0 627794879 | |
782756398 | https://github.com/simonw/datasette/issues/782#issuecomment-782756398 | https://api.github.com/repos/simonw/datasette/issues/782 | MDEyOklzc3VlQ29tbWVudDc4Mjc1NjM5OA== | simonrjones 601316 | 2021-02-20T22:05:48Z | 2021-02-20T22:05:48Z | NONE |
I agree it is more predictable if the top level item is an object with a rows or data object that contains an array of data, which then allows for other top-level meta data. I can see the argument for removing this and just using an array for convenience - but I think that's OK as an option (as you have now). Rather than have lots of top-level keys you could have a "meta" object to contain non-data stuff. You could use something like "links" for API endpoint URLs (or use a standard like HAL). Which would then leave the top level a bit cleaner - if that's what you what. Have you had much feedback from users who use the Datasette API a lot? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Redesign default JSON format in preparation for Datasette 1.0 627794879 | |
782746755 | https://github.com/simonw/datasette/issues/782#issuecomment-782746755 | https://api.github.com/repos/simonw/datasette/issues/782 | MDEyOklzc3VlQ29tbWVudDc4Mjc0Njc1NQ== | frankieroberto 30665 | 2021-02-20T20:44:05Z | 2021-02-20T20:44:05Z | NONE | Minor suggestion: rename I like the idea of specifying a limit of 0 if you don’t want any rows data - and returning an empty array under the Have you given any thought as to whether to pretty print (format with spaces) the output or not? Can be useful for debugging/exploring in a browser or other basic tools which don’t parse the JSON. Could be default (can’t be much bigger with gzip?) or opt-in. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Redesign default JSON format in preparation for Datasette 1.0 627794879 | |
782745199 | https://github.com/simonw/datasette/issues/782#issuecomment-782745199 | https://api.github.com/repos/simonw/datasette/issues/782 | MDEyOklzc3VlQ29tbWVudDc4Mjc0NTE5OQ== | frankieroberto 30665 | 2021-02-20T20:32:03Z | 2021-02-20T20:32:03Z | NONE | I think it’s a good idea if the top level item of the response JSON is always an object, rather than an array, at least as the default. Mainly because it allows you to add extra keys in a backwards-compatible way. Also just seems more expected somehow. The API design guidance for the UK government also recommends this: https://www.gov.uk/guidance/gds-api-technical-and-data-standards#use-json I also strongly dislike having versioned APIs (eg with a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Redesign default JSON format in preparation for Datasette 1.0 627794879 | |
782430028 | https://github.com/simonw/datasette/issues/1212#issuecomment-782430028 | https://api.github.com/repos/simonw/datasette/issues/1212 | MDEyOklzc3VlQ29tbWVudDc4MjQzMDAyOA== | kbaikov 4488943 | 2021-02-19T22:54:13Z | 2021-02-19T22:54:13Z | NONE | I will close this issue since it appears only in my particular setup. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Tests are very slow. 797651831 | |
782053455 | https://github.com/simonw/datasette/pull/1229#issuecomment-782053455 | https://api.github.com/repos/simonw/datasette/issues/1229 | MDEyOklzc3VlQ29tbWVudDc4MjA1MzQ1NQ== | camallen 295329 | 2021-02-19T12:47:19Z | 2021-02-19T12:47:19Z | NONE | I believe this pr and #1031 are related and fix the same issue. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
ensure immutable databses when starting in configuration directory mode with 810507413 | |
781599929 | https://github.com/simonw/datasette/pull/1232#issuecomment-781599929 | https://api.github.com/repos/simonw/datasette/issues/1232 | MDEyOklzc3VlQ29tbWVudDc4MTU5OTkyOQ== | codecov[bot] 22429695 | 2021-02-18T19:59:54Z | 2021-02-18T22:06:42Z | NONE | Codecov Report
<table>
<thead>
<tr>
<th>Impacted Files</th>
<th>Coverage Δ</th>
<th></th>
</tr>
</thead>
<tbody>
<tr>
<td>datasette/app.py</td>
<td>95.68% <100.00%> (+0.06%) </td>
<td>:arrow_up:</td>
</tr>
<tr>
<td>datasette/cli.py</td>
<td>76.62% <100.00%> (+0.36%) </td>
<td>:arrow_up:</td>
</tr>
<tr>
<td>datasette/views/database.py</td>
<td>97.19% <100.00%> (+0.01%) </td>
<td>:arrow_up:</td>
</tr>
</tbody>
</table>
Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
--crossdb option for joining across databases 811407131 | |
781451701 | https://github.com/dogsheep/google-takeout-to-sqlite/issues/4#issuecomment-781451701 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/4 | MDEyOklzc3VlQ29tbWVudDc4MTQ1MTcwMQ== | Btibert3 203343 | 2021-02-18T16:06:21Z | 2021-02-18T16:06:21Z | NONE | Awesome! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Feature Request: Gmail 778380836 | |
781330466 | https://github.com/simonw/datasette/issues/1230#issuecomment-781330466 | https://api.github.com/repos/simonw/datasette/issues/1230 | MDEyOklzc3VlQ29tbWVudDc4MTMzMDQ2Ng== | Kabouik 7107523 | 2021-02-18T13:06:22Z | 2021-02-18T15:22:15Z | NONE | [Edit] Oh, I just saw the "Load all" button under the cluster map as well as the setting to alter the max number or results. So I guess this issue only is about the Vega charts.
Note that datasette-cluster-map also seems to be limited to 998 displayed points:

|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Vega charts are plotted only for rows on the visible page, cluster maps only for rows in the remaining pages 811054000 | |
780991910 | https://github.com/simonw/datasette/issues/283#issuecomment-780991910 | https://api.github.com/repos/simonw/datasette/issues/283 | MDEyOklzc3VlQ29tbWVudDc4MDk5MTkxMA== | rayvoelker 9308268 | 2021-02-18T02:13:56Z | 2021-02-18T02:13:56Z | NONE | I was going ask you about this issue when we talk during your office-hours schedule this Friday, but was there any support ever added for doing this cross-database joining? I have a use-case where could be pretty neat to do analysis using this tool on time-specific databases from snapshots https://ilsweb.cincinnatilibrary.org/collection-analysis/ and thanks again for such an amazing tool! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Support cross-database joins 325958506 | |
780830464 | https://github.com/simonw/datasette/pull/1229#issuecomment-780830464 | https://api.github.com/repos/simonw/datasette/issues/1229 | MDEyOklzc3VlQ29tbWVudDc4MDgzMDQ2NA== | codecov[bot] 22429695 | 2021-02-17T20:24:30Z | 2021-02-17T20:24:30Z | NONE | Codecov Report
<table>
<thead>
<tr>
<th>Impacted Files</th>
<th>Coverage Δ</th>
<th></th>
</tr>
</thead>
<tbody>
<tr>
<td>datasette/app.py</td>
<td>95.61% <100.00%> (ø) </td>
<td></td>
</tr>
</tbody>
</table>
Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
ensure immutable databses when starting in configuration directory mode with 810507413 | |
780817596 | https://github.com/dogsheep/google-takeout-to-sqlite/issues/4#issuecomment-780817596 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/4 | MDEyOklzc3VlQ29tbWVudDc4MDgxNzU5Ng== | UtahDave 306240 | 2021-02-17T20:01:35Z | 2021-02-17T20:01:35Z | NONE | I've got this almost working. Just needs some polish |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Feature Request: Gmail 778380836 | |
779785638 | https://github.com/simonw/sqlite-utils/issues/227#issuecomment-779785638 | https://api.github.com/repos/simonw/sqlite-utils/issues/227 | MDEyOklzc3VlQ29tbWVudDc3OTc4NTYzOA== | camallen 295329 | 2021-02-16T11:48:03Z | 2021-02-16T11:48:03Z | NONE | Thank you @simonw |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Error reading csv files with large column data 807174161 | |
778467759 | https://github.com/simonw/datasette/issues/1220#issuecomment-778467759 | https://api.github.com/repos/simonw/datasette/issues/1220 | MDEyOklzc3VlQ29tbWVudDc3ODQ2Nzc1OQ== | aborruso 30607 | 2021-02-12T21:35:17Z | 2021-02-12T21:35:17Z | NONE | Thank you |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Installing datasette via docker: Path 'fixtures.db' does not exist 806743116 | |
778439617 | https://github.com/simonw/datasette/issues/1220#issuecomment-778439617 | https://api.github.com/repos/simonw/datasette/issues/1220 | MDEyOklzc3VlQ29tbWVudDc3ODQzOTYxNw== | bobwhitelock 7476523 | 2021-02-12T20:33:27Z | 2021-02-12T20:33:27Z | NONE | That Docker command will mount your current directory inside the Docker container at
and it will use the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Installing datasette via docker: Path 'fixtures.db' does not exist 806743116 | |
770069864 | https://github.com/dogsheep/github-to-sqlite/issues/60#issuecomment-770069864 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/60 | MDEyOklzc3VlQ29tbWVudDc3MDA2OTg2NA== | daniel-butler 22578954 | 2021-01-29T21:52:05Z | 2021-02-12T18:29:43Z | NONE | For the purposes below I am assuming the organization I would get all the repositories and their related commits from is called
I'm on a windows computer running git bash to be able to use the
On a pure linux system I think this would work because the new line character is normally
As expected I ran into rate limit issues #51 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Use Data from SQLite in other commands 797097140 | |
778014990 | https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-778014990 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33 | MDEyOklzc3VlQ29tbWVudDc3ODAxNDk5MA== | leafgarland 675335 | 2021-02-12T06:54:14Z | 2021-02-12T06:54:14Z | NONE | Ahh, that might be because macOS Big Sur has changed the structure of the photos db. Might need to wait for a later release, there is a PR which adds support for Big Sur. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
photo-to-sqlite: command not found 803338729 | |
778008752 | https://github.com/simonw/datasette/issues/1220#issuecomment-778008752 | https://api.github.com/repos/simonw/datasette/issues/1220 | MDEyOklzc3VlQ29tbWVudDc3ODAwODc1Mg== | aborruso 30607 | 2021-02-12T06:37:34Z | 2021-02-12T06:37:34Z | NONE | I have used my path, I'm running it from the folder in wich I have the db. Do I must an absolute path? Do I must create exactly that folder? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Installing datasette via docker: Path 'fixtures.db' does not exist 806743116 | |
778002092 | https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-778002092 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33 | MDEyOklzc3VlQ29tbWVudDc3ODAwMjA5Mg== | robmarkcole 11855322 | 2021-02-12T06:19:32Z | 2021-02-12T06:19:32Z | NONE | hi @leafgarland that results in a new error:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
photo-to-sqlite: command not found 803338729 | |
777951854 | https://github.com/dogsheep/dogsheep-photos/issues/33#issuecomment-777951854 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/33 | MDEyOklzc3VlQ29tbWVudDc3Nzk1MTg1NA== | leafgarland 675335 | 2021-02-12T03:54:39Z | 2021-02-12T03:54:39Z | NONE | I think that is a typo in the docs, you can use
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
photo-to-sqlite: command not found 803338729 | |
777949755 | https://github.com/simonw/datasette/pull/1223#issuecomment-777949755 | https://api.github.com/repos/simonw/datasette/issues/1223 | MDEyOklzc3VlQ29tbWVudDc3Nzk0OTc1NQ== | codecov[bot] 22429695 | 2021-02-12T03:45:31Z | 2021-02-12T03:45:31Z | NONE | Codecov Report
Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add compile option to Dockerfile to fix failing test (fixes #696) 806918878 | |
777927946 | https://github.com/simonw/datasette/issues/1220#issuecomment-777927946 | https://api.github.com/repos/simonw/datasette/issues/1220 | MDEyOklzc3VlQ29tbWVudDc3NzkyNzk0Ng== | bobwhitelock 7476523 | 2021-02-12T02:29:54Z | 2021-02-12T02:29:54Z | NONE | According to https://github.com/simonw/datasette/blob/master/docs/installation.rst#using-docker it should be
This uses |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Installing datasette via docker: Path 'fixtures.db' does not exist 806743116 | |
777690332 | https://github.com/dogsheep/evernote-to-sqlite/issues/11#issuecomment-777690332 | https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/11 | MDEyOklzc3VlQ29tbWVudDc3NzY5MDMzMg== | dskrad 3613583 | 2021-02-11T18:16:01Z | 2021-02-11T18:16:01Z | NONE | I solved this issue by modifying line 31 of utils.py
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
XML parse error 792851444 | |
777132761 | https://github.com/simonw/datasette/issues/1200#issuecomment-777132761 | https://api.github.com/repos/simonw/datasette/issues/1200 | MDEyOklzc3VlQ29tbWVudDc3NzEzMjc2MQ== | bobwhitelock 7476523 | 2021-02-11T00:29:52Z | 2021-02-11T00:29:52Z | NONE | I'm probably missing something but what's the use case here - what would this offer over adding |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
?_size=10 option for the arbitrary query page would be useful 792890765 | |
774730656 | https://github.com/dogsheep/pocket-to-sqlite/issues/9#issuecomment-774730656 | https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/9 | MDEyOklzc3VlQ29tbWVudDc3NDczMDY1Ng== | merwok 635179 | 2021-02-07T18:45:04Z | 2021-02-07T18:45:04Z | NONE | That URL uses TLS 1.3, but maybe only if the client supports it. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
SSL Error 801780625 | |
774726123 | https://github.com/dogsheep/pocket-to-sqlite/issues/9#issuecomment-774726123 | https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/9 | MDEyOklzc3VlQ29tbWVudDc3NDcyNjEyMw== | jfeiwell 12669260 | 2021-02-07T18:21:08Z | 2021-02-07T18:21:08Z | NONE | @simonw any ideas here? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
SSL Error 801780625 | |
774528913 | https://github.com/simonw/datasette/issues/1217#issuecomment-774528913 | https://api.github.com/repos/simonw/datasette/issues/1217 | MDEyOklzc3VlQ29tbWVudDc3NDUyODkxMw== | virtadpt 639730 | 2021-02-06T19:23:41Z | 2021-02-06T19:23:41Z | NONE | I've had a lot of success running it as an OpenFaaS lambda. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Possible to deploy as a python app (for Rstudio connect server)? 802513359 | |
774385092 | https://github.com/simonw/datasette/issues/1217#issuecomment-774385092 | https://api.github.com/repos/simonw/datasette/issues/1217 | MDEyOklzc3VlQ29tbWVudDc3NDM4NTA5Mg== | pavopax 6165713 | 2021-02-06T02:49:11Z | 2021-02-06T02:49:11Z | NONE | A good reference seems to be the note to run |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Possible to deploy as a python app (for Rstudio connect server)? 802513359 | |
774286962 | https://github.com/simonw/datasette/issues/1208#issuecomment-774286962 | https://api.github.com/repos/simonw/datasette/issues/1208 | MDEyOklzc3VlQ29tbWVudDc3NDI4Njk2Mg== | kbaikov 4488943 | 2021-02-05T21:02:39Z | 2021-02-05T21:02:39Z | NONE | @simonw could you please take a look at the PR 1211 that fixes this issue? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
A lot of open(file) functions are used without a context manager thus producing ResourceWarning: unclosed file <_io.TextIOWrapper 794554881 | |
774217792 | https://github.com/simonw/sqlite-utils/pull/203#issuecomment-774217792 | https://api.github.com/repos/simonw/sqlite-utils/issues/203 | MDEyOklzc3VlQ29tbWVudDc3NDIxNzc5Mg== | drkane 1049910 | 2021-02-05T18:44:13Z | 2021-02-05T18:44:13Z | NONE | Thanks for looking at this - home schooling kids has prevented me from replying. I'd struggled with how to adapt the API for the foreign keys too - I definitely tried the String/Tuple approach. I hadn't considered the breaking changes that would introduce though. I can take a look at this and try and make the change - see which of your options works best. I've got a workaround for the use-case I was looking at this for, so it wouldn't be a problem for me if it was put on the back burner until a hypothetical v4.0 anyway. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
changes to allow for compound foreign keys 743384829 | |
773977128 | https://github.com/simonw/datasette/issues/1210#issuecomment-773977128 | https://api.github.com/repos/simonw/datasette/issues/1210 | MDEyOklzc3VlQ29tbWVudDc3Mzk3NzEyOA== | heyarne 525780 | 2021-02-05T11:30:34Z | 2021-02-05T11:30:34Z | NONE | Thanks for your quick reply! Having changed my |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Immutable Database w/ Canned Queries 796234313 | |
772408273 | https://github.com/dogsheep/twitter-to-sqlite/issues/56#issuecomment-772408273 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/56 | MDEyOklzc3VlQ29tbWVudDc3MjQwODI3Mw== | gsajko 42315895 | 2021-02-03T10:36:36Z | 2021-02-03T10:36:36Z | NONE | I figured it out. So if someone quote tweeted a quote tweet, the second quote tweet won't have |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Not all quoted statuses get fetched? 796736607 | |
772007663 | https://github.com/simonw/datasette/issues/1212#issuecomment-772007663 | https://api.github.com/repos/simonw/datasette/issues/1212 | MDEyOklzc3VlQ29tbWVudDc3MjAwNzY2Mw== | kbaikov 4488943 | 2021-02-02T21:36:56Z | 2021-02-02T21:36:56Z | NONE | How do you get 4-5 minutes? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Tests are very slow. 797651831 | |
771127458 | https://github.com/simonw/datasette/pull/1211#issuecomment-771127458 | https://api.github.com/repos/simonw/datasette/issues/1211 | MDEyOklzc3VlQ29tbWVudDc3MTEyNzQ1OA== | kbaikov 4488943 | 2021-02-01T20:13:39Z | 2021-02-01T20:13:39Z | NONE | Ping @simonw |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Use context manager instead of plain open 797649915 | |
770865698 | https://github.com/simonw/datasette/pull/1159#issuecomment-770865698 | https://api.github.com/repos/simonw/datasette/issues/1159 | MDEyOklzc3VlQ29tbWVudDc3MDg2NTY5OA== | lovasoa 552629 | 2021-02-01T13:42:29Z | 2021-02-01T13:42:29Z | NONE | @simonw : Could you have a look at this ? I think this really improves readability. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Improve the display of facets information 774332247 | |
770343684 | https://github.com/simonw/datasette/pull/1211#issuecomment-770343684 | https://api.github.com/repos/simonw/datasette/issues/1211 | MDEyOklzc3VlQ29tbWVudDc3MDM0MzY4NA== | codecov[bot] 22429695 | 2021-01-31T08:03:40Z | 2021-01-31T08:03:40Z | NONE | Codecov Report
<table>
<thead>
<tr>
<th>Impacted Files</th>
<th>Coverage Δ</th>
<th></th>
</tr>
</thead>
<tbody>
<tr>
<td>datasette/cli.py</td>
<td>77.29% <66.66%> (-0.31%) </td>
<td>:arrow_down:</td>
</tr>
<tr>
<td>datasette/app.py</td>
<td>95.62% <100.00%> (+<0.01%) </td>
<td>:arrow_up:</td>
</tr>
<tr>
<td>datasette/publish/cloudrun.py</td>
<td>96.96% <100.00%> (+0.09%) </td>
<td>:arrow_up:</td>
</tr>
<tr>
<td>datasette/publish/heroku.py</td>
<td>87.73% <100.00%> (+0.60%) </td>
<td>:arrow_up:</td>
</tr>
<tr>
<td>datasette/utils/__init__.py</td>
<td>94.13% <100.00%> (+0.02%) </td>
<td>:arrow_up:</td>
</tr>
</tbody>
</table>
Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Use context manager instead of plain open 797649915 | |
770150526 | https://github.com/dogsheep/github-to-sqlite/issues/51#issuecomment-770150526 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/51 | MDEyOklzc3VlQ29tbWVudDc3MDE1MDUyNg== | daniel-butler 22578954 | 2021-01-30T03:44:19Z | 2021-01-30T03:47:24Z | NONE | I don't have much experience with github's rate limiting. In my day job we use the tenacity library to handle http errors we get. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
github-to-sqlite should handle rate limits better 703246031 | |
770112248 | https://github.com/dogsheep/github-to-sqlite/issues/60#issuecomment-770112248 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/60 | MDEyOklzc3VlQ29tbWVudDc3MDExMjI0OA== | daniel-butler 22578954 | 2021-01-30T00:01:03Z | 2021-01-30T01:14:42Z | NONE | Yes that would be cool! I wouldn't mind helping. Is this the meat of it? https://github.com/dogsheep/twitter-to-sqlite/blob/21fc1cad6dd6348c67acff90a785b458d3a81275/twitter_to_sqlite/utils.py#L512 It looks like the cli option is added with this decorator : https://github.com/dogsheep/twitter-to-sqlite/blob/21fc1cad6dd6348c67acff90a785b458d3a81275/twitter_to_sqlite/cli.py#L14 I looked a bit at utils.py in the GitHub repository. I was surprised at the amount of manual mapping of the API response you had to do to get this to work. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Use Data from SQLite in other commands 797097140 | |
769973212 | https://github.com/dogsheep/twitter-to-sqlite/issues/56#issuecomment-769973212 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/56 | MDEyOklzc3VlQ29tbWVudDc2OTk3MzIxMg== | gsajko 42315895 | 2021-01-29T18:29:02Z | 2021-01-29T18:31:55Z | NONE | I think it was with from cron tab |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Not all quoted statuses get fetched? 796736607 | |
767888743 | https://github.com/dogsheep/twitter-to-sqlite/issues/54#issuecomment-767888743 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/54 | MDEyOklzc3VlQ29tbWVudDc2Nzg4ODc0Mw== | henry501 19328961 | 2021-01-26T23:07:41Z | 2021-01-26T23:07:41Z | NONE | My import got much further with the applied fixes than 0.21.3, but not 100%. I do appear to have all of the tweets imported at least. Here's my output:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Archive import appears to be broken on recent exports 779088071 | |
766589070 | https://github.com/simonw/datasette/pull/1206#issuecomment-766589070 | https://api.github.com/repos/simonw/datasette/issues/1206 | MDEyOklzc3VlQ29tbWVudDc2NjU4OTA3MA== | codecov[bot] 22429695 | 2021-01-25T06:50:30Z | 2021-01-25T17:31:11Z | NONE | Codecov Report
<table>
<thead>
<tr>
<th>Impacted Files</th>
<th>Coverage Δ</th>
<th></th>
</tr>
</thead>
<tbody>
<tr>
<td>datasette/version.py</td>
<td>100.00% <100.00%> (ø) </td>
<td></td>
</tr>
</tbody>
</table>
Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Release 0.54 793086333 | |
765678057 | https://github.com/simonw/sqlite-utils/pull/224#issuecomment-765678057 | https://api.github.com/repos/simonw/sqlite-utils/issues/224 | MDEyOklzc3VlQ29tbWVudDc2NTY3ODA1Nw== | polyrand 37962604 | 2021-01-22T20:53:06Z | 2021-01-23T20:13:27Z | NONE | I'm using the FTS methods in sqlite-utils for this website: drwn.io. I wanted to get pagination to have some kind of infinite scrolling in the landing page, and I ended up using that. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add fts offset docs. 792297010 | |
765639968 | https://github.com/simonw/datasette/issues/1196#issuecomment-765639968 | https://api.github.com/repos/simonw/datasette/issues/1196 | MDEyOklzc3VlQ29tbWVudDc2NTYzOTk2OA== | QAInsights 2826376 | 2021-01-22T19:37:15Z | 2021-01-22T19:37:15Z | NONE | I tried deployment in WSL. It is working fine https://jmeter.vercel.app/ |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Access Denied Error in Windows 791237799 | |
765525338 | https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765525338 | https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1 | MDEyOklzc3VlQ29tbWVudDc2NTUyNTMzOA== | cobiadigital 25372415 | 2021-01-22T16:22:44Z | 2021-01-22T16:22:44Z | NONE | rs1333049 associated with coronary artery disease
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Figure out some interesting example SQL queries 496415321 | |
765523517 | https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765523517 | https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1 | MDEyOklzc3VlQ29tbWVudDc2NTUyMzUxNw== | cobiadigital 25372415 | 2021-01-22T16:20:25Z | 2021-01-22T16:20:25Z | NONE | rs53576: the oxytocin receptor (OXTR) gene
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Figure out some interesting example SQL queries 496415321 | |
765506901 | https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765506901 | https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1 | MDEyOklzc3VlQ29tbWVudDc2NTUwNjkwMQ== | cobiadigital 25372415 | 2021-01-22T15:58:41Z | 2021-01-22T15:58:58Z | NONE | Both rs10757274 and rs2383206 can both indicate higher risks of heart disease
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Figure out some interesting example SQL queries 496415321 | |
765502845 | https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765502845 | https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1 | MDEyOklzc3VlQ29tbWVudDc2NTUwMjg0NQ== | cobiadigital 25372415 | 2021-01-22T15:53:19Z | 2021-01-22T15:53:19Z | NONE | rs7903146 Influences risk of Type-2 diabetes
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Figure out some interesting example SQL queries 496415321 | |
765498984 | https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765498984 | https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1 | MDEyOklzc3VlQ29tbWVudDc2NTQ5ODk4NA== | cobiadigital 25372415 | 2021-01-22T15:48:25Z | 2021-01-22T15:49:33Z | NONE | The "Warrior Gene" https://www.snpedia.com/index.php/Rs4680
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Figure out some interesting example SQL queries 496415321 | |
765495861 | https://github.com/dogsheep/genome-to-sqlite/issues/1#issuecomment-765495861 | https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1 | MDEyOklzc3VlQ29tbWVudDc2NTQ5NTg2MQ== | cobiadigital 25372415 | 2021-01-22T15:44:00Z | 2021-01-22T15:44:00Z | NONE | Risk of autoimmune disorders: https://www.snpedia.com/index.php/Genotype
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Figure out some interesting example SQL queries 496415321 | |
762488336 | https://github.com/simonw/datasette/issues/1175#issuecomment-762488336 | https://api.github.com/repos/simonw/datasette/issues/1175 | MDEyOklzc3VlQ29tbWVudDc2MjQ4ODMzNg== | hannseman 758858 | 2021-01-18T21:59:28Z | 2021-01-18T22:00:31Z | NONE | I encountered your issue when trying to find a solution and came up with the following, maybe it can help.
And then it'll be run on the startup event:
It depends on a setting called |
{ "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Use structlog for logging 779156520 | |
762391426 | https://github.com/simonw/datasette/issues/1036#issuecomment-762391426 | https://api.github.com/repos/simonw/datasette/issues/1036 | MDEyOklzc3VlQ29tbWVudDc2MjM5MTQyNg== | philshem 4997607 | 2021-01-18T17:45:00Z | 2021-01-18T17:45:00Z | NONE | It might be possible with this library: https://docs.python.org/3/library/imghdr.html quick test of the downloaded blob:
The output plugin would be cool. I'll look into making my first datasette plugin. I'm also imagining displaying the image in the browser -- but that would be a step 2. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Make it possible to download BLOB data from the Datasette UI 725996507 | |
762385981 | https://github.com/simonw/datasette/issues/1036#issuecomment-762385981 | https://api.github.com/repos/simonw/datasette/issues/1036 | MDEyOklzc3VlQ29tbWVudDc2MjM4NTk4MQ== | philshem 4997607 | 2021-01-18T17:32:13Z | 2021-01-18T17:34:50Z | NONE | Hi Simon Just finding this old issue regarding downloading blobs. Nice work!
As a feature request, maybe it would be possible to assign a blob column as a certain data type (e.g. I guess the column blob-type definition could fit into this dropdown selection:
Let me know if I should open a new issue with a feature request. (This could slowly go in the direction of displaying image blob-types in the browser.) Thanks for the great tool! edit: just reading the rest of the twitter thread: https://twitter.com/simonw/status/1318685933256855552 perhaps this is already possible in some form with the plugin datasette-media: https://github.com/simonw/datasette-media |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Make it possible to download BLOB data from the Datasette UI 725996507 | |
761015218 | https://github.com/simonw/sqlite-utils/issues/220#issuecomment-761015218 | https://api.github.com/repos/simonw/sqlite-utils/issues/220 | MDEyOklzc3VlQ29tbWVudDc2MTAxNTIxOA== | mhalle 649467 | 2021-01-15T15:40:08Z | 2021-01-15T15:40:08Z | NONE | Make sense. If you're coming from the sqlite3 side of things, rather than the datasette side, wanting the fts methods to work for views makes more sense. sqlite3 allows fts5 tables on views, so I was looking for CLI functionality to build the fts virtual tables. Ultimately, though, sharing fts virtual tables across tables and derivative views is likely more efficient. Maybe an explicit error message like, "fts is not supported for views" rather than just throwing an exception that the method doesn't exist" might be helpful. Not critical though. Thanks. |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Better error message for *_fts methods against views 783778672 | |
760950128 | https://github.com/dogsheep/twitter-to-sqlite/pull/55#issuecomment-760950128 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/55 | MDEyOklzc3VlQ29tbWVudDc2MDk1MDEyOA== | jacobian 21148 | 2021-01-15T13:44:52Z | 2021-01-15T13:44:52Z | NONE | I found and fixed another bug, this one around importing the tweets table. @simonw let me know if you'd prefer this broken out into multiple PRs, happy to do that if it makes review/merging easier. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Fix archive imports 779211940 | |
759306228 | https://github.com/simonw/datasette/pull/1159#issuecomment-759306228 | https://api.github.com/repos/simonw/datasette/issues/1159 | MDEyOklzc3VlQ29tbWVudDc1OTMwNjIyOA== | lovasoa 552629 | 2021-01-13T08:59:31Z | 2021-01-13T08:59:31Z | NONE | @simonw : Did you have the time to take a look at this ? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Improve the display of facets information 774332247 | |
758668359 | https://github.com/simonw/datasette/issues/1091#issuecomment-758668359 | https://api.github.com/repos/simonw/datasette/issues/1091 | MDEyOklzc3VlQ29tbWVudDc1ODY2ODM1OQ== | tballison 6739646 | 2021-01-12T13:52:29Z | 2021-01-12T13:52:29Z | NONE | Y, thank you to both of you! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
.json and .csv exports fail to apply base_url 742011049 | |
758448525 | https://github.com/simonw/datasette/issues/1091#issuecomment-758448525 | https://api.github.com/repos/simonw/datasette/issues/1091 | MDEyOklzc3VlQ29tbWVudDc1ODQ0ODUyNQ== | henry501 19328961 | 2021-01-12T06:55:08Z | 2021-01-12T06:55:08Z | NONE | Great, really happy I could help! Reverse proxies get tricky. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
.json and .csv exports fail to apply base_url 742011049 | |
758280611 | https://github.com/simonw/datasette/issues/1091#issuecomment-758280611 | https://api.github.com/repos/simonw/datasette/issues/1091 | MDEyOklzc3VlQ29tbWVudDc1ODI4MDYxMQ== | tballison 6739646 | 2021-01-11T23:06:10Z | 2021-01-11T23:06:10Z | NONE | +1 Yep! Fixes it. If I navigate to https://corpora.tika.apache.org/datasette, I get a 404 (database not found: datasette), but if I navigate to https://corpora.tika.apache.org/datasette/file_profiles/, everything WORKS! Thank you! |
{ "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 1, "eyes": 0 } |
.json and .csv exports fail to apply base_url 742011049 | |
756425587 | https://github.com/simonw/datasette/issues/1091#issuecomment-756425587 | https://api.github.com/repos/simonw/datasette/issues/1091 | MDEyOklzc3VlQ29tbWVudDc1NjQyNTU4Nw== | henry501 19328961 | 2021-01-07T22:27:19Z | 2021-01-07T22:27:19Z | NONE | I found this issue while troubleshooting the same behavior with an nginx reverse proxy. The solution was to make sure I set:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
.json and .csv exports fail to apply base_url 742011049 | |
754911290 | https://github.com/simonw/datasette/issues/1171#issuecomment-754911290 | https://api.github.com/repos/simonw/datasette/issues/1171 | MDEyOklzc3VlQ29tbWVudDc1NDkxMTI5MA== | rcoup 59874 | 2021-01-05T21:31:15Z | 2021-01-05T21:31:15Z | NONE | We did this for Sno under macOS — it's a PyInstaller binary/setup which uses Packages for packaging.
FYI (if you ever get to it) for Windows you need to get a code signing certificate. And if you want automated CI, you'll want to get an "EV CodeSigning for HSM" certificate from GlobalSign, which then lets you put the certificate into Azure Key Vault. Which you can use with azuresigntool to sign your code & installer. (Non-EV certificates are a waste of time, the user still gets big warnings at install time). |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
GitHub Actions workflow to build and sign macOS binary executables 778450486 | |
754729035 | https://github.com/dogsheep/twitter-to-sqlite/issues/54#issuecomment-754729035 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/54 | MDEyOklzc3VlQ29tbWVudDc1NDcyOTAzNQ== | jacobian 21148 | 2021-01-05T16:03:29Z | 2021-01-05T16:03:29Z | NONE | I was able to fix this, at least enough to get my archive to import. Not sure if there's more work to be done here or not. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Archive import appears to be broken on recent exports 779088071 | |
754728696 | https://github.com/dogsheep/twitter-to-sqlite/pull/55#issuecomment-754728696 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/55 | MDEyOklzc3VlQ29tbWVudDc1NDcyODY5Ng== | jacobian 21148 | 2021-01-05T16:02:55Z | 2021-01-05T16:02:55Z | NONE | This now works for me, though I'm entirely ensure if it's a just-my-export thing or a wider issue. Also, this doesn't contain any tests. So I'm not sure if there's more work to be done here, or if this is good enough. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Fix archive imports 779211940 | |
754721153 | https://github.com/dogsheep/twitter-to-sqlite/issues/54#issuecomment-754721153 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/54 | MDEyOklzc3VlQ29tbWVudDc1NDcyMTE1Mw== | jacobian 21148 | 2021-01-05T15:51:09Z | 2021-01-05T15:51:09Z | NONE | Correction: the failure is on |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Archive import appears to be broken on recent exports 779088071 | |
754210356 | https://github.com/simonw/datasette/issues/983#issuecomment-754210356 | https://api.github.com/repos/simonw/datasette/issues/983 | MDEyOklzc3VlQ29tbWVudDc1NDIxMDM1Ng== | carlmjohnson 222245 | 2021-01-04T20:49:05Z | 2021-01-04T20:49:05Z | NONE | For reasons I've written about elsewhere, I'm in favor of modules. It has several beneficial effects. One, old browsers just ignore it all together. Two, if you include the same plain script on the page more than once, it will be executed twice, but if you include the same module script on a page twice, it will only execute once. Three, you get a module local namespace, instead of having to use the global window namespace or a function private namespace. OTOH, if you are going to use an old style script, the code from before isn't ideal, because you wipe out your registry if the script it included more than once. Also you may as well use object methods and splat arguments. The event based architecture probably makes more sense though. Just make up some event names prefixed with
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
JavaScript plugin hooks mechanism similar to pluggy 712260429 | |
754181647 | https://github.com/simonw/datasette/issues/983#issuecomment-754181647 | https://api.github.com/repos/simonw/datasette/issues/983 | MDEyOklzc3VlQ29tbWVudDc1NDE4MTY0Nw== | jussiarpalahti 11941245 | 2021-01-04T19:52:40Z | 2021-01-04T19:52:40Z | NONE | I was thinking JavaScript plugins going with server side template extensions custom HTML. Attach my own widgets on there and listen for Datasette events to refresh when user interacts with main UI. Like a map view or table that updates according to selected column. There's certainly other ways to look at this. Perhaps you could list possible hooks or high level design doc on what would be possible with the plugin system? Re: modules. I would like to see modules supported at least in development. The developer experience is so much better than what JavaScript coding has been in the past. With large parts of NPM at your disposal I’d imagine even less experienced coder can whisk a custom plugin in no time. Proper production build system (like one you get with Pika or Parcel) could package everything up into bundles that older browsers can understand. Though that does come with performance and size penalties alongside the added complexity. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
JavaScript plugin hooks mechanism similar to pluggy 712260429 | |
754002859 | https://github.com/simonw/datasette/pull/1170#issuecomment-754002859 | https://api.github.com/repos/simonw/datasette/issues/1170 | MDEyOklzc3VlQ29tbWVudDc1NDAwMjg1OQ== | codecov[bot] 22429695 | 2021-01-04T14:22:52Z | 2021-01-04T14:22:52Z | NONE | Codecov Report
Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Install Prettier via package.json 778126516 | |
753600999 | https://github.com/simonw/datasette/issues/983#issuecomment-753600999 | https://api.github.com/repos/simonw/datasette/issues/983 | MDEyOklzc3VlQ29tbWVudDc1MzYwMDk5OQ== | MarkusH 475613 | 2021-01-03T11:11:21Z | 2021-01-03T11:11:21Z | NONE | With regards to JS/Browser events, given your example of menu items that plugins could add, I could imagine this code to work:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
JavaScript plugin hooks mechanism similar to pluggy 712260429 | |
753587963 | https://github.com/simonw/datasette/issues/983#issuecomment-753587963 | https://api.github.com/repos/simonw/datasette/issues/983 | MDEyOklzc3VlQ29tbWVudDc1MzU4Nzk2Mw== | dracos 154364 | 2021-01-03T09:02:50Z | 2021-01-03T10:00:05Z | NONE |
Don't think you are :) (e.g. gzipped, using arrow functions in my example saves 2 bytes over spelling out function). On FMS, past month, looking at popular browsers, looks like we'd have 95.41% arrow support, 94.19% module support, and 4.58% (mostly IE9/IE11/Safari 9) supporting neither. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
JavaScript plugin hooks mechanism similar to pluggy 712260429 | |
753224999 | https://github.com/simonw/datasette/issues/983#issuecomment-753224999 | https://api.github.com/repos/simonw/datasette/issues/983 | MDEyOklzc3VlQ29tbWVudDc1MzIyNDk5OQ== | jussiarpalahti 11941245 | 2020-12-31T23:29:36Z | 2020-12-31T23:29:36Z | NONE | I have yet to build Datasette plugin and am unfamiliar with Pluggy. Since browsers have event handling builtin Datasette could communicate with plugins through it. Handlers register as listeners for custom Datasette events and Datasette's JS can then trigger said events. I was also wondering if you had looked at Javascript Modules for JS plugins? With services like Skypack (https://www.skypack.dev) NPM libraries can be loaded directly into browser, no build step needed. Same goes for local JS if you adhere to ES Module spec. If minification is required then tools such as Snowpack (https://www.snowpack.dev) could fit better. It uses https://github.com/evanw/esbuild for bundling and minification. On plugins you'd simply:
In Datasette HTML pages' head you'd merely import these files as modules one by one. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
JavaScript plugin hooks mechanism similar to pluggy 712260429 | |
753218817 | https://github.com/simonw/datasette/issues/983#issuecomment-753218817 | https://api.github.com/repos/simonw/datasette/issues/983 | MDEyOklzc3VlQ29tbWVudDc1MzIxODgxNw== | yozlet 173848 | 2020-12-31T22:32:25Z | 2020-12-31T22:32:25Z | NONE | Amazing work! And you've put in far more work than I'd expect to reduce the payload (which is admirable). So, to add a plugin with the current design, it goes in (a) the template or (b) a bookmarklet, right? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
JavaScript plugin hooks mechanism similar to pluggy 712260429 | |
753033121 | https://github.com/simonw/datasette/issues/1165#issuecomment-753033121 | https://api.github.com/repos/simonw/datasette/issues/1165 | MDEyOklzc3VlQ29tbWVudDc1MzAzMzEyMQ== | dracos 154364 | 2020-12-31T19:33:47Z | 2020-12-31T19:33:47Z | NONE | Sorry to go on about it, but it's my only example ;) And thought it might be of interest/use. Here is FixMyStreet's Cypress workflow https://github.com/mysociety/fixmystreet/blob/master/.github/workflows/cypress.yml with the master script that sets up server etc at https://github.com/mysociety/fixmystreet/blob/master/bin/browser-tests (that has features such as working inside/outside Vagrant, and can do JS code coverage) and then the tests are at https://github.com/mysociety/fixmystreet/tree/master/.cypress/cypress/integration |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Mechanism for executing JavaScript unit tests 776635426 | |
752882797 | https://github.com/simonw/datasette/issues/983#issuecomment-752882797 | https://api.github.com/repos/simonw/datasette/issues/983 | MDEyOklzc3VlQ29tbWVudDc1Mjg4Mjc5Nw== | dracos 154364 | 2020-12-31T08:07:59Z | 2020-12-31T15:04:32Z | NONE | If you're using arrow functions, you can presumably use default parameters, not much difference in support. That would save you 9 bytes. But OTOH you need Your latest 250-byte one, with use strict, gzips to 199 bytes. The following might be 292 bytes, but compresses to 204, basically the same, and works in any browser (well, IE9+) at all:
Source for that is below; I replaced the [fn,parameters] because closure-compiler includes a polyfill for that, and I ran
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
JavaScript plugin hooks mechanism similar to pluggy 712260429 | |
752888552 | https://github.com/simonw/datasette/issues/983#issuecomment-752888552 | https://api.github.com/repos/simonw/datasette/issues/983 | MDEyOklzc3VlQ29tbWVudDc1Mjg4ODU1Mg== | dracos 154364 | 2020-12-31T08:33:11Z | 2020-12-31T08:34:27Z | NONE | If you could say that all hook functions had to accept one options parameter (and could use object destructuring if they wished to only see a subset), you could have this, which minifies (to all-browser-JS) to 200 bytes, gzips to 146, and works practically the same:
Called the same, definitions tiny bit different:
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
JavaScript plugin hooks mechanism similar to pluggy 712260429 | |
751504136 | https://github.com/simonw/datasette/issues/417#issuecomment-751504136 | https://api.github.com/repos/simonw/datasette/issues/417 | MDEyOklzc3VlQ29tbWVudDc1MTUwNDEzNg== | drewda 212369 | 2020-12-27T19:02:06Z | 2020-12-27T19:02:06Z | NONE | Very much looking forward to seeing this functionality come together. This is probably out-of-scope for an initial release, but in the future it could be useful to also think of how to run this is a container'ized context. For example, an immutable datasette container that points to an S3 bucket of SQLite DBs or CSVs. Or an immutable datasette container pointing to a NFS volume elsewhere on a Kubernetes cluster. |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Datasette Library 421546944 | |
751476406 | https://github.com/simonw/datasette/issues/1150#issuecomment-751476406 | https://api.github.com/repos/simonw/datasette/issues/1150 | MDEyOklzc3VlQ29tbWVudDc1MTQ3NjQwNg== | noklam 18221871 | 2020-12-27T14:51:39Z | 2020-12-27T14:51:39Z | NONE | I like the idea of _internal, it's a nice way to get a data catalog quickly. I wonder if this trick applies to db other than SQLite. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Maintain an in-memory SQLite table of connected databases and their tables 770436876 | |
751375487 | https://github.com/dogsheep/github-to-sqlite/pull/59#issuecomment-751375487 | https://api.github.com/repos/dogsheep/github-to-sqlite/issues/59 | MDEyOklzc3VlQ29tbWVudDc1MTM3NTQ4Nw== | frosencrantz 631242 | 2020-12-26T17:08:44Z | 2020-12-26T17:08:44Z | NONE | Hi @simonw, do I need to do anything else for this PR to be considered to be included? I've tried using this project and it is quite nice to be able to explore a repository, but noticed that a couple commands don't allow you to use authorization from the environment variable. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Remove unneeded exists=True for -a/--auth flag. 771872303 | |
751127384 | https://github.com/simonw/datasette/issues/417#issuecomment-751127384 | https://api.github.com/repos/simonw/datasette/issues/417 | MDEyOklzc3VlQ29tbWVudDc1MTEyNzM4NA== | dyllan-to-yu 1279360 | 2020-12-24T22:56:48Z | 2020-12-24T22:56:48Z | NONE | Instead of scanning the directory every 10s, have you considered listening for the native system events to notify you of updates? I think python has a nice module to do this for you called watchdog |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Datasette Library 421546944 | |
751125270 | https://github.com/dogsheep/dogsheep-photos/issues/28#issuecomment-751125270 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/28 | MDEyOklzc3VlQ29tbWVudDc1MTEyNTI3MA== | jmelloy 129786 | 2020-12-24T22:26:22Z | 2020-12-24T22:26:22Z | NONE | This comes around if you’ve run the photo export without running an s3 upload. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Invalid SQL no such table: main.uploads 624490929 | |
750849460 | https://github.com/simonw/datasette/pull/1159#issuecomment-750849460 | https://api.github.com/repos/simonw/datasette/issues/1159 | MDEyOklzc3VlQ29tbWVudDc1MDg0OTQ2MA== | codecov[bot] 22429695 | 2020-12-24T11:07:35Z | 2020-12-24T11:29:21Z | NONE | Codecov Report
Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Improve the display of facets information 774332247 | |
750373496 | https://github.com/simonw/datasette/pull/1158#issuecomment-750373496 | https://api.github.com/repos/simonw/datasette/issues/1158 | MDEyOklzc3VlQ29tbWVudDc1MDM3MzQ5Ng== | codecov[bot] 22429695 | 2020-12-23T16:26:06Z | 2020-12-23T16:26:06Z | NONE | Codecov Report
<table>
<thead>
<tr>
<th>Impacted Files</th>
<th>Coverage Δ</th>
<th></th>
</tr>
</thead>
<tbody>
<tr>
<td>datasette/cli.py</td>
<td>77.41% <ø> (ø) </td>
<td></td>
</tr>
<tr>
<td>datasette/facets.py</td>
<td>89.04% <ø> (ø) </td>
<td></td>
</tr>
<tr>
<td>datasette/filters.py</td>
<td>94.35% <ø> (ø) </td>
<td></td>
</tr>
<tr>
<td>datasette/hookspecs.py</td>
<td>100.00% <ø> (ø) </td>
<td></td>
</tr>
<tr>
<td>datasette/inspect.py</td>
<td>36.11% <ø> (ø) </td>
<td></td>
</tr>
<tr>
<td>datasette/renderer.py</td>
<td>94.02% <ø> (ø) </td>
<td></td>
</tr>
<tr>
<td>datasette/views/base.py</td>
<td>95.01% <50.00%> (ø) </td>
<td></td>
</tr>
<tr>
<td>datasette/app.py</td>
<td>95.85% <100.00%> (ø) </td>
<td></td>
</tr>
<tr>
<td>datasette/utils/__init__.py</td>
<td>94.11% <100.00%> (ø) </td>
<td></td>
</tr>
<tr>
<td>datasette/utils/asgi.py</td>
<td>92.13% <100.00%> (ø) </td>
<td></td>
</tr>
<tr>
<td>... and 1 more</td>
<td></td>
<td></td>
</tr>
</tbody>
</table>
Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Modernize code to Python 3.6+ 773913793 | |
748436115 | https://github.com/dogsheep/dogsheep-photos/issues/15#issuecomment-748436115 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/15 | MDEyOklzc3VlQ29tbWVudDc0ODQzNjExNQ== | nickvazz 8573886 | 2020-12-19T07:43:38Z | 2020-12-19T07:47:36Z | NONE | Hey Simon! I really enjoy datasette so far, just started trying it out today following your iPhone photos example. I am not sure if you had run into this or not, but it seems like they might have changed one of the column names from |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Expose scores from ZCOMPUTEDASSETATTRIBUTES 612151767 | |
748436453 | https://github.com/dogsheep/twitter-to-sqlite/issues/53#issuecomment-748436453 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/53 | MDEyOklzc3VlQ29tbWVudDc0ODQzNjQ1Mw== | anotherjesse 27 | 2020-12-19T07:47:01Z | 2020-12-19T07:47:01Z | NONE | I think this should probably be closed as won't fix. Attempting to make a patch for this I realized that the since_id would limit to tweets posted since that since_id, not when it was favorited. So favoriting something in the older would be missed if you used Better to just use |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
--since support for favorites 771324837 | |
748436195 | https://github.com/dogsheep/dogsheep-photos/issues/21#issuecomment-748436195 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21 | MDEyOklzc3VlQ29tbWVudDc0ODQzNjE5NQ== | nickvazz 8573886 | 2020-12-19T07:44:32Z | 2020-12-19T07:44:49Z | NONE | I have also run into this a bit, would it be possible to post your |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
bpylist.archiver.CircularReference: archive has a cycle with uid(13) 615474990 | |
747130908 | https://github.com/dogsheep/google-takeout-to-sqlite/issues/2#issuecomment-747130908 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/2 | MDEyOklzc3VlQ29tbWVudDc0NzEzMDkwOA== | khimaros 231498 | 2020-12-17T00:47:04Z | 2020-12-17T00:47:43Z | NONE | it looks like almost all of the memory consumption is coming from another direction here may be to use the new "Semantic Location History" data which is already broken down by year and month. it also provides much more interesting data, such as estimated address, form of travel, etc. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
killed by oomkiller on large location-history 769376447 | |
745162571 | https://github.com/simonw/datasette/issues/1142#issuecomment-745162571 | https://api.github.com/repos/simonw/datasette/issues/1142 | MDEyOklzc3VlQ29tbWVudDc0NTE2MjU3MQ== | nitinpaul 6622733 | 2020-12-15T09:22:58Z | 2020-12-15T09:22:58Z | NONE | You're right, probably more straightforward to have the links for JSON. I was imagining to toggle the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"Stream all rows" is not at all obvious 763361458 | |
744618787 | https://github.com/simonw/datasette/issues/1143#issuecomment-744618787 | https://api.github.com/repos/simonw/datasette/issues/1143 | MDEyOklzc3VlQ29tbWVudDc0NDYxODc4Nw== | yurivish 114388 | 2020-12-14T18:15:00Z | 2020-12-15T02:21:53Z | NONE | From a quick look at the README, it does seem to do everything I need, thanks! I think the argument for inclusion in core is to lower the chances of unwanted data access. A local server can be accessed by anybody who can make an HTTP request to your computer regardless of CORS rules, but the default That's probably not what people typically intend, particularly when the data is of a sensitive nature. A default of requiring the user to specify the origin (allowing |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
More flexible CORS support in core, to encourage good security practices 764059235 | |
744522099 | https://github.com/simonw/datasette/issues/1142#issuecomment-744522099 | https://api.github.com/repos/simonw/datasette/issues/1142 | MDEyOklzc3VlQ29tbWVudDc0NDUyMjA5OQ== | nitinpaul 6622733 | 2020-12-14T15:37:47Z | 2020-12-14T15:37:47Z | NONE | Alright I could give it a try! This might be a stupid question, can you tell me how to run the server from my fork? So that I can test the changes? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"Stream all rows" is not at all obvious 763361458 | |
744489028 | https://github.com/simonw/datasette/issues/1144#issuecomment-744489028 | https://api.github.com/repos/simonw/datasette/issues/1144 | MDEyOklzc3VlQ29tbWVudDc0NDQ4OTAyOA== | MarkusH 475613 | 2020-12-14T14:47:11Z | 2020-12-14T14:47:11Z | NONE | Thanks for opening the issue, @simonw. Let me elaborate on my Tweets. datasette-chartjs provides drop down lists to pick the chart visualization (e.g. bar, line, doughnut, pie, ...) as well as the column used for the "x axis" (e.g. time). A user can change the values on-demand. The chart will be redrawn w/o querying the database again. However, if a user wants to change the underlying query, they will use the SQL field provided by datasette or any of the other datasette built-in features to amend a query. In order to maintain a user's selections for the plugin, datasette-chartjs copies some parts of datasette-vega which persist the chosen visualization and column in the hash part of a URL (the stuff behind the Additionally, datasette-vega and datasette-chartjs need to make sure to include the hash in all links and forms that cause a reload of the page. This is, such that the config persists between clicks. This ticket is about moving thes parts into datasette that provide the functionality to do so. This includes:
There's another, optional, feature that we might want to think about during the design phase: the scope of the config. Links within a datasette instance have 1 of 3 scopes:
When updating the links and forms as pointed out in 3. above, it might be worth considering which links need to be updated. I could imagine a plugin that wants to persist some setting across all tables within a database but another setting only within a table. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
JavaScript to help plugins interact with the fragment part of the URL 765637324 | |
744475543 | https://github.com/simonw/datasette/pull/1145#issuecomment-744475543 | https://api.github.com/repos/simonw/datasette/issues/1145 | MDEyOklzc3VlQ29tbWVudDc0NDQ3NTU0Mw== | codecov[bot] 22429695 | 2020-12-14T14:26:25Z | 2020-12-14T14:26:25Z | NONE | Codecov Report
Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update pytest requirement from <6.2.0,>=5.2.2 to >=5.2.2,<6.3.0 766494367 | |
744461856 | https://github.com/simonw/datasette/issues/276#issuecomment-744461856 | https://api.github.com/repos/simonw/datasette/issues/276 | MDEyOklzc3VlQ29tbWVudDc0NDQ2MTg1Ng== | robintw 296686 | 2020-12-14T14:04:57Z | 2020-12-14T14:04:57Z | NONE | I'm looking into using datasette with a database with spatialite geometry columns, and came across this issue. Has there been any progress on this since 2018? In one of my tables I'm just storing lat/lon points in a spatialite point geometry, and I've managed to make datasette-cluster-map display the points by extracting the lat and lon in SQL - using something like |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Handle spatialite geometry columns better 324835838 | |
744003454 | https://github.com/simonw/datasette/pull/1031#issuecomment-744003454 | https://api.github.com/repos/simonw/datasette/issues/1031 | MDEyOklzc3VlQ29tbWVudDc0NDAwMzQ1NA== | frankier 299380 | 2020-12-13T12:52:56Z | 2020-12-13T12:52:56Z | NONE | Please let me know if there's anything I can do to help get this merged. This is causing problems for me because it means when I build my Docker image my databases aren't considered immutable, which I would like them to be so that a download link is produced. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Fallback to databases in inspect-data.json when no -i options are passed 724369025 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
user