issue_comments
996 rows where author_association = "NONE" sorted by updated_at descending
This data as json, CSV (advanced)
created_at (date) 419 ✖
- 2021-01-22 8
- 2020-05-07 6
- 2021-02-12 6
- 2021-03-04 6
- 2021-08-24 6
- 2019-12-18 5
- 2020-12-14 5
- 2020-12-31 5
- 2021-07-22 5
- 2018-06-21 4
- 2020-02-24 4
- 2020-10-09 4
- 2020-12-09 4
- 2021-02-18 4
- 2021-02-22 4
- 2021-02-23 4
- 2021-03-09 4
- 2021-03-10 4
- 2021-04-02 4
- 2021-06-12 4
- 2021-08-12 4
- 2021-11-29 4
- 2021-12-08 4
- 2022-02-02 4
- 2018-03-27 3
- 2018-04-09 3
- 2018-05-29 3
- 2019-06-18 3
- 2019-06-22 3
- 2019-06-25 3
- 2019-07-08 3
- 2019-07-20 3
- 2019-10-14 3
- 2020-01-29 3
- 2020-03-23 3
- 2020-04-28 3
- 2020-04-29 3
- 2020-05-21 3
- 2020-05-28 3
- 2020-06-24 3
- 2020-10-17 3
- 2020-11-12 3
- 2020-11-30 3
- 2020-12-01 3
- 2020-12-19 3
- 2020-12-24 3
- 2021-01-04 3
- 2021-01-18 3
- 2021-02-20 3
- 2021-03-05 3
- 2021-03-07 3
- 2021-03-18 3
- 2021-03-31 3
- 2021-05-12 3
- 2021-05-17 3
- 2021-05-21 3
- 2021-06-14 3
- 2021-06-17 3
- 2021-10-12 3
- 2021-10-30 3
- 2021-12-17 3
- 2022-02-09 3
- 2022-02-16 3
- 2022-03-13 3
- 2017-11-14 2
- 2017-11-15 2
- 2017-11-26 2
- 2018-01-09 2
- 2018-05-21 2
- 2018-06-18 2
- 2018-07-12 2
- 2019-03-14 2
- 2019-05-21 2
- 2019-06-05 2
- 2019-06-23 2
- 2019-10-10 2
- 2019-10-12 2
- 2019-10-27 2
- 2019-10-30 2
- 2019-11-26 2
- 2019-12-03 2
- 2020-02-12 2
- 2020-05-03 2
- 2020-05-11 2
- 2020-09-23 2
- 2020-09-24 2
- 2020-10-12 2
- 2020-10-19 2
- 2020-10-20 2
- 2020-10-21 2
- 2020-10-22 2
- 2020-10-23 2
- 2020-10-28 2
- 2020-10-29 2
- 2020-10-30 2
- 2020-11-13 2
- 2020-11-17 2
- 2020-11-20 2
- 2020-12-04 2
- 2020-12-10 2
- 2020-12-13 2
- 2020-12-27 2
- 2021-01-03 2
- 2021-01-12 2
- 2021-02-05 2
- 2021-02-06 2
- 2021-02-07 2
- 2021-02-17 2
- 2021-03-03 2
- 2021-03-13 2
- 2021-03-21 2
- 2021-04-03 2
- 2021-04-20 2
- 2021-04-28 2
- 2021-04-29 2
- 2021-05-24 2
- 2021-05-27 2
- 2021-06-10 2
- 2021-06-11 2
- 2021-07-08 2
- 2021-07-11 2
- 2021-07-18 2
- 2021-08-03 2
- 2021-08-10 2
- 2021-08-26 2
- 2021-09-08 2
- 2021-09-28 2
- 2021-09-29 2
- 2021-10-18 2
- 2021-10-26 2
- 2021-11-01 2
- 2021-11-03 2
- 2021-11-16 2
- 2021-11-20 2
- 2021-12-26 2
- 2022-01-06 2
- 2022-01-12 2
- 2022-02-17 2
- 2022-02-24 2
- 2022-03-05 2
- 2022-03-06 2
- 2022-03-12 2
- 2022-03-15 2
- 2022-03-19 2
- 2022-03-20 2
- 2022-03-29 2
- 2017-11-16 1
- 2017-11-19 1
- 2017-11-22 1
- 2017-11-27 1
- 2017-11-29 1
- 2017-11-30 1
- 2017-12-07 1
- 2017-12-08 1
- 2018-01-05 1
- 2018-01-08 1
- 2018-01-23 1
- 2018-02-26 1
- 2018-03-05 1
- 2018-03-21 1
- 2018-04-03 1
- 2018-04-06 1
- 2018-04-15 1
- 2018-04-16 1
- 2018-05-11 1
- 2018-06-20 1
- 2018-06-27 1
- 2018-08-13 1
- 2018-08-21 1
- 2018-08-31 1
- 2018-09-05 1
- 2018-09-11 1
- 2018-10-05 1
- 2018-10-08 1
- 2018-10-22 1
- 2018-11-15 1
- 2018-11-16 1
- 2019-01-04 1
- 2019-01-18 1
- 2019-01-19 1
- 2019-02-15 1
- 2019-02-16 1
- 2019-02-22 1
- 2019-03-15 1
- 2019-04-07 1
- 2019-04-14 1
- 2019-04-22 1
- 2019-05-04 1
- 2019-05-07 1
- 2019-05-23 1
- 2019-05-29 1
- 2019-05-31 1
- 2019-06-09 1
- 2019-06-13 1
- 2019-06-24 1
- 2019-06-26 1
- 2019-06-29 1
- 2019-07-04 1
- 2019-07-15 1
- 2019-07-17 1
- 2019-07-18 1
- 2019-07-19 1
- 2019-07-22 1
- 2019-07-24 1
- 2019-07-26 1
- 2019-08-03 1
- 2019-08-07 1
- 2019-08-23 1
- 2019-10-08 1
- 2019-10-13 1
- 2019-10-21 1
- 2019-10-28 1
- 2019-10-29 1
- 2019-10-31 1
- 2019-11-07 1
- 2019-11-08 1
- 2019-11-11 1
- 2019-11-12 1
- 2019-11-30 1
- 2020-01-06 1
- 2020-01-07 1
- 2020-01-10 1
- 2020-01-16 1
- 2020-01-17 1
- 2020-01-21 1
- 2020-01-30 1
- 2020-01-31 1
- 2020-02-07 1
- 2020-02-10 1
- 2020-02-11 1
- 2020-02-16 1
- 2020-02-22 1
- 2020-02-29 1
- 2020-03-01 1
- 2020-03-22 1
- 2020-03-24 1
- 2020-03-25 1
- 2020-03-28 1
- 2020-04-05 1
- 2020-04-15 1
- 2020-04-16 1
- 2020-04-21 1
- 2020-04-26 1
- 2020-05-01 1
- 2020-05-04 1
- 2020-05-06 1
- 2020-05-08 1
- 2020-05-10 1
- 2020-05-24 1
- 2020-05-26 1
- 2020-05-27 1
- 2020-06-10 1
- 2020-06-11 1
- 2020-06-12 1
- 2020-06-14 1
- 2020-06-17 1
- 2020-06-22 1
- 2020-06-23 1
- 2020-06-26 1
- 2020-06-27 1
- 2020-07-01 1
- 2020-07-03 1
- 2020-07-21 1
- 2020-07-24 1
- 2020-07-26 1
- 2020-07-29 1
- 2020-08-09 1
- 2020-08-12 1
- 2020-08-15 1
- 2020-08-16 1
- 2020-09-12 1
- 2020-09-16 1
- 2020-09-21 1
- 2020-09-27 1
- 2020-09-28 1
- 2020-10-01 1
- 2020-10-05 1
- 2020-10-06 1
- 2020-10-14 1
- 2020-10-18 1
- 2020-10-25 1
- 2020-10-27 1
- 2020-11-02 1
- 2020-11-04 1
- 2020-11-15 1
- 2020-11-18 1
- 2020-11-28 1
- 2020-11-29 1
- 2020-12-02 1
- 2020-12-08 1
- 2020-12-12 1
- 2020-12-15 1
- 2020-12-17 1
- 2020-12-23 1
- 2021-01-05 1
- 2021-01-07 1
- 2021-01-11 1
- 2021-01-13 1
- 2021-01-15 1
- 2021-01-25 1
- 2021-01-26 1
- 2021-01-29 1
- 2021-01-31 1
- 2021-02-01 1
- 2021-02-03 1
- 2021-02-11 1
- 2021-02-16 1
- 2021-02-24 1
- 2021-02-25 1
- 2021-02-27 1
- 2021-03-14 1
- 2021-03-19 1
- 2021-03-22 1
- 2021-03-23 1
- 2021-03-24 1
- 2021-03-25 1
- 2021-03-27 1
- 2021-03-29 1
- 2021-04-05 1
- 2021-04-12 1
- 2021-04-14 1
- 2021-04-19 1
- 2021-04-21 1
- 2021-04-22 1
- 2021-04-26 1
- 2021-04-27 1
- 2021-04-30 1
- 2021-05-03 1
- 2021-05-05 1
- 2021-05-11 1
- 2021-05-18 1
- 2021-05-19 1
- 2021-05-20 1
- 2021-05-26 1
- 2021-05-28 1
- 2021-06-02 1
- 2021-06-05 1
- 2021-06-06 1
- 2021-06-09 1
- 2021-06-15 1
- 2021-06-16 1
- 2021-06-22 1
- 2021-06-25 1
- 2021-06-27 1
- 2021-06-28 1
- 2021-07-02 1
- 2021-07-19 1
- 2021-07-23 1
- 2021-07-28 1
- 2021-07-30 1
- 2021-08-01 1
- 2021-08-07 1
- 2021-08-13 1
- 2021-08-18 1
- 2021-08-23 1
- 2021-08-30 1
- 2021-09-02 1
- 2021-09-14 1
- 2021-09-16 1
- 2021-09-21 1
- 2021-09-22 1
- 2021-09-26 1
- 2021-10-05 1
- 2021-10-07 1
- 2021-10-11 1
- 2021-10-13 1
- 2021-10-16 1
- 2021-10-17 1
- 2021-10-19 1
- 2021-10-28 1
- 2021-11-05 1
- 2021-11-12 1
- 2021-11-13 1
- 2021-11-15 1
- 2021-11-17 1
- 2021-11-18 1
- 2021-11-23 1
- 2021-11-24 1
- 2021-11-25 1
- 2021-11-30 1
- 2021-12-01 1
- 2021-12-02 1
- 2021-12-04 1
- 2021-12-06 1
- 2021-12-10 1
- 2021-12-13 1
- 2021-12-14 1
- 2021-12-21 1
- 2021-12-29 1
- 2021-12-31 1
- 2022-01-08 1
- 2022-01-09 1
- 2022-01-11 1
- 2022-01-13 1
- 2022-01-19 1
- 2022-01-20 1
- 2022-01-25 1
- 2022-01-28 1
- 2022-01-31 1
- 2022-02-03 1
- 2022-02-04 1
- 2022-02-06 1
- 2022-02-07 1
- 2022-02-11 1
- 2022-02-15 1
- 2022-02-26 1
- 2022-03-04 1
- 2022-03-08 1
- 2022-03-11 1
- 2022-03-21 1
- 2022-03-23 1
- 2022-03-25 1
- 2022-03-28 1
- 2022-03-31 1
- 2022-04-08 1
- 2022-04-14 1
- 2022-04-15 1
- 2022-04-21 1
- 2022-04-23 1
user 223
- codecov[bot] 141
- aborruso 19
- chrismp 18
- carlmjohnson 14
- tballison 13
- psychemedia 10
- terrycojones 10
- stonebig 10
- maxhawkins 9
- clausjuhl 9
- rayvoelker 9
- 20after4 8
- UtahDave 8
- tomchristie 8
- bsilverm 8
- dracos 7
- mhalle 7
- zeluspudding 7
- cobiadigital 7
- frafra 6
- zaneselvans 6
- tsibley 5
- khusmann 5
- khimaros 5
- MarkusH 5
- dazzag24 5
- SteadBytes 5
- Btibert3 4
- dholth 4
- lovasoa 4
- jungle-boogie 4
- ColinMaudry 4
- nitinpaultifr 4
- Kabouik 4
- henry501 4
- frankieroberto 3
- fs111 3
- obra 3
- janimo 3
- atomotic 3
- pkoppstein 3
- yozlet 3
- yschimke 3
- philroche 3
- wsxiaoys 3
- xrotwang 3
- robroc 3
- dufferzafar 3
- Florents-Tselai 3
- ashishdotme 3
- Segerberg 3
- jsancho-gpl 3
- learning4life 3
- FabianHertwig 3
- polyrand 3
- pjamargh 3
- garethr 2
- jayvdb 2
- ftrain 2
- chrishas35 2
- coleifer 2
- gavinband 2
- aviflax 2
- tholo 2
- frankier 2
- lchski 2
- tmaier 2
- eads 2
- leafgarland 2
- strada 2
- Mjboothaus 2
- eelkevdbos 2
- ligurio 2
- n8henrie 2
- soobrosa 2
- nathancahill 2
- betatim 2
- bsmithgall 2
- willingc 2
- nattaylor 2
- durkie 2
- wulfmann 2
- philshem 2
- bram2000 2
- zzeleznick 2
- plpxsk 2
- swyxio 2
- nickvazz 2
- aaronyih1 2
- jussiarpalahti 2
- lagolucas 2
- chekos 2
- ad-si 2
- smithdc1 2
- gsajko 2
- null92 2
- rachelmarconi 2
- tunguyenatwork 2
- LVerneyPEReN 2
- anotherjesse 1
- jarib 1
- jokull 1
- danp 1
- dsisnero 1
- gijs 1
- blaine 1
- gravis 1
- nkirsch 1
- mrchrisadams 1
- dkam 1
- harperreed 1
- furilo 1
- prabhur 1
- dmd 1
- Uninen 1
- carsonyl 1
- nryberg 1
- step21 1
- stefanocudini 1
- tannewt 1
- rcoup 1
- scoates 1
- hpk42 1
- annapowellsmith 1
- thorn0 1
- yurivish 1
- jmelloy 1
- Krazybug 1
- dvhthomas 1
- phubbard 1
- sethvincent 1
- aitoehigie 1
- michaelmcandrew 1
- drewda 1
- stiles 1
- saulpw 1
- thadk 1
- camallen 1
- robintw 1
- astrojuanlu 1
- ipmb 1
- steren 1
- aidansteele 1
- 0x1997 1
- davidszotten 1
- kevboh 1
- eaubin 1
- yunzheng 1
- karlcow 1
- heyarne 1
- simonrjones 1
- justinpinkney 1
- merwok 1
- virtadpt 1
- snth 1
- joshmgrant 1
- bcongdon 1
- nickdirienzo 1
- hannseman 1
- kaihendry 1
- urbas 1
- brimstone 1
- adamchainz 1
- PabloLerma 1
- heussd 1
- RayBB 1
- limar 1
- drkane 1
- Gagravarr 1
- agguser 1
- dyllan-to-you 1
- justinallen 1
- jordaneremieff 1
- wdccdw 1
- progpow 1
- ltrgoddard 1
- costrouc 1
- jratike80 1
- ccorcos 1
- qqilihq 1
- QAInsights 1
- secretGeek 1
- fkuhn 1
- jameslittle230 1
- dskrad 1
- kwladyka 1
- Carib0u 1
- fatihky 1
- phoenixjun 1
- JesperTreetop 1
- bapowell 1
- chris48s 1
- ChristopherWilks 1
- Maltazar 1
- henrikek 1
- wuhland 1
- foscoj 1
- dvot197007 1
- kokes 1
- csusanu 1
- metab0t 1
- luxint 1
- spdkils 1
- sturzl 1
- robmarkcole 1
- jfeiwell 1
- coisnepe 1
- chmaynard 1
- noklam 1
- GmGniap 1
- rdtq 1
- AnkitKundariya 1
- LucasElArruda 1
- duarteocarmo 1
- sarcasticadmin 1
- patricktrainer 1
- justmars 1
- miuku 1
- jcmkk3 1
- publicmatt 1
- thisismyfuckingusername 1
- kirajano 1
- knowledgecamp12 1
id | html_url | issue_url | node_id | user | created_at | updated_at ▲ | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
1107459446 | https://github.com/simonw/datasette/pull/1717#issuecomment-1107459446 | https://api.github.com/repos/simonw/datasette/issues/1717 | IC_kwDOBm6k_c5CAn12 | codecov[bot] 22429695 | 2022-04-23T11:56:36Z | 2022-04-23T11:56:36Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1717 +/-=======================================
Coverage 91.75% 91.75% | Impacted Files | Coverage Δ | |
|---|---|---|
| datasette/publish/cloudrun.py | Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add timeout option to Cloudrun build 1213281044 | |
1105464661 | https://github.com/simonw/datasette/pull/1574#issuecomment-1105464661 | https://api.github.com/repos/simonw/datasette/issues/1574 | IC_kwDOBm6k_c5B5A1V | dholth 208018 | 2022-04-21T16:51:24Z | 2022-04-21T16:51:24Z | NONE | tfw you have more ephemeral storage than upstream bandwidth ``` FROM python:3.10-slim AS base RUN apt update && apt -y install zstd ENV DATASETTE_SECRET 'sosecret' RUN --mount=type=cache,target=/root/.cache/pip pip install -U datasette datasette-pretty-json datasette-graphql ENV PORT 8080 EXPOSE 8080 FROM base AS pack COPY . /app WORKDIR /app RUN datasette inspect --inspect-file inspect-data.json RUN zstd --rm *.db FROM base AS unpack COPY --from=pack /app /app WORKDIR /app CMD ["/bin/bash", "-c", "shopt -s nullglob && zstd --rm -d .db.zst && datasette serve --host 0.0.0.0 --cors --inspect-file inspect-data.json --metadata metadata.json --create --port $PORT .db"] ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
introduce new option for datasette package to use a slim base image 1084193403 | |
1100243987 | https://github.com/simonw/datasette/pull/1159#issuecomment-1100243987 | https://api.github.com/repos/simonw/datasette/issues/1159 | IC_kwDOBm6k_c5BlGQT | lovasoa 552629 | 2022-04-15T17:24:43Z | 2022-04-15T17:24:43Z | NONE | @simonw : do you think this could be merged ? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Improve the display of facets information 774332247 | |
1099443468 | https://github.com/simonw/datasette/issues/1713#issuecomment-1099443468 | https://api.github.com/repos/simonw/datasette/issues/1713 | IC_kwDOBm6k_c5BiC0M | rayvoelker 9308268 | 2022-04-14T17:26:27Z | 2022-04-14T17:26:27Z | NONE | What would be an awesome feature as a plugin would be to be able to save a query (and possibly even results) to a github gist. Being able to share results that way would be super fantastic. Possibly even in Jupyter Notebook format (since github and github gists nicely render those)! I know there's the handy datasette-saved-queries plugin, but a button that could export stuff out and then even possibly import stuff back in (I'm sort of thinking the way that Google Colab allows you to save to github, and then pull the notebook back in is a really great workflow https://github.com/cincinnatilibrary/collection-analysis/blob/master/reports/colab_datasette_example.ipynb ) |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Datasette feature for publishing snapshots of query results 1203943272 | |
1092850719 | https://github.com/simonw/datasette/pull/1703#issuecomment-1092850719 | https://api.github.com/repos/simonw/datasette/issues/1703 | IC_kwDOBm6k_c5BI5Qf | codecov[bot] 22429695 | 2022-04-08T13:18:04Z | 2022-04-08T13:18:04Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1703 +/-=======================================
Coverage 91.75% 91.75% Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update beautifulsoup4 requirement from <4.11.0,>=4.8.1 to >=4.8.1,<4.12.0 1197298420 | |
1084216224 | https://github.com/simonw/datasette/pull/1574#issuecomment-1084216224 | https://api.github.com/repos/simonw/datasette/issues/1574 | IC_kwDOBm6k_c5An9Og | fs111 33631 | 2022-03-31T07:45:25Z | 2022-03-31T07:45:25Z | NONE | @simonw I like that you want to go "slim by default". Do you want another PR for that or should I just wait? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
introduce new option for datasette package to use a slim base image 1084193403 | |
1082476727 | https://github.com/simonw/sqlite-utils/issues/420#issuecomment-1082476727 | https://api.github.com/repos/simonw/sqlite-utils/issues/420 | IC_kwDOCGYnMM5AhUi3 | strada 770231 | 2022-03-29T23:52:38Z | 2022-03-29T23:52:38Z | NONE | @simonw Thanks for looking into it and documenting the solution! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Document how to use a `--convert` function that runs initialization code first 1178546862 | |
1081860312 | https://github.com/simonw/datasette/pull/1694#issuecomment-1081860312 | https://api.github.com/repos/simonw/datasette/issues/1694 | IC_kwDOBm6k_c5Ae-DY | codecov[bot] 22429695 | 2022-03-29T13:17:30Z | 2022-03-29T13:17:30Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1694 +/-=======================================
Coverage 91.74% 91.74% Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update click requirement from <8.1.0,>=7.1.1 to >=7.1.1,<8.2.0 1184850675 | |
1081079506 | https://github.com/simonw/sqlite-utils/issues/421#issuecomment-1081079506 | https://api.github.com/repos/simonw/sqlite-utils/issues/421 | IC_kwDOCGYnMM5Ab_bS | learning4life 24938923 | 2022-03-28T19:58:55Z | 2022-03-28T20:05:57Z | NONE | Sure, it is from the documentation example: Extracting columns into a separate table sqlite-utils insert global.db power_plants \ 'global_power_plant_database.csv?raw=true' --csv Extract those columns:sqlite-utils extract global.db power_plants country country_long \ --table countries \ --fk-column country_id \ --rename country_long name ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
"Error: near "(": syntax error" when using sqlite-utils indexes CLI 1180427792 | |
1079018557 | https://github.com/simonw/datasette/pull/1685#issuecomment-1079018557 | https://api.github.com/repos/simonw/datasette/issues/1685 | IC_kwDOBm6k_c5AUIQ9 | codecov[bot] 22429695 | 2022-03-25T13:16:48Z | 2022-03-25T13:16:48Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1685 +/-=======================================
Coverage 91.74% 91.74% Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update jinja2 requirement from <3.1.0,>=2.10.3 to >=2.10.3,<3.2.0 1180778860 | |
1076662556 | https://github.com/simonw/sqlite-utils/pull/419#issuecomment-1076662556 | https://api.github.com/repos/simonw/sqlite-utils/issues/419 | IC_kwDOCGYnMM5ALJEc | codecov[bot] 22429695 | 2022-03-23T18:12:47Z | 2022-03-23T18:12:47Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #419 +/-=======================================
Coverage 96.55% 96.55% Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ignore common generated files 1178484369 | |
1074256603 | https://github.com/simonw/sqlite-utils/issues/417#issuecomment-1074256603 | https://api.github.com/repos/simonw/sqlite-utils/issues/417 | IC_kwDOCGYnMM5AB9rb | blaine 9954 | 2022-03-21T18:19:41Z | 2022-03-21T18:19:41Z | NONE | That makes sense; just a little hint that points folks towards doing the right thing might be helpful! fwiw, the reason I was using jq in the first place was just a quick way to extract one attribute from an actual JSON array. When I initially imported it, I got a table with a bunch of embedded JSON values, rather than a native table, because each array entry had two attributes, one with the data I actually wanted. Not sure how common a use-case this is, though (and easily fixed, aside from the jq weirdness!) |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
insert fails on JSONL with whitespace 1175744654 | |
1073152522 | https://github.com/dogsheep/google-takeout-to-sqlite/issues/10#issuecomment-1073152522 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/10 | IC_kwDODFE5qs4_9wIK | csusanu 9290214 | 2022-03-20T02:38:07Z | 2022-03-20T02:38:07Z | NONE | This line needs to say |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
sqlite3.OperationalError: no such table: main.my_activity 1123393829 | |
1073139067 | https://github.com/dogsheep/healthkit-to-sqlite/issues/14#issuecomment-1073139067 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/14 | IC_kwDOC8tyDs4_9s17 | lchski 343884 | 2022-03-20T00:54:18Z | 2022-03-20T00:54:18Z | NONE | Update: this appears to be because of running the command twice without clearing the DB in between. Tries to insert a Workout that already exists, causing a collision on the (auto-generated) |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
UNIQUE constraint failed: workouts.id 771608692 | |
1073123231 | https://github.com/dogsheep/healthkit-to-sqlite/issues/14#issuecomment-1073123231 | https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/14 | IC_kwDOC8tyDs4_9o-f | lchski 343884 | 2022-03-19T22:39:29Z | 2022-03-19T22:39:29Z | NONE | I have this issue, too, with a fresh export. None of my When I run the script, a
Are there maybe duplicate workouts in the data, which’d cause multiple rows to share the same |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
UNIQUE constraint failed: workouts.id 771608692 | |
1072954795 | https://github.com/simonw/datasette/issues/1228#issuecomment-1072954795 | https://api.github.com/repos/simonw/datasette/issues/1228 | IC_kwDOBm6k_c4_8_2r | Kabouik 7107523 | 2022-03-19T06:44:40Z | 2022-03-19T06:44:40Z | NONE |
Exactly, that's highly likely even though I can't double check from this computer just now. Thanks! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
500 error caused by faceting if a column called `n` exists 810397025 | |
1068193035 | https://github.com/simonw/datasette/pull/1659#issuecomment-1068193035 | https://api.github.com/repos/simonw/datasette/issues/1659 | IC_kwDOBm6k_c4_q1UL | codecov[bot] 22429695 | 2022-03-15T16:28:25Z | 2022-03-15T17:56:09Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1659 +/-==========================================
+ Coverage 92.06% 92.10% +0.03% | Impacted Files | Coverage Δ | |
|---|---|---|
| datasette/app.py | Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Tilde encoding 1169895600 | |
1068154183 | https://github.com/simonw/datasette/pull/1656#issuecomment-1068154183 | https://api.github.com/repos/simonw/datasette/issues/1656 | IC_kwDOBm6k_c4_qr1H | codecov[bot] 22429695 | 2022-03-15T15:55:34Z | 2022-03-15T15:55:34Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1656 +/-=======================================
Coverage 92.06% 92.06% Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update pytest requirement from <7.1.0,>=5.2.2 to >=5.2.2,<7.2.0 1168357113 | |
1066194130 | https://github.com/simonw/datasette/issues/1384#issuecomment-1066194130 | https://api.github.com/repos/simonw/datasette/issues/1384 | IC_kwDOBm6k_c4_jNTS | khusmann 167160 | 2022-03-13T22:23:04Z | 2022-03-13T22:23:04Z | NONE | Ah, sorry, I didn't get what you were saying you the first time. Using _metadata_local in that way makes total sense -- I agree, refreshing metadata each cell was seeming quite excessive. Now I'm on the same page! :) |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for dynamic metadata 930807135 | |
1066143991 | https://github.com/simonw/datasette/issues/1384#issuecomment-1066143991 | https://api.github.com/repos/simonw/datasette/issues/1384 | IC_kwDOBm6k_c4_jBD3 | khusmann 167160 | 2022-03-13T17:13:09Z | 2022-03-13T17:13:09Z | NONE | Thanks for taking the time to reply @brandonrobertz , this is really helpful info.
Ah, that's nifty! Yeah, then caching on the python side is likely a waste :) I'm new to working with sqlite so this is super good to know the many-small-queries is a common pattern
For my reference, did you include a (If you didn't test this specific situation, no worries -- I'm just trying to calibrate my intuition on this and can do my own benchmarks at some point.)
Yeah, getting metadata (and static pages as well for that matter) from internal tables definitely has my vote for including as a standard feature! Its really nice to be able to distribute a single *.db with all the metadata and static pages bundled. My metadata are sufficiently complex/domain specific that it makes sense to continue on my own plugin for now, but I'll be thinking about more general parts I can spin off as possible contributions to liveconfig (if you're open to them) or other plugins in this ecosystem. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for dynamic metadata 930807135 | |
1066139147 | https://github.com/simonw/sqlite-utils/issues/408#issuecomment-1066139147 | https://api.github.com/repos/simonw/sqlite-utils/issues/408 | IC_kwDOCGYnMM4_i_4L | learning4life 24938923 | 2022-03-13T16:45:00Z | 2022-03-13T16:54:09Z | NONE | @simonw Now I get this:
Dockerfile ``` FROM centos/python-38-centos7 USER root RUN yum update -y RUN yum upgrade -y epelRUN yum -y install epel-release && yum clean all SQLiteRUN yum -y install zlib-devel geos geos-devel proj proj-devel freexl freexl-devel libxml2-devel WORKDIR /build/ COPY sqlite-autoconf-3360000.tar.gz ./ RUN tar -zxf sqlite-autoconf-3360000.tar.gz WORKDIR /build/sqlite-autoconf-3360000 RUN ./configure RUN make RUN make install RUN /opt/app-root/bin/python3.8 -m pip install --upgrade pip RUN pip install sqlite-utils ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
`deterministic=True` fails on versions of SQLite prior to 3.8.3 1145882578 | |
1065951744 | https://github.com/simonw/datasette/issues/1384#issuecomment-1065951744 | https://api.github.com/repos/simonw/datasette/issues/1384 | IC_kwDOBm6k_c4_iSIA | khusmann 167160 | 2022-03-12T19:47:17Z | 2022-03-12T19:47:17Z | NONE | Awesome, thanks @brandonrobertz ! The plugin is close, but looks like it only grabs remote metadata, is that right? Instead what I'm wanting is to grab metadata embedded in the attached databases. Rather than extending that plugin, at this point I've realized I need a lot more flexibility in metadata for my data model (esp around formatting cell values and custom file exports) so rather than extending that I'll continue working on a plugin specific to my app. If I'm understanding your plugin code correctly, you query the db using the sync handle every time
I agree -- because things like That leaves your app, where it sounds like you want changes made by the user in the browser in to be immediately reflected, rather than have to wait for the next metadata refresh. In this case I wonder if you could have your app make a sync write to the datasette object so the change would have the immediate effect, but then have a separate async polling mechanism to eventually write that change out to the database for long-term persistence. Then you'd have the best of both worlds, I think? But probably not worth the trouble if your use cases are small (and/or you're not reading metadata/config from tight loops like render_cell). |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for dynamic metadata 930807135 | |
1065929510 | https://github.com/simonw/datasette/issues/1384#issuecomment-1065929510 | https://api.github.com/repos/simonw/datasette/issues/1384 | IC_kwDOBm6k_c4_iMsm | khusmann 167160 | 2022-03-12T17:49:59Z | 2022-03-12T17:49:59Z | NONE | Ok, I'm taking a slightly different approach, which I think is sort of close to the in-memory _metadata table idea. I'm using a startup hook to load metadata / other info from the database, which I store in the datasette object for later:
Then, I can use this in other plugins:
For my app I don't need anything to update dynamically so it's fine to pre-populate everything on startup. It's also good to have things precached especially for a hook like render_cell, which would otherwise require a ton of redundant db queries. Makes me wonder if we could take a sort of similar caching approach with the internal _metadata table. Like have a little watchdog that could query all of the attached dbs for their _metadata tables every 5min or so, which then could be merged into the in memory _metadata table which then could be accessed sync by the plugins, or something like that. For most the use cases I can think of, live updates don't need to take into effect immediately; refreshing a cache every 5min or on some other trigger (adjustable w a config setting) would be just fine. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for dynamic metadata 930807135 | |
1065334891 | https://github.com/simonw/datasette/issues/1634#issuecomment-1065334891 | https://api.github.com/repos/simonw/datasette/issues/1634 | IC_kwDOBm6k_c4_f7hr | dholth 208018 | 2022-03-11T17:38:08Z | 2022-03-11T17:38:08Z | NONE | I noticed the image was large when using fly. Is it possible to use a -slim base? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update Dockerfile generated by `datasette publish` 1131295060 | |
1062124485 | https://github.com/simonw/datasette/issues/1384#issuecomment-1062124485 | https://api.github.com/repos/simonw/datasette/issues/1384 | IC_kwDOBm6k_c4_TrvF | khusmann 167160 | 2022-03-08T19:26:32Z | 2022-03-08T19:26:32Z | NONE | Looks like I'm late to the party here, but wanted to join the convo if there's still time before this interface is solidified in v1.0. My plugin use case is for education / social science data, which is meta-data heavy in the documentation of measurement scales, instruments, collection procedures, etc. that I want to connect to columns, tables, and dbs (and render in static pages, but looks like I can do that with the jinja plugin hook). I'm still digging in and I think @brandonrobertz 's approach will work for me at least for now, but I want to bump this thread in the meantime -- are there still plans for an async metadata hook at some point in the future? (or are you considering other directions?) |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Plugin hook for dynamic metadata 930807135 | |
1059823151 | https://github.com/simonw/datasette/pull/1648#issuecomment-1059823151 | https://api.github.com/repos/simonw/datasette/issues/1648 | IC_kwDOBm6k_c4_K54v | codecov[bot] 22429695 | 2022-03-05T19:56:41Z | 2022-03-07T15:38:08Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1648 +/-==========================================
+ Coverage 92.03% 92.05% +0.02% | Impacted Files | Coverage Δ | |
|---|---|---|
| datasette/url_builder.py | Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Use dash encoding for table names and row primary keys in URLs 1160432941 | |
1060021753 | https://github.com/simonw/datasette/pull/1649#issuecomment-1060021753 | https://api.github.com/repos/simonw/datasette/issues/1649 | IC_kwDOBm6k_c4_LqX5 | codecov[bot] 22429695 | 2022-03-06T19:13:09Z | 2022-03-06T19:13:09Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1649 +/-=======================================
Coverage 92.03% 92.03% | Impacted Files | Coverage Δ | |
|---|---|---|
| datasette/utils/__init__.py | Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add /opt/homebrew to where spatialite extension can be found 1160677684 | |
1016456784 | https://github.com/simonw/datasette/pull/1602#issuecomment-1016456784 | https://api.github.com/repos/simonw/datasette/issues/1602 | IC_kwDOBm6k_c48leZQ | codecov[bot] 22429695 | 2022-01-19T13:17:24Z | 2022-03-06T01:30:46Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1602 +/-==========================================
+ Coverage 92.03% 92.16% +0.13% | Impacted Files | Coverage Δ | |
|---|---|---|
| datasette/tracer.py | Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update pytest-timeout requirement from <2.1,>=1.4.2 to >=1.4.2,<2.2 1108084641 | |
1059863997 | https://github.com/simonw/datasette/issues/1439#issuecomment-1059863997 | https://api.github.com/repos/simonw/datasette/issues/1439 | IC_kwDOBm6k_c4_LD29 | karlcow 505230 | 2022-03-06T00:57:57Z | 2022-03-06T00:57:57Z | NONE | Probably too late… but I have just seen this because http://simonwillison.net/2022/Mar/5/dash-encoding/#atom-everything And it reminded me of comma tools at W3C. http://www.w3.org/,tools Example, the text version of W3C homepage https://www.w3.org/,text
I haven't checked all the cases in the thread. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Rethink how .ext formats (v.s. ?_format=) works before 1.0 973139047 | |
1059652834 | https://github.com/simonw/sqlite-utils/issues/412#issuecomment-1059652834 | https://api.github.com/repos/simonw/sqlite-utils/issues/412 | IC_kwDOCGYnMM4_KQTi | zaneselvans 596279 | 2022-03-05T02:14:40Z | 2022-03-05T02:14:40Z | NONE | We do a lot of |
{ "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Optional Pandas integration 1160182768 | |
1059097969 | https://github.com/simonw/sqlite-utils/issues/408#issuecomment-1059097969 | https://api.github.com/repos/simonw/sqlite-utils/issues/408 | IC_kwDOCGYnMM4_II1x | learning4life 24938923 | 2022-03-04T11:55:21Z | 2022-03-04T11:55:21Z | NONE | Thanks @simonw I will test it after my vacation 👍 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
`deterministic=True` fails on versions of SQLite prior to 3.8.3 1145882578 | |
1051473892 | https://github.com/simonw/datasette/issues/260#issuecomment-1051473892 | https://api.github.com/repos/simonw/datasette/issues/260 | IC_kwDOBm6k_c4-rDfk | zaneselvans 596279 | 2022-02-26T02:24:15Z | 2022-02-26T02:24:15Z | NONE | Is there already functionality that can be used to validate the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Validate metadata.json on startup 323223872 | |
1050123919 | https://github.com/dogsheep/twitter-to-sqlite/issues/62#issuecomment-1050123919 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/62 | IC_kwDODEm0Qs4-l56P | swyxio 6764957 | 2022-02-24T18:10:18Z | 2022-02-24T18:10:18Z | NONE | gonna close this for now since i'm not actively working on it. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
KeyError: 'created_at' for private accounts? 1088816961 | |
1049775451 | https://github.com/dogsheep/twitter-to-sqlite/issues/62#issuecomment-1049775451 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/62 | IC_kwDODEm0Qs4-kk1b | miuku 43036882 | 2022-02-24T11:43:31Z | 2022-02-24T11:43:31Z | NONE | i seem to have fixed this issue by applying for elevated API access |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
KeyError: 'created_at' for private accounts? 1088816961 | |
1043626870 | https://github.com/simonw/datasette/issues/327#issuecomment-1043626870 | https://api.github.com/repos/simonw/datasette/issues/327 | IC_kwDOBm6k_c4-NHt2 | dholth 208018 | 2022-02-17T23:37:24Z | 2022-02-17T23:37:24Z | NONE | On second thought any kind of quick-to-decompress-on-startup could be helpful if we're paying for the container registry and deployment bandwidth but not ephemeral storage. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Explore if SquashFS can be used to shrink size of packaged Docker containers 335200136 | |
1043609198 | https://github.com/simonw/datasette/issues/327#issuecomment-1043609198 | https://api.github.com/repos/simonw/datasette/issues/327 | IC_kwDOBm6k_c4-NDZu | dholth 208018 | 2022-02-17T23:21:36Z | 2022-02-17T23:33:01Z | NONE | On fly.io. This particular database goes from 1.4GB to 200M. Slower, part of that might be having no ``` $ datasette publish fly ... --generate-dir /tmp/deploy-this ... $ mksquashfs large.db large.squashfs $ rm large.db # don't accidentally put it in the image $ cat Dockerfile FROM python:3.8 COPY . /app WORKDIR /app ENV DATASETTE_SECRET 'xyzzy' RUN pip install -U datasette RUN datasette inspect large.db --inspect-file inspect-data.jsonENV PORT 8080 EXPOSE 8080 CMD mount -o loop -t squashfs large.squashfs /mnt; datasette serve --host 0.0.0.0 -i /mnt/large.db --cors --port $PORT ``` It would also be possible to copy the file onto the ~6GB available on the ephemeral container filesystem on startup. A little against the spirit of the thing? On this example the whole docker image is 2.42 GB and the squashfs version is 1.14 GB. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Explore if SquashFS can be used to shrink size of packaged Docker containers 335200136 | |
1041363433 | https://github.com/simonw/sqlite-utils/issues/406#issuecomment-1041363433 | https://api.github.com/repos/simonw/sqlite-utils/issues/406 | IC_kwDOCGYnMM4-EfHp | psychemedia 82988 | 2022-02-16T10:57:03Z | 2022-02-16T10:57:19Z | NONE | Wondering if this actually relates to https://github.com/simonw/sqlite-utils/issues/402 ? I also wonder if this would be a sensible approach for eg registering |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Creating tables with custom datatypes 1128466114 | |
1041325398 | https://github.com/simonw/sqlite-utils/issues/402#issuecomment-1041325398 | https://api.github.com/repos/simonw/sqlite-utils/issues/402 | IC_kwDOCGYnMM4-EV1W | psychemedia 82988 | 2022-02-16T10:12:48Z | 2022-02-16T10:18:55Z | NONE |
Other possible pairs: unconventional date/datetime and timezone pairs eg |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Advanced class-based `conversions=` mechanism 1125297737 | |
1041313679 | https://github.com/simonw/sqlite-utils/issues/406#issuecomment-1041313679 | https://api.github.com/repos/simonw/sqlite-utils/issues/406 | IC_kwDOCGYnMM4-ES-P | psychemedia 82988 | 2022-02-16T09:59:51Z | 2022-02-16T10:00:10Z | NONE | The When creating the table, you could then error, or at least warn, if someone wasn't setting a column on a |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Creating tables with custom datatypes 1128466114 | |
1040519196 | https://github.com/simonw/sqlite-utils/pull/407#issuecomment-1040519196 | https://api.github.com/repos/simonw/sqlite-utils/issues/407 | IC_kwDOCGYnMM4-BRAc | codecov[bot] 22429695 | 2022-02-15T16:52:21Z | 2022-02-15T18:12:03Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #407 +/-==========================================
+ Coverage 95.91% 96.62% +0.71% | Impacted Files | Coverage Δ | |
|---|---|---|
| sqlite_utils/cli.py | Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add SpatiaLite helpers to CLI 1138948786 | |
1035717429 | https://github.com/dogsheep/dogsheep-photos/pull/31#issuecomment-1035717429 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/31 | IC_kwDOD079W849u8s1 | harperreed 18504 | 2022-02-11T01:55:38Z | 2022-02-11T01:55:38Z | NONE | I would love this merged! |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update for Big Sur 771511344 | |
1034222709 | https://github.com/simonw/datasette/issues/1633#issuecomment-1034222709 | https://api.github.com/repos/simonw/datasette/issues/1633 | IC_kwDOBm6k_c49pPx1 | henrikek 6613091 | 2022-02-09T21:47:02Z | 2022-02-09T21:47:02Z | NONE | Is this the correct solution to add the base_url row to url_builder.py? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
base_url or prefix does not work with _exact match 1129052172 | |
1033772902 | https://github.com/simonw/datasette/issues/236#issuecomment-1033772902 | https://api.github.com/repos/simonw/datasette/issues/236 | IC_kwDOBm6k_c49nh9m | jordaneremieff 1376648 | 2022-02-09T13:40:52Z | 2022-02-09T13:40:52Z | NONE | Hi @simonw, I've received some inquiries over the last year or so about Datasette and how it might be supported by Mangum. I maintain Mangum which is, as far as I know, the only project that provides support for ASGI applications in AWS Lambda. If there is anything that I can help with here, please let me know because I think what Datasette provides to the community (even beyond OSS) is noble and worthy of special consideration. |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
datasette publish lambda plugin 317001500 | |
1033641009 | https://github.com/simonw/sqlite-utils/pull/203#issuecomment-1033641009 | https://api.github.com/repos/simonw/sqlite-utils/issues/203 | IC_kwDOCGYnMM49nBwx | psychemedia 82988 | 2022-02-09T11:06:18Z | 2022-02-09T11:06:18Z | NONE | Is there any progress elsewhere on the handling of compound / composite foreign keys, or is this PR still effectively open? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
changes to allow for compound foreign keys 743384829 | |
1031463789 | https://github.com/simonw/datasette/pull/1631#issuecomment-1031463789 | https://api.github.com/repos/simonw/datasette/issues/1631 | IC_kwDOBm6k_c49euNt | codecov[bot] 22429695 | 2022-02-07T13:21:48Z | 2022-02-07T13:21:48Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1631 +/-=======================================
Coverage 92.19% 92.19% Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update pytest-asyncio requirement from <0.17,>=0.10 to >=0.10,<0.19 1125973221 | |
1030807433 | https://github.com/simonw/sqlite-utils/issues/399#issuecomment-1030807433 | https://api.github.com/repos/simonw/sqlite-utils/issues/399 | IC_kwDOCGYnMM49cN-J | chris48s 6025893 | 2022-02-06T10:54:09Z | 2022-02-06T10:54:09Z | NONE |
The ewtk/ewkb ones don't accept an SRID is because ewkt encodes the SRID in the string, so you would do this with a wkt string:
but for ewkt it would be
The specs for KML and GeoJSON specify a Coordinate Reference System for the format
GML can specify the SRID in the XML at feature level e.g:
There's a few more obscure formats in there, but broadly I think it is safe to assume an SRID param exists on the function for cases where the SRID is not implied by or specified in the input format. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Make it easier to insert geometries, with documentation and maybe code 1124731464 | |
1029980337 | https://github.com/simonw/datasette/pull/1629#issuecomment-1029980337 | https://api.github.com/repos/simonw/datasette/issues/1629 | IC_kwDOBm6k_c49ZECx | codecov[bot] 22429695 | 2022-02-04T13:21:09Z | 2022-02-04T13:21:09Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1629 +/-=======================================
Coverage 92.16% 92.16% Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update pytest requirement from <6.3.0,>=5.2.2 to >=5.2.2,<7.1.0 1124191982 | |
1029177700 | https://github.com/simonw/sqlite-utils/pull/385#issuecomment-1029177700 | https://api.github.com/repos/simonw/sqlite-utils/issues/385 | IC_kwDOCGYnMM49WAFk | codecov[bot] 22429695 | 2022-02-03T16:38:45Z | 2022-02-04T05:52:39Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #385 +/-==========================================
- Coverage 96.52% 95.91% -0.62% | Impacted Files | Coverage Δ | |
|---|---|---|
| sqlite_utils/cli.py | Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add new spatialite helper methods 1102899312 | |
1028419517 | https://github.com/simonw/datasette/pull/1617#issuecomment-1028419517 | https://api.github.com/repos/simonw/datasette/issues/1617 | IC_kwDOBm6k_c49TG-9 | codecov[bot] 22429695 | 2022-02-02T22:30:26Z | 2022-02-03T01:36:07Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1617 +/-==========================================
+ Coverage 92.09% 92.16% +0.06% | Impacted Files | Coverage Δ | |
|---|---|---|
| datasette/app.py | Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Ensure template_path always uses "/" to match jinja 1120990806 | |
1028423514 | https://github.com/simonw/datasette/pull/1626#issuecomment-1028423514 | https://api.github.com/repos/simonw/datasette/issues/1626 | IC_kwDOBm6k_c49TH9a | codecov[bot] 22429695 | 2022-02-02T22:36:37Z | 2022-02-02T22:39:52Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1626 +/-=======================================
Coverage 92.16% 92.16% Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Try test suite against macOS and Windows 1122451096 | |
1028387529 | https://github.com/simonw/datasette/pull/1622#issuecomment-1028387529 | https://api.github.com/repos/simonw/datasette/issues/1622 | IC_kwDOBm6k_c49S_LJ | codecov[bot] 22429695 | 2022-02-02T21:45:21Z | 2022-02-02T21:45:21Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1622 +/-=======================================
Coverage 92.11% 92.11% Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Test against Python 3.11-dev 1122414274 | |
1028294089 | https://github.com/simonw/datasette/issues/1618#issuecomment-1028294089 | https://api.github.com/repos/simonw/datasette/issues/1618 | IC_kwDOBm6k_c49SoXJ | strada 770231 | 2022-02-02T19:42:03Z | 2022-02-02T19:42:03Z | NONE | Thanks for looking into this. It might have been nice if ``` sqlite-utils fixtures.db 'explain query plan select * from pragma_function_list(), pragma_database_list(), pragma_module_list()' -t id parent notused detail 4 0 0 SCAN pragma_function_list VIRTUAL TABLE INDEX 0: 8 0 0 SCAN pragma_database_list VIRTUAL TABLE INDEX 0: 12 0 0 SCAN pragma_module_list VIRTUAL TABLE INDEX 0: ``` ``` sqlite-utils fixtures.db 'explain query plan select * from pragma_function_list() as fl, pragma_database_list() as dl, pragma_module_list() as ml' -t id parent notused detail 4 0 0 SCAN fl VIRTUAL TABLE INDEX 0: 8 0 0 SCAN dl VIRTUAL TABLE INDEX 0: 12 0 0 SCAN ml VIRTUAL TABLE INDEX 0: ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Reconsider policy on blocking queries containing the string "pragma" 1121121305 | |
1025732071 | https://github.com/simonw/datasette/pull/1616#issuecomment-1025732071 | https://api.github.com/repos/simonw/datasette/issues/1616 | IC_kwDOBm6k_c49I23n | codecov[bot] 22429695 | 2022-01-31T13:20:18Z | 2022-01-31T13:20:18Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1616 +/-=======================================
Coverage 92.09% 92.09% Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Bump black from 21.12b0 to 22.1.0 1119413338 | |
1023997327 | https://github.com/simonw/datasette/issues/1615#issuecomment-1023997327 | https://api.github.com/repos/simonw/datasette/issues/1615 | IC_kwDOBm6k_c49CPWP | aidansteele 369053 | 2022-01-28T08:37:36Z | 2022-01-28T08:37:36Z | NONE | Oops, it feels like this should perhaps be migrated to GitHub Discussions - sorry! I don't think I have the ability to do that 😅 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Potential simplified publishing mechanism 1117132741 | |
1021264135 | https://github.com/dogsheep/dogsheep.github.io/pull/6#issuecomment-1021264135 | https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/6 | IC_kwDODMzF1s4830EH | ligurio 1151557 | 2022-01-25T14:52:40Z | 2022-01-25T14:52:40Z | NONE | @simonw, could you review? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add testres-db tool 842765105 | |
1017993482 | https://github.com/simonw/datasette/issues/1608#issuecomment-1017993482 | https://api.github.com/repos/simonw/datasette/issues/1608 | IC_kwDOBm6k_c48rVkK | astrojuanlu 316517 | 2022-01-20T22:46:16Z | 2022-01-20T22:46:16Z | NONE | Or you can use https://sphinx-version-warning.readthedocs.io/! 😄 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Documentation should clarify /stable/ vs /latest/ 1109808154 | |
1012128696 | https://github.com/simonw/datasette/pull/1593#issuecomment-1012128696 | https://api.github.com/repos/simonw/datasette/issues/1593 | IC_kwDOBm6k_c48U9u4 | codecov[bot] 22429695 | 2022-01-13T13:18:35Z | 2022-01-13T13:18:35Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1593 +/-=======================================
Coverage 92.09% 92.09% Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update pytest-asyncio requirement from <0.17,>=0.10 to >=0.10,<0.18 1101705012 | |
1010559681 | https://github.com/simonw/datasette/issues/1590#issuecomment-1010559681 | https://api.github.com/repos/simonw/datasette/issues/1590 | IC_kwDOBm6k_c48O-rB | eelkevdbos 1001306 | 2022-01-12T02:10:20Z | 2022-01-12T02:10:20Z | NONE | In my example, path matching happens at the application layer (being the Django channels URLRouter). That might be a somewhat exotic solution that would normally be solved by a proxy like Apache or Nginx. However, in my specific use case, this is a "feature" enabling me to do simple management of databases and metadata from within a Django admin app instance mapped in that same router. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Table+query JSON and CSV links broken when using `base_url` setting 1099723916 | |
1010556333 | https://github.com/simonw/datasette/issues/1590#issuecomment-1010556333 | https://api.github.com/repos/simonw/datasette/issues/1590 | IC_kwDOBm6k_c48O92t | eelkevdbos 1001306 | 2022-01-12T02:03:59Z | 2022-01-12T02:03:59Z | NONE | Thank you for the quick reply! Just a quick observation, I am running this locally without a proxy, whereas your fly example seems to be running behind an apache proxy (if the name is accurate). Can it be that the apache proxy strips the prefix before it passes on the request to the daphne backend? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Table+query JSON and CSV links broken when using `base_url` setting 1099723916 | |
1009531863 | https://github.com/simonw/sqlite-utils/pull/377#issuecomment-1009531863 | https://api.github.com/repos/simonw/sqlite-utils/issues/377 | IC_kwDOCGYnMM48LDvX | codecov[bot] 22429695 | 2022-01-11T02:03:00Z | 2022-01-11T02:03:00Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #377 +/-==========================================
- Coverage 96.52% 96.50% -0.02% | Impacted Files | Coverage Δ | |
|---|---|---|
| sqlite_utils/cli.py | Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
`sqlite-utils bulk` command 1097477582 | |
1008279307 | https://github.com/simonw/datasette/pull/1574#issuecomment-1008279307 | https://api.github.com/repos/simonw/datasette/issues/1574 | IC_kwDOBm6k_c48GR8L | fs111 33631 | 2022-01-09T11:26:06Z | 2022-01-09T11:26:06Z | NONE | @fgregg my thinking was backwards compatibility. I don't know what people do to their builds, I just wanted a smaller image for my use case. @simonw any chance to take a look at this? If there is no interest, feel free to close the PR |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
introduce new option for datasette package to use a slim base image 1084193403 | |
1008158799 | https://github.com/simonw/sqlite-utils/pull/367#issuecomment-1008158799 | https://api.github.com/repos/simonw/sqlite-utils/issues/367 | IC_kwDOCGYnMM48F0hP | codecov[bot] 22429695 | 2022-01-08T21:36:55Z | 2022-01-09T02:34:44Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #367 +/-==========================================
- Coverage 96.44% 96.24% -0.21% | Impacted Files | Coverage Δ | |
|---|---|---|
| sqlite_utils/db.py | Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Initial prototype of .analyze() methods 1097041471 | |
1006708046 | https://github.com/dogsheep/dogsheep-photos/pull/36#issuecomment-1006708046 | https://api.github.com/repos/dogsheep/dogsheep-photos/issues/36 | IC_kwDOD079W848ASVO | scoates 71983 | 2022-01-06T16:04:46Z | 2022-01-06T16:04:46Z | NONE | This one got me, today, too. 👍 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Correct naming of tool in readme 988493790 | |
1006219956 | https://github.com/simonw/sqlite-utils/pull/361#issuecomment-1006219956 | https://api.github.com/repos/simonw/sqlite-utils/issues/361 | IC_kwDOCGYnMM47-bK0 | codecov[bot] 22429695 | 2022-01-06T01:51:54Z | 2022-01-06T06:22:25Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #361 +/-==========================================
- Coverage 96.49% 96.44% -0.06% | Impacted Files | Coverage Δ | |
|---|---|---|
| sqlite_utils/cli.py | Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
--lines and --text and --convert and --import 1094890366 | |
1003437288 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/8#issuecomment-1003437288 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/8 | IC_kwDODFE5qs47zzzo | maxhawkins 28565 | 2021-12-31T19:06:20Z | 2021-12-31T19:06:20Z | NONE |
Shouldn't be hard. The easiest way is probably to remove the |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add Gmail takeout mbox import (v2) 954546309 | |
1002735370 | https://github.com/dogsheep/google-takeout-to-sqlite/pull/8#issuecomment-1002735370 | https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/8 | IC_kwDODFE5qs47xIcK | Btibert3 203343 | 2021-12-29T18:58:23Z | 2021-12-29T18:58:23Z | NONE | @maxhawkins how hard would it be to add an entry to the table that includes the HTML version of the email, if it exists? I just attempted your the PR branch on a very small mbox file, and it worked great. My use case is a research project and I need to access more than just the body plain text. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add Gmail takeout mbox import (v2) 954546309 | |
1001222213 | https://github.com/dogsheep/twitter-to-sqlite/issues/62#issuecomment-1001222213 | https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/62 | IC_kwDODEm0Qs47rXBF | swyxio 6764957 | 2021-12-26T17:59:25Z | 2021-12-26T17:59:25Z | NONE | just confirmed that this error does not occur when i use my public main account. gets more interesting! |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
KeyError: 'created_at' for private accounts? 1088816961 | |
1001115286 | https://github.com/simonw/sqlite-utils/issues/228#issuecomment-1001115286 | https://api.github.com/repos/simonw/sqlite-utils/issues/228 | IC_kwDOCGYnMM47q86W | agguser 1206106 | 2021-12-26T07:01:31Z | 2021-12-26T07:01:31Z | NONE |
b 2 ``` |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
--no-headers option for CSV and TSV 807437089 | |
998999230 | https://github.com/simonw/datasette/issues/1181#issuecomment-998999230 | https://api.github.com/repos/simonw/datasette/issues/1181 | IC_kwDOBm6k_c47i4S- | rayvoelker 9308268 | 2021-12-21T18:25:15Z | 2021-12-21T18:25:15Z | NONE | I wonder if I'm encountering the same bug (or something related). I had previously been using the .csv feature to run queries and then fetch results for the pandas Datasette v0.59.4 |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Certain database names results in 404: "Database not found: None" 781262510 | |
996482595 | https://github.com/simonw/sqlite-utils/issues/358#issuecomment-996482595 | https://api.github.com/repos/simonw/sqlite-utils/issues/358 | IC_kwDOCGYnMM47ZR4j | luxint 11597658 | 2021-12-17T06:57:51Z | 2021-12-17T23:24:16Z | NONE |
I'm using them myself for the first time as well, this is a tutorial of how to use (and change) them in sqlite: https://www.sqlitetutorial.net/sqlite-check-constraint/ |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Support for CHECK constraints 1082651698 | |
996289541 | https://github.com/simonw/datasette/pull/1559#issuecomment-996289541 | https://api.github.com/repos/simonw/datasette/issues/1559 | IC_kwDOBm6k_c47YiwF | codecov[bot] 22429695 | 2021-12-17T00:07:42Z | 2021-12-17T17:28:54Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1559 +/-==========================================
+ Coverage 91.96% 92.05% +0.09% | Impacted Files | Coverage Δ | |
|---|---|---|
| datasette/plugins.py | Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
filters_from_request plugin hook, now used in TableView 1082743068 | |
996716158 | https://github.com/simonw/datasette/pull/1562#issuecomment-996716158 | https://api.github.com/repos/simonw/datasette/issues/1562 | IC_kwDOBm6k_c47aK5- | codecov[bot] 22429695 | 2021-12-17T13:18:49Z | 2021-12-17T13:18:49Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1562 +/-=======================================
Coverage 91.96% 91.96% Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update janus requirement from <0.8,>=0.6.2 to >=0.6.2,<1.1 1083246400 | |
993876599 | https://github.com/simonw/datasette/issues/1423#issuecomment-993876599 | https://api.github.com/repos/simonw/datasette/issues/1423 | IC_kwDOBm6k_c47PVp3 | plpxsk 6165713 | 2021-12-14T18:48:09Z | 2021-12-14T18:48:09Z | NONE | Great feature. But what is the right way to enable this to show up? Currently, it seems I need to edit the URL to add, in the right place, Is there another (easier) way to enable this feature? |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Show count of facet values if ?_facet_size=max 962391325 | |
993002933 | https://github.com/simonw/datasette/pull/1554#issuecomment-993002933 | https://api.github.com/repos/simonw/datasette/issues/1554 | IC_kwDOBm6k_c47MAW1 | codecov[bot] 22429695 | 2021-12-13T23:22:58Z | 2021-12-13T23:22:58Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1554 +/-=======================================
Coverage 91.84% 91.84% | Impacted Files | Coverage Δ | |
|---|---|---|
| datasette/views/table.py | Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
TableView refactor 1079129258 | |
982123183 | https://github.com/simonw/sqlite-utils/pull/347#issuecomment-982123183 | https://api.github.com/repos/simonw/sqlite-utils/issues/347 | IC_kwDOCGYnMM46igKv | codecov[bot] 22429695 | 2021-11-29T23:20:35Z | 2021-12-11T01:02:19Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #347 +/-=======================================
Coverage 96.51% 96.52% | Impacted Files | Coverage Δ | |
|---|---|---|
| sqlite_utils/cli.py | Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Test against pysqlite3 running SQLite 3.37 1066603133 | |
990967417 | https://github.com/simonw/datasette/pull/1548#issuecomment-990967417 | https://api.github.com/repos/simonw/datasette/issues/1548 | IC_kwDOBm6k_c47EPZ5 | codecov[bot] 22429695 | 2021-12-10T13:19:00Z | 2021-12-10T13:19:00Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1548 +/-=======================================
Coverage 91.84% 91.84% Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update pytest-xdist requirement from <2.5,>=2.2.1 to >=2.2.1,<2.6 1076834768 | |
988468238 | https://github.com/simonw/datasette/issues/1528#issuecomment-988468238 | https://api.github.com/repos/simonw/datasette/issues/1528 | IC_kwDOBm6k_c466tQO | 20after4 30934 | 2021-12-08T03:35:45Z | 2021-12-08T03:35:45Z | NONE | FWIW I implemented something similar with a bit of plugin code: ```python @hookimpl def canned_queries(datasette: Datasette, database: str) -> Mapping[str, str]: # load "canned queries" from the filesystem under # www/sql/db/query_name.sql queries = {}
``` |
{ "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0 } |
Add new `"sql_file"` key to Canned Queries in metadata? 1060631257 | |
988463455 | https://github.com/simonw/datasette/issues/1304#issuecomment-988463455 | https://api.github.com/repos/simonw/datasette/issues/1304 | IC_kwDOBm6k_c466sFf | 20after4 30934 | 2021-12-08T03:23:14Z | 2021-12-08T03:23:14Z | NONE | I actually think it would be a useful thing to add support for in datasette. It wouldn't be difficult to unwind an array of params and add the placeholders automatically. |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Document how to send multiple values for "Named parameters" 863884805 | |
988461884 | https://github.com/simonw/datasette/issues/1304#issuecomment-988461884 | https://api.github.com/repos/simonw/datasette/issues/1304 | IC_kwDOBm6k_c466rs8 | 20after4 30934 | 2021-12-08T03:20:26Z | 2021-12-08T03:20:26Z | NONE | The easiest or most straightforward thing to do is to use named parameters like:
And simply construct the list of placeholders dynamically based on the number of values. Doing this is possible with datasette if you forgo "canned queries" and just use the raw query endpoint and pass the query sql, along with p1, p2 ... in the request. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Document how to send multiple values for "Named parameters" 863884805 | |
988459453 | https://github.com/simonw/datasette/issues/1304#issuecomment-988459453 | https://api.github.com/repos/simonw/datasette/issues/1304 | IC_kwDOBm6k_c466rG9 | rayvoelker 9308268 | 2021-12-08T03:15:27Z | 2021-12-08T03:15:27Z | NONE | I was thinking if there were a way to use some sort of sting function to "unpack" the values and convert them into ints... hm |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Document how to send multiple values for "Named parameters" 863884805 | |
986768401 | https://github.com/simonw/datasette/pull/1543#issuecomment-986768401 | https://api.github.com/repos/simonw/datasette/issues/1543 | IC_kwDOBm6k_c460OQR | codecov[bot] 22429695 | 2021-12-06T13:18:48Z | 2021-12-06T13:18:48Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1543 +/-=======================================
Coverage 91.84% 91.84% Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Bump black from 21.11b1 to 21.12b0 1072135269 | |
985982668 | https://github.com/simonw/datasette/issues/1426#issuecomment-985982668 | https://api.github.com/repos/simonw/datasette/issues/1426 | IC_kwDOBm6k_c46xObM | knowledgecamp12 95520595 | 2021-12-04T07:11:29Z | 2021-12-04T07:11:29Z | NONE | You can generate xml site map from the online tools using https://tools4seo.site/xml-sitemap-generator. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Manage /robots.txt in Datasette core, block robots by default 964322136 | |
984569477 | https://github.com/simonw/datasette/issues/1175#issuecomment-984569477 | https://api.github.com/repos/simonw/datasette/issues/1175 | IC_kwDOBm6k_c46r1aF | AnkitKundariya 24821294 | 2021-12-02T12:09:30Z | 2021-12-02T12:09:30Z | NONE | @hannseman I have tried the above suggestion given by you but somehow I'm getting the below error. note : I'm running my application with Docker.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Use structlog for logging 779156520 | |
983890815 | https://github.com/simonw/datasette/issues/1519#issuecomment-983890815 | https://api.github.com/repos/simonw/datasette/issues/1519 | IC_kwDOBm6k_c46pPt_ | phubbard 157158 | 2021-12-01T17:50:09Z | 2021-12-01T17:50:09Z | NONE | thanks so very much for the prompt attention and fix! Plus, the animated GIF showing the bug is just extra and I love it. Interactions like this are why I love open source. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
base_url is omitted in JSON and CSV views 1058790545 | |
982745406 | https://github.com/simonw/datasette/issues/1532#issuecomment-982745406 | https://api.github.com/repos/simonw/datasette/issues/1532 | IC_kwDOBm6k_c46k4E- | 20after4 30934 | 2021-11-30T15:28:57Z | 2021-11-30T15:28:57Z | NONE | It's a really great API and the documentation is really great too. Honestly, in more than 20 years of professional experience, I haven't worked with any software API that was more of a joy to use. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Use datasette-table Web Component to guide the design of the JSON API for 1.0 1065429936 | |
977870699 | https://github.com/simonw/datasette/pull/1529#issuecomment-977870699 | https://api.github.com/repos/simonw/datasette/issues/1529 | IC_kwDOBm6k_c46SR9r | codecov[bot] 22429695 | 2021-11-24T13:18:52Z | 2021-11-30T02:30:46Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1529 +/-==========================================
- Coverage 91.90% 91.83% -0.07% | Impacted Files | Coverage Δ | |
|---|---|---|
| datasette/views/index.py | Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update janus requirement from <0.7,>=0.6.2 to >=0.6.2,<0.8 1062414013 | |
981980048 | https://github.com/simonw/datasette/issues/1304#issuecomment-981980048 | https://api.github.com/repos/simonw/datasette/issues/1304 | IC_kwDOBm6k_c46h9OQ | 20after4 30934 | 2021-11-29T20:13:53Z | 2021-11-29T20:14:11Z | NONE | There isn't any way to do this with sqlite as far as I know. The only option is to insert the right number of ? placeholders into the sql template and then provide an array of values. |
{ "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Document how to send multiple values for "Named parameters" 863884805 | |
981966693 | https://github.com/simonw/datasette/issues/1532#issuecomment-981966693 | https://api.github.com/repos/simonw/datasette/issues/1532 | IC_kwDOBm6k_c46h59l | 20after4 30934 | 2021-11-29T19:56:52Z | 2021-11-29T19:56:52Z | NONE | FWIW I've written some web components that consume the json api and I think it's a really nice way to work with datasette. I like the combination with datasette+sqlite as a back-end feeding data to a front-end that's entirely javascript + html. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Use datasette-table Web Component to guide the design of the JSON API for 1.0 1065429936 | |
981631026 | https://github.com/simonw/datasette/pull/1537#issuecomment-981631026 | https://api.github.com/repos/simonw/datasette/issues/1537 | IC_kwDOBm6k_c46goAy | codecov[bot] 22429695 | 2021-11-29T13:23:20Z | 2021-11-29T13:23:20Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1537 +/-==========================================
- Coverage 91.90% 91.83% -0.07% | Impacted Files | Coverage Δ | |
|---|---|---|
| datasette/views/index.py | Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update aiofiles requirement from <0.8,>=0.4 to >=0.4,<0.9 1066023866 | |
979345527 | https://github.com/simonw/sqlite-utils/pull/333#issuecomment-979345527 | https://api.github.com/repos/simonw/sqlite-utils/issues/333 | IC_kwDOCGYnMM46X6B3 | Florents-Tselai 2118708 | 2021-11-25T16:31:47Z | 2021-11-25T16:31:47Z | NONE | Thanks for your reply @simonw . Tbh, my first attempt was actually the I don't think plugins make much sense either. Probably defeats the purpose of simplicity: simple database along with a pip-able package. |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Add functionality to read Parquet files. 1039037439 | |
976023405 | https://github.com/simonw/datasette/issues/1522#issuecomment-976023405 | https://api.github.com/repos/simonw/datasette/issues/1522 | IC_kwDOBm6k_c46LO9t | steren 360895 | 2021-11-23T00:08:07Z | 2021-11-23T00:08:07Z | NONE | If you suspect that Cloud Run throttled CPU could be the cause, you can request to have CPU always allocated with It could also be the Cloud Run sandbox that somehow gets in the way here, in which case I recommend testing with the second generation execution environment: |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Deploy a live instance of demos/apache-proxy 1058896236 | |
974711959 | https://github.com/simonw/datasette/issues/1426#issuecomment-974711959 | https://api.github.com/repos/simonw/datasette/issues/1426 | IC_kwDOBm6k_c46GOyX | tannewt 52649 | 2021-11-20T21:11:51Z | 2021-11-20T21:11:51Z | NONE | I think another thing would be to make |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Manage /robots.txt in Datasette core, block robots by default 964322136 | |
974607456 | https://github.com/simonw/datasette/issues/1522#issuecomment-974607456 | https://api.github.com/repos/simonw/datasette/issues/1522 | IC_kwDOBm6k_c46F1Rg | mrchrisadams 17906 | 2021-11-20T07:10:11Z | 2021-11-20T07:10:11Z | NONE | As a a sanity check, would it be worth looking at trying to push the multi-process container on another provider of a knative / cloud run / tekton ? I have a somewhat similar use case for a future proejct, so i'm been very grateful to you sharing all the progress in this issue. As I understand it, Scaleway also offer a very similar offering using what appear to be many similar components that might at least see if it's an issue with more than one knative based FaaS provider https://www.scaleway.com/en/serverless-containers/ https://developers.scaleway.com/en/products/containers/api/#main-features |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Deploy a live instance of demos/apache-proxy 1058896236 | |
972858458 | https://github.com/simonw/datasette/pull/1516#issuecomment-972858458 | https://api.github.com/repos/simonw/datasette/issues/1516 | IC_kwDOBm6k_c45_KRa | codecov[bot] 22429695 | 2021-11-18T13:19:01Z | 2021-11-18T13:19:01Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1516 +/-=======================================
Coverage 91.82% 91.82% Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Bump black from 21.9b0 to 21.11b1 1057340779 | |
971575746 | https://github.com/simonw/datasette/pull/1514#issuecomment-971575746 | https://api.github.com/repos/simonw/datasette/issues/1514 | IC_kwDOBm6k_c456RHC | codecov[bot] 22429695 | 2021-11-17T13:18:58Z | 2021-11-17T13:18:58Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1514 +/-=======================================
Coverage 91.82% 91.82% Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Bump black from 21.9b0 to 21.11b0 1056117435 | |
970718652 | https://github.com/simonw/datasette/pull/1512#issuecomment-970718652 | https://api.github.com/repos/simonw/datasette/issues/1512 | IC_kwDOBm6k_c452_28 | codecov[bot] 22429695 | 2021-11-16T22:02:59Z | 2021-11-16T23:51:48Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1512 +/-==========================================
- Coverage 91.82% 89.72% -2.11% | Impacted Files | Coverage Δ | |
|---|---|---|
| datasette/utils/vendored_graphlib.py | Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
New pattern for async view classes 1055402144 | |
970188065 | https://github.com/simonw/datasette/issues/1505#issuecomment-970188065 | https://api.github.com/repos/simonw/datasette/issues/1505 | IC_kwDOBm6k_c450-Uh | Segerberg 7094907 | 2021-11-16T11:40:52Z | 2021-11-16T11:40:52Z | NONE | A suggestion is to have the option to choose an arbitrary delimiter (and quoting characters ) |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Datasette should have an option to output CSV with semicolons 1052247023 | |
968904414 | https://github.com/simonw/datasette/pull/1508#issuecomment-968904414 | https://api.github.com/repos/simonw/datasette/issues/1508 | IC_kwDOBm6k_c45wE7e | codecov[bot] 22429695 | 2021-11-15T13:20:49Z | 2021-11-15T13:20:49Z | NONE | Codecov Report
```diff @@ Coverage Diff @@ main #1508 +/-=======================================
Coverage 91.82% 91.82% Continue to review full report at Codecov.
|
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Update docutils requirement from <0.18 to <0.19 1053655062 | |
967801997 | https://github.com/simonw/datasette/issues/1380#issuecomment-967801997 | https://api.github.com/repos/simonw/datasette/issues/1380 | IC_kwDOBm6k_c45r3yN | Segerberg 7094907 | 2021-11-13T08:05:37Z | 2021-11-13T08:09:11Z | NONE | @glasnt yeah I guess that could be an option. I run datasette on large databases > 75gb and the startup time is a bit slow for me even with -i --inspect-file options. Here's a quick sketch for a plugin that will reload db's in a folder that you set for the plugin in metadata.json. If you request /-reload-db new db's will be added. (You probably want to implement some authentication for this =) ) https://gist.github.com/Segerberg/b96a0e0a5389dce2396497323cda7042 |
{ "total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Serve all db files in a folder 924748955 | |
967181828 | https://github.com/simonw/datasette/issues/1380#issuecomment-967181828 | https://api.github.com/repos/simonw/datasette/issues/1380 | IC_kwDOBm6k_c45pgYE | Segerberg 7094907 | 2021-11-12T15:00:18Z | 2021-11-12T20:02:29Z | NONE | There is no such option see https://github.com/simonw/datasette/issues/43. But you could write a plugin using the datasette.add_database(db, name=None) https://docs.datasette.io/en/stable/internals.html#add-database-db-name-none |
{ "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0 } |
Serve all db files in a folder 924748955 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
issue 419