html_url,issue_url,id,node_id,user,user_label,created_at,updated_at,author_association,body,reactions,issue,issue_label,performed_via_github_app https://github.com/dogsheep/twitter-to-sqlite/issues/56#issuecomment-769973212,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/56,769973212,MDEyOklzc3VlQ29tbWVudDc2OTk3MzIxMg==,42315895,gsajko,2021-01-29T18:29:02Z,2021-01-29T18:31:55Z,NONE,"I think it was with `twitter-to-sqlite home-timeline home.db -a auth.json --since` and Im using only this command to grab tweets from cron tab `2,7,12,17,22,27,32,37,42,47,52,57 * * * * run-one /home/gsajko/miniconda3/bin/twitter-to-sqlite home-timeline /home/gsajko/work/custom_twitter_feed/home.db -a /home/gsajko/work/custom_twitter_feed/auth/auth.json --since` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",796736607,Not all quoted statuses get fetched?, https://github.com/dogsheep/twitter-to-sqlite/issues/56#issuecomment-772408273,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/56,772408273,MDEyOklzc3VlQ29tbWVudDc3MjQwODI3Mw==,42315895,gsajko,2021-02-03T10:36:36Z,2021-02-03T10:36:36Z,NONE,"I figured it out. Those tweets are in database, because somebody quote tweeted them, or retweeted them. And if you grab quoted tweet or reweeted tweet from other tweet json, It doesn't grab all of the details. So if someone quote tweeted a quote tweet, the second quote tweet won't have `quoted_status`. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",796736607,Not all quoted statuses get fetched?, https://github.com/dogsheep/twitter-to-sqlite/issues/57#issuecomment-860063190,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/57,860063190,MDEyOklzc3VlQ29tbWVudDg2MDA2MzE5MA==,232237,stiles,2021-06-12T14:46:44Z,2021-06-12T14:46:44Z,NONE,I'm having the same issue (same versions of python and twitter-to-sqlite). It's the `user-timeline` command. Other commands are working. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",907645813,"Error: Use either --since or --since_id, not both", https://github.com/dogsheep/twitter-to-sqlite/issues/60#issuecomment-1279249898,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/60,1279249898,IC_kwDODEm0Qs5MP83q,7908073,chapmanjacobd,2022-10-14T16:58:26Z,2022-10-14T16:58:26Z,NONE,"You could try using `msys2`. I've had better luck running python CLIs within that system on Windows. Here is a guide: https://github.com/chapmanjacobd/lb/blob/main/Windows.md#prep","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1063982712,Execution on Windows, https://github.com/dogsheep/twitter-to-sqlite/issues/61#issuecomment-1297201971,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/61,1297201971,IC_kwDODEm0Qs5NUbsz,3153638,Profpatsch,2022-10-31T14:47:58Z,2022-10-31T14:47:58Z,NONE,There’s also a limit of 3200 tweets. I wonder if that can be circumvented somehow.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1077560091,"Data Pull fails for ""Essential"" level access to the Twitter API (for Documentation)", https://github.com/dogsheep/twitter-to-sqlite/issues/62#issuecomment-1001222213,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/62,1001222213,IC_kwDODEm0Qs47rXBF,6764957,swyxio,2021-12-26T17:59:25Z,2021-12-26T17:59:25Z,NONE,just confirmed that this error does not occur when i use my public main account. gets more interesting!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1088816961,KeyError: 'created_at' for private accounts?, https://github.com/dogsheep/twitter-to-sqlite/issues/62#issuecomment-1049775451,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/62,1049775451,IC_kwDODEm0Qs4-kk1b,43036882,miuku,2022-02-24T11:43:31Z,2022-02-24T11:43:31Z,NONE,i seem to have fixed this issue by applying for [elevated API access](https://developer.twitter.com/en/portal/products/elevated),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1088816961,KeyError: 'created_at' for private accounts?, https://github.com/dogsheep/twitter-to-sqlite/issues/62#issuecomment-1050123919,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/62,1050123919,IC_kwDODEm0Qs4-l56P,6764957,swyxio,2022-02-24T18:10:18Z,2022-02-24T18:10:18Z,NONE,gonna close this for now since i'm not actively working on it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1088816961,KeyError: 'created_at' for private accounts?, https://github.com/simonw/datasette/issues/100#issuecomment-344864254,https://api.github.com/repos/simonw/datasette/issues/100,344864254,MDEyOklzc3VlQ29tbWVudDM0NDg2NDI1NA==,13304454,coisnepe,2017-11-16T09:25:10Z,2017-11-16T09:25:10Z,NONE,@simonw I see. I upgraded sanic-jinja2 and jinja2: it now works flawlessly. Thank you!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274160723,TemplateAssertionError: no filter named 'tojson', https://github.com/simonw/datasette/issues/1003#issuecomment-706302863,https://api.github.com/repos/simonw/datasette/issues/1003,706302863,MDEyOklzc3VlQ29tbWVudDcwNjMwMjg2Mw==,649467,mhalle,2020-10-09T17:17:06Z,2020-10-09T17:17:06Z,NONE,"I agree on the descriptive and python-consistent naming. There is already a tojson, but frankly i find the ""to"" and ""from"" confusing in a text templating language where what's a string and what's data isn't 100% transparent.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",718238967,from_json jinja2 filter, https://github.com/simonw/datasette/issues/101#issuecomment-344597274,https://api.github.com/repos/simonw/datasette/issues/101,344597274,MDEyOklzc3VlQ29tbWVudDM0NDU5NzI3NA==,450244,eaubin,2017-11-15T13:48:55Z,2017-11-15T13:48:55Z,NONE,This is a duplicate of https://github.com/simonw/datasette/issues/100,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274161964,TemplateAssertionError: no filter named 'tojson', https://github.com/simonw/datasette/issues/1032#issuecomment-712397537,https://api.github.com/repos/simonw/datasette/issues/1032,712397537,MDEyOklzc3VlQ29tbWVudDcxMjM5NzUzNw==,236498,saulpw,2020-10-19T19:37:55Z,2020-10-19T19:37:55Z,NONE,"python-dateutil is awesome, but it can only guess at one date at a time. So if you have a column of dates that are (presumably) in the same format, it can't use the full set of dates to deduce the format. Also, once it has parsed a date, you can't get the format it used, whether to parse or render other dates. These limitations prevent it from being a silver bullet for date parsing, though they're not enough for me to stop using it!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724878151,Bring date parsing into Datasette core, https://github.com/simonw/datasette/issues/1036#issuecomment-762385981,https://api.github.com/repos/simonw/datasette/issues/1036,762385981,MDEyOklzc3VlQ29tbWVudDc2MjM4NTk4MQ==,4997607,philshem,2021-01-18T17:32:13Z,2021-01-18T17:34:50Z,NONE,"Hi Simon Just finding this old issue regarding downloading blobs. Nice work! As a feature request, maybe it would be possible to assign a blob column as a certain data type (e.g. `.jpg`) and then each blob could be downloaded as that type of file (perhaps if the file types were constrained to normal blobs that people store in sqlite databases, this could avoid the execution stuff mentioned above). I guess the column blob-type definition could fit into this dropdown selection: Let me know if I should open a new issue with a feature request. (This could slowly go in the direction of displaying image blob-types in the browser.) Thanks for the great tool! --- edit: just reading the rest of the twitter thread: https://twitter.com/simonw/status/1318685933256855552 perhaps this is already possible in some form with the plugin datasette-media: https://github.com/simonw/datasette-media","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1036#issuecomment-762391426,https://api.github.com/repos/simonw/datasette/issues/1036,762391426,MDEyOklzc3VlQ29tbWVudDc2MjM5MTQyNg==,4997607,philshem,2021-01-18T17:45:00Z,2021-01-18T17:45:00Z,NONE,"It might be possible with this library: https://docs.python.org/3/library/imghdr.html quick test of the downloaded blob: ``` >>> import imghdr >>> imghdr.what('material_culture-1-image.blob') 'jpeg' ``` The output plugin would be cool. I'll look into making my first datasette plugin. I'm also imagining displaying the image in the browser -- but that would be a step 2. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725996507,Make it possible to download BLOB data from the Datasette UI, https://github.com/simonw/datasette/issues/1050#issuecomment-718317997,https://api.github.com/repos/simonw/datasette/issues/1050,718317997,MDEyOklzc3VlQ29tbWVudDcxODMxNzk5Nw==,283343,thadk,2020-10-29T02:24:50Z,2020-10-29T02:29:24Z,NONE,"Unsolicited feedback for an unreleased feature of the [current](https://github.com/simonw/datasette/commit/5e0b72247ecab4ce0fcec599b77a83d73a480872) unreleased GitHub version (I casually wanted to access a blob row) – the existing #1036 route doesn't support special characters in database or table names (e.g. `@()` ). Maybe this is motivation for your new idea here. Also I got this error/crash with my blob and wasn't able to get the file: https://gist.github.com/thadk/28ac32af0e88747ce9056c90b0b19d34","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729057388,Switch to .blob render extension for BLOB downloads, https://github.com/simonw/datasette/issues/1082#issuecomment-721547177,https://api.github.com/repos/simonw/datasette/issues/1082,721547177,MDEyOklzc3VlQ29tbWVudDcyMTU0NzE3Nw==,39538958,justmars,2020-11-04T06:52:30Z,2020-11-04T06:53:16Z,NONE,"I think I tried the same db size on the following scenarios in Digital Ocean: 1. Basic ($5/month) with 512MB RAM 2. Basic ($10/month) with 1GB RAM 3. Pro ($12/month) with 1GB RAM All such attempts conked out with ""out of memory"" errors","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",735852274,DigitalOcean buildpack memory errors for large sqlite db?, https://github.com/simonw/datasette/issues/1091#issuecomment-726798745,https://api.github.com/repos/simonw/datasette/issues/1091,726798745,MDEyOklzc3VlQ29tbWVudDcyNjc5ODc0NQ==,6739646,tballison,2020-11-13T14:35:22Z,2020-11-13T14:35:22Z,NONE,"I'm starting this with docker like so: `docker run --name datasette -d -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/file_profiles.db --config sql_time_limit_ms:120000 --config max_returned_rows:100000 --config base_url:/datasette/ --config cache_size_kb:50000` I'm not doing any templating or anything else custom. Apropos of nothing, I swapped out a simpler db, so this query should now work: https://corpora.tika.apache.org/datasette/file_profiles?sql=select%0D%0A++*%0D%0Afrom%0D%0A++file_profiles+fp%0D%0Alimit%0D%0A++10","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-726801731,https://api.github.com/repos/simonw/datasette/issues/1091,726801731,MDEyOklzc3VlQ29tbWVudDcyNjgwMTczMQ==,6739646,tballison,2020-11-13T14:40:56Z,2020-11-13T14:40:56Z,NONE,"My headers aren't clickable/sortable with custom sql, but I think that's by design. In the default view, https://corpora.tika.apache.org/datasette/file_profiles/file_profiles, ah, y, now I see that the headers should be sortable, but you're right the base_url is not applied. base_url works with ""View and Edit SQL"" and with ""(advanced)"" As you point out, does not work with the export csv, json, other or with the ""Next page"" navigational button at the bottom.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-729018386,https://api.github.com/repos/simonw/datasette/issues/1091,729018386,MDEyOklzc3VlQ29tbWVudDcyOTAxODM4Ng==,6739646,tballison,2020-11-17T15:48:58Z,2020-11-17T15:48:58Z,NONE,"I don't think we are, but I'll check with Maruan. I think this is the relevant part of our config? ``` Alias ""/base/"" ""/usr/share/corpora/"" Options +Indexes -Multiviews AllowOverride None ProxyPreserveHost On ProxyPass /datasette http://0.0.0.0:8001 ProxyPassReverse /datasette http://0.0.0.0:8001 ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-729045320,https://api.github.com/repos/simonw/datasette/issues/1091,729045320,MDEyOklzc3VlQ29tbWVudDcyOTA0NTMyMA==,6739646,tballison,2020-11-17T16:31:00Z,2020-11-17T16:31:00Z,NONE,We're using mod_proxy.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-741804334,https://api.github.com/repos/simonw/datasette/issues/1091,741804334,MDEyOklzc3VlQ29tbWVudDc0MTgwNDMzNA==,6739646,tballison,2020-12-09T14:26:05Z,2020-12-09T14:26:05Z,NONE,"Anything we can do to help debug this? Thank you, again!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-742001510,https://api.github.com/repos/simonw/datasette/issues/1091,742001510,MDEyOklzc3VlQ29tbWVudDc0MjAwMTUxMA==,6739646,tballison,2020-12-09T19:36:42Z,2020-12-09T19:38:04Z,NONE,"I don't think this fixes it: ``` grep -R datasette . ./sites-available/000-default.conf: ProxyPass /datasette http://127.0.0.1:8001/ ./sites-available/000-default.conf: #ProxyPassReverse /datasette http://127.0.0.1:8001/ ./sites-available/corpora-le-ssl.conf: ProxyPass /datasette http://0.0.0.0:8001 ./sites-available/corpora-le-ssl.conf: #ProxyPassReverse /datasette http://0.0.0.0:8001 ./sites-enabled/corpora-le-ssl.conf: ProxyPass /datasette http://0.0.0.0:8001 ./sites-enabled/corpora-le-ssl.conf: #ProxyPassReverse /datasette http://0.0.0.0:8001 ``` And I confirmed that I actually restarted the server. :rofl: https://corpora.tika.apache.org/datasette/file_profiles","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-742010306,https://api.github.com/repos/simonw/datasette/issues/1091,742010306,MDEyOklzc3VlQ29tbWVudDc0MjAxMDMwNg==,6739646,tballison,2020-12-09T19:53:18Z,2020-12-09T19:59:52Z,NONE,"I can't imagine this helps (esp. given your point about potential rewrites), but you can see that /datasette/ was correctly added to the sql form, but not to the ""export-links"" ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-756425587,https://api.github.com/repos/simonw/datasette/issues/1091,756425587,MDEyOklzc3VlQ29tbWVudDc1NjQyNTU4Nw==,19328961,henry501,2021-01-07T22:27:19Z,2021-01-07T22:27:19Z,NONE,"I found this issue while troubleshooting the same behavior with an nginx reverse proxy. The solution was to make sure I set: `proxy_pass http://server:8001/baseurl/ ` instead of just: `proxy_pass http://server:8001 ` The custom SQL query and header links are now correct. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-758280611,https://api.github.com/repos/simonw/datasette/issues/1091,758280611,MDEyOklzc3VlQ29tbWVudDc1ODI4MDYxMQ==,6739646,tballison,2021-01-11T23:06:10Z,2021-01-11T23:06:10Z,NONE,"+1 Yep! Fixes it. If I navigate to https://corpora.tika.apache.org/datasette, I get a 404 (database not found: datasette), but if I navigate to https://corpora.tika.apache.org/datasette/file_profiles/, everything WORKS! Thank you!","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 1, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-758448525,https://api.github.com/repos/simonw/datasette/issues/1091,758448525,MDEyOklzc3VlQ29tbWVudDc1ODQ0ODUyNQ==,19328961,henry501,2021-01-12T06:55:08Z,2021-01-12T06:55:08Z,NONE,"Great, really happy I could help! Reverse proxies get tricky.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1091#issuecomment-758668359,https://api.github.com/repos/simonw/datasette/issues/1091,758668359,MDEyOklzc3VlQ29tbWVudDc1ODY2ODM1OQ==,6739646,tballison,2021-01-12T13:52:29Z,2021-01-12T13:52:29Z,NONE,"Y, thank you to both of you!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",742011049,.json and .csv exports fail to apply base_url, https://github.com/simonw/datasette/issues/1094#issuecomment-731260091,https://api.github.com/repos/simonw/datasette/issues/1094,731260091,MDEyOklzc3VlQ29tbWVudDczMTI2MDA5MQ==,4808085,bapowell,2020-11-20T16:11:29Z,2020-11-20T16:11:29Z,NONE,"I can confirm this issue, running version 0.51.1 under Windows. Fixed by commenting out the following line near the top of datasette\utils\asgi.py : `#from os import EX_CANTCREAT` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743011397,import EX_CANTCREAT means datasette fails to work on Windows, https://github.com/simonw/datasette/issues/1116#issuecomment-736005833,https://api.github.com/repos/simonw/datasette/issues/1116,736005833,MDEyOklzc3VlQ29tbWVudDczNjAwNTgzMw==,2789593,nattaylor,2020-11-30T19:54:39Z,2020-11-30T19:54:39Z,NONE,"@simonw thanks for investigating so quickly. If it is undesirable to change that hidden behavior, maybe something like this is a suitable workaround: ``` SELECT * FROM pragma_table_xinfo('deeds') where hidden in (0,2); 0|body|TEXT|0||0|0 1|id|INT GENERATED ALWAYS|0||0|2 2|consideration|INT GENERATED ALWAYS|0||0|2 ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",753668177,GENERATED column support, https://github.com/simonw/datasette/issues/1134#issuecomment-742260116,https://api.github.com/repos/simonw/datasette/issues/1134,742260116,MDEyOklzc3VlQ29tbWVudDc0MjI2MDExNg==,2181410,clausjuhl,2020-12-10T05:57:17Z,2020-12-10T05:57:17Z,NONE,"Hi Simon Thank you for the quick fix! And glad you like our use of Datasette (launches 1. january 2021). It's a site that currently (more to come) makes all minutes and their annexes from Aarhus City Council and the major committees (1997-2019) available to the public. So we're putting Datasette to good use :)","{""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 2, ""rocket"": 0, ""eyes"": 0}",760312579,"""_searchmode=raw"" throws an index out of range error when combined with ""_search_COLUMN""", https://github.com/simonw/datasette/issues/1142#issuecomment-743732440,https://api.github.com/repos/simonw/datasette/issues/1142,743732440,MDEyOklzc3VlQ29tbWVudDc0MzczMjQ0MA==,6622733,nitinpaultifr,2020-12-12T09:56:40Z,2020-12-12T09:56:40Z,NONE,'Include all rows' seem like a fairly obvious alternative,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763361458,"""Stream all rows"" is not at all obvious", https://github.com/simonw/datasette/issues/1142#issuecomment-743998792,https://api.github.com/repos/simonw/datasette/issues/1142,743998792,MDEyOklzc3VlQ29tbWVudDc0Mzk5ODc5Mg==,6622733,nitinpaultifr,2020-12-13T12:14:06Z,2020-12-13T12:14:06Z,NONE,"Agreed, it would definitely provide better controls. However, I do feel it makes for a bit of inconsistent UX for the 'Advanced export' section, with links to download for JSON, checkboxes and radio buttons + button to download for CSV. Do you think this example makes the UX a bit nicer/consistent? ![Screenshot 2020-12-13 at 5 38 43 PM](https://user-images.githubusercontent.com/6622733/102011444-1dc1cd00-3d6a-11eb-9e38-5af198161e80.png) I could give it a try if you'd like but I've never contributed to an actual project! ","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763361458,"""Stream all rows"" is not at all obvious", https://github.com/simonw/datasette/issues/1142#issuecomment-744522099,https://api.github.com/repos/simonw/datasette/issues/1142,744522099,MDEyOklzc3VlQ29tbWVudDc0NDUyMjA5OQ==,6622733,nitinpaultifr,2020-12-14T15:37:47Z,2020-12-14T15:37:47Z,NONE,"Alright I could give it a try! This might be a stupid question, can you tell me how to run the server from my fork? So that I can test the changes?","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763361458,"""Stream all rows"" is not at all obvious", https://github.com/simonw/datasette/issues/1142#issuecomment-745162571,https://api.github.com/repos/simonw/datasette/issues/1142,745162571,MDEyOklzc3VlQ29tbWVudDc0NTE2MjU3MQ==,6622733,nitinpaultifr,2020-12-15T09:22:58Z,2020-12-15T09:22:58Z,NONE,"You're right, probably more straightforward to have the links for JSON. I was imagining to toggle the `href` for the 'Export JSON' link (button) to the selected shape, but it'll probably be needlessly complex in the end.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763361458,"""Stream all rows"" is not at all obvious", https://github.com/simonw/datasette/issues/1143#issuecomment-744618787,https://api.github.com/repos/simonw/datasette/issues/1143,744618787,MDEyOklzc3VlQ29tbWVudDc0NDYxODc4Nw==,114388,yurivish,2020-12-14T18:15:00Z,2020-12-15T02:21:53Z,NONE,"From a quick look at the README, it does seem to do everything I need, thanks! I think the argument for inclusion in core is to lower the chances of unwanted data access. A local server can be accessed by anybody who can make an HTTP request to your computer regardless of CORS rules, but the default `*` rule additionally opens up access to the local instance to any website you visit while it is running. That's probably not what people typically intend, particularly when the data is of a sensitive nature. A default of requiring the user to specify the origin (allowing `*` but encouraging a narrower scope) would solve this problem entirely, I think. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",764059235,"More flexible CORS support in core, to encourage good security practices", https://github.com/simonw/datasette/issues/1144#issuecomment-744489028,https://api.github.com/repos/simonw/datasette/issues/1144,744489028,MDEyOklzc3VlQ29tbWVudDc0NDQ4OTAyOA==,475613,MarkusH,2020-12-14T14:47:11Z,2020-12-14T14:47:11Z,NONE,"Thanks for opening the issue, @simonw. Let me elaborate on my Tweets. [datasette-chartjs](https://github.com/MarkusH/datasette-chartjs) provides drop down lists to pick the chart visualization (e.g. bar, line, doughnut, pie, ...) as well as the column used for the ""x axis"" (e.g. time). A user can change the values on-demand. The chart will be redrawn w/o querying the database again. However, if a user wants to change the underlying query, they will use the SQL field provided by datasette or any of the other datasette built-in features to amend a query. In order to maintain a user's selections for the plugin, datasette-chartjs copies some parts of [datasette-vega](https://github.com/simonw/datasette-vega) which persist the chosen visualization and column in the hash part of a URL (the stuff behind the `#`). The plugin load the config from the hash upon initialization on the next page and use it accordingly. Additionally, datasette-vega and datasette-chartjs need to make sure to include the hash in all links and forms that cause a reload of the page. This is, such that the config persists between clicks. This ticket is about moving thes parts into datasette that provide the functionality to do so. This includes: 1. a way to load config options with a given prefix from the current URL hash 1. a way to update the current URL hash with a new config value or a bunch of config options 1. updating all necessary links and forms on the current page to include the URL hash whenever its updated 1. to prevent leaking config options to external pages, only ""internal"" links should be updated There's another, optional, feature that we might want to think about during the design phase: the scope of the config. Links within a datasette instance have 1 of 3 scopes: 1. global, for the whole datasette project 1. database, for all tables in a database 1. table, only for a table within a database When updating the links and forms as pointed out in 3. above, it might be worth considering which links need to be updated. I could imagine a plugin that wants to persist some setting across all tables within a database but another setting only within a table.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",765637324,JavaScript to help plugins interact with the fragment part of the URL, https://github.com/simonw/datasette/issues/1150#issuecomment-751476406,https://api.github.com/repos/simonw/datasette/issues/1150,751476406,MDEyOklzc3VlQ29tbWVudDc1MTQ3NjQwNg==,18221871,noklam,2020-12-27T14:51:39Z,2020-12-27T14:51:39Z,NONE,"I like the idea of _internal, it's a nice way to get a data catalog quickly. I wonder if this trick applies to db other than SQLite.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",770436876,Maintain an in-memory SQLite table of connected databases and their tables, https://github.com/simonw/datasette/issues/1165#issuecomment-753033121,https://api.github.com/repos/simonw/datasette/issues/1165,753033121,MDEyOklzc3VlQ29tbWVudDc1MzAzMzEyMQ==,154364,dracos,2020-12-31T19:33:47Z,2020-12-31T19:33:47Z,NONE,"Sorry to go on about it, but it's my only example ;) And thought it might be of interest/use. Here is FixMyStreet's Cypress workflow https://github.com/mysociety/fixmystreet/blob/master/.github/workflows/cypress.yml with the master script that sets up server etc at https://github.com/mysociety/fixmystreet/blob/master/bin/browser-tests (that has features such as working inside/outside Vagrant, and can do JS code coverage) and then the tests are at https://github.com/mysociety/fixmystreet/tree/master/.cypress/cypress/integration","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",776635426,Mechanism for executing JavaScript unit tests, https://github.com/simonw/datasette/issues/1166#issuecomment-783560017,https://api.github.com/repos/simonw/datasette/issues/1166,783560017,MDEyOklzc3VlQ29tbWVudDc4MzU2MDAxNw==,94334,thorn0,2021-02-22T18:00:57Z,2021-02-22T18:13:11Z,NONE,"Hi! I don't think Prettier supports this syntax for globs: `datasette/static/*[!.min].js` Are you sure that works? Prettier uses https://github.com/mrmlnc/fast-glob, which in turn uses https://github.com/micromatch/micromatch, and the docs for these packages don't mention this syntax. As per the docs, square brackets should work as in regexes (`foo-[1-5].js`). Tested it. Apparently, it works as a negated character class in regexes (like `[^.min]`). I wonder where this syntax comes from. Micromatch doesn't support that: ```js micromatch(['static/table.js', 'static/n.js'], ['static/*[!.min].js']); // result: [""static/n.js""] -- brackets are treated like [!.min] in regexes, without negation ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777140799,Adopt Prettier for JavaScript code formatting, https://github.com/simonw/datasette/issues/1171#issuecomment-754911290,https://api.github.com/repos/simonw/datasette/issues/1171,754911290,MDEyOklzc3VlQ29tbWVudDc1NDkxMTI5MA==,59874,rcoup,2021-01-05T21:31:15Z,2021-01-05T21:31:15Z,NONE,"We did this for [Sno](https://sno.earth) under macOS — it's a PyInstaller binary/setup which uses [Packages](http://s.sudre.free.fr/Software/Packages/about.html) for packaging. * [Building & Signing](https://github.com/koordinates/sno/blob/master/platforms/Makefile#L67-L95) * [Packaging & Notarizing](https://github.com/koordinates/sno/blob/master/platforms/Makefile#L121-L215) * [Github Workflow](https://github.com/koordinates/sno/blob/master/.github/workflows/build.yml#L228-L269) has the CI side of it FYI (if you ever get to it) for Windows you need to get a code signing certificate. And if you want automated CI, you'll want to get an ""EV CodeSigning for HSM"" certificate from GlobalSign, which then lets you put the certificate into Azure Key Vault. Which you can use with [azuresigntool](https://github.com/vcsjones/AzureSignTool) to sign your code & installer. (Non-EV certificates are a waste of time, the user still gets big warnings at install time). ","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",778450486,GitHub Actions workflow to build and sign macOS binary executables, https://github.com/simonw/datasette/issues/1175#issuecomment-1195442266,https://api.github.com/repos/simonw/datasette/issues/1175,1195442266,IC_kwDOBm6k_c5HQQBa,8523191,RamiAwar,2022-07-26T12:52:10Z,2022-07-26T12:52:10Z,NONE,"I'm using this in a separate FastAPI app, worked perfectly when I changed the AsyncBoundLogger to BoundLogger only. Also for some reason, I'm now getting some logs surfacing from internal packages, like Elasticsearch. But don't have time to deal with that now.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779156520,Use structlog for logging, https://github.com/simonw/datasette/issues/1175#issuecomment-762488336,https://api.github.com/repos/simonw/datasette/issues/1175,762488336,MDEyOklzc3VlQ29tbWVudDc2MjQ4ODMzNg==,758858,hannseman,2021-01-18T21:59:28Z,2021-01-18T22:00:31Z,NONE,"I encountered your issue when trying to find a solution and came up with the following, maybe it can help. ```python import logging.config from typing import Tuple import structlog import uvicorn from example.config import settings shared_processors: Tuple[structlog.types.Processor, ...] = ( structlog.contextvars.merge_contextvars, structlog.stdlib.add_logger_name, structlog.stdlib.add_log_level, structlog.processors.TimeStamper(fmt=""iso""), ) logging_config = { ""version"": 1, ""disable_existing_loggers"": False, ""formatters"": { ""json"": { ""()"": structlog.stdlib.ProcessorFormatter, ""processor"": structlog.processors.JSONRenderer(), ""foreign_pre_chain"": shared_processors, }, ""console"": { ""()"": structlog.stdlib.ProcessorFormatter, ""processor"": structlog.dev.ConsoleRenderer(), ""foreign_pre_chain"": shared_processors, }, **uvicorn.config.LOGGING_CONFIG[""formatters""], }, ""handlers"": { ""default"": { ""level"": ""DEBUG"", ""class"": ""logging.StreamHandler"", ""formatter"": ""json"" if not settings.debug else ""console"", }, ""uvicorn.access"": { ""level"": ""INFO"", ""class"": ""logging.StreamHandler"", ""formatter"": ""access"", }, ""uvicorn.default"": { ""level"": ""INFO"", ""class"": ""logging.StreamHandler"", ""formatter"": ""default"", }, }, ""loggers"": { """": {""handlers"": [""default""], ""level"": ""INFO""}, ""uvicorn.error"": { ""handlers"": [""default"" if not settings.debug else ""uvicorn.default""], ""level"": ""INFO"", ""propagate"": False, }, ""uvicorn.access"": { ""handlers"": [""default"" if not settings.debug else ""uvicorn.access""], ""level"": ""INFO"", ""propagate"": False, }, }, } def setup_logging() -> None: structlog.configure( processors=[ structlog.stdlib.filter_by_level, *shared_processors, structlog.stdlib.PositionalArgumentsFormatter(), structlog.processors.StackInfoRenderer(), structlog.processors.format_exc_info, structlog.stdlib.ProcessorFormatter.wrap_for_formatter, ], context_class=dict, logger_factory=structlog.stdlib.LoggerFactory(), wrapper_class=structlog.stdlib.AsyncBoundLogger, cache_logger_on_first_use=True, ) logging.config.dictConfig(logging_config) ``` And then it'll be run on the startup event: ```python @app.on_event(""startup"") async def startup_event() -> None: setup_logging() ``` It depends on a setting called `debug` which controls if it should output the regular uvicorn logging or json. ","{""total_count"": 15, ""+1"": 7, ""-1"": 0, ""laugh"": 1, ""hooray"": 1, ""confused"": 0, ""heart"": 5, ""rocket"": 1, ""eyes"": 0}",779156520,Use structlog for logging, https://github.com/simonw/datasette/issues/1175#issuecomment-984569477,https://api.github.com/repos/simonw/datasette/issues/1175,984569477,IC_kwDOBm6k_c46r1aF,24821294,AnkitKundariya,2021-12-02T12:09:30Z,2021-12-02T12:09:30Z,NONE,"@hannseman I have tried the above suggestion given by you but somehow I'm getting the below error. _note : I'm running my application with Docker._ `app_1 | {""event"": ""Exception in ASGI application\n"", ""exc_info"": ["""", ""RuntimeError('no running event loop')"", """"], ""logger"": ""uvicorn.error"", ""level"": ""error"", ""timestamp"": ""2021-12-02T12:06:36.011448Z""} `","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",779156520,Use structlog for logging, https://github.com/simonw/datasette/issues/1181#issuecomment-998999230,https://api.github.com/repos/simonw/datasette/issues/1181,998999230,IC_kwDOBm6k_c47i4S-,9308268,rayvoelker,2021-12-21T18:25:15Z,2021-12-21T18:25:15Z,NONE,"I wonder if I'm encountering the same bug (or something related). I had previously been using the .csv feature to run queries and then fetch results for the pandas `read_csv()` function, but it seems to have stopped working recently. https://ilsweb.cincinnatilibrary.org/collection-analysis/collection-analysis/current_collection-3d56dbf.csv?sql=select%0D%0A++*%0D%0Afrom%0D%0A++bib%0D%0Alimit%0D%0A++100&_size=max Datasette v0.59.4 ![image](https://user-images.githubusercontent.com/9308268/146979957-66911877-2cd9-4022-bc76-fd54e4a3a6f7.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",781262510,"Certain database names results in 404: ""Database not found: None""", https://github.com/simonw/datasette/issues/1196#issuecomment-765639968,https://api.github.com/repos/simonw/datasette/issues/1196,765639968,MDEyOklzc3VlQ29tbWVudDc2NTYzOTk2OA==,2826376,QAInsights,2021-01-22T19:37:15Z,2021-01-22T19:37:15Z,NONE,I tried deployment in WSL. It is working fine https://jmeter.vercel.app/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",791237799,Access Denied Error in Windows, https://github.com/simonw/datasette/issues/1196#issuecomment-819775388,https://api.github.com/repos/simonw/datasette/issues/1196,819775388,MDEyOklzc3VlQ29tbWVudDgxOTc3NTM4OA==,1219001,robroc,2021-04-14T19:28:38Z,2021-04-14T19:28:38Z,NONE,@QAInsights I'm having a similar problem when publishing to Cloud Run on Windows. It's not able to access certain packages in my conda environment where Datasette is installed. Can you explain how you got it to work in WSL? Were you able to access the .db file in the Windows file system? Thank you.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",791237799,Access Denied Error in Windows, https://github.com/simonw/datasette/issues/120#issuecomment-355487646,https://api.github.com/repos/simonw/datasette/issues/120,355487646,MDEyOklzc3VlQ29tbWVudDM1NTQ4NzY0Ng==,723567,nickdirienzo,2018-01-05T07:10:12Z,2018-01-05T07:10:12Z,NONE,"Ah, glad I found this issue. I have private data that I'd like to share to a few different people. Personally, a shared username and password would be sufficient for me, more-or-less Basic Auth. Do you have more complex requirements in mind? I'm not sure if ""plugin"" means ""build a plugin"" or ""find a plugin"" or something else entirely. FWIW, I stumbled upon [sanic-auth](https://github.com/pyx/sanic-auth) which looks like a new project to bring some interfaces around auth to sanic, similar to Flask. Alternatively, it shouldn't be too bad to add in Basic Auth. If we went down that route, that would probably be best built as a separate package for sanic that `datasette` brings in. What are your thoughts around this?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275087397,Plugin that adds an authentication layer of some sort, https://github.com/simonw/datasette/issues/120#issuecomment-439421164,https://api.github.com/repos/simonw/datasette/issues/120,439421164,MDEyOklzc3VlQ29tbWVudDQzOTQyMTE2NA==,36796532,ad-si,2018-11-16T15:05:18Z,2018-11-16T15:05:18Z,NONE,This would be an awesome feature ❤️ ,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275087397,Plugin that adds an authentication layer of some sort, https://github.com/simonw/datasette/issues/120#issuecomment-496966227,https://api.github.com/repos/simonw/datasette/issues/120,496966227,MDEyOklzc3VlQ29tbWVudDQ5Njk2NjIyNw==,26342344,duarteocarmo,2019-05-29T14:40:52Z,2019-05-29T14:40:52Z,NONE,I would really like this. If you give me some pointers @simonw I'm willing to PR!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275087397,Plugin that adds an authentication layer of some sort, https://github.com/simonw/datasette/issues/1210#issuecomment-773977128,https://api.github.com/repos/simonw/datasette/issues/1210,773977128,MDEyOklzc3VlQ29tbWVudDc3Mzk3NzEyOA==,525780,heyarne,2021-02-05T11:30:34Z,2021-02-05T11:30:34Z,NONE,"Thanks for your quick reply! Having changed my `metadata.yml`, queries AND database I can't really reproduce it anymore, sorry. But at least I'm happy to say that it works now! :) Thanks again for the super nifty tool, very appreciated.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",796234313,Immutable Database w/ Canned Queries, https://github.com/simonw/datasette/issues/1217#issuecomment-1303299509,https://api.github.com/repos/simonw/datasette/issues/1217,1303299509,IC_kwDOBm6k_c5NrsW1,31312775,mattmalcher,2022-11-04T11:35:13Z,2022-11-04T11:35:13Z,NONE,"The following worked for deployment to RStudio / Posit Connect An app.py along the lines of: ```python from pathlib import Path from datasette.app import Datasette example_db = Path(__file__).parent / ""data"" / ""example.db"" # use connect 'Content URL' setting here to set app to /datasette/ ds = Datasette(files=[example_db], settings={""base_url"": ""/datasette/""}) ds._startup_invoked = True ds_app = ds.app() ``` Then to deploy, from within a virtualenv with `rsconnect-python` ```sh rsconnect write-manifest fastapi -p $VIRTUAL_ENV/bin/python -e app:ds_app -o . rsconnect deploy manifest manifest.json -n -t ""Example Datasette"" ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",802513359,Possible to deploy as a python app (for Rstudio connect server)?, https://github.com/simonw/datasette/issues/1217#issuecomment-1303301786,https://api.github.com/repos/simonw/datasette/issues/1217,1303301786,IC_kwDOBm6k_c5Nrs6a,31312775,mattmalcher,2022-11-04T11:37:52Z,2022-11-04T11:37:52Z,NONE,"All seems to work well, but there are some glitches to do with proxies, see #1883 . Excited to use this :)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",802513359,Possible to deploy as a python app (for Rstudio connect server)?, https://github.com/simonw/datasette/issues/1217#issuecomment-774385092,https://api.github.com/repos/simonw/datasette/issues/1217,774385092,MDEyOklzc3VlQ29tbWVudDc3NDM4NTA5Mg==,6165713,plpxsk,2021-02-06T02:49:11Z,2021-02-06T02:49:11Z,NONE,"A good reference seems to be the note to run `datasette` as a module in https://github.com/simonw/datasette/pull/556 ","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",802513359,Possible to deploy as a python app (for Rstudio connect server)?, https://github.com/simonw/datasette/issues/1217#issuecomment-774528913,https://api.github.com/repos/simonw/datasette/issues/1217,774528913,MDEyOklzc3VlQ29tbWVudDc3NDUyODkxMw==,639730,virtadpt,2021-02-06T19:23:41Z,2021-02-06T19:23:41Z,NONE,I've had a lot of success running it as an OpenFaaS lambda.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",802513359,Possible to deploy as a python app (for Rstudio connect server)?, https://github.com/simonw/datasette/issues/1218#issuecomment-784157345,https://api.github.com/repos/simonw/datasette/issues/1218,784157345,MDEyOklzc3VlQ29tbWVudDc4NDE1NzM0NQ==,1244799,soobrosa,2021-02-23T12:12:17Z,2021-02-23T12:12:17Z,NONE,"Topline this fixed the same problem for me. ``` brew install python@3.7 ln -s /usr/local/opt/python@3.7/bin/python3.7 /usr/local/opt/python/bin/python3.7 pip3 uninstall -y numpy pip3 uninstall -y setuptools pip3 install setuptools pip3 install numpy pip3 install datasette-publish-fly ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",803356942, /usr/local/opt/python3/bin/python3.6: bad interpreter: No such file or directory, https://github.com/simonw/datasette/issues/1220#issuecomment-778008752,https://api.github.com/repos/simonw/datasette/issues/1220,778008752,MDEyOklzc3VlQ29tbWVudDc3ODAwODc1Mg==,30607,aborruso,2021-02-12T06:37:34Z,2021-02-12T06:37:34Z,NONE,"I have used my path, I'm running it from the folder in wich I have the db. Do I must an absolute path? Do I must create exactly that folder?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",806743116,Installing datasette via docker: Path 'fixtures.db' does not exist, https://github.com/simonw/datasette/issues/1220#issuecomment-778467759,https://api.github.com/repos/simonw/datasette/issues/1220,778467759,MDEyOklzc3VlQ29tbWVudDc3ODQ2Nzc1OQ==,30607,aborruso,2021-02-12T21:35:17Z,2021-02-12T21:35:17Z,NONE,Thank you,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",806743116,Installing datasette via docker: Path 'fixtures.db' does not exist, https://github.com/simonw/datasette/issues/1228#issuecomment-1072954795,https://api.github.com/repos/simonw/datasette/issues/1228,1072954795,IC_kwDOBm6k_c4_8_2r,7107523,Kabouik,2022-03-19T06:44:40Z,2022-03-19T06:44:40Z,NONE,"> ... unless your data had a column called `n`? Exactly, that's highly likely even though I can't double check from this computer just now. Thanks!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",810397025,500 error caused by faceting if a column called `n` exists, https://github.com/simonw/datasette/issues/123#issuecomment-698110186,https://api.github.com/repos/simonw/datasette/issues/123,698110186,MDEyOklzc3VlQ29tbWVudDY5ODExMDE4Ng==,45416,obra,2020-09-24T04:49:51Z,2020-09-24T04:49:51Z,NONE,"As a half-measure, I'd get value out of being able to upload a CSV and have datasette run csv-to-sqlite on it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125561,Datasette serve should accept paths/URLs to CSVs and other file formats, https://github.com/simonw/datasette/issues/123#issuecomment-698174957,https://api.github.com/repos/simonw/datasette/issues/123,698174957,MDEyOklzc3VlQ29tbWVudDY5ODE3NDk1Nw==,45416,obra,2020-09-24T07:42:05Z,2020-09-24T07:42:05Z,NONE," Oh. Awesome. On Thu, Sep 24, 2020 at 12:28:53AM -0700, Simon Willison wrote: > @obra there's a plugin for that! https://github.com/simonw/ > datasette-upload-csvs > > — > You are receiving this because you were mentioned. > Reply to this email directly, view it on GitHub, or unsubscribe.* > -- ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125561,Datasette serve should accept paths/URLs to CSVs and other file formats, https://github.com/simonw/datasette/issues/123#issuecomment-735440555,https://api.github.com/repos/simonw/datasette/issues/123,735440555,MDEyOklzc3VlQ29tbWVudDczNTQ0MDU1NQ==,11912854,jsancho-gpl,2020-11-29T19:12:30Z,2020-11-29T19:12:30Z,NONE,"[datasette-connectors](https://github.com/pytables/datasette-connectors) provides an API for making connectors for any file based database. For example, [datasette-pytables](https://github.com/pytables/datasette-pytables) is a connector for HDF5 files, so now is possible to use this type of files with Datasette. It'd be nice if Datasette coud provide that API directly, for other file formats and for urls too.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125561,Datasette serve should accept paths/URLs to CSVs and other file formats, https://github.com/simonw/datasette/issues/123#issuecomment-882096402,https://api.github.com/repos/simonw/datasette/issues/123,882096402,IC_kwDOBm6k_c40k7kS,921217,RayBB,2021-07-18T18:07:29Z,2021-07-18T18:07:29Z,NONE,"I also love the idea for this feature and wonder if it could work without having to download the whole database into memory at once if it's a rather large db. Obviously this could be slower but could have many use cases. My comment is partially inspired by this post about streaming sqlite dbs from github pages or such https://news.ycombinator.com/item?id=27016630 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125561,Datasette serve should accept paths/URLs to CSVs and other file formats, https://github.com/simonw/datasette/issues/1230#issuecomment-781330466,https://api.github.com/repos/simonw/datasette/issues/1230,781330466,MDEyOklzc3VlQ29tbWVudDc4MTMzMDQ2Ng==,7107523,Kabouik,2021-02-18T13:06:22Z,2021-02-18T15:22:15Z,NONE,"[Edit] Oh, I just saw the ""Load all"" button under the cluster map as well as the [setting to alter the max number or results](https://docs.datasette.io/en/stable/settings.html#max-returned-rows). So I guess this issue only is about the Vega charts.
Note that datasette-cluster-map also seems to be limited to 998 displayed points: ![ss-2021-02-18_140548](https://user-images.githubusercontent.com/7107523/108361225-15fb2a80-71ea-11eb-9a19-d885e8513f55.png)
","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",811054000,"Vega charts are plotted only for rows on the visible page, cluster maps only for rows in the remaining pages", https://github.com/simonw/datasette/issues/1238#issuecomment-790857004,https://api.github.com/repos/simonw/datasette/issues/1238,790857004,MDEyOklzc3VlQ29tbWVudDc5MDg1NzAwNA==,79913,tsibley,2021-03-04T19:06:55Z,2021-03-04T19:06:55Z,NONE,"@rgieseke Ah, that's super helpful. Thank you for the workaround for now!","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",813899472,Custom pages don't work with base_url setting, https://github.com/simonw/datasette/issues/124#issuecomment-346987395,https://api.github.com/repos/simonw/datasette/issues/124,346987395,MDEyOklzc3VlQ29tbWVudDM0Njk4NzM5NQ==,50138,janimo,2017-11-26T06:24:08Z,2017-11-26T06:24:08Z,NONE,"Are there performance gains when using immutable as opposed to read-only? From what I see other processes can still modify the DB when immutable, but there are no change notifications.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125805,Option to open readonly but not immutable, https://github.com/simonw/datasette/issues/124#issuecomment-347123991,https://api.github.com/repos/simonw/datasette/issues/124,347123991,MDEyOklzc3VlQ29tbWVudDM0NzEyMzk5MQ==,50138,janimo,2017-11-27T09:25:15Z,2017-11-27T09:25:15Z,NONE,"That's the only reference to immutable I saw as well, making me think that there may be no perceivable advantages over simply using mode=ro. Since the database is never or seldom updated the change notifications should not impact performance.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125805,Option to open readonly but not immutable, https://github.com/simonw/datasette/issues/1240#issuecomment-784312460,https://api.github.com/repos/simonw/datasette/issues/1240,784312460,MDEyOklzc3VlQ29tbWVudDc4NDMxMjQ2MA==,7107523,Kabouik,2021-02-23T16:07:10Z,2021-02-23T16:08:28Z,NONE,"Likewise, while answering to another issue regarding the Vega plugin, I realized that there is no such way of linking rows after a custom query, I only get this ""Link"" column with individual URLs for the default SQL view: ![ss-2021-02-23_170559](https://user-images.githubusercontent.com/7107523/108871491-1e3fd500-75f1-11eb-8f76-5d5a82cc14d7.png) Or is it there and I am just missing the option in my custom queries?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",814591962,Allow facetting on custom queries, https://github.com/simonw/datasette/issues/1241#issuecomment-784347646,https://api.github.com/repos/simonw/datasette/issues/1241,784347646,MDEyOklzc3VlQ29tbWVudDc4NDM0NzY0Ng==,7107523,Kabouik,2021-02-23T16:55:26Z,2021-02-23T16:57:39Z,NONE,"> I think it's possible that many users these days no longer assume they can paste a URL from the browser address bar (if they ever understood that at all) because to many apps are SPAs with broken URLs. Absolutely, that's why I thought my corner case with `iframe` preventing access to the datasette URL could actually be relevant in more general situations.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",814595021,Share button for copying current URL, https://github.com/simonw/datasette/issues/1245#issuecomment-812711365,https://api.github.com/repos/simonw/datasette/issues/1245,812711365,MDEyOklzc3VlQ29tbWVudDgxMjcxMTM2NQ==,1111743,jungle-boogie,2021-04-02T20:53:35Z,2021-04-02T20:53:35Z,NONE,"Yes, I agree. Alternatively, maybe the header could be at the top and bottom, above the next page button. Maybe even have the header 50 records down?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",817544251,"Sticky table column headers would be useful, especially on the query page", https://github.com/simonw/datasette/issues/1253#issuecomment-955384545,https://api.github.com/repos/simonw/datasette/issues/1253,955384545,IC_kwDOBm6k_c448gLh,1449512,dufferzafar,2021-10-30T16:00:42Z,2021-10-30T16:00:42Z,NONE,"Yeah, I was pressing Ctrl + Enter as well. Came here to open this issue and found out Shift + Enter works. @simonw Any way to configure this?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",826064552,"Capture ""Ctrl + Enter"" or ""⌘ + Enter"" to send SQL query?", https://github.com/simonw/datasette/issues/1255#issuecomment-812680519,https://api.github.com/repos/simonw/datasette/issues/1255,812680519,MDEyOklzc3VlQ29tbWVudDgxMjY4MDUxOQ==,1111743,jungle-boogie,2021-04-02T19:37:57Z,2021-04-02T19:37:57Z,NONE,"Hello, I'm also experiencing a timeout in my environment. I don't know if it's because I need more indexes or a more powerful system. My data has 1,271,111 and when I try to create a facet, there's a time out. I've tried this on two different rows that should significantly filter down data: `CITY` and `PARTY_REG`. Simon's johns_hopkins_csse_daily_reports has more rows and it setup with two facets on load. He does have four indexes created, though. Do I need more indexes? I have one simple one so far: ``` CREATE INDEX [idx_party_reg] ON [county_active] ([PARTY_REG]); ``` I'm running Datasette 0.56 installed via pip with Python 3.7.3. `4.19.0-10-amd64 #1 SMP Debian 4.19.132-1 (2020-07-24) x86_64 GNU/Linux` ``` $ cat /etc/os-release PRETTY_NAME=""Debian GNU/Linux 10 (buster)"" NAME=""Debian GNU/Linux"" VERSION_ID=""10"" VERSION=""10 (buster)"" VERSION_CODENAME=buster ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",826700095,Facets timing out but work when filtering, https://github.com/simonw/datasette/issues/1255#issuecomment-812710120,https://api.github.com/repos/simonw/datasette/issues/1255,812710120,MDEyOklzc3VlQ29tbWVudDgxMjcxMDEyMA==,1111743,jungle-boogie,2021-04-02T20:50:08Z,2021-04-02T20:50:08Z,NONE,"Hello again, I was able to get my facets running with this `settings.json`, which was lifted from one of Simon's datasette's and slightly modified. ``` { ""default_page_size"": 100, ""max_returned_rows"": 1000, ""num_sql_threads"": 3, ""sql_time_limit_ms"": 9000, ""default_facet_size"": 10, ""facet_time_limit_ms"": 9000, ""facet_suggest_time_limit_ms"": 500, ""hash_urls"": false, ""allow_facet"": true, ""suggest_facets"": false, ""default_cache_ttl"": 5, ""default_cache_ttl_hashed"": 31536000, ""cache_size_kb"": 0, ""allow_csv_stream"": true, ""max_csv_mb"": 100, ""truncate_cells_html"": 2048, ""template_debug"": false, ""base_url"": ""/"" } ```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",826700095,Facets timing out but work when filtering, https://github.com/simonw/datasette/issues/1258#issuecomment-807459633,https://api.github.com/repos/simonw/datasette/issues/1258,807459633,MDEyOklzc3VlQ29tbWVudDgwNzQ1OTYzMw==,1385831,wdccdw,2021-03-25T20:48:33Z,2021-03-25T20:49:34Z,NONE,"What about allowing default parameters when defining the query in metadata.yml? Something like: ``` databases: fec: queries: search_by_name: params: - q default-param-values: q: ""text to search"" sql: |- SELECT... ``` For now, I'm using a custom database-.html file that hardcodes a default param in the link, but I'd rather not customize the template just for that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",828858421,Allow canned query params to specify default values, https://github.com/simonw/datasette/issues/1261#issuecomment-803499509,https://api.github.com/repos/simonw/datasette/issues/1261,803499509,MDEyOklzc3VlQ29tbWVudDgwMzQ5OTUwOQ==,812795,brimstone,2021-03-21T02:06:43Z,2021-03-21T02:06:43Z,NONE,I can confirm 0.9.2 fixes the problem. Thanks for the fast response!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",832092321,Some links aren't properly URL encoded., https://github.com/simonw/datasette/issues/1262#issuecomment-802164134,https://api.github.com/repos/simonw/datasette/issues/1262,802164134,MDEyOklzc3VlQ29tbWVudDgwMjE2NDEzNA==,19328961,henry501,2021-03-18T17:55:00Z,2021-03-18T17:55:00Z,NONE,"Thanks for the comments. I'll take a look at the documentation to familiarize myself, as I haven't tried to write any plugins yet. With some luck I might be ready to write it when the hook is implemented.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",834602299,Plugin hook that could support 'order by random()' for table view, https://github.com/simonw/datasette/issues/1265#issuecomment-803160804,https://api.github.com/repos/simonw/datasette/issues/1265,803160804,MDEyOklzc3VlQ29tbWVudDgwMzE2MDgwNA==,468612,yunzheng,2021-03-19T22:05:12Z,2021-03-19T22:05:12Z,NONE,Wow that was fast! Thanks for this very cool project and quick update! 👍 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",836123030,Support for HTTP Basic Authentication, https://github.com/simonw/datasette/issues/1272#issuecomment-1199115002,https://api.github.com/repos/simonw/datasette/issues/1272,1199115002,IC_kwDOBm6k_c5HeQr6,37748899,xmichele,2022-07-29T10:22:58Z,2022-07-29T10:22:58Z,NONE,"> test_dockerfile.py > I'm skipping this for the moment because the new Dockerfile shape introduced in [#1249 (comment)](https://github.com/simonw/datasette/issues/1249#issuecomment-804404544) isn't compatible with this technique, since it installs Datasette from PyPI rather than directly from the repo. > > Will need to change that if I want to do this unit tests thing. Hello, can't you copy in a later step directly in the output directory, e.g. COPY test_dockerfile.py /usr/lib/python*/.. or something like that ? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",838245338,Unit tests for the Dockerfile, https://github.com/simonw/datasette/issues/1276#issuecomment-811209922,https://api.github.com/repos/simonw/datasette/issues/1276,811209922,MDEyOklzc3VlQ29tbWVudDgxMTIwOTkyMg==,1314318,justinallen,2021-03-31T16:27:26Z,2021-03-31T16:27:26Z,NONE,Fantastic. Thank you! ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",841456306,"Invalid SQL: ""no such table: pragma_database_list"" on database page", https://github.com/simonw/datasette/issues/1286#issuecomment-860047794,https://api.github.com/repos/simonw/datasette/issues/1286,860047794,MDEyOklzc3VlQ29tbWVudDg2MDA0Nzc5NA==,4068,frafra,2021-06-12T12:36:15Z,2021-06-12T12:36:15Z,NONE,"@mroswell That is a very nice solution. I wonder if custom classes, like `col-columnName-value` could be automatically added to cells when facets on such column are enabled, to allow custom styling without having to modify templates or add custom JavaScript code.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",849220154,Better default display of arrays of items, https://github.com/simonw/datasette/issues/1298#issuecomment-1125083348,https://api.github.com/repos/simonw/datasette/issues/1298,1125083348,IC_kwDOBm6k_c5DD2jU,7150,llimllib,2022-05-12T14:43:51Z,2022-05-12T14:43:51Z,NONE,"user report: I found this issue because the first time I tried to use datasette for real, I displayed a large table, and thought there was no horizontal scroll bar at all. I didn't even consider that I had to scroll all the way to the end of the page to find it. Just chipping in to say that this confused me, and I didn't even find the scroll bar until after I saw this issue. I don't know what the right answer is, but IMO the UI should suggest to the user that there is a way to view the data that's hidden to the right.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",855476501,improve table horizontal scroll experience, https://github.com/simonw/datasette/issues/1298#issuecomment-823064725,https://api.github.com/repos/simonw/datasette/issues/1298,823064725,MDEyOklzc3VlQ29tbWVudDgyMzA2NDcyNQ==,154364,dracos,2021-04-20T07:57:14Z,2021-04-20T07:57:14Z,NONE,"My suggestions, originally made on twitter, but might be better here now: 1. Could have a CSS shadow (one of the comments on https://stackoverflow.com/questions/44793453/how-do-i-add-a-top-and-bottom-shadow-while-scrolling-but-only-when-needed is a codepen for horizontal instead of vertical); 2. Could give the table a max-height (either the window or work out the available space) so that it is both vertically/horizontally scrollable and you don't have to scroll to the bottom in order to see this; 3. On a desktop browser, what I think I'd want is an absolute grid to work with - left query/filters, TR chart (or map), BR table. No problem with scrolling then. Here is a mockup I made when this was about the map plugin: ![image](https://user-images.githubusercontent.com/154364/115358389-82c47e00-a1b5-11eb-8a63-0ca14fd23d32.png) ![image](https://user-images.githubusercontent.com/154364/115358454-97087b00-a1b5-11eb-9501-cf884ae72d7c.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",855476501,improve table horizontal scroll experience, https://github.com/simonw/datasette/issues/1298#issuecomment-823102978,https://api.github.com/repos/simonw/datasette/issues/1298,823102978,MDEyOklzc3VlQ29tbWVudDgyMzEwMjk3OA==,154364,dracos,2021-04-20T08:51:23Z,2021-04-20T08:51:23Z,NONE,"2. Max height would still let you scroll the page to underneath the facets to the table, but would mean the table would never take up more than your window size, so the horizontal scrollbar would be visible as soon as the table took up the size of the window. 3. Yes, this wouldn't be for mobile :) It'd be desktop-only styling. On mobile you can scroll much more easily with touch, anyway. In your case, perhaps better would be the whole top half would be facets, bottom left quadrant chart, bottom right table. Depends upon the particular use case, as you say.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",855476501,improve table horizontal scroll experience, https://github.com/simonw/datasette/issues/1304#issuecomment-981980048,https://api.github.com/repos/simonw/datasette/issues/1304,981980048,IC_kwDOBm6k_c46h9OQ,30934,20after4,2021-11-29T20:13:53Z,2021-11-29T20:14:11Z,NONE,There isn't any way to do this with sqlite as far as I know. The only option is to insert the right number of ? placeholders into the sql template and then provide an array of values.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",863884805,"Document how to send multiple values for ""Named parameters"" ", https://github.com/simonw/datasette/issues/1304#issuecomment-988459453,https://api.github.com/repos/simonw/datasette/issues/1304,988459453,IC_kwDOBm6k_c466rG9,9308268,rayvoelker,2021-12-08T03:15:27Z,2021-12-08T03:15:27Z,NONE,"I was thinking if there were a way to use some sort of sting function to ""unpack"" the values and convert them into ints... hm","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",863884805,"Document how to send multiple values for ""Named parameters"" ", https://github.com/simonw/datasette/issues/1304#issuecomment-988461884,https://api.github.com/repos/simonw/datasette/issues/1304,988461884,IC_kwDOBm6k_c466rs8,30934,20after4,2021-12-08T03:20:26Z,2021-12-08T03:20:26Z,NONE,"The easiest or most straightforward thing to do is to use named parameters like: ```sql select * where key IN (:p1, :p2, :p3) ``` And simply construct the list of placeholders dynamically based on the number of values. Doing this is possible with datasette if you forgo ""canned queries"" and just use the raw query endpoint and pass the query sql, along with p1, p2 ... in the request.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",863884805,"Document how to send multiple values for ""Named parameters"" ", https://github.com/simonw/datasette/issues/1304#issuecomment-988463455,https://api.github.com/repos/simonw/datasette/issues/1304,988463455,IC_kwDOBm6k_c466sFf,30934,20after4,2021-12-08T03:23:14Z,2021-12-08T03:23:14Z,NONE,I actually think it would be a useful thing to add support for in datasette. It wouldn't be difficult to unwind an array of params and add the placeholders automatically.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",863884805,"Document how to send multiple values for ""Named parameters"" ", https://github.com/simonw/datasette/issues/1310#issuecomment-828670621,https://api.github.com/repos/simonw/datasette/issues/1310,828670621,MDEyOklzc3VlQ29tbWVudDgyODY3MDYyMQ==,3747136,ColinMaudry,2021-04-28T18:12:08Z,2021-04-28T18:12:08Z,NONE,"Apparently, beside a string, Reponse could also [work with bytes](https://github.com/simonw/datasette/blob/master/datasette/utils/asgi.py#L338).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",870125126,I'm creating a plugin to export a spreadsheet file (.ods or .xlsx), https://github.com/simonw/datasette/issues/1310#issuecomment-829885904,https://api.github.com/repos/simonw/datasette/issues/1310,829885904,MDEyOklzc3VlQ29tbWVudDgyOTg4NTkwNA==,3747136,ColinMaudry,2021-04-30T06:58:46Z,2021-04-30T07:26:11Z,NONE,"I made it work with openpyxl. I'm not sure all the code under `@hookimpl` is necessary... but it works :) ```python from datasette import hookimpl from datasette.utils.asgi import Response from openpyxl import Workbook from openpyxl.writer.excel import save_virtual_workbook from openpyxl.cell import WriteOnlyCell from openpyxl.styles import Alignment, Font, PatternFill from tempfile import NamedTemporaryFile def render_spreadsheet(rows): wb = Workbook(write_only=True) ws = wb.create_sheet() ws = wb.active ws.title = ""decp"" columns = rows[0].keys() headers = [] for col in columns : c = WriteOnlyCell(ws, col) c.fill = PatternFill(""solid"", fgColor=""DDEFFF"") headers.append(c) ws.append(headers) for row in rows: wsRow = [] for col in columns: c = WriteOnlyCell(ws, row[col]) if col == ""objet"" : c.alignment = Alignment(wrapText = True) wsRow.append(c) ws.append(wsRow) with NamedTemporaryFile() as tmp: wb.save(tmp.name) tmp.seek(0) return Response( tmp.read(), headers={ 'Content-Disposition': 'attachment; filename=decp.xlsx', 'Content-type': 'application/vnd.openxmlformats-officedocument.spreadsheetml.sheet' } ) @hookimpl def register_output_renderer(): return {""extension"": ""xlsx"", ""render"": render_spreadsheet, ""can_render"": lambda: False} ``` The key part was to find the right function to wrap the spreadsheet object `wb`. `NamedTemporaryFile()` did it! I'll update this issue when the plugin is packaged and ready for broader use.","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",870125126,I'm creating a plugin to export a spreadsheet file (.ods or .xlsx), https://github.com/simonw/datasette/issues/1327#issuecomment-847271122,https://api.github.com/repos/simonw/datasette/issues/1327,847271122,MDEyOklzc3VlQ29tbWVudDg0NzI3MTEyMg==,20846286,GmGniap,2021-05-24T19:10:21Z,2021-05-24T19:10:21Z,NONE,"wow, thanks a lot @simonw , problem is solved. I converted my current json file into utf-8 format with Python script. It's working now. I'm using with Window 10. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",892457208,Support Unicode characters in metadata.json, https://github.com/simonw/datasette/issues/1331#issuecomment-842495820,https://api.github.com/repos/simonw/datasette/issues/1331,842495820,MDEyOklzc3VlQ29tbWVudDg0MjQ5NTgyMA==,475613,MarkusH,2021-05-17T17:18:05Z,2021-05-17T17:18:05Z,NONE,"Wow, you are _fast_! I didn't notice dependabot had opened a PR already. I was about to. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",893537744,Add support for Jinja2 version 3.0, https://github.com/simonw/datasette/issues/1331#issuecomment-842499728,https://api.github.com/repos/simonw/datasette/issues/1331,842499728,MDEyOklzc3VlQ29tbWVudDg0MjQ5OTcyOA==,475613,MarkusH,2021-05-17T17:24:30Z,2021-05-17T17:24:30Z,NONE,"> I wonder if there are any new 3.0 features we should be taking advantage of here that would justify pinning to 3.0 minimum? The changelog reads like bug fixes and removal of deprecated parts to me","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",893537744,Add support for Jinja2 version 3.0, https://github.com/simonw/datasette/issues/1331#issuecomment-844970776,https://api.github.com/repos/simonw/datasette/issues/1331,844970776,MDEyOklzc3VlQ29tbWVudDg0NDk3MDc3Ng==,475613,MarkusH,2021-05-20T10:40:25Z,2021-05-20T10:40:25Z,NONE,"Any chance you could push a new datasette release with the updated dependencies in the setup.py, @simonw?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",893537744,Add support for Jinja2 version 3.0, https://github.com/simonw/datasette/issues/1331#issuecomment-846137332,https://api.github.com/repos/simonw/datasette/issues/1331,846137332,MDEyOklzc3VlQ29tbWVudDg0NjEzNzMzMg==,4312421,stonebig,2021-05-21T17:57:53Z,2021-05-21T17:57:53Z,NONE,I'm stuck also because datasette wants itsdangerous~=1.1 instead of allowing itsdangerous-2.0.0,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",893537744,Add support for Jinja2 version 3.0, https://github.com/simonw/datasette/issues/1362#issuecomment-855428296,https://api.github.com/repos/simonw/datasette/issues/1362,855428296,MDEyOklzc3VlQ29tbWVudDg1NTQyODI5Ng==,154364,dracos,2021-06-06T16:53:20Z,2021-06-06T16:53:20Z,NONE,"> Presumably this would also require adding Content-Security-Policy to the Vary header though, which will have a nasty effect on Cloudflare and Fastly and such like. No, because Vary header is about *request* headers that cause the response to vary, not response headers.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",912864936,Consider using CSP to protect against future XSS, https://github.com/simonw/datasette/issues/1375#issuecomment-860548546,https://api.github.com/repos/simonw/datasette/issues/1375,860548546,MDEyOklzc3VlQ29tbWVudDg2MDU0ODU0Ng==,4068,frafra,2021-06-14T09:41:59Z,2021-06-14T09:41:59Z,NONE,"> There is a feature for this at the moment, but it's a little bit hidden: you can use `?_json=col` to tell > Datasette that you would like a specific column to be exported as nested JSON: https://docs.datasette.io/en/stable/json_api.html#special-json-arguments Thanks :) > I considered trying to make this automatic - so it detects columns that appear to contain valid JSON and outputs them as nested objects - but the problem with that is that it can lead to inconsistent results - you might hit the API and find that not every column contains valid JSON (compared to the previous day) resulting in the API retuning string instead of the expected dictionary and breaking your code. If a developer is not sure if the JSON fields are valid, but then retrieves and parse them, it should handle errors too. Handling inconsistent data is necessary due to the nature of SQLite. A global or dataset option to render the data as they have been defined (JSON, boolean, etc.) when requesting JSON could allow the user to download a regular JSON from the browser without having to rely on APIs. I would guess someone could just make a custom template with an extra JSON-parsed download button otherwise :)","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",919508498,JSON export dumps JSON fields as TEXT, https://github.com/simonw/datasette/issues/1380#issuecomment-967181828,https://api.github.com/repos/simonw/datasette/issues/1380,967181828,IC_kwDOBm6k_c45pgYE,7094907,Segerberg,2021-11-12T15:00:18Z,2021-11-12T20:02:29Z,NONE,"There is no such option see https://github.com/simonw/datasette/issues/43. But you could write a plugin using the datasette.add_database(db, name=None) https://docs.datasette.io/en/stable/internals.html#add-database-db-name-none ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",924748955,Serve all db files in a folder, https://github.com/simonw/datasette/issues/1380#issuecomment-967801997,https://api.github.com/repos/simonw/datasette/issues/1380,967801997,IC_kwDOBm6k_c45r3yN,7094907,Segerberg,2021-11-13T08:05:37Z,2021-11-13T08:09:11Z,NONE,"@glasnt yeah I guess that could be an option. I run datasette on large databases > 75gb and the startup time is a bit slow for me even with -i --inspect-file options. Here's a quick sketch for a plugin that will reload db's in a folder that you set for the plugin in metadata.json. If you request /-reload-db new db's will be added. (You probably want to implement some authentication for this =) ) https://gist.github.com/Segerberg/b96a0e0a5389dce2396497323cda7042 ","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",924748955,Serve all db files in a folder, https://github.com/simonw/datasette/issues/1384#issuecomment-1062124485,https://api.github.com/repos/simonw/datasette/issues/1384,1062124485,IC_kwDOBm6k_c4_TrvF,167160,khusmann,2022-03-08T19:26:32Z,2022-03-08T19:26:32Z,NONE,"Looks like I'm late to the party here, but wanted to join the convo if there's still time before this interface is solidified in v1.0. My plugin use case is for education / social science data, which is meta-data heavy in the documentation of measurement scales, instruments, collection procedures, etc. that I want to connect to columns, tables, and dbs (and render in static pages, but looks like I can do that with the jinja plugin hook). I'm still digging in and I think @brandonrobertz 's approach will work for me at least for now, but I want to bump this thread in the meantime -- are there still plans for an async metadata hook at some point in the future? (or are you considering other directions?)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",930807135,Plugin hook for dynamic metadata, https://github.com/simonw/datasette/issues/1384#issuecomment-1065929510,https://api.github.com/repos/simonw/datasette/issues/1384,1065929510,IC_kwDOBm6k_c4_iMsm,167160,khusmann,2022-03-12T17:49:59Z,2022-03-12T17:49:59Z,NONE,"Ok, I'm taking a slightly different approach, which I think is sort of close to the in-memory _metadata table idea. I'm using a startup hook to load metadata / other info from the database, which I store in the datasette object for later: ``` @hookimpl def startup(datasette): async def inner(): datasette._mypluginmetadata = # await db query return inner ``` Then, I can use this in other plugins: ``` @hookimpl def render_cell(value, column, table, database, datasette): # use datasette._mypluginmetadata ``` For my app I don't need anything to update dynamically so it's fine to pre-populate everything on startup. It's also good to have things precached especially for a hook like render_cell, which would otherwise require a ton of redundant db queries. Makes me wonder if we could take a sort of similar caching approach with the internal _metadata table. Like have a little watchdog that could query all of the attached dbs for their _metadata tables every 5min or so, which then could be merged into the in memory _metadata table which then could be accessed sync by the plugins, or something like that. For most the use cases I can think of, live updates don't need to take into effect immediately; refreshing a cache every 5min or on some other trigger (adjustable w a config setting) would be just fine. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",930807135,Plugin hook for dynamic metadata, https://github.com/simonw/datasette/issues/1384#issuecomment-1065951744,https://api.github.com/repos/simonw/datasette/issues/1384,1065951744,IC_kwDOBm6k_c4_iSIA,167160,khusmann,2022-03-12T19:47:17Z,2022-03-12T19:47:17Z,NONE,"Awesome, thanks @brandonrobertz ! The plugin is close, but looks like it only grabs remote metadata, is that right? Instead what I'm wanting is to grab metadata embedded in the attached databases. Rather than extending that plugin, at this point I've realized I need a lot more flexibility in metadata for my data model (esp around formatting cell values and custom file exports) so rather than extending that I'll continue working on a plugin specific to my app. If I'm understanding your plugin code correctly, you query the db using the sync handle every time `get_metdata` is called, right? Won't this become a pretty big bottleneck if a hook into `render_cell` is trying to read metadata / plugin config? > Making the get_metadata async won't improve the situation by itself as only some of the code paths accessing metadata use that hook. The other paths use the internal metadata dict. I agree -- because things like `render_cell` will potentially want to read metadata/config, `get_metadata` should really remain sync and lightweight, which we can do with something like the remote-metadata plugin that could also poll metadata tables in attached databases. That leaves your app, where it sounds like you want changes made by the user in the browser in to be immediately reflected, rather than have to wait for the next metadata refresh. In this case I wonder if you could have your app make a sync write to the datasette object so the change would have the immediate effect, but then have a separate async polling mechanism to eventually write that change out to the database for long-term persistence. Then you'd have the best of both worlds, I think? But probably not worth the trouble if your use cases are small (and/or you're not reading metadata/config from tight loops like render_cell).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",930807135,Plugin hook for dynamic metadata, https://github.com/simonw/datasette/issues/1384#issuecomment-1066143991,https://api.github.com/repos/simonw/datasette/issues/1384,1066143991,IC_kwDOBm6k_c4_jBD3,167160,khusmann,2022-03-13T17:13:09Z,2022-03-13T17:13:09Z,NONE,"Thanks for taking the time to reply @brandonrobertz , this is really helpful info. > See ""Many small queries are efficient in sqlite"" for more information on the rationale here. Also note that in the datasette-live-config reference plugin, the DB connection is cached, so that eliminated most of the performance worries we had. Ah, that's nifty! Yeah, then caching on the python side is likely a waste :) I'm new to working with sqlite so this is super good to know the many-small-queries is a common pattern > I tested on very large Datasette deployments (hundreds of DBs, millions of rows). For my reference, did you include a `render_cell` plugin calling `get_metadata` in those tests? I'm less concerned now that I know a little more about sqlite's caching, but that special situation will jump you to a few orders of magnitude above what the sqlite article describes (e.g. 200 vs 20,000 queries+metadata merges for a page displaying 100 rows of a 200 column table). It wouldn't scale with db size as much as # of visible cells being rendered on the page, although they would be identical queries I suppose so will cache well. (If you didn't test this specific situation, no worries -- I'm just trying to calibrate my intuition on this and can do my own benchmarks at some point.) > Simon talked about eventually making something like this a standard feature of Datasette Yeah, getting metadata (and static pages as well for that matter) from internal tables definitely has my vote for including as a standard feature! Its really nice to be able to distribute a single *.db with all the metadata and static pages bundled. My metadata are sufficiently complex/domain specific that it makes sense to continue on my own plugin for now, but I'll be thinking about more general parts I can spin off as possible contributions to liveconfig (if you're open to them) or other plugins in this ecosystem.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",930807135,Plugin hook for dynamic metadata, https://github.com/simonw/datasette/issues/1384#issuecomment-1066194130,https://api.github.com/repos/simonw/datasette/issues/1384,1066194130,IC_kwDOBm6k_c4_jNTS,167160,khusmann,2022-03-13T22:23:04Z,2022-03-13T22:23:04Z,NONE,"Ah, sorry, I didn't get what you were saying you the first time. Using _metadata_local in that way makes total sense -- I agree, refreshing metadata each cell was seeming quite excessive. Now I'm on the same page! :)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",930807135,Plugin hook for dynamic metadata, https://github.com/simonw/datasette/issues/1387#issuecomment-873166836,https://api.github.com/repos/simonw/datasette/issues/1387,873166836,MDEyOklzc3VlQ29tbWVudDg3MzE2NjgzNg==,9308268,rayvoelker,2021-07-02T17:58:23Z,2021-07-02T17:58:23Z,NONE,"Thanks Simon for nailing that one down! It does seem a little confusing that the ProxyPreservehost option is set to Off By default, but this config totally did the trick and fixed the issue ``` ProxyPass http://127.0.0.1:8010/collection-analysis/ ProxyPreservehost On ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",935930820,absolute_url() behind a proxy assembles incorrect http://127.0.0.1:8001/ URLs, https://github.com/simonw/datasette/issues/1398#issuecomment-889599513,https://api.github.com/repos/simonw/datasette/issues/1398,889599513,IC_kwDOBm6k_c41BjYZ,192984,aitoehigie,2021-07-30T03:21:49Z,2021-07-30T03:21:49Z,NONE,Does the library support this now?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",947044667,Documentation on using Datasette as a library, https://github.com/simonw/datasette/issues/141#issuecomment-346974336,https://api.github.com/repos/simonw/datasette/issues/141,346974336,MDEyOklzc3VlQ29tbWVudDM0Njk3NDMzNg==,50138,janimo,2017-11-26T00:00:35Z,2017-11-26T00:00:35Z,NONE,FWIW I worked around this by setting TMPDIR to ~/tmp before running the command.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275814941,datasette publish can fail if /tmp is on a different device, https://github.com/simonw/datasette/issues/1415#issuecomment-1255603780,https://api.github.com/repos/simonw/datasette/issues/1415,1255603780,IC_kwDOBm6k_c5K1v5E,17532695,bendnorman,2022-09-22T22:06:10Z,2022-09-22T22:06:10Z,NONE,"This would be great! I just went through the process of figuring out the minimum permissions for a service account to run `datasette publish cloudrun` for [PUDL](https://github.com/catalyst-cooperative/pudl)'s [datasette deployment](https://data.catalyst.coop/). These are the roles I gave the service account (disclaim: I'm not sure these are the minimum permissions): - Cloud Build Service Account: The SA needs this role to publish the build on Cloud Build. - Cloud Run Admin for the Cloud Run datasette service so the SA can deploy the build. - I gave the SA the Storage Admin role on the bucket Cloud Build creates to store the build tar files. - The Viewer Role is [required for storing build logs in the default bucket](https://cloud.google.com/build/docs/running-builds/submit-build-via-cli-api#permissions). More on this below! The Viewer Role is a Basic IAM role that [Google does not recommend using](https://cloud.google.com/build/docs/running-builds/submit-build-via-cli-api#permissions): > Caution: Basic roles include thousands of permissions across all Google Cloud services. In production environments, do not grant basic roles unless there is no alternative. Instead, grant the most limited [predefined roles](https://cloud.google.com/iam/docs/understanding-roles#predefined_roles) or [custom roles](https://cloud.google.com/iam/docs/understanding-custom-roles) that meet your needs. If you don't grant the Viewer role the `gcloud builds submit` command will successfully create a build but returns exit code 1, preventing the script from getting to the cloud run step: ``` ERROR: (gcloud.builds.submit) The build is running, and logs are being written to the default logs bucket. This tool can only stream logs if you are Viewer/Owner of the project and, if applicable, allowed by your VPC-SC security policy. The default logs bucket is always outside any VPC-SC security perimeter. If you want your logs saved inside your VPC-SC perimeter, use your own bucket. See https://cloud.google.com/build/docs/securing-builds/store-manage-build-logs. ``` long stack trace... ``` CalledProcessError: Command 'gcloud builds submit --tag gcr.io/catalyst-cooperative-pudl/datasette' returned non-zero exit status 1. ``` You can store Cloud Build logs in a [user-created bucket](https://cloud.google.com/build/docs/securing-builds/store-manage-build-logs#store-custom-bucket) which only requires the Storage Admin role. However, you have to pass a config file to `gcloud builds submit`, which isn't possible with the current options for `datasette publish cloudrun`. I propose we add an additional CLI option to `datasette publish cloudrun` called `--build-config` that allows users to pass a [config file](https://cloud.google.com/build/docs/running-builds/submit-build-via-cli-api#running_builds) specifying a user create Cloud Build log bucket. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",959137143,feature request: document minimum permissions for service account for cloudrun, https://github.com/simonw/datasette/issues/1423#issuecomment-993876599,https://api.github.com/repos/simonw/datasette/issues/1423,993876599,IC_kwDOBm6k_c47PVp3,6165713,plpxsk,2021-12-14T18:48:09Z,2021-12-14T18:48:09Z,NONE,"Great feature. But what is the right way to enable this to show up? Currently, it seems I need to edit the URL to add, in the right place, `&_facet_size=max` Is there another (easier) way to enable this feature?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",962391325,Show count of facet values if ?_facet_size=max, https://github.com/simonw/datasette/issues/1426#issuecomment-974711959,https://api.github.com/repos/simonw/datasette/issues/1426,974711959,IC_kwDOBm6k_c46GOyX,52649,tannewt,2021-11-20T21:11:51Z,2021-11-20T21:11:51Z,NONE,I think another thing would be to make `/pages/robots.txt` work. That way you can use jinja to generate a desired robots.txt. I'm using it to allow the main index and what it links to to be crawled (but not the database pages directly.),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",964322136,"Manage /robots.txt in Datasette core, block robots by default", https://github.com/simonw/datasette/issues/1426#issuecomment-985982668,https://api.github.com/repos/simonw/datasette/issues/1426,985982668,IC_kwDOBm6k_c46xObM,95520595,knowledgecamp12,2021-12-04T07:11:29Z,2021-12-04T07:11:29Z,NONE,You can generate xml site map from the online tools using https://tools4seo.site/xml-sitemap-generator. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",964322136,"Manage /robots.txt in Datasette core, block robots by default", https://github.com/simonw/datasette/issues/1432#issuecomment-941002127,https://api.github.com/repos/simonw/datasette/issues/1432,941002127,IC_kwDOBm6k_c44Fo2P,5802411,ashishdotme,2021-10-12T13:14:31Z,2021-10-12T13:14:39Z,NONE,"Any workaround for making it work with datasette-publish-vercel. Currently getting below error module initialization error: __init__() got an unexpected keyword argument 'config' module initialization error __init__() got an unexpected keyword argument 'config'","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",969855774,Rename Datasette.__init__(config=) parameter to settings=, https://github.com/simonw/datasette/issues/1432#issuecomment-941585767,https://api.github.com/repos/simonw/datasette/issues/1432,941585767,IC_kwDOBm6k_c44H3Vn,5802411,ashishdotme,2021-10-12T21:23:19Z,2021-10-12T21:23:19Z,NONE,"Nevermind, had to remove the branch argument in the workflow to make vercel publish work","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",969855774,Rename Datasette.__init__(config=) parameter to settings=, https://github.com/simonw/datasette/issues/1432#issuecomment-945639639,https://api.github.com/repos/simonw/datasette/issues/1432,945639639,IC_kwDOBm6k_c44XVDX,5802411,ashishdotme,2021-10-18T10:44:56Z,2021-10-18T10:44:56Z,NONE,"@simonw I am getting the below issue again now, even after removing branch argument from vercel datasette plugin module initialization error: init() got an unexpected keyword argument 'config' module initialization error init() got an unexpected keyword argument 'config'","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",969855774,Rename Datasette.__init__(config=) parameter to settings=, https://github.com/simonw/datasette/issues/1439#issuecomment-1059863997,https://api.github.com/repos/simonw/datasette/issues/1439,1059863997,IC_kwDOBm6k_c4_LD29,505230,karlcow,2022-03-06T00:57:57Z,2022-03-06T00:57:57Z,NONE,"Probably too late… but I have just seen this because http://simonwillison.net/2022/Mar/5/dash-encoding/#atom-everything And it reminded me of comma tools at W3C. http://www.w3.org/,tools Example, the text version of W3C homepage https://www.w3.org/,text > The challenge comes down to telling the difference between the following: > > * `/db/table` - an HTML table page `/db/table` > * `/db/table.csv` - the CSV version of `/db/table` `/db/table,csv` > * `/db/table.csv` - no this one is actually a database table called `table.csv` `/db/table.csv` > * `/db/table.csv.csv` - the CSV version of `/db/table.csv` `/db/table.csv,csv` > * `/db/table.csv.csv.csv` and so on... `/db/table.csv.csv,csv` I haven't checked all the cases in the thread.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",973139047,Rethink how .ext formats (v.s. ?_format=) works before 1.0, https://github.com/simonw/datasette/issues/144#issuecomment-346427794,https://api.github.com/repos/simonw/datasette/issues/144,346427794,MDEyOklzc3VlQ29tbWVudDM0NjQyNzc5NA==,649467,mhalle,2017-11-22T17:55:45Z,2017-11-22T17:55:45Z,NONE,"Thanks. There is a way to use pip to grab apsw, which also let's you configure it (flags to build extensions, use an internal sqlite, etc). Don't know how that works as a dependency for another package, though. On November 22, 2017 11:38:06 AM EST, Simon Willison wrote: >I have a solution for FTS already, but I'm interested in apsw as a >mechanism for allowing custom virtual tables to be written in Python >(pysqlite only lets you write custom functions) > >Not having PyPI support is pretty tough though. I'm planning a >plugin/extension system which would be ideal for things like an >optional apsw mode, but that's a lot harder if apsw isn't in PyPI. > >-- >You are receiving this because you authored the thread. >Reply to this email directly or view it on GitHub: >https://github.com/simonw/datasette/issues/144#issuecomment-346405660 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276091279,apsw as alternative sqlite3 binding (for full text search), https://github.com/simonw/datasette/issues/1479#issuecomment-1114601882,https://api.github.com/repos/simonw/datasette/issues/1479,1114601882,IC_kwDOBm6k_c5Cb3ma,32839123,Rik-de-Kort,2022-05-02T08:10:27Z,2022-05-02T11:54:49Z,NONE,"Also ran into this issue today using `datasette package`. The stack trace takes up my whole PowerShell history, though (recursionerror), but it also concerns the temporary directory. Our development machines have a very zealous scanner that appears to insert itself between every call to the filesystem. I suspected that was causing some racing, but this turned out not to be the case: inserting `time.sleep(3)` on line 451 of `datasette/datasette/utils/__init__.py` does not make the problem go away. Commenting out the `tmp.cleanup()` line does. The next error I get is docker-specific, so that probably does resolve the Datasette error here.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1010112818,"Win32 ""used by another process"" error with datasette publish", https://github.com/simonw/datasette/issues/1479#issuecomment-929927144,https://api.github.com/repos/simonw/datasette/issues/1479,929927144,IC_kwDOBm6k_c43bY_o,1244799,soobrosa,2021-09-29T07:49:40Z,2021-09-29T07:49:40Z,NONE,"My search yielded these four entries: https://github.com/simonw/datasette/issues?q=PermissionError%3A+%5BWinError+32%5D+ Maybe this is the closet hit? https://github.com/simonw/datasette/issues/744 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1010112818,"Win32 ""used by another process"" error with datasette publish", https://github.com/simonw/datasette/issues/1479#issuecomment-930071625,https://api.github.com/repos/simonw/datasette/issues/1479,930071625,IC_kwDOBm6k_c43b8RJ,76450761,kirajano,2021-09-29T11:01:30Z,2021-09-29T11:01:30Z,NONE,"Thanks, but this one has a different error type. Unfortunately, still not working.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1010112818,"Win32 ""used by another process"" error with datasette publish", https://github.com/simonw/datasette/issues/1497#issuecomment-1387433455,https://api.github.com/repos/simonw/datasette/issues/1497,1387433455,IC_kwDOBm6k_c5Sso3v,270255,adamalton,2023-01-18T17:13:45Z,2023-01-18T17:13:45Z,NONE,"You may have just been talking to yourself here, but I found your documentation of the process incredibly useful when I hit the same error myself. Thanks! 🌈 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1034535001,"Publish to Docker Hub failing with ""libcrypt.so.1: cannot open shared object file""", https://github.com/simonw/datasette/issues/1505#issuecomment-970188065,https://api.github.com/repos/simonw/datasette/issues/1505,970188065,IC_kwDOBm6k_c450-Uh,7094907,Segerberg,2021-11-16T11:40:52Z,2021-11-16T11:40:52Z,NONE,A suggestion is to have the option to choose an arbitrary delimiter (and quoting characters ),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1052247023,Datasette should have an option to output CSV with semicolons, https://github.com/simonw/datasette/issues/1519#issuecomment-983890815,https://api.github.com/repos/simonw/datasette/issues/1519,983890815,IC_kwDOBm6k_c46pPt_,157158,phubbard,2021-12-01T17:50:09Z,2021-12-01T17:50:09Z,NONE,"thanks so very much for the prompt attention and fix! Plus, the animated GIF showing the bug is just extra and I love it. Interactions like this are why I love open source.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058790545,base_url is omitted in JSON and CSV views, https://github.com/simonw/datasette/issues/1522#issuecomment-974607456,https://api.github.com/repos/simonw/datasette/issues/1522,974607456,IC_kwDOBm6k_c46F1Rg,17906,mrchrisadams,2021-11-20T07:10:11Z,2021-11-20T07:10:11Z,NONE,"As a a sanity check, would it be worth looking at trying to push the multi-process container on another provider of a knative / cloud run / tekton ? I have a somewhat similar use case for a future proejct, so i'm been very grateful to you sharing all the progress in this issue. As I understand it, Scaleway also offer a very similar offering using what appear to be many similar components that might at least see if it's an issue with more than one knative based FaaS provider https://www.scaleway.com/en/serverless-containers/ https://developers.scaleway.com/en/products/containers/api/#main-features ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236,Deploy a live instance of demos/apache-proxy, https://github.com/simonw/datasette/issues/1522#issuecomment-976023405,https://api.github.com/repos/simonw/datasette/issues/1522,976023405,IC_kwDOBm6k_c46LO9t,360895,steren,2021-11-23T00:08:07Z,2021-11-23T00:08:07Z,NONE,"If you suspect that Cloud Run throttled CPU could be the cause, you can request to have CPU always allocated with `gcloud beta run deploy --no-cpu-throttling` ([read more](https://cloud.google.com/blog/products/serverless/cloud-run-gets-always-on-cpu-allocation)) It could also be the Cloud Run sandbox that somehow gets in the way here, in which case I recommend testing with the second generation execution environment: `gcloud beta run deploy --execution-environment gen2`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236,Deploy a live instance of demos/apache-proxy, https://github.com/simonw/datasette/issues/1528#issuecomment-988468238,https://api.github.com/repos/simonw/datasette/issues/1528,988468238,IC_kwDOBm6k_c466tQO,30934,20after4,2021-12-08T03:35:45Z,2021-12-08T03:35:45Z,NONE,"FWIW I implemented something similar with a bit of plugin code: ```python @hookimpl def canned_queries(datasette: Datasette, database: str) -> Mapping[str, str]: # load ""canned queries"" from the filesystem under # www/sql/db/query_name.sql queries = {} sqldir = Path(__file__).parent.parent / ""sql"" if database: sqldir = sqldir / database if not sqldir.is_dir(): return queries for f in sqldir.glob('*.sql'): try: sql = f.read_text('utf8').strip() if not len(sql): log(f""Skipping empty canned query file: {f}"") continue queries[f.stem] = { ""sql"": sql } except OSError as err: log(err) return queries ```","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",1060631257,"Add new `""sql_file""` key to Canned Queries in metadata?", https://github.com/simonw/datasette/issues/153#issuecomment-348252037,https://api.github.com/repos/simonw/datasette/issues/153,348252037,MDEyOklzc3VlQ29tbWVudDM0ODI1MjAzNw==,20264,ftrain,2017-11-30T16:59:00Z,2017-11-30T16:59:00Z,NONE,"WOW! -- Paul Ford // (646) 369-7128 // @ftrain On Thu, Nov 30, 2017 at 11:47 AM, Simon Willison wrote: > Remaining work on this now lives in a milestone: > https://github.com/simonw/datasette/milestone/6 > > — > You are receiving this because you were mentioned. > Reply to this email directly, view it on GitHub > , > or mute the thread > > . > ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/1532#issuecomment-981966693,https://api.github.com/repos/simonw/datasette/issues/1532,981966693,IC_kwDOBm6k_c46h59l,30934,20after4,2021-11-29T19:56:52Z,2021-11-29T19:56:52Z,NONE,FWIW I've written some web components that consume the json api and I think it's a really nice way to work with datasette. I like the combination with datasette+sqlite as a back-end feeding data to a front-end that's entirely javascript + html.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1065429936,Use datasette-table Web Component to guide the design of the JSON API for 1.0, https://github.com/simonw/datasette/issues/1532#issuecomment-982745406,https://api.github.com/repos/simonw/datasette/issues/1532,982745406,IC_kwDOBm6k_c46k4E-,30934,20after4,2021-11-30T15:28:57Z,2021-11-30T15:28:57Z,NONE,"It's a really great API and the documentation is really great too. Honestly, in more than 20 years of professional experience, I haven't worked with any software API that was more of a joy to use. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1065429936,Use datasette-table Web Component to guide the design of the JSON API for 1.0, https://github.com/simonw/datasette/issues/155#issuecomment-347714314,https://api.github.com/repos/simonw/datasette/issues/155,347714314,MDEyOklzc3VlQ29tbWVudDM0NzcxNDMxNA==,388154,wsxiaoys,2017-11-29T00:46:25Z,2017-11-29T00:46:25Z,NONE,"``` CREATE TABLE rhs ( id INTEGER PRIMARY KEY, name TEXT ); CREATE TABLE lhs ( symbol INTEGER PRIMARY KEY, FOREIGN KEY (symbol) REFERENCES rhs(id) ); INSERT INTO rhs VALUES (1, ""foo""); INSERT INTO rhs VALUES (2, ""bar""); INSERT INTO lhs VALUES (1); INSERT INTO lhs VALUES (2); ``` It's expected that in lhs's view, foo / bar should be displayed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",277589569,A primary key column that has foreign key restriction associated won't rendering label column, https://github.com/simonw/datasette/issues/1590#issuecomment-1010556333,https://api.github.com/repos/simonw/datasette/issues/1590,1010556333,IC_kwDOBm6k_c48O92t,1001306,eelkevdbos,2022-01-12T02:03:59Z,2022-01-12T02:03:59Z,NONE,"Thank you for the quick reply! Just a quick observation, I am running this locally without a proxy, whereas your fly example seems to be running behind an apache proxy (if the name is accurate). Can it be that the apache proxy strips the prefix before it passes on the request to the daphne backend?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1099723916,Table+query JSON and CSV links broken when using `base_url` setting, https://github.com/simonw/datasette/issues/1590#issuecomment-1010559681,https://api.github.com/repos/simonw/datasette/issues/1590,1010559681,IC_kwDOBm6k_c48O-rB,1001306,eelkevdbos,2022-01-12T02:10:20Z,2022-01-12T02:10:20Z,NONE,"In my example, path matching happens at the application layer (being the Django channels URLRouter). That might be a somewhat exotic solution that would normally be solved by a proxy like Apache or Nginx. However, in my specific use case, this is a ""feature"" enabling me to do simple management of databases and metadata from within a Django admin app instance mapped in that same router.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1099723916,Table+query JSON and CSV links broken when using `base_url` setting, https://github.com/simonw/datasette/issues/1608#issuecomment-1017993482,https://api.github.com/repos/simonw/datasette/issues/1608,1017993482,IC_kwDOBm6k_c48rVkK,316517,astrojuanlu,2022-01-20T22:46:16Z,2022-01-20T22:46:16Z,NONE,Or you can use https://sphinx-version-warning.readthedocs.io/! 😄 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1109808154,Documentation should clarify /stable/ vs /latest/, https://github.com/simonw/datasette/issues/161#issuecomment-350108113,https://api.github.com/repos/simonw/datasette/issues/161,350108113,MDEyOklzc3VlQ29tbWVudDM1MDEwODExMw==,388154,wsxiaoys,2017-12-07T22:02:24Z,2017-12-07T22:02:24Z,NONE,"It's not throwing the validation error anymore, but i still cannot run following with query: ``` WITH RECURSIVE cnt(x) AS (SELECT 1 UNION ALL SELECT x+1 FROM cnt LIMIT 10) SELECT x FROM cnt; ``` I got `near ""WITH"": syntax error`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278814220,Support WITH query , https://github.com/simonw/datasette/issues/161#issuecomment-350182904,https://api.github.com/repos/simonw/datasette/issues/161,350182904,MDEyOklzc3VlQ29tbWVudDM1MDE4MjkwNA==,388154,wsxiaoys,2017-12-08T06:18:12Z,2017-12-08T06:18:12Z,NONE,"You're right..got this resolved after upgrading the sqlite version. Thanks you!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278814220,Support WITH query , https://github.com/simonw/datasette/issues/1615#issuecomment-1023997327,https://api.github.com/repos/simonw/datasette/issues/1615,1023997327,IC_kwDOBm6k_c49CPWP,369053,aidansteele,2022-01-28T08:37:36Z,2022-01-28T08:37:36Z,NONE,"Oops, it feels like this should perhaps be migrated to GitHub Discussions - sorry! I don't think I have the ability to do that 😅","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1117132741,Potential simplified publishing mechanism, https://github.com/simonw/datasette/issues/1618#issuecomment-1028294089,https://api.github.com/repos/simonw/datasette/issues/1618,1028294089,IC_kwDOBm6k_c49SoXJ,770231,strada,2022-02-02T19:42:03Z,2022-02-02T19:42:03Z,NONE,"Thanks for looking into this. It might have been nice if `explain` surfaced these function calls. Looks like `explain query plan` does, but only for basic queries. ``` sqlite-utils fixtures.db 'explain query plan select * from pragma_function_list(), pragma_database_list(), pragma_module_list()' -t id parent notused detail ---- -------- --------- ------------------------------------------------ 4 0 0 SCAN pragma_function_list VIRTUAL TABLE INDEX 0: 8 0 0 SCAN pragma_database_list VIRTUAL TABLE INDEX 0: 12 0 0 SCAN pragma_module_list VIRTUAL TABLE INDEX 0: ``` ``` sqlite-utils fixtures.db 'explain query plan select * from pragma_function_list() as fl, pragma_database_list() as dl, pragma_module_list() as ml' -t id parent notused detail ---- -------- --------- ------------------------------ 4 0 0 SCAN fl VIRTUAL TABLE INDEX 0: 8 0 0 SCAN dl VIRTUAL TABLE INDEX 0: 12 0 0 SCAN ml VIRTUAL TABLE INDEX 0: ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1121121305,"Reconsider policy on blocking queries containing the string ""pragma""", https://github.com/simonw/datasette/issues/1619#issuecomment-1353721442,https://api.github.com/repos/simonw/datasette/issues/1619,1353721442,IC_kwDOBm6k_c5QsCZi,2090382,noslouch,2022-12-15T21:20:53Z,2022-12-15T21:20:53Z,NONE,"i'm also getting bit by this. I'm trying to set up an nginx reverse proxy in front of multiple datasette backends. When I run it locally or behind the proxy, I see the `base_url` value added a second time to the path for various action links on table pages (view as JSON, sort by column, etc).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1121583414,JSON link on row page is 404 if base_url setting is used, https://github.com/simonw/datasette/issues/1619#issuecomment-1455196849,https://api.github.com/repos/simonw/datasette/issues/1619,1455196849,IC_kwDOBm6k_c5WvIqx,969875,BryantD,2023-03-05T20:29:55Z,2023-03-05T20:30:14Z,NONE,"I have this same issue, which is happening with both json links and facets. It is not happening with column sort links in the gear popup menus, but it is happening with the sort arrow that results after you use one of those links. I'm using Apache as a proxy to Datasette; the relevant configs are: ``` ProxyPass /datasette/ http://127.0.0.1:8000/datasette/ nocanon ProxyPreserveHost on ``` ``` { ""base_url"": ""/datasette/"" } ``` If it would be useful to get a look at the running installation via the Web, Simon, let me know.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1121583414,JSON link on row page is 404 if base_url setting is used, https://github.com/simonw/datasette/issues/1619#issuecomment-1483009959,https://api.github.com/repos/simonw/datasette/issues/1619,1483009959,IC_kwDOBm6k_c5YZO-n,6613091,henrikek,2023-03-24T15:38:04Z,2023-03-24T15:38:04Z,NONE,"I also have the same problem when running behind an apache proxy server with base_url. However, I have researched the problem a bit and have come to the conclusion that if you change _facet_date to _facet in the following https://github.com/simonw/datasette/blob/3feed1f66e2b746f349ee56970a62246a18bb164/datasette/facets.py#LL493C57-L493C68 , the facets work. But I'm not sure if this has other consequences?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1121583414,JSON link on row page is 404 if base_url setting is used, https://github.com/simonw/datasette/issues/1624#issuecomment-1261194164,https://api.github.com/repos/simonw/datasette/issues/1624,1261194164,IC_kwDOBm6k_c5LLEu0,38532,palfrey,2022-09-28T16:54:22Z,2022-09-28T16:54:22Z,NONE,https://github.com/simonw/datasette-cors seems to workaround this,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1122427321,Index page `/` has no CORS headers, https://github.com/simonw/datasette/issues/1633#issuecomment-1034222709,https://api.github.com/repos/simonw/datasette/issues/1633,1034222709,IC_kwDOBm6k_c49pPx1,6613091,henrikek,2022-02-09T21:47:02Z,2022-02-09T21:47:02Z,NONE,Is this the correct solution to add the base_url row to url_builder.py?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1129052172,base_url or prefix does not work with _exact match, https://github.com/simonw/datasette/issues/1633#issuecomment-1111955628,https://api.github.com/repos/simonw/datasette/issues/1633,1111955628,IC_kwDOBm6k_c5CRxis,6613091,henrikek,2022-04-28T09:12:56Z,2022-04-28T09:12:56Z,NONE,I have verified that the problem with base_url still exists in the latest version 0.61.1. I would need some guidance if my code change suggestion is correct or if base_url should be included in some other code?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1129052172,base_url or prefix does not work with _exact match, https://github.com/simonw/datasette/issues/1634#issuecomment-1065334891,https://api.github.com/repos/simonw/datasette/issues/1634,1065334891,IC_kwDOBm6k_c4_f7hr,208018,dholth,2022-03-11T17:38:08Z,2022-03-11T17:38:08Z,NONE,I noticed the image was large when using fly. Is it possible to use a -slim base?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1131295060,Update Dockerfile generated by `datasette publish`, https://github.com/simonw/datasette/issues/1646#issuecomment-1172930092,https://api.github.com/repos/simonw/datasette/issues/1646,1172930092,IC_kwDOBm6k_c5F6X4s,1473102,mustafa0x,2022-07-02T17:12:18Z,2022-07-02T17:12:18Z,NONE,"I'm affected by this as well. Would be nice to be able to pass in an extension, eg `--extension=sqlite3`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1157182254,Configuration directory mode does not pick up other file extensions than .db, https://github.com/simonw/datasette/issues/1688#issuecomment-1079550754,https://api.github.com/repos/simonw/datasette/issues/1688,1079550754,IC_kwDOBm6k_c5AWKMi,9020979,hydrosquall,2022-03-26T01:27:27Z,2022-03-26T03:16:29Z,NONE,"> Is there a way to serve a static assets when using the plugins/ directory method instead of installing plugins as a new python package? As a workaround, I found I can serve my statics from a non-plugin specific folder using the [--static](https://docs.datasette.io/en/stable/custom_templates.html#serving-static-files) CLI flag. ```bash datasette ~/Library/Safari/History.db \ --plugins-dir=plugins/ \ --static assets:dist/ ``` It's not ideal because it means I'll change the cache pattern path depending on how the plugin is running (via pip install or as a one off script), but it's usable as a workaround. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1181432624,[plugins][documentation] Is it possible to serve per-plugin static folders when writing one-off (single file) plugins?, https://github.com/simonw/datasette/issues/1688#issuecomment-1079806857,https://api.github.com/repos/simonw/datasette/issues/1688,1079806857,IC_kwDOBm6k_c5AXIuJ,9020979,hydrosquall,2022-03-27T01:01:14Z,2022-03-27T01:01:14Z,NONE,"Thank you! I went through the cookiecutter template, and published my first package here: https://github.com/hydrosquall/datasette-nteract-data-explorer","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1181432624,[plugins][documentation] Is it possible to serve per-plugin static folders when writing one-off (single file) plugins?, https://github.com/simonw/datasette/issues/1706#issuecomment-1325164933,https://api.github.com/repos/simonw/datasette/issues/1706,1325164933,IC_kwDOBm6k_c5O_GmF,1176293,ar-jan,2022-11-23T14:34:54Z,2022-11-23T14:34:54Z,NONE,This would be helpful.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1198822563,"[feature] immutable mode for a directory, not just individual sqlite file", https://github.com/simonw/datasette/issues/1706#issuecomment-1344669160,https://api.github.com/repos/simonw/datasette/issues/1706,1344669160,IC_kwDOBm6k_c5QJgXo,419145,rdmurphy,2022-12-09T19:11:40Z,2022-12-09T19:11:40Z,NONE,"Ah, yes! Was just trying to do this and had the same issue. +1 to this!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1198822563,"[feature] immutable mode for a directory, not just individual sqlite file", https://github.com/simonw/datasette/issues/1713#issuecomment-1099443468,https://api.github.com/repos/simonw/datasette/issues/1713,1099443468,IC_kwDOBm6k_c5BiC0M,9308268,rayvoelker,2022-04-14T17:26:27Z,2022-04-14T17:26:27Z,NONE,"What would be an awesome feature as a plugin would be to be able to save a query (and possibly even results) to a github gist. Being able to share results that way would be super fantastic. Possibly even in Jupyter Notebook format (since github and github gists nicely render those)! I know there's the handy datasette-saved-queries plugin, but a button that could export stuff out and then even possibly import stuff back in (I'm sort of thinking the way that Google Colab allows you to save to github, and then pull the notebook back in is a really great workflow ![image](https://user-images.githubusercontent.com/9308268/163441612-9ad2649f-c73e-4557-aaf2-e3d0fdc48fbf.png) https://github.com/cincinnatilibrary/collection-analysis/blob/master/reports/colab_datasette_example.ipynb )","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1203943272,Datasette feature for publishing snapshots of query results, https://github.com/simonw/datasette/issues/1727#issuecomment-1111448928,https://api.github.com/repos/simonw/datasette/issues/1727,1111448928,IC_kwDOBm6k_c5CP11g,716529,glyph,2022-04-27T20:27:05Z,2022-04-27T20:27:05Z,NONE,"You don't want to re-use an SQLite connection from multiple threads anyway: https://www.sqlite.org/threadsafe.html Multiple connections can operate on the file in parallel, but a single connection can't: > Multi-thread. In this mode, SQLite can be safely used by multiple threads **provided that no single database connection is used simultaneously in two or more threads**. (emphasis mine)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1217759117,Research: demonstrate if parallel SQL queries are worthwhile, https://github.com/simonw/datasette/issues/1727#issuecomment-1111451790,https://api.github.com/repos/simonw/datasette/issues/1727,1111451790,IC_kwDOBm6k_c5CP2iO,716529,glyph,2022-04-27T20:30:33Z,2022-04-27T20:30:33Z,NONE,"> I should try seeing what happens with WAL mode enabled. I've only skimmed above but it looks like you're doing mainly read-only queries? WAL mode is about better interactions between writers & readers, primarily.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1217759117,Research: demonstrate if parallel SQL queries are worthwhile, https://github.com/simonw/datasette/issues/173#issuecomment-823961091,https://api.github.com/repos/simonw/datasette/issues/173,823961091,MDEyOklzc3VlQ29tbWVudDgyMzk2MTA5MQ==,3747136,ColinMaudry,2021-04-21T10:37:05Z,2021-04-21T10:37:36Z,NONE,"I have the feeling that the text visible to users is 95% present in template files ([datasette/templates](https://github.com/simonw/datasette/tree/main/datasette/templates)). The python code mainly contains error messages. In the current situation, the best way to provide a localized frontend is to translate the templates and [configure datasette to use them](https://docs.datasette.io/en/stable/custom_templates.html). I think I'm going to do it for French. If we want localization to be better integrated, for the python code, I think [gettext](https://docs.python.org/3/library/gettext.html#localizing-your-application) is the way to go. The .po can be translated in user-friendly tools such as Transifex and Crowdin. For the templates, I'm not sure how we could do it cleanly and easy to maintain. Maybe the tools above could parse HTML and detect the strings to be translated. In any case, localization implementing l10n is just the first step: a continuous process must be setup to maintain the translations and produce new ones while datasette keeps getting new features.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",281110295,I18n and L10n support, https://github.com/simonw/datasette/issues/173#issuecomment-826784306,https://api.github.com/repos/simonw/datasette/issues/173,826784306,MDEyOklzc3VlQ29tbWVudDgyNjc4NDMwNg==,3747136,ColinMaudry,2021-04-26T12:10:01Z,2021-04-26T12:10:01Z,NONE,I found a neat tutorial to set up gettext with jinja2: http://siongui.github.io/2016/01/17/i18n-python-web-application-by-gettext-jinja2/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",281110295,I18n and L10n support, https://github.com/simonw/datasette/issues/1732#issuecomment-1115542067,https://api.github.com/repos/simonw/datasette/issues/1732,1115542067,IC_kwDOBm6k_c5CfdIz,52649,tannewt,2022-05-03T01:50:44Z,2022-05-03T01:50:44Z,NONE,"I haven’t set one up unfortunately. My time is very limited because we just had a baby. On Mon, May 2, 2022, at 6:42 PM, Simon Willison wrote: > > > Thanks, this definitely sounds like a bug. Do you have simple steps to reproduce this? > > > — > Reply to this email directly, view it on GitHub , or unsubscribe . > You are receiving this because you authored the thread.Message ID: ***@***.***> > ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1221849746,Custom page variables aren't decoded, https://github.com/simonw/datasette/issues/1751#issuecomment-1140321380,https://api.github.com/repos/simonw/datasette/issues/1751,1140321380,IC_kwDOBm6k_c5D9-xk,408765,knutwannheden,2022-05-28T19:52:17Z,2022-05-28T19:52:17Z,NONE,Closing in favor of existing issue #1298.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1251710928,Add scrollbars to table presentation in default layout, https://github.com/simonw/datasette/issues/176#issuecomment-356115657,https://api.github.com/repos/simonw/datasette/issues/176,356115657,MDEyOklzc3VlQ29tbWVudDM1NjExNTY1Nw==,4313116,wulfmann,2018-01-08T22:22:32Z,2018-01-08T22:22:32Z,NONE,"This project probably would not be the place for that. This is a layer for sqllite specifically. It solves a similar problem as graphql, so adding that here wouldn't make sense. Here's an example i found from google that uses micro to run a graphql microservice. you'd just then need to connect your db. https://github.com/timneutkens/micro-graphql","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/176#issuecomment-356161672,https://api.github.com/repos/simonw/datasette/issues/176,356161672,MDEyOklzc3VlQ29tbWVudDM1NjE2MTY3Mg==,173848,yozlet,2018-01-09T02:35:35Z,2018-01-09T02:35:35Z,NONE,"@wulfmann I think I disagree, except I'm not entirely sure what you mean by that first paragraph. The JSON API that Datasette currently exposes is quite different to GraphQL. Furthermore, there's no ""just"" about connecting micro-graphql to a DB; at least, no more ""just"" than adding any other API. You still need to configure the schema, which is exactly the kind of thing that Datasette does for JSON API. This is why I think that GraphQL's a good fit here.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/176#issuecomment-356175667,https://api.github.com/repos/simonw/datasette/issues/176,356175667,MDEyOklzc3VlQ29tbWVudDM1NjE3NTY2Nw==,4313116,wulfmann,2018-01-09T04:19:03Z,2018-01-09T04:19:03Z,NONE,"@yozlet Yes I think that I was confused when I posted my original comment. I see your main point now and am in agreement. ","{""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 2, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/176#issuecomment-359697938,https://api.github.com/repos/simonw/datasette/issues/176,359697938,MDEyOklzc3VlQ29tbWVudDM1OTY5NzkzOA==,7193,gijs,2018-01-23T07:17:56Z,2018-01-23T07:17:56Z,NONE,👍 I'd like this too! ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/176#issuecomment-368625350,https://api.github.com/repos/simonw/datasette/issues/176,368625350,MDEyOklzc3VlQ29tbWVudDM2ODYyNTM1MA==,7431774,wuhland,2018-02-26T19:44:11Z,2018-02-26T19:44:11Z,NONE,great idea!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/176#issuecomment-431867885,https://api.github.com/repos/simonw/datasette/issues/176,431867885,MDEyOklzc3VlQ29tbWVudDQzMTg2Nzg4NQ==,634572,eads,2018-10-22T15:24:57Z,2018-10-22T15:24:57Z,NONE,"I'd like this as well. It would let me access Datasette-driven projects from GatsbyJS the same way I can access Postgres DBs via Hasura. While I don't see SQLite replacing Postgres for the 50m row datasets I sometimes have to work with, there's a whole class of smaller datasets that are great with Datasette but currently would find another option.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/176#issuecomment-548508237,https://api.github.com/repos/simonw/datasette/issues/176,548508237,MDEyOklzc3VlQ29tbWVudDU0ODUwODIzNw==,634572,eads,2019-10-31T18:25:44Z,2019-10-31T18:25:44Z,NONE,"👋 I'd be interested in building this out in Q1 or Q2 of 2020 if nobody has tackled it by then. I would love to integrate Datasette into @thechicagoreporter's practice, but we're also fully committed to GraphQL moving forward.","{""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 2, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/176#issuecomment-617208503,https://api.github.com/repos/simonw/datasette/issues/176,617208503,MDEyOklzc3VlQ29tbWVudDYxNzIwODUwMw==,12976,nkirsch,2020-04-21T14:16:24Z,2020-04-21T14:16:24Z,NONE,"@eads I'm interested in helping, if there's still a need...","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/1771#issuecomment-1356657451,https://api.github.com/repos/simonw/datasette/issues/1771,1356657451,IC_kwDOBm6k_c5Q3PMr,1473102,mustafa0x,2022-12-18T04:04:32Z,2022-12-18T04:04:32Z,NONE,"the problem is: ``` .select-wrapper select:focus { outline: none; } ``` I sometimes add this js: ``` window.addEventListener('keydown', function check_tab(e) { if (e.key === 'Tab') { document.documentElement.classList.add('user-is-tabbing') window.removeEventListener('keydown', check_tab) } }) ``` and then in the css, using a `html.user-is-tabbing` selector undo any outlines I removed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1306984363,minor a11y: