html_url,issue_url,id,node_id,user,user_label,created_at,updated_at,author_association,body,reactions,issue,issue_label,performed_via_github_app https://github.com/simonw/datasette/issues/46#issuecomment-344161226,https://api.github.com/repos/simonw/datasette/issues/46,344161226,MDEyOklzc3VlQ29tbWVudDM0NDE2MTIyNg==,9599,simonw,2017-11-14T06:41:21Z,2017-11-14T06:41:21Z,OWNER,Spatial extensions would be really useful too. https://www.gaia-gis.it/spatialite-2.1/SpatiaLite-manual.html,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/175#issuecomment-353424169,https://api.github.com/repos/simonw/datasette/issues/175,353424169,MDEyOklzc3VlQ29tbWVudDM1MzQyNDE2OQ==,9599,simonw,2017-12-21T18:33:55Z,2017-12-21T18:33:55Z,OWNER,Done - thanks for curating these: https://github.com/topics/automatic-api,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",282971961,"Add project topic ""automatic-api""", https://github.com/simonw/datasette/issues/97#issuecomment-392602334,https://api.github.com/repos/simonw/datasette/issues/97,392602334,MDEyOklzc3VlQ29tbWVudDM5MjYwMjMzNA==,9599,simonw,2018-05-28T20:57:21Z,2018-05-28T20:57:21Z,OWNER,"The `/.json` endpoint is more of an implementation detail of the homepage at this point. A better, documented ( http://datasette.readthedocs.io/en/stable/introspection.html#inspect ) endpoint for finding all of the databases and tables is https://parlgov.datasettes.com/-/inspect.json","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274022950,Link to JSON for the list of tables , https://github.com/simonw/datasette/issues/339#issuecomment-404565566,https://api.github.com/repos/simonw/datasette/issues/339,404565566,MDEyOklzc3VlQ29tbWVudDQwNDU2NTU2Ng==,9599,simonw,2018-07-12T16:08:42Z,2018-07-12T16:08:42Z,OWNER,I'm going to turn this into an issue about better supporting the above option.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",340396247,Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way, https://github.com/simonw/datasette/issues/308#issuecomment-405971920,https://api.github.com/repos/simonw/datasette/issues/308,405971920,MDEyOklzc3VlQ29tbWVudDQwNTk3MTkyMA==,9599,simonw,2018-07-18T15:27:12Z,2018-07-18T15:27:12Z,OWNER,"It looks like there are a few extra options we should support: https://devcenter.heroku.com/articles/heroku-cli-commands ``` -t, --team=team team to use --region=region specify region for the app to run in --space=space the private space to create the app in ``` Since these differ from the options for Zeit Now I think this means splitting up `datasette publish now` and `datasette publish Heroku` into separate subcommands.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",330826972,"Support extra Heroku apps:create options - region, space, team", https://github.com/simonw/datasette/issues/176#issuecomment-431867885,https://api.github.com/repos/simonw/datasette/issues/176,431867885,MDEyOklzc3VlQ29tbWVudDQzMTg2Nzg4NQ==,634572,eads,2018-10-22T15:24:57Z,2018-10-22T15:24:57Z,NONE,"I'd like this as well. It would let me access Datasette-driven projects from GatsbyJS the same way I can access Postgres DBs via Hasura. While I don't see SQLite replacing Postgres for the 50m row datasets I sometimes have to work with, there's a whole class of smaller datasets that are great with Datasette but currently would find another option.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/370#issuecomment-435974786,https://api.github.com/repos/simonw/datasette/issues/370,435974786,MDEyOklzc3VlQ29tbWVudDQzNTk3NDc4Ng==,9599,simonw,2018-11-05T18:06:56Z,2018-11-05T18:06:56Z,OWNER,"I've been thinking a bit about ways of using Jupyter Notebook more effectively with Datasette (thinks like a `publish_dataframes(df1, df2, df3)` function which publishes some Pandas dataframes and returns you a URL to a new hosted Datasette instance) but you're right, Jupyter Lab is potentially a much more interesting fit.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377155320,Integration with JupyterLab, https://github.com/simonw/datasette/issues/120#issuecomment-439421164,https://api.github.com/repos/simonw/datasette/issues/120,439421164,MDEyOklzc3VlQ29tbWVudDQzOTQyMTE2NA==,36796532,ad-si,2018-11-16T15:05:18Z,2018-11-16T15:05:18Z,NONE,This would be an awesome feature ❤️ ,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275087397,Plugin that adds an authentication layer of some sort, https://github.com/simonw/datasette/issues/391#issuecomment-450964512,https://api.github.com/repos/simonw/datasette/issues/391,450964512,MDEyOklzc3VlQ29tbWVudDQ1MDk2NDUxMg==,9599,simonw,2019-01-02T19:45:12Z,2019-01-02T19:45:12Z,OWNER,"Thanks, I've fixed this. I had to re-alias it against now: ``` ~ $ now alias google-trends-pnwhfwvgqf.now.sh https://google-trends.datasettes.com/ > Assigning alias google-trends.datasettes.com to deployment google-trends-pnwhfwvgqf.now.sh > Certificate for google-trends.datasettes.com (cert_uXaADIuNooHS3tZ) created [18s] > Success! google-trends.datasettes.com now points to google-trends-pnwhfwvgqf.now.sh [20s] ```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",392610803,Google Trends example doesn’t work, https://github.com/simonw/datasette/issues/397#issuecomment-453330680,https://api.github.com/repos/simonw/datasette/issues/397,453330680,MDEyOklzc3VlQ29tbWVudDQ1MzMzMDY4MA==,9599,simonw,2019-01-11T01:17:11Z,2019-01-11T01:25:33Z,OWNER,"If you pull [the latest image](https://hub.docker.com/r/datasetteproject/datasette) you should get the right SQLite version now: docker pull datasetteproject/datasette docker run -p 8001:8001 \ datasetteproject/datasette \ datasette -p 8001 -h 0.0.0.0 http://0.0.0.0:8001/-/versions now gives me: ``` ""version"": ""3.26.0"" ```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",397129564,Update official datasetteproject/datasette Docker container to SQLite 3.26.0, https://github.com/simonw/datasette/issues/187#issuecomment-467264937,https://api.github.com/repos/simonw/datasette/issues/187,467264937,MDEyOklzc3VlQ29tbWVudDQ2NzI2NDkzNw==,9599,simonw,2019-02-26T02:14:28Z,2019-02-26T02:14:28Z,OWNER,I'm working on a port of Datasette to Starlette which I think would fix this issue: https://github.com/encode/starlette,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309033998,Windows installation error, https://github.com/simonw/datasette/issues/419#issuecomment-473708941,https://api.github.com/repos/simonw/datasette/issues/419,473708941,MDEyOklzc3VlQ29tbWVudDQ3MzcwODk0MQ==,9599,simonw,2019-03-17T19:58:11Z,2019-03-17T19:58:11Z,OWNER,"Some problems to solve: * Right now Datasette assumes it can always show the count of rows in a table, because this has been pre-calculated. If a database is mutable the pre-calculation trick no longer works, and for giant tables a `select count(*) from X` query can be expensive to run. Maybe we set a time limit on these? If time limit expires show ""many rows""? * Maintaining a content hash of the table no longer makes sense if it is changing (though interestingly there's a `.sha3sum` built-in SQLite CLI command which takes a hash of the content and stays the same even through vacuum runs). Without that we need a different mechanism for calculating table colours. It also means that we can't do the special dbname-hash URL trick (see #418) at all if the database is opened as mutable.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421551434,"Default to opening files in mutable mode, special option for immutable files", https://github.com/simonw/datasette/issues/431#issuecomment-488555399,https://api.github.com/repos/simonw/datasette/issues/431,488555399,MDEyOklzc3VlQ29tbWVudDQ4ODU1NTM5OQ==,9599,simonw,2019-05-02T05:13:54Z,2019-05-02T05:13:54Z,OWNER,"Datasette master now treats databases as readonly but NOT immutable. This means you can make changes to those databases from another process and those changes will be instantly reflected in the Datasette interface. As such, reloading on database change is no longer necessary. Closing this ticket.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",432870248,Datasette doesn't reload when database file changes, https://github.com/simonw/datasette/issues/486#issuecomment-495659567,https://api.github.com/repos/simonw/datasette/issues/486,495659567,MDEyOklzc3VlQ29tbWVudDQ5NTY1OTU2Nw==,9599,simonw,2019-05-24T14:41:45Z,2019-05-24T14:41:45Z,OWNER,"I'm really keen to offer this as a plugin hook once I have Datasette working on ASGI - #272 I'll hopefully have that working in the next few weeks, but in the meantime there are a couple of tricks you can use: - you can add static HTML files (no templates though) using the static route configuration options - you can link to external hosted pages using the `about_url` metadata option - you can add information to an existing page with a custom template. I do that here for example: https://russian-ira-facebook-ads.datasettes.com/","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",448189298,Ability to add extra routes and related templates, https://github.com/simonw/sqlite-utils/issues/21#issuecomment-496786354,https://api.github.com/repos/simonw/sqlite-utils/issues/21,496786354,MDEyOklzc3VlQ29tbWVudDQ5Njc4NjM1NA==,9599,simonw,2019-05-29T05:09:01Z,2019-05-29T05:09:01Z,OWNER,Shipped this feature in sqlite-utils 1.1: https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v1-1,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",448391492,Option to ignore inserts if primary key exists already, https://github.com/simonw/datasette/issues/498#issuecomment-498839428,https://api.github.com/repos/simonw/datasette/issues/498,498839428,MDEyOklzc3VlQ29tbWVudDQ5ODgzOTQyOA==,9599,simonw,2019-06-04T20:53:21Z,2019-06-04T20:53:21Z,OWNER,"It does not, but that's a really great idea for a feature. One challenge here is that FTS ranking calculations take overall table statistics into account, which means it's usually not possible to combine rankings from different tables in a sensible way. But that doesn't mean it's not possible to return grouped results. I think this makes a lot of sense as a plugin.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",451513541,Full text search of all tables at once?, https://github.com/simonw/datasette/issues/499#issuecomment-498840129,https://api.github.com/repos/simonw/datasette/issues/499,498840129,MDEyOklzc3VlQ29tbWVudDQ5ODg0MDEyOQ==,9599,simonw,2019-06-04T20:55:30Z,2019-06-04T21:01:22Z,OWNER,"I really want this too! It's one of the goals of the Datasette Library #417 concept, which I'm hoping to turn into an actual feature in the coming months. It's also going to be a major focus of my ten month JSK fellowship at Stanford, which starts in September. https://twitter.com/simonw/status/1123624552867565569","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",451585764,Accessibility for non-techie newsies? , https://github.com/simonw/datasette/issues/527#issuecomment-505057520,https://api.github.com/repos/simonw/datasette/issues/527,505057520,MDEyOklzc3VlQ29tbWVudDUwNTA1NzUyMA==,9599,simonw,2019-06-24T15:21:18Z,2019-06-24T15:21:18Z,OWNER,I just released csvs-to-sqlite 0.9.1 with this bug fix: https://github.com/simonw/csvs-to-sqlite/releases/tag/0.9.1,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",459936585,Unable to use rank when fts-table generated with csvs-to-sqlite, https://github.com/simonw/datasette/pull/437#issuecomment-505087020,https://api.github.com/repos/simonw/datasette/issues/437,505087020,MDEyOklzc3VlQ29tbWVudDUwNTA4NzAyMA==,9599,simonw,2019-06-24T16:38:56Z,2019-06-24T16:38:56Z,OWNER,Closing this because it doesn't really fit the new model of inspect (though we should discuss in #465 how to further evolve this feature) and because as-of #272 we no longer use Sanic - though #520 will implement the equivalent of `prepare_sanic` against ASGI.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",438048318,Add inspect and prepare_sanic hooks, https://github.com/simonw/datasette/issues/514#issuecomment-509154312,https://api.github.com/repos/simonw/datasette/issues/514,509154312,MDEyOklzc3VlQ29tbWVudDUwOTE1NDMxMg==,4363711,JesperTreetop,2019-07-08T09:36:25Z,2019-07-08T09:40:33Z,NONE,"@chrismp: Ports 1024 and under are privileged and can usually only be bound by a root or supervisor user, so it makes sense if you're running as the user `chris` that port 8000 works but 80 doesn't. See [this generic question-and-answer](https://superuser.com/questions/710253/allow-non-root-process-to-bind-to-port-80-and-443) and [this systemd question-and-answer](https://stackoverflow.com/questions/40865775/linux-systemd-service-on-port-80) for more information about ways to skin this cat. Without knowing your specific circumstances, either extending those privileges to that service/executable/user, proxying them through something like nginx or indeed looking at what the nginx systemd job has to do to listen at port 80 all sound like good ways to start. At this point, this is more generic systemd/Linux support than a Datasette issue, which is why a complete rando like me is able to contribute anything. ","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",459397625,Documentation with recommendations on running Datasette in production without using Docker, https://github.com/simonw/datasette/pull/556#issuecomment-510550279,https://api.github.com/repos/simonw/datasette/issues/556,510550279,MDEyOklzc3VlQ29tbWVudDUxMDU1MDI3OQ==,9599,simonw,2019-07-11T16:07:27Z,2019-07-11T16:07:27Z,OWNER,"This is a really neat trick, thanks!","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",465773546,Add support for running datasette as a module, https://github.com/simonw/datasette/pull/557#issuecomment-511625212,https://api.github.com/repos/simonw/datasette/issues/557,511625212,MDEyOklzc3VlQ29tbWVudDUxMTYyNTIxMg==,9599,simonw,2019-07-16T01:12:14Z,2019-07-16T01:12:14Z,OWNER,This looks useful for dealing with the `The process cannot access the file because it is being used by another process` error: https://stackoverflow.com/a/28032829,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",466996584,Get tests running on Windows using Travis CI, https://github.com/simonw/datasette/pull/595#issuecomment-541931047,https://api.github.com/repos/simonw/datasette/issues/595,541931047,MDEyOklzc3VlQ29tbWVudDU0MTkzMTA0Nw==,9599,simonw,2019-10-14T21:25:38Z,2019-10-14T21:25:38Z,OWNER,I like the conditional dependency for the moment - maybe until 3.5 becomes officially unsupported.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",506300941,bump uvicorn to 0.9.0 to be Python-3.8 friendly, https://github.com/dogsheep/twitter-to-sqlite/issues/20#issuecomment-544335363,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20,544335363,MDEyOklzc3VlQ29tbWVudDU0NDMzNTM2Mw==,9599,simonw,2019-10-21T03:32:04Z,2019-10-21T03:32:04Z,MEMBER,"In case anyone is interested, here's an extract from the crontab I'm running these under at the moment: ``` 1,11,21,31,41,51 * * * * /home/ubuntu/datasette-venv/bin/twitter-to-sqlite user-timeline /home/ubuntu/twitter.db -a /home/ubuntu/auth.json --since 2,7,12,17,22,27,32,37,42,47,52,57 * * * * /home/ubuntu/datasette-venv/bin/twitter-to-sqlite home-timeline /home/ubuntu/timeline.db -a /home/ubuntu/auth.json --since 6,16,26,36,46,56 * * * * /home/ubuntu/datasette-venv/bin/twitter-to-sqlite favorites /home/ubuntu/twitter.db -a /home/ubuntu/auth.json --stop_after=50 ```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",506268945,--since support for various commands for refresh-by-cron, https://github.com/simonw/sqlite-utils/issues/62#issuecomment-549435364,https://api.github.com/repos/simonw/sqlite-utils/issues/62,549435364,MDEyOklzc3VlQ29tbWVudDU0OTQzNTM2NA==,9599,simonw,2019-11-04T16:30:34Z,2019-11-04T16:30:34Z,OWNER,Released as 1.12.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",500783373,[enhancement] Method to delete a row in python, https://github.com/simonw/datasette/issues/567#issuecomment-549665423,https://api.github.com/repos/simonw/datasette/issues/567,549665423,MDEyOklzc3VlQ29tbWVudDU0OTY2NTQyMw==,9599,simonw,2019-11-05T05:11:14Z,2019-11-05T05:11:14Z,OWNER,"@clausjuhl I wrote a bit about that here: https://simonwillison.net/2019/May/19/datasette-0-28/ Short version: just point Datasette at a SQLite file and update it from another process - it should work fine! I do it all the time now - I'll have a script running that writes to a database and I'll use Datasette to monitor progress. ","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",476573875,Datasette Edit, https://github.com/simonw/datasette/pull/595#issuecomment-552275668,https://api.github.com/repos/simonw/datasette/issues/595,552275668,MDEyOklzc3VlQ29tbWVudDU1MjI3NTY2OA==,9599,simonw,2019-11-11T03:09:43Z,2019-11-11T03:09:43Z,OWNER,Glitch has been upgraded to Python 3.7. I think I'm happy to drop 3.5 support now - users who want Python 3.5 can get it by installing `datasette==0.30.2`,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",506300941,bump uvicorn to 0.9.0 to be Python-3.8 friendly, https://github.com/simonw/datasette/issues/646#issuecomment-561022224,https://api.github.com/repos/simonw/datasette/issues/646,561022224,MDEyOklzc3VlQ29tbWVudDU2MTAyMjIyNA==,9599,simonw,2019-12-03T06:30:42Z,2019-12-03T06:30:42Z,OWNER,"I don't think this is possible at the moment but you're right, it totally should be.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",531502365,Make database level information from metadata.json available in the index.html template, https://github.com/simonw/sqlite-utils/issues/73#issuecomment-570930239,https://api.github.com/repos/simonw/sqlite-utils/issues/73,570930239,MDEyOklzc3VlQ29tbWVudDU3MDkzMDIzOQ==,9599,simonw,2020-01-05T17:15:18Z,2020-01-05T17:15:18Z,OWNER,I think this is because you forgot to include a `pk=` argument. I'll change the code to throw a more useful error in this case.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",545407916,upsert_all() throws issue when upserting to empty table, https://github.com/simonw/datasette/issues/662#issuecomment-579787057,https://api.github.com/repos/simonw/datasette/issues/662,579787057,MDEyOklzc3VlQ29tbWVudDU3OTc4NzA1Nw==,9599,simonw,2020-01-29T14:43:46Z,2020-01-29T14:43:46Z,OWNER,Can you share the exact queries you're having trouble with? The SQL itself or even just the full URL to the page (it doesn't matter if it's to a Datasette instance that isn't available online - I just need to see the URL parameters).,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",556814876,Escape_fts5_query-hookimplementation does not work with queries to standard tables, https://github.com/simonw/datasette/issues/662#issuecomment-579798917,https://api.github.com/repos/simonw/datasette/issues/662,579798917,MDEyOklzc3VlQ29tbWVudDU3OTc5ODkxNw==,2181410,clausjuhl,2020-01-29T15:08:57Z,2020-01-29T15:08:57Z,NONE,"Hi Simon Thankt you for a quick reply. Here are a few examples of urls, where I search the 'cases_fts'-virtual table for tokens in the title-column. It returns the same results, wether the other query-params are present or not. Searching for sky http://localhost:8001/db-7596a4e/cases?_search_title=sky&year__gte=1997&year__lte=2017&_sort_desc=last_deliberation_date Returns searchresults Searching for sky* http://localhost:8001/db-7596a4e/cases?_search_title=sky*&year__gte=1997&year__lte=2017&_sort_desc=last_deliberation_date Returns searchresults Searching for sky-tog http://localhost:8001/db-7596a4e/cases?_search_title=sky-tog&year__gte=1997&year__lte=2017&_sort_desc=last_deliberation_date Throws: No such column: tog searching for sky+ http://localhost:8001/db-7596a4e/cases?_search_title=sky%2B&year__gte=1997&year__lte=2017&_sort_desc=last_deliberation_date Throws: Invalid SQL: fts5: syntax error near """" Searching for ""madpakke"" (including double quotes) http://localhost:8001/db-7596a4e/cases?_search_title=%22madpakke%22&year__gte=1997&year__lte=2017&_sort_desc=last_deliberation_date Returns searchresults even though 'madpakke' only appears in the fulltextindex without quotes As I said, my other plugins work just fine, and I just copied your sql_functions.py from the datasette-repo. Thanks!","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",556814876,Escape_fts5_query-hookimplementation does not work with queries to standard tables, https://github.com/simonw/datasette/issues/662#issuecomment-579832857,https://api.github.com/repos/simonw/datasette/issues/662,579832857,MDEyOklzc3VlQ29tbWVudDU3OTgzMjg1Nw==,9599,simonw,2020-01-29T16:12:08Z,2020-01-29T16:12:08Z,OWNER,"I think I see what's happening here. Adding the new plugin isn't quite enough: the change I made to master also alters the table view code to call the new function: https://github.com/simonw/datasette/commit/3c861f363df02a59a67c59036278338e4760d2ed#diff-5e0ffd62fced7d46339b9b2cd167c2f9 If you add the escape function as a plugin in Datasette 0.33 you will have to use a custom SQL query to run it, like this: https://latest.datasette.io/fixtures?sql=select+pk%2C+text1%2C+text2%2C+%5Bname+with+.+and+spaces%5D+from+searchable+where+rowid+in+%28select+rowid+from+searchable_fts+where+searchable_fts+match+escape_fts%28%3Asearch%29%29+order+by+pk+limit+101&search=Dog Or you can hold out for Datasette 0.34 which will have this fix and will hopefully ship within the next 24 hours.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",556814876,Escape_fts5_query-hookimplementation does not work with queries to standard tables, https://github.com/simonw/datasette/issues/662#issuecomment-579864036,https://api.github.com/repos/simonw/datasette/issues/662,579864036,MDEyOklzc3VlQ29tbWVudDU3OTg2NDAzNg==,2181410,clausjuhl,2020-01-29T17:17:01Z,2020-01-29T17:17:01Z,NONE,This is excellent news. I'll wait until version 0.34. It would be tiresome to rewrite all standard-queries into custom queries. Thank you!,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",556814876,Escape_fts5_query-hookimplementation does not work with queries to standard tables, https://github.com/simonw/datasette/issues/661#issuecomment-580028593,https://api.github.com/repos/simonw/datasette/issues/661,580028593,MDEyOklzc3VlQ29tbWVudDU4MDAyODU5Mw==,9599,simonw,2020-01-30T00:30:04Z,2020-01-30T00:30:04Z,OWNER,This has now shipped as part of Datasette 0.34: https://datasette.readthedocs.io/en/stable/changelog.html#v0-34,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",555832585,"--port option to expose a port other than 8001 in ""datasette package""", https://github.com/simonw/datasette/issues/658#issuecomment-580029288,https://api.github.com/repos/simonw/datasette/issues/658,580029288,MDEyOklzc3VlQ29tbWVudDU4MDAyOTI4OA==,9599,simonw,2020-01-30T00:32:43Z,2020-01-30T00:32:43Z,OWNER,"Can you share how your file layout is working? You should have something like this: `static/app.css` - a CSS file Then run Datasette like this: `datasette my.db --static-dir=static:static/` Then `http://127.0.0.1:8001/static/app.css` should serve your CSS. Could you share the command you're using to deploy to Heroku?","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",550293770,How do I use the app.css as style sheet?, https://github.com/simonw/datasette/issues/577#issuecomment-581758728,https://api.github.com/repos/simonw/datasette/issues/577,581758728,MDEyOklzc3VlQ29tbWVudDU4MTc1ODcyOA==,9599,simonw,2020-02-04T06:11:53Z,2020-02-04T06:11:53Z,OWNER,"For the moment I'm going to move it to `async def render_template()` on `datasette` but otherwise keep the implementation the same. The new signature will be: async def render_template(self, template, context=None, request=None, view_name=None): `template` can be a list of strings or a single string. If a list of strings a template will be selected from them. I'll reconsider the large list of default context variables later on in a separate ticket.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",497171390,Utility mechanism for plugins to render templates, https://github.com/simonw/datasette/issues/675#issuecomment-589908912,https://api.github.com/repos/simonw/datasette/issues/675,589908912,MDEyOklzc3VlQ29tbWVudDU4OTkwODkxMg==,9599,simonw,2020-02-22T02:38:21Z,2020-02-22T02:38:21Z,OWNER,"Interesting feature suggestion. My initial instinct was that this would be better handled using the layered nature of Docker - so build a Docker image with `datasette package` and then have a separate custom script which takes that image, copies in the extra data and outputs a new image. But... `datasette package` is already meant to be more convenient than messing around with Docker by hand like this - so actually having a `--copy` option like you describe here feels like it's within scope of what `datasette package` is meant to do. So yeah - if you're happy to design this I think it would be worth us adding. Small design suggestion: allow `--copy` to be applied multiple times, so you can do something like this: datasette package \ --copy ~/project/templates /templates \ --copy ~/project/README.md /README.md \ data.db Also since Click arguments can take multiple options I don't think you need to have the `:` in there - although if it better matches Docker's own UI it might be more consistent to have it.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",567902704,--cp option for datasette publish and datasette package for shipping additional files and directories, https://github.com/simonw/datasette/issues/682#issuecomment-590517338,https://api.github.com/repos/simonw/datasette/issues/682,590517338,MDEyOklzc3VlQ29tbWVudDU5MDUxNzMzOA==,9599,simonw,2020-02-24T19:51:21Z,2020-02-24T19:51:21Z,OWNER,"I filed a question / feature request with Janus about supporting timeouts for `.get()` against async queues here: https://github.com/aio-libs/janus/issues/240 I'm going to move ahead without needing that ability though. I figure SQLite writes are _fast_, and plugins can be trusted to implement just fast writes. So I'm going to support either fire-and-forget writes (they get added to the queue and a task ID is returned) or have the option to block awaiting the completion of the write (using Janus) but let callers decide which version they want. I may add optional timeouts some time in the future. I am going to make both `execute_write()` and `execute_write_fn()` awaitable functions though, for consistency with `.execute()` and to give me flexibility to change how they work in the future. I'll also add a `block=True` option to both of them which causes the function to wait for the write to be successfully executed - defaults to `False` (fire-and-forget mode). ","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",569613563,Mechanism for writing to database via a queue, https://github.com/simonw/datasette/pull/683#issuecomment-590679273,https://api.github.com/repos/simonw/datasette/issues/683,590679273,MDEyOklzc3VlQ29tbWVudDU5MDY3OTI3Mw==,9599,simonw,2020-02-25T04:37:21Z,2020-02-25T04:37:21Z,OWNER,I'm happy with this now. I'm going to merge to master.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",570101428,.execute_write() and .execute_write_fn() methods on Database, https://github.com/simonw/datasette/issues/675#issuecomment-592399256,https://api.github.com/repos/simonw/datasette/issues/675,592399256,MDEyOklzc3VlQ29tbWVudDU5MjM5OTI1Ng==,9599,simonw,2020-02-28T08:09:12Z,2020-02-28T08:09:12Z,OWNER,"Sure, `--cp` looks good to me.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",567902704,--cp option for datasette publish and datasette package for shipping additional files and directories, https://github.com/simonw/datasette/issues/394#issuecomment-602907207,https://api.github.com/repos/simonw/datasette/issues/394,602907207,MDEyOklzc3VlQ29tbWVudDYwMjkwNzIwNw==,127565,wragge,2020-03-23T23:12:18Z,2020-03-23T23:12:18Z,CONTRIBUTOR,"This would also be useful for running Datasette in Jupyter notebooks on [Binder](https://mybinder.org/). While you can use [Jupyter-server-proxy](https://github.com/jupyterhub/jupyter-server-proxy) to access Datasette on Binder, the links are broken. Why run Datasette on Binder? I'm developing a [range of Jupyter notebooks](https://glam-workbench.github.io/) that are aimed at getting humanities researchers to explore data from libraries, archives, and museums. Many of them are aimed at researchers with limited digital skills, so being able to run examples in Binder without them installing anything is fantastic. For example, there are a [series of notebooks](https://glam-workbench.github.io/trove-harvester/) that help researchers harvest digitised historical newspaper articles from Trove. The metadata from this harvest is saved as a CSV file that users can download. I've also provided some extra notebooks that use Pandas etc to demonstrate ways of analysing and visualising the harvested data. But it would be really nice if, after completing a harvest, the user could spin up Datasette for some initial exploration of their harvested data without ever leaving their browser.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",396212021,base_url configuration setting, https://github.com/simonw/datasette/issues/394#issuecomment-603631640,https://api.github.com/repos/simonw/datasette/issues/394,603631640,MDEyOklzc3VlQ29tbWVudDYwMzYzMTY0MA==,9599,simonw,2020-03-25T04:19:08Z,2020-03-25T04:19:08Z,OWNER,Shipped in 0.39: https://datasette.readthedocs.io/en/latest/changelog.html#v0-39,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",396212021,base_url configuration setting, https://github.com/simonw/datasette/issues/717#issuecomment-610076073,https://api.github.com/repos/simonw/datasette/issues/717,610076073,MDEyOklzc3VlQ29tbWVudDYxMDA3NjA3Mw==,9599,simonw,2020-04-06T22:47:21Z,2020-04-06T22:47:21Z,OWNER,I'm confident it's possible to create a plugin that deploys to Now v2 now. I'll do the rest of the work in a separate repo: https://github.com/simonw/datasette-publish-now,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",594189527,See if I can get Datasette working on Zeit Now v2, https://github.com/simonw/datasette/issues/176#issuecomment-617208503,https://api.github.com/repos/simonw/datasette/issues/176,617208503,MDEyOklzc3VlQ29tbWVudDYxNzIwODUwMw==,12976,nkirsch,2020-04-21T14:16:24Z,2020-04-21T14:16:24Z,NONE,"@eads I'm interested in helping, if there's still a need...","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/731#issuecomment-618155472,https://api.github.com/repos/simonw/datasette/issues/731,618155472,MDEyOklzc3VlQ29tbWVudDYxODE1NTQ3Mg==,9599,simonw,2020-04-23T03:28:42Z,2020-04-23T03:28:56Z,OWNER,"As an alternative to `--static` this could work by letting you create the following: - `static/css/` - `static/js/` Which would be automatically mounted at `/js/...` and `/css/...` Or maybe just mount `static/` at `/static/` instead? ","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",605110015,Option to automatically configure based on directory layout, https://github.com/dogsheep/github-to-sqlite/issues/33#issuecomment-622279374,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/33,622279374,MDEyOklzc3VlQ29tbWVudDYyMjI3OTM3NA==,2029,garethr,2020-05-01T07:12:47Z,2020-05-01T07:12:47Z,NONE,"I also go it working with: ```yaml run: echo ${{ secrets.github_token }} | github-to-sqlite auth ```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",609950090,Fall back to authentication via ENV, https://github.com/dogsheep/dogsheep-photos/issues/16#issuecomment-623807568,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/16,623807568,MDEyOklzc3VlQ29tbWVudDYyMzgwNzU2OA==,9599,simonw,2020-05-05T02:56:06Z,2020-05-05T02:56:06Z,MEMBER,"I'm pretty sure this is what I'm after. The `groups` table has what looks like identified labels in the rows with category = 2025: Then there's a `ga` table that maps groups to assets: And an `assets` table which looks like it has one row for every one of my photos: One major challenge: these UUIDs are split into two integer numbers, `uuid_0` and `uuid_1` - but the main photos database uses regular UUIDs like this: ![image](https://user-images.githubusercontent.com/9599/81031481-39164280-8e41-11ea-983b-005ced641a18.png) I need to figure out how to match up these two different UUID representations. I asked on Twitter if anyone has any ideas: https://twitter.com/simonw/status/1257500689019703296","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",612287234,"Import machine-learning detected labels (dog, llama etc) from Apple Photos", https://github.com/simonw/datasette/issues/758#issuecomment-624797119,https://api.github.com/repos/simonw/datasette/issues/758,624797119,MDEyOklzc3VlQ29tbWVudDYyNDc5NzExOQ==,9599,simonw,2020-05-06T17:53:46Z,2020-05-06T17:53:46Z,OWNER,It's interesting to hear from someone who's using this feature - I'm considering moving it out into a plugin #647.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",612382643,Question: Access to immutable database-path, https://github.com/simonw/datasette/issues/757#issuecomment-624821090,https://api.github.com/repos/simonw/datasette/issues/757,624821090,MDEyOklzc3VlQ29tbWVudDYyNDgyMTA5MA==,9599,simonw,2020-05-06T18:41:29Z,2020-05-06T18:41:29Z,OWNER,"OK, I just released 0.41 with that and a bunch of other stuff: https://datasette.readthedocs.io/en/latest/changelog.html#v0-41","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",612378203,Question: Any fixed date for the release with the uft8-encoding fix?, https://github.com/dogsheep/dogsheep-photos/issues/21#issuecomment-626395209,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21,626395209,MDEyOklzc3VlQ29tbWVudDYyNjM5NTIwOQ==,9599,simonw,2020-05-10T21:52:42Z,2020-05-10T21:52:42Z,MEMBER,"Aha! It looks like I accidentally installed the old bplist into the same environment: ``` $ pip freeze | grep bpylist bpylist==0.1.4 bpylist2==3.0.0 ```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",615474990,bpylist.archiver.CircularReference: archive has a cycle with uid(13), https://github.com/dogsheep/dogsheep-photos/issues/21#issuecomment-626395781,https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21,626395781,MDEyOklzc3VlQ29tbWVudDYyNjM5NTc4MQ==,9599,simonw,2020-05-10T21:57:09Z,2020-05-10T21:57:09Z,MEMBER,"Yes, I just recreated my virtual environment from scratch and the error went away. The problem occurred when I ran `pip install datasette-bplist` in the same virtual environment - https://github.com/simonw/datasette-bplist/blob/master/setup.py depends on `bpylist` which is incompatible with `bpylist2`.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",615474990,bpylist.archiver.CircularReference: archive has a cycle with uid(13), https://github.com/simonw/datasette/pull/868#issuecomment-650600606,https://api.github.com/repos/simonw/datasette/issues/868,650600606,MDEyOklzc3VlQ29tbWVudDY1MDYwMDYwNg==,9599,simonw,2020-06-27T18:44:28Z,2020-06-27T18:44:28Z,OWNER,"This is really exciting! Thanks so much for looking into this. I'm interested in moving CI for this repo over to GitHub Actions, so I'd be fine with you getting this to work as an Action rather than through Travis. If you can get it working in Travis though I'll happily land that and figure out how to convert that to GitHub Actions later on.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",646448486,initial windows ci setup, https://github.com/simonw/datasette/issues/877#issuecomment-652520496,https://api.github.com/repos/simonw/datasette/issues/877,652520496,MDEyOklzc3VlQ29tbWVudDY1MjUyMDQ5Ng==,9599,simonw,2020-07-01T16:26:52Z,2020-07-01T16:26:52Z,OWNER,Tokens get verified by plugins. So far there's only one: https://github.com/simonw/datasette-auth-tokens - which has you hard-coding plugins in a configuration file. I have a issue there to add support for database-backed tokens too: https://github.com/simonw/datasette-auth-tokens/issues/1,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",648421105,Consider dropping explicit CSRF protection entirely?, https://github.com/simonw/sqlite-utils/issues/121#issuecomment-655652679,https://api.github.com/repos/simonw/sqlite-utils/issues/121,655652679,MDEyOklzc3VlQ29tbWVudDY1NTY1MjY3OQ==,79913,tsibley,2020-07-08T17:24:46Z,2020-07-08T17:24:46Z,CONTRIBUTOR,"Better transaction handling would be really great. Some of my thoughts on implementing better transaction discipline are in https://github.com/simonw/sqlite-utils/pull/118#issuecomment-655239728. My preferences: - Each CLI command should operate in a single transaction so that either the whole thing succeeds or the whole thing is rolled back. This avoids partially completed operations when an error occurs part way through processing. Partially completed operations are typically much harder to recovery from gracefully and may cause inconsistent data states. - The Python API should be transaction-agnostic and rely on the caller to coordinate transactions. Only the caller knows how individual insert, create, update, etc operations/methods should be bundled conceptually into transactions. When the caller is the CLI, for example, that bundling would be at the CLI command-level. Other callers might want to break up operations into multiple transactions. Transactions are usually most useful when controlled at the application-level (like logging configuration) instead of the library level. The library needs to provide an API that's conducive to transaction use, though. - The Python API should provide a context manager to provide consistent transactions handling with more useful defaults than Python's `sqlite3` module. The latter issues implicit `BEGIN` statements by default for most DML (`INSERT`, `UPDATE`, `DELETE`, … but not `SELECT`, I believe), but **not** DDL (`CREATE TABLE`, `DROP TABLE`, `CREATE VIEW`, …). Notably, the `sqlite3` module doesn't issue the implicit `BEGIN` until the first DML statement. It _does not_ issue it when entering the `with conn` block, like other DBAPI2-compatible modules do. The `with conn` block for `sqlite3` only arranges to commit or rollback an existing transaction when exiting. Including DDL and `SELECT`s in transactions is important for operation consistency, though. There are several existing bugs.python.org tickets about this and future changes are in the works, but sqlite-utils can provide its own API sooner. sqlite-utils's `Database` class could itself be a context manager (built on the `sqlite3` connection context manager) which additionally issues an explicit `BEGIN` when entering. This would then let Python API callers do something like: ```python db = sqlite_utils.Database(path) with db: # ← BEGIN issued here by Database.__enter__ db.insert(…) db.create_view(…) # ← COMMIT/ROLLBACK issue here by sqlite3.connection.__exit__ ```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",652961907,Improved (and better documented) support for transactions, https://github.com/simonw/sqlite-utils/issues/121#issuecomment-655673896,https://api.github.com/repos/simonw/sqlite-utils/issues/121,655673896,MDEyOklzc3VlQ29tbWVudDY1NTY3Mzg5Ng==,9599,simonw,2020-07-08T18:08:11Z,2020-07-08T18:08:11Z,OWNER,"I'm with you on most of this. Completely agreed that the CLI should do everything in a transaction. The one thing I'm not keen on is forcing calling code to explicitly start a transaction, for a couple of reasons: 1. It will break all of the existing code out there 2. It doesn't match to how I most commonly use this library - as an interactive tool in a Jupyter notebook, where I'm generally working against a brand new scratch database and any errors don't actually matter So... how about this: IF you wrap your code in a `with db:` block then the `.insert()` and suchlike methods expect you to manage transactions yourself. But if you don't use the context manager they behave like they do at the moment (or maybe a bit more sensibly). That way existing code works as it does today, lazy people like me can call `.insert()` without thinking about transactions, but people writing actual production code (as opposed to Jupyter hacks) have a sensible way to take control of the transactions themselves.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",652961907,Improved (and better documented) support for transactions, https://github.com/simonw/datasette/issues/942#issuecomment-675718593,https://api.github.com/repos/simonw/datasette/issues/942,675718593,MDEyOklzc3VlQ29tbWVudDY3NTcxODU5Mw==,9599,simonw,2020-08-18T21:02:11Z,2020-08-18T21:02:24Z,OWNER,"Easiest solution: if you provide column metadata it gets displayed above the table, something like on https://fivethirtyeight.datasettes.com/fivethirtyeight/antiquities-act%2Factions_under_antiquities_act HTML `title=` tooltips are also added to the table headers, which won't be visible on touch devices but that's OK because the information is visible on the page already.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",681334912,Support column descriptions in metadata.json, https://github.com/simonw/sqlite-utils/pull/142#issuecomment-683173375,https://api.github.com/repos/simonw/sqlite-utils/issues/142,683173375,MDEyOklzc3VlQ29tbWVudDY4MzE3MzM3NQ==,9599,simonw,2020-08-28T22:29:02Z,2020-08-28T22:29:02Z,OWNER,Yeah I think that failure is actually because there's a brand new release of Black out and it subtly changes some of the formatting rules. I'll merge this and then run Black against the entire codebase.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",688386219,"insert_all(..., alter=True) should work for new columns introduced after the first 100 records", https://github.com/dogsheep/twitter-to-sqlite/issues/50#issuecomment-691501132,https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/50,691501132,MDEyOklzc3VlQ29tbWVudDY5MTUwMTEzMg==,706257,bcongdon,2020-09-12T14:48:10Z,2020-09-12T14:48:10Z,NONE,"This seems to be an issue even with larger values of `--stop_after`: ``` $ twitter-to-sqlite favorites twitter.db --stop_after=2000 Importing favorites [####################################] 198 $ ```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",698791218,"favorites --stop_after=N stops after min(N, 200)", https://github.com/simonw/sqlite-utils/issues/159#issuecomment-693199049,https://api.github.com/repos/simonw/sqlite-utils/issues/159,693199049,MDEyOklzc3VlQ29tbWVudDY5MzE5OTA0OQ==,9599,simonw,2020-09-16T06:20:26Z,2020-09-16T06:20:26Z,OWNER,"See #121 - I need to think harder about how this all interacts with transactions. You can do this: ```python with db.conn: db[""mytable""].delete_where() ``` But that should be documented and maybe rethought.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",702386948,.delete_where() does not auto-commit (unlike .insert() or .upsert()), https://github.com/simonw/datasette/issues/970#issuecomment-695896557,https://api.github.com/repos/simonw/datasette/issues/970,695896557,MDEyOklzc3VlQ29tbWVudDY5NTg5NjU1Nw==,9599,simonw,2020-09-21T04:40:12Z,2020-09-21T04:40:12Z,OWNER,The Python standard library has a module for this: https://docs.python.org/3/library/webbrowser.html,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",705108492,"request an ""-o"" option on ""datasette server"" to open the default browser at the running url", https://github.com/simonw/datasette/issues/619#issuecomment-697973420,https://api.github.com/repos/simonw/datasette/issues/619,697973420,MDEyOklzc3VlQ29tbWVudDY5Nzk3MzQyMA==,45416,obra,2020-09-23T21:07:58Z,2020-09-23T21:07:58Z,NONE,"I've just run into this after crafting a complex query and discovered that hitting back loses my query. Even showing me the whole bad query would be a huge improvement over the current status quo.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",520655983,"""Invalid SQL"" page should let you edit the SQL", https://github.com/simonw/sqlite-utils/pull/178#issuecomment-701627158,https://api.github.com/repos/simonw/sqlite-utils/issues/178,701627158,MDEyOklzc3VlQ29tbWVudDcwMTYyNzE1OA==,9599,simonw,2020-09-30T20:29:11Z,2020-09-30T20:29:11Z,OWNER,Thanks for the fix!,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",709043182,Update README.md, https://github.com/simonw/datasette/pull/986#issuecomment-702265255,https://api.github.com/repos/simonw/datasette/issues/986,702265255,MDEyOklzc3VlQ29tbWVudDcwMjI2NTI1NQ==,9599,simonw,2020-10-01T16:51:45Z,2020-10-01T16:51:45Z,OWNER,Thanks for taking a look! The fix ended up being a little different from this because I still want to disable faceting on regular single primary keys (since faceting by those won't ever produce interesting results) - here's what I used: https://github.com/simonw/datasette/commit/5d6bc4c268f9f155e59561671f8617addd3e91bc,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712889459,"Allow facet by primary keys, fixes #985", https://github.com/simonw/datasette/issues/778#issuecomment-702493047,https://api.github.com/repos/simonw/datasette/issues/778,702493047,MDEyOklzc3VlQ29tbWVudDcwMjQ5MzA0Nw==,9599,simonw,2020-10-02T02:26:25Z,2020-10-02T02:26:25Z,OWNER,"I think this could work for arbitrary SQL queries too. Those would need querystring configuration that specifies which sorted column(s) should be used for the ""next"" cursor. One example: I'd like to be able to offer a paginated list of counts of values in a table - e.g. this query: https://fivethirtyeight.datasettes.com/fivethirtyeight?sql=select+replies%2C+count%28*%29+from+%5Btwitter-ratio%2Fsenators%5D+group+by+replies+order+by+count%28*%29+desc%3B That could even become a query that gets linked to from the column actions menu.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",626211658,Ability to configure keyset pagination for views and queries, https://github.com/simonw/datasette/issues/991#issuecomment-712317638,https://api.github.com/repos/simonw/datasette/issues/991,712317638,MDEyOklzc3VlQ29tbWVudDcxMjMxNzYzOA==,9599,simonw,2020-10-19T17:30:56Z,2020-10-19T17:30:56Z,OWNER,https://biglocal.datasettes.com/ is one of my larger Datasettes in terms of number of databases.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",714377268,Redesign application homepage, https://github.com/simonw/datasette/issues/782#issuecomment-712569695,https://api.github.com/repos/simonw/datasette/issues/782,712569695,MDEyOklzc3VlQ29tbWVudDcxMjU2OTY5NQ==,222245,carlmjohnson,2020-10-20T03:45:48Z,2020-10-20T03:46:14Z,NONE,"I vote against headers. It has a lot of strikes against it: poor discoverability, new developers often don’t know how to use them, makes CORS harder, makes it hard to use eg with JQ, needs ad hoc specification for each bit of metadata, etc. The only advantage of headers is that you don’t need to do .rows, but that’s actually good as a data validation step anyway—if .rows is missing assume there’s an error and do your error handling path instead of parsing the rest.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879,Redesign default .json format, https://github.com/simonw/datasette/pull/1044#issuecomment-715584579,https://api.github.com/repos/simonw/datasette/issues/1044,715584579,MDEyOklzc3VlQ29tbWVudDcxNTU4NDU3OQ==,9599,simonw,2020-10-23T20:53:01Z,2020-10-23T20:53:01Z,OWNER,Thanks for this!,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727916744,Add minimum supported python, https://github.com/simonw/datasette/pull/1043#issuecomment-715585140,https://api.github.com/repos/simonw/datasette/issues/1043,715585140,MDEyOklzc3VlQ29tbWVudDcxNTU4NTE0MA==,9599,simonw,2020-10-23T20:54:29Z,2020-10-23T20:54:29Z,OWNER,Thanks. I'll push a source release of `asgi-csrf`.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",727915394,Include LICENSE in sdist, https://github.com/simonw/datasette/issues/1033#issuecomment-716048564,https://api.github.com/repos/simonw/datasette/issues/1033,716048564,MDEyOklzc3VlQ29tbWVudDcxNjA0ODU2NA==,9599,simonw,2020-10-24T20:08:31Z,2020-10-24T20:08:31Z,OWNER,Documentation here: https://docs.datasette.io/en/latest/internals.html#datasette-urls,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",725099777,datasette.urls.static_plugins(...) method, https://github.com/simonw/sqlite-utils/pull/189#issuecomment-717359145,https://api.github.com/repos/simonw/sqlite-utils/issues/189,717359145,MDEyOklzc3VlQ29tbWVudDcxNzM1OTE0NQ==,35681,adamwolf,2020-10-27T16:20:32Z,2020-10-27T16:20:32Z,CONTRIBUTOR,"No problem. I added a test. Let me know if it looks sufficient or if you want me to to tweak something! If you don't mind, would you tag this PR as ""hacktoberfest-accepted""? If you do mind, no problem and I'm sorry for asking :) My kiddos like the shirts.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729818242,Allow iterables other than Lists in m2m records, https://github.com/simonw/datasette/issues/1050#issuecomment-718342036,https://api.github.com/repos/simonw/datasette/issues/1050,718342036,MDEyOklzc3VlQ29tbWVudDcxODM0MjAzNg==,9599,simonw,2020-10-29T03:49:57Z,2020-10-29T03:49:57Z,OWNER,"@thadk from that error it looks like the problem may have been that you had a BLOB column containing a `null` value? If so that's definitely a bug, I'll fix that.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",729057388,Switch to .blob render extension for BLOB downloads, https://github.com/simonw/datasette/issues/865#issuecomment-726412057,https://api.github.com/repos/simonw/datasette/issues/865,726412057,MDEyOklzc3VlQ29tbWVudDcyNjQxMjA1Nw==,9599,simonw,2020-11-12T23:49:23Z,2020-11-12T23:49:23Z,OWNER,"@tballison thanks, I've split that out into a new issue #1091","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",644582921,"base_url doesn't seem to work when adding criteria and clicking ""apply""", https://github.com/simonw/datasette/issues/1114#issuecomment-735443626,https://api.github.com/repos/simonw/datasette/issues/1114,735443626,MDEyOklzc3VlQ29tbWVudDczNTQ0MzYyNg==,9599,simonw,2020-11-29T19:40:49Z,2020-11-29T19:40:49Z,OWNER,Fix is out in 0.52.1: https://docs.datasette.io/en/latest/changelog.html#v0-52-1,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",752966476,--load-extension=spatialite not working with datasetteproject/datasette docker image, https://github.com/simonw/datasette/issues/942#issuecomment-737463116,https://api.github.com/repos/simonw/datasette/issues/942,737463116,MDEyOklzc3VlQ29tbWVudDczNzQ2MzExNg==,9599,simonw,2020-12-02T20:02:10Z,2020-12-02T20:03:01Z,OWNER,"My idea is that if you installed my proposed plugin you wouldn't need `metadata.json` at all - your metadata would instead live in a table in the connected SQLite database files - either one table per database (so the metadata can live in the same place as the data) or maybe also in a dedicated separate database file, for if you want to add metadata to an otherwise read-only database. The plugin would then provide a UI for editing that metadata - maybe by configuring some writable canned queries or maybe something more custom than that. Or you could edit the metadata by manually editing the SQLite database file (or loading data into it using a tool like [yaml-to-sqlite](https://github.com/simonw/yaml-to-sqlite)).","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",681334912,Support column descriptions in metadata.json, https://github.com/simonw/datasette/issues/111#issuecomment-738904347,https://api.github.com/repos/simonw/datasette/issues/111,738904347,MDEyOklzc3VlQ29tbWVudDczODkwNDM0Nw==,9599,simonw,2020-12-04T17:16:56Z,2020-12-04T17:16:56Z,OWNER,This is STILL a good idea.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274615452,Add “updated” to metadata, https://github.com/simonw/datasette/pull/1128#issuecomment-739355855,https://api.github.com/repos/simonw/datasette/issues/1128,739355855,MDEyOklzc3VlQ29tbWVudDczOTM1NTg1NQ==,9599,simonw,2020-12-05T19:34:57Z,2020-12-05T19:34:57Z,OWNER,Thanks for this!,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",756867924,Fix startup error on windows, https://github.com/simonw/datasette/issues/766#issuecomment-741665253,https://api.github.com/repos/simonw/datasette/issues/766,741665253,MDEyOklzc3VlQ29tbWVudDc0MTY2NTI1Mw==,2181410,clausjuhl,2020-12-09T09:59:05Z,2020-12-09T09:59:05Z,NONE,Hi Simon. Any news on using wildcard-searches with datasette? Thanks!,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",617323873,Enable wildcard-searches by default, https://github.com/simonw/datasette/issues/1142#issuecomment-743998792,https://api.github.com/repos/simonw/datasette/issues/1142,743998792,MDEyOklzc3VlQ29tbWVudDc0Mzk5ODc5Mg==,6622733,nitinpaultifr,2020-12-13T12:14:06Z,2020-12-13T12:14:06Z,NONE,"Agreed, it would definitely provide better controls. However, I do feel it makes for a bit of inconsistent UX for the 'Advanced export' section, with links to download for JSON, checkboxes and radio buttons + button to download for CSV. Do you think this example makes the UX a bit nicer/consistent? ![Screenshot 2020-12-13 at 5 38 43 PM](https://user-images.githubusercontent.com/6622733/102011444-1dc1cd00-3d6a-11eb-9e38-5af198161e80.png) I could give it a try if you'd like but I've never contributed to an actual project! ","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763361458,"""Stream all rows"" is not at all obvious", https://github.com/simonw/datasette/issues/276#issuecomment-744461856,https://api.github.com/repos/simonw/datasette/issues/276,744461856,MDEyOklzc3VlQ29tbWVudDc0NDQ2MTg1Ng==,296686,robintw,2020-12-14T14:04:57Z,2020-12-14T14:04:57Z,NONE,"I'm looking into using datasette with a database with spatialite geometry columns, and came across this issue. Has there been any progress on this since 2018? In one of my tables I'm just storing lat/lon points in a spatialite point geometry, and I've managed to make datasette-cluster-map display the points by extracting the lat and lon in SQL - using something like `select ... ST_X(location) as longitude, ST_Y(location) as latitude from Blah`. Something more 'built-in' would be great though - particularly for the tables I have that store more complex geometries.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/1144#issuecomment-744489028,https://api.github.com/repos/simonw/datasette/issues/1144,744489028,MDEyOklzc3VlQ29tbWVudDc0NDQ4OTAyOA==,475613,MarkusH,2020-12-14T14:47:11Z,2020-12-14T14:47:11Z,NONE,"Thanks for opening the issue, @simonw. Let me elaborate on my Tweets. [datasette-chartjs](https://github.com/MarkusH/datasette-chartjs) provides drop down lists to pick the chart visualization (e.g. bar, line, doughnut, pie, ...) as well as the column used for the ""x axis"" (e.g. time). A user can change the values on-demand. The chart will be redrawn w/o querying the database again. However, if a user wants to change the underlying query, they will use the SQL field provided by datasette or any of the other datasette built-in features to amend a query. In order to maintain a user's selections for the plugin, datasette-chartjs copies some parts of [datasette-vega](https://github.com/simonw/datasette-vega) which persist the chosen visualization and column in the hash part of a URL (the stuff behind the `#`). The plugin load the config from the hash upon initialization on the next page and use it accordingly. Additionally, datasette-vega and datasette-chartjs need to make sure to include the hash in all links and forms that cause a reload of the page. This is, such that the config persists between clicks. This ticket is about moving thes parts into datasette that provide the functionality to do so. This includes: 1. a way to load config options with a given prefix from the current URL hash 1. a way to update the current URL hash with a new config value or a bunch of config options 1. updating all necessary links and forms on the current page to include the URL hash whenever its updated 1. to prevent leaking config options to external pages, only ""internal"" links should be updated There's another, optional, feature that we might want to think about during the design phase: the scope of the config. Links within a datasette instance have 1 of 3 scopes: 1. global, for the whole datasette project 1. database, for all tables in a database 1. table, only for a table within a database When updating the links and forms as pointed out in 3. above, it might be worth considering which links need to be updated. I could imagine a plugin that wants to persist some setting across all tables within a database but another setting only within a table.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",765637324,JavaScript to help plugins interact with the fragment part of the URL, https://github.com/simonw/datasette/issues/1142#issuecomment-744522099,https://api.github.com/repos/simonw/datasette/issues/1142,744522099,MDEyOklzc3VlQ29tbWVudDc0NDUyMjA5OQ==,6622733,nitinpaultifr,2020-12-14T15:37:47Z,2020-12-14T15:37:47Z,NONE,"Alright I could give it a try! This might be a stupid question, can you tell me how to run the server from my fork? So that I can test the changes?","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763361458,"""Stream all rows"" is not at all obvious", https://github.com/simonw/datasette/issues/1142#issuecomment-744563209,https://api.github.com/repos/simonw/datasette/issues/1142,744563209,MDEyOklzc3VlQ29tbWVudDc0NDU2MzIwOQ==,9599,simonw,2020-12-14T16:41:11Z,2020-12-14T16:41:11Z,OWNER,"To check out and start the server: /tmp % git clone git@github.com:nitinpaul/datasette Cloning into 'datasette'... remote: Enumerating objects: 124, done. # ... datasette % python3 -m venv venv datasette % source venv/bin/activate (venv) datasette % pip install -e '.[test]' Obtaining file:///private/tmp/datasette Collecting asgiref<3.4.0,>=3.2.10 Using cached asgiref-3.3.1-py3-none-any.whl (19 kB) # ... (venv) datasette % datasette INFO: Started server process [24002] INFO: Waiting for application startup. INFO: Application startup complete. INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit) And to run the tests: (venv) datasette % pytest ======================================================================== test session starts ======================================================================== platform darwin -- Python 3.9.1, pytest-6.1.2, py-1.10.0, pluggy-0.13.1 SQLite: 3.34.0 rootdir: /private/tmp/datasette, configfile: pytest.ini plugins: asyncio-0.14.0, timeout-1.4.2 collected 841 items tests/test_package.py .. [ 0%] ","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",763361458,"""Stream all rows"" is not at all obvious", https://github.com/simonw/datasette/issues/1148#issuecomment-747062909,https://api.github.com/repos/simonw/datasette/issues/1148,747062909,MDEyOklzc3VlQ29tbWVudDc0NzA2MjkwOQ==,9599,simonw,2020-12-16T21:51:54Z,2020-12-16T21:51:54Z,OWNER,"This is a really frustrating bug with Vercel: https://github.com/simonw/datasette-publish-vercel/issues/28 `+` characters in URLs get translated into spaces before they get to Datasette. They know about the bug and said they were working on a fix a few months ago, but looks like it's still a problem. A workaround is to avoid `+` and use `-` instead - I think this SQL query does the same thing as yours: https://aws-partners-singapore.vercel.app/partners?sql=select%0D%0A++A.launch_rank%2C%0D%0A++A.partner_info%0D%0Afrom%0D%0A++summary+A%0D%0A++INNER+JOIN+summary+B+ON+A.launch_rank+%3E%3D+B.launch_rank+-+3%0D%0A++AND+A.launch_rank+-4+%3C%3D+B.launch_rank%0D%0AWHERE%0D%0A++B.%22partner_info%22+LIKE+%27%25Palo+Alto%25%27 ```sql select A.launch_rank, A.partner_info from summary A INNER JOIN summary B ON A.launch_rank >= B.launch_rank - 3 AND A.launch_rank -4 <= B.launch_rank WHERE B.""partner_info"" LIKE '%Palo Alto%' ``` I've been moving projects from Vercel to Cloud Run when they run into this, but that's not a great situation to be in.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",767561886,Syntax error with + symbol when deployed to Vercel, https://github.com/simonw/datasette/issues/1149#issuecomment-747207787,https://api.github.com/repos/simonw/datasette/issues/1149,747207787,MDEyOklzc3VlQ29tbWVudDc0NzIwNzc4Nw==,9599,simonw,2020-12-17T05:06:16Z,2020-12-17T05:06:16Z,OWNER,"So, an idea: what if Datasette's default CSS applied only to elements with classes - or maybe to childen of a `body class=""datasette""` element? In such a way that you could write your own custom HTML that reused elements of Datasette's CSS - the cog menu styling for example - but only on an opt-in basis?","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",769520939,Make it easier to theme Datasette with CSS, https://github.com/simonw/datasette/pull/1158#issuecomment-750390741,https://api.github.com/repos/simonw/datasette/issues/1158,750390741,MDEyOklzc3VlQ29tbWVudDc1MDM5MDc0MQ==,9599,simonw,2020-12-23T17:05:32Z,2020-12-23T17:05:32Z,OWNER,"Thanks for this! I'm fine keeping the `os.path` stuff as is.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",773913793,Modernize code to Python 3.6+, https://github.com/simonw/datasette/issues/987#issuecomment-752714747,https://api.github.com/repos/simonw/datasette/issues/987,752714747,MDEyOklzc3VlQ29tbWVudDc1MjcxNDc0Nw==,9599,simonw,2020-12-30T18:23:08Z,2020-12-30T18:23:20Z,OWNER,"In terms of ""places to put your plugin content"", the simplest solution I can think of is something like this: ```html
``` Alternative designs: - A documented JavaScript function that returns the CSS selector where plugins should put their content - A documented JavaScript function that returns a DOM node where plugins should put their content. This would allow the JavaScript to create the element if it does not already exist (though it wouldn't be obvious WHERE that element should be created) - Documented JavaScript functions for things like ""append this node/HTML to the place-where-plugins-go"" I think the original option - an empty `
` with a known `id` attribute - is the right one to go with here. It's the simplest, it's very easy for custom template authors to understand and it acknowledges that plugins may have all kinds of extra crazy stuff they want to do - like checking in that div to see if another plugin has written to it already, for example.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",712984738,Documented HTML hooks for JavaScript plugin authors, https://github.com/simonw/datasette/issues/1169#issuecomment-753653260,https://api.github.com/repos/simonw/datasette/issues/1169,753653260,MDEyOklzc3VlQ29tbWVudDc1MzY1MzI2MA==,9599,simonw,2021-01-03T17:54:40Z,2021-01-03T17:54:40Z,OWNER,And @benpickles yes I would land that pull request straight away as-is. Thanks!,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",777677671,Prettier package not actually being cached, https://github.com/simonw/datasette/issues/913#issuecomment-754187326,https://api.github.com/repos/simonw/datasette/issues/913,754187326,MDEyOklzc3VlQ29tbWVudDc1NDE4NzMyNg==,9599,simonw,2021-01-04T20:03:50Z,2021-01-04T20:03:50Z,OWNER,"I renamed `--config` to `--setting` and changed it to work like this: datasette --setting sql_time_limit_ms 1000 Note the lack of colons. This actually makes colons cleaner to use for plugins - I could support this: datasette --setting datasette-insert:unsafe 1","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",670209331,Mechanism for passing additional options to `datasette my.db` that affect plugins, https://github.com/simonw/datasette/issues/93#issuecomment-754215392,https://api.github.com/repos/simonw/datasette/issues/93,754215392,MDEyOklzc3VlQ29tbWVudDc1NDIxNTM5Mg==,9599,simonw,2021-01-04T20:59:20Z,2021-01-04T21:03:14Z,OWNER,"Updated `pyinstaller` recipe - lots of hidden imports needed now: ``` pip install wheel pip install datasette pyinstaller BASE=$(python -c 'import os; print(os.path.dirname(__import__(""datasette"").__file__))') \ pyinstaller -F \ --add-data ""$BASE/templates:datasette/templates"" \ --add-data ""$BASE/static:datasette/static"" \ --hidden-import datasette.publish \ --hidden-import datasette.publish.heroku \ --hidden-import datasette.publish.cloudrun \ --hidden-import datasette.facets \ --hidden-import datasette.sql_functions \ --hidden-import datasette.actor_auth_cookie \ --hidden-import datasette.default_permissions \ --hidden-import datasette.default_magic_parameters \ --hidden-import datasette.blob_renderer \ --hidden-import datasette.default_menu_links \ --hidden-import uvicorn \ --hidden-import uvicorn.logging \ --hidden-import uvicorn.loops \ --hidden-import uvicorn.loops.auto \ --hidden-import uvicorn.protocols \ --hidden-import uvicorn.protocols.http \ --hidden-import uvicorn.protocols.http.auto \ --hidden-import uvicorn.protocols.websockets \ --hidden-import uvicorn.protocols.websockets.auto \ --hidden-import uvicorn.lifespan \ --hidden-import uvicorn.lifespan.on \ $(which datasette) ```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/sqlite-utils/issues/220#issuecomment-761015218,https://api.github.com/repos/simonw/sqlite-utils/issues/220,761015218,MDEyOklzc3VlQ29tbWVudDc2MTAxNTIxOA==,649467,mhalle,2021-01-15T15:40:08Z,2021-01-15T15:40:08Z,NONE,"Make sense. If you're coming from the sqlite3 side of things, rather than the datasette side, wanting the fts methods to work for views makes more sense. sqlite3 allows fts5 tables on views, so I was looking for CLI functionality to build the fts virtual tables. Ultimately, though, sharing fts virtual tables across tables and derivative views is likely more efficient. Maybe an explicit error message like, ""fts is not supported for views"" rather than just throwing an exception that the method doesn't exist"" might be helpful. Not critical though. Thanks.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",783778672,Better error message for *_fts methods against views, https://github.com/simonw/datasette/issues/657#issuecomment-761179229,https://api.github.com/repos/simonw/datasette/issues/657,761179229,MDEyOklzc3VlQ29tbWVudDc2MTE3OTIyOQ==,9599,simonw,2021-01-15T20:24:35Z,2021-01-15T20:24:35Z,OWNER,"I'm not sure how I missed this issue but it's almost a year later and I'm finally taking a look at your Parquet work. This is yet more evidence that allowing plugins to provide their own custom `Database` objects would be a good idea. I started exploring what Datasette would like on PostgreSQL in #670 - my concern was that I would need to add a large amount of database abstraction code which would dramatically increase the complexity of the core project, but my thinking now is that it might be tractable - Datasette doesn't actually construct SQL in complex ways anywhere outside of the `TableView` class so abstracting away just that bit should be feasible.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",548591089,Allow creation of virtual tables at startup, https://github.com/simonw/datasette/issues/1209#issuecomment-769455370,https://api.github.com/repos/simonw/datasette/issues/1209,769455370,MDEyOklzc3VlQ29tbWVudDc2OTQ1NTM3MA==,9599,simonw,2021-01-28T23:00:21Z,2021-01-28T23:00:21Z,OWNER,"Good catch on the workaround here. The root problem is that `datasette-template-sql` looks for the first available databsae if you don't provide it with a `database=` argument, and in Datasette 0.54 the first available database changed to being the new `_internal` database. Is this a bug? I think it is - because the documented behaviour on https://docs.datasette.io/en/stable/internals.html#get-database-name is this: > `name` - string, optional > > The name to be used for this database - this will be used in the URL path, e.g. `/dbname`. If not specified Datasette will pick one based on the filename or memory name. Since the new behaviour differs from what was in the documentation I'm going to treat this as a bug and fix it.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",795367402,v0.54 500 error from sql query in custom template; code worked in v0.53; found a workaround, https://github.com/dogsheep/github-to-sqlite/issues/60#issuecomment-770071568,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/60,770071568,MDEyOklzc3VlQ29tbWVudDc3MDA3MTU2OA==,9599,simonw,2021-01-29T21:56:15Z,2021-01-29T21:56:15Z,MEMBER,"I really like the way you're using pipes here - really smart. It's similar to how I build the demo database in this GitHub Actions workflow: https://github.com/dogsheep/github-to-sqlite/blob/62dfd3bc4014b108200001ef4bc746feb6f33b45/.github/workflows/deploy-demo.yml#L52-L82 `twitter-to-sqlite` actually has a mechanism for doing this kind of thing, documented at https://github.com/dogsheep/twitter-to-sqlite#providing-input-from-a-sql-query-with---sql-and---attach It lets you do things like: ``` $ twitter-to-sqlite users-lookup my.db --sql=""select follower_id from following"" --ids ``` Maybe I should add something similar to `github-to-sqlite`? Feels like it could be really useful.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",797097140,Use Data from SQLite in other commands, https://github.com/simonw/datasette/issues/1217#issuecomment-774385092,https://api.github.com/repos/simonw/datasette/issues/1217,774385092,MDEyOklzc3VlQ29tbWVudDc3NDM4NTA5Mg==,6165713,plpxsk,2021-02-06T02:49:11Z,2021-02-06T02:49:11Z,NONE,"A good reference seems to be the note to run `datasette` as a module in https://github.com/simonw/datasette/pull/556 ","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",802513359,Possible to deploy as a python app (for Rstudio connect server)?, https://github.com/simonw/datasette/issues/1200#issuecomment-777178728,https://api.github.com/repos/simonw/datasette/issues/1200,777178728,MDEyOklzc3VlQ29tbWVudDc3NzE3ODcyOA==,9599,simonw,2021-02-11T03:13:59Z,2021-02-11T03:13:59Z,OWNER,"I came up with the need for this while playing with this tool: https://calands.datasettes.com/calands?sql=select%0D%0A++AsGeoJSON(geometry)%2C+*%0D%0Afrom%0D%0A++CPAD_2020a_SuperUnits%0D%0Awhere%0D%0A++PARK_NAME+like+'%25mini%25'+and%0D%0A++Intersects(GeomFromGeoJSON(%3Afreedraw)%2C+geometry)+%3D+1%0D%0A++and+CPAD_2020a_SuperUnits.rowid+in+(%0D%0A++++select%0D%0A++++++rowid%0D%0A++++from%0D%0A++++++SpatialIndex%0D%0A++++where%0D%0A++++++f_table_name+%3D+'CPAD_2020a_SuperUnits'%0D%0A++++++and+search_frame+%3D+GeomFromGeoJSON(%3Afreedraw)%0D%0A++)&freedraw={""type""%3A""MultiPolygon""%2C""coordinates""%3A[[[[-122.42202758789064%2C37.82280243352759]%2C[-122.39868164062501%2C37.823887203271454]%2C[-122.38220214843751%2C37.81846319511331]%2C[-122.35061645507814%2C37.77071473849611]%2C[-122.34924316406251%2C37.74465712069939]%2C[-122.37258911132814%2C37.703380457832374]%2C[-122.39044189453125%2C37.690340943717715]%2C[-122.41241455078126%2C37.680559803205135]%2C[-122.44262695312501%2C37.67295135774715]%2C[-122.47283935546876%2C37.67295135774715]%2C[-122.52502441406251%2C37.68382032669382]%2C[-122.53463745117189%2C37.6892542140253]%2C[-122.54699707031251%2C37.690340943717715]%2C[-122.55798339843751%2C37.72945260537781]%2C[-122.54287719726564%2C37.77831314799672]%2C[-122.49893188476564%2C37.81303878836991]%2C[-122.46185302734376%2C37.82822612280363]%2C[-122.42889404296876%2C37.82822612280363]%2C[-122.42202758789064%2C37.82280243352759]]]]} - before I fixed https://github.com/simonw/datasette-leaflet-geojson/issues/16 it was loading a LOT of maps, which felt bad. I wanted to be able to link people to that page with a hard limit on the number of rows displayed on that page. It's mainly to guard against unexpected behaviour from limit-less queries though. It's not a very high priority feature!","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792890765,?_size=10 option for the arbitrary query page would be useful, https://github.com/dogsheep/evernote-to-sqlite/issues/11#issuecomment-777798330,https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/11,777798330,MDEyOklzc3VlQ29tbWVudDc3Nzc5ODMzMA==,9599,simonw,2021-02-11T21:18:58Z,2021-02-11T21:18:58Z,MEMBER,Thanks for the fix!,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",792851444,XML parse error, https://github.com/simonw/datasette/issues/1220#issuecomment-778467759,https://api.github.com/repos/simonw/datasette/issues/1220,778467759,MDEyOklzc3VlQ29tbWVudDc3ODQ2Nzc1OQ==,30607,aborruso,2021-02-12T21:35:17Z,2021-02-12T21:35:17Z,NONE,Thank you,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",806743116,Installing datasette via docker: Path 'fixtures.db' does not exist, https://github.com/simonw/sqlite-utils/issues/131#issuecomment-778510528,https://api.github.com/repos/simonw/sqlite-utils/issues/131,778510528,MDEyOklzc3VlQ29tbWVudDc3ODUxMDUyOA==,9599,simonw,2021-02-12T23:25:06Z,2021-02-12T23:25:06Z,OWNER,"If `-c` isn't available, maybe `-t` or `--type` would work for specifying column types: ``` sqlite-utils insert db.db images images.tsv \ --tsv \ --type id int \ --type score float ``` or ``` sqlite-utils insert db.db images images.tsv \ --tsv \ -t id int \ -t score float ```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",675753042,sqlite-utils insert: options for column types, https://github.com/simonw/datasette/issues/782#issuecomment-782789598,https://api.github.com/repos/simonw/datasette/issues/782,782789598,MDEyOklzc3VlQ29tbWVudDc4Mjc4OTU5OA==,9599,simonw,2021-02-21T03:30:02Z,2021-02-21T03:30:02Z,OWNER,Another benefit to default:object - I could include a key that shows a list of available extras. I could then use that to power an interactive API explorer.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",627794879,Redesign default .json format, https://github.com/simonw/datasette/issues/1241#issuecomment-784567547,https://api.github.com/repos/simonw/datasette/issues/1241,784567547,MDEyOklzc3VlQ29tbWVudDc4NDU2NzU0Nw==,9599,simonw,2021-02-23T22:45:56Z,2021-02-23T22:46:12Z,OWNER,"I really like the way the Share feature on Stack Overflow works: https://stackoverflow.com/questions/18934149/how-can-i-use-postgresqls-text-column-type-in-django ","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",814595021,Share button for copying current URL, https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-786925280,https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5,786925280,MDEyOklzc3VlQ29tbWVudDc4NjkyNTI4MA==,9599,simonw,2021-02-26T22:23:10Z,2021-02-26T22:23:10Z,MEMBER,"Thanks! I requested my Gmail export from takeout - once that arrives I'll test it against this and then merge the PR.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",813880401,WIP: Add Gmail takeout mbox import, https://github.com/dogsheep/google-takeout-to-sqlite/issues/4#issuecomment-790198930,https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/4,790198930,MDEyOklzc3VlQ29tbWVudDc5MDE5ODkzMA==,203343,Btibert3,2021-03-04T00:58:40Z,2021-03-04T00:58:40Z,NONE,"I am just seeing this sorry, yes! I will kick the tires later on tonight. My apologies for the delay.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",778380836,Feature Request: Gmail, https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-790389335,https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5,790389335,MDEyOklzc3VlQ29tbWVudDc5MDM4OTMzNQ==,306240,UtahDave,2021-03-04T07:32:04Z,2021-03-04T07:32:04Z,NONE,"> The command takes quite a while to start running, presumably because this line causes it to have to scan the WHOLE file in order to generate a count: > > https://github.com/dogsheep/google-takeout-to-sqlite/blob/a3de045eba0fae4b309da21aa3119102b0efc576/google_takeout_to_sqlite/utils.py#L66-L67 > > I'm fine with waiting though. It's not like this is a command people run every day - and without that count we can't show a progress bar, which seems pretty important for a process that takes this long. The wait is from python loading the mbox file. This happens regardless if you're getting the length of the mbox. The mbox module is on the slow side. It is possible to do one's own parsing of the mbox, but I kind of wanted to avoid doing that.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",813880401,WIP: Add Gmail takeout mbox import, https://github.com/simonw/datasette/issues/1238#issuecomment-790857004,https://api.github.com/repos/simonw/datasette/issues/1238,790857004,MDEyOklzc3VlQ29tbWVudDc5MDg1NzAwNA==,79913,tsibley,2021-03-04T19:06:55Z,2021-03-04T19:06:55Z,NONE,"@rgieseke Ah, that's super helpful. Thank you for the workaround for now!","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",813899472,Custom pages don't work with base_url setting, https://github.com/simonw/datasette/issues/838#issuecomment-795895436,https://api.github.com/repos/simonw/datasette/issues/838,795895436,MDEyOklzc3VlQ29tbWVudDc5NTg5NTQzNg==,9599,simonw,2021-03-10T18:44:46Z,2021-03-10T18:44:57Z,OWNER,Let's reopen this.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",637395097,Incorrect URLs when served behind a proxy with base_url set, https://github.com/simonw/sqlite-utils/issues/159#issuecomment-802032152,https://api.github.com/repos/simonw/sqlite-utils/issues/159,802032152,MDEyOklzc3VlQ29tbWVudDgwMjAzMjE1Mg==,1025224,limar,2021-03-18T15:42:52Z,2021-03-18T15:42:52Z,NONE,"I confirm the bug. Happens for me in version 3.6. I use the call to delete all the records: `table.delete_where()` This does not delete anything. I see that `delete()` method DOES use context manager `with self.db.conn:` which should help. You may want to align the code of both methods.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",702386948,.delete_where() does not auto-commit (unlike .insert() or .upsert()), https://github.com/simonw/sqlite-utils/issues/249#issuecomment-803501756,https://api.github.com/repos/simonw/sqlite-utils/issues/249,803501756,MDEyOklzc3VlQ29tbWVudDgwMzUwMTc1Ng==,9599,simonw,2021-03-21T02:33:45Z,2021-03-21T02:33:45Z,OWNER,"Did you run `enable-fts` before you inserted the data? If so you'll need to run `populate-fts` after the insert to populate the FTS index. A better solution may be to add `--create-triggers` to the `enable-fts` command to add triggers that will automatically keep the index updated as you insert new records.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",836963850,Full text search possibly broken?, https://github.com/simonw/datasette/issues/1153#issuecomment-805109341,https://api.github.com/repos/simonw/datasette/issues/1153,805109341,MDEyOklzc3VlQ29tbWVudDgwNTEwOTM0MQ==,9599,simonw,2021-03-23T17:55:48Z,2021-03-23T18:41:57Z,OWNER,"Beginnings of a UI element for switching between them: ```html
JSON YAML
``` That `
` has a padding of 12px, so using 12px padding on the tab links should get them to line up better.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",771202454,"Use YAML examples in documentation by default, not JSON",
https://github.com/simonw/datasette/issues/1274#issuecomment-805214307,https://api.github.com/repos/simonw/datasette/issues/1274,805214307,MDEyOklzc3VlQ29tbWVudDgwNTIxNDMwNw==,7476523,bobwhitelock,2021-03-23T20:12:29Z,2021-03-23T20:12:29Z,CONTRIBUTOR,"One issue I could see with adding first class support for metadata in hjson format is that this would require adding an additional dependency to handle this, for a feature that would be unused by many users. I wonder if this could fit in as a plugin instead; if a hook existed for loading metadata (maybe as part of https://github.com/simonw/datasette/issues/860) the metadata could then come from any source, as specified by plugins, e.g. hjson, toml, XML, a database table etc.

Until/unless this exists, a few ideas for how you could add comments:
- Using YAML as you suggest.
- A common pattern is adding a `""comment""` key for comments to any object in JSON - I don't think including an unnecessary key like this would break anything in Datasette, but not certain.
- You could use another tool as a preprocessor for your JSON metadata - e.g. hjson or Jsonnet. You'd write the metadata in that format, and then convert that into JSON to actually use as your final metadata.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",839008371,Might there be some way to comment metadata.json?,
https://github.com/simonw/datasette/pull/1260#issuecomment-808988697,https://api.github.com/repos/simonw/datasette/issues/1260,808988697,MDEyOklzc3VlQ29tbWVudDgwODk4ODY5Nw==,9599,simonw,2021-03-29T00:22:21Z,2021-03-29T00:22:21Z,OWNER,"This is interesting!

I've decided to apply a subset of these - the `if` and `elif` blocks are a deliberate style choice from me, because I find code clearer when it has if/else as opposed to relying on early termination. Likewise the iteration against `.keys()` on dictionaries.

I like the other fixes though, I'm about to land them in a separate commit that credits you.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",831163537,Fix: code quality issues,
https://github.com/simonw/datasette/pull/1031#issuecomment-809010713,https://api.github.com/repos/simonw/datasette/issues/1031,809010713,MDEyOklzc3VlQ29tbWVudDgwOTAxMDcxMw==,9599,simonw,2021-03-29T01:46:45Z,2021-03-29T01:46:45Z,OWNER,Sorry I didn't get to this PR sooner. I've joint-credited you in the release notes for this fix: https://docs.datasette.io/en/stable/changelog.html#v0-56,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",724369025,Fallback to databases in inspect-data.json when no -i options are passed,
https://github.com/simonw/datasette/issues/696#issuecomment-809548363,https://api.github.com/repos/simonw/datasette/issues/696,809548363,MDEyOklzc3VlQ29tbWVudDgwOTU0ODM2Mw==,9599,simonw,2021-03-29T17:04:19Z,2021-03-29T17:04:19Z,OWNER,I tried this just now against Datasette 0.56 with the new Dockerfile from #1249 (that uses SQLite and SpatiaLite installed with `apt-get install`) and the tests all passed.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",576722115,Single failing unit test when run inside the Docker image,
https://github.com/simonw/datasette/issues/1284#issuecomment-810740486,https://api.github.com/repos/simonw/datasette/issues/1284,810740486,MDEyOklzc3VlQ29tbWVudDgxMDc0MDQ4Ng==,9599,simonw,2021-03-31T03:57:55Z,2021-03-31T03:57:55Z,OWNER,"You're right, doing this is really hard at the moment - I'm not sure I know how I would tackle this either, and it's something I've wanted in the past!

I'll have a think about this one.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",845794436,Feature or Documentation Request: Individual table as home page template,
https://github.com/simonw/datasette/issues/1286#issuecomment-812664443,https://api.github.com/repos/simonw/datasette/issues/1286,812664443,MDEyOklzc3VlQ29tbWVudDgxMjY2NDQ0Mw==,9599,simonw,2021-04-02T18:52:45Z,2021-04-02T18:52:51Z,OWNER,"Idea: default to displaying single-dimension JSON arrays of strings as a comma-separated list but show the comma in a different colour - something like this:



I used this HTML for the prototype (re-using `.type-int` just to get the colour):
```html
tag1, tag2
```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",849220154,Better default display of arrays of items,
https://github.com/simonw/datasette/issues/1255#issuecomment-812710120,https://api.github.com/repos/simonw/datasette/issues/1255,812710120,MDEyOklzc3VlQ29tbWVudDgxMjcxMDEyMA==,1111743,jungle-boogie,2021-04-02T20:50:08Z,2021-04-02T20:50:08Z,NONE,"Hello again,

I was able to get my facets running with this `settings.json`, which was lifted from one of Simon's datasette's and slightly modified.

```
{
    ""default_page_size"": 100,
    ""max_returned_rows"": 1000,
    ""num_sql_threads"": 3,
    ""sql_time_limit_ms"": 9000,
    ""default_facet_size"": 10,
    ""facet_time_limit_ms"": 9000,
    ""facet_suggest_time_limit_ms"": 500,
    ""hash_urls"": false,
    ""allow_facet"": true,
    ""suggest_facets"": false,
    ""default_cache_ttl"": 5,
    ""default_cache_ttl_hashed"": 31536000,
    ""cache_size_kb"": 0,
    ""allow_csv_stream"": true,
    ""max_csv_mb"": 100,
    ""truncate_cells_html"": 2048,
    ""template_debug"": false,
    ""base_url"": ""/""
}
```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",826700095,Facets timing out but work when filtering,
https://github.com/simonw/datasette/issues/1286#issuecomment-815978405,https://api.github.com/repos/simonw/datasette/issues/1286,815978405,MDEyOklzc3VlQ29tbWVudDgxNTk3ODQwNQ==,192568,mroswell,2021-04-08T16:47:29Z,2021-04-10T03:59:00Z,CONTRIBUTOR,"This worked for me:                      
`{{ cell.value | replace('"", ""','; ') | replace('[\""','') | replace('\""]','')}}`

I'm sure there is a prettier (and more flexible) way, but for now, this is ever-so-much more pleasant to look at. 

------ AFTER:


------ BEFORE:




(Note: I didn't figure out how to have one item have no semicolon, while multi-items close with a semicolon, but this is good enough for now. I also didn't figure out how to set up a new jinja filter. I don't want to add to /datasette/utils/__init__.py as I assume that would get overwritten when upgrading datasette. Having a starter guide on creating jinja filters in datasette would be helpful. (The jinja documentation isn't datasette-specific enough for me to quite nail it.)
","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",849220154,Better default display of arrays of items,
https://github.com/simonw/sqlite-utils/pull/258#issuecomment-843702392,https://api.github.com/repos/simonw/sqlite-utils/issues/258,843702392,MDEyOklzc3VlQ29tbWVudDg0MzcwMjM5Mg==,9599,simonw,2021-05-19T02:47:37Z,2021-05-19T02:47:37Z,OWNER,I'm going to merge this and add a test - thanks!,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",868191959,Fixing insert from JSON containing strings with non-ascii characters …,
https://github.com/simonw/sqlite-utils/issues/253#issuecomment-843718859,https://api.github.com/repos/simonw/sqlite-utils/issues/253,843718859,MDEyOklzc3VlQ29tbWVudDg0MzcxODg1OQ==,9599,simonw,2021-05-19T03:31:47Z,2021-05-19T03:31:47Z,OWNER,Fixed: https://simonwillison.net/2020/Sep/23/sqlite-advanced-alter-table/,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",847423559,fixtures.db example error in sql-utils blog post,
https://github.com/simonw/datasette/pull/1352#issuecomment-852673695,https://api.github.com/repos/simonw/datasette/issues/1352,852673695,MDEyOklzc3VlQ29tbWVudDg1MjY3MzY5NQ==,9599,simonw,2021-06-02T02:52:26Z,2021-06-02T02:52:26Z,OWNER,@dependabot recreate,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",908276134,Bump black from 21.5b1 to 21.5b2,
https://github.com/simonw/datasette/issues/526#issuecomment-853567413,https://api.github.com/repos/simonw/datasette/issues/526,853567413,MDEyOklzc3VlQ29tbWVudDg1MzU2NzQxMw==,9599,simonw,2021-06-03T05:11:27Z,2021-06-03T05:11:27Z,OWNER,"Another potential way to implement this would be to hold the SQLite connection open and execute the full query there.

I've avoided this in the past due to concerns of resource exhaustion - if multiple requests attempt this at the same time all of the connections in the pool will become tied up and the site will be unable to respond to further requests.

But... now that Datasette has authentication there's the possibility of making this feature only available to specific authenticated users - the `--root` user for example. Which avoids the danger while unlocking a super-useful feature.

Not to mention people who are running Datasette privately on their own laptop, or the proposed `--query` CLI feature in #1356.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",459882902,Stream all results for arbitrary SQL and canned queries,
https://github.com/simonw/sqlite-utils/issues/264#issuecomment-853567861,https://api.github.com/repos/simonw/sqlite-utils/issues/264,853567861,MDEyOklzc3VlQ29tbWVudDg1MzU2Nzg2MQ==,9599,simonw,2021-06-03T05:12:21Z,2021-06-03T05:12:21Z,OWNER,I think this is more likely to happen in Datasette than in sqlite-utils - see https://github.com/simonw/datasette/issues/1356 for thoughts on this.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",907642546,"Supporting additional output formats, like GeoJSON",
https://github.com/simonw/datasette/issues/1375#issuecomment-860230385,https://api.github.com/repos/simonw/datasette/issues/1375,860230385,MDEyOklzc3VlQ29tbWVudDg2MDIzMDM4NQ==,9599,simonw,2021-06-13T15:37:49Z,2021-06-13T15:37:49Z,OWNER,"There is a feature for this at the moment, but it's a little bit hidden: you can use `?_json=col` to tell
Datasette that you would like a specific column to be exported as nested JSON: https://docs.datasette.io/en/stable/json_api.html#special-json-arguments

I considered trying to make this automatic - so it detects columns that appear to contain valid JSON and outputs them as nested objects - but the problem with that is that it can lead to inconsistent results - you might hit the API and find that not every column contains valid JSON (compared to the previous day) resulting in the API retuning  string instead of the expected dictionary and breaking your code.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",919508498,JSON export dumps JSON fields as TEXT,
https://github.com/simonw/datasette/issues/1375#issuecomment-860548546,https://api.github.com/repos/simonw/datasette/issues/1375,860548546,MDEyOklzc3VlQ29tbWVudDg2MDU0ODU0Ng==,4068,frafra,2021-06-14T09:41:59Z,2021-06-14T09:41:59Z,NONE,"> There is a feature for this at the moment, but it's a little bit hidden: you can use `?_json=col` to tell
> Datasette that you would like a specific column to be exported as nested JSON: https://docs.datasette.io/en/stable/json_api.html#special-json-arguments

Thanks :)
 
> I considered trying to make this automatic - so it detects columns that appear to contain valid JSON and outputs them as nested objects - but the problem with that is that it can lead to inconsistent results - you might hit the API and find that not every column contains valid JSON (compared to the previous day) resulting in the API retuning string instead of the expected dictionary and breaking your code.

If a developer is not sure if the JSON fields are valid, but then retrieves and parse them, it should handle errors too. Handling inconsistent data is necessary due to the nature of SQLite. A global or dataset option to render the data as they have been defined (JSON, boolean, etc.) when requesting JSON could allow the user to download a regular JSON from the browser without having to rely on APIs. I would guess someone could just make a custom template with an extra JSON-parsed download button otherwise :)","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",919508498,JSON export dumps JSON fields as TEXT,
https://github.com/simonw/sqlite-utils/issues/272#issuecomment-861987651,https://api.github.com/repos/simonw/sqlite-utils/issues/272,861987651,MDEyOklzc3VlQ29tbWVudDg2MTk4NzY1MQ==,9599,simonw,2021-06-16T02:27:20Z,2021-06-16T02:27:20Z,OWNER,Solution: `sqlite-utils memory -` attempts to detect the input based on if it starts with a `{` or `[` (likely JSON) or if it doesn't use the `csv.Sniffer()` mechanism. Or you can use `sqlite-utils memory -:csv` to specifically indicate the type of input.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",921878733,"Idea: import CSV to memory, run SQL, export in a single command",
https://github.com/simonw/sqlite-utils/issues/278#issuecomment-864128489,https://api.github.com/repos/simonw/sqlite-utils/issues/278,864128489,MDEyOklzc3VlQ29tbWVudDg2NDEyODQ4OQ==,9599,simonw,2021-06-18T15:46:24Z,2021-06-18T15:46:24Z,OWNER,A workaround could be to define a bash or zsh alias of some sort.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",923697888,"Support db as first parameter before subcommand, or as environment variable",
https://github.com/simonw/datasette/issues/1396#issuecomment-880326049,https://api.github.com/repos/simonw/datasette/issues/1396,880326049,MDEyOklzc3VlQ29tbWVudDg4MDMyNjA0OQ==,9599,simonw,2021-07-15T01:50:05Z,2021-07-15T01:50:05Z,OWNER,"I think I made a mistake in this commit: https://github.com/simonw/datasette/commit/0486303b60ce2784fd2e2ecdbecf304b7d6e6659



It looks like I copied `$VERSION_TAG` from here - but it's not available in the `publish.yml` flow: https://github.com/simonw/datasette/blob/0486303b60ce2784fd2e2ecdbecf304b7d6e6659/.github/workflows/push_docker_tag.yml#L18-L25","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",944903881,"""invalid reference format"" publishing Docker image",
https://github.com/simonw/sqlite-utils/issues/298#issuecomment-891359751,https://api.github.com/repos/simonw/sqlite-utils/issues/298,891359751,IC_kwDOCGYnMM41IRIH,9599,simonw,2021-08-02T21:55:16Z,2021-08-02T21:55:16Z,OWNER,"This is a feature already! You can do this:

    sqlite-utils insert nl-demo.db mytable data.ndjson --nl

See https://sqlite-utils.datasette.io/en/stable/cli.html#inserting-newline-delimited-json
","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",951581763,Read lines with JSON object,
https://github.com/simonw/datasette/issues/942#issuecomment-897996296,https://api.github.com/repos/simonw/datasette/issues/942,897996296,IC_kwDOBm6k_c41hlYI,9599,simonw,2021-08-12T22:01:36Z,2021-08-12T22:01:36Z,OWNER,"I'm going with `""columns"": {""name-of-column"": ""description-of-column""}`.

If I decide to make `""col""` and `""nocol""` available in metadata I'll use those as the keys in the metadata, for consistency with the existing query string parameters.

I'm OK with having both `""columns"": ...` and `""col"": ...` keys in the metadata, even though they could be a tiny bit confusing without the documentation.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",681334912,Support column descriptions in metadata.json,
https://github.com/simonw/datasette/pull/1455#issuecomment-913001282,https://api.github.com/repos/simonw/datasette/issues/1455,913001282,IC_kwDOBm6k_c42a0tC,51016,ctb,2021-09-04T16:31:24Z,2021-09-04T16:31:24Z,CONTRIBUTOR,I love it! maybe 'researchers' instead? Or 'scientists and researchers'?,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",988325628,Add scientists to target groups,
https://github.com/simonw/datasette/pull/1455#issuecomment-913001416,https://api.github.com/repos/simonw/datasette/issues/1455,913001416,IC_kwDOBm6k_c42a0vI,9599,simonw,2021-09-04T16:32:21Z,2021-09-04T16:32:21Z,OWNER,I'll add researchers too.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",988325628,Add scientists to target groups,
https://github.com/simonw/sqlite-utils/issues/328#issuecomment-925296085,https://api.github.com/repos/simonw/sqlite-utils/issues/328,925296085,IC_kwDOCGYnMM43JuXV,9599,simonw,2021-09-22T20:14:53Z,2021-09-22T20:14:53Z,OWNER,The bug is in this code: https://github.com/simonw/sqlite-utils/blob/77c240df56068341561e95e4a412cbfa24dc5bc7/sqlite_utils/cli.py#L2205-L2227,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1004613267,Invalid JSON output when no rows,
https://github.com/simonw/sqlite-utils/issues/325#issuecomment-925321439,https://api.github.com/repos/simonw/sqlite-utils/issues/325,925321439,IC_kwDOCGYnMM43J0jf,9599,simonw,2021-09-22T20:52:56Z,2021-09-22T20:52:56Z,OWNER,"Updated documentation: https://sqlite-utils.datasette.io/en/latest/cli.html#running-queries-directly-against-csv-or-json

> If two files have the same name they will be assigned a numeric suffix:
> 
>     $ sqlite-utils memory foo/data.csv bar/data.csv ""select * from data_2""","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",990844088,sqlite-utils memory can't deal with multiple files with the same name,
https://github.com/simonw/datasette/pull/1487#issuecomment-942722595,https://api.github.com/repos/simonw/datasette/issues/1487,942722595,IC_kwDOBm6k_c44MM4j,9599,simonw,2021-10-13T21:08:53Z,2021-10-13T21:08:53Z,OWNER,Thanks for this!,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1023245060,"Added instructions for installing plugins via pipx, #1486",
https://github.com/simonw/datasette/pull/1489#issuecomment-943594712,https://api.github.com/repos/simonw/datasette/issues/1489,943594712,IC_kwDOBm6k_c44PhzY,9599,simonw,2021-10-14T18:04:11Z,2021-10-14T18:04:11Z,OWNER,@dependabot recreate,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1026379132,"Update pyyaml requirement from ~=5.3 to >=5.3,<7.0",
https://github.com/simonw/datasette/issues/1284#issuecomment-949604763,https://api.github.com/repos/simonw/datasette/issues/1284,949604763,IC_kwDOBm6k_c44mdGb,536941,fgregg,2021-10-22T12:54:34Z,2021-10-22T12:54:34Z,CONTRIBUTOR,i'm going to take a swing at this today. we'll see.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",845794436,Feature or Documentation Request: Individual table as home page template,
https://github.com/simonw/sqlite-utils/issues/336#issuecomment-962411119,https://api.github.com/repos/simonw/sqlite-utils/issues/336,962411119,IC_kwDOCGYnMM45XTpv,9599,simonw,2021-11-06T07:21:04Z,2021-11-06T07:21:04Z,OWNER,I've never used `DEFAULT 'CURRENT_TIMESTAMP'` myself so this one should be an interesting bug to explore.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1044267332,"sqlite-util tranform --column-order mangles columns of type ""timestamp""",
https://github.com/simonw/datasette/issues/1522#issuecomment-976117989,https://api.github.com/repos/simonw/datasette/issues/1522,976117989,IC_kwDOBm6k_c46LmDl,813732,glasnt,2021-11-23T03:00:34Z,2021-11-23T03:00:34Z,CONTRIBUTOR,"I tried deploying the most recent version of the Dockerfile in this thread ([link to comment](https://github.com/simonw/datasette/issues/1522#issuecomment-974605128)), and after trying a few different different combinations, I was only successful when I used `--no-cpu-throttling` (""CPU Is always allocated"" in the UI)

Using this method, I got a very similar issue to you: The first time I'd load the site I'd get a 503. But after that first load, I didn't get the issue again. It would re-occur if the service started from cold boot. 

I suspect this is a race condition in the supervisord configuration. The errors I got were the same `Connection refused: AH00957: http: attempt to connect to 127.0.0.1:8001 (127.0.0.1) failed`, and that seems to indicate that `datasette` hadn't yet started. 

Looking at the order of logs getting back, the processes reported successfully completing loading after the first 503 was returned, so that makes me think race condition. 

I can replicate this locally, if I `docker run` and request `localhost:5000/prefix` _before_ I get the `datasette entered RUNNING state` message. Cloud Run wakes up when requests are received, so this test would semi-replicate that, but local docker would be the equivalent of a persistent process, hence it doesn't normally exhibit the same issues.

Unfortunately supervisor/supervisor issue 122 (not linking as to prevent cross-project link spam) seems to say that dependency chaining is a feature that's been asked for for a long time, but hasn't been implemented. You could try some suggestions in that thread. ","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1058896236,Deploy a live instance of demos/apache-proxy,
https://github.com/simonw/datasette/issues/1304#issuecomment-981980048,https://api.github.com/repos/simonw/datasette/issues/1304,981980048,IC_kwDOBm6k_c46h9OQ,30934,20after4,2021-11-29T20:13:53Z,2021-11-29T20:14:11Z,NONE,There isn't any way to do this with sqlite as far as I know.  The only option is to insert the right number of ? placeholders into the sql template and then provide an array of values.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",863884805,"Document how to send multiple values for ""Named parameters"" ",
https://github.com/simonw/datasette/issues/1304#issuecomment-988463455,https://api.github.com/repos/simonw/datasette/issues/1304,988463455,IC_kwDOBm6k_c466sFf,30934,20after4,2021-12-08T03:23:14Z,2021-12-08T03:23:14Z,NONE,I actually think it would be a useful thing to add support for in datasette. It wouldn't be difficult to unwind an array of params and add the placeholders automatically.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",863884805,"Document how to send multiple values for ""Named parameters"" ",
https://github.com/simonw/sqlite-utils/issues/353#issuecomment-991378346,https://api.github.com/repos/simonw/sqlite-utils/issues/353,991378346,IC_kwDOCGYnMM47Fzuq,9599,simonw,2021-12-10T23:48:28Z,2021-12-10T23:48:28Z,OWNER,"One option: allow `CODE` to be a special value of `-` which means ""read from standard input"". It's a tiny bit of a hack but I think it would work here.

If you wanted to replace a column entirely with hyphens you would still be able to do this:

    sqlite-utils convert my.db mytable col1 '""-""'","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1077102934,"Allow passing a file of code to ""sqlite-utils convert""",
https://github.com/simonw/datasette/issues/1549#issuecomment-991754794,https://api.github.com/repos/simonw/datasette/issues/1549,991754794,IC_kwDOBm6k_c47HPoq,9599,simonw,2021-12-11T19:16:33Z,2021-12-11T19:16:33Z,OWNER,Good call! I'm doing a refactor #1518 right now which will hopefully bring the functionality of those two much closer - I'll make a note to consider this there too.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1077620955,Redesign CSV export to improve usability,
https://github.com/simonw/datasette/issues/1552#issuecomment-995034143,https://api.github.com/repos/simonw/datasette/issues/1552,995034143,IC_kwDOBm6k_c47TwQf,9599,simonw,2021-12-15T18:02:53Z,2021-12-15T18:02:53Z,OWNER,"This is definitely a missing feature. The ""different types of facet"" stuff feels incomplete to me generally - this is one issue, but this one as well:

- #625","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1078702875,Allow to set `facets_array` in metadata (like current `facets`),
https://github.com/simonw/datasette/issues/1608#issuecomment-1017998993,https://api.github.com/repos/simonw/datasette/issues/1608,1017998993,IC_kwDOBm6k_c48rW6R,9599,simonw,2022-01-20T22:56:00Z,2022-01-20T22:56:00Z,OWNER,"> https://sphinx-version-warning.readthedocs.io/ looks like it can show a banner for ""You are looking at v0.36 but you should be looking at 0.40"" but doesn't hand the case I need here which is ""you are looking at /latest/ but you should be looking at /stable/"".

Correction! That tool DOES support that, as can be seen in their example configuration for their own documentation:

https://github.com/humitos/sphinx-version-warning/blob/a82156c2ea08e5feab406514d0ccd9d48a345f48/docs/conf.py#L32-L38

```python
versionwarning_messages = {
    'latest': 'This is a custom message only for version ""latest"" of this documentation.',
}
versionwarning_admonition_type = 'tip'
versionwarning_banner_title = 'Tip'
versionwarning_body_selector = 'div[itemprop=""articleBody""]'
```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1109808154,Documentation should clarify /stable/ vs /latest/,
https://github.com/simonw/datasette/issues/1613#issuecomment-1021860694,https://api.github.com/repos/simonw/datasette/issues/1613,1021860694,IC_kwDOBm6k_c486FtW,9599,simonw,2022-01-26T04:57:53Z,2022-01-26T04:57:53Z,OWNER,"The existing flow where you can apply filters to a table and then click ""View and edit SQL"" to see the query is a good starting point.

Group by queries are both crucially important and difficult to assemble for beginners. Providing a way to see the query that was used by a facet (since facets are really just group-by-counts) would be very useful, which could come out of this:

- #1080","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1114628238,Improvements to help make Datasette a better tool for learning SQL,
https://github.com/simonw/sqlite-utils/pull/385#issuecomment-1029285985,https://api.github.com/repos/simonw/sqlite-utils/issues/385,1029285985,IC_kwDOCGYnMM49Wahh,9599,simonw,2022-02-03T18:37:48Z,2022-02-03T18:37:48Z,OWNER,"`from sqlite_utils.utils import find_spatialite` is part of the documented API already:

https://sqlite-utils.datasette.io/en/3.22.1/python-api.html#finding-spatialite

To avoid needing to bump the major version number to 4 to indicate a backwards incompatible change, we should keep a `from .gis import find_spatialite` line at the top of `utils.py` such that any existing code with that documented import continues to work.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1102899312,Add new spatialite helper methods,
https://github.com/simonw/sqlite-utils/issues/399#issuecomment-1030740653,https://api.github.com/repos/simonw/sqlite-utils/issues/399,1030740653,IC_kwDOCGYnMM49b9qt,25778,eyeseast,2022-02-06T02:57:17Z,2022-02-06T02:57:17Z,CONTRIBUTOR,"I like the idea of having stock conversions you could import. I'd actually move them to a dedicated module (call it `sqlite_utils.conversions` or something), because it's different from other utilities. Maybe they even take configuration, or they're composable.

```python
from sqlite_utils.conversions import LongitudeLatitude

db[""places""].insert(
    {
        ""name"": ""London"",
        ""lng"": -0.118092,
        ""lat"": 51.509865,
    },
    conversions={""point"": LongitudeLatitude(""lng"", ""lat"")},
)
```

I would definitely use that for every CSV I get with lat/lng columns where I actually need GeoJSON.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1124731464,"Make it easier to insert geometries, with documentation and maybe code",
https://github.com/simonw/datasette/issues/236#issuecomment-1033772902,https://api.github.com/repos/simonw/datasette/issues/236,1033772902,IC_kwDOBm6k_c49nh9m,1376648,jordaneremieff,2022-02-09T13:40:52Z,2022-02-09T13:40:52Z,NONE,"Hi @simonw, 

I've received some inquiries over the last year or so about Datasette and how it might be supported by [Mangum](https://github.com/jordaneremieff/mangum). I maintain Mangum which is, as far as I know, the only project that provides support for ASGI applications in AWS Lambda.

If there is anything that I can help with here, please let me know because I think what Datasette provides to the community (even beyond OSS) is noble and worthy of special consideration.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317001500,datasette publish lambda plugin,
https://github.com/simonw/sqlite-utils/issues/412#issuecomment-1059650190,https://api.github.com/repos/simonw/sqlite-utils/issues/412,1059650190,IC_kwDOCGYnMM4_KPqO,9599,simonw,2022-03-05T02:04:43Z,2022-03-05T02:04:54Z,OWNER,"To be honest, I'm having second thoughts about this now mainly because the idiom for turning a generator of dicts into a DataFrame is SO simple:

```python
df = pd.DataFrame(db.query(""select * from articles""))
```
Given it's that simple, I'm questioning if there's any value to adding this to `sqlite-utils` at all. This likely becomes a documentation thing instead!","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1160182768,Optional Pandas integration,
https://github.com/simonw/sqlite-utils/issues/412#issuecomment-1059652834,https://api.github.com/repos/simonw/sqlite-utils/issues/412,1059652834,IC_kwDOCGYnMM4_KQTi,596279,zaneselvans,2022-03-05T02:14:40Z,2022-03-05T02:14:40Z,NONE,"We do a lot of `df.to_sql()` to write into sqlite, mostly in [this moddule](https://github.com/catalyst-cooperative/pudl/blob/main/src/pudl/load.py#L25)","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1160182768,Optional Pandas integration,
https://github.com/simonw/datasette/issues/1384#issuecomment-1066222323,https://api.github.com/repos/simonw/datasette/issues/1384,1066222323,IC_kwDOBm6k_c4_jULz,2670795,brandonrobertz,2022-03-14T00:36:42Z,2022-03-14T00:36:42Z,CONTRIBUTOR,"> Ah, sorry, I didn't get what you were saying you the first time. Using _metadata_local in that way makes total sense -- I agree, refreshing metadata each cell was seeming quite excessive. Now I'm on the same page! :)

All good. Report back any issues you find with this stuff. Metadata/dynamic config hasn't been tested widely outside of what I've done AFAIK. If you find a strong use case for async meta, it's going to be better to know sooner rather than later!","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",930807135,Plugin hook for dynamic metadata,
https://github.com/simonw/datasette/issues/526#issuecomment-1074019047,https://api.github.com/repos/simonw/datasette/issues/526,1074019047,IC_kwDOBm6k_c5ABDrn,9599,simonw,2022-03-21T15:09:56Z,2022-03-21T15:09:56Z,OWNER,I should research how much overhead creating a new connection costs - it may be that an easy way to solve this is to create A dedicated connection for the query and then close that connection at the end.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",459882902,Stream all results for arbitrary SQL and canned queries,
https://github.com/simonw/datasette/issues/1688#issuecomment-1079582485,https://api.github.com/repos/simonw/datasette/issues/1688,1079582485,IC_kwDOBm6k_c5AWR8V,9599,simonw,2022-03-26T03:15:34Z,2022-03-26T03:15:34Z,OWNER,"Yup, you're right in what you figured out here: stand-alone plugins can't currently package static assets other then using the static folder.

The `datasette-plugin` cookiecutter template should make creating a Python package pretty easy though: https://github.com/simonw/datasette-plugin

You can run that yourself, or you can run it using this GitHub template repository: https://github.com/simonw/datasette-plugin-template-repository 

","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1181432624,[plugins][documentation] Is it possible to serve per-plugin static folders when writing one-off (single file) plugins?,
https://github.com/simonw/datasette/issues/1692#issuecomment-1082663746,https://api.github.com/repos/simonw/datasette/issues/1692,1082663746,IC_kwDOBm6k_c5AiCNC,9599,simonw,2022-03-30T06:14:39Z,2022-03-30T06:14:51Z,OWNER,"I like your design, though I think it should be `""nomodule"": True` for consistency with the other options.

I think `""async"": True` is worth supporting too.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1182227211,[plugins][feature request]: Support additional script tag attributes when loading custom JS,
https://github.com/simonw/sqlite-utils/issues/421#issuecomment-1098548931,https://api.github.com/repos/simonw/sqlite-utils/issues/421,1098548931,IC_kwDOCGYnMM5BeobD,9599,simonw,2022-04-13T22:41:59Z,2022-04-13T22:41:59Z,OWNER,"I'm going to close this ticket since it looks like this is a bug in the way the Dockerfile builds Python, but I'm going to ship a fix for that issue I found so the `LD_PRELOAD` workaround above should work OK with the next release of `sqlite-utils`. Thanks for the detailed bug report!","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1180427792,"""Error: near ""("": syntax error"" when using sqlite-utils indexes CLI",
https://github.com/dogsheep/github-to-sqlite/issues/72#issuecomment-1105474232,https://api.github.com/repos/dogsheep/github-to-sqlite/issues/72,1105474232,IC_kwDODFdgUs5B5DK4,9599,simonw,2022-04-21T17:02:15Z,2022-04-21T17:02:15Z,MEMBER,"That's interesting - yeah it looks like the number of pages can be derived from the `Link` header, which is enough information to show a progress bar, probably using Click just to avoid adding another dependency.

https://docs.github.com/en/rest/guides/traversing-with-pagination","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1211283427,feature: display progress bar when downloading multi-page responses,
https://github.com/simonw/datasette/issues/1720#issuecomment-1109174715,https://api.github.com/repos/simonw/datasette/issues/1720,1109174715,IC_kwDOBm6k_c5CHKm7,9599,simonw,2022-04-26T00:40:13Z,2022-04-26T00:43:33Z,OWNER,"Some of the things I'd like to use `?_extra=` for, that may or not make sense as plugins:

- Performance breakdown information, maybe including explain output for a query/table
- Information about the tables that were consulted in a query - imagine pulling in additional table metadata
- Statistical aggregates against the full set of results. This may well be a Datasette core feature at some point in the future, but being able to provide it early as a plugin would be really cool.
- For tables, what are the other tables they can join against?
- Suggested facets
- Facet results themselves
- New custom facets I haven't thought of - though the `register_facet_classes` hook covers that already
- Table schema
- Table metadata
- Analytics - how many times has this table been queried? Would be a plugin thing
- For geospatial data, how about a GeoJSON polygon that represents the bounding box for all returned results? Effectively this is an extra aggregation.

Looking at https://github-to-sqlite.dogsheep.net/github/commits.json?_labels=on&_shape=objects for inspiration.

I think there's a separate potential mechanism in the future that lets you add custom columns to a table. This would affect `.csv` and the HTML presentation too, which makes it a different concept from the `?_extra=` hook that affects the JSON export (and the context that is fed to the HTML templates).","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1215174094,Design plugin hook for extras,
https://github.com/simonw/sqlite-utils/issues/441#issuecomment-1154373361,https://api.github.com/repos/simonw/sqlite-utils/issues/441,1154373361,IC_kwDOCGYnMM5Ezlbx,9599,simonw,2022-06-13T20:01:25Z,2022-06-13T20:01:25Z,OWNER,"Yeah, at the moment the best way to do this is with `search_sql()`, but you're right it really isn't very intuitive.

Here's how I would do this, using a CTE trick to combine the queries:
```python
search_sql = db[""articles""].search_sql(columns=[""title"", ""author""]))
sql = f""""""
with search_results as ({search_sql})
select * from search_results where owner = :owner
""""""
results = db.query(sql, {""query"": ""my search query"", ""owner"": ""my owner""})
```
I'm not sure if `sqlite-utils` should ever evolve to provide a better way of doing this kind of thing to be honest - if it did, it would turn into more of an ORM. Something like [PeeWee](http://docs.peewee-orm.com/en/latest/) may be a better option here.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1257724585,Combining `rows_where()` and `search()` to limit which rows are searched,
https://github.com/simonw/datasette/pull/1759#issuecomment-1160717735,https://api.github.com/repos/simonw/datasette/issues/1759,1160717735,IC_kwDOBm6k_c5FLyWn,9599,simonw,2022-06-20T18:04:41Z,2022-06-20T18:04:41Z,OWNER,I don't think this change needs any changes to the documentation: https://docs.datasette.io/en/stable/custom_templates.html#custom-templates,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1275523220,Extract facet portions of table.html out into included templates,
https://github.com/simonw/sqlite-utils/issues/297#issuecomment-1160991031,https://api.github.com/repos/simonw/sqlite-utils/issues/297,1160991031,IC_kwDOCGYnMM5FM1E3,9599,simonw,2022-06-21T00:35:20Z,2022-06-21T00:35:20Z,OWNER,Relevant TIL: https://til.simonwillison.net/sqlite/one-line-csv-operations,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",944846776,Option for importing CSV data using the SQLite .import mechanism,
https://github.com/simonw/sqlite-utils/issues/453#issuecomment-1185974145,https://api.github.com/repos/simonw/sqlite-utils/issues/453,1185974145,IC_kwDOCGYnMM5GsIeB,9599,simonw,2022-07-15T21:52:18Z,2022-07-15T21:52:18Z,OWNER,"I should warn you that this isn't a supported API - I reserve the right to change how it works between release without a major version bump, because it's not part of the documented API surface.

You'll be fine if you pin to exact versions of the library though!

You may find this recently-documented function useful though: https://sqlite-utils.datasette.io/en/latest/python-api.html#reading-rows-from-a-file

See:
- #443

I'm going to close this issue for the moment, but if anyone wants to submit a PR that cleans up this I'll happily review it.

","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1303169663,'unclosed file' warning when using insert_upsert_implementation from Python,
https://github.com/simonw/datasette/pull/1685#issuecomment-1186657003,https://api.github.com/repos/simonw/datasette/issues/1685,1186657003,IC_kwDOBm6k_c5GuvLr,9599,simonw,2022-07-18T01:06:58Z,2022-07-18T01:06:58Z,OWNER,@dependabot rebase,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1180778860,"Update jinja2 requirement from <3.1.0,>=2.10.3 to >=2.10.3,<3.2.0",
https://github.com/simonw/datasette/issues/1779#issuecomment-1214416491,https://api.github.com/repos/simonw/datasette/issues/1779,1214416491,IC_kwDOBm6k_c5IYoZr,9599,simonw,2022-08-14T17:07:34Z,2022-08-14T17:07:34Z,OWNER,"Tested that with:

    datasette publish cloudrun fixtures.db --service issue-1779 --min-instances 2 --max-instances 4


","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1334628400,google cloudrun updated their limits on maxscale based on memory and cpu count,
https://github.com/simonw/sqlite-utils/pull/463#issuecomment-1218610320,https://api.github.com/repos/simonw/sqlite-utils/issues/463,1218610320,IC_kwDOCGYnMM5IooSQ,9599,simonw,2022-08-17T23:11:07Z,2022-08-17T23:11:07Z,OWNER,Thanks!,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1334416486,Use Read the Docs action v1,
https://github.com/dogsheep/pocket-to-sqlite/issues/10#issuecomment-1221623052,https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/10,1221623052,IC_kwDODLZ_YM5I0H0M,9599,simonw,2022-08-21T21:20:33Z,2022-08-21T21:20:33Z,MEMBER,"That was clearly the intention from the description of this issue:
- #4","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1246826792,"When running `auth` command, don't overwrite an existing auth.json file",
https://github.com/simonw/datasette/issues/1775#issuecomment-1233680261,https://api.github.com/repos/simonw/datasette/issues/1775,1233680261,IC_kwDOBm6k_c5JiHeF,9599,simonw,2022-09-01T03:05:57Z,2022-09-01T03:05:57Z,OWNER,"OK, I'm convinced that it's time to start figuring this out.

I've done a little bit of this with Django in the past, but Datasette isn't built on Django.

It looks to me like the key library for implementing this is Babel: https://babel.pocoo.org/en/latest/

It's been around since 2007 and is very widely used: https://github.com/python-babel/babel/network/dependents?package_id=UGFja2FnZS01MDM0NTU3NQ%3D%3D

Also found these hints on getting it to work with Jinja: https://stackoverflow.com/questions/12046998/babel-doesnt-recognize-jinja2-extraction-method-for-language-support","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1323346408,i18n support,
https://github.com/simonw/sqlite-utils/issues/297#issuecomment-1246977989,https://api.github.com/repos/simonw/sqlite-utils/issues/297,1246977989,IC_kwDOCGYnMM5KU1_F,9599,simonw,2022-09-14T15:57:09Z,2022-09-14T15:57:09Z,OWNER,"Should consider how this could best handle creating columns that are integer and float as opposed to just text.

https://discord.com/channels/823971286308356157/823971286941302908/1019630014544748584 is a relevant discussion on Discord. Even if you create the schema in advance with the correct column types, this import mechanism can put empty strings in blank float/integer columns when ideally you would want to have nulls.

Related feature idea for `sqlite-utils transform`:
- #488

Not sure how best to handle this for `sqlite3 .import` imports.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",944846776,Option for importing CSV data using the SQLite .import mechanism,
https://github.com/simonw/datasette/issues/1814#issuecomment-1251677554,https://api.github.com/repos/simonw/datasette/issues/1814,1251677554,IC_kwDOBm6k_c5KmxVy,9599,simonw,2022-09-19T23:35:06Z,2022-09-19T23:35:06Z,OWNER,It might have been useful for Datasette to show an error when started against a `settings.json` file that contains an invalid setting though.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1378495690,Static files not served,
https://github.com/simonw/datasette/pull/1838#issuecomment-1271009214,https://api.github.com/repos/simonw/datasette/issues/1838,1271009214,IC_kwDOBm6k_c5Lwg--,9599,simonw,2022-10-07T02:01:07Z,2022-10-07T02:01:07Z,OWNER,"The argument that has always convinced me NOT to use `target=""_blank""` (even for links like this one) is that it breaks browser expectations.

If you click a link with `target=""_blank"" on it you get a new browser window... with a disabled back button. You have to then know to close that browser window in order to return to the previous page - as opposed to hitting the ""back"" button like usual.

You'll note that Datasette doesn't use `target=""_blank""` even on URLs presented in database tables - like these ones: https://latest.datasette.io/fixtures/roadside_attractions

So I'm very firmly in the anti-target-blank camp!

This is the kind of change which I'd suggest implementing as a plugin. `datasette-external-links-new-windows` could run a bit of JavaScript on every page that looks for `` elements that link to off-domain pages and adds `target=""_blank""` to them via the DOM.

That way people who like `target=""_blank""` can have it!
","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1400494162,Open Datasette link in new tab,
https://github.com/simonw/datasette/issues/1860#issuecomment-1292659986,https://api.github.com/repos/simonw/datasette/issues/1860,1292659986,IC_kwDOBm6k_c5NDG0S,9599,simonw,2022-10-26T21:14:26Z,2022-10-26T21:15:22Z,OWNER,"Yeah we should fix this.

https://www.sqlite.org/lang_comment.html - SQLite also supports `-- style` comments.

I like how explicit the documentation is here:

> SQL comments begin with two consecutive ""-"" characters (ASCII 0x2d) and extend up to and including the next newline character (ASCII 0x0a) or until the end of input, whichever comes first.
> 
> C-style comments begin with ""/*"" and extend up to and including the next ""*/"" character pair or until the end of input, whichever comes first. C-style comments can span multiple lines. ","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1424378012,SQL query field can't begin by a comment,
https://github.com/simonw/datasette/issues/1860#issuecomment-1293928738,https://api.github.com/repos/simonw/datasette/issues/1860,1293928738,IC_kwDOBm6k_c5NH8ki,9599,simonw,2022-10-27T18:46:31Z,2022-10-27T18:46:31Z,OWNER,I think mine has a better pattern for handling `/* ... anything in here that isn't */ ... */`,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1424378012,SQL query field can't begin by a comment,
https://github.com/simonw/datasette/pull/1839#issuecomment-1294034011,https://api.github.com/repos/simonw/datasette/issues/1839,1294034011,IC_kwDOBm6k_c5NIWRb,9599,simonw,2022-10-27T20:34:37Z,2022-10-27T20:34:37Z,OWNER,@dependabot rebase,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1401155623,Bump black from 22.8.0 to 22.10.0,
https://github.com/simonw/datasette/issues/1879#issuecomment-1299102108,https://api.github.com/repos/simonw/datasette/issues/1879,1299102108,IC_kwDOBm6k_c5Nbrmc,9599,simonw,2022-11-01T20:30:54Z,2022-11-01T20:33:06Z,OWNER,One idea: add a `/-/debug` page (or `/-/tips` or `/-/checks`) which shows the incoming requests headers and could even detect if there's an `x-forwarded-host` header that isn't being repeated and show a tip on how to fix that.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1432037325,Make it easier to fix URL proxy problems,
https://github.com/simonw/datasette/issues/1880#issuecomment-1311271298,https://api.github.com/repos/simonw/datasette/issues/1880,1311271298,IC_kwDOBm6k_c5OKGmC,9599,simonw,2022-11-11T06:12:29Z,2022-11-11T06:12:29Z,OWNER,"I think you may have misunderstood this feature. This is talking about the `_internal` in-memory database, which maintains a set of tables that list the databases and tables that are attached to Datasette.

They're not a copy of the data itself - just a list of table names, column names and database names.

You can see what that database looks like by signing in as root - running `datasette --root` and clicking the link. Or you can see an example here:

- Click the button on https://latest.datasette.io/login-as-root
- Now visit https://latest.datasette.io/_internal

For the example instance that looks like this:



The two most interesting tables in there are these ones:





As you can see, it's just the table schema itself and the columns that make up the tables. Even if you have hundreds of databases connected each with hundreds of tables this should still only add up to a few MB of RAM.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1433576351,Datasette with many and large databases > Memory use,
https://github.com/simonw/datasette/issues/1880#issuecomment-1311273063,https://api.github.com/repos/simonw/datasette/issues/1880,1311273063,IC_kwDOBm6k_c5OKHBn,9599,simonw,2022-11-11T06:15:28Z,2022-11-11T06:15:28Z,OWNER,"The `_internal` database is intended to help Datasette handle much larger attached databases. Right now Datasette attempts to show every database on the https://latest.datasette.io/ index page and every table on the https://latest.datasette.io/fixtures database index page - but these are not paginated. If you had a database containing 1,000 tables the database index page would get pretty slow.

So I want to be able to paginate (and search) those. But to paginate them it's useful to have them in a database table itself, since then I can paginate using SQL.

My plan for `_internal` is to use it to implement those advanced browsing features. I've not completed this work yet though. See this issue for more details on that:

- #417","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1433576351,Datasette with many and large databases > Memory use,
https://github.com/simonw/datasette/issues/1871#issuecomment-1312821031,https://api.github.com/repos/simonw/datasette/issues/1871,1312821031,IC_kwDOBm6k_c5OQA8n,9599,simonw,2022-11-13T21:02:06Z,2022-11-13T21:03:11Z,OWNER,"Actually no, I'm going to add a class of `details-menu` to the other details elements that SHOULD be closed. That way custom templates using `
` won't close in a surprising way.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1427293909,API explorer tool, https://github.com/simonw/datasette/pull/1893#issuecomment-1316340865,https://api.github.com/repos/simonw/datasette/issues/1893,1316340865,IC_kwDOBm6k_c5OdcSB,9599,simonw,2022-11-16T04:49:30Z,2022-11-16T04:49:43Z,OWNER,"> The main issue is that we don't pass the relevant table data down to QueryView. If you can come up with a static example JSON data structure example that does the right thing, I'm happy to refactor QueryView to make that available to the template - or even have a separate `fetch()` that grabs just the data needed for the autocomplete as a separate hit when the page loads (whichever has better performance implications). I'm working a fair amount in the view classes at the moment so adding this to that work would make sense. ","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1450363982,"Upgrade to CodeMirror 6, add SQL autocomplete", https://github.com/simonw/datasette/pull/1893#issuecomment-1317681193,https://api.github.com/repos/simonw/datasette/issues/1893,1317681193,IC_kwDOBm6k_c5Oijgp,95570,bgrins,2022-11-16T21:19:13Z,2022-11-16T21:19:13Z,CONTRIBUTOR,"Alright, added Cmd+Enter to submit (Ctrl+Enter on Windows as well bc of using Meta-Enter on codemirror). We can make that MacOS only by changing the combo to Cmd+Enter specifically but I think it's probably fine to have both.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1450363982,"Upgrade to CodeMirror 6, add SQL autocomplete", https://github.com/simonw/datasette/issues/1900#issuecomment-1319574972,https://api.github.com/repos/simonw/datasette/issues/1900,1319574972,IC_kwDOBm6k_c5Opx28,9599,simonw,2022-11-18T05:41:28Z,2022-11-18T05:41:28Z,OWNER,Oh this is with `datasette package`? That should work. Will investigate.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1452572348,datasette package --spatialite throws error during build, https://github.com/simonw/sqlite-utils/issues/510#issuecomment-1320394127,https://api.github.com/repos/simonw/sqlite-utils/issues/510,1320394127,IC_kwDOCGYnMM5Os52P,1176293,ar-jan,2022-11-18T18:37:51Z,2022-11-18T18:37:51Z,NONE,"I guess it is not incorrect when it says the version is `4`, though it is confusing. Maybe it doesn't even refer to FTS4/FTS5 versions, but something else? In any case, it's not related to sqlite-utils, but SQLite itself.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1434911255,Cannot enable FTS5 despite it being available, https://github.com/simonw/datasette/issues/1958#issuecomment-1352644267,https://api.github.com/repos/simonw/datasette/issues/1958,1352644267,IC_kwDOBm6k_c5Qn7ar,9599,simonw,2022-12-13T18:33:32Z,2022-12-13T18:33:32Z,OWNER,"When you run `--root` you need to follow the special link that gets output to the console: ``` % datasette --root http://127.0.0.1:8001/-/auth-token?token=036d8055cc8000e9667f21c1dd08722a9358c066463873ad9566d23d88765c52 INFO: Started server process [53934] INFO: Waiting for application startup. INFO: Application startup complete. ``` That `/-/auth-token?...` link is the one that sets the cookie and lets you in.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1497909798,datasette --root running in Docker doesn't reliably show the magic URL, https://github.com/simonw/datasette/issues/1886#issuecomment-1356842576,https://api.github.com/repos/simonw/datasette/issues/1886,1356842576,IC_kwDOBm6k_c5Q38ZQ,18738650,stevecrawshaw,2022-12-18T17:34:20Z,2022-12-18T17:34:20Z,NONE,"A bit late to this, but I have made an app to publish air quality data in Bristol, UK. [air quality data in Bristol, UK.](https://brisaq-wfzqhmj43q-ew.a.run.app/) Next step to see if I can make a streamlit app based on this to produce some nice charts.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1447050738,"Call for birthday presents: if you're using Datasette, let us know how you're using it here", https://github.com/simonw/datasette/issues/1101#issuecomment-1399341761,https://api.github.com/repos/simonw/datasette/issues/1101,1399341761,IC_kwDOBm6k_c5TaELB,9599,simonw,2023-01-21T22:07:19Z,2023-01-21T22:07:19Z,OWNER,"Idea for supporting streaming with the `register_output_renderer` hook: ```python @hookimpl def register_output_renderer(datasette): return { ""extension"": ""test"", ""render"": render_demo, ""can_render"": can_render_demo, ""render_stream"": render_demo_stream, # This is new } ``` So there's a new `""render_stream""` key which can be returned, which if present means that the output renderer supports streaming. I'll play around with the design of that function signature in: - #1999 - #1062 ","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",749283032,register_output_renderer() should support streaming data, https://github.com/simonw/datasette/pull/1159#issuecomment-1399589414,https://api.github.com/repos/simonw/datasette/issues/1159,1399589414,IC_kwDOBm6k_c5TbAom,193185,cldellow,2023-01-22T19:48:41Z,2023-01-22T19:48:41Z,CONTRIBUTOR,"Hey @lovasoa, I hope you don't mind - I pulled this PR into [datasette-ui-extras](https://github.com/cldellow/datasette-ui-extras), a plugin I'm making that collects UI tweaks to Datasette. You can apply it to your own Datasette instance by running `datasette install datasette-ui-extras`","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",774332247,Improve the display of facets information, https://github.com/simonw/datasette/issues/2001#issuecomment-1403084856,https://api.github.com/repos/simonw/datasette/issues/2001,1403084856,IC_kwDOBm6k_c5ToWA4,193185,cldellow,2023-01-25T04:31:02Z,2023-01-25T04:31:02Z,CONTRIBUTOR,"Aha, it's user error on my part. Adding ``` sqlite3_db_config.argtypes = [ctypes.c_void_p, ctypes.c_int, ctypes.c_int, ctypes.c_int] ``` makes it work reliably both on the CLI and from datasette, and now I can reproduce the errors you mentioned in the issue description.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1553615704,Datasette is not compatible with SQLite's strict quoting compilation option, https://github.com/simonw/sqlite-utils/pull/203#issuecomment-1404070841,https://api.github.com/repos/simonw/sqlite-utils/issues/203,1404070841,IC_kwDOCGYnMM5TsGu5,536941,fgregg,2023-01-25T18:47:18Z,2023-01-25T18:47:18Z,CONTRIBUTOR,i'll adopt this PR to make the changes @simonw suggested https://github.com/simonw/sqlite-utils/pull/203#issuecomment-753567932,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",743384829,changes to allow for compound foreign keys, https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1419734229,https://api.github.com/repos/simonw/sqlite-utils/issues/524,1419734229,IC_kwDOCGYnMM5Un2zV,193185,cldellow,2023-02-06T20:53:28Z,2023-02-06T21:16:29Z,NONE,"I think it's not currently possible: sqlite-utils requires that it be one of `integer`, `text`, `float`, `blob` ([see code](https://github.com/simonw/sqlite-utils/blob/fc221f9b62ed8624b1d2098e564f525c84497969/sqlite_utils/cli.py#L2266)) IMO, this is a bit of friction and it would be nice if it was more permissive. SQLite permits developers to use any data type when creating a table. For example, this is a perfectly cromulent sqlite session that creates a table with columns of type `baz` and `bar`: ``` sqlite> create table foo(column1 baz, column2 bar); sqlite> .schema foo CREATE TABLE foo(column1 baz, column2 bar); sqlite> select * from pragma_table_info('foo'); cid name type notnull dflt_value pk ---------- ---------- ---------- ---------- ---------- ---------- 0 column1 baz 0 0 1 column2 bar 0 0 ``` The idea is that the application developer will know what meaning to ascribe to those types. For example, I'm working on a plugin to Datasette. Dates are tricky to handle. If you have some existing rows, you can look at the values in them to know how a user is serializing the dates -- as an ISO 8601 string? An RFC 3339 string? With millisecond precision? With timezone offset? But if you don't yet have any rows, you have to guess. If the column is of type `TEXT`, you don't even know that it's meant to hold a date! In this case, my plugin will look to see if the column is of type `DATE` or `DATETIME`, and assume a certain representation when writing. Perhaps there is an argument that sqlite-utils is trying to conform to SQLite's strict mode, and that is why it limits the choices. In strict mode, SQLite requires that the data type be one of `INT`, `INTEGER`, `REAL`, `TEXT`, `BLOB`, `ANY`. But that can't be the case -- sqlite-utils supports `FLOAT`, which is not one of the valid types in strict mode, and it rejects `INT`, `REAL` and `ANY`, which _are_ valid.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1572766460,Transformation type `--type DATETIME`, https://github.com/simonw/datasette/issues/2023#issuecomment-1425974877,https://api.github.com/repos/simonw/datasette/issues/2023,1425974877,IC_kwDOBm6k_c5U_qZd,193185,cldellow,2023-02-10T15:32:41Z,2023-02-10T15:32:41Z,CONTRIBUTOR,"I think this feature was removed in Datasette 0.61 and moved to a plugin. People who want hashed URLs can use the [datasette-hashed-urls](https://docs.datasette.io/en/stable/performance.html#performance-hashed-urls) plugin to achieve the same affect. It looks like you're trying to disable hashed urls, so I think you can just remove that config setting and things will work.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1579695809,Error: Invalid setting 'hash_urls' in settings.json in 0.64.1, https://github.com/simonw/datasette/issues/1258#issuecomment-1437671409,https://api.github.com/repos/simonw/datasette/issues/1258,1437671409,IC_kwDOBm6k_c5VsR_x,2670795,brandonrobertz,2023-02-20T23:39:58Z,2023-02-20T23:39:58Z,CONTRIBUTOR,"This is pretty annoying for FTS because sqlite throws an error instead of just doing something like returning all or no results. This makes users who are unfamiliar with SQL and Datasette think the canned query page is broken and is a frequent source of confusion. To anyone dealing with this: My solution is to modify the canned query so that it returns no results which cues people to fill in the blank parameters. So instead of `emails_fts match escape_fts(:search))` My canned queries now look like this: `emails_fts match escape_fts(iif(:search=="""", ""*"", :search))` There are no asterisks in my data so the result is always blank. Ultimately it would be nice to be able to handle this in the metadata. Either making some named parameters required or setting some default values.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",828858421,Allow canned query params to specify default values, https://github.com/simonw/sqlite-utils/issues/433#issuecomment-1444474487,https://api.github.com/repos/simonw/sqlite-utils/issues/433,1444474487,IC_kwDOCGYnMM5WGO53,167893,mcarpenter,2023-02-24T20:57:43Z,2023-02-24T22:22:18Z,CONTRIBUTOR,"I think I see what is happening here, although I haven't quite work out a fix yet. Usually: * `click.progressbar.render_progress()` renders the cursor invisible on each invocation (update of the bar) * When the progress bar goes out of scope, the `__exit()__` method is invoked, which calls `render_finish()` to make the cursor re-appear. (See terminal escape sequences `BEFORE_BAR` and `AFTER_BAR` in click). However the sqlite-utils `utils.file_progress` context manager wraps `click.progressbar` and yields an instance of a helper class: ``` python @contextlib.contextmanager def file_progress(file, silent=False, **kwargs): ... with click.progressbar(length=file_length, **kwargs) as bar: yield UpdateWrapper(file, bar.update) ``` The yielded `UpdateWrapper` goes out of scope quickly and `click.progressbar.__exit__()` is called. The cursor is made un-invisible. Hoewever `bar` is still live and so when the caller iterates on the yielded wrapper this invokes the bar's update method, calling `render_progress()`, each time printing the ""make cursor invisible"" escape code. The `progressbar.__exit__` function is not called again, so the cursor doesn't re-appear. ","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1239034903,CLI eats my cursor, https://github.com/simonw/datasette/pull/2014#issuecomment-1487998788,https://api.github.com/repos/simonw/datasette/issues/2014,1487998788,IC_kwDOBm6k_c5YsQ9E,9599,simonw,2023-03-29T06:08:23Z,2023-03-29T06:08:23Z,OWNER,@dependabot recreate,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1566081801,Bump black from 22.12.0 to 23.1.0, https://github.com/simonw/sqlite-utils/issues/530#issuecomment-1539015064,https://api.github.com/repos/simonw/sqlite-utils/issues/530,1539015064,IC_kwDOCGYnMM5bu4GY,9599,simonw,2023-05-08T20:35:07Z,2023-05-08T20:35:07Z,OWNER,"Wow, this is a neat feature I didn't know about. Looks like there are a bunch of options: - NO ACTION (default) - RESTRICT: application is prohibited from deleting a parent key when there exists one or more child keys mapped to it - SET NULL: when a parent key is deleted the child key columns of all rows in the child table that mapped to the parent key are set to contain SQL NULL values - SET DEFAULT: set a specific default - CASCADE: propagates the delete or update operation on the parent key to each dependent child key","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1595340692,"add ability to configure ""on delete"" and ""on update"" attributes of foreign keys:", https://github.com/simonw/sqlite-utils/pull/537#issuecomment-1539055393,https://api.github.com/repos/simonw/sqlite-utils/issues/537,1539055393,IC_kwDOCGYnMM5bvB8h,9599,simonw,2023-05-08T21:10:06Z,2023-05-08T21:10:06Z,OWNER,Thanks!,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1665200812,Support self-referencing FKs in `Table.create`, https://github.com/simonw/sqlite-utils/issues/514#issuecomment-1539100300,https://api.github.com/repos/simonw/sqlite-utils/issues/514,1539100300,IC_kwDOCGYnMM5bvM6M,9599,simonw,2023-05-08T21:50:51Z,2023-05-08T21:50:51Z,OWNER,Seeing as `sqlite-utils` doesn't currently provide mechanisms for adding `check` constraints like this I'm going to leave this - I'm happy with the fix I put in for the `not null` constraints.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1465194249,upsert of new row with check constraints fails, https://github.com/simonw/sqlite-utils/issues/525#issuecomment-1539108140,https://api.github.com/repos/simonw/sqlite-utils/issues/525,1539108140,IC_kwDOCGYnMM5bvO0s,9599,simonw,2023-05-08T21:59:41Z,2023-05-08T21:59:41Z,OWNER,That original example passes against `main` now.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1575131737,Repeated calls to `Table.convert()` fail, https://github.com/simonw/datasette/pull/2052#issuecomment-1548617257,https://api.github.com/repos/simonw/datasette/issues/2052,1548617257,IC_kwDOBm6k_c5cTgYp,193185,cldellow,2023-05-15T21:32:20Z,2023-05-15T21:32:20Z,CONTRIBUTOR,"> Were you picturing that the whole plugin config object could be returned as a promise, or that the individual hooks (like makeColumnActions or makeAboveTablePanelConfigs supported returning a promise of arrays instead only returning plain arrays? The latter - that you could return a promise of arrays, so it parallels the [""await me maybe"" pattern in Datasette](https://simonwillison.net/2020/Sep/2/await-me-maybe/), where you can return either a value, a callable or an awaitable. > I have a hunch that what you're describing might be achievable without adding Promises to the API with something Oops, I did a poor job explaining. Yes, this would work - but it requires me to continue to communicate the column names out of band (in order to fetch the facet data per-column before registering my plugin), vs being able to re-use them from the plugin implementation. This isn't that big of a deal - it'd be a nice ergonomic improvement, but nowhere near as a big of an improvement as having an officially sanctioned way to add stuff to the column menus in the first place. This could also be layered on in a future commit without breaking v1 users, too, so it's not at all urgent. > especially if those lines are encapsulated by a function we provide (maybe something that's available on the window provided by Datasette as an inline script tag Ah, this is maybe the the key point. Since it's all hosted inside Datasette, Datasette can provide some arbitrary sugar to make it easier to work with. My experience with async scripts in JS is that people sometimes don't understand the race conditions inherent to them. If they copy/paste from a tutorial, it does just work. But then they'll delete half the code, and by chance it still works on their machine/Datasette templates, and now someone's headed for an annoying debugging session -- maybe them, maybe someone else who tries to re-use their plugin. Again, a fairly minor thing, though.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1651082214,"feat: Javascript Plugin API (Custom panels, column menu items with JS actions)", https://github.com/simonw/sqlite-utils/issues/399#issuecomment-1548913065,https://api.github.com/repos/simonw/sqlite-utils/issues/399,1548913065,IC_kwDOCGYnMM5cUomp,433780,chrislkeller,2023-05-16T03:11:03Z,2023-05-16T03:11:52Z,NONE,"Using this thread and some [other resources](https://sqlite-utils.datasette.io/en/stable/cli.html#spatialite-helpers) I managed to cobble together a couple of sqlite-utils lines to add a geometry column for a table that already has a lat/lng column. ``` # add a geometry column sqlite-utils add-geometry-column [db name] [table name] geometry --type POINT --srid 4326 # add a point for each row to geometry column sqlite-utils --load-extension=spatialite [db name] 'update [table name] SET Geometry=MakePoint(longitude, latitude, 4326);' ```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1124731464,"Make it easier to insert geometries, with documentation and maybe code", https://github.com/simonw/datasette/pull/2077#issuecomment-1613290899,https://api.github.com/repos/simonw/datasette/issues/2077,1613290899,IC_kwDOBm6k_c5gKN2T,9599,simonw,2023-06-29T14:32:16Z,2023-06-29T14:32:16Z,OWNER,@dependabot recreate,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1719759468,Bump furo from 2023.3.27 to 2023.5.20, https://github.com/simonw/datasette/issues/2093#issuecomment-1613895188,https://api.github.com/repos/simonw/datasette/issues/2093,1613895188,IC_kwDOBm6k_c5gMhYU,15178711,asg017,2023-06-29T22:51:53Z,2023-06-29T22:51:53Z,CONTRIBUTOR,"I agree with not liking `metadata.json` stuff in a `datasette.*` config file. Editing description of a table/column in a file like `datasette.*` seems odd to me. Though since plugin configuration currently lives in `metadata.json`, I think it should be removed from there and placed in `datasette.*`, at least for top-level config like `datasette-auth-github`'s config. Keeping `metadata.json` strictly for documentation/licensing/column units makes sense to me, but anything plugin related should be in some config file, like `datasette.*`. And ya, supporting both `datasette.*` and CLI flags makes a lot of sense to me. Any `--setting` flag should override anything in `datasette.*` for easier debugging, with possibly a warning message so people don't get confused. Same with `--port` and a port defined in `datasette.*`","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1781530343,"Proposal: Combine settings, metadata, static, etc. into a single `datasette.toml` File", https://github.com/simonw/datasette/pull/2052#issuecomment-1616095810,https://api.github.com/repos/simonw/datasette/issues/2052,1616095810,IC_kwDOBm6k_c5gU6pC,15178711,asg017,2023-07-01T20:31:31Z,2023-07-01T20:31:31Z,CONTRIBUTOR,"> Just curious, is there a query that can be used to compile this programmatically, or did you identify these through memory? I just did a github search for `user:simonw ""def extra_js_urls(""` ! Though I'm sure other plugins made by people other than Simon also exist out there https://github.com/search?q=user%3Asimonw+%22def+extra_js_urls%28%22&type=code","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1651082214,"feat: Javascript Plugin API (Custom panels, column menu items with JS actions)",