html_url,issue_url,id,node_id,user,user_label,created_at,updated_at,author_association,body,reactions,issue,issue_label,performed_via_github_app https://github.com/simonw/datasette/issues/1#issuecomment-338523957,https://api.github.com/repos/simonw/datasette/issues/1,338523957,MDEyOklzc3VlQ29tbWVudDMzODUyMzk1Nw==,9599,simonw,2017-10-23T01:09:05Z,2017-10-24T02:42:12Z,OWNER,"I also need to solve for weird primary keys. If it’s a single integer or a single char field that’s easy. But what if it is a compound key with more than one chat field? What delimiter can I use that will definitely be safe? Let’s say I use hyphen. Now I need to find a durable encoding for any hyphens that might exist in the key fields themselves. How about I use URLencoding for every non-alpha-numeric character? That will turn hyphens into (I think) %2D. It should also solve for unicode characters, but it means the vast majority of keys (integers) will display neatly, including a compound key of eg 5678-345 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267513424,Addressable pages for every row in a table, https://github.com/simonw/datasette/issues/1#issuecomment-338524454,https://api.github.com/repos/simonw/datasette/issues/1,338524454,MDEyOklzc3VlQ29tbWVudDMzODUyNDQ1NA==,9599,simonw,2017-10-23T01:15:24Z,2017-10-23T01:15:24Z,OWNER,Table rendering logic needs to detect the primary key field and turn it into a hyperlink. If there is a compound primary key it should add an extra column at the start of the table which displays the compound key as a link,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267513424,Addressable pages for every row in a table, https://github.com/simonw/datasette/issues/1#issuecomment-338857568,https://api.github.com/repos/simonw/datasette/issues/1,338857568,MDEyOklzc3VlQ29tbWVudDMzODg1NzU2OA==,9599,simonw,2017-10-24T02:57:12Z,2017-10-24T02:57:12Z,OWNER,"I can find the primary keys using: PRAGMA table_info(myTable) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267513424,Addressable pages for every row in a table, https://github.com/simonw/datasette/issues/1#issuecomment-338861511,https://api.github.com/repos/simonw/datasette/issues/1,338861511,MDEyOklzc3VlQ29tbWVudDMzODg2MTUxMQ==,9599,simonw,2017-10-24T03:24:17Z,2017-10-24T03:24:17Z,OWNER,"Some tables won't have primary keys, in which case I won't generate pages for individual records.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267513424,Addressable pages for every row in a table, https://github.com/simonw/datasette/issues/1#issuecomment-338872286,https://api.github.com/repos/simonw/datasette/issues/1,338872286,MDEyOklzc3VlQ29tbWVudDMzODg3MjI4Ng==,9599,simonw,2017-10-24T04:46:06Z,2017-10-24T04:46:06Z,OWNER,"I'm going to use `,` as the separator between elements of a compound primary key. If those elements themselves include a comma I will use `%2C` in its place.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267513424,Addressable pages for every row in a table, https://github.com/simonw/datasette/issues/1#issuecomment-338882207,https://api.github.com/repos/simonw/datasette/issues/1,338882207,MDEyOklzc3VlQ29tbWVudDMzODg4MjIwNw==,9599,simonw,2017-10-24T05:56:04Z,2017-10-24T05:56:04Z,OWNER,Next step: generate links to these.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267513424,Addressable pages for every row in a table, https://github.com/simonw/datasette/issues/3#issuecomment-338526148,https://api.github.com/repos/simonw/datasette/issues/3,338526148,MDEyOklzc3VlQ29tbWVudDMzODUyNjE0OA==,9599,simonw,2017-10-23T01:35:17Z,2017-10-23T01:35:17Z,OWNER,https://github.com/ahupp/python-magic/blob/master/README.md,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267515678,"Make individual column valuables addressable, with smart content types", https://github.com/simonw/datasette/issues/4#issuecomment-338530389,https://api.github.com/repos/simonw/datasette/issues/4,338530389,MDEyOklzc3VlQ29tbWVudDMzODUzMDM4OQ==,9599,simonw,2017-10-23T02:15:41Z,2017-10-23T02:15:41Z,OWNER,"This means I need a good solution for these compile time options while running in development mode ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267515836,Make URLs immutable, https://github.com/simonw/datasette/issues/4#issuecomment-338530480,https://api.github.com/repos/simonw/datasette/issues/4,338530480,MDEyOklzc3VlQ29tbWVudDMzODUzMDQ4MA==,9599,simonw,2017-10-23T02:16:33Z,2017-10-23T02:16:33Z,OWNER," How about when the service starts up it checks for a compile.json file and, if it is missing, creates it using the same code we run at compile time normally ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267515836,Make URLs immutable, https://github.com/simonw/datasette/issues/4#issuecomment-338531827,https://api.github.com/repos/simonw/datasette/issues/4,338531827,MDEyOklzc3VlQ29tbWVudDMzODUzMTgyNw==,9599,simonw,2017-10-23T02:28:31Z,2017-10-23T02:29:05Z,OWNER,"Many of the applications I want to implement with this would benefit from having permanent real URLs. So let’s have both. The sha1 urls will serve far future cache headers (and an etag derived from their path). The non sha1 URLs will serve 302 uncached redirects to the sha1 locations. We will have a setting that lets people opt out of this behavior.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267515836,Make URLs immutable, https://github.com/simonw/datasette/issues/4#issuecomment-338789734,https://api.github.com/repos/simonw/datasette/issues/4,338789734,MDEyOklzc3VlQ29tbWVudDMzODc4OTczNA==,9599,simonw,2017-10-23T20:40:25Z,2017-10-23T21:10:19Z,OWNER,"URL design: /database/table.json - redirects to /database-6753f4a/table.json So we always redirect to the version with the truncated hash in the URL. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267515836,Make URLs immutable, https://github.com/simonw/datasette/issues/4#issuecomment-338797522,https://api.github.com/repos/simonw/datasette/issues/4,338797522,MDEyOklzc3VlQ29tbWVudDMzODc5NzUyMg==,9599,simonw,2017-10-23T21:09:33Z,2017-10-23T21:09:33Z,OWNER,"https://stackoverflow.com/a/18134919/6083 is a good answer about how many characters of the hash are needed to be unique. I say we default to 7 characters, like git does - but allow extras to be configured.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267515836,Make URLs immutable, https://github.com/simonw/datasette/issues/4#issuecomment-338799438,https://api.github.com/repos/simonw/datasette/issues/4,338799438,MDEyOklzc3VlQ29tbWVudDMzODc5OTQzOA==,9599,simonw,2017-10-23T21:17:25Z,2017-10-23T21:17:25Z,OWNER,Can I take advantage of HTTP/2 so even if you get redirected I start serving you the correct resource straight away?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267515836,Make URLs immutable, https://github.com/simonw/datasette/issues/4#issuecomment-338804173,https://api.github.com/repos/simonw/datasette/issues/4,338804173,MDEyOklzc3VlQ29tbWVudDMzODgwNDE3Mw==,9599,simonw,2017-10-23T21:36:37Z,2017-10-23T21:36:37Z,OWNER,"Looks like the easiest way to implement HTTP/2 server push today is to run behind Cloudflare and use this: Link: ; rel=preload; as=script https://blog.cloudflare.com/announcing-support-for-http-2-server-push-2/ Here's the W3C draft: https://w3c.github.io/preload/ From https://w3c.github.io/preload/#as-attribute it looks like I should use `as=fetch` if the content is intended for consumption by fetch() or XMLHTTPRequest. Unclear if I should throw `as=fetch crossorigin` in there. Need to experiment on that. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267515836,Make URLs immutable, https://github.com/simonw/datasette/issues/4#issuecomment-338806718,https://api.github.com/repos/simonw/datasette/issues/4,338806718,MDEyOklzc3VlQ29tbWVudDMzODgwNjcxOA==,9599,simonw,2017-10-23T21:47:53Z,2017-10-23T21:47:53Z,OWNER,"Here's what the homepage of cloudflare.com does (with newlines added within the link header for clarity): $ curl -i 'https://www.cloudflare.com/' HTTP/1.1 200 OK Date: Mon, 23 Oct 2017 21:45:58 GMT Content-Type: text/html; charset=utf-8 Transfer-Encoding: chunked Connection: keep-alive link: ; rel=preload; as=style, ; rel=preload; as=style, ; rel=preload, ; rel=preload, ; rel=preload; as=video, ; rel=preload; as=video, ; rel=preload; as=video, ; rel=preload; as=video, ; rel=preload; as=video, ; rel=preload; as=image The original header looked like this: link: ; rel=preload; as=style, ; rel=preload; as=style, ; rel=preload, ; rel=preload, ; rel=preload; as=video, ; rel=preload; as=video, ; rel=preload; as=video, ; rel=preload; as=video, ; rel=preload; as=video, ; rel=preload; as=image ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267515836,Make URLs immutable, https://github.com/simonw/datasette/issues/5#issuecomment-338524857,https://api.github.com/repos/simonw/datasette/issues/5,338524857,MDEyOklzc3VlQ29tbWVudDMzODUyNDg1Nw==,9599,simonw,2017-10-23T01:20:30Z,2017-10-23T01:20:30Z,OWNER,"https://stackoverflow.com/a/14468878/6083 Looks like I should order by compound primary key and implement cursor-based pagination.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267516066,Implement sensible query pagination, https://github.com/simonw/datasette/issues/5#issuecomment-339027711,https://api.github.com/repos/simonw/datasette/issues/5,339027711,MDEyOklzc3VlQ29tbWVudDMzOTAyNzcxMQ==,9599,simonw,2017-10-24T15:21:30Z,2017-10-24T15:21:30Z,OWNER,I have code to detect primary keys on tables... but what should I do for tables that lack primary keys? How should I even sort them?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267516066,Implement sensible query pagination, https://github.com/simonw/datasette/issues/5#issuecomment-339028979,https://api.github.com/repos/simonw/datasette/issues/5,339028979,MDEyOklzc3VlQ29tbWVudDMzOTAyODk3OQ==,9599,simonw,2017-10-24T15:25:08Z,2017-10-24T15:25:08Z,OWNER,"Looks like I can use the SQLite specific “rowid” in that case. It isn’t guaranteed to stay consistent across a VACUUM but that’s ok because we are immutable anyway. https://www.sqlite.org/lang_createtable.html#rowid","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267516066,Implement sensible query pagination, https://github.com/simonw/datasette/issues/7#issuecomment-338853083,https://api.github.com/repos/simonw/datasette/issues/7,338853083,MDEyOklzc3VlQ29tbWVudDMzODg1MzA4Mw==,9599,simonw,2017-10-24T02:27:25Z,2017-10-24T02:27:25Z,OWNER,Fixed in 9d219140694551453bfa528e0624919eb065f9d6,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267516650,Framework where by every page is JSON plus a template, https://github.com/simonw/datasette/issues/8#issuecomment-338697223,https://api.github.com/repos/simonw/datasette/issues/8,338697223,MDEyOklzc3VlQ29tbWVudDMzODY5NzIyMw==,9599,simonw,2017-10-23T15:28:11Z,2017-10-23T15:28:11Z,OWNER,"Now returning this: { ""error"": ""attempt to write a readonly database"", ""ok"": false } ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267517314,Attempting an INSERT or UPDATE should return a sane error message, https://github.com/simonw/datasette/issues/9#issuecomment-338863155,https://api.github.com/repos/simonw/datasette/issues/9,338863155,MDEyOklzc3VlQ29tbWVudDMzODg2MzE1NQ==,9599,simonw,2017-10-24T03:36:58Z,2017-10-24T03:36:58Z,OWNER,I’m going to use py.test and start with all tests in a single tests.py module,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267517348,Initial test suite, https://github.com/simonw/datasette/issues/9#issuecomment-338882110,https://api.github.com/repos/simonw/datasette/issues/9,338882110,MDEyOklzc3VlQ29tbWVudDMzODg4MjExMA==,9599,simonw,2017-10-24T05:55:33Z,2017-10-24T05:55:33Z,OWNER,"Well, I've started it at least.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267517348,Initial test suite, https://github.com/simonw/datasette/issues/10#issuecomment-341938424,https://api.github.com/repos/simonw/datasette/issues/10,341938424,MDEyOklzc3VlQ29tbWVudDM0MTkzODQyNA==,9599,simonw,2017-11-04T23:48:57Z,2017-11-04T23:48:57Z,OWNER,Done: https://github.com/simonw/stateless-datasets/commit/edaa10587e60946e0c1935333f6b79553db33798,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267517381,Set up Travis, https://github.com/simonw/datasette/issues/11#issuecomment-338530704,https://api.github.com/repos/simonw/datasette/issues/11,338530704,MDEyOklzc3VlQ29tbWVudDMzODUzMDcwNA==,9599,simonw,2017-10-23T02:18:36Z,2017-10-23T02:18:36Z,OWNER,Needed by https://github.com/simonw/stateless-datasets/issues/4#issuecomment-338530389,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267522549,Code that generates compile-time properties about the database , https://github.com/simonw/datasette/issues/12#issuecomment-348245757,https://api.github.com/repos/simonw/datasette/issues/12,348245757,MDEyOklzc3VlQ29tbWVudDM0ODI0NTc1Nw==,9599,simonw,2017-11-30T16:39:45Z,2017-11-30T16:39:45Z,OWNER,"It is now possible to over-ride templates on a per-database / per-row or per- table basis. When you access e.g. `/mydatabase/mytable` Datasette will look for the following: - table-mydatabase-mytable.html - table.html If you provided a `--template-dir` argument to datasette serve it will look in that directory first. The lookup rules are as follows: Index page (/): index.html Database page (/mydatabase): database-mydatabase.html database.html Table page (/mydatabase/mytable): table-mydatabase-mytable.html table.html Row page (/mydatabase/mytable/id): row-mydatabase-mytable.html row.html If a table name has spaces or other unexpected characters in it, the template filename will follow the same rules as our custom `` CSS classes introduced in 8ab3a16 - for example, a table called ""Food Trucks"" will attempt to load the following templates: table-mydatabase-Food-Trucks-399138.html table.html It is possible to extend the default templates using Jinja template inheritance. If you want to customize EVERY row template with some additional content you can do so by creating a `row.html` template like this: {% extends ""default:row.html"" %} {% block content %}

EXTRA HTML AT THE TOP OF THE CONTENT BLOCK

This line renders the original block:

{{ super() }} {% endblock %} ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267523511,Make it so you can override templates, https://github.com/simonw/datasette/issues/13#issuecomment-344462608,https://api.github.com/repos/simonw/datasette/issues/13,344462608,MDEyOklzc3VlQ29tbWVudDM0NDQ2MjYwOA==,9599,simonw,2017-11-15T02:04:51Z,2017-11-15T02:04:51Z,OWNER,"Fixed in https://github.com/simonw/datasette/commit/8252daa4c14d73b4b69e3f2db4576bb39d73c070 - thanks, @tomdyson!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267542338,Add a syntax highlighting SQL editor, https://github.com/simonw/datasette/issues/14#issuecomment-343675165,https://api.github.com/repos/simonw/datasette/issues/14,343675165,MDEyOklzc3VlQ29tbWVudDM0MzY3NTE2NQ==,9599,simonw,2017-11-11T16:07:10Z,2017-11-11T16:07:10Z,OWNER,The plugin system can also allow alternative providers for the `publish` command - e.g. maybe hook up hyper.sh as an option for publishing containers.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-344438724,https://api.github.com/repos/simonw/datasette/issues/14,344438724,MDEyOklzc3VlQ29tbWVudDM0NDQzODcyNA==,9599,simonw,2017-11-14T23:47:54Z,2017-11-14T23:47:54Z,OWNER,"Plugins should be able to interact with the build step. This would give plugins an opportunity to modify the SQL databases and help prepare them for serving - for example, a full-text search plugin might create additional FTS tables, or a mapping plugin might pre-calculate a bunch of geohashes for tables that have latitude/longitude values. Plugins could really take advantage of the immutable nature of the dataset here.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-345067498,https://api.github.com/repos/simonw/datasette/issues/14,345067498,MDEyOklzc3VlQ29tbWVudDM0NTA2NzQ5OA==,9599,simonw,2017-11-16T21:25:32Z,2017-11-16T21:26:22Z,OWNER,"For visualizations, Google Maps should be made available as a plugin. The default visualizations can use Leaflet and Open Street Map, but there's no reason to not make Google Maps available as a plugin, especially if the plugin can provide a mechanism for configuring the necessary API key. I'm particularly excited in the Google Maps heatmap visualization https://developers.google.com/maps/documentation/javascript/heatmaplayer as seen on http://mochimachine.org/wasteland/","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-345893877,https://api.github.com/repos/simonw/datasette/issues/14,345893877,MDEyOklzc3VlQ29tbWVudDM0NTg5Mzg3Nw==,9599,simonw,2017-11-21T02:11:27Z,2017-11-21T02:11:27Z,OWNER,http://setuptools.readthedocs.io/en/latest/setuptools.html#dynamic-discovery-of-services-and-plugins Is pretty good ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-346244871,https://api.github.com/repos/simonw/datasette/issues/14,346244871,MDEyOklzc3VlQ29tbWVudDM0NjI0NDg3MQ==,21148,jacobian,2017-11-22T05:06:30Z,2017-11-22T05:06:30Z,CONTRIBUTOR,"I'd also suggest taking a look at [stevedore](https://docs.openstack.org/stevedore/latest/), which has a ton of tools for doing plugin stuff. I've had good luck with it in the past.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-346406009,https://api.github.com/repos/simonw/datasette/issues/14,346406009,MDEyOklzc3VlQ29tbWVudDM0NjQwNjAwOQ==,9599,simonw,2017-11-22T16:39:08Z,2017-11-22T16:39:08Z,OWNER,"Oh thanks, that definitely looks like an interesting option.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381442233,https://api.github.com/repos/simonw/datasette/issues/14,381442233,MDEyOklzc3VlQ29tbWVudDM4MTQ0MjIzMw==,9599,simonw,2018-04-15T22:13:06Z,2018-04-15T22:13:06Z,OWNER,"I started a thread on Twitter asking people for good examples of Python projects with a strong plugin ecosystem: https://twitter.com/simonw/status/985377670388105216 The most impressive example that came back was pytest - which now has nearly 400 plugins: https://plugincompat.herokuapp.com/ The pytest plugin infrastructure is available as an independent package called pluggy - which appears to offer everything I need for Datasette. I'm going to give that a go and see how well it works: https://pluggy.readthedocs.io/en/latest/","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381442494,https://api.github.com/repos/simonw/datasette/issues/14,381442494,MDEyOklzc3VlQ29tbWVudDM4MTQ0MjQ5NA==,9599,simonw,2018-04-15T22:17:59Z,2018-04-15T22:17:59Z,OWNER,"Datasette 1.0 will be the release of Datasette that attempts to provide a stable plugin API: https://github.com/simonw/datasette/milestone/7 There's a lot of work to be done before then, but as a starting point I'm going to support two very simple extension mechanisms: * Template system plugins - where the hook gets passed the Jinja environment and can freely register new template tags and filters * SQLite connection plugins - where the hook gets passed a new SQLite connection and can register custom SQLite functions The template system hook will go near here: https://github.com/simonw/datasette/blob/efbb4e83374a2c795e436c72fa79f70da72309b8/datasette/app.py#L1225-L1228 The SQLite connection hook will go near here: https://github.com/simonw/datasette/blob/efbb4e83374a2c795e436c72fa79f70da72309b8/datasette/app.py#L1094-L1098 These two feel simple enough that I'm not worried that I might design an API that I later regret.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381443728,https://api.github.com/repos/simonw/datasette/issues/14,381443728,MDEyOklzc3VlQ29tbWVudDM4MTQ0MzcyOA==,9599,simonw,2018-04-15T22:39:00Z,2018-04-15T22:39:00Z,OWNER,Tox is a good example of a project that uses pluggy in the way I want to use it (function hooks rather than classes): https://github.com/tox-dev/tox/blob/master/tox/hookspecs.py,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381446392,https://api.github.com/repos/simonw/datasette/issues/14,381446392,MDEyOklzc3VlQ29tbWVudDM4MTQ0NjM5Mg==,9599,simonw,2018-04-15T23:22:40Z,2018-04-16T05:25:57Z,OWNER,"OK, from that prototype in f2720b0c6b7172ebe8820 it looks like pluggy provides a solid path forward. Next steps: - [x] Build a demo plugin that uses setuptools entrypoints to register with the `datasette` plugin manager via pluggy - [x] Figure out a mechanism for registering plugins without first needing to publish them to PyPI. Can I load plugins from a special `plugins/` directory similar to the `--template-dir=templates/` option already supported by Datasette? #211","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381446511,https://api.github.com/repos/simonw/datasette/issues/14,381446511,MDEyOklzc3VlQ29tbWVudDM4MTQ0NjUxMQ==,9599,simonw,2018-04-15T23:25:04Z,2018-04-15T23:25:04Z,OWNER,"Here's a demo of the `convert_units()` SQL function I prototyped in f2720b0c6b7172ebe88 ![2018-04-15 at 4 23 pm](https://user-images.githubusercontent.com/9599/38784633-8c43821e-40c9-11e8-97dd-697755a0f858.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381446906,https://api.github.com/repos/simonw/datasette/issues/14,381446906,MDEyOklzc3VlQ29tbWVudDM4MTQ0NjkwNg==,9599,simonw,2018-04-15T23:31:58Z,2018-04-15T23:34:10Z,OWNER,"Once I've got the plugins mechanism stable and people start releasing plugins it would be useful to have a dedicated Trove classifier on PyPI for Datasette plugins - `Framework :: Datasette` for example. This would help me build a Datasette equivalent of the http://plugincompat.herokuapp.com/ site, which works by scanning PyPI for items with the ``Framework :: Pytest`` classifier: https://github.com/pytest-dev/plugincompat/blob/8bdf1a6fb82807091ece0c68c196103ee8270194/update_index.py#L52-L53 It looks like the mechanism for requesting new PyPI classifiers is to file a ticket against warehouse, like these ones: https://github.com/pypa/warehouse/issues/3570 and https://github.com/pypa/warehouse/issues/2881","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381450394,https://api.github.com/repos/simonw/datasette/issues/14,381450394,MDEyOklzc3VlQ29tbWVudDM4MTQ1MDM5NA==,9599,simonw,2018-04-16T00:27:23Z,2018-04-16T00:27:23Z,OWNER,"I created https://github.com/simonw/datasette-plugin-demos which is now published to PyPI and can be installed with `pip install datasette-plugin-demos` - I've confirmed that if you DO install it my Datasette `plugins` branch picks up the plugins, and `select random_integer(1, 4)` works as it should.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381450591,https://api.github.com/repos/simonw/datasette/issues/14,381450591,MDEyOklzc3VlQ29tbWVudDM4MTQ1MDU5MQ==,9599,simonw,2018-04-16T00:30:22Z,2018-04-16T00:34:42Z,OWNER,"Slight code design problem... when I tried installing my branch in a fresh virtual environment I got this error, because `setup.py` now depends on `pluggy` (from importing `__version__`): ``` File ""/private/var/folders/jj/fngnv0810tn2lt_kd3911pdc0000gp/T/pip-req-build-dftqdezt/setup.py"", line 2, in from datasette import __version__ File ""/private/var/folders/jj/fngnv0810tn2lt_kd3911pdc0000gp/T/pip-req-build-dftqdezt/datasette/__init__.py"", line 2, in from .hookspecs import hookimpl # noqa File ""/private/var/folders/jj/fngnv0810tn2lt_kd3911pdc0000gp/T/pip-req-build-dftqdezt/datasette/hookspecs.py"", line 1, in from pluggy import HookimplMarker ModuleNotFoundError: No module named 'pluggy' ``` Looks like I've run into point 6 on https://packaging.python.org/guides/single-sourcing-package-version/ : ![2018-04-15 at 5 34 pm](https://user-images.githubusercontent.com/9599/38785314-403ce86a-40d3-11e8-8542-ba426eddf4ac.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381611738,https://api.github.com/repos/simonw/datasette/issues/14,381611738,MDEyOklzc3VlQ29tbWVudDM4MTYxMTczOA==,9599,simonw,2018-04-16T14:07:30Z,2018-04-16T14:07:30Z,OWNER,I should check if it's possible to have two template registration function plugins in a single plugin module. If it isn't maybe I should use class plugins instead of module plugins.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381621338,https://api.github.com/repos/simonw/datasette/issues/14,381621338,MDEyOklzc3VlQ29tbWVudDM4MTYyMTMzOA==,9599,simonw,2018-04-16T14:36:27Z,2018-04-16T14:36:27Z,OWNER,"Annoyingly, the following only results in the last of the two `prepare_connection` hooks being registered: ``` from datasette import hookimpl import pint import random ureg = pint.UnitRegistry() @hookimpl def prepare_connection(conn): def convert_units(amount, from_, to_): ""select convert_units(100, 'm', 'ft');"" return (amount * ureg(from_)).to(to_).to_tuple()[0] conn.create_function('convert_units', 3, convert_units) @hookimpl def prepare_connection(conn): conn.create_function('random_integer', 2, random.randint) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381622793,https://api.github.com/repos/simonw/datasette/issues/14,381622793,MDEyOklzc3VlQ29tbWVudDM4MTYyMjc5Mw==,9599,simonw,2018-04-16T14:40:39Z,2018-04-17T01:47:15Z,OWNER,"I think that's OK. The two plugins I've implemented so far (`prepare_connection` and `prepare_jinja2_environment`) both make sense if they can only be defined once-per-plugin. For the moment I'll assume I can define future hooks to work well with the same limitation. The syntactic sugar idea in #220 can help here too.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381809998,https://api.github.com/repos/simonw/datasette/issues/14,381809998,MDEyOklzc3VlQ29tbWVudDM4MTgwOTk5OA==,9599,simonw,2018-04-17T02:23:39Z,2018-04-17T02:23:39Z,OWNER,I just shipped Datasette 0.19 with where I'm at so far: https://github.com/simonw/datasette/releases/tag/0.19,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-382256729,https://api.github.com/repos/simonw/datasette/issues/14,382256729,MDEyOklzc3VlQ29tbWVudDM4MjI1NjcyOQ==,9599,simonw,2018-04-18T04:29:29Z,2018-04-18T04:30:14Z,OWNER,I added a mechanism for plugins to serve static files and define custom CSS and JS URLs in #214 - see new documentation on http://datasette.readthedocs.io/en/latest/plugins.html#static-assets and http://datasette.readthedocs.io/en/latest/plugins.html#extra-css-urls,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-383139889,https://api.github.com/repos/simonw/datasette/issues/14,383139889,MDEyOklzc3VlQ29tbWVudDM4MzEzOTg4OQ==,9599,simonw,2018-04-20T15:51:47Z,2018-04-20T15:51:47Z,OWNER,"I released everything we have so far in [Datasette 0.20](https://github.com/simonw/datasette/releases/tag/0.20) and built and released an example plugin, [datasette-cluster-map](https://pypi.org/project/datasette-cluster-map/). Here's my blog entry about it: https://simonwillison.net/2018/Apr/20/datasette-plugins/","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-383140111,https://api.github.com/repos/simonw/datasette/issues/14,383140111,MDEyOklzc3VlQ29tbWVudDM4MzE0MDExMQ==,9599,simonw,2018-04-20T15:52:33Z,2018-04-20T15:52:33Z,OWNER,Here's a link demonstrating my new plugin: https://datasette-cluster-map-demo.now.sh/polar-bears-455fe3a/USGS_WC_eartags_output_files_2009-2011-Status,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-491944613,https://api.github.com/repos/simonw/datasette/issues/14,491944613,MDEyOklzc3VlQ29tbWVudDQ5MTk0NDYxMw==,9599,simonw,2019-05-13T18:58:19Z,2019-05-13T18:58:19Z,OWNER,"We've grown a bunch of plugin hooks over the past two years: https://datasette.readthedocs.io/en/latest/plugins.html#plugin-hooks Since the plugin system will never be 100% ""finished"", I'm closing this in favor of the label: https://github.com/simonw/datasette/labels/plugins","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/16#issuecomment-338768860,https://api.github.com/repos/simonw/datasette/issues/16,338768860,MDEyOklzc3VlQ29tbWVudDMzODc2ODg2MA==,9599,simonw,2017-10-23T19:23:29Z,2017-10-23T19:23:29Z,OWNER,I could use the table-reflow mechanism demonstrated here: http://demos.jquerymobile.com/1.4.3/table-reflow/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267726219,Default HTML/CSS needs to look reasonable and be responsive, https://github.com/simonw/datasette/issues/16#issuecomment-339420462,https://api.github.com/repos/simonw/datasette/issues/16,339420462,MDEyOklzc3VlQ29tbWVudDMzOTQyMDQ2Mg==,9599,simonw,2017-10-25T18:10:51Z,2017-10-25T18:10:51Z,OWNER,"https://sitesforprofit.com/responsive-table-plugins-and-patterns has some useful links. I really like the pattern from https://css-tricks.com/responsive-data-tables/ /* Max width before this PARTICULAR table gets nasty This query will take effect for any screen smaller than 760px and also iPads specifically. */ @media only screen and (max-width: 760px), (min-device-width: 768px) and (max-device-width: 1024px) { /* Force table to not be like tables anymore */ table, thead, tbody, th, td, tr { display: block; } /* Hide table headers (but not display: none;, for accessibility) */ thead tr { position: absolute; top: -9999px; left: -9999px; } tr { border: 1px solid #ccc; } td { /* Behave like a ""row"" */ border: none; border-bottom: 1px solid #eee; position: relative; padding-left: 50%; } td:before { /* Now like a table header */ position: absolute; /* Top/left values mimic padding */ top: 6px; left: 6px; width: 45%; padding-right: 10px; white-space: nowrap; } /* Label the data */ td:nth-of-type(1):before { content: ""First Name""; } td:nth-of-type(2):before { content: ""Last Name""; } td:nth-of-type(3):before { content: ""Job Title""; } td:nth-of-type(4):before { content: ""Favorite Color""; } td:nth-of-type(5):before { content: ""Wars of Trek?""; } td:nth-of-type(6):before { content: ""Porn Name""; } td:nth-of-type(7):before { content: ""Date of Birth""; } td:nth-of-type(8):before { content: ""Dream Vacation City""; } td:nth-of-type(9):before { content: ""GPA""; } td:nth-of-type(10):before { content: ""Arbitrary Data""; } }","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267726219,Default HTML/CSS needs to look reasonable and be responsive, https://github.com/simonw/datasette/issues/16#issuecomment-342032943,https://api.github.com/repos/simonw/datasette/issues/16,342032943,MDEyOklzc3VlQ29tbWVudDM0MjAzMjk0Mw==,9599,simonw,2017-11-06T02:50:07Z,2017-11-06T02:50:07Z,OWNER,"Default look with Bootstrap 4 looks like this: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267726219,Default HTML/CSS needs to look reasonable and be responsive, https://github.com/simonw/datasette/issues/16#issuecomment-343643332,https://api.github.com/repos/simonw/datasette/issues/16,343643332,MDEyOklzc3VlQ29tbWVudDM0MzY0MzMzMg==,9599,simonw,2017-11-11T06:00:04Z,2017-11-11T06:00:04Z,OWNER,"Here's what a table looks like now at a smaller screen size: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267726219,Default HTML/CSS needs to look reasonable and be responsive, https://github.com/simonw/datasette/issues/16#issuecomment-343647300,https://api.github.com/repos/simonw/datasette/issues/16,343647300,MDEyOklzc3VlQ29tbWVudDM0MzY0NzMwMA==,9599,simonw,2017-11-11T07:41:19Z,2017-11-11T07:53:09Z,OWNER,"Still needed: - [ ] A link to the homepage from some kind of navigation bar in the header - [ ] link to github.com/simonw/datasette in the footer - [ ] Slightly better titles (maybe ditch the visited link colours for titles only? should keep those for primary key links) - [ ] Links to the .json and .jsono versions of every view","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267726219,Default HTML/CSS needs to look reasonable and be responsive, https://github.com/simonw/datasette/issues/16#issuecomment-343691342,https://api.github.com/repos/simonw/datasette/issues/16,343691342,MDEyOklzc3VlQ29tbWVudDM0MzY5MTM0Mg==,9599,simonw,2017-11-11T20:19:07Z,2017-11-11T20:19:07Z,OWNER,"Closing this, opening a fresh ticket for the navigation stuff.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267726219,Default HTML/CSS needs to look reasonable and be responsive, https://github.com/simonw/datasette/issues/17#issuecomment-338852971,https://api.github.com/repos/simonw/datasette/issues/17,338852971,MDEyOklzc3VlQ29tbWVudDMzODg1Mjk3MQ==,9599,simonw,2017-10-24T02:26:47Z,2017-10-24T02:26:47Z,OWNER,I'm not going to bother with this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267732005,"In development mode, should still pick up new .db files", https://github.com/simonw/datasette/issues/18#issuecomment-754188383,https://api.github.com/repos/simonw/datasette/issues/18,754188383,MDEyOklzc3VlQ29tbWVudDc1NDE4ODM4Mw==,9599,simonw,2021-01-04T20:05:48Z,2021-01-04T20:05:48Z,OWNER,"I'm not using Sanic any more, but this is still very feasible. If I ever do it I'll write a plugin.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267739593,See if I can get a websockets interface working, https://github.com/simonw/datasette/issues/19#issuecomment-339366612,https://api.github.com/repos/simonw/datasette/issues/19,339366612,MDEyOklzc3VlQ29tbWVudDMzOTM2NjYxMg==,9599,simonw,2017-10-25T15:21:16Z,2017-10-25T15:21:16Z,OWNER,"I had to manually set the content disposition header: return await response.file_stream( filepath, headers={ 'Content-Disposition': 'attachment; filename=""{}""'.format(ilepath) } ) In the next release of Sanic I can just use the filename= argument instead: https://github.com/channelcat/sanic/commit/07e95dba4f5983afc1e673df14bdd278817288aa","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267741262,Efficient url for downloading the raw database file, https://github.com/simonw/datasette/issues/20#issuecomment-338769538,https://api.github.com/repos/simonw/datasette/issues/20,338769538,MDEyOklzc3VlQ29tbWVudDMzODc2OTUzOA==,9599,simonw,2017-10-23T19:25:55Z,2017-10-23T19:25:55Z,OWNER,"Maybe this should be handled by views instead? https://stateless-datasets-wreplxalgu.now.sh/ lists some views https://stateless-datasets-wreplxalgu.now.sh/?sql=select%20*%20from%20%22Order%20Subtotals%22 is an example showing the content of a view. What would the URL to views be? I don't think a view can share a name with a table, so the same URL scheme could work for both.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/20#issuecomment-343581130,https://api.github.com/repos/simonw/datasette/issues/20,343581130,MDEyOklzc3VlQ29tbWVudDM0MzU4MTEzMA==,9599,simonw,2017-11-10T20:44:38Z,2017-11-10T20:44:38Z,OWNER,"I'm going to handle this a different way. I'm going to support a local history of your own queries stored in localStorage, but if you want to share a query you have to do it with a URL. If people really want canned query support, they can do that using custom templates - see #12 - or by adding views to their database before they publish it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/20#issuecomment-348420129,https://api.github.com/repos/simonw/datasette/issues/20,348420129,MDEyOklzc3VlQ29tbWVudDM0ODQyMDEyOQ==,9599,simonw,2017-12-01T07:16:25Z,2017-12-01T07:16:25Z,OWNER,"I've found some examples of canned queries I want to support that can't be represented as views, so I'm going to reopen this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/20#issuecomment-348420955,https://api.github.com/repos/simonw/datasette/issues/20,348420955,MDEyOklzc3VlQ29tbWVudDM0ODQyMDk1NQ==,9599,simonw,2017-12-01T07:21:08Z,2017-12-01T07:21:08Z,OWNER,"I'll use the existing metadata.json file: { ""databases"": { ""mydb"": { ""queries"": { ""custom_thingy"": {... The query definition can either be just a string of SQL, or it can be an object with a sql key and optional title and description keys. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/20#issuecomment-348860623,https://api.github.com/repos/simonw/datasette/issues/20,348860623,MDEyOklzc3VlQ29tbWVudDM0ODg2MDYyMw==,9599,simonw,2017-12-04T04:56:21Z,2017-12-04T04:56:21Z,OWNER,"While I'm doing this, I could add per-database and per-table metadata too ala #68","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/20#issuecomment-349027974,https://api.github.com/repos/simonw/datasette/issues/20,349027974,MDEyOklzc3VlQ29tbWVudDM0OTAyNzk3NA==,9599,simonw,2017-12-04T17:01:19Z,2017-12-04T17:01:19Z,OWNER, This is also a good opportunity to re-factor out a separate query.html template - right now the database.html template is doing two jobs.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/20#issuecomment-349359498,https://api.github.com/repos/simonw/datasette/issues/20,349359498,MDEyOklzc3VlQ29tbWVudDM0OTM1OTQ5OA==,9599,simonw,2017-12-05T16:30:06Z,2017-12-05T16:30:06Z,OWNER,"Named canned queries can now be defined in metadata.json like this: { ""databases"": { ""timezones"": { ""queries"": { ""timezone_for_point"": ""select tzid from timezones ..."" } } } } These will be shown in a new ""Queries"" section beneath ""Views"" on the database page. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/20#issuecomment-349383276,https://api.github.com/repos/simonw/datasette/issues/20,349383276,MDEyOklzc3VlQ29tbWVudDM0OTM4MzI3Ng==,9599,simonw,2017-12-05T17:45:20Z,2017-12-05T17:45:20Z,OWNER,http://datasette.readthedocs.io/en/latest/sql_queries.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/20#issuecomment-349406761,https://api.github.com/repos/simonw/datasette/issues/20,349406761,MDEyOklzc3VlQ29tbWVudDM0OTQwNjc2MQ==,9599,simonw,2017-12-05T19:03:06Z,2017-12-05T19:03:06Z,OWNER,Demo: https://timezones-api.now.sh/timezones-3cb9f64/by_point,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/21#issuecomment-343581332,https://api.github.com/repos/simonw/datasette/issues/21,343581332,MDEyOklzc3VlQ29tbWVudDM0MzU4MTMzMg==,9599,simonw,2017-11-10T20:45:42Z,2017-11-10T20:45:42Z,OWNER,I'm not going to use Sanic's mechanism for this. I'll use arguments passed to my cli instead.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267769034,Use Sanic configuration mechanism , https://github.com/simonw/datasette/issues/23#issuecomment-338854988,https://api.github.com/repos/simonw/datasette/issues/23,338854988,MDEyOklzc3VlQ29tbWVudDMzODg1NDk4OA==,9599,simonw,2017-10-24T02:40:12Z,2017-10-25T00:05:46Z,OWNER," /database-name/table-name?name__contains=simon&sort=id+desc Note that if there's a column called ""sort"" you can still do sort__exact=blah ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267788884,Support Django-style filters in querystring arguments, https://github.com/simonw/datasette/issues/23#issuecomment-338859620,https://api.github.com/repos/simonw/datasette/issues/23,338859620,MDEyOklzc3VlQ29tbWVudDMzODg1OTYyMA==,9599,simonw,2017-10-24T03:11:42Z,2017-10-24T03:11:42Z,OWNER,I’m going to implement everything in https://docs.djangoproject.com/en/1.11/ref/models/querysets/#field-lookups with the exception of range and the various date ones.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267788884,Support Django-style filters in querystring arguments, https://github.com/simonw/datasette/issues/23#issuecomment-338859709,https://api.github.com/repos/simonw/datasette/issues/23,338859709,MDEyOklzc3VlQ29tbWVudDMzODg1OTcwOQ==,9599,simonw,2017-10-24T03:12:18Z,2017-10-24T03:12:42Z,OWNER,"I’m going to need to write unit tests for this, is this depends on #9","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267788884,Support Django-style filters in querystring arguments, https://github.com/simonw/datasette/issues/23#issuecomment-339138809,https://api.github.com/repos/simonw/datasette/issues/23,339138809,MDEyOklzc3VlQ29tbWVudDMzOTEzODgwOQ==,9599,simonw,2017-10-24T21:32:46Z,2017-10-24T21:32:46Z,OWNER,May as well support most of https://sqlite.org/lang_expr.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267788884,Support Django-style filters in querystring arguments, https://github.com/simonw/datasette/issues/23#issuecomment-339186887,https://api.github.com/repos/simonw/datasette/issues/23,339186887,MDEyOklzc3VlQ29tbWVudDMzOTE4Njg4Nw==,9599,simonw,2017-10-25T01:39:43Z,2017-10-25T04:22:41Z,OWNER,"Still to do: - [x] `gt`, `gte`, `lt`, `lte` - [x] `like` - [x] `glob` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267788884,Support Django-style filters in querystring arguments, https://github.com/simonw/datasette/issues/23#issuecomment-339210353,https://api.github.com/repos/simonw/datasette/issues/23,339210353,MDEyOklzc3VlQ29tbWVudDMzOTIxMDM1Mw==,9599,simonw,2017-10-25T04:23:02Z,2017-10-25T04:23:02Z,OWNER,I'm going to call this one done for the moment. The date filters can go in a stretch goal.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267788884,Support Django-style filters in querystring arguments, https://github.com/simonw/datasette/issues/24#issuecomment-338834213,https://api.github.com/repos/simonw/datasette/issues/24,338834213,MDEyOklzc3VlQ29tbWVudDMzODgzNDIxMw==,9599,simonw,2017-10-24T00:23:05Z,2017-10-24T00:23:05Z,OWNER,"If I can’t setect a primary key, I won’t provide a URL for those records","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267828746,Implement full URL design, https://github.com/simonw/datasette/issues/24#issuecomment-339003850,https://api.github.com/repos/simonw/datasette/issues/24,339003850,MDEyOklzc3VlQ29tbWVudDMzOTAwMzg1MA==,9599,simonw,2017-10-24T14:12:00Z,2017-10-24T14:12:00Z,OWNER,As of b46e370ee6126aa2fa85cf789a31da38aed98496 this is done.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267828746,Implement full URL design, https://github.com/simonw/datasette/issues/25#issuecomment-343715915,https://api.github.com/repos/simonw/datasette/issues/25,343715915,MDEyOklzc3VlQ29tbWVudDM0MzcxNTkxNQ==,9599,simonw,2017-11-12T06:08:28Z,2017-11-12T06:08:28Z,OWNER," con = sqlite3.connect('existing_db.db') with open('dump.sql', 'w') as f: for line in con.iterdump(): f.write('%s\n' % line) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267857622,Endpoint that returns SQL ready to be piped into DB, https://github.com/simonw/datasette/issues/25#issuecomment-344487639,https://api.github.com/repos/simonw/datasette/issues/25,344487639,MDEyOklzc3VlQ29tbWVudDM0NDQ4NzYzOQ==,9599,simonw,2017-11-15T05:11:11Z,2017-11-15T05:11:11Z,OWNER,"Since you can already download the database directly, I'm not going to bother with this one.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267857622,Endpoint that returns SQL ready to be piped into DB, https://github.com/simonw/datasette/issues/26#issuecomment-343644976,https://api.github.com/repos/simonw/datasette/issues/26,343644976,MDEyOklzc3VlQ29tbWVudDM0MzY0NDk3Ng==,9599,simonw,2017-11-11T06:42:23Z,2017-11-11T06:42:23Z,OWNER,"Simplest version of this: 1. Create a temporary directory 2. Write a Dockerfile into it that pulls an image and pip installs datasette 3. Add symlinks to the DBs they listed (so we don't have to copy them) 4. Shell out to ""now"" 5. Done! ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267861210,Command line tool for uploading one or more DBs to Now, https://github.com/simonw/datasette/issues/26#issuecomment-343645249,https://api.github.com/repos/simonw/datasette/issues/26,343645249,MDEyOklzc3VlQ29tbWVudDM0MzY0NTI0OQ==,9599,simonw,2017-11-11T06:48:59Z,2017-11-11T06:48:59Z,OWNER,"Doing this works: import os os.link('/tmp/databases/northwind.db', '/tmp/tmp-blah/northwind.db') That creates a link in tmp-blah - and then when I delete that entire directory like so: import shutil shutil.rmtree('/tmp/tmp-blah') The original database is not deleted, just the link.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267861210,Command line tool for uploading one or more DBs to Now, https://github.com/simonw/datasette/issues/26#issuecomment-343645327,https://api.github.com/repos/simonw/datasette/issues/26,343645327,MDEyOklzc3VlQ29tbWVudDM0MzY0NTMyNw==,9599,simonw,2017-11-11T06:51:16Z,2017-11-11T06:51:16Z,OWNER,"I can create the temporary directory like so: import tempfile t = tempfile.TemporaryDirectory() t t.name '/var/folders/w9/0xm39tk94ng9h52g06z4b54c0000gp/T/tmpkym70wlp' And then to delete it all: t.cleanup() ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267861210,Command line tool for uploading one or more DBs to Now, https://github.com/simonw/datasette/issues/27#issuecomment-344179878,https://api.github.com/repos/simonw/datasette/issues/27,344179878,MDEyOklzc3VlQ29tbWVudDM0NDE3OTg3OA==,9599,simonw,2017-11-14T08:21:22Z,2017-11-14T08:21:22Z,OWNER,https://github.com/frappe/charts perhaps ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267886330,Ability to plot a simple graph, https://github.com/simonw/datasette/issues/27#issuecomment-345652450,https://api.github.com/repos/simonw/datasette/issues/27,345652450,MDEyOklzc3VlQ29tbWVudDM0NTY1MjQ1MA==,198537,rgieseke,2017-11-20T10:19:39Z,2017-11-20T10:19:39Z,CONTRIBUTOR,"If Data Package metadata gets adopted (#105) the views spec work might also be worth a look: http://frictionlessdata.io/specs/views/ http://datahub.io/docs/features/views ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267886330,Ability to plot a simple graph, https://github.com/simonw/datasette/issues/27#issuecomment-403910774,https://api.github.com/repos/simonw/datasette/issues/27,403910774,MDEyOklzc3VlQ29tbWVudDQwMzkxMDc3NA==,9599,simonw,2018-07-10T17:52:41Z,2018-07-10T17:52:41Z,OWNER,I consider this handled by https://github.com/simonw/datasette-vega,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267886330,Ability to plot a simple graph, https://github.com/simonw/datasette/issues/29#issuecomment-339019873,https://api.github.com/repos/simonw/datasette/issues/29,339019873,MDEyOklzc3VlQ29tbWVudDMzOTAxOTg3Mw==,9599,simonw,2017-10-24T14:58:33Z,2017-10-24T14:58:33Z,OWNER,"Here's what I've got now: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268050821,Handle bytestring records encoding to JSON, https://github.com/simonw/datasette/issues/30#issuecomment-344352573,https://api.github.com/repos/simonw/datasette/issues/30,344352573,MDEyOklzc3VlQ29tbWVudDM0NDM1MjU3Mw==,9599,simonw,2017-11-14T18:29:01Z,2017-11-14T18:29:01Z,OWNER,This is a dupe of #85 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268078453,Do something neat with foreign keys, https://github.com/simonw/datasette/issues/31#issuecomment-392580715,https://api.github.com/repos/simonw/datasette/issues/31,392580715,MDEyOklzc3VlQ29tbWVudDM5MjU4MDcxNQ==,9599,simonw,2018-05-28T18:10:45Z,2018-05-28T18:10:45Z,OWNER,"Oops, that commit should have referenced #121 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268087542,Idea: colour scheme based on sha256 of db, https://github.com/simonw/datasette/issues/32#issuecomment-343164111,https://api.github.com/repos/simonw/datasette/issues/32,343164111,MDEyOklzc3VlQ29tbWVudDM0MzE2NDExMQ==,9599,simonw,2017-11-09T14:05:56Z,2017-11-09T14:05:56Z,OWNER,Implemented in 31b21f5c5e15fc3acab7fabb170c1da71dc3c98c,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268106803,Try running SQLite queries in a separate thread, https://github.com/simonw/datasette/issues/34#issuecomment-392600866,https://api.github.com/repos/simonw/datasette/issues/34,392600866,MDEyOklzc3VlQ29tbWVudDM5MjYwMDg2Ng==,9599,simonw,2018-05-28T20:45:34Z,2018-05-28T20:45:42Z,OWNER,"This is an accidental duplicate, work is now taking place in #266","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268176505,Support CSV export with a .csv extension, https://github.com/simonw/datasette/issues/36#issuecomment-345262738,https://api.github.com/repos/simonw/datasette/issues/36,345262738,MDEyOklzc3VlQ29tbWVudDM0NTI2MjczOA==,9599,simonw,2017-11-17T14:45:37Z,2017-11-17T14:45:37Z,OWNER,"Consider for example https://fivethirtyeight.datasettes.com/fivethirtyeight/inconvenient-sequel%2Fratings The idea here is to be able to support querystring parameters like this: * `?timestamp___date=2017-07-17` - return every item where the timestamp falls on that date * `?timestamp___year=2017` - return every item where the timestamp falls within 2017 * `?timestamp___month=1` - return every item where the month component is January * `?timestamp___day=10` - return every item where the day-of-the-month component is 10 This is similar to #64 but a fair bit more complicated. SQLite date functions are documented here: https://sqlite.org/lang_datefunc.html ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268262480,"date, year, month and day querystring lookups", https://github.com/simonw/datasette/issues/36#issuecomment-345448756,https://api.github.com/repos/simonw/datasette/issues/36,345448756,MDEyOklzc3VlQ29tbWVudDM0NTQ0ODc1Ng==,9599,simonw,2017-11-18T15:17:43Z,2017-11-18T15:17:43Z,OWNER,"This may be useful: https://github.com/coleifer/peewee/blob/db85167d93861451a1fe7cde8c4f05748b222634/peewee.py#L162-L185","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268262480,"date, year, month and day querystring lookups", https://github.com/simonw/datasette/issues/36#issuecomment-392575160,https://api.github.com/repos/simonw/datasette/issues/36,392575160,MDEyOklzc3VlQ29tbWVudDM5MjU3NTE2MA==,9599,simonw,2018-05-28T17:30:52Z,2018-05-28T17:30:52Z,OWNER,"I've changed my mind about this. ""Select every record on the 3rd day of the month"" doesn't strike me as an actually useful feature. ""Select every record in 2018 / in May 2018 / on 1st May 2018"", if you are using the SQLite-preferred datestring format, are already supported using LIKE queries (or the startswith filter): * https://fivethirtyeight.datasettes.com/fivethirtyeight/inconvenient-sequel%2Fratings?timestamp__startswith=2017 * https://fivethirtyeight.datasettes.com/fivethirtyeight/inconvenient-sequel%2Fratings?timestamp__startswith=2017-08 * https://fivethirtyeight.datasettes.com/fivethirtyeight/inconvenient-sequel%2Fratings?timestamp__startswith=2017-08-29 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268262480,"date, year, month and day querystring lookups", https://github.com/simonw/datasette/issues/37#issuecomment-339382054,https://api.github.com/repos/simonw/datasette/issues/37,339382054,MDEyOklzc3VlQ29tbWVudDMzOTM4MjA1NA==,9599,simonw,2017-10-25T16:05:56Z,2017-10-25T16:05:56Z,OWNER,Could this be as simple as using the iterative JSON encoder and adding a yield statement in between each chunk?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268453968,Ability to serialize massive JSON without blocking event loop, https://github.com/simonw/datasette/issues/37#issuecomment-636360861,https://api.github.com/repos/simonw/datasette/issues/37,636360861,MDEyOklzc3VlQ29tbWVudDYzNjM2MDg2MQ==,9599,simonw,2020-05-30T17:29:20Z,2020-05-30T17:29:20Z,OWNER,I'm not going to do this: 2.5 years later I have yet to run into anything that makes me think that JSON serialization performance is worth any extra work.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268453968,Ability to serialize massive JSON without blocking event loop, https://github.com/simonw/datasette/issues/38#issuecomment-339388215,https://api.github.com/repos/simonw/datasette/issues/38,339388215,MDEyOklzc3VlQ29tbWVudDMzOTM4ODIxNQ==,9599,simonw,2017-10-25T16:25:45Z,2017-10-25T16:25:45Z,OWNER,"First experiment: hook up an iterative CSV dump (just because that’s a tiny bit easier to get started with than iterative a JSON). Have it execute a big select statement and then iterate through the result set 100 rows at a time using sqite fetchmany() - also have it async sleep for a second in between each batch of 100. Can this work without needing python threads? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268462768,Experiment with patterns for concurrent long running queries, https://github.com/simonw/datasette/issues/38#issuecomment-339388771,https://api.github.com/repos/simonw/datasette/issues/38,339388771,MDEyOklzc3VlQ29tbWVudDMzOTM4ODc3MQ==,9599,simonw,2017-10-25T16:27:29Z,2017-10-25T16:27:29Z,OWNER,"If this does work, I need to figure it what to do about the HTML view. ASsuming I can iteratively produce JSON and CSV, what to do about HTML? One option: render the first 500 rows as HTML, then hand off to an infinite scroll experience that iteratively loads more rows as JSON.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268462768,Experiment with patterns for concurrent long running queries, https://github.com/simonw/datasette/issues/38#issuecomment-339389105,https://api.github.com/repos/simonw/datasette/issues/38,339389105,MDEyOklzc3VlQ29tbWVudDMzOTM4OTEwNQ==,9599,simonw,2017-10-25T16:28:39Z,2017-10-25T16:28:39Z,OWNER,The gold standard here is to be able to serve up increasingly large datasets without blocking the event loop and while using a sustainable amount of RAM,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268462768,Experiment with patterns for concurrent long running queries, https://github.com/simonw/datasette/issues/38#issuecomment-339389328,https://api.github.com/repos/simonw/datasette/issues/38,339389328,MDEyOklzc3VlQ29tbWVudDMzOTM4OTMyOA==,9599,simonw,2017-10-25T16:29:23Z,2017-10-25T16:29:23Z,OWNER,Ideally we can get some serious gains from the fact that our database file is opened with the immutable option.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268462768,Experiment with patterns for concurrent long running queries, https://github.com/simonw/datasette/issues/38#issuecomment-392601114,https://api.github.com/repos/simonw/datasette/issues/38,392601114,MDEyOklzc3VlQ29tbWVudDM5MjYwMTExNA==,9599,simonw,2018-05-28T20:47:31Z,2018-05-28T20:47:31Z,OWNER,I think the way Datasette executes SQL queries in a thread pool introduced in #45 is a good solution for this ticket.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268462768,Experiment with patterns for concurrent long running queries, https://github.com/simonw/datasette/issues/39#issuecomment-339406634,https://api.github.com/repos/simonw/datasette/issues/39,339406634,MDEyOklzc3VlQ29tbWVudDMzOTQwNjYzNA==,9599,simonw,2017-10-25T17:27:10Z,2017-10-25T17:27:10Z,OWNER,It certainly looks like some of the stuff in https://sqlite.org/pragma.html could be used to screw around with things. Example: `PRAGMA case_sensitive_like = 1` - would that affect future queries?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268469569,Protect against malicious SQL that causes damage even though our DB is immutable, https://github.com/simonw/datasette/issues/39#issuecomment-339413825,https://api.github.com/repos/simonw/datasette/issues/39,339413825,MDEyOklzc3VlQ29tbWVudDMzOTQxMzgyNQ==,9599,simonw,2017-10-25T17:48:48Z,2017-10-25T17:48:48Z,OWNER,Could I use https://sqlparse.readthedocs.io/en/latest/ to parse incoming statements and ensure they are pure SELECTs? Would that prevent people from using a compound SELECT statement to trigger an evil PRAGMA of some sort?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268469569,Protect against malicious SQL that causes damage even though our DB is immutable, https://github.com/simonw/datasette/issues/39#issuecomment-339510770,https://api.github.com/repos/simonw/datasette/issues/39,339510770,MDEyOklzc3VlQ29tbWVudDMzOTUxMDc3MA==,9599,simonw,2017-10-26T00:07:40Z,2017-10-26T00:07:40Z,OWNER,It looks like I should double quote my columns and ensure they are correctly escaped https://blog.christosoft.de/2012/10/sqlite-escaping-table-acolumn-names/ - hopefully using ? placeholders for column names will work. I should use ? for tables too.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268469569,Protect against malicious SQL that causes damage even though our DB is immutable, https://github.com/simonw/datasette/issues/39#issuecomment-340787868,https://api.github.com/repos/simonw/datasette/issues/39,340787868,MDEyOklzc3VlQ29tbWVudDM0MDc4Nzg2OA==,9599,simonw,2017-10-31T14:54:14Z,2017-10-31T14:54:14Z,OWNER,"Here’s how I can (I think) provide safe execution of arbitrary SQL while blocking PRAGMA calls: let people use names parameters in their SQL and apply strict filtering to the SQL query but not to the parameter values. cur.execute( ""select * from people where name_last=:who and age=:age"", { ""who"": who, ""age"": age }) In URL form: ?sql=select...&who=Terry&age=34 Now we can apply strict, dumb validation rules to the SQL part while allowing anything in the named queries - so people can execute a search for PRAGMA without being able to execute a PRAGMA statement.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268469569,Protect against malicious SQL that causes damage even though our DB is immutable, https://github.com/simonw/datasette/issues/40#issuecomment-339395551,https://api.github.com/repos/simonw/datasette/issues/40,339395551,MDEyOklzc3VlQ29tbWVudDMzOTM5NTU1MQ==,9599,simonw,2017-10-25T16:49:32Z,2017-10-25T16:49:32Z,OWNER,"Simplest implementation will be to create a temporary directory somewhere, copy in a Dockerfile and the databases and run “now” in it. Ideally I can use symlinks rather than copying potentially large database files around.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/40#issuecomment-339514819,https://api.github.com/repos/simonw/datasette/issues/40,339514819,MDEyOklzc3VlQ29tbWVudDMzOTUxNDgxOQ==,9599,simonw,2017-10-26T00:35:46Z,2017-10-26T00:35:46Z,OWNER,"I’m going to have a single command-line app that does everything. Name to be decided - options include dataset, stateless, datasite (I quite like that - it reflects SQLite and the fact that you create a website)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/40#issuecomment-339515822,https://api.github.com/repos/simonw/datasette/issues/40,339515822,MDEyOklzc3VlQ29tbWVudDMzOTUxNTgyMg==,9599,simonw,2017-10-26T00:43:34Z,2017-10-26T00:43:34Z,OWNER,"datasite . - starts web app in current directory, serving all DB files datasite . -p 8001 - serves on custom port datasite blah.db blah2.db - serves specified files You can’t specify more than one directory. You can specify as many files as you like. If you specify two files with different oaths but the same name then they must be accessed by hash. datasite publish . - publishes current directory to the internet! Uses now by default, if it detects it on your path. Other publishers will be eventually added as plugins. datasite publish http://path-to-db.db - publishes a DB available at a URL. Works by constructing the Dockerfile with wget calls in it. datasite blah.db -m metadata.json If you specify a directory it looks for metadata.json in that directory. Otherwise you can pass an explicit metadata file oath with -m or —metadata","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/40#issuecomment-339516032,https://api.github.com/repos/simonw/datasette/issues/40,339516032,MDEyOklzc3VlQ29tbWVudDMzOTUxNjAzMg==,9599,simonw,2017-10-26T00:44:52Z,2017-10-26T00:44:52Z,OWNER,Another potential name: datapi ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/40#issuecomment-339517846,https://api.github.com/repos/simonw/datasette/issues/40,339517846,MDEyOklzc3VlQ29tbWVudDMzOTUxNzg0Ng==,9599,simonw,2017-10-26T00:58:39Z,2017-10-26T00:58:39Z,OWNER,"I’m going to use Click for this http://nvie.com/posts/writing-a-cli-in-python-in-under-60-seconds/ https://kushaldas.in/posts/building-command-line-tools-in-python-with-click.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/40#issuecomment-339724700,https://api.github.com/repos/simonw/datasette/issues/40,339724700,MDEyOklzc3VlQ29tbWVudDMzOTcyNDcwMA==,9599,simonw,2017-10-26T16:35:20Z,2017-10-26T16:35:20Z,OWNER,"Here’s how to make the “serve” subcommand the default if it is called with no arguments: @click.group(invoke_without_command=True) def serve(): # ...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/40#issuecomment-339891755,https://api.github.com/repos/simonw/datasette/issues/40,339891755,MDEyOklzc3VlQ29tbWVudDMzOTg5MTc1NQ==,9599,simonw,2017-10-27T07:10:53Z,2017-10-27T07:10:53Z,OWNER,"Deploys to Now aren't working at the moment - they aren't showing the uploaded databases, because I've broken the path handling somehow. I need to do a bit more work here.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/40#issuecomment-340561577,https://api.github.com/repos/simonw/datasette/issues/40,340561577,MDEyOklzc3VlQ29tbWVudDM0MDU2MTU3Nw==,9599,simonw,2017-10-30T19:43:40Z,2017-10-30T19:43:40Z,OWNER,http://the-hitchhikers-guide-to-packaging.readthedocs.io/en/latest/quickstart.html describes how to package this for PyPI,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/40#issuecomment-341945420,https://api.github.com/repos/simonw/datasette/issues/40,341945420,MDEyOklzc3VlQ29tbWVudDM0MTk0NTQyMA==,9599,simonw,2017-11-05T02:55:07Z,2017-11-05T02:55:07Z,OWNER,"To simplify things a bit, I'm going to require that every database is explicitly listed in the command line. I won't support ""serve everything in this directory"" for the moment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/40#issuecomment-342030075,https://api.github.com/repos/simonw/datasette/issues/40,342030075,MDEyOklzc3VlQ29tbWVudDM0MjAzMDA3NQ==,9599,simonw,2017-11-06T02:25:48Z,2017-11-06T02:25:48Z,OWNER,"... I tried that, I don't like it. I'm going to bring back ""directory serving"" by allowing you to pass a directory as an argument to `datasite` (including `datasite .`). I may even make `.` the default if you don't provide anything at all.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/40#issuecomment-343646740,https://api.github.com/repos/simonw/datasette/issues/40,343646740,MDEyOklzc3VlQ29tbWVudDM0MzY0Njc0MA==,9599,simonw,2017-11-11T07:27:33Z,2017-11-11T07:27:33Z,OWNER,I'm happy with this now that I've implemented the publish command in #26 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/41#issuecomment-339866724,https://api.github.com/repos/simonw/datasette/issues/41,339866724,MDEyOklzc3VlQ29tbWVudDMzOTg2NjcyNA==,9599,simonw,2017-10-27T04:04:52Z,2017-10-27T04:04:52Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268590777,Homepage should show summary of databases, https://github.com/simonw/datasette/issues/42#issuecomment-343708447,https://api.github.com/repos/simonw/datasette/issues/42,343708447,MDEyOklzc3VlQ29tbWVudDM0MzcwODQ0Nw==,9599,simonw,2017-11-12T02:12:15Z,2017-11-12T02:12:15Z,OWNER,I ditched the metadata file concept.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268591332,Homepage UI for editing metadata file, https://github.com/simonw/datasette/issues/42#issuecomment-343752404,https://api.github.com/repos/simonw/datasette/issues/42,343752404,MDEyOklzc3VlQ29tbWVudDM0Mzc1MjQwNA==,9599,simonw,2017-11-12T17:20:10Z,2017-11-12T17:20:10Z,OWNER,"Re-opening this - I've decided to bring back this concept, see #68 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268591332,Homepage UI for editing metadata file, https://github.com/simonw/datasette/issues/42#issuecomment-345810031,https://api.github.com/repos/simonw/datasette/issues/42,345810031,MDEyOklzc3VlQ29tbWVudDM0NTgxMDAzMQ==,9599,simonw,2017-11-20T19:51:29Z,2017-11-20T19:51:29Z,OWNER,See also #138,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268591332,Homepage UI for editing metadata file, https://github.com/simonw/datasette/issues/42#issuecomment-350521619,https://api.github.com/repos/simonw/datasette/issues/42,350521619,MDEyOklzc3VlQ29tbWVudDM1MDUyMTYxOQ==,9599,simonw,2017-12-10T03:02:14Z,2017-12-10T03:02:14Z,OWNER,I think the `datasette skeleton` command from #164 makes this obsolete.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268591332,Homepage UI for editing metadata file, https://github.com/simonw/datasette/issues/43#issuecomment-344180866,https://api.github.com/repos/simonw/datasette/issues/43,344180866,MDEyOklzc3VlQ29tbWVudDM0NDE4MDg2Ng==,9599,simonw,2017-11-14T08:25:37Z,2017-11-14T08:25:37Z,OWNER,"This isn’t necessary - restarting the server is fast and easy, and I’ve not found myself needing this at all during development.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268592894,"While running, server should spot new db files added to its directory ", https://github.com/simonw/datasette/issues/44#issuecomment-342484889,https://api.github.com/repos/simonw/datasette/issues/44,342484889,MDEyOklzc3VlQ29tbWVudDM0MjQ4NDg4OQ==,9599,simonw,2017-11-07T13:39:49Z,2017-11-07T13:39:49Z,OWNER,I’m going to call this feature “count values”,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",269731374,?_group_count=country - return counts by specific column(s), https://github.com/simonw/datasette/issues/44#issuecomment-345342512,https://api.github.com/repos/simonw/datasette/issues/44,345342512,MDEyOklzc3VlQ29tbWVudDM0NTM0MjUxMg==,9599,simonw,2017-11-17T19:27:53Z,2017-11-20T04:37:35Z,OWNER,"This should support multiple columns, e.g. `?_group_count=precinct&_group_count=candidate`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",269731374,?_group_count=country - return counts by specific column(s), https://github.com/simonw/datasette/issues/44#issuecomment-345343079,https://api.github.com/repos/simonw/datasette/issues/44,345343079,MDEyOklzc3VlQ29tbWVudDM0NTM0MzA3OQ==,9599,simonw,2017-11-17T19:29:43Z,2017-11-17T19:29:43Z,OWNER,Should this support sum/avg/etc as well?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",269731374,?_group_count=country - return counts by specific column(s), https://github.com/simonw/datasette/issues/44#issuecomment-345494971,https://api.github.com/repos/simonw/datasette/issues/44,345494971,MDEyOklzc3VlQ29tbWVudDM0NTQ5NDk3MQ==,9599,simonw,2017-11-19T06:15:39Z,2017-11-19T06:15:39Z,OWNER,It would be great if this could support foreign key references and automatically resolve and hyperlink them if they are detected.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",269731374,?_group_count=country - return counts by specific column(s), https://github.com/simonw/datasette/issues/44#issuecomment-345537315,https://api.github.com/repos/simonw/datasette/issues/44,345537315,MDEyOklzc3VlQ29tbWVudDM0NTUzNzMxNQ==,9599,simonw,2017-11-19T18:11:27Z,2017-11-19T18:11:27Z,OWNER,This would enable faceted search - moving it to the search milestone.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",269731374,?_group_count=country - return counts by specific column(s), https://github.com/simonw/datasette/issues/44#issuecomment-345601103,https://api.github.com/repos/simonw/datasette/issues/44,345601103,MDEyOklzc3VlQ29tbWVudDM0NTYwMTEwMw==,9599,simonw,2017-11-20T06:13:35Z,2017-11-20T06:13:35Z,OWNER,"Some demos: Single column: https://sf-trees-flat.now.sh/sf-trees-flat-ba738ce/Street_Tree_List?_group_count=qSpecies Multi column: https://sf-trees-flat.now.sh/sf-trees-flat-ba738ce/Street_Tree_List?_group_count=qLegalStatus&_group_count=qSpecies ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",269731374,?_group_count=country - return counts by specific column(s), https://github.com/simonw/datasette/issues/44#issuecomment-384676488,https://api.github.com/repos/simonw/datasette/issues/44,384676488,MDEyOklzc3VlQ29tbWVudDM4NDY3NjQ4OA==,9599,simonw,2018-04-26T15:09:57Z,2018-04-26T15:09:57Z,OWNER,Remaining work for this is tracked in #150,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",269731374,?_group_count=country - return counts by specific column(s), https://github.com/simonw/datasette/issues/46#issuecomment-344161226,https://api.github.com/repos/simonw/datasette/issues/46,344161226,MDEyOklzc3VlQ29tbWVudDM0NDE2MTIyNg==,9599,simonw,2017-11-14T06:41:21Z,2017-11-14T06:41:21Z,OWNER,Spatial extensions would be really useful too. https://www.gaia-gis.it/spatialite-2.1/SpatiaLite-manual.html,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/46#issuecomment-344161371,https://api.github.com/repos/simonw/datasette/issues/46,344161371,MDEyOklzc3VlQ29tbWVudDM0NDE2MTM3MQ==,9599,simonw,2017-11-14T06:42:15Z,2017-11-14T06:42:15Z,OWNER,http://charlesleifer.com/blog/going-fast-with-sqlite-and-python/ is useful here too.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/46#issuecomment-344161430,https://api.github.com/repos/simonw/datasette/issues/46,344161430,MDEyOklzc3VlQ29tbWVudDM0NDE2MTQzMA==,9599,simonw,2017-11-14T06:42:44Z,2017-11-14T06:42:44Z,OWNER,Also requested on Twitter: https://twitter.com/DenubisX/status/930322813864439808,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/46#issuecomment-344810525,https://api.github.com/repos/simonw/datasette/issues/46,344810525,MDEyOklzc3VlQ29tbWVudDM0NDgxMDUyNQ==,54999,ingenieroariel,2017-11-16T04:11:25Z,2017-11-16T04:11:25Z,CONTRIBUTOR,"@simonw On the spatialite support, here is some info to make it work and a screenshot: I used the following Dockerfile: ``` FROM prolocutor/python3-sqlite-ext:3.5.1-spatialite as build RUN mkdir /code ADD . /code/ RUN pip install /code/ EXPOSE 8001 CMD [""datasette"", ""serve"", ""/code/ne.sqlite"", ""--host"", ""0.0.0.0""] ``` and added this to `prepare_connection`: ``` conn.enable_load_extension(True) conn.execute(""SELECT load_extension('/usr/local/lib/mod_spatialite.so')"") ```","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/46#issuecomment-344975156,https://api.github.com/repos/simonw/datasette/issues/46,344975156,MDEyOklzc3VlQ29tbWVudDM0NDk3NTE1Ng==,9599,simonw,2017-11-16T16:19:44Z,2017-11-16T16:19:44Z,OWNER,"That's fantastic! Thank you very much for that. Do you know if it's possible to view the Dockerfile used by https://hub.docker.com/r/prolocutor/python3-sqlite-ext/ ?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/46#issuecomment-344976104,https://api.github.com/repos/simonw/datasette/issues/46,344976104,MDEyOklzc3VlQ29tbWVudDM0NDk3NjEwNA==,9599,simonw,2017-11-16T16:22:45Z,2017-11-16T16:22:45Z,OWNER,Found a relevant Dockerfile on Reddit: https://www.reddit.com/r/Python/comments/5unkb3/install_sqlite3_on_python_3/ddzdz2b/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/46#issuecomment-344976882,https://api.github.com/repos/simonw/datasette/issues/46,344976882,MDEyOklzc3VlQ29tbWVudDM0NDk3Njg4Mg==,9599,simonw,2017-11-16T16:25:07Z,2017-11-16T16:25:07Z,OWNER,Maybe part of the solution here is to add a `--load-extension` argument to `datasette` - so when you run the command you can specify SQLite extensions that should be loaded. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/46#issuecomment-344988591,https://api.github.com/repos/simonw/datasette/issues/46,344988591,MDEyOklzc3VlQ29tbWVudDM0NDk4ODU5MQ==,9599,simonw,2017-11-16T16:59:51Z,2017-11-16T16:59:51Z,OWNER,"OK, `--load-extension` is now a supported command line option - see #110 which includes my notes on how I manually tested it using the `prolocutor/python3-sqlite-ext` Docker image.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/46#issuecomment-344989340,https://api.github.com/repos/simonw/datasette/issues/46,344989340,MDEyOklzc3VlQ29tbWVudDM0NDk4OTM0MA==,9599,simonw,2017-11-16T17:02:07Z,2017-11-16T17:02:07Z,OWNER,The fact that `prolocutor/python3-sqlite-ext` doesn't provide a visible Dockerfile and hasn't been updated in two years makes me hesitant to bake it into datasette itself. I'd rather put together a Dockerfile that enables the necessary extensions and can live in the datasette repository itself.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/46#issuecomment-344995571,https://api.github.com/repos/simonw/datasette/issues/46,344995571,MDEyOklzc3VlQ29tbWVudDM0NDk5NTU3MQ==,9599,simonw,2017-11-16T17:22:32Z,2017-11-16T17:22:32Z,OWNER,The JSON extension would be very worthwhile too: https://www.sqlite.org/json1.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/46#issuecomment-345002908,https://api.github.com/repos/simonw/datasette/issues/46,345002908,MDEyOklzc3VlQ29tbWVudDM0NTAwMjkwOA==,54999,ingenieroariel,2017-11-16T17:47:49Z,2017-11-16T17:47:49Z,CONTRIBUTOR,I'll try to find alternatives to the Dockerfile option - I also think we should not use that old one without sources or license.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/46#issuecomment-345138347,https://api.github.com/repos/simonw/datasette/issues/46,345138347,MDEyOklzc3VlQ29tbWVudDM0NTEzODM0Nw==,9599,simonw,2017-11-17T03:52:25Z,2017-11-17T03:52:25Z,OWNER,We now have a Dockerfile that compiles spatialite! https://github.com/simonw/datasette/pull/114/commits/6c6b63d890529eeefcefb7ab126ea3bd7b2315c1,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/46#issuecomment-345259115,https://api.github.com/repos/simonw/datasette/issues/46,345259115,MDEyOklzc3VlQ29tbWVudDM0NTI1OTExNQ==,9599,simonw,2017-11-17T14:32:12Z,2017-11-17T14:32:12Z,OWNER,"OK, I can confirm that the version in the new docker container supports FTS5, JSON *and* spatialite! Notes on how I built the container and tested the spatialite extension are here: https://github.com/simonw/datasette/issues/112#issuecomment-345255655 To confirm that JSON and FTS5 are working, I ran the following: $ docker run -it -p 8001:8001 6c9ca7e29181 python Python 3.6.3 (default, Nov 4 2017, 14:24:48) [GCC 6.3.0 20170516] on linux Type ""help"", ""copyright"", ""credits"" or ""license"" for more information. >>> import sqlite3 >>> sqlite3.connect(':memory:').execute('CREATE VIRTUAL TABLE email USING fts5(sender, title, body);') >>> list(sqlite3.connect(':memory:').execute('''SELECT json(' { ""this"" : ""is"", ""a"": [ ""test"" ] } ') ''')) [('{""this"":""is"",""a"":[""test""]}',)] If I do the same thing in python3 on my OS X laptop directly, I get this: $ python3 Python 3.5.1 (default, Apr 18 2016, 11:46:32) [GCC 4.2.1 Compatible Apple LLVM 7.3.0 (clang-703.0.29)] on darwin Type ""help"", ""copyright"", ""credits"" or ""license"" for more information. >>> import sqlite3 >>> sqlite3.connect(':memory:').execute('CREATE VIRTUAL TABLE email USING fts5(sender, title, body);') Traceback (most recent call last): File """", line 1, in sqlite3.OperationalError: no such module: fts5 >>> list(sqlite3.connect(':memory:').execute('''SELECT json(' { ""this"" : ""is"", ""a"": [ ""test"" ] } ') ''')) Traceback (most recent call last): File """", line 1, in sqlite3.OperationalError: no such function: json ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/47#issuecomment-342521344,https://api.github.com/repos/simonw/datasette/issues/47,342521344,MDEyOklzc3VlQ29tbWVudDM0MjUyMTM0NA==,9599,simonw,2017-11-07T15:37:45Z,2017-11-07T15:37:45Z,OWNER,GDS Registries could be fun too: https://registers.cloudapps.digital/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271831408,Create neat example database, https://github.com/simonw/datasette/issues/47#issuecomment-343647102,https://api.github.com/repos/simonw/datasette/issues/47,343647102,MDEyOklzc3VlQ29tbWVudDM0MzY0NzEwMg==,9599,simonw,2017-11-11T07:36:00Z,2017-11-11T07:36:00Z,OWNER,"http://2016.padjo.org/tutorials/data-primer-census-acs1-demographics/ has a sqlite database: http://2016.padjo.org/files/data/starterpack/census-acs-1year/acs-1-year-2015.sqlite I tested this by deploying it here: https://datasette-fewuggrvwr.now.sh/","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271831408,Create neat example database, https://github.com/simonw/datasette/issues/47#issuecomment-343690060,https://api.github.com/repos/simonw/datasette/issues/47,343690060,MDEyOklzc3VlQ29tbWVudDM0MzY5MDA2MA==,9599,simonw,2017-11-11T19:56:08Z,2017-11-11T19:56:08Z,OWNER," ""parlgov-development.db"": { ""url"": ""http://www.parlgov.org/"" }, ""nhsadmin.sqlite"": { ""url"": ""https://github.com/psychemedia/openHealthDataDoodles"" }","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271831408,Create neat example database, https://github.com/simonw/datasette/issues/47#issuecomment-343705966,https://api.github.com/repos/simonw/datasette/issues/47,343705966,MDEyOklzc3VlQ29tbWVudDM0MzcwNTk2Ng==,9599,simonw,2017-11-12T01:00:20Z,2017-11-12T01:00:20Z,OWNER,https://github.com/fivethirtyeight/data has a ton of CSVs,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271831408,Create neat example database, https://github.com/simonw/datasette/issues/47#issuecomment-344132481,https://api.github.com/repos/simonw/datasette/issues/47,344132481,MDEyOklzc3VlQ29tbWVudDM0NDEzMjQ4MQ==,9599,simonw,2017-11-14T03:08:13Z,2017-11-14T03:08:13Z,OWNER,I ended up shipping with https://fivethirtyeight.datasettes.com/ and https://parlgov.datasettes.com/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271831408,Create neat example database, https://github.com/simonw/datasette/issues/48#issuecomment-343168796,https://api.github.com/repos/simonw/datasette/issues/48,343168796,MDEyOklzc3VlQ29tbWVudDM0MzE2ODc5Ng==,9599,simonw,2017-11-09T14:22:21Z,2017-11-09T14:22:21Z,OWNER,Won't fix: ujson is not compatible with the custom JSON encoder I'm using here: https://github.com/simonw/immutabase/blob/b2dee11fcd989d9e2a7bf4de1e23dbc320c05013/immutabase/app.py#L401-L416,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272391665,Switch to ujson, https://github.com/simonw/datasette/issues/48#issuecomment-343239062,https://api.github.com/repos/simonw/datasette/issues/48,343239062,MDEyOklzc3VlQ29tbWVudDM0MzIzOTA2Mg==,9599,simonw,2017-11-09T18:01:46Z,2017-11-09T18:01:46Z,OWNER,This looks promising: https://github.com/esnme/ultrajson/issues/124#issuecomment-323882878,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272391665,Switch to ujson, https://github.com/simonw/datasette/issues/48#issuecomment-379556637,https://api.github.com/repos/simonw/datasette/issues/48,379556637,MDEyOklzc3VlQ29tbWVudDM3OTU1NjYzNw==,9599,simonw,2018-04-08T14:56:52Z,2018-04-08T14:56:52Z,OWNER,It would be useful to have a microbenchmark in place to help understand how much of a performance benefit this would actually provide.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272391665,Switch to ujson, https://github.com/simonw/datasette/issues/48#issuecomment-504883688,https://api.github.com/repos/simonw/datasette/issues/48,504883688,MDEyOklzc3VlQ29tbWVudDUwNDg4MzY4OA==,9599,simonw,2019-06-24T06:57:43Z,2019-06-24T06:57:43Z,OWNER,"I've seen no evidence that JSON handling is even close to being a performance bottleneck, so wontfix.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272391665,Switch to ujson, https://github.com/simonw/datasette/issues/49#issuecomment-343237982,https://api.github.com/repos/simonw/datasette/issues/49,343237982,MDEyOklzc3VlQ29tbWVudDM0MzIzNzk4Mg==,9599,simonw,2017-11-09T17:58:01Z,2017-11-09T17:58:01Z,OWNER,"More terms: * publish * share * docker * host * stateless I want to capture the idea of publishing an immutable database in a stateless container.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272661336,Pick a name, https://github.com/simonw/datasette/issues/49#issuecomment-343238262,https://api.github.com/repos/simonw/datasette/issues/49,343238262,MDEyOklzc3VlQ29tbWVudDM0MzIzODI2Mg==,9599,simonw,2017-11-09T17:58:59Z,2017-11-09T17:58:59Z,OWNER,The name should ideally be available on PyPI and should make sense as both a command line application and a library.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272661336,Pick a name, https://github.com/simonw/datasette/issues/49#issuecomment-343281876,https://api.github.com/repos/simonw/datasette/issues/49,343281876,MDEyOklzc3VlQ29tbWVudDM0MzI4MTg3Ng==,9599,simonw,2017-11-09T20:30:42Z,2017-11-09T20:30:42Z,OWNER,How about datasette?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272661336,Pick a name, https://github.com/simonw/datasette/issues/49#issuecomment-343551356,https://api.github.com/repos/simonw/datasette/issues/49,343551356,MDEyOklzc3VlQ29tbWVudDM0MzU1MTM1Ng==,9599,simonw,2017-11-10T18:33:22Z,2017-11-10T18:33:22Z,OWNER,I'm going with datasette.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272661336,Pick a name, https://github.com/simonw/datasette/issues/50#issuecomment-343266326,https://api.github.com/repos/simonw/datasette/issues/50,343266326,MDEyOklzc3VlQ29tbWVudDM0MzI2NjMyNg==,9599,simonw,2017-11-09T19:33:18Z,2017-11-09T19:33:18Z,OWNER,http://sanic.readthedocs.io/en/latest/sanic/testing.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272694136,Unit tests against application itself, https://github.com/simonw/datasette/issues/50#issuecomment-343698214,https://api.github.com/repos/simonw/datasette/issues/50,343698214,MDEyOklzc3VlQ29tbWVudDM0MzY5ODIxNA==,9599,simonw,2017-11-11T22:23:21Z,2017-11-11T22:23:21Z,OWNER,"I'm closing #50 - more tests will be added in the future, but the framework is neatly in place for them now. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272694136,Unit tests against application itself, https://github.com/simonw/datasette/issues/51#issuecomment-344017088,https://api.github.com/repos/simonw/datasette/issues/51,344017088,MDEyOklzc3VlQ29tbWVudDM0NDAxNzA4OA==,9599,simonw,2017-11-13T18:44:23Z,2017-11-13T18:44:23Z,OWNER,Implemented in https://github.com/simonw/datasette/commit/e838bd743d31358b362875854a0ac5e78047727f,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272735257,Make a proper README, https://github.com/simonw/datasette/issues/52#issuecomment-343557070,https://api.github.com/repos/simonw/datasette/issues/52,343557070,MDEyOklzc3VlQ29tbWVudDM0MzU1NzA3MA==,9599,simonw,2017-11-10T18:57:47Z,2017-11-10T18:57:47Z,OWNER,"https://file.io/ looks like it could be good for this. It's been around since 2015, and lets you upload a temporary file which can be downloaded once. $ curl -s -F ""file=@database.db"" ""https://file.io/?expires=1d"" {""success"":true,""key"":""ySrl1j"",""link"":""https://file.io/ySrl1j"",""expiry"":""1 day""} Downloading from that URL serves up the data with a `Content-disposition` header containing the filename: simonw$ curl -vv https://file.io/ySrl1j | more % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0* Trying 34.232.1.167... * Connected to file.io (34.232.1.167) port 443 (#0) * TLS 1.2 connection using TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256 * Server certificate: file.io * Server certificate: Amazon * Server certificate: Amazon Root CA 1 * Server certificate: Starfield Services Root Certificate Authority - G2 > GET /ySrl1j HTTP/1.1 > Host: file.io > User-Agent: curl/7.43.0 > Accept: */* > < HTTP/1.1 200 OK < Date: Fri, 10 Nov 2017 18:14:38 GMT < Content-Type: undefined < Transfer-Encoding: chunked < Connection: keep-alive < X-Powered-By: Express < X-RateLimit-Limit: 5 < X-RateLimit-Remaining: 4 < Access-Control-Allow-Origin: * < Access-Control-Allow-Headers: Cache-Control,X-reqed-With,x-requested-with < Content-disposition: attachment; filename=database.db ... ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273026602,Solution for temporarily uploading DB so it can be built by docker, https://github.com/simonw/datasette/issues/52#issuecomment-350521635,https://api.github.com/repos/simonw/datasette/issues/52,350521635,MDEyOklzc3VlQ29tbWVudDM1MDUyMTYzNQ==,9599,simonw,2017-12-10T03:02:56Z,2017-12-10T03:02:56Z,OWNER,I don't think this is necessary.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273026602,Solution for temporarily uploading DB so it can be built by docker, https://github.com/simonw/datasette/issues/53#issuecomment-343699115,https://api.github.com/repos/simonw/datasette/issues/53,343699115,MDEyOklzc3VlQ29tbWVudDM0MzY5OTExNQ==,9599,simonw,2017-11-11T22:41:38Z,2017-11-11T22:41:38Z,OWNER,This needs to incorporate a sensible way of presenting custom SQL query results too. And let's get a textarea in there for executing SQL while we're at it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273054652,Implement a better database index page, https://github.com/simonw/datasette/issues/53#issuecomment-343707624,https://api.github.com/repos/simonw/datasette/issues/53,343707624,MDEyOklzc3VlQ29tbWVudDM0MzcwNzYyNA==,9599,simonw,2017-11-12T01:47:45Z,2017-11-12T01:47:45Z,OWNER,Split the SQL thing out into #65 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273054652,Implement a better database index page, https://github.com/simonw/datasette/issues/53#issuecomment-343707676,https://api.github.com/repos/simonw/datasette/issues/53,343707676,MDEyOklzc3VlQ29tbWVudDM0MzcwNzY3Ng==,9599,simonw,2017-11-12T01:49:07Z,2017-11-12T01:49:07Z,OWNER,"Here's the new design: Also lists views at the bottom (refs #54): ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273054652,Implement a better database index page, https://github.com/simonw/datasette/issues/54#issuecomment-343644891,https://api.github.com/repos/simonw/datasette/issues/54,343644891,MDEyOklzc3VlQ29tbWVudDM0MzY0NDg5MQ==,9599,simonw,2017-11-11T06:39:54Z,2017-11-11T06:39:54Z,OWNER,"I can detect something is a view like this: SELECT name from sqlite_master WHERE type ='view'; ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273121803,Views should not attempt to link to records / use rowids, https://github.com/simonw/datasette/issues/55#issuecomment-344060070,https://api.github.com/repos/simonw/datasette/issues/55,344060070,MDEyOklzc3VlQ29tbWVudDM0NDA2MDA3MA==,9599,simonw,2017-11-13T21:14:13Z,2017-11-13T21:14:13Z,OWNER,"I'm going to add some extra metadata to setup.py and then tag this as version 0.8: git tag 0.8 git push --tags Then to ship to PyPI: python setup.py bdist_wheel twine register dist/datasette-0.8-py3-none-any.whl twine upload dist/datasette-0.8-py3-none-any.whl ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127117,Ship first version to PyPI, https://github.com/simonw/datasette/issues/55#issuecomment-344061762,https://api.github.com/repos/simonw/datasette/issues/55,344061762,MDEyOklzc3VlQ29tbWVudDM0NDA2MTc2Mg==,9599,simonw,2017-11-13T21:19:43Z,2017-11-13T21:19:43Z,OWNER,And we're live! https://pypi.python.org/pypi/datasette,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127117,Ship first version to PyPI, https://github.com/simonw/datasette/issues/56#issuecomment-392601478,https://api.github.com/repos/simonw/datasette/issues/56,392601478,MDEyOklzc3VlQ29tbWVudDM5MjYwMTQ3OA==,9599,simonw,2018-05-28T20:50:24Z,2018-05-28T20:50:24Z,OWNER,I'm going to close this as WONTFIX for the moment. Once Plugins #14 grows the ability to add extra URL paths and views someone who needs this could build it as a plugin instead.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127443,Easy way to block search engine crawling in robots.txt, https://github.com/simonw/datasette/issues/57#issuecomment-343769692,https://api.github.com/repos/simonw/datasette/issues/57,343769692,MDEyOklzc3VlQ29tbWVudDM0Mzc2OTY5Mg==,9599,simonw,2017-11-12T21:32:36Z,2017-11-12T21:32:36Z,OWNER,I have created a Docker Hub public repository for this: https://hub.docker.com/r/simonwillison/datasette/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127694,Ship a Docker image of the whole thing, https://github.com/simonw/datasette/issues/57#issuecomment-344145265,https://api.github.com/repos/simonw/datasette/issues/57,344145265,MDEyOklzc3VlQ29tbWVudDM0NDE0NTI2NQ==,247192,macropin,2017-11-14T04:45:38Z,2017-11-14T04:45:38Z,CONTRIBUTOR,"I'm happy to contribute this. Just let me know if you want a Dockerfile for development or production purposes, or both. If it's prod then we can just pip install the source from pypi, otherwise for dev we'll need a `requirements.txt` to speed up rebuilds.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127694,Ship a Docker image of the whole thing, https://github.com/simonw/datasette/issues/57#issuecomment-344147583,https://api.github.com/repos/simonw/datasette/issues/57,344147583,MDEyOklzc3VlQ29tbWVudDM0NDE0NzU4Mw==,247192,macropin,2017-11-14T05:03:47Z,2017-11-14T05:03:47Z,CONTRIBUTOR,"Let me know if you'd like a PR. The image is usable as `docker run --rm -t -i -p 9000:8001 -v $(pwd)/db:/db datasette datasette serve /db/chinook.db`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127694,Ship a Docker image of the whole thing, https://github.com/simonw/datasette/issues/57#issuecomment-344149165,https://api.github.com/repos/simonw/datasette/issues/57,344149165,MDEyOklzc3VlQ29tbWVudDM0NDE0OTE2NQ==,9599,simonw,2017-11-14T05:16:34Z,2017-11-14T05:17:14Z,OWNER,"I’m intrigued by this pattern: https://github.com/macropin/datasette/blob/147195c2fdfa2b984d8f9fc1c6cab6634970a056/Dockerfile#L8 What’s the benefit of doing that? Does it result in a smaller image size?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127694,Ship a Docker image of the whole thing, https://github.com/simonw/datasette/issues/57#issuecomment-344151223,https://api.github.com/repos/simonw/datasette/issues/57,344151223,MDEyOklzc3VlQ29tbWVudDM0NDE1MTIyMw==,247192,macropin,2017-11-14T05:32:28Z,2017-11-14T05:33:03Z,CONTRIBUTOR,"The pattern is called ""multi-stage builds"". And the result is a svelte 226MB image (201MB for 3.6-slim) vs 700MB+ for the full image. It's possible to get it even smaller, but that takes a lot more work.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127694,Ship a Docker image of the whole thing, https://github.com/simonw/datasette/issues/57#issuecomment-344185817,https://api.github.com/repos/simonw/datasette/issues/57,344185817,MDEyOklzc3VlQ29tbWVudDM0NDE4NTgxNw==,9599,simonw,2017-11-14T08:46:24Z,2017-11-14T08:46:24Z,OWNER,Thanks for the explanation! Please do start a pull request. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127694,Ship a Docker image of the whole thing, https://github.com/simonw/datasette/issues/57#issuecomment-400903871,https://api.github.com/repos/simonw/datasette/issues/57,400903871,MDEyOklzc3VlQ29tbWVudDQwMDkwMzg3MQ==,9599,simonw,2018-06-28T04:01:38Z,2018-06-28T04:01:38Z,OWNER,"Shipped to Docker Hub: https://hub.docker.com/r/datasetteproject/datasette/ I did this manually the first time. I'll set Travis up to do this automatically in #329","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127694,Ship a Docker image of the whole thing, https://github.com/simonw/datasette/issues/59#issuecomment-343676574,https://api.github.com/repos/simonw/datasette/issues/59,343676574,MDEyOklzc3VlQ29tbWVudDM0MzY3NjU3NA==,9599,simonw,2017-11-11T16:29:48Z,2017-11-11T16:29:48Z,OWNER,See also #14,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273157085,datasette publish hyper, https://github.com/simonw/datasette/issues/59#issuecomment-344081876,https://api.github.com/repos/simonw/datasette/issues/59,344081876,MDEyOklzc3VlQ29tbWVudDM0NDA4MTg3Ng==,9599,simonw,2017-11-13T22:33:43Z,2017-11-13T22:33:43Z,OWNER,The `datasette package` command introduced in 4143e3b45c16cbae5e3e3419ef479a71810e7df3 is relevant here.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273157085,datasette publish hyper, https://github.com/simonw/datasette/issues/59#issuecomment-344141199,https://api.github.com/repos/simonw/datasette/issues/59,344141199,MDEyOklzc3VlQ29tbWVudDM0NDE0MTE5OQ==,9599,simonw,2017-11-14T04:13:11Z,2017-11-14T04:13:11Z,OWNER,"I managed to do this manually: datasette package ~/parlgov-db/parlgov.db --metadata=parlgov.json # Output 8758ec31dda3 as the new image ID docker save 8758ec31dda3 > /tmp/my-image # I could have just piped this straight to hyper cat /tmp/my-image | hyper load # Now start the container running in hyper hyper run -d -p 80:8001 --name parlgov 8758ec31dda3 # We need to assign an IP address so we can see it hyper fip allocate 1 # Outputs 199.245.58.78 hyper fip attach 199.245.58.78 parlgov At this point, visiting the IP address in a browser showed the parlgov UI. To clean up... hyper hyper fip detach parlgov hyper fip release 199.245.58.78 hyper stop parlgov hyper rm parlgov ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273157085,datasette publish hyper, https://github.com/simonw/datasette/issues/59#issuecomment-491945391,https://api.github.com/repos/simonw/datasette/issues/59,491945391,MDEyOklzc3VlQ29tbWVudDQ5MTk0NTM5MQ==,9599,simonw,2019-05-13T19:00:44Z,2019-05-13T19:01:00Z,OWNER,Hyper shut down at the start of this year: https://news.ycombinator.com/item?id=18734658,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273157085,datasette publish hyper, https://github.com/simonw/datasette/issues/60#issuecomment-343683566,https://api.github.com/repos/simonw/datasette/issues/60,343683566,MDEyOklzc3VlQ29tbWVudDM0MzY4MzU2Ng==,9599,simonw,2017-11-11T18:12:24Z,2017-11-11T18:12:24Z,OWNER,"I’m going to solve this by making it an optional argument you can pass to the serve command. Then the Dockerfile can still build and use it but it won’t interfere with tests or dev. If argument is not passed, we will calculate hashes on startup and calculate table row counts on demand. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273163905,Rethink how metadata is generated and stored, https://github.com/simonw/datasette/issues/63#issuecomment-343697291,https://api.github.com/repos/simonw/datasette/issues/63,343697291,MDEyOklzc3VlQ29tbWVudDM0MzY5NzI5MQ==,9599,simonw,2017-11-11T22:05:06Z,2017-11-11T22:11:49Z,OWNER,"I'm going to bundle sql and sql_params together into a query nested object like this: { ""query"": { ""sql"": ""select ..."", ""params"": { ""p0"": ""blah"" } } }","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273174447,Review design of JSON output, https://github.com/simonw/datasette/issues/64#issuecomment-345260784,https://api.github.com/repos/simonw/datasette/issues/64,345260784,MDEyOklzc3VlQ29tbWVudDM0NTI2MDc4NA==,9599,simonw,2017-11-17T14:38:21Z,2017-11-17T14:38:21Z,OWNER,This was fixed by ed2b3f25beac720f14869350baacc5f62b065194 in #107 - thanks @raynae!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273181020,Support for ?field__isnull=1 or similar, https://github.com/simonw/datasette/issues/65#issuecomment-343709217,https://api.github.com/repos/simonw/datasette/issues/65,343709217,MDEyOklzc3VlQ29tbWVudDM0MzcwOTIxNw==,9599,simonw,2017-11-12T02:36:37Z,2017-11-12T02:36:37Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273191608,Re-implement ?sql= mode, https://github.com/simonw/datasette/issues/66#issuecomment-343752683,https://api.github.com/repos/simonw/datasette/issues/66,343752683,MDEyOklzc3VlQ29tbWVudDM0Mzc1MjY4Mw==,9599,simonw,2017-11-12T17:24:05Z,2017-11-12T17:24:21Z,OWNER,"Maybe SQL views should have their own Sanic view class (`ViewView` is kinda funny), subclassed from `TableView`?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273191806,Show table SQL on table page, https://github.com/simonw/datasette/issues/67#issuecomment-343961784,https://api.github.com/repos/simonw/datasette/issues/67,343961784,MDEyOklzc3VlQ29tbWVudDM0Mzk2MTc4NA==,9599,simonw,2017-11-13T15:50:50Z,2017-11-13T15:50:50Z,OWNER,"`datasette package ...` - same arguments as `datasette publish`. Creates Docker container in your local repo, optionally tagged with `--tag`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273192789,Command that builds a local docker container, https://github.com/simonw/datasette/issues/67#issuecomment-343967020,https://api.github.com/repos/simonw/datasette/issues/67,343967020,MDEyOklzc3VlQ29tbWVudDM0Mzk2NzAyMA==,9599,simonw,2017-11-13T16:06:10Z,2017-11-13T16:06:10Z,OWNER,http://odewahn.github.io/docker-jumpstart/example.html is helpful,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273192789,Command that builds a local docker container, https://github.com/simonw/datasette/issues/68#issuecomment-343753999,https://api.github.com/repos/simonw/datasette/issues/68,343753999,MDEyOklzc3VlQ29tbWVudDM0Mzc1Mzk5OQ==,9599,simonw,2017-11-12T17:45:21Z,2017-11-12T19:38:33Z,OWNER,"For initial launch, I could just support this as some optional command line arguments you pass to the publish command: datasette publish data.db --title=""Title"" --source=""url""","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273247186,Support for title/source/license metadata, https://github.com/simonw/datasette/issues/68#issuecomment-343754058,https://api.github.com/repos/simonw/datasette/issues/68,343754058,MDEyOklzc3VlQ29tbWVudDM0Mzc1NDA1OA==,9599,simonw,2017-11-12T17:46:13Z,2017-11-12T17:46:13Z,OWNER,I’m going to store this stuff in a file called metadata.json and move the existing automatically generated metadata to a file called build.json,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273247186,Support for title/source/license metadata, https://github.com/simonw/datasette/issues/68#issuecomment-343791348,https://api.github.com/repos/simonw/datasette/issues/68,343791348,MDEyOklzc3VlQ29tbWVudDM0Mzc5MTM0OA==,9599,simonw,2017-11-13T02:12:58Z,2017-11-13T02:12:58Z,OWNER,I should use this on https://fivethirtyeight.datasettes.com/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273247186,Support for title/source/license metadata, https://github.com/simonw/datasette/issues/68#issuecomment-343951751,https://api.github.com/repos/simonw/datasette/issues/68,343951751,MDEyOklzc3VlQ29tbWVudDM0Mzk1MTc1MQ==,9599,simonw,2017-11-13T15:21:04Z,2017-11-13T15:21:04Z,OWNER,"For first version, I'm just supporting title, source and license information at the database level.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273247186,Support for title/source/license metadata, https://github.com/simonw/datasette/issues/69#issuecomment-343752579,https://api.github.com/repos/simonw/datasette/issues/69,343752579,MDEyOklzc3VlQ29tbWVudDM0Mzc1MjU3OQ==,9599,simonw,2017-11-12T17:22:39Z,2017-11-12T17:22:39Z,OWNER,"By default I'll allow LIMIT and OFFSET up to a maximum of X (where X is let's say 50,000 to start with, but can be custom configured to a larger number or set to None for no limit).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273248366,Enforce pagination (or at least limits) for arbitrary custom SQL, https://github.com/simonw/datasette/issues/69#issuecomment-343780039,https://api.github.com/repos/simonw/datasette/issues/69,343780039,MDEyOklzc3VlQ29tbWVudDM0Mzc4MDAzOQ==,9599,simonw,2017-11-13T00:05:27Z,2017-11-13T00:05:27Z,OWNER,"I think the only safe way to do this is using SQLite `.fetchmany(1000)` - I can't guarantee that the user has not entered SQL that will outfox a limit in some way. So instead of attempting to edit their SQL, I'll always return 1001 records and let them know if they went over 1000 or not.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273248366,Enforce pagination (or at least limits) for arbitrary custom SQL, https://github.com/simonw/datasette/issues/69#issuecomment-344019631,https://api.github.com/repos/simonw/datasette/issues/69,344019631,MDEyOklzc3VlQ29tbWVudDM0NDAxOTYzMQ==,9599,simonw,2017-11-13T18:53:13Z,2017-11-13T18:53:13Z,OWNER,I'm going with a page size of 100 and a max limit of 1000,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273248366,Enforce pagination (or at least limits) for arbitrary custom SQL, https://github.com/simonw/datasette/issues/69#issuecomment-344048656,https://api.github.com/repos/simonw/datasette/issues/69,344048656,MDEyOklzc3VlQ29tbWVudDM0NDA0ODY1Ng==,9599,simonw,2017-11-13T20:32:47Z,2017-11-13T20:32:47Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273248366,Enforce pagination (or at least limits) for arbitrary custom SQL, https://github.com/simonw/datasette/issues/71#issuecomment-343780141,https://api.github.com/repos/simonw/datasette/issues/71,343780141,MDEyOklzc3VlQ29tbWVudDM0Mzc4MDE0MQ==,9599,simonw,2017-11-13T00:06:52Z,2017-11-13T00:06:52Z,OWNER,I've registered datasettes.com as a domain name for doing this. Now setting it up so Cloudflare and Now can serve content from it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840,Set up some example datasets on a Cloudflare-backed domain, https://github.com/simonw/datasette/issues/71#issuecomment-343780539,https://api.github.com/repos/simonw/datasette/issues/71,343780539,MDEyOklzc3VlQ29tbWVudDM0Mzc4MDUzOQ==,9599,simonw,2017-11-13T00:13:29Z,2017-11-13T00:19:46Z,OWNER,"https://zeit.co/docs/features/dns is docs now domain add -e datasettes.com I had to set up a custom TXT record on `_now.datasettes.com` to get this to work.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840,Set up some example datasets on a Cloudflare-backed domain, https://github.com/simonw/datasette/issues/71#issuecomment-343780671,https://api.github.com/repos/simonw/datasette/issues/71,343780671,MDEyOklzc3VlQ29tbWVudDM0Mzc4MDY3MQ==,9599,simonw,2017-11-13T00:15:21Z,2017-11-13T00:17:37Z,OWNER,- [x] Redirect https://datasettes.com/ and https://www.datasettes.com/ to https://github.com/simonw/datasette,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840,Set up some example datasets on a Cloudflare-backed domain, https://github.com/simonw/datasette/issues/71#issuecomment-343780814,https://api.github.com/repos/simonw/datasette/issues/71,343780814,MDEyOklzc3VlQ29tbWVudDM0Mzc4MDgxNA==,9599,simonw,2017-11-13T00:17:50Z,2017-11-13T00:18:19Z,OWNER,"Achieved those redirects using Cloudflare ""page rules"": https://www.cloudflare.com/a/page-rules/datasettes.com","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840,Set up some example datasets on a Cloudflare-backed domain, https://github.com/simonw/datasette/issues/71#issuecomment-343781030,https://api.github.com/repos/simonw/datasette/issues/71,343781030,MDEyOklzc3VlQ29tbWVudDM0Mzc4MTAzMA==,9599,simonw,2017-11-13T00:21:05Z,2017-11-13T02:09:32Z,OWNER,"- [x] Have `now domain add -e datasettes.com` run without errors (hopefully just a matter of waiting for the DNS to update) - [x] Alias an example dataset hosted on Now on a datasettes.com subdomain - [x] Confirm that HTTP caching and HTTP/2 redirect pushing works as expected - this may require another page rule","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840,Set up some example datasets on a Cloudflare-backed domain, https://github.com/simonw/datasette/issues/71#issuecomment-343788581,https://api.github.com/repos/simonw/datasette/issues/71,343788581,MDEyOklzc3VlQ29tbWVudDM0Mzc4ODU4MQ==,9599,simonw,2017-11-13T01:48:17Z,2017-11-13T01:48:17Z,OWNER,"I had to add a rule like this to get letsencrypt certificates on now.sh working: https://github.com/zeit/now-cli/issues/188#issuecomment-270105052 I also have to flip this switch off every time I want to add a new alias: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840,Set up some example datasets on a Cloudflare-backed domain, https://github.com/simonw/datasette/issues/71#issuecomment-343788780,https://api.github.com/repos/simonw/datasette/issues/71,343788780,MDEyOklzc3VlQ29tbWVudDM0Mzc4ODc4MA==,9599,simonw,2017-11-13T01:50:01Z,2017-11-13T01:50:01Z,OWNER,"Added another page rule in order to get Cloudflare to always obey cache headers sent by the server: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840,Set up some example datasets on a Cloudflare-backed domain, https://github.com/simonw/datasette/issues/71#issuecomment-343788817,https://api.github.com/repos/simonw/datasette/issues/71,343788817,MDEyOklzc3VlQ29tbWVudDM0Mzc4ODgxNw==,9599,simonw,2017-11-13T01:50:27Z,2017-11-13T01:50:27Z,OWNER,https://fivethirtyeight.datasettes.com/ is now up and running.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840,Set up some example datasets on a Cloudflare-backed domain, https://github.com/simonw/datasette/issues/71#issuecomment-343789162,https://api.github.com/repos/simonw/datasette/issues/71,343789162,MDEyOklzc3VlQ29tbWVudDM0Mzc4OTE2Mg==,9599,simonw,2017-11-13T01:53:29Z,2017-11-13T01:53:29Z,OWNER,"``` $ curl -i 'https://fivethirtyeight.datasettes.com/fivethirtyeight-75d605c/obama-commutations%2Fobama_commutations.csv.jsono' HTTP/1.1 200 OK Date: Mon, 13 Nov 2017 01:50:57 GMT Content-Type: application/json Transfer-Encoding: chunked Connection: keep-alive Set-Cookie: __cfduid=de836090f3e12a60579cc7a1696cf0d9e1510537857; expires=Tue, 13-Nov-18 01:50:57 GMT; path=/; domain=.datasettes.com; HttpOnly; Secure Access-Control-Allow-Origin: * Cache-Control: public, max-age=31536000 X-Now-Region: now-sfo CF-Cache-Status: HIT Expires: Tue, 13 Nov 2018 01:50:57 GMT Server: cloudflare-nginx CF-RAY: 3bce154a6d9293b4-SJC {""database"": ""fivethirtyeight"", ""table"": ""obama-commutations/obama_commutations.csv""...```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840,Set up some example datasets on a Cloudflare-backed domain, https://github.com/simonw/datasette/issues/71#issuecomment-343790984,https://api.github.com/repos/simonw/datasette/issues/71,343790984,MDEyOklzc3VlQ29tbWVudDM0Mzc5MDk4NA==,9599,simonw,2017-11-13T02:09:34Z,2017-11-13T02:09:34Z,OWNER,"HTTP/2 push totally worked on the redirect! fetch('https://fivethirtyeight.datasettes.com/fivethirtyeight/riddler-pick-lowest%2Flow_numbers.csv.jsono').then(r => r.json()).then(console.log) Meanwhile, in the network pane... ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840,Set up some example datasets on a Cloudflare-backed domain, https://github.com/simonw/datasette/issues/73#issuecomment-343801392,https://api.github.com/repos/simonw/datasette/issues/73,343801392,MDEyOklzc3VlQ29tbWVudDM0MzgwMTM5Mg==,9599,simonw,2017-11-13T03:36:47Z,2017-11-13T03:36:47Z,OWNER,"While I’m at it, let’s allow people to opt out of HTTP/2 push with a ?_nopush=1 argument too - in case they decide they don’t want to receive large 302 responses.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273296178,_nocache=1 query string option for use with sort-by-random, https://github.com/simonw/datasette/issues/73#issuecomment-392574415,https://api.github.com/repos/simonw/datasette/issues/73,392574415,MDEyOklzc3VlQ29tbWVudDM5MjU3NDQxNQ==,9599,simonw,2018-05-28T17:25:14Z,2018-05-28T17:25:14Z,OWNER,I implemented this as `?_ttl=0` in #289 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273296178,_nocache=1 query string option for use with sort-by-random, https://github.com/simonw/datasette/issues/74#issuecomment-344018680,https://api.github.com/repos/simonw/datasette/issues/74,344018680,MDEyOklzc3VlQ29tbWVudDM0NDAxODY4MA==,9599,simonw,2017-11-13T18:49:58Z,2017-11-13T18:49:58Z,OWNER,Turns out it does this already: https://github.com/simonw/datasette/blob/6b3b05b6db0d2a7b7cec8b8dbb4ddc5e12a376b2/datasette/app.py#L96-L107,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273296684,Send a 302 redirect to the new hash for hits to old hashes, https://github.com/simonw/datasette/issues/75#issuecomment-344000982,https://api.github.com/repos/simonw/datasette/issues/75,344000982,MDEyOklzc3VlQ29tbWVudDM0NDAwMDk4Mg==,9599,simonw,2017-11-13T17:50:27Z,2017-11-13T17:50:27Z,OWNER,"This is necessary because one of the fun things to do with this tool is run it locally, e.g.: datasette ~/Library/Application\ Support/Google/Chrome/Default/History -p 8003 BUT... if we enable CORS by default, an evil site could try sniffing for localhost:8003 and attempt to steal data. So we'll enable the CORS headers only if `--cors` is provided to the command, and then use that command in the default Dockerfile.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273509159,Add --cors argument to serve, https://github.com/simonw/datasette/issues/79#issuecomment-344141515,https://api.github.com/repos/simonw/datasette/issues/79,344141515,MDEyOklzc3VlQ29tbWVudDM0NDE0MTUxNQ==,9599,simonw,2017-11-14T04:16:01Z,2017-11-14T04:16:01Z,OWNER,This is probably a bit too much for the README - I should get readthedocs working.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273569068,Add more detailed API documentation to the README, https://github.com/simonw/datasette/issues/79#issuecomment-384675792,https://api.github.com/repos/simonw/datasette/issues/79,384675792,MDEyOklzc3VlQ29tbWVudDM4NDY3NTc5Mg==,9599,simonw,2018-04-26T15:08:13Z,2018-04-26T15:08:13Z,OWNER,"Docs now live at http://datasette.readthedocs.io/ I still need to document a few more parts of the API before closing this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273569068,Add more detailed API documentation to the README, https://github.com/simonw/datasette/issues/79#issuecomment-392574358,https://api.github.com/repos/simonw/datasette/issues/79,392574358,MDEyOklzc3VlQ29tbWVudDM5MjU3NDM1OA==,9599,simonw,2018-05-28T17:24:48Z,2018-05-28T17:24:48Z,OWNER,Closing this as obsolete in favor of other issues [tagged documentation](https://github.com/simonw/datasette/labels/documentation).,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273569068,Add more detailed API documentation to the README, https://github.com/simonw/datasette/issues/80#issuecomment-344074443,https://api.github.com/repos/simonw/datasette/issues/80,344074443,MDEyOklzc3VlQ29tbWVudDM0NDA3NDQ0Mw==,9599,simonw,2017-11-13T22:04:54Z,2017-11-13T22:05:02Z,OWNER,"The fivethirtyeight dataset: datasette publish now --name fivethirtyeight --metadata metadata.json fivethirtyeight.db now alias https://fivethirtyeight-jyqfudvjli.now.sh fivethirtyeight.datasettes.com And parlgov: datasette publish now parlgov.db --name=parlgov --metadata=parlgov.json now alias https://parlgov-hqvxuhmbyh.now.sh parlgov.datasettes.com ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273569477,Deploy final versions of fivethirtyeight and parlgov datasets (with view pagination), https://github.com/simonw/datasette/issues/80#issuecomment-344075696,https://api.github.com/repos/simonw/datasette/issues/80,344075696,MDEyOklzc3VlQ29tbWVudDM0NDA3NTY5Ng==,9599,simonw,2017-11-13T22:09:46Z,2017-11-13T22:09:46Z,OWNER,"Parlgov was throwing errors on one of the views, which takes longer than 1000ms to execute - so I added the ability to customize the time limit in https://github.com/simonw/datasette/commit/1e698787a4dd6df0432021a6814c446c8b69bba2 datasette publish now parlgov.db --metadata parlgov.json --name parlgov --extra-options=""--sql_time_limit_ms=3500"" now alias https://parlgov-nvkcowlixq.now.sh parlgov.datasettes.com https://parlgov.datasettes.com/parlgov-25f9855/view_cabinet now returns in just over 2.5s ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273569477,Deploy final versions of fivethirtyeight and parlgov datasets (with view pagination), https://github.com/simonw/datasette/pull/81#issuecomment-344076554,https://api.github.com/repos/simonw/datasette/issues/81,344076554,MDEyOklzc3VlQ29tbWVudDM0NDA3NjU1NA==,9599,simonw,2017-11-13T22:12:57Z,2017-11-13T22:12:57Z,OWNER,"Hah, I haven't even announced this yet :) Travis is upset because I'm using SQL in the tests which isn't compatible with their version of Python 3.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273595473,:fire: Removes DS_Store, https://github.com/simonw/datasette/pull/81#issuecomment-344125441,https://api.github.com/repos/simonw/datasette/issues/81,344125441,MDEyOklzc3VlQ29tbWVudDM0NDEyNTQ0MQ==,50527,jefftriplett,2017-11-14T02:24:54Z,2017-11-14T02:24:54Z,CONTRIBUTOR,"Oops, if I jumped the gun. I saw the project in my github activity feed and saw some low hanging fruit :) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273595473,:fire: Removes DS_Store, https://github.com/simonw/datasette/issues/82#issuecomment-344118849,https://api.github.com/repos/simonw/datasette/issues/82,344118849,MDEyOklzc3VlQ29tbWVudDM0NDExODg0OQ==,9599,simonw,2017-11-14T01:46:10Z,2017-11-14T01:46:10Z,OWNER,Did this: https://simonwillison.net/2017/Nov/13/datasette/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273596159,Post a blog entry announcing it to the world, https://github.com/simonw/datasette/issues/85#issuecomment-344452063,https://api.github.com/repos/simonw/datasette/issues/85,344452063,MDEyOklzc3VlQ29tbWVudDM0NDQ1MjA2Mw==,9599,simonw,2017-11-15T01:03:03Z,2017-11-15T01:03:03Z,OWNER,"This can work in reverse too. If you view the row page for something that has foreign keys against it, we can show you “53 items in TABLE link to this” and provide a link to view them all. That count worry could be prohibitively expensive. To counter that, we could run the count query via Ajax and set a strict time limit on it. See #95","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273678673,Detect foreign keys and use them to link HTML pages together, https://github.com/simonw/datasette/issues/85#issuecomment-344452326,https://api.github.com/repos/simonw/datasette/issues/85,344452326,MDEyOklzc3VlQ29tbWVudDM0NDQ1MjMyNg==,9599,simonw,2017-11-15T01:04:38Z,2017-11-15T01:04:38Z,OWNER,This will work well in conjunction with https://github.com/simonw/csvs-to-sqlite/issues/2,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273678673,Detect foreign keys and use them to link HTML pages together, https://github.com/simonw/datasette/issues/85#issuecomment-344657040,https://api.github.com/repos/simonw/datasette/issues/85,344657040,MDEyOklzc3VlQ29tbWVudDM0NDY1NzA0MA==,9599,simonw,2017-11-15T16:56:48Z,2017-11-15T16:56:48Z,OWNER,"Since detecting foreign keys that point to a specific table is a bit expensive (you have to call a PRAGMA on every other table) I’m going to add this to the build/inspect stage. Idea: if we detect that the foreign key table only has one other column in it (id, name) AND we know that the id is the primary key, we can add an efficient lookup on the table list view and prefetch a dictionary mapping IDs to their value. Then we can feed that dictionary in as extra tenplate context and use it to render labeled hyperlinks in the corresponding column. This means our build step should also cache which columns are indexed, and add a “label_column” property for tables with an obvious lane column.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273678673,Detect foreign keys and use them to link HTML pages together, https://github.com/simonw/datasette/issues/85#issuecomment-345150048,https://api.github.com/repos/simonw/datasette/issues/85,345150048,MDEyOklzc3VlQ29tbWVudDM0NTE1MDA0OA==,9599,simonw,2017-11-17T05:35:25Z,2017-11-17T05:35:25Z,OWNER,`csvs-to-sqlite` is now capable of generating databases with foreign key lookup tables: https://github.com/simonw/csvs-to-sqlite/releases/tag/0.3,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273678673,Detect foreign keys and use them to link HTML pages together, https://github.com/simonw/datasette/issues/85#issuecomment-345242447,https://api.github.com/repos/simonw/datasette/issues/85,345242447,MDEyOklzc3VlQ29tbWVudDM0NTI0MjQ0Nw==,9599,simonw,2017-11-17T13:22:33Z,2017-11-17T13:23:14Z,OWNER,"I could support explicit label columns using additional arguments to `datasette serve`: datasette serve mydb.py --label-column mydb:table1:name --label-column mydb:table2:title This would mean ""in mydb, set the label column for table1 to name, and the label column for table2 to title""","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273678673,Detect foreign keys and use them to link HTML pages together, https://github.com/simonw/datasette/issues/85#issuecomment-345494724,https://api.github.com/repos/simonw/datasette/issues/85,345494724,MDEyOklzc3VlQ29tbWVudDM0NTQ5NDcyNA==,9599,simonw,2017-11-19T06:08:19Z,2017-11-19T06:08:19Z,OWNER,"This is working really nicely now: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273678673,Detect foreign keys and use them to link HTML pages together, https://github.com/simonw/datasette/issues/86#issuecomment-345494775,https://api.github.com/repos/simonw/datasette/issues/86,345494775,MDEyOklzc3VlQ29tbWVudDM0NTQ5NDc3NQ==,9599,simonw,2017-11-19T06:09:43Z,2017-11-19T06:09:43Z,OWNER,"Now that we have foreign key support (#85) this is even more important, since foreign key support actively encourages linking to filtered table views.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/86#issuecomment-345494918,https://api.github.com/repos/simonw/datasette/issues/86,345494918,MDEyOklzc3VlQ29tbWVudDM0NTQ5NDkxOA==,9599,simonw,2017-11-19T06:14:17Z,2017-11-19T06:14:17Z,OWNER,"If the selected relationship is a foreign key reference, we should resolve that foreign key and display it on the page.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/86#issuecomment-345496540,https://api.github.com/repos/simonw/datasette/issues/86,345496540,MDEyOklzc3VlQ29tbWVudDM0NTQ5NjU0MA==,9599,simonw,2017-11-19T06:59:40Z,2017-11-19T06:59:40Z,OWNER,"OK,I've figured out how to do an initial version of this without JavaScript. I'll provide three form fields labell d ""add filter"": * a select box of all of the columns * a select box of the available operations * a value box Submit those and the site will redirect you to a correctly populated querystring for that filter. If you have filters applied, those will display as prepopulated form field triples. For foreign key reference filters, I will display the resolved value next to the text box containing the numeric ID. In the future this can get a select2 style treatment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/86#issuecomment-345497453,https://api.github.com/repos/simonw/datasette/issues/86,345497453,MDEyOklzc3VlQ29tbWVudDM0NTQ5NzQ1Mw==,9599,simonw,2017-11-19T07:21:22Z,2017-11-19T07:21:22Z,OWNER,I'm going to be a bit classier about this and auto generate a title for the page that describes the currently applied filters.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/86#issuecomment-345497534,https://api.github.com/repos/simonw/datasette/issues/86,345497534,MDEyOklzc3VlQ29tbWVudDM0NTQ5NzUzNA==,9599,simonw,2017-11-19T07:23:33Z,2017-11-19T07:23:33Z,OWNER,"""Tablename: 3,567 rows where status = 3 (published) and n > 55""","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/86#issuecomment-345497689,https://api.github.com/repos/simonw/datasette/issues/86,345497689,MDEyOklzc3VlQ29tbWVudDM0NTQ5NzY4OQ==,9599,simonw,2017-11-19T07:27:40Z,2017-11-19T07:27:40Z,OWNER,"I'll have to refactor the foreign key annotating code to be usable in other contexts - at the moment it only works for annotating displays of rows, but I need to use it to resolve selected filters as well. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/86#issuecomment-345559864,https://api.github.com/repos/simonw/datasette/issues/86,345559864,MDEyOklzc3VlQ29tbWVudDM0NTU1OTg2NA==,9599,simonw,2017-11-19T23:35:48Z,2017-11-19T23:35:48Z,OWNER,"I need a nicer abstraction around the concept of filters. It needs to be able to: - convert querystring parameters into filters - convert filters into a querystring - iterate through currently applied filters - convert selected filters into a human description (e.g. for a title) - expand filters that involve a foreign key - add filters - remove filters - define different types of filters It should replace my current `build_where_clauses` implementation, in particular this bit: https://github.com/simonw/datasette/blob/a5881e105a02830d26f07e98177248d5910893da/datasette/utils.py#L38-L56","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/86#issuecomment-346530498,https://api.github.com/repos/simonw/datasette/issues/86,346530498,MDEyOklzc3VlQ29tbWVudDM0NjUzMDQ5OA==,9599,simonw,2017-11-23T04:35:07Z,2017-11-23T04:35:07Z,OWNER,"Here's where I am now. Needs a bit of UI tidy up and it will be good to release: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/86#issuecomment-346691243,https://api.github.com/repos/simonw/datasette/issues/86,346691243,MDEyOklzc3VlQ29tbWVudDM0NjY5MTI0Mw==,9599,simonw,2017-11-23T20:07:15Z,2017-11-23T20:07:15Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/86#issuecomment-346694211,https://api.github.com/repos/simonw/datasette/issues/86,346694211,MDEyOklzc3VlQ29tbWVudDM0NjY5NDIxMQ==,9599,simonw,2017-11-23T20:34:32Z,2017-11-23T20:34:32Z,OWNER,And with ef3eacf622e69723d48ab1ad597645770a7361db I'm ready to call this one done.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/87#issuecomment-403909671,https://api.github.com/repos/simonw/datasette/issues/87,403909671,MDEyOklzc3VlQ29tbWVudDQwMzkwOTY3MQ==,9599,simonw,2018-07-10T17:49:12Z,2018-07-10T17:49:12Z,OWNER,This was fixed by https://github.com/simonw/datasette/commit/6a32684ebba89dfe882e1147b23aa8778479f5d8#diff-354f30a63fb0907d4ad57269548329e3,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273709194,Configure Travis to release new tags to PyPI, https://github.com/simonw/datasette/issues/88#issuecomment-344427448,https://api.github.com/repos/simonw/datasette/issues/88,344427448,MDEyOklzc3VlQ29tbWVudDM0NDQyNzQ0OA==,9599,simonw,2017-11-14T22:54:06Z,2017-11-14T22:54:06Z,OWNER,Hooray! First dataset that wasn't deployed by me :) https://github.com/simonw/datasette/wiki/Datasettes,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273775212,Add NHS England Hospitals example to wiki, https://github.com/simonw/datasette/issues/88#issuecomment-344427560,https://api.github.com/repos/simonw/datasette/issues/88,344427560,MDEyOklzc3VlQ29tbWVudDM0NDQyNzU2MA==,9599,simonw,2017-11-14T22:54:33Z,2017-11-14T22:54:33Z,OWNER,I'm getting an internal server error on http://run.plnkr.co/preview/cj9zlf1qc0003414y90ajkwpk/ at the moment,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273775212,Add NHS England Hospitals example to wiki, https://github.com/simonw/datasette/issues/88#issuecomment-344430689,https://api.github.com/repos/simonw/datasette/issues/88,344430689,MDEyOklzc3VlQ29tbWVudDM0NDQzMDY4OQ==,15543,tomdyson,2017-11-14T23:08:22Z,2017-11-14T23:08:22Z,CONTRIBUTOR,"> I'm getting an internal server error on http://run.plnkr.co/preview/cj9zlf1qc0003414y90ajkwpk/ at the moment Sorry about that - here's a working version on Netlify: https://nhs-england-map.netlify.com","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273775212,Add NHS England Hospitals example to wiki, https://github.com/simonw/datasette/issues/88#issuecomment-804471733,https://api.github.com/repos/simonw/datasette/issues/88,804471733,MDEyOklzc3VlQ29tbWVudDgwNDQ3MTczMw==,192568,mroswell,2021-03-22T23:46:36Z,2021-03-22T23:46:36Z,CONTRIBUTOR,Google Map API limits seem to prevent https://nhs-england-map.netlify.com from being a working demo.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273775212,Add NHS England Hospitals example to wiki, https://github.com/simonw/datasette/pull/89#issuecomment-344462277,https://api.github.com/repos/simonw/datasette/issues/89,344462277,MDEyOklzc3VlQ29tbWVudDM0NDQ2MjI3Nw==,9599,simonw,2017-11-15T02:02:52Z,2017-11-15T02:02:52Z,OWNER,"This is exactly what I was after, thanks!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273816720,SQL syntax highlighting with CodeMirror, https://github.com/simonw/datasette/issues/90#issuecomment-344667202,https://api.github.com/repos/simonw/datasette/issues/90,344667202,MDEyOklzc3VlQ29tbWVudDM0NDY2NzIwMg==,9599,simonw,2017-11-15T17:29:38Z,2017-11-15T17:29:38Z,OWNER,@jacobian points out that a buildpack may be a better fit than a Docker container for implementing this: https://twitter.com/jacobian/status/930849058465255424,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123,datasette publish heroku, https://github.com/simonw/datasette/issues/90#issuecomment-344680385,https://api.github.com/repos/simonw/datasette/issues/90,344680385,MDEyOklzc3VlQ29tbWVudDM0NDY4MDM4NQ==,9599,simonw,2017-11-15T18:14:11Z,2017-11-15T18:14:11Z,OWNER,"Maybe we don’t even need a buildpack... we could create a temporary directory, set up a classic heroku app with the datasette serve command in the Procfile and then git push to deploy.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123,datasette publish heroku, https://github.com/simonw/datasette/issues/90#issuecomment-344686483,https://api.github.com/repos/simonw/datasette/issues/90,344686483,MDEyOklzc3VlQ29tbWVudDM0NDY4NjQ4Mw==,9599,simonw,2017-11-15T18:36:23Z,2017-11-15T18:36:23Z,OWNER,The “datasette build” command would need to run in a bin/post_compile script eg https://github.com/simonw/simonwillisonblog/blob/cloudflare-ips/bin/post_compile,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123,datasette publish heroku, https://github.com/simonw/datasette/issues/90#issuecomment-344687328,https://api.github.com/repos/simonw/datasette/issues/90,344687328,MDEyOklzc3VlQ29tbWVudDM0NDY4NzMyOA==,9599,simonw,2017-11-15T18:39:14Z,2017-11-15T18:39:49Z,OWNER,"By default the command could use a temporary directory that gets cleaned up after the deploy, but we could allow users to opt in to keeping the generated directory like so: datasette publish heroku mydb.py -d ~/dev/my-heroku-app This would create the my-heroku-app folder so you can later execute further git deploys from there.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123,datasette publish heroku, https://github.com/simonw/datasette/issues/90#issuecomment-346161985,https://api.github.com/repos/simonw/datasette/issues/90,346161985,MDEyOklzc3VlQ29tbWVudDM0NjE2MTk4NQ==,9599,simonw,2017-11-21T21:10:22Z,2017-11-21T21:10:22Z,OWNER,"Woohoo! I've found one tiny issue: right now, the following doesn't work: datasette publish heroku ../demo-databses/google-trends.db It results in this error in the Heroku logs: 2017-11-21T21:03:29.210511+00:00 app[web.1]: Usage: datasette serve [OPTIONS] [FILES]... 2017-11-21T21:03:29.210524+00:00 app[web.1]: 2017-11-21T21:03:29.210555+00:00 app[web.1]: Error: Invalid value for ""files"": Path ""../demo-databses/google-trends.db"" does not exist. The command works fine if you run it in the same directory as the database file you are publishing.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123,datasette publish heroku, https://github.com/simonw/datasette/issues/90#issuecomment-346163513,https://api.github.com/repos/simonw/datasette/issues/90,346163513,MDEyOklzc3VlQ29tbWVudDM0NjE2MzUxMw==,9599,simonw,2017-11-21T21:16:16Z,2017-11-21T21:16:16Z,OWNER,"The reason relative paths work for `publish now` is that the `make_dockerfile()` function is called by passing the file names, not the full file paths: https://github.com/simonw/datasette/blob/e47117ce1d15f11246a3120aa49de70205713d05/datasette/utils.py#L166 Clearly the correct thing to do here is for us to refactor the shared code between heroku/package/now.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123,datasette publish heroku, https://github.com/simonw/datasette/issues/90#issuecomment-350521711,https://api.github.com/repos/simonw/datasette/issues/90,350521711,MDEyOklzc3VlQ29tbWVudDM1MDUyMTcxMQ==,9599,simonw,2017-12-10T03:05:48Z,2017-12-10T03:05:48Z,OWNER,I fixed that last issue in c195ee4d46f2577b1943836a8270d84c8341d138,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123,datasette publish heroku, https://github.com/simonw/datasette/issues/90#issuecomment-350521736,https://api.github.com/repos/simonw/datasette/issues/90,350521736,MDEyOklzc3VlQ29tbWVudDM1MDUyMTczNg==,9599,simonw,2017-12-10T03:06:34Z,2017-12-10T03:06:34Z,OWNER,Heroku is now in the README as of 6bdfcf60760c27e29ff34692d06e62b36aeecc56,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123,datasette publish heroku, https://github.com/simonw/datasette/issues/91#issuecomment-350521780,https://api.github.com/repos/simonw/datasette/issues/91,350521780,MDEyOklzc3VlQ29tbWVudDM1MDUyMTc4MA==,9599,simonw,2017-12-10T03:07:53Z,2017-12-10T03:07:53Z,OWNER,Won't fix - I think the custom templates and static stuff in https://github.com/simonw/datasette/releases/tag/0.14 renders this obsolete.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273878873,"Option to serve databases from a different prefix, serve regular content elsewhere", https://github.com/simonw/datasette/issues/93#issuecomment-344409906,https://api.github.com/repos/simonw/datasette/issues/93,344409906,MDEyOklzc3VlQ29tbWVudDM0NDQwOTkwNg==,9599,simonw,2017-11-14T21:47:02Z,2017-11-14T21:47:02Z,OWNER,"Even without bundling in the database file itself, I'd love to have a standalone binary version of the core `datasette` CLI utility. I think Sanic may have some complex dependencies, but I've never tried pyinstaller so I don't know how easy or hard it would be to get this working.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-344415756,https://api.github.com/repos/simonw/datasette/issues/93,344415756,MDEyOklzc3VlQ29tbWVudDM0NDQxNTc1Ng==,9599,simonw,2017-11-14T22:09:13Z,2017-11-14T22:09:13Z,OWNER,Looks like we'd need to use this recipe: https://github.com/pyinstaller/pyinstaller/wiki/Recipe-Setuptools-Entry-Point,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-344424382,https://api.github.com/repos/simonw/datasette/issues/93,344424382,MDEyOklzc3VlQ29tbWVudDM0NDQyNDM4Mg==,67420,atomotic,2017-11-14T22:42:16Z,2017-11-14T22:42:16Z,NONE,"tried quickly, this seems working: ``` ~ pip3 install pyinstaller ~ pyinstaller -F --add-data /usr/local/lib/python3.6/site-packages/datasette/templates:datasette/templates --add-data /usr/local/lib/python3.6/site-packages/datasette/static:datasette/static /usr/local/bin/datasette ~ du -h dist/datasette 6.8M dist/datasette ~ file dist/datasette dist/datasette: Mach-O 64-bit executable x86_64 ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-344426887,https://api.github.com/repos/simonw/datasette/issues/93,344426887,MDEyOklzc3VlQ29tbWVudDM0NDQyNjg4Nw==,9599,simonw,2017-11-14T22:51:46Z,2017-11-14T22:51:46Z,OWNER,"That didn't quite work for me. It built me a `dist/datasette` executable but when I try to run it I get an error: $ pwd /Users/simonw/Dropbox/Development/datasette $ source venv/bin/activate $ pyinstaller -F --add-data datasette/templates:datasette/templates --add-data datasette/static:datasette/static /Users/simonw/Dropbox/Development/datasette/venv/bin/datasette $ dist/datasette --help Traceback (most recent call last): File ""datasette"", line 11, in File ""site-packages/pkg_resources/__init__.py"", line 572, in load_entry_point File ""site-packages/pkg_resources/__init__.py"", line 564, in get_distribution File ""site-packages/pkg_resources/__init__.py"", line 436, in get_provider File ""site-packages/pkg_resources/__init__.py"", line 984, in require File ""site-packages/pkg_resources/__init__.py"", line 870, in resolve pkg_resources.DistributionNotFound: The 'datasette' distribution was not found and is required by the application [99117] Failed to execute script datasette ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-344430299,https://api.github.com/repos/simonw/datasette/issues/93,344430299,MDEyOklzc3VlQ29tbWVudDM0NDQzMDI5OQ==,67420,atomotic,2017-11-14T23:06:33Z,2017-11-14T23:06:33Z,NONE,"i will look better tomorrow, it's late i surely made some mistake https://asciinema.org/a/ZyAWbetrlriDadwWyVPUWB94H","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-344440377,https://api.github.com/repos/simonw/datasette/issues/93,344440377,MDEyOklzc3VlQ29tbWVudDM0NDQ0MDM3Nw==,9599,simonw,2017-11-14T23:56:35Z,2017-11-14T23:56:35Z,OWNER,"It worked! $ pyinstaller -F \ --add-data /usr/local/lib/python3.5/site-packages/datasette/templates:datasette/templates \ --add-data /usr/local/lib/python3.5/site-packages/datasette/static:datasette/static \ /usr/local/bin/datasette $ file dist/datasette dist/datasette: Mach-O 64-bit executable x86_64 $ dist/datasette --help Usage: datasette [OPTIONS] COMMAND [ARGS]... Datasette! Options: --help Show this message and exit. Commands: serve* Serve up specified SQLite database files with... build package Package specified SQLite files into a new... publish Publish specified SQLite database files to... ","{""total_count"": 3, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 3, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-344440658,https://api.github.com/repos/simonw/datasette/issues/93,344440658,MDEyOklzc3VlQ29tbWVudDM0NDQ0MDY1OA==,9599,simonw,2017-11-14T23:58:07Z,2017-11-14T23:58:07Z,OWNER,It's a shame pyinstaller can't act as a cross-compiler - so I don't think I can get Travis CI to build packages. But it's fantastic that it's possible to turn the tool into a standalone executable!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-344516406,https://api.github.com/repos/simonw/datasette/issues/93,344516406,MDEyOklzc3VlQ29tbWVudDM0NDUxNjQwNg==,67420,atomotic,2017-11-15T08:09:41Z,2017-11-15T08:09:41Z,NONE,actually you can use travis to build for linux/macos and [appveyor](https://www.appveyor.com/) to build for windows.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-538760447,https://api.github.com/repos/simonw/datasette/issues/93,538760447,MDEyOklzc3VlQ29tbWVudDUzODc2MDQ0Nw==,9599,simonw,2019-10-06T15:56:01Z,2019-10-06T15:56:01Z,OWNER,Relevant conversation on Twitter: https://twitter.com/simonw/status/1180866651962560512?s=21,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-754215392,https://api.github.com/repos/simonw/datasette/issues/93,754215392,MDEyOklzc3VlQ29tbWVudDc1NDIxNTM5Mg==,9599,simonw,2021-01-04T20:59:20Z,2021-01-04T21:03:14Z,OWNER,"Updated `pyinstaller` recipe - lots of hidden imports needed now: ``` pip install wheel pip install datasette pyinstaller BASE=$(python -c 'import os; print(os.path.dirname(__import__(""datasette"").__file__))') \ pyinstaller -F \ --add-data ""$BASE/templates:datasette/templates"" \ --add-data ""$BASE/static:datasette/static"" \ --hidden-import datasette.publish \ --hidden-import datasette.publish.heroku \ --hidden-import datasette.publish.cloudrun \ --hidden-import datasette.facets \ --hidden-import datasette.sql_functions \ --hidden-import datasette.actor_auth_cookie \ --hidden-import datasette.default_permissions \ --hidden-import datasette.default_magic_parameters \ --hidden-import datasette.blob_renderer \ --hidden-import datasette.default_menu_links \ --hidden-import uvicorn \ --hidden-import uvicorn.logging \ --hidden-import uvicorn.loops \ --hidden-import uvicorn.loops.auto \ --hidden-import uvicorn.protocols \ --hidden-import uvicorn.protocols.http \ --hidden-import uvicorn.protocols.http.auto \ --hidden-import uvicorn.protocols.websockets \ --hidden-import uvicorn.protocols.websockets.auto \ --hidden-import uvicorn.lifespan \ --hidden-import uvicorn.lifespan.on \ $(which datasette) ```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-754215793,https://api.github.com/repos/simonw/datasette/issues/93,754215793,MDEyOklzc3VlQ29tbWVudDc1NDIxNTc5Mw==,9599,simonw,2021-01-04T21:00:14Z,2021-01-04T21:00:14Z,OWNER,"``` (pyinstaller-datasette) pyinstaller-datasette % file dist/datasette dist/datasette: Mach-O 64-bit executable x86_64 (pyinstaller-datasette) pyinstaller-datasette % ls -lah dist/datasette -rwxr-xr-x 1 simon wheel 8.0M Jan 4 12:58 dist/datasette ``` I'm surprised it's only 8MB!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-754218545,https://api.github.com/repos/simonw/datasette/issues/93,754218545,MDEyOklzc3VlQ29tbWVudDc1NDIxODU0NQ==,9599,simonw,2021-01-04T21:05:57Z,2021-01-04T21:05:57Z,OWNER,That BASE= trick seems to work with `zsh` but not with `bash`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-754219002,https://api.github.com/repos/simonw/datasette/issues/93,754219002,MDEyOklzc3VlQ29tbWVudDc1NDIxOTAwMg==,9599,simonw,2021-01-04T21:06:49Z,2021-01-04T21:22:27Z,OWNER,"Works on Linux/Ubuntu too, except I had to do `export BASE=` on a separate line. I also did this: ``` apt-get install python3 python3-venv python3 -m venv pyinstaller-venv source pyinstaller-venv/bin/activate pip install wheel pip install datasette pyinstaller export DATASETTE_BASE=$(python -c 'import os; print(os.path.dirname(__import__(""datasette"").__file__))') pyinstaller -F \ --add-data ""$DATASETTE_BASE/templates:datasette/templates"" \ --add-data ""$DATASETTE_BASE/static:datasette/static"" \ --hidden-import datasette.publish \ --hidden-import datasette.publish.heroku \ --hidden-import datasette.publish.cloudrun \ --hidden-import datasette.facets \ --hidden-import datasette.sql_functions \ --hidden-import datasette.actor_auth_cookie \ --hidden-import datasette.default_permissions \ --hidden-import datasette.default_magic_parameters \ --hidden-import datasette.blob_renderer \ --hidden-import datasette.default_menu_links \ --hidden-import uvicorn \ --hidden-import uvicorn.logging \ --hidden-import uvicorn.loops \ --hidden-import uvicorn.loops.auto \ --hidden-import uvicorn.protocols \ --hidden-import uvicorn.protocols.http \ --hidden-import uvicorn.protocols.http.auto \ --hidden-import uvicorn.protocols.websockets \ --hidden-import uvicorn.protocols.websockets.auto \ --hidden-import uvicorn.lifespan \ --hidden-import uvicorn.lifespan.on \ $(which datasette) ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-754227543,https://api.github.com/repos/simonw/datasette/issues/93,754227543,MDEyOklzc3VlQ29tbWVudDc1NDIyNzU0Mw==,9599,simonw,2021-01-04T21:23:13Z,2021-01-04T21:23:13Z,OWNER,"``` (pyinstaller-venv) root@dogsheep:/tmp/pyinstaller-venv# dist/datasette --get /-/databases.json [{""name"": "":memory:"", ""path"": null, ""size"": 0, ""is_mutable"": true, ""is_memory"": true, ""hash"": null}] (pyinstaller-venv) root@dogsheep:/tmp/pyinstaller-venv# ls -lah dist/datasette -rwxr-xr-x 1 root root 8.9M Jan 4 21:05 dist/datasette ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-754229977,https://api.github.com/repos/simonw/datasette/issues/93,754229977,MDEyOklzc3VlQ29tbWVudDc1NDIyOTk3Nw==,9599,simonw,2021-01-04T21:28:01Z,2021-01-04T21:28:01Z,OWNER,"As an experiment, I put the macOS one in a zip file and attached it to the latest release: ``` mkdir datasette-0.53-macos-binary cp dist/datasette datasette-0.53-macos-binary zip -r datasette-0.53-macos-binary.zip datasette-0.53-macos-binary ``` It's available here: https://github.com/simonw/datasette/releases/tag/0.53 - download URL is https://github.com/simonw/datasette/releases/download/0.53/datasette-0.53-macos-binary.zip","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-754233960,https://api.github.com/repos/simonw/datasette/issues/93,754233960,MDEyOklzc3VlQ29tbWVudDc1NDIzMzk2MA==,9599,simonw,2021-01-04T21:35:37Z,2021-01-04T21:35:37Z,OWNER,"I tested it by running a `tmate` session on the GitHub macOS machines, and it worked! ``` Mac-1609795972770:tmp runner$ wget 'https://github.com/simonw/datasette/releases/download/0.53/datasette-0.53-macos-binary.zip' --2021-01-04 21:34:10-- https://github.com/simonw/datasette/releases/download/0.53/datasette-0.53-macos-binary.zip Resolving github.com (github.com)... 140.82.114.4 Connecting to github.com (github.com)|140.82.114.4|:443... connected. HTTP request sent, awaiting response... 302 Found Location: https://github-production-release-asset-2e65be.s3.amazonaws.com/107914493/74658700-4e90-11eb-8f3b-ee77e6dfad90?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIWNJYAX4CSVEH53A%2F20210104%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20210104T213414Z&X-Amz-Expires=300&X-Amz-Signature=6f3c54211077092553590b33a7c36cd052895c9d4619607ad1df094782f64acf&X-Amz-SignedHeaders=host&actor_id=0&key_id=0&repo_id=107914493&response-content-disposition=attachment%3B%20filename%3Ddatasette-0.53-macos-binary.zip&response-content-type=application%2Foctet-stream [following] --2021-01-04 21:34:14-- https://github-production-release-asset-2e65be.s3.amazonaws.com/107914493/74658700-4e90-11eb-8f3b-ee77e6dfad90?X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAIWNJYAX4CSVEH53A%2F20210104%2Fus-east-1%2Fs3%2Faws4_request&X-Amz-Date=20210104T213414Z&X-Amz-Expires=300&X-Amz-Signature=6f3c54211077092553590b33a7c36cd052895c9d4619607ad1df094782f64acf&X-Amz-SignedHeaders=host&actor_id=0&key_id=0&repo_id=107914493&response-content-disposition=attachment%3B%20filename%3Ddatasette-0.53-macos-binary.zip&response-content-type=application%2Foctet-stream Resolving github-production-release-asset-2e65be.s3.amazonaws.com (github-production-release-asset-2e65be.s3.amazonaws.com)... 52.217.43.164 Connecting to github-production-release-asset-2e65be.s3.amazonaws.com (github-production-release-asset-2e65be.s3.amazonaws.com)|52.217.43.164|:443... connected. HTTP request sent, awaiting response... 200 OK Length: 8297283 (7.9M) [application/octet-stream] Saving to: ‘datasette-0.53-macos-binary.zip’ datasette-0.53-maco 100%[===================>] 7.91M --.-KB/s in 0.1s 2021-01-04 21:34:14 (73.4 MB/s) - ‘datasette-0.53-macos-binary.zip’ saved [8297283/8297283] Mac-1609795972770:tmp runner$ unzip datasette-0.53-macos-binary.zip Archive: datasette-0.53-macos-binary.zip creating: datasette-0.53-macos-binary/ inflating: datasette-0.53-macos-binary/datasette Mac-1609795972770:tmp runner$ datasette-0.53-macos-binary/datasette --help Usage: datasette [OPTIONS] COMMAND [ARGS]... Datasette! Options: --version Show the version and exit. --help Show this message and exit. Commands: serve* Serve up specified SQLite database files with a web UI inspect install Install Python packages - e.g. package Package specified SQLite files into a new datasette Docker... plugins List currently available plugins publish Publish specified SQLite database files to the internet along... uninstall Uninstall Python packages (e.g. Mac-1609795972770:tmp runner$ datasette-0.53-macos-binary/datasette --get /-/versions.json {""python"": {""version"": ""3.9.1"", ""full"": ""3.9.1 (default, Dec 10 2020, 10:36:35) \n[Clang 12.0.0 (clang-1200.0.32.27)]""}, ""datasette"": {""version"": ""0.53""}, ""asgi"": ""3.0"", ""uvicorn"": ""0.13.3"", ""sqlite"": {""version"": ""3.34.0"", ""fts_versions"": [""FTS5"", ""FTS4"", ""FTS3""], ""extensions"": {""json1"": null}, ""compile_options"": [""COMPILER=clang-12.0.0"", ""ENABLE_COLUMN_METADATA"", ""ENABLE_FTS3"", ""ENABLE_FTS3_PARENTHESIS"", ""ENABLE_FTS4"", ""ENABLE_FTS5"", ""ENABLE_GEOPOLY"", ""ENABLE_JSON1"", ""ENABLE_PREUPDATE_HOOK"", ""ENABLE_RTREE"", ""ENABLE_SESSION"", ""MAX_VARIABLE_NUMBER=250000"", ""THREADSAFE=1""]}} Mac-1609795972770:tmp runner$ ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-754285795,https://api.github.com/repos/simonw/datasette/issues/93,754285795,MDEyOklzc3VlQ29tbWVudDc1NDI4NTc5NQ==,9599,simonw,2021-01-04T23:35:13Z,2021-01-04T23:35:13Z,OWNER,Next step is to automate this all!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-974765825,https://api.github.com/repos/simonw/datasette/issues/93,974765825,IC_kwDOBm6k_c46Gb8B,9599,simonw,2021-11-21T07:00:21Z,2021-11-21T07:00:21Z,OWNER,Closing this in favour of Datasette Desktop: https://datasette.io/desktop,"{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/pull/94#issuecomment-344472313,https://api.github.com/repos/simonw/datasette/issues/94,344472313,MDEyOklzc3VlQ29tbWVudDM0NDQ3MjMxMw==,9599,simonw,2017-11-15T03:08:00Z,2017-11-15T03:08:00Z,OWNER,"Works for me. I'm going to land this. Just one thing: simonw$ docker run --rm -t -i -p 9001:8001 c408e8cfbe40 datasette publish now The publish command requires ""now"" to be installed and configured Follow the instructions at https://zeit.co/now#whats-now Maybe we should have the Docker container install the ""now"" client? Not sure how much size that would add though. I think it's OK without for the moment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273961179,Initial add simple prod ready Dockerfile refs #57, https://github.com/simonw/datasette/issues/95#issuecomment-344463436,https://api.github.com/repos/simonw/datasette/issues/95,344463436,MDEyOklzc3VlQ29tbWVudDM0NDQ2MzQzNg==,9599,simonw,2017-11-15T02:10:10Z,2017-11-15T02:10:10Z,OWNER,"This means clients can ask questions but say ""don't bother if it takes longer than X"" - which is really handy when you're working against unknown databases that might be small or might be enormous.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273998513,Allow shorter time limits to be set using a ?_sql_time_limit_ms =20 query string limit, https://github.com/simonw/datasette/issues/96#issuecomment-344786528,https://api.github.com/repos/simonw/datasette/issues/96,344786528,MDEyOklzc3VlQ29tbWVudDM0NDc4NjUyOA==,9599,simonw,2017-11-16T01:32:41Z,2017-11-16T01:32:41Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274001453,UI for editing named parameters, https://github.com/simonw/datasette/issues/96#issuecomment-344788435,https://api.github.com/repos/simonw/datasette/issues/96,344788435,MDEyOklzc3VlQ29tbWVudDM0NDc4ODQzNQ==,9599,simonw,2017-11-16T01:43:52Z,2017-11-16T01:43:52Z,OWNER,Demo: https://australian-dogs.now.sh/australian-dogs-3ba9628?sql=select+name%2C+count%28*%29+as+n+from+%28%0D%0A%0D%0Aselect+upper%28%22Animal+name%22%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2013%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2014%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2015%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22AnimalName%22%29+as+name+from+%5BCity-of-Port-Adelaide-Enfield-Dog_Registrations_2016%5D+where+AnimalBreed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5BMitcham-dog-registrations-2015%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22DOG_NAME%22%29+as+name+from+%5Bburnside-dog-registrations-2015%5D+where+DOG_BREED+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Animal_Name%22%29+as+name+from+%5Bcity-of-playford-2015-dog-registration%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5Bcity-of-prospect-dog-registration-details-2016%5D+where%22Breed+Description%22+like+%3Abreed%0D%0A%0D%0A%29+group+by+name+order+by+n+desc%3B&breed=chihuahua,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274001453,UI for editing named parameters, https://github.com/simonw/datasette/issues/96#issuecomment-344788763,https://api.github.com/repos/simonw/datasette/issues/96,344788763,MDEyOklzc3VlQ29tbWVudDM0NDc4ODc2Mw==,9599,simonw,2017-11-16T01:45:51Z,2017-11-16T01:45:51Z,OWNER,Another demo - this time it lets you search by name and see the most popular breeds with that name: https://australian-dogs.now.sh/australian-dogs-3ba9628?sql=select+breed%2C+count%28*%29+as+n+from+%28%0D%0A%0D%0Aselect+upper%28%22Breed%22%29+as+breed+from+%5BAdelaide-City-Council-dog-registrations-2013%5D+where+%22Animal+name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Breed_Description%22%29+as+breed+from+%5BAdelaide-City-Council-dog-registrations-2014%5D+where+%22Animal_Name%22+like+%3Aname%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Breed_Description%22%29+as+breed+from+%5BAdelaide-City-Council-dog-registrations-2015%5D+where+%22Animal_Name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22AnimalBreed%22%29+as+breed+from+%5BCity-of-Port-Adelaide-Enfield-Dog_Registrations_2016%5D+where+%22AnimalName%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Breed%22%29+as+breed+from+%5BMitcham-dog-registrations-2015%5D+where+%22Animal+Name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22DOG_BREED%22%29+as+breed+from+%5Bburnside-dog-registrations-2015%5D+where+%22DOG_NAME%22+like+%3Aname%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Breed_Description%22%29+as+breed+from+%5Bcity-of-playford-2015-dog-registration%5D+where+%22Animal_Name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Breed+Description%22%29+as+breed+from+%5Bcity-of-prospect-dog-registration-details-2016%5D+where+%22Animal+Name%22+like+%3Aname%0D%0A%0D%0A%29+group+by+breed+order+by+n+desc%3B&name=rex,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274001453,UI for editing named parameters, https://github.com/simonw/datasette/issues/97#issuecomment-345509500,https://api.github.com/repos/simonw/datasette/issues/97,345509500,MDEyOklzc3VlQ29tbWVudDM0NTUwOTUwMA==,231923,yschimke,2017-11-19T11:26:58Z,2017-11-19T11:26:58Z,NONE,"Specifically docs should make it clearer this file exists https://parlgov.datasettes.com/.json And from that you can build https://parlgov.datasettes.com/parlgov-25f9855.json Then https://parlgov.datasettes.com/parlgov-25f9855/cabinet.json","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274022950,Link to JSON for the list of tables , https://github.com/simonw/datasette/issues/97#issuecomment-392602334,https://api.github.com/repos/simonw/datasette/issues/97,392602334,MDEyOklzc3VlQ29tbWVudDM5MjYwMjMzNA==,9599,simonw,2018-05-28T20:57:21Z,2018-05-28T20:57:21Z,OWNER,"The `/.json` endpoint is more of an implementation detail of the homepage at this point. A better, documented ( http://datasette.readthedocs.io/en/stable/introspection.html#inspect ) endpoint for finding all of the databases and tables is https://parlgov.datasettes.com/-/inspect.json","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274022950,Link to JSON for the list of tables , https://github.com/simonw/datasette/issues/97#issuecomment-392895733,https://api.github.com/repos/simonw/datasette/issues/97,392895733,MDEyOklzc3VlQ29tbWVudDM5Mjg5NTczMw==,231923,yschimke,2018-05-29T18:51:35Z,2018-05-29T18:51:35Z,NONE,Do you have an existing example with views?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274022950,Link to JSON for the list of tables , https://github.com/simonw/datasette/issues/100#issuecomment-344771130,https://api.github.com/repos/simonw/datasette/issues/100,344771130,MDEyOklzc3VlQ29tbWVudDM0NDc3MTEzMA==,9599,simonw,2017-11-16T00:06:00Z,2017-11-16T00:06:00Z,OWNER,"Aha... it looks like this is a Jinja version problem: https://github.com/ansible/ansible/issues/25381#issuecomment-306492389 Datasette depends on sanic-jinja2 - and that doesn't depend on a particular jinja2 version: https://github.com/lixxu/sanic-jinja2/blob/7e9520850d8c6bb66faf43b7f252593d7efe3452/setup.py#L22 So if you have an older version of Jinja installed, stuff breaks.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274160723,TemplateAssertionError: no filter named 'tojson', https://github.com/simonw/datasette/issues/100#issuecomment-344864254,https://api.github.com/repos/simonw/datasette/issues/100,344864254,MDEyOklzc3VlQ29tbWVudDM0NDg2NDI1NA==,13304454,coisnepe,2017-11-16T09:25:10Z,2017-11-16T09:25:10Z,NONE,@simonw I see. I upgraded sanic-jinja2 and jinja2: it now works flawlessly. Thank you!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274160723,TemplateAssertionError: no filter named 'tojson', https://github.com/simonw/datasette/issues/101#issuecomment-344597274,https://api.github.com/repos/simonw/datasette/issues/101,344597274,MDEyOklzc3VlQ29tbWVudDM0NDU5NzI3NA==,450244,eaubin,2017-11-15T13:48:55Z,2017-11-15T13:48:55Z,NONE,This is a duplicate of https://github.com/simonw/datasette/issues/100,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274161964,TemplateAssertionError: no filter named 'tojson', https://github.com/simonw/datasette/issues/102#issuecomment-754192267,https://api.github.com/repos/simonw/datasette/issues/102,754192267,MDEyOklzc3VlQ29tbWVudDc1NDE5MjI2Nw==,9599,simonw,2021-01-04T20:13:19Z,2021-01-04T20:13:19Z,OWNER,"I'm more likely to do Lambda than Elastic Beanstalk, especially now the size limit for Lambdas has been increased as part of their support for Docker.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274264175,datasette publish elasticbeanstalk, https://github.com/simonw/datasette/issues/103#issuecomment-754188099,https://api.github.com/repos/simonw/datasette/issues/103,754188099,MDEyOklzc3VlQ29tbWVudDc1NDE4ODA5OQ==,9599,simonw,2021-01-04T20:05:14Z,2021-01-04T20:05:14Z,OWNER,"Wontfix, Cloud Run is already implemented and is a better fit for Datasette.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274265878,datasette publish appengine, https://github.com/simonw/datasette/pull/104#issuecomment-344710204,https://api.github.com/repos/simonw/datasette/issues/104,344710204,MDEyOklzc3VlQ29tbWVudDM0NDcxMDIwNA==,21148,jacobian,2017-11-15T19:57:50Z,2017-11-15T19:57:50Z,CONTRIBUTOR,"A first basic stab at making this work, just to prove the approach. Right now this requires [a Heroku CLI plugin](https://github.com/heroku/heroku-builds), which seems pretty unreasonable. I think this can be replaced with direct API calls, which could clean up a lot of things. But I wanted to prove it worked first, and it does.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274284246,[WIP] Add publish to heroku support, https://github.com/simonw/datasette/pull/104#issuecomment-345447161,https://api.github.com/repos/simonw/datasette/issues/104,345447161,MDEyOklzc3VlQ29tbWVudDM0NTQ0NzE2MQ==,9599,simonw,2017-11-18T14:53:17Z,2017-11-18T14:53:17Z,OWNER,any reason I shouldn't land this?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274284246,[WIP] Add publish to heroku support, https://github.com/simonw/datasette/pull/104#issuecomment-345452669,https://api.github.com/repos/simonw/datasette/issues/104,345452669,MDEyOklzc3VlQ29tbWVudDM0NTQ1MjY2OQ==,21148,jacobian,2017-11-18T16:18:45Z,2017-11-18T16:18:45Z,CONTRIBUTOR,"I'd like to do a bit of cleanup, and some error checking in case heroku/heroku-builds isn't installed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274284246,[WIP] Add publish to heroku support, https://github.com/simonw/datasette/pull/104#issuecomment-346116745,https://api.github.com/repos/simonw/datasette/issues/104,346116745,MDEyOklzc3VlQ29tbWVudDM0NjExNjc0NQ==,21148,jacobian,2017-11-21T18:23:25Z,2017-11-21T18:23:25Z,CONTRIBUTOR,"@simonw ready for a review and merge if you want. There's still some nasty duplicated code in cli.py and utils.py, which is just going to get worse if/when we start adding any other deploy targets (and I want to do one for cloud.gov, at least). I think there's an opportunity for some refactoring here. I'm happy to do that now as part of this PR, or if you merge this first I'll do it in a different one.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274284246,[WIP] Add publish to heroku support, https://github.com/simonw/datasette/pull/104#issuecomment-346124073,https://api.github.com/repos/simonw/datasette/issues/104,346124073,MDEyOklzc3VlQ29tbWVudDM0NjEyNDA3Mw==,21148,jacobian,2017-11-21T18:49:55Z,2017-11-21T18:49:55Z,CONTRIBUTOR,"Actually hang on, don't merge - there are some bugs that #141 masked when I tested this out elsewhere.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274284246,[WIP] Add publish to heroku support, https://github.com/simonw/datasette/pull/104#issuecomment-346124764,https://api.github.com/repos/simonw/datasette/issues/104,346124764,MDEyOklzc3VlQ29tbWVudDM0NjEyNDc2NA==,21148,jacobian,2017-11-21T18:52:14Z,2017-11-21T18:52:14Z,CONTRIBUTOR,"OK, now this should work.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274284246,[WIP] Add publish to heroku support, https://github.com/simonw/datasette/issues/105#issuecomment-345493344,https://api.github.com/repos/simonw/datasette/issues/105,345493344,MDEyOklzc3VlQ29tbWVudDM0NTQ5MzM0NA==,9599,simonw,2017-11-19T05:28:49Z,2017-11-19T05:28:49Z,OWNER,Looks like there are a ton of interesting datasets packaged in this way at http://datahub.io/docs/core-data - see also https://github.com/datasets,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274314940,Consider data-package as a format for metadata, https://github.com/simonw/datasette/issues/105#issuecomment-345494052,https://api.github.com/repos/simonw/datasette/issues/105,345494052,MDEyOklzc3VlQ29tbWVudDM0NTQ5NDA1Mg==,9599,simonw,2017-11-19T05:49:53Z,2017-11-19T05:49:53Z,OWNER,https://github.com/rgieseke/pandas-datapackage-reader,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274314940,Consider data-package as a format for metadata, https://github.com/simonw/datasette/issues/105#issuecomment-345503897,https://api.github.com/repos/simonw/datasette/issues/105,345503897,MDEyOklzc3VlQ29tbWVudDM0NTUwMzg5Nw==,198537,rgieseke,2017-11-19T09:38:08Z,2017-11-19T09:38:08Z,CONTRIBUTOR,"Thanks, I wrote this very simple reader because the default approach as described on the Datahub pages seemed to complicated. I had metadata from the `datapackage.json` attached to the returned DataFrames but removed this due to some attribute handling change in the latest Pandas version. This could also be useful for getting from Data Package to SQL db: https://github.com/frictionlessdata/tableschema-sql-py I maintain a few climate science related dataset at https://github.com/openclimatedata/ The Data Retriever (mainly ecological data) by @ethanwhite et al. is also using the Data Package format for metadata and has some tooling for different dbs: https://frictionlessdata.io/articles/the-data-retriever/ https://github.com/weecology/retriever The Open Power System Data project also has a couple of datasets that show nicely how CSV is great for assembling and then already make SQLite files available. It's one of the first data sets I tried with Datasette, perfect for the use case of getting an API for putting power stations on a map ... https://data.open-power-system-data.org/","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274314940,Consider data-package as a format for metadata, https://github.com/simonw/datasette/issues/105#issuecomment-345809808,https://api.github.com/repos/simonw/datasette/issues/105,345809808,MDEyOklzc3VlQ29tbWVudDM0NTgwOTgwOA==,9599,simonw,2017-11-20T19:50:53Z,2017-11-20T19:50:53Z,OWNER,"OK, https://github.com/openclimatedata/global-carbon-budget/blob/master/datapackage.json really does look like it covers all of the bases I need for #138. Closing this ticket in favour of that new one.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274314940,Consider data-package as a format for metadata, https://github.com/simonw/datasette/issues/106#issuecomment-504879510,https://api.github.com/repos/simonw/datasette/issues/106,504879510,MDEyOklzc3VlQ29tbWVudDUwNDg3OTUxMA==,9599,simonw,2019-06-24T06:42:33Z,2019-06-24T06:42:33Z,OWNER,https://datasette.readthedocs.io/en/stable/sql_queries.html?highlight=Pagination#pagination,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274315193,Document how pagination works, https://github.com/simonw/datasette/pull/107#issuecomment-344770170,https://api.github.com/repos/simonw/datasette/issues/107,344770170,MDEyOklzc3VlQ29tbWVudDM0NDc3MDE3MA==,9599,simonw,2017-11-16T00:01:00Z,2017-11-16T00:01:22Z,OWNER,"It is - but I think this will break on this line since it expects two format string parameters: https://github.com/simonw/datasette/blob/f45ca30f91b92ac68adaba893bf034f13ec61ced/datasette/utils.py#L61 Needs unit tests too, which live here: https://github.com/simonw/datasette/blob/f45ca30f91b92ac68adaba893bf034f13ec61ced/tests/test_utils.py#L49","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274343647,add support for ?field__isnull=1, https://github.com/simonw/datasette/pull/107#issuecomment-344811268,https://api.github.com/repos/simonw/datasette/issues/107,344811268,MDEyOklzc3VlQ29tbWVudDM0NDgxMTI2OA==,3433657,raynae,2017-11-16T04:17:45Z,2017-11-16T04:17:45Z,CONTRIBUTOR,"Thanks for the guidance. I added a unit test and made a slight change to utils.py. I didn't realize this, but evidently string.format only complains if you supply less arguments than there are format placeholders, so the original commit worked, but was adding a superfluous named param. I added a conditional that prevents the named param from being created and ensures the correct number of args are passed to sting.format. It has the side effect of hiding the SQL query in /templates/table.html when there are no other where clauses--not sure if that's the desired outcome here.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274343647,add support for ?field__isnull=1, https://github.com/simonw/datasette/pull/107#issuecomment-345108644,https://api.github.com/repos/simonw/datasette/issues/107,345108644,MDEyOklzc3VlQ29tbWVudDM0NTEwODY0NA==,9599,simonw,2017-11-17T00:34:46Z,2017-11-17T00:34:46Z,OWNER,Looks like your tests are failing because of a bug which I fixed in https://github.com/simonw/datasette/commit/9199945a1bcec4852e1cb866eb3642614dd32a48 - if you rebase to master the tests should pass.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274343647,add support for ?field__isnull=1, https://github.com/simonw/datasette/pull/107#issuecomment-345117690,https://api.github.com/repos/simonw/datasette/issues/107,345117690,MDEyOklzc3VlQ29tbWVudDM0NTExNzY5MA==,3433657,raynae,2017-11-17T01:29:41Z,2017-11-17T01:29:41Z,CONTRIBUTOR,"Thanks for bearing with me. I was getting a message about my branch diverging when I tried to push after rebasing, so I merged master into isnull, seems like that did the trick. Let me know if I should make any corrections.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274343647,add support for ?field__isnull=1, https://github.com/simonw/datasette/issues/109#issuecomment-344986423,https://api.github.com/repos/simonw/datasette/issues/109,344986423,MDEyOklzc3VlQ29tbWVudDM0NDk4NjQyMw==,9599,simonw,2017-11-16T16:53:26Z,2017-11-16T16:53:26Z,OWNER,http://datasette.readthedocs.io/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274378301,Set up readthedocs, https://github.com/simonw/datasette/issues/110#issuecomment-344988263,https://api.github.com/repos/simonw/datasette/issues/110,344988263,MDEyOklzc3VlQ29tbWVudDM0NDk4ODI2Mw==,9599,simonw,2017-11-16T16:58:48Z,2017-11-16T16:58:48Z,OWNER,"Here's how I tested this. First I downloaded and started a docker container using https://hub.docker.com/r/prolocutor/python3-sqlite-ext - which includes the compiled spatialite extension. This downloads it, then starts a shell in that container. docker run -it -p 8018:8018 prolocutor/python3-sqlite-ext:3.5.1-spatialite /bin/sh Installed a pre-release build of datasette which includes the new `--load-extension` option. pip install https://static.simonwillison.net/static/2017/datasette-0.13-py3-none-any.whl Now grab a sample database from https://www.gaia-gis.it/spatialite-2.3.1/resources.html - and unzip and rename it (datasette doesn't yet like databases with dots in their filename): wget http://www.gaia-gis.it/spatialite-2.3.1/test-2.3.sqlite.gz gunzip test-2.3.sqlite.gz mv test-2.3.sqlite test23.sqlite Now start datasette on port 8018 (the port I exposed earlier) with the extension loaded: datasette test23.sqlite -p 8018 -h 0.0.0.0 --load-extension /usr/local/lib/mod_spatialite.so Now I can confirm that it worked: http://localhost:8018/test23-c88bc35?sql=select+ST_AsText%28Geometry%29+from+HighWays+limit+1 If I run datasette without `--load-extension` I get this: datasette test23.sqlite -p 8018 -h 0.0.0.0 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274578142,Add --load-extension option to datasette for loading extra SQLite extensions, https://github.com/simonw/datasette/issues/110#issuecomment-345017256,https://api.github.com/repos/simonw/datasette/issues/110,345017256,MDEyOklzc3VlQ29tbWVudDM0NTAxNzI1Ng==,9599,simonw,2017-11-16T18:38:30Z,2017-11-16T18:38:30Z,OWNER,"To finish up, I committed the image I created in the above so I can run it again in the future: docker commit $(docker ps -lq) datasette-sqlite Now I can run it like this: docker run -it -p 8018:8018 datasette-sqlite datasette /tmp/test23.sqlite -p 8018 -h 0.0.0.0 --load-extension /usr/local/lib/mod_spatialite.so ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274578142,Add --load-extension option to datasette for loading extra SQLite extensions, https://github.com/simonw/datasette/issues/111#issuecomment-345013127,https://api.github.com/repos/simonw/datasette/issues/111,345013127,MDEyOklzc3VlQ29tbWVudDM0NTAxMzEyNw==,9599,simonw,2017-11-16T18:23:56Z,2017-11-16T18:23:56Z,OWNER,Having this as a global option may not make sense when publishing multiple databases. We can revisit that when we implement per-database and per-table metadata.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274615452,Add “updated” to metadata, https://github.com/simonw/datasette/issues/111#issuecomment-502134167,https://api.github.com/repos/simonw/datasette/issues/111,502134167,MDEyOklzc3VlQ29tbWVudDUwMjEzNDE2Nw==,9599,simonw,2019-06-14T14:37:35Z,2019-06-14T14:37:35Z,OWNER,We have per-database and per-table metadata now. I think it's time to make this actually happen.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274615452,Add “updated” to metadata, https://github.com/simonw/datasette/issues/111#issuecomment-502134699,https://api.github.com/repos/simonw/datasette/issues/111,502134699,MDEyOklzc3VlQ29tbWVudDUwMjEzNDY5OQ==,9599,simonw,2019-06-14T14:38:58Z,2019-06-14T14:38:58Z,OWNER,"I think I'll just call it ""updated"" to avoid the ugly underscore.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274615452,Add “updated” to metadata, https://github.com/simonw/datasette/issues/111#issuecomment-502135123,https://api.github.com/repos/simonw/datasette/issues/111,502135123,MDEyOklzc3VlQ29tbWVudDUwMjEzNTEyMw==,9599,simonw,2019-06-14T14:39:59Z,2019-06-14T14:39:59Z,OWNER,This may be the feature that causes me to add dateutilas a dependency (so I can use dateutil.parser.parse),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274615452,Add “updated” to metadata, https://github.com/simonw/datasette/issues/111#issuecomment-697545290,https://api.github.com/repos/simonw/datasette/issues/111,697545290,MDEyOklzc3VlQ29tbWVudDY5NzU0NTI5MA==,9599,simonw,2020-09-23T15:29:11Z,2020-09-23T15:29:11Z,OWNER,This is still a good idea.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274615452,Add “updated” to metadata, https://github.com/simonw/datasette/issues/111#issuecomment-738904347,https://api.github.com/repos/simonw/datasette/issues/111,738904347,MDEyOklzc3VlQ29tbWVudDczODkwNDM0Nw==,9599,simonw,2020-12-04T17:16:56Z,2020-12-04T17:16:56Z,OWNER,This is STILL a good idea.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274615452,Add “updated” to metadata, https://github.com/simonw/datasette/issues/111#issuecomment-923106887,https://api.github.com/repos/simonw/datasette/issues/111,923106887,IC_kwDOBm6k_c43BX5H,9599,simonw,2021-09-20T16:58:39Z,2021-09-20T16:58:39Z,OWNER,Still a good idea today too! Would be great for https://cdc-vaccination-history.datasette.io/ for example.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274615452,Add “updated” to metadata, https://github.com/simonw/datasette/issues/111#issuecomment-924432643,https://api.github.com/repos/simonw/datasette/issues/111,924432643,IC_kwDOBm6k_c43GbkD,9599,simonw,2021-09-21T22:23:23Z,2021-09-21T22:23:23Z,OWNER,I'm going to use https://github.com/dateutil/dateutil for this - it's been maintained constantly (by an evolving team of contributors) [since 2003](https://github.com/dateutil/dateutil/commit/68ae2757ae15c84bf947d47a82a314b3b975bc9b) and is a very trustworthy dependency.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274615452,Add “updated” to metadata, https://github.com/simonw/datasette/issues/111#issuecomment-924435971,https://api.github.com/repos/simonw/datasette/issues/111,924435971,IC_kwDOBm6k_c43GcYD,9599,simonw,2021-09-21T22:29:15Z,2021-09-21T22:29:49Z,OWNER,"So this is a metadata key called `updated` which can be applied at the table, database or instance level. It is represented as a `.isoformat()` timestamp. Question: should I support just the date - `yyyy-mm-dd` - in addition to the datetime? I think so. I can easily imagine situations where the exact time of day that a change was made hasn't been recorded, but the overall date is known. But in that case, should the `updated` key sometimes be `yyyy-mm-dd` and sometimes be the full isoformat datetime? Or should there be an `updated_date` key that's used for just the date?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274615452,Add “updated” to metadata, https://github.com/simonw/datasette/issues/111#issuecomment-924437942,https://api.github.com/repos/simonw/datasette/issues/111,924437942,IC_kwDOBm6k_c43Gc22,9599,simonw,2021-09-21T22:32:59Z,2021-09-21T22:47:07Z,OWNER,"Side-note: Django 4.0 [will switch](https://docs.djangoproject.com/en/dev/releases/4.0/#zoneinfo-default-timezone-implementation) from using `pytz` to using the standard library `zoneinfo` module introduced in Python 3.9, which has a backport that works as far back as 3.6: https://github.com/pganssle/zoneinfo (https://pypi.org/project/backports.zoneinfo/) If I need to handle timezones I'll use that, but I think I can get away without it? Django does this: https://github.com/django/django/blob/b0ed619303d2fb723330ca9efa3acf23d49f1d19/setup.cfg#L39-L43 ``` install_requires = asgiref >= 3.3.2 backports.zoneinfo; python_version<""3.9"" sqlparse >= 0.2.2 tzdata; sys_platform == 'win32' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274615452,Add “updated” to metadata, https://github.com/simonw/datasette/issues/111#issuecomment-924438481,https://api.github.com/repos/simonw/datasette/issues/111,924438481,IC_kwDOBm6k_c43Gc_R,9599,simonw,2021-09-21T22:34:03Z,2021-09-21T22:34:21Z,OWNER,"The simplest possible version of this: it's always represented as a UTC ISO datetime, like this: ""updated"": ""2020-10-31T12:00:00+00:00"" Later versions of Datasette could extend this to handle other timezones or support just the date (though that's a backwards incompatible change so probably better to decide on the date thing right now).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274615452,Add “updated” to metadata, https://github.com/simonw/datasette/issues/111#issuecomment-924443089,https://api.github.com/repos/simonw/datasette/issues/111,924443089,IC_kwDOBm6k_c43GeHR,9599,simonw,2021-09-21T22:45:14Z,2021-09-21T22:45:26Z,OWNER,"The audiences I care about here are: - Producers of this timestamp - generally that will be users who are using `datasette publish` to share their data - Human consumers of this timestamp - end users who look at a Datasette site and want to know how recent the data is - Machine consumers of this timestamp - API integrations that might want to check if a Datasette instance has been updated before downloading new data For producers I think there are going to be two categories. The first is users who run ""publish"" and want the site to reflect when they did so (probably using `--updated=now` when they publish). The second are users who are willing to spend more time thinking about this - for example my various git scraping projects where I want to use a date derived from the git history. For humans... I'd like to be able to calculate a relative time (3 hours ago) in addition to showing the display time, because that helps avoid confusion over timezones. For machine consumers, it might be nice to offer the option of a calculated Unix timestamp-since-1970, since those can be easier to work with in some languages than running a full ISO date parser.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274615452,Add “updated” to metadata, https://github.com/simonw/datasette/issues/112#issuecomment-345255655,https://api.github.com/repos/simonw/datasette/issues/112,345255655,MDEyOklzc3VlQ29tbWVudDM0NTI1NTY1NQ==,9599,simonw,2017-11-17T14:19:23Z,2017-11-17T14:19:23Z,OWNER,"I tesed this by first building and running a container using the new Dockerfile from #114: docker build . docker run -it -p 8001:8001 6c9ca7e29181 /bin/sh Then I ran this inside the container itself: apt update && apt-get install wget -y \ && wget http://www.gaia-gis.it/spatialite-2.3.1/test-2.3.sqlite.gz \ && gunzip test-2.3.sqlite.gz \ && mv test-2.3.sqlite test23.sqlite \ && datasette -h 0.0.0.0 test23.sqlite I visited this URL to confirm I got an error due to spatialite not being loaded: http://localhost:8001/test23-c88bc35?sql=select+ST_AsText%28Geometry%29+from+HighWays+limit+1 Then I checked that loading it with `--load-extension` worked correctly: datasette -h 0.0.0.0 test23.sqlite \ --load-extension=/usr/lib/x86_64-linux-gnu/mod_spatialite.so Then, finally, I tested it with the new environment variable option: SQLITE_EXTENSIONS=/usr/lib/x86_64-linux-gnu/mod_spatialite.so \ datasette -h 0.0.0.0 test23.sqlite Running it with an invalid environment variable option shows an error: $ SQLITE_EXTENSIONS=/usr/lib/x86_64-linux-gnu/blah.so datasette \ -h 0.0.0.0 test23.sqlite Usage: datasette -h [OPTIONS] [FILES]... Error: Invalid value for ""--load-extension"": Path ""/usr/lib/x86_64-linux-gnu/blah.so"" does not exist. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274617240,Allow --load-extension to be set via environment variables, https://github.com/simonw/datasette/pull/114#issuecomment-345138134,https://api.github.com/repos/simonw/datasette/issues/114,345138134,MDEyOklzc3VlQ29tbWVudDM0NTEzODEzNA==,9599,simonw,2017-11-17T03:50:38Z,2017-11-17T03:50:38Z,OWNER,Fantastic! Thank you very much.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274733145,"Add spatialite, switch to debian and local build", https://github.com/simonw/datasette/pull/115#issuecomment-345256576,https://api.github.com/repos/simonw/datasette/issues/115,345256576,MDEyOklzc3VlQ29tbWVudDM0NTI1NjU3Ng==,9599,simonw,2017-11-17T14:22:51Z,2017-11-17T14:22:51Z,OWNER,"This is great - I've been frustrated by how CodeMirror prevents me from hitting tab-enter to activate the ""Run SQL"" button. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274877366,Add keyboard shortcut to execute SQL query, https://github.com/simonw/datasette/issues/116#issuecomment-392574208,https://api.github.com/repos/simonw/datasette/issues/116,392574208,MDEyOklzc3VlQ29tbWVudDM5MjU3NDIwOA==,9599,simonw,2018-05-28T17:23:41Z,2018-05-28T17:23:41Z,OWNER,"I'm handling this as separate documentation sections instead, e.g. http://datasette.readthedocs.io/en/latest/spatialite.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274884209,Add documentation section about SQLite extensions, https://github.com/simonw/datasette/pull/117#issuecomment-345404257,https://api.github.com/repos/simonw/datasette/issues/117,345404257,MDEyOklzc3VlQ29tbWVudDM0NTQwNDI1Nw==,9599,simonw,2017-11-18T00:53:58Z,2017-11-18T00:53:58Z,OWNER,Thanks!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274900388,Don't prevent tabbing to `Run SQL` button, https://github.com/simonw/datasette/issues/119#issuecomment-639047315,https://api.github.com/repos/simonw/datasette/issues/119,639047315,MDEyOklzc3VlQ29tbWVudDYzOTA0NzMxNQ==,9599,simonw,2020-06-04T18:46:39Z,2020-06-04T18:46:39Z,OWNER,"The OAuth dance needed for this is a pretty nasty barrier to plugin installation and configuration. I'm going to focus on making it easy to copy and paste data into sheets instead.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275082158,"Build an ""export this data to google sheets"" plugin", https://github.com/simonw/datasette/issues/120#issuecomment-355487646,https://api.github.com/repos/simonw/datasette/issues/120,355487646,MDEyOklzc3VlQ29tbWVudDM1NTQ4NzY0Ng==,723567,nickdirienzo,2018-01-05T07:10:12Z,2018-01-05T07:10:12Z,NONE,"Ah, glad I found this issue. I have private data that I'd like to share to a few different people. Personally, a shared username and password would be sufficient for me, more-or-less Basic Auth. Do you have more complex requirements in mind? I'm not sure if ""plugin"" means ""build a plugin"" or ""find a plugin"" or something else entirely. FWIW, I stumbled upon [sanic-auth](https://github.com/pyx/sanic-auth) which looks like a new project to bring some interfaces around auth to sanic, similar to Flask. Alternatively, it shouldn't be too bad to add in Basic Auth. If we went down that route, that would probably be best built as a separate package for sanic that `datasette` brings in. What are your thoughts around this?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275087397,Plugin that adds an authentication layer of some sort, https://github.com/simonw/datasette/issues/120#issuecomment-439421164,https://api.github.com/repos/simonw/datasette/issues/120,439421164,MDEyOklzc3VlQ29tbWVudDQzOTQyMTE2NA==,36796532,ad-si,2018-11-16T15:05:18Z,2018-11-16T15:05:18Z,NONE,This would be an awesome feature ❤️ ,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275087397,Plugin that adds an authentication layer of some sort, https://github.com/simonw/datasette/issues/120#issuecomment-496966227,https://api.github.com/repos/simonw/datasette/issues/120,496966227,MDEyOklzc3VlQ29tbWVudDQ5Njk2NjIyNw==,26342344,duarteocarmo,2019-05-29T14:40:52Z,2019-05-29T14:40:52Z,NONE,I would really like this. If you give me some pointers @simonw I'm willing to PR!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275087397,Plugin that adds an authentication layer of some sort, https://github.com/simonw/datasette/issues/120#issuecomment-599702870,https://api.github.com/repos/simonw/datasette/issues/120,599702870,MDEyOklzc3VlQ29tbWVudDU5OTcwMjg3MA==,9599,simonw,2020-03-16T18:48:05Z,2020-03-16T18:48:05Z,OWNER,"I've built two of these so far: https://github.com/simonw/datasette-auth-github and https://github.com/simonw/datasette-auth-existing-cookies Closing this ticket in favour of #699","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275087397,Plugin that adds an authentication layer of some sort, https://github.com/simonw/datasette/issues/121#issuecomment-345452215,https://api.github.com/repos/simonw/datasette/issues/121,345452215,MDEyOklzc3VlQ29tbWVudDM0NTQ1MjIxNQ==,9599,simonw,2017-11-18T16:11:23Z,2017-11-18T16:11:23Z,OWNER,"If a column value is invalid JSON, let's return the invalid JSON as a regular string.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275089535,?_json=foo&_json=bar query string argument , https://github.com/simonw/datasette/issues/121#issuecomment-350527283,https://api.github.com/repos/simonw/datasette/issues/121,350527283,MDEyOklzc3VlQ29tbWVudDM1MDUyNzI4Mw==,9599,simonw,2017-12-10T06:00:47Z,2017-12-10T06:00:47Z,OWNER,This is also really interesting when combined with the spatialite AsGeoJSON function: http://www.gaia-gis.it/gaia-sins/spatialite-sql-4.2.0.html#p3misc,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275089535,?_json=foo&_json=bar query string argument , https://github.com/simonw/datasette/issues/121#issuecomment-392575448,https://api.github.com/repos/simonw/datasette/issues/121,392575448,MDEyOklzc3VlQ29tbWVudDM5MjU3NTQ0OA==,9599,simonw,2018-05-28T17:33:07Z,2018-05-28T17:33:07Z,OWNER,"This shouldn't be a comma-separated list, it should be an argument you can pass multiple times to better match #255 and #292 Maybe `?_json=foo&_json=bar` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275089535,?_json=foo&_json=bar query string argument , https://github.com/simonw/datasette/issues/121#issuecomment-392580902,https://api.github.com/repos/simonw/datasette/issues/121,392580902,MDEyOklzc3VlQ29tbWVudDM5MjU4MDkwMg==,9599,simonw,2018-05-28T18:11:51Z,2018-05-28T18:11:51Z,OWNER,"Implemented in 76d11eb768e2f05f593c4d37a25280c0fcdf8fd6 Documented here: http://datasette.readthedocs.io/en/latest/json_api.html#special-json-arguments","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275089535,?_json=foo&_json=bar query string argument , https://github.com/simonw/datasette/issues/122#issuecomment-345552358,https://api.github.com/repos/simonw/datasette/issues/122,345552358,MDEyOklzc3VlQ29tbWVudDM0NTU1MjM1OA==,9599,simonw,2017-11-19T21:45:38Z,2017-12-05T19:09:52Z,OWNER,"For the overall shape of the rows: `?_shape=lists` (default), `?_shape=objects`, `?_shape=object` (primary key as object keys) For getting back extra keys: `?_extras=schema,query,timing` For expanding columns: `?_expand_all=1` Or `?_expand=qSpecies&_expand=qCaretaker` The template view will only be allowed to work with data it can request using extra options. That leaves one sighted nasty edge-case: the default view will expand all columns, but the `.json` view of it won't? I think that's OK. The default view won't include the extras used by the template to render the page either.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275092453,"Redesign JSON output, ditch jsono, offer variants controlled by parameter instead", https://github.com/simonw/datasette/issues/122#issuecomment-345552440,https://api.github.com/repos/simonw/datasette/issues/122,345552440,MDEyOklzc3VlQ29tbWVudDM0NTU1MjQ0MA==,9599,simonw,2017-11-19T21:46:43Z,2017-11-19T21:46:43Z,OWNER,"This calls for refactoring the code so the table view, the row view and the custom SQL view share as much logic as possible.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275092453,"Redesign JSON output, ditch jsono, offer variants controlled by parameter instead", https://github.com/simonw/datasette/issues/122#issuecomment-345552500,https://api.github.com/repos/simonw/datasette/issues/122,345552500,MDEyOklzc3VlQ29tbWVudDM0NTU1MjUwMA==,9599,simonw,2017-11-19T21:47:27Z,2017-11-19T21:47:27Z,OWNER,"To start with, I could just ditch the .jsono in favour of the new _shape argument.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275092453,"Redesign JSON output, ditch jsono, offer variants controlled by parameter instead", https://github.com/simonw/datasette/issues/122#issuecomment-349408214,https://api.github.com/repos/simonw/datasette/issues/122,349408214,MDEyOklzc3VlQ29tbWVudDM0OTQwODIxNA==,9599,simonw,2017-12-05T19:08:04Z,2017-12-05T19:08:04Z,OWNER,I think `.json` should continue to return rows as list-of-lists - it's a nice default because it produces a smaller overall JSON file. Encouraging people to specify an alternative shape to get the current `.jsono` format feels appropriate.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275092453,"Redesign JSON output, ditch jsono, offer variants controlled by parameter instead", https://github.com/simonw/datasette/issues/122#issuecomment-378279612,https://api.github.com/repos/simonw/datasette/issues/122,378279612,MDEyOklzc3VlQ29tbWVudDM3ODI3OTYxMg==,9599,simonw,2018-04-03T14:55:54Z,2018-04-03T14:55:54Z,OWNER,The new documentation for the `_shape=` parameter is now live at http://datasette.readthedocs.io/en/latest/json_api.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275092453,"Redesign JSON output, ditch jsono, offer variants controlled by parameter instead", https://github.com/simonw/datasette/issues/123#issuecomment-350521853,https://api.github.com/repos/simonw/datasette/issues/123,350521853,MDEyOklzc3VlQ29tbWVudDM1MDUyMTg1Mw==,9599,simonw,2017-12-10T03:09:53Z,2017-12-10T03:09:53Z,OWNER,I'm going to keep this separate in csvs-to-sqlite.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125561,Datasette serve should accept paths/URLs to CSVs and other file formats, https://github.com/simonw/datasette/issues/123#issuecomment-473313975,https://api.github.com/repos/simonw/datasette/issues/123,473313975,MDEyOklzc3VlQ29tbWVudDQ3MzMxMzk3NQ==,9599,simonw,2019-03-15T14:45:46Z,2019-03-15T14:45:46Z,OWNER,"I'm reopening this one as part of #417. Further experience with Python's CSV standard library module has convinced me that pandas is not a required dependency for this. My [sqlite-utils](https://github.com/simonw/sqlite-utils) package can do most of the work here with very few dependencies.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125561,Datasette serve should accept paths/URLs to CSVs and other file formats, https://github.com/simonw/datasette/issues/123#issuecomment-473323329,https://api.github.com/repos/simonw/datasette/issues/123,473323329,MDEyOklzc3VlQ29tbWVudDQ3MzMyMzMyOQ==,9599,simonw,2019-03-15T15:09:15Z,2019-05-14T15:53:05Z,OWNER,"How would Datasette accepting URLs work? I want to support not just SQLite files and CSVs but other extensible formats (geojson, Atom, shapefiles etc) as well. So `datasette serve` needs to be able to take filepaths or URLs to a variety of different content types. If it's a URL, we can use the first 200 downloaded bytes to decide which type of file it is. This is likely more reliable than hoping the web server provided the correct content-type. Also: let's have a threshold for downloading to disk. We will start downloading to a temp file (location controlled by an environment variable) if either the content length header is above that threshold OR we hit that much data cached in memory already and don't know how much more is still to come. There needs to be a command line option for saying ""grab from this URL but force treat it as CSV"" - same thing for files on disk. datasette mydb.db --type=db http://blah/blah --type=csv If you provide less `--type` options thatn you did URLs then the default behavior is used for all of the subsequent URLs. Auto detection could be tricky. Probably do this with a plugin hook. https://github.com/h2non/filetype.py is interesting but deals with images video etc so not right for this purpose. I think we need our own simple content sniffing code via a plugin hook. What if two plugin type hooks can both potentially handle a sniffed file? The CLI can quit and return an error saying content is ambiguous and you need to specify a `--type`, picking from the following list. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125561,Datasette serve should accept paths/URLs to CSVs and other file formats, https://github.com/simonw/datasette/issues/123#issuecomment-698110186,https://api.github.com/repos/simonw/datasette/issues/123,698110186,MDEyOklzc3VlQ29tbWVudDY5ODExMDE4Ng==,45416,obra,2020-09-24T04:49:51Z,2020-09-24T04:49:51Z,NONE,"As a half-measure, I'd get value out of being able to upload a CSV and have datasette run csv-to-sqlite on it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125561,Datasette serve should accept paths/URLs to CSVs and other file formats, https://github.com/simonw/datasette/issues/123#issuecomment-698168648,https://api.github.com/repos/simonw/datasette/issues/123,698168648,MDEyOklzc3VlQ29tbWVudDY5ODE2ODY0OA==,9599,simonw,2020-09-24T07:28:38Z,2020-09-24T07:28:38Z,OWNER,@obra there's a plugin for that! https://github.com/simonw/datasette-upload-csvs,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125561,Datasette serve should accept paths/URLs to CSVs and other file formats, https://github.com/simonw/datasette/issues/123#issuecomment-698174957,https://api.github.com/repos/simonw/datasette/issues/123,698174957,MDEyOklzc3VlQ29tbWVudDY5ODE3NDk1Nw==,45416,obra,2020-09-24T07:42:05Z,2020-09-24T07:42:05Z,NONE," Oh. Awesome. On Thu, Sep 24, 2020 at 12:28:53AM -0700, Simon Willison wrote: > @obra there's a plugin for that! https://github.com/simonw/ > datasette-upload-csvs > > — > You are receiving this because you were mentioned. > Reply to this email directly, view it on GitHub, or unsubscribe.* > -- ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125561,Datasette serve should accept paths/URLs to CSVs and other file formats, https://github.com/simonw/datasette/issues/123#issuecomment-735440555,https://api.github.com/repos/simonw/datasette/issues/123,735440555,MDEyOklzc3VlQ29tbWVudDczNTQ0MDU1NQ==,11912854,jsancho-gpl,2020-11-29T19:12:30Z,2020-11-29T19:12:30Z,NONE,"[datasette-connectors](https://github.com/pytables/datasette-connectors) provides an API for making connectors for any file based database. For example, [datasette-pytables](https://github.com/pytables/datasette-pytables) is a connector for HDF5 files, so now is possible to use this type of files with Datasette. It'd be nice if Datasette coud provide that API directly, for other file formats and for urls too.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125561,Datasette serve should accept paths/URLs to CSVs and other file formats, https://github.com/simonw/datasette/issues/123#issuecomment-882096402,https://api.github.com/repos/simonw/datasette/issues/123,882096402,IC_kwDOBm6k_c40k7kS,921217,RayBB,2021-07-18T18:07:29Z,2021-07-18T18:07:29Z,NONE,"I also love the idea for this feature and wonder if it could work without having to download the whole database into memory at once if it's a rather large db. Obviously this could be slower but could have many use cases. My comment is partially inspired by this post about streaming sqlite dbs from github pages or such https://news.ycombinator.com/item?id=27016630 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125561,Datasette serve should accept paths/URLs to CSVs and other file formats, https://github.com/simonw/datasette/issues/123#issuecomment-882138084,https://api.github.com/repos/simonw/datasette/issues/123,882138084,IC_kwDOBm6k_c40lFvk,9599,simonw,2021-07-19T00:04:31Z,2021-07-19T00:04:31Z,OWNER,"I've been thinking more about this one today too. An extension of this (touched on in #417, Datasette Library) would be to support pointing Datasette at a directory and having it automatically load any CSV files it finds anywhere in that folder or its descendants - either loading them fully, or providing a UI that allows users to select a file to open it in Datasette. For larger files I think the right thing to do is import them into an on-disk SQLite database, which is limited only by available disk space. For smaller files loading them into an in-memory database should work fine.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125561,Datasette serve should accept paths/URLs to CSVs and other file formats, https://github.com/simonw/datasette/issues/124#issuecomment-346987395,https://api.github.com/repos/simonw/datasette/issues/124,346987395,MDEyOklzc3VlQ29tbWVudDM0Njk4NzM5NQ==,50138,janimo,2017-11-26T06:24:08Z,2017-11-26T06:24:08Z,NONE,"Are there performance gains when using immutable as opposed to read-only? From what I see other processes can still modify the DB when immutable, but there are no change notifications.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125805,Option to open readonly but not immutable, https://github.com/simonw/datasette/issues/124#issuecomment-347049888,https://api.github.com/repos/simonw/datasette/issues/124,347049888,MDEyOklzc3VlQ29tbWVudDM0NzA0OTg4OA==,9599,simonw,2017-11-27T00:01:08Z,2017-11-27T00:01:08Z,OWNER,"https://sqlite.org/c3ref/open.html Is the only documentation I've been able to find of the immutable option: > **immutable**: The immutable parameter is a boolean query parameter that indicates that the database file is stored on read-only media. When immutable is set, SQLite assumes that the database file cannot be changed, even by a process with higher privilege, and so the database is opened read-only and all locking and change detection is disabled. Caution: Setting the immutable property on a database file that does in fact change can result in incorrect query results and/or SQLITE_CORRUPT errors. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125805,Option to open readonly but not immutable, https://github.com/simonw/datasette/issues/124#issuecomment-347123991,https://api.github.com/repos/simonw/datasette/issues/124,347123991,MDEyOklzc3VlQ29tbWVudDM0NzEyMzk5MQ==,50138,janimo,2017-11-27T09:25:15Z,2017-11-27T09:25:15Z,NONE,"That's the only reference to immutable I saw as well, making me think that there may be no perceivable advantages over simply using mode=ro. Since the database is never or seldom updated the change notifications should not impact performance.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125805,Option to open readonly but not immutable, https://github.com/simonw/datasette/issues/124#issuecomment-347236102,https://api.github.com/repos/simonw/datasette/issues/124,347236102,MDEyOklzc3VlQ29tbWVudDM0NzIzNjEwMg==,9599,simonw,2017-11-27T16:24:15Z,2017-11-27T16:24:15Z,OWNER,I'd really like to get some benchmarks working so I can see the actual impact of this kind of thing.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125805,Option to open readonly but not immutable, https://github.com/simonw/datasette/issues/124#issuecomment-504879834,https://api.github.com/repos/simonw/datasette/issues/124,504879834,MDEyOklzc3VlQ29tbWVudDUwNDg3OTgzNA==,9599,simonw,2019-06-24T06:43:46Z,2019-06-24T06:43:46Z,OWNER,https://simonwillison.net/2019/May/19/datasette-0-28/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125805,Option to open readonly but not immutable, https://github.com/simonw/datasette/issues/125#issuecomment-381361734,https://api.github.com/repos/simonw/datasette/issues/125,381361734,MDEyOklzc3VlQ29tbWVudDM4MTM2MTczNA==,45057,russss,2018-04-14T21:26:30Z,2018-04-14T21:26:30Z,CONTRIBUTOR,"FWIW I am now doing this on my WTR app (instead of silently limiting maps to 1000). [Telefonica](https://wtr-api.herokuapp.com/wtr-663ea99/licensee/18325) now has about 4000 markers and good old [BT](https://wtr-api.herokuapp.com/wtr-663ea99/licensee/8412) has 22,000 or so.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275135393,Plot rows on a map with Leaflet and Leaflet.markercluster, https://github.com/simonw/datasette/issues/125#issuecomment-384678319,https://api.github.com/repos/simonw/datasette/issues/125,384678319,MDEyOklzc3VlQ29tbWVudDM4NDY3ODMxOQ==,9599,simonw,2018-04-26T15:14:31Z,2018-04-26T15:14:31Z,OWNER,"I shipped this last week as the first plugin: https://simonwillison.net/2018/Apr/20/datasette-plugins/ Demo: https://datasette-cluster-map-demo.datasettes.com/polar-bears-455fe3a/USGS_WC_eartags_output_files_2009-2011-Status Plugin: https://github.com/simonw/datasette-cluster-map","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275135393,Plot rows on a map with Leaflet and Leaflet.markercluster, https://github.com/simonw/datasette/issues/126#issuecomment-348248957,https://api.github.com/repos/simonw/datasette/issues/126,348248957,MDEyOklzc3VlQ29tbWVudDM0ODI0ODk1Nw==,9599,simonw,2017-11-30T16:49:24Z,2017-11-30T16:49:24Z,OWNER,https://simonwillison.net/2017/Nov/25/new-in-datasette/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275135535,Blog entry announcing foreign key support, https://github.com/simonw/datasette/issues/127#issuecomment-345495046,https://api.github.com/repos/simonw/datasette/issues/127,345495046,MDEyOklzc3VlQ29tbWVudDM0NTQ5NTA0Ng==,9599,simonw,2017-11-19T06:17:42Z,2017-11-19T06:17:42Z,OWNER,Maybe I should support `&_count=1` to handle this - that would be easy to Ajax-in in conjenction with the other filters.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275135719,"Filtered tables should show count of all matching rows, if fast enough", https://github.com/simonw/datasette/issues/127#issuecomment-345538016,https://api.github.com/repos/simonw/datasette/issues/127,345538016,MDEyOklzc3VlQ29tbWVudDM0NTUzODAxNg==,9599,simonw,2017-11-19T18:22:45Z,2017-11-19T18:22:45Z,OWNER,I implemented a basic version of this in f59c840e7db8870afcdeba7a53bdea07bb674334 for custom SQL.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275135719,"Filtered tables should show count of all matching rows, if fast enough", https://github.com/simonw/datasette/issues/129#issuecomment-345793887,https://api.github.com/repos/simonw/datasette/issues/129,345793887,MDEyOklzc3VlQ29tbWVudDM0NTc5Mzg4Nw==,9599,simonw,2017-11-20T19:00:30Z,2017-11-20T19:00:30Z,OWNER,"Need to hide these from the index summary page as well: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275164558,Hide FTS-created tables by default on the database index page, https://github.com/simonw/datasette/issues/129#issuecomment-346463342,https://api.github.com/repos/simonw/datasette/issues/129,346463342,MDEyOklzc3VlQ29tbWVudDM0NjQ2MzM0Mg==,9599,simonw,2017-11-22T20:22:02Z,2017-11-22T20:22:02Z,OWNER,"On the index page: On the database index page: After clicking that link: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275164558,Hide FTS-created tables by default on the database index page, https://github.com/simonw/datasette/issues/131#issuecomment-345526171,https://api.github.com/repos/simonw/datasette/issues/131,345526171,MDEyOklzc3VlQ29tbWVudDM0NTUyNjE3MQ==,9599,simonw,2017-11-19T15:44:30Z,2017-11-19T15:44:30Z,OWNER,"Relevant SQLite docs: * https://sqlite.org/fts5.html * https://www.sqlite.org/fts3.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275166669,UI support for running FTS searches, https://github.com/simonw/datasette/issues/131#issuecomment-345526517,https://api.github.com/repos/simonw/datasette/issues/131,345526517,MDEyOklzc3VlQ29tbWVudDM0NTUyNjUxNw==,9599,simonw,2017-11-19T15:48:28Z,2017-11-19T15:48:28Z,OWNER,"Since SQLite supports column specifications in the MATCH body itself, there's no need to provide a separate mechanism for specifying columns in the query string: https://sqlite.org/fts5.html#fts5_column_filters","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275166669,UI support for running FTS searches, https://github.com/simonw/datasette/issues/131#issuecomment-345533274,https://api.github.com/repos/simonw/datasette/issues/131,345533274,MDEyOklzc3VlQ29tbWVudDM0NTUzMzI3NA==,9599,simonw,2017-11-19T17:17:37Z,2017-11-19T17:18:05Z,OWNER,"Demo: https://sf-trees.now.sh/sf-trees-ebc2ad9/Street_Tree_List?_search=grove+st ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275166669,UI support for running FTS searches, https://github.com/simonw/datasette/issues/132#issuecomment-346701751,https://api.github.com/repos/simonw/datasette/issues/132,346701751,MDEyOklzc3VlQ29tbWVudDM0NjcwMTc1MQ==,9599,simonw,2017-11-23T21:51:51Z,2017-11-23T21:51:51Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275175929,Row view is not currently expanding foreign keys, https://github.com/simonw/datasette/issues/133#issuecomment-345601870,https://api.github.com/repos/simonw/datasette/issues/133,345601870,MDEyOklzc3VlQ29tbWVudDM0NTYwMTg3MA==,9599,simonw,2017-11-20T06:18:53Z,2017-11-20T06:18:53Z,OWNER,This may be tackled by the filters work happening in #86,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275176006,"If view is filtered, search should apply within those filtered rows", https://github.com/simonw/datasette/issues/133#issuecomment-346705879,https://api.github.com/repos/simonw/datasette/issues/133,346705879,MDEyOklzc3VlQ29tbWVudDM0NjcwNTg3OQ==,9599,simonw,2017-11-23T22:43:42Z,2017-11-24T22:07:46Z,OWNER,"Easiest way to do this will be to move it into the same `
` as the filters. Would be nice to detect `?_search=` and redirect to URL without the `_search` parameter, just for aesthetics.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275176006,"If view is filtered, search should apply within those filtered rows", https://github.com/simonw/datasette/issues/133#issuecomment-346902583,https://api.github.com/repos/simonw/datasette/issues/133,346902583,MDEyOklzc3VlQ29tbWVudDM0NjkwMjU4Mw==,9599,simonw,2017-11-24T22:30:32Z,2017-11-24T22:30:32Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275176006,"If view is filtered, search should apply within those filtered rows", https://github.com/simonw/datasette/issues/134#issuecomment-345537268,https://api.github.com/repos/simonw/datasette/issues/134,345537268,MDEyOklzc3VlQ29tbWVudDM0NTUzNzI2OA==,9599,simonw,2017-11-19T18:10:48Z,2017-11-19T18:10:48Z,OWNER,Dupe of #127 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275176094,Filtered table view should show a count, https://github.com/simonw/datasette/issues/135#issuecomment-349047335,https://api.github.com/repos/simonw/datasette/issues/135,349047335,MDEyOklzc3VlQ29tbWVudDM0OTA0NzMzNQ==,9599,simonw,2017-12-04T17:57:08Z,2017-12-04T17:57:08Z,OWNER,Turns out there's a bug in this: https://timezones-now-hrjgkinozh.now.sh/timezones-0d61a90/ElementaryGeometries should not be showing the search box.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275179724,?_search=x should work if used directly against a FTS virtual table, https://github.com/simonw/datasette/issues/135#issuecomment-349860851,https://api.github.com/repos/simonw/datasette/issues/135,349860851,MDEyOklzc3VlQ29tbWVudDM0OTg2MDg1MQ==,9599,simonw,2017-12-07T04:37:59Z,2017-12-07T04:37:59Z,OWNER,"I'm testing this like so: datasette ~/Dropbox/Development/timezones-api/timezones.db --reload --load-extension /usr/local/lib/mod_spatialite.dylib ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275179724,?_search=x should work if used directly against a FTS virtual table, https://github.com/simonw/datasette/issues/135#issuecomment-349861461,https://api.github.com/repos/simonw/datasette/issues/135,349861461,MDEyOklzc3VlQ29tbWVudDM0OTg2MTQ2MQ==,9599,simonw,2017-12-07T04:43:12Z,2017-12-07T04:43:12Z,OWNER,"This query looks like it does the right thing: select * from sqlite_master where rootpage = 0 and ( sql like '%VIRTUAL TABLE%USING FTS%content=""ElementaryGeometries""%' or ( tbl_name = ""ElementaryGeometries"" and sql like '%VIRTUAL TABLE%USING FTS%' ) ) Against a table that should not be shown as FTS: https://timezones-now-hrjgkinozh.now.sh/timezones-0d61a90?sql=++++++++select+*+from+sqlite_master%0D%0A++++++++++++where+rootpage+%3D+0%0D%0A++++++++++++and+%28%0D%0A++++++++++++++++sql+like+%27%25VIRTUAL+TABLE%25USING+FTS%25content%3D%22ElementaryGeometries%22%25%27%0D%0A++++++++++++++++or+%28%0D%0A++++++++++++++++++tbl_name+%3D+%22ElementaryGeometries%22%0D%0A++++++++++++++++++and+sql+like+%27%25VIRTUAL+TABLE%25USING+FTS%25%27%0D%0A++++++++++++++++%29%0D%0A++++++++++++%29+ Against a table that SHOULD match: https://sf-trees.now.sh/sf-trees-ebc2ad9?sql=++++++++select+*+from+sqlite_master%0D%0A++++++++++++where+rootpage+%3D+0%0D%0A++++++++++++and+%28%0D%0A++++++++++++++++sql+like+%27%25VIRTUAL+TABLE%25USING+FTS%25content%3D%22Street_Tree_List_fts%22%25%27%0D%0A++++++++++++++++or+%28%0D%0A++++++++++++++++++tbl_name+%3D+%22Street_Tree_List_fts%22%0D%0A++++++++++++++++++and+sql+like+%27%25VIRTUAL+TABLE%25USING+FTS%25%27%0D%0A++++++++++++++++%29%0D%0A++++++++++++%29+","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275179724,?_search=x should work if used directly against a FTS virtual table, https://github.com/simonw/datasette/issues/137#issuecomment-345750135,https://api.github.com/repos/simonw/datasette/issues/137,345750135,MDEyOklzc3VlQ29tbWVudDM0NTc1MDEzNQ==,9599,simonw,2017-11-20T16:30:56Z,2018-07-10T17:53:13Z,OWNER,"One possible route: introduce prefixes eg `?a.Trees.age__gt=5&a.Trees._group_count=qSpecies&b.Trees.age__gt=10&b.Trees._group_count=qSpecies` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275415799,Ability to combine multiple SQL queries on a single graph, https://github.com/simonw/datasette/issues/138#issuecomment-350521806,https://api.github.com/repos/simonw/datasette/issues/138,350521806,MDEyOklzc3VlQ29tbWVudDM1MDUyMTgwNg==,9599,simonw,2017-12-10T03:08:26Z,2017-12-10T03:08:36Z,OWNER,Implemented this in 80bf3afa43e3cb396c7a7c9b168eedbc6fe0fa15 and #165. Didn't use data package though.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275476839,"Per-database and per-table metadata, probably using data-package", https://github.com/simonw/datasette/issues/139#issuecomment-381455054,https://api.github.com/repos/simonw/datasette/issues/139,381455054,MDEyOklzc3VlQ29tbWVudDM4MTQ1NTA1NA==,9599,simonw,2018-04-16T01:24:13Z,2018-04-16T01:24:13Z,OWNER,"I think Vega-Lite is the way to go here: https://vega.github.io/vega-lite/ I've been playing around with it and Datasette with some really positive initial results: https://vega.github.io/editor/#/gist/vega-lite/simonw/89100ce80573d062d70f780d10e5e609/decada131575825875c0a076e418c661c2adb014/vice-shootings-gender-race-by-department.vl.json https://vega.github.io/editor/#/gist/vega-lite/simonw/5f69fbe29380b0d5d95f31a385f49ee4/7087b64df03cf9dba44a5258a606f29182cb8619/trees-san-francisco.vl.json","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275493851,Build a visualization plugin for Vega, https://github.com/simonw/datasette/issues/139#issuecomment-403909389,https://api.github.com/repos/simonw/datasette/issues/139,403909389,MDEyOklzc3VlQ29tbWVudDQwMzkwOTM4OQ==,9599,simonw,2018-07-10T17:48:18Z,2018-07-10T17:48:18Z,OWNER,This is done! https://github.com/simonw/datasette-vega,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275493851,Build a visualization plugin for Vega, https://github.com/simonw/datasette/issues/140#issuecomment-403910318,https://api.github.com/repos/simonw/datasette/issues/140,403910318,MDEyOklzc3VlQ29tbWVudDQwMzkxMDMxOA==,9599,simonw,2018-07-10T17:51:11Z,2018-07-10T17:51:11Z,OWNER,This would be a nice example plugin to demonstrate plugin configuration options in #231,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275755475,Heatmap visualization plugin, https://github.com/simonw/datasette/issues/140#issuecomment-403939399,https://api.github.com/repos/simonw/datasette/issues/140,403939399,MDEyOklzc3VlQ29tbWVudDQwMzkzOTM5OQ==,9599,simonw,2018-07-10T19:30:17Z,2018-07-10T19:30:41Z,OWNER,Building this using Svelte would also produce a neat example of a plugin that uses Svelte: https://svelte.technology/guide - and if I like it I might part datasette-vega to it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275755475,Heatmap visualization plugin, https://github.com/simonw/datasette/issues/141#issuecomment-346157542,https://api.github.com/repos/simonw/datasette/issues/141,346157542,MDEyOklzc3VlQ29tbWVudDM0NjE1NzU0Mg==,9599,simonw,2017-11-21T20:53:47Z,2017-11-21T20:53:47Z,OWNER,"I think a copy is the right thing to do here - it will be cleaned up when the temp directory is removed. The hard link thing was always intended to save space, but if we can't do a hard link I don't see any harm in a temporary file copy.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275814941,datasette publish can fail if /tmp is on a different device, https://github.com/simonw/datasette/issues/141#issuecomment-346974336,https://api.github.com/repos/simonw/datasette/issues/141,346974336,MDEyOklzc3VlQ29tbWVudDM0Njk3NDMzNg==,50138,janimo,2017-11-26T00:00:35Z,2017-11-26T00:00:35Z,NONE,FWIW I worked around this by setting TMPDIR to ~/tmp before running the command.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275814941,datasette publish can fail if /tmp is on a different device, https://github.com/simonw/datasette/issues/141#issuecomment-350292364,https://api.github.com/repos/simonw/datasette/issues/141,350292364,MDEyOklzc3VlQ29tbWVudDM1MDI5MjM2NA==,9599,simonw,2017-12-08T15:33:18Z,2017-12-08T15:33:18Z,OWNER,"I can emulate this on OS X using a disk image (Disk Utility -> File -> New Image -> Blank Image...) - once mounted, I get the following: >>> os.link('/tmp/hello', '/Volumes/Untitled/hello') Traceback (most recent call last): File """", line 1, in OSError: [Errno 18] Cross-device link: '/tmp/hello' -> '/Volumes/Untitled/hello' I can simulate that in a mock like this: >>> from unittest.mock import patch >>> @patch('os.link') ... def test_link(mock_link): ... mock_link.side_effect = OSError ... mock_link() ... ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275814941,datasette publish can fail if /tmp is on a different device, https://github.com/simonw/datasette/issues/141#issuecomment-350301248,https://api.github.com/repos/simonw/datasette/issues/141,350301248,MDEyOklzc3VlQ29tbWVudDM1MDMwMTI0OA==,9599,simonw,2017-12-08T16:07:04Z,2017-12-08T16:07:04Z,OWNER,"This fix should work, please have a go with latest master and let me know if you run into any problems.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275814941,datasette publish can fail if /tmp is on a different device, https://github.com/simonw/datasette/issues/141#issuecomment-620970547,https://api.github.com/repos/simonw/datasette/issues/141,620970547,MDEyOklzc3VlQ29tbWVudDYyMDk3MDU0Nw==,9599,simonw,2020-04-29T03:27:37Z,2020-04-29T03:27:54Z,OWNER,This bug raised its head again in #744,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275814941,datasette publish can fail if /tmp is on a different device, https://github.com/simonw/datasette/issues/142#issuecomment-346217739,https://api.github.com/repos/simonw/datasette/issues/142,346217739,MDEyOklzc3VlQ29tbWVudDM0NjIxNzczOQ==,9599,simonw,2017-11-22T01:45:30Z,2017-11-22T01:45:30Z,OWNER,Might be nice to have a --no-limits option that disables time and maximum row count limits.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275917760,Show extra instructions with the interrupted, https://github.com/simonw/datasette/issues/142#issuecomment-392602558,https://api.github.com/repos/simonw/datasette/issues/142,392602558,MDEyOklzc3VlQ29tbWVudDM5MjYwMjU1OA==,9599,simonw,2018-05-28T20:58:59Z,2018-05-28T20:58:59Z,OWNER,I'll have the error message display a link to the documentation.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275917760,Show extra instructions with the interrupted, https://github.com/simonw/datasette/issues/142#issuecomment-392605574,https://api.github.com/repos/simonw/datasette/issues/142,392605574,MDEyOklzc3VlQ29tbWVudDM5MjYwNTU3NA==,9599,simonw,2018-05-28T21:25:05Z,2018-05-28T21:25:05Z,OWNER,"![2018-05-28 at 2 24 pm](https://user-images.githubusercontent.com/9599/40629887-e991c61c-6282-11e8-9d66-6387f90e87ca.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275917760,Show extra instructions with the interrupted, https://github.com/simonw/datasette/issues/143#issuecomment-403909469,https://api.github.com/repos/simonw/datasette/issues/143,403909469,MDEyOklzc3VlQ29tbWVudDQwMzkwOTQ2OQ==,9599,simonw,2018-07-10T17:48:34Z,2018-07-10T17:48:34Z,OWNER,This is now a dupe of https://github.com/simonw/datasette-vega/issues/4,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275939188,"Mechanism for ""suggested visualizations""", https://github.com/simonw/datasette/issues/144#issuecomment-346405660,https://api.github.com/repos/simonw/datasette/issues/144,346405660,MDEyOklzc3VlQ29tbWVudDM0NjQwNTY2MA==,9599,simonw,2017-11-22T16:38:05Z,2017-11-22T16:38:05Z,OWNER,"I have a solution for FTS already, but I'm interested in apsw as a mechanism for allowing custom virtual tables to be written in Python (pysqlite only lets you write custom functions) Not having PyPI support is pretty tough though. I'm planning a plugin/extension system which would be ideal for things like an optional apsw mode, but that's a lot harder if apsw isn't in PyPI.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276091279,apsw as alternative sqlite3 binding (for full text search), https://github.com/simonw/datasette/issues/144#issuecomment-346427794,https://api.github.com/repos/simonw/datasette/issues/144,346427794,MDEyOklzc3VlQ29tbWVudDM0NjQyNzc5NA==,649467,mhalle,2017-11-22T17:55:45Z,2017-11-22T17:55:45Z,NONE,"Thanks. There is a way to use pip to grab apsw, which also let's you configure it (flags to build extensions, use an internal sqlite, etc). Don't know how that works as a dependency for another package, though. On November 22, 2017 11:38:06 AM EST, Simon Willison wrote: >I have a solution for FTS already, but I'm interested in apsw as a >mechanism for allowing custom virtual tables to be written in Python >(pysqlite only lets you write custom functions) > >Not having PyPI support is pretty tough though. I'm planning a >plugin/extension system which would be ideal for things like an >optional apsw mode, but that's a lot harder if apsw isn't in PyPI. > >-- >You are receiving this because you authored the thread. >Reply to this email directly or view it on GitHub: >https://github.com/simonw/datasette/issues/144#issuecomment-346405660 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276091279,apsw as alternative sqlite3 binding (for full text search), https://github.com/simonw/datasette/issues/144#issuecomment-392606044,https://api.github.com/repos/simonw/datasette/issues/144,392606044,MDEyOklzc3VlQ29tbWVudDM5MjYwNjA0NA==,9599,simonw,2018-05-28T21:29:42Z,2018-05-28T21:29:42Z,OWNER,"The other major limitation of APSW is its treatment of unicode: https://rogerbinns.github.io/apsw/types.html - it tells you that it is your responsibility to ensure that TEXT columns in your SQLite database are correctly encoded. Since Datasette is designed to work against ANY SQLite database that someone may have already created, I see that as a show-stopping limitation. Thanks to https://github.com/coleifer/sqlite-vtfunc I now have a working mechanism for virtual tables (I've even built a demo plugin with them - https://github.com/simonw/datasette-sql-scraper ) which was the main thing that interested me about APSW. I'm going to close this as WONTFIX - I think Python's built-in `sqlite3` is good enough, and is now so firmly embedded in the project that making it pluggable would be more trouble than it's worth.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276091279,apsw as alternative sqlite3 binding (for full text search), https://github.com/simonw/datasette/issues/146#issuecomment-346682905,https://api.github.com/repos/simonw/datasette/issues/146,346682905,MDEyOklzc3VlQ29tbWVudDM0NjY4MjkwNQ==,9599,simonw,2017-11-23T18:55:08Z,2017-11-23T18:55:08Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276455748,datasette publish gcloud, https://github.com/simonw/datasette/issues/146#issuecomment-504881030,https://api.github.com/repos/simonw/datasette/issues/146,504881030,MDEyOklzc3VlQ29tbWVudDUwNDg4MTAzMA==,9599,simonw,2019-06-24T06:48:20Z,2019-06-24T06:48:20Z,OWNER,"I'm going to call this ""done"" thanks to cloudrun: #400 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276455748,datasette publish gcloud, https://github.com/simonw/datasette/issues/147#issuecomment-346900554,https://api.github.com/repos/simonw/datasette/issues/147,346900554,MDEyOklzc3VlQ29tbWVudDM0NjkwMDU1NA==,9599,simonw,2017-11-24T22:02:22Z,2017-11-24T22:02:22Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276476670,Tidy up design of the header of the table page, https://github.com/simonw/datasette/issues/149#issuecomment-346903317,https://api.github.com/repos/simonw/datasette/issues/149,346903317,MDEyOklzc3VlQ29tbWVudDM0NjkwMzMxNw==,9599,simonw,2017-11-24T22:41:58Z,2017-11-24T22:41:58Z,OWNER,"Custom SQL results now look like this: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276704127,Update custom SQL results to match new table view header, https://github.com/simonw/datasette/issues/150#issuecomment-379559214,https://api.github.com/repos/simonw/datasette/issues/150,379559214,MDEyOklzc3VlQ29tbWVudDM3OTU1OTIxNA==,9599,simonw,2018-04-08T15:33:58Z,2018-04-08T15:33:58Z,OWNER,The single biggest challenge here is expanding foreign key references. This is the blocker that prevents `_group_count` from being useful at the moment.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276704327,_group_count= feature improvements, https://github.com/simonw/datasette/issues/150#issuecomment-379559319,https://api.github.com/repos/simonw/datasette/issues/150,379559319,MDEyOklzc3VlQ29tbWVudDM3OTU1OTMxOQ==,9599,simonw,2018-04-08T15:35:43Z,2018-04-08T15:35:43Z,OWNER,"From a code point of view, the current mechanism for `_group_count` makes the `TableView` even **more** complicated: https://github.com/simonw/datasette/blob/446d47fdb005b3776bc06ad8d1f44b01fc2e938b/datasette/app.py#L644-L653 Instead, I think if `_group_count` is detected we should generate the SQL and then defer to `self.custom_sql`, like we do for canned queries: https://github.com/simonw/datasette/blob/446d47fdb005b3776bc06ad8d1f44b01fc2e938b/datasette/app.py#L539-L541","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276704327,_group_count= feature improvements, https://github.com/simonw/datasette/issues/150#issuecomment-392568047,https://api.github.com/repos/simonw/datasette/issues/150,392568047,MDEyOklzc3VlQ29tbWVudDM5MjU2ODA0Nw==,9599,simonw,2018-05-28T16:41:28Z,2018-05-28T16:41:28Z,OWNER,Closing this as obsolete since we have facets now.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276704327,_group_count= feature improvements, https://github.com/simonw/datasette/issues/151#issuecomment-623044858,https://api.github.com/repos/simonw/datasette/issues/151,623044858,MDEyOklzc3VlQ29tbWVudDYyMzA0NDg1OA==,9599,simonw,2020-05-03T02:37:03Z,2020-05-03T02:37:03Z,OWNER,"I'm going to put this at `/-/patterns`, which will render a template called `patterns.html`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276718605,Set up a pattern portfolio, https://github.com/simonw/datasette/issues/151#issuecomment-623047233,https://api.github.com/repos/simonw/datasette/issues/151,623047233,MDEyOklzc3VlQ29tbWVudDYyMzA0NzIzMw==,9599,simonw,2020-05-03T03:11:16Z,2020-05-03T03:11:16Z,OWNER,Now live at https://latest.datasette.io/-/patterns,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276718605,Set up a pattern portfolio, https://github.com/simonw/datasette/issues/153#issuecomment-347050235,https://api.github.com/repos/simonw/datasette/issues/153,347050235,MDEyOklzc3VlQ29tbWVudDM0NzA1MDIzNQ==,9599,simonw,2017-11-27T00:06:24Z,2017-11-27T00:06:24Z,OWNER,"I've been thinking about 1. a bit - I actually think it would be fine to have a rule that says ""if the contents of the cell starts with `http://` or `https://` and doesn't contain any whitespace, turn that into a link"". If you need the non-linked version that will always be available in the JSON. For the other two... I think #12 may be the way to go here: if you can easily over-ride the `row.html` and `table.html` templates for specific databases you can easily set pre-formatted text or similar for certain values - maybe even with CSS that targets a specific table column.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-347051331,https://api.github.com/repos/simonw/datasette/issues/153,347051331,MDEyOklzc3VlQ29tbWVudDM0NzA1MTMzMQ==,9599,simonw,2017-11-27T00:23:40Z,2017-11-27T03:58:49Z,OWNER,"One quick fix could be to add a `extra_css_url` key to the `metadata.json` format (which currently hosts `title`, `license_url` etc) - if populated, we can inject a link to that stylesheet on every page. We could add a few classes in strategic places that include the database and table names to give people styling hooks. While we're at it, an `extra_js_url` key would let people go really nuts!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-347735334,https://api.github.com/repos/simonw/datasette/issues/153,347735334,MDEyOklzc3VlQ29tbWVudDM0NzczNTMzNA==,9599,simonw,2017-11-29T02:45:03Z,2017-11-29T02:45:03Z,OWNER,"@ftrain OK I've shipped the first version of this. Here's the initial documentation: Create a `metadata.json` file that looks like this: { ""extra_css_urls"": [ ""https://simonwillison.net/static/css/all.bf8cd891642c.css"" ], ""extra_js_urls"": [ ""https://code.jquery.com/jquery-3.2.1.slim.min.js"" ] } Then start datasette like this: datasette mydb.db --metadata=metadata.json The CSS and JavaScript files will be linked in the `` of every page. You can also specify a SRI (subresource integrity hash) for these assets: { ""extra_css_urls"": [ { ""url"": ""https://simonwillison.net/static/css/all.bf8cd891642c.css"", ""sri"": ""sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI"" } ], ""extra_js_urls"": [ { ""url"": ""https://code.jquery.com/jquery-3.2.1.slim.min.js"", ""sri"": ""sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g="" } ] } Modern browsers will only execute the stylsheet or JavaScript if the SRI hash matches the content served. You can generate hashes using www.srihash.org This isn't shipped in a release yet, but you can still access these features in `datasette publish` like so: datasette publish now mydb.db --metadata=metadata.json --branch=master The `--branch=master` option will pull the latest master build of Datasette from GitHub.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-347735598,https://api.github.com/repos/simonw/datasette/issues/153,347735598,MDEyOklzc3VlQ29tbWVudDM0NzczNTU5OA==,9599,simonw,2017-11-29T02:46:31Z,2017-11-29T02:47:27Z,OWNER,"To style individual columns you'll currently need to use the `nth-of-type` selector, e.g.: td:nth-of-type(5):before { white-space: pre }","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-347735724,https://api.github.com/repos/simonw/datasette/issues/153,347735724,MDEyOklzc3VlQ29tbWVudDM0NzczNTcyNA==,9599,simonw,2017-11-29T02:47:14Z,2017-11-29T02:47:14Z,OWNER,(This only addresses point 2 in your issue description - points 1 and point 3 are still to come),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-347928926,https://api.github.com/repos/simonw/datasette/issues/153,347928926,MDEyOklzc3VlQ29tbWVudDM0NzkyODkyNg==,9599,simonw,2017-11-29T17:09:40Z,2017-11-29T17:09:40Z,OWNER,"OK, that's point 1 covered.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-348103270,https://api.github.com/repos/simonw/datasette/issues/153,348103270,MDEyOklzc3VlQ29tbWVudDM0ODEwMzI3MA==,9599,simonw,2017-11-30T07:16:40Z,2017-11-30T07:16:40Z,OWNER,"Every template now gets CSS classes in the body designed to support custom styling. The index template (the top level page at /) gets this: The database template (/dbname/) gets this: The table template (/dbname/tablename) gets: The row template (/dbname/tablename/rowid) gets: The db-x and table-x classes use the database or table names themselves IF they are valid CSS identifiers. If they aren't, we strip any invalid characters out and append a 6 character md5 digest of the original name, in order to ensure that multiple tables which resolve to the same stripped character version still have different CSS classes. Some examples (extracted from the unit tests): ""simple"" => ""simple"" ""MixedCase"" => ""MixedCase"" ""-no-leading-hyphens"" => ""no-leading-hyphens-65bea6"" ""_no-leading-underscores"" => ""no-leading-underscores-b921bc"" ""no spaces"" => ""no-spaces-7088d7"" ""-"" => ""336d5e"" ""no $ characters"" => ""no--characters-59e024"" ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-348245843,https://api.github.com/repos/simonw/datasette/issues/153,348245843,MDEyOklzc3VlQ29tbWVudDM0ODI0NTg0Mw==,9599,simonw,2017-11-30T16:40:02Z,2017-11-30T16:40:02Z,OWNER,"It is now possible to over-ride templates on a per-database / per-row or per- table basis. When you access e.g. `/mydatabase/mytable` Datasette will look for the following: - table-mydatabase-mytable.html - table.html If you provided a `--template-dir` argument to datasette serve it will look in that directory first. The lookup rules are as follows: Index page (/): index.html Database page (/mydatabase): database-mydatabase.html database.html Table page (/mydatabase/mytable): table-mydatabase-mytable.html table.html Row page (/mydatabase/mytable/id): row-mydatabase-mytable.html row.html If a table name has spaces or other unexpected characters in it, the template filename will follow the same rules as our custom `` CSS classes introduced in 8ab3a16 - for example, a table called ""Food Trucks"" will attempt to load the following templates: table-mydatabase-Food-Trucks-399138.html table.html It is possible to extend the default templates using Jinja template inheritance. If you want to customize EVERY row template with some additional content you can do so by creating a `row.html` template like this: {% extends ""default:row.html"" %} {% block content %}

EXTRA HTML AT THE TOP OF THE CONTENT BLOCK

This line renders the original block:

{{ super() }} {% endblock %} ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-348248406,https://api.github.com/repos/simonw/datasette/issues/153,348248406,MDEyOklzc3VlQ29tbWVudDM0ODI0ODQwNg==,9599,simonw,2017-11-30T16:47:45Z,2017-11-30T16:47:45Z,OWNER,Remaining work on this now lives in a milestone: https://github.com/simonw/datasette/milestone/6,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-348252037,https://api.github.com/repos/simonw/datasette/issues/153,348252037,MDEyOklzc3VlQ29tbWVudDM0ODI1MjAzNw==,20264,ftrain,2017-11-30T16:59:00Z,2017-11-30T16:59:00Z,NONE,"WOW! -- Paul Ford // (646) 369-7128 // @ftrain On Thu, Nov 30, 2017 at 11:47 AM, Simon Willison wrote: > Remaining work on this now lives in a milestone: > https://github.com/simonw/datasette/milestone/6 > > — > You are receiving this because you were mentioned. > Reply to this email directly, view it on GitHub > , > or mute the thread > > . > ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-348255925,https://api.github.com/repos/simonw/datasette/issues/153,348255925,MDEyOklzc3VlQ29tbWVudDM0ODI1NTkyNQ==,9599,simonw,2017-11-30T17:12:03Z,2017-11-30T17:12:03Z,OWNER,Documentation is now live for this: http://datasette.readthedocs.io/en/latest/custom_templates.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-349874052,https://api.github.com/repos/simonw/datasette/issues/153,349874052,MDEyOklzc3VlQ29tbWVudDM0OTg3NDA1Mg==,9599,simonw,2017-12-07T06:17:33Z,2017-12-07T06:17:33Z,OWNER,"In #159 I added a mechanism for easily customizing per-column displays, and I've added documentation showing an example of using this mechanism to set certain columns to display as unescaped HTML: http://datasette.readthedocs.io/en/latest/custom_templates.html#custom-templates This fixes item 3, so I'm closing this ticket!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-350519736,https://api.github.com/repos/simonw/datasette/issues/153,350519736,MDEyOklzc3VlQ29tbWVudDM1MDUxOTczNg==,9599,simonw,2017-12-10T02:06:01Z,2017-12-10T02:06:01Z,OWNER,@ftrain Datasette 0.14 is now released with all of the above: https://github.com/simonw/datasette/releases/tag/0.14,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-350519821,https://api.github.com/repos/simonw/datasette/issues/153,350519821,MDEyOklzc3VlQ29tbWVudDM1MDUxOTgyMQ==,9599,simonw,2017-12-10T02:08:45Z,2017-12-10T02:08:45Z,OWNER,"Also worth mentioning: as of #160 and #157 the `datasette publish now`, `datasette publish heroku` and `datasette package` commands all know how to bundle up any `--static` or `--template-dir` content and include it in the Docker image / Heroku/Now deployment that gets generated.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/154#issuecomment-348404988,https://api.github.com/repos/simonw/datasette/issues/154,348404988,MDEyOklzc3VlQ29tbWVudDM0ODQwNDk4OA==,9599,simonw,2017-12-01T05:27:40Z,2017-12-01T05:27:40Z,OWNER,If I do add additional static file bundling should that automatically get content hashes as well? #160 - problem with that is then I might have to parse the CSS files and rewrite their internal background-url references etc.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276873891,Datasette CSS should include content hash in the URL, https://github.com/simonw/datasette/issues/154#issuecomment-350302417,https://api.github.com/repos/simonw/datasette/issues/154,350302417,MDEyOklzc3VlQ29tbWVudDM1MDMwMjQxNw==,9599,simonw,2017-12-08T16:11:24Z,2017-12-08T16:11:24Z,OWNER,I think I'll do this as a custom Jinja template filter. That way template authors can re-use it for their own static files if they want.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276873891,Datasette CSS should include content hash in the URL, https://github.com/simonw/datasette/issues/154#issuecomment-350323722,https://api.github.com/repos/simonw/datasette/issues/154,350323722,MDEyOklzc3VlQ29tbWVudDM1MDMyMzcyMg==,9599,simonw,2017-12-08T17:35:25Z,2017-12-08T17:35:25Z,OWNER,If I do this as a querystring parameter I won't need to worry about URL routing.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276873891,Datasette CSS should include content hash in the URL, https://github.com/simonw/datasette/issues/155#issuecomment-347713453,https://api.github.com/repos/simonw/datasette/issues/155,347713453,MDEyOklzc3VlQ29tbWVudDM0NzcxMzQ1Mw==,9599,simonw,2017-11-29T00:41:30Z,2017-11-29T00:41:30Z,OWNER,Could you provide the SQL to create a reproducible test case (both CREATE TABLE and INSERT statements)?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",277589569,A primary key column that has foreign key restriction associated won't rendering label column, https://github.com/simonw/datasette/issues/155#issuecomment-347714314,https://api.github.com/repos/simonw/datasette/issues/155,347714314,MDEyOklzc3VlQ29tbWVudDM0NzcxNDMxNA==,388154,wsxiaoys,2017-11-29T00:46:25Z,2017-11-29T00:46:25Z,NONE,"``` CREATE TABLE rhs ( id INTEGER PRIMARY KEY, name TEXT ); CREATE TABLE lhs ( symbol INTEGER PRIMARY KEY, FOREIGN KEY (symbol) REFERENCES rhs(id) ); INSERT INTO rhs VALUES (1, ""foo""); INSERT INTO rhs VALUES (2, ""bar""); INSERT INTO lhs VALUES (1); INSERT INTO lhs VALUES (2); ``` It's expected that in lhs's view, foo / bar should be displayed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",277589569,A primary key column that has foreign key restriction associated won't rendering label column, https://github.com/simonw/datasette/issues/155#issuecomment-347714471,https://api.github.com/repos/simonw/datasette/issues/155,347714471,MDEyOklzc3VlQ29tbWVudDM0NzcxNDQ3MQ==,9599,simonw,2017-11-29T00:47:21Z,2017-11-29T00:47:21Z,OWNER,Thanks!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",277589569,A primary key column that has foreign key restriction associated won't rendering label column, https://github.com/simonw/datasette/issues/155#issuecomment-347715452,https://api.github.com/repos/simonw/datasette/issues/155,347715452,MDEyOklzc3VlQ29tbWVudDM0NzcxNTQ1Mg==,9599,simonw,2017-11-29T00:52:30Z,2017-11-29T00:52:30Z,OWNER,"Interestingly, it almost does the right thing on the individual row page: https://bug-155-dkcqckhgki.now.sh/bug-155-9a7bb68/lhs/1 The symbol has been expanded, but there's a rogue '1' that shouldn't be there at all - I think that's bug #152 The table view itself is definitely doing the wrong thing: https://bug-155-dkcqckhgki.now.sh/bug-155-9a7bb68/lhs ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",277589569,A primary key column that has foreign key restriction associated won't rendering label column, https://github.com/simonw/datasette/issues/156#issuecomment-348255782,https://api.github.com/repos/simonw/datasette/issues/156,348255782,MDEyOklzc3VlQ29tbWVudDM0ODI1NTc4Mg==,9599,simonw,2017-11-30T17:11:34Z,2017-11-30T17:11:34Z,OWNER,http://datasette.readthedocs.io/en/latest/custom_templates.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278189708,Document CSS hooks and custom templates, https://github.com/simonw/datasette/issues/157#issuecomment-350496277,https://api.github.com/repos/simonw/datasette/issues/157,350496277,MDEyOklzc3VlQ29tbWVudDM1MDQ5NjI3Nw==,9599,simonw,2017-12-09T18:29:41Z,2017-12-09T18:29:41Z,OWNER,"Example usage: datasette package --static css:extra-css/ --static js:extra-js/ \ sf-trees.db --template-dir templates/ --tag sf-trees --branch master This creates a local Docker image that includes copies of the templates/, extra-css/ and extra-js/ directories. You can then run it like this: docker run -p 8001:8001 sf-trees For publishing to Zeit now: datasette publish now --static css:extra-css/ --static js:extra-js/ \ sf-trees.db --template-dir templates/ --name sf-trees --branch master Example: https://sf-trees-wbihszoazc.now.sh/sf-trees-02c8ef1/Street_Tree_List For publishing to Heroku: datasette publish heroku --static css:extra-css/ --static js:extra-js/ \ sf-trees.db --template-dir templates/ --branch master ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278190321,"Teach ""datasette publish"" about custom template directories", https://github.com/simonw/datasette/issues/158#issuecomment-349868849,https://api.github.com/repos/simonw/datasette/issues/158,349868849,MDEyOklzc3VlQ29tbWVudDM0OTg2ODg0OQ==,9599,simonw,2017-12-07T05:41:08Z,2017-12-07T05:41:08Z,OWNER,"I'm happy with this - we have extra_head, content, body_class and title blocks which should provide enough hooks for most reasonable customizations.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278190981,Ensure default templates are designed to be extended, https://github.com/simonw/datasette/issues/160#issuecomment-348404864,https://api.github.com/repos/simonw/datasette/issues/160,348404864,MDEyOklzc3VlQ29tbWVudDM0ODQwNDg2NA==,9599,simonw,2017-12-01T05:26:57Z,2017-12-01T05:26:57Z,OWNER,"Question is... what should happen to the default static stuff? At the moment that's just https://fivethirtyeight.datasettes.com/-/static/app.css - though I want to improve that to include a content hash, see #154 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/160#issuecomment-348719680,https://api.github.com/repos/simonw/datasette/issues/160,348719680,MDEyOklzc3VlQ29tbWVudDM0ODcxOTY4MA==,9599,simonw,2017-12-02T20:59:27Z,2017-12-02T20:59:27Z,OWNER,"This is about more than just CSS and JavaScript - there are plenty of reasons someone might want to bundle HTML as well, e.g. for building something like https://sf-tree-search.now.sh/ So, instead of thinking about this in terms of /static/, I'm going to think about this in terms of allowing people to mount one or more document roots (or docroots). datasette serve mydb.db -d my-doc-root/ This will cause the root of the server to show content from the `my-doc-root/` directory (assuming it has an index.html file in it). A more common option will be to mount specific folders to specific directories, like this: datasette serve mydb.db -d static:my-static/ Now any hits to `/static/foo.css` will serve content from `my-static/foo.css`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/160#issuecomment-348719752,https://api.github.com/repos/simonw/datasette/issues/160,348719752,MDEyOklzc3VlQ29tbWVudDM0ODcxOTc1Mg==,9599,simonw,2017-12-02T21:00:21Z,2017-12-02T21:00:21Z,OWNER,Not sure which I like better out of `-d/--docroot` or `-s/--static` or `-m/--mount` for this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/160#issuecomment-348719827,https://api.github.com/repos/simonw/datasette/issues/160,348719827,MDEyOklzc3VlQ29tbWVudDM0ODcxOTgyNw==,9599,simonw,2017-12-02T21:01:36Z,2017-12-02T21:01:36Z,OWNER,`-m` is already taken for `--metadata`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/160#issuecomment-348793054,https://api.github.com/repos/simonw/datasette/issues/160,348793054,MDEyOklzc3VlQ29tbWVudDM0ODc5MzA1NA==,9599,simonw,2017-12-03T16:35:22Z,2017-12-03T16:35:22Z,OWNER,"You can now tell Datasette to serve static files from a specific location at a specific mountpoint. For example: datasette serve mydb.db --static extra-css:/tmp/static/css Now if you visit this URL: http://localhost:8001/extra-css/blah.css The following file will be served: /tmp/static/css/blah.css ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/160#issuecomment-348793156,https://api.github.com/repos/simonw/datasette/issues/160,348793156,MDEyOklzc3VlQ29tbWVudDM0ODc5MzE1Ng==,9599,simonw,2017-12-03T16:35:53Z,2017-12-03T16:35:53Z,OWNER,Still TODO: teach `datasette publish` and friends about this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/160#issuecomment-350496258,https://api.github.com/repos/simonw/datasette/issues/160,350496258,MDEyOklzc3VlQ29tbWVudDM1MDQ5NjI1OA==,9599,simonw,2017-12-09T18:29:28Z,2017-12-09T18:29:28Z,OWNER,"Example usage: datasette package --static css:extra-css/ --static js:extra-js/ \ sf-trees.db --template-dir templates/ --tag sf-trees --branch master This creates a local Docker image that includes copies of the templates/, extra-css/ and extra-js/ directories. You can then run it like this: docker run -p 8001:8001 sf-trees For publishing to Zeit now: datasette publish now --static css:extra-css/ --static js:extra-js/ \ sf-trees.db --template-dir templates/ --name sf-trees --branch master Example: https://sf-trees-wbihszoazc.now.sh/sf-trees-02c8ef1/Street_Tree_List For publishing to Heroku: datasette publish heroku --static css:extra-css/ --static js:extra-js/ \ sf-trees.db --template-dir templates/ --branch master ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/160#issuecomment-459915995,https://api.github.com/repos/simonw/datasette/issues/160,459915995,MDEyOklzc3VlQ29tbWVudDQ1OTkxNTk5NQ==,82988,psychemedia,2019-02-02T00:43:16Z,2019-02-02T00:58:20Z,CONTRIBUTOR,"Do you have any simple working examples of how to use `--static`? Inspection of default served files suggests locations such as `http://example.com/-/static/app.css?0e06ee`. If `datasette` is being proxied to `http://example.com/foo/datasette`, what form should arguments to `--static` take so that static files are correctly referenced? Use case is here: https://github.com/psychemedia/jupyterserverproxy-datasette-demo Trying to do a really simple `datasette` demo in MyBinder using jupyter-server-proxy.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/161#issuecomment-348860191,https://api.github.com/repos/simonw/datasette/issues/161,348860191,MDEyOklzc3VlQ29tbWVudDM0ODg2MDE5MQ==,9599,simonw,2017-12-04T04:52:14Z,2017-12-04T04:52:14Z,OWNER,Seems like a reasonable thing for us to support.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278814220,Support WITH query , https://github.com/simonw/datasette/issues/161#issuecomment-350108113,https://api.github.com/repos/simonw/datasette/issues/161,350108113,MDEyOklzc3VlQ29tbWVudDM1MDEwODExMw==,388154,wsxiaoys,2017-12-07T22:02:24Z,2017-12-07T22:02:24Z,NONE,"It's not throwing the validation error anymore, but i still cannot run following with query: ``` WITH RECURSIVE cnt(x) AS (SELECT 1 UNION ALL SELECT x+1 FROM cnt LIMIT 10) SELECT x FROM cnt; ``` I got `near ""WITH"": syntax error`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278814220,Support WITH query , https://github.com/simonw/datasette/issues/161#issuecomment-350158037,https://api.github.com/repos/simonw/datasette/issues/161,350158037,MDEyOklzc3VlQ29tbWVudDM1MDE1ODAzNw==,9599,simonw,2017-12-08T02:52:34Z,2017-12-08T02:52:34Z,OWNER,That might mean your version of SQLite doesn't support that syntax. Unfortunately the version bundled with Python is a bit old - the one built by the Dockerfile in this repo should handle it though.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278814220,Support WITH query , https://github.com/simonw/datasette/issues/161#issuecomment-350182904,https://api.github.com/repos/simonw/datasette/issues/161,350182904,MDEyOklzc3VlQ29tbWVudDM1MDE4MjkwNA==,388154,wsxiaoys,2017-12-08T06:18:12Z,2017-12-08T06:18:12Z,NONE,"You're right..got this resolved after upgrading the sqlite version. Thanks you!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278814220,Support WITH query , https://github.com/simonw/datasette/issues/163#issuecomment-804539729,https://api.github.com/repos/simonw/datasette/issues/163,804539729,MDEyOklzc3VlQ29tbWVudDgwNDUzOTcyOQ==,192568,mroswell,2021-03-23T02:41:14Z,2021-03-23T02:41:14Z,CONTRIBUTOR,"I'm visiting old issues for context while learning datasette. Let me know if okay to make the occasional comment like this one. querystring argument now located at: https://docs.datasette.io/en/latest/settings.html#sql-time-limit-ms","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",279547886,Document the querystring argument for setting a different time limit, https://github.com/simonw/datasette/issues/163#issuecomment-804540869,https://api.github.com/repos/simonw/datasette/issues/163,804540869,MDEyOklzc3VlQ29tbWVudDgwNDU0MDg2OQ==,9599,simonw,2021-03-23T02:44:33Z,2021-03-23T02:44:33Z,OWNER,Comments welcome!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",279547886,Document the querystring argument for setting a different time limit, https://github.com/simonw/datasette/issues/164#issuecomment-349874709,https://api.github.com/repos/simonw/datasette/issues/164,349874709,MDEyOklzc3VlQ29tbWVudDM0OTg3NDcwOQ==,9599,simonw,2017-12-07T06:22:10Z,2017-12-07T06:22:10Z,OWNER,"Example usage: datasette skeleton parlgov.db -m parlgov.json Generates a `parlgov.json` file containing this: { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null, ""databases"": { ""parlgov"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null, ""queries"": {}, ""tables"": { ""info_data_source"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_castles_mair"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_chess"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_huber_inglehart"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""info_table"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_euprofiler"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""party_family"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""info_id"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""sqlite_stat1"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_benoit_laver"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_country_iso"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""viewcalc_party_position"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""viewcalc_election_parameter"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""viewcalc_parliament_composition"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""viewcalc_country_year_share"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""election"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""politician_president"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""party_name_change"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_commissioner_doering"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_ray"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""party_change"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""cabinet_party"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_ees"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""party"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_cmp"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""country"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""cabinet"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""info_variable"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""election_result"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null } } } } } ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280013907,datasette skeleton command for kick-starting database and table metadata, https://github.com/simonw/datasette/issues/164#issuecomment-349874844,https://api.github.com/repos/simonw/datasette/issues/164,349874844,MDEyOklzc3VlQ29tbWVudDM0OTg3NDg0NA==,9599,simonw,2017-12-07T06:22:58Z,2017-12-07T06:22:58Z,OWNER,This metadata doesn't yet do anything - need to implement #165,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280013907,datasette skeleton command for kick-starting database and table metadata, https://github.com/simonw/datasette/issues/164#issuecomment-804541064,https://api.github.com/repos/simonw/datasette/issues/164,804541064,MDEyOklzc3VlQ29tbWVudDgwNDU0MTA2NA==,192568,mroswell,2021-03-23T02:45:12Z,2021-03-23T02:45:12Z,CONTRIBUTOR,"""datasette skeleton"" feature removed #476","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280013907,datasette skeleton command for kick-starting database and table metadata, https://github.com/simonw/datasette/issues/165#issuecomment-350026183,https://api.github.com/repos/simonw/datasette/issues/165,350026183,MDEyOklzc3VlQ29tbWVudDM1MDAyNjE4Mw==,9599,simonw,2017-12-07T16:47:46Z,2017-12-07T16:47:46Z,OWNER,"Here's an example metadata.json file illustrating custom per-database and per- table metadata: { ""title"": ""Overall datasette title"", ""description_html"": ""This is a description with HTML."", ""databases"": { ""db1"": { ""title"": ""First database"", ""description"": ""This is a string description & has no HTML"", ""license_url"": ""http://example.com/"", ""license"": ""The example license"", ""queries"": { ""canned_query"": ""select * from table1 limit 3;"" }, ""tables"": { ""table1"": { ""title"": ""Custom title for table1"", ""description"": ""Tables can have descriptions too"", ""source"": ""This has a custom source"", ""source_url"": ""http://example.com/"" } } } } }","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280014287,metadata.json support for per-database and per-table information, https://github.com/simonw/datasette/issues/165#issuecomment-350026452,https://api.github.com/repos/simonw/datasette/issues/165,350026452,MDEyOklzc3VlQ29tbWVudDM1MDAyNjQ1Mg==,9599,simonw,2017-12-07T16:48:34Z,2017-12-07T16:48:34Z,OWNER,"Needs documentation, see #166 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280014287,metadata.json support for per-database and per-table information, https://github.com/simonw/datasette/issues/166#issuecomment-350035741,https://api.github.com/repos/simonw/datasette/issues/166,350035741,MDEyOklzc3VlQ29tbWVudDM1MDAzNTc0MQ==,9599,simonw,2017-12-07T17:20:35Z,2017-12-07T17:20:35Z,OWNER,"http://datasette.readthedocs.io/en/latest/metadata.html ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280023225,Documentation for metadata.json and datasette skeleton, https://github.com/simonw/datasette/issues/167#issuecomment-350125953,https://api.github.com/repos/simonw/datasette/issues/167,350125953,MDEyOklzc3VlQ29tbWVudDM1MDEyNTk1Mw==,9599,simonw,2017-12-07T23:25:28Z,2017-12-07T23:25:28Z,OWNER,"My column/row HTML display logic has got way too convoluted. This is a sign I need to add proper unit tests for it and clean it up. The complexity comes from: * Displaying a rowid for tables that do not have a primary key * Showing an additional Link column for rows with a primary key * Not displaying that Link column on the individual row pages * Trying to get foreign keys working correctly in all cases, e.g. #152 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280315352,Nasty bug: last column not being correctly displayed, https://github.com/simonw/datasette/issues/167#issuecomment-350421661,https://api.github.com/repos/simonw/datasette/issues/167,350421661,MDEyOklzc3VlQ29tbWVudDM1MDQyMTY2MQ==,9599,simonw,2017-12-09T03:52:46Z,2017-12-09T03:52:46Z,OWNER,"Input: results from the database, foreign key definitions, primary key definitions, type of page Output: display_columns and display_rows","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280315352,Nasty bug: last column not being correctly displayed, https://github.com/simonw/datasette/issues/167#issuecomment-350424595,https://api.github.com/repos/simonw/datasette/issues/167,350424595,MDEyOklzc3VlQ29tbWVudDM1MDQyNDU5NQ==,9599,simonw,2017-12-09T05:08:27Z,2017-12-09T05:08:27Z,OWNER,Perhaps the row.html and table.html templates should be passed the same data but should themselves decide if they will display the Link column ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280315352,Nasty bug: last column not being correctly displayed, https://github.com/simonw/datasette/issues/167#issuecomment-350515616,https://api.github.com/repos/simonw/datasette/issues/167,350515616,MDEyOklzc3VlQ29tbWVudDM1MDUxNTYxNg==,9599,simonw,2017-12-10T00:21:58Z,2017-12-10T00:21:58Z,OWNER,This function signature is pretty gross: https://github.com/simonw/datasette/blob/7a7e4b2ed8c76c6d002a9d707dbc840f6a2abf7f/datasette/app.py#L418,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280315352,Nasty bug: last column not being correctly displayed, https://github.com/simonw/datasette/issues/167#issuecomment-350515985,https://api.github.com/repos/simonw/datasette/issues/167,350515985,MDEyOklzc3VlQ29tbWVudDM1MDUxNTk4NQ==,9599,simonw,2017-12-10T00:28:39Z,2017-12-10T00:28:39Z,OWNER,"A better alternative: ```async def display_columns_and_rows(self, database, table, rows, link_column=False):```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280315352,Nasty bug: last column not being correctly displayed, https://github.com/simonw/datasette/issues/167#issuecomment-350516782,https://api.github.com/repos/simonw/datasette/issues/167,350516782,MDEyOklzc3VlQ29tbWVudDM1MDUxNjc4Mg==,9599,simonw,2017-12-10T00:48:54Z,2017-12-10T00:48:54Z,OWNER,I can simplify this all by dropping the nicety where if a table is using a rowid the Link column is titled rowid instead.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280315352,Nasty bug: last column not being correctly displayed, https://github.com/simonw/datasette/pull/168#issuecomment-350413422,https://api.github.com/repos/simonw/datasette/issues/168,350413422,MDEyOklzc3VlQ29tbWVudDM1MDQxMzQyMg==,9599,simonw,2017-12-09T01:33:40Z,2017-12-09T01:33:40Z,OWNER,https://github.com/channelcat/sanic/releases/tag/0.7.0,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280662866,Upgrade to Sanic 0.7.0, https://github.com/simonw/datasette/issues/169#issuecomment-350519711,https://api.github.com/repos/simonw/datasette/issues/169,350519711,MDEyOklzc3VlQ29tbWVudDM1MDUxOTcxMQ==,9599,simonw,2017-12-10T02:04:56Z,2017-12-10T02:04:56Z,OWNER,Done! https://github.com/simonw/datasette/releases/tag/0.14,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280744309,Release v0.14 with templates and static files features, https://github.com/simonw/datasette/issues/170#issuecomment-350506593,https://api.github.com/repos/simonw/datasette/issues/170,350506593,MDEyOklzc3VlQ29tbWVudDM1MDUwNjU5Mw==,9599,simonw,2017-12-09T21:25:50Z,2017-12-09T21:25:50Z,OWNER,Turns out this is already supported: https://github.com/simonw/datasette/blob/6bdfcf60760c27e29ff34692d06e62b36aeecc56/datasette/app.py#L307,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280745470,Custom template for named canned query, https://github.com/simonw/datasette/issues/170#issuecomment-350506751,https://api.github.com/repos/simonw/datasette/issues/170,350506751,MDEyOklzc3VlQ29tbWVudDM1MDUwNjc1MQ==,9599,simonw,2017-12-09T21:28:32Z,2017-12-09T21:28:32Z,OWNER,"My mistake, that's using the database name - there isn't a way of customizing for a specific named query yet.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280745470,Custom template for named canned query, https://github.com/simonw/datasette/issues/170#issuecomment-350507155,https://api.github.com/repos/simonw/datasette/issues/170,350507155,MDEyOklzc3VlQ29tbWVudDM1MDUwNzE1NQ==,9599,simonw,2017-12-09T21:35:30Z,2017-12-09T21:35:30Z,OWNER," Canned query page (/mydatabase/canned-query): query-mydatabase-canned-query.html query-mydatabase.html query.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280745470,Custom template for named canned query, https://github.com/simonw/datasette/issues/171#issuecomment-350508049,https://api.github.com/repos/simonw/datasette/issues/171,350508049,MDEyOklzc3VlQ29tbWVudDM1MDUwODA0OQ==,9599,simonw,2017-12-09T21:50:50Z,2017-12-09T21:50:50Z,OWNER,"Quoting the new documentation: You can find out which templates were considered for a specific page by viewing source on that page and looking for an HTML comment at the bottom. The comment will look something like this: This example is from the canned query page for a query called ""tz"" in the database called ""mydb"". The asterisk shows which template was selected - so in this case, Datasette found a template file called `query-mydb-tz.html` and used that - but if that template had not been found, it would have tried for `query-mydb.html` or the default `query.html`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280745746,HTML comments specifying custom templates for page, https://github.com/simonw/datasette/issues/172#issuecomment-460902824,https://api.github.com/repos/simonw/datasette/issues/172,460902824,MDEyOklzc3VlQ29tbWVudDQ2MDkwMjgyNA==,9599,simonw,2019-02-06T05:09:05Z,2019-02-06T05:09:05Z,OWNER,"Demo: https://latest.datasette.io/fixtures-dd88475 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280896290,Show size of .db file next to download link, https://github.com/simonw/datasette/issues/173#issuecomment-823961091,https://api.github.com/repos/simonw/datasette/issues/173,823961091,MDEyOklzc3VlQ29tbWVudDgyMzk2MTA5MQ==,3747136,ColinMaudry,2021-04-21T10:37:05Z,2021-04-21T10:37:36Z,NONE,"I have the feeling that the text visible to users is 95% present in template files ([datasette/templates](https://github.com/simonw/datasette/tree/main/datasette/templates)). The python code mainly contains error messages. In the current situation, the best way to provide a localized frontend is to translate the templates and [configure datasette to use them](https://docs.datasette.io/en/stable/custom_templates.html). I think I'm going to do it for French. If we want localization to be better integrated, for the python code, I think [gettext](https://docs.python.org/3/library/gettext.html#localizing-your-application) is the way to go. The .po can be translated in user-friendly tools such as Transifex and Crowdin. For the templates, I'm not sure how we could do it cleanly and easy to maintain. Maybe the tools above could parse HTML and detect the strings to be translated. In any case, localization implementing l10n is just the first step: a continuous process must be setup to maintain the translations and produce new ones while datasette keeps getting new features.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",281110295,I18n and L10n support, https://github.com/simonw/datasette/issues/173#issuecomment-826784306,https://api.github.com/repos/simonw/datasette/issues/173,826784306,MDEyOklzc3VlQ29tbWVudDgyNjc4NDMwNg==,3747136,ColinMaudry,2021-04-26T12:10:01Z,2021-04-26T12:10:01Z,NONE,I found a neat tutorial to set up gettext with jinja2: http://siongui.github.io/2016/01/17/i18n-python-web-application-by-gettext-jinja2/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",281110295,I18n and L10n support, https://github.com/simonw/datasette/issues/174#issuecomment-412290986,https://api.github.com/repos/simonw/datasette/issues/174,412290986,MDEyOklzc3VlQ29tbWVudDQxMjI5MDk4Ng==,9599,simonw,2018-08-11T17:46:51Z,2018-08-11T17:46:51Z,OWNER,This was fixed in https://github.com/simonw/datasette/commit/89d9fbb91bfc0dd9091b34dbf3cf540ab849cc44,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",281197863,License/Source in footer should inherit from top level, https://github.com/simonw/datasette/issues/175#issuecomment-353424169,https://api.github.com/repos/simonw/datasette/issues/175,353424169,MDEyOklzc3VlQ29tbWVudDM1MzQyNDE2OQ==,9599,simonw,2017-12-21T18:33:55Z,2017-12-21T18:33:55Z,OWNER,Done - thanks for curating these: https://github.com/topics/automatic-api,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",282971961,"Add project topic ""automatic-api""", https://github.com/simonw/datasette/issues/176#issuecomment-356115657,https://api.github.com/repos/simonw/datasette/issues/176,356115657,MDEyOklzc3VlQ29tbWVudDM1NjExNTY1Nw==,4313116,wulfmann,2018-01-08T22:22:32Z,2018-01-08T22:22:32Z,NONE,"This project probably would not be the place for that. This is a layer for sqllite specifically. It solves a similar problem as graphql, so adding that here wouldn't make sense. Here's an example i found from google that uses micro to run a graphql microservice. you'd just then need to connect your db. https://github.com/timneutkens/micro-graphql","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/176#issuecomment-356161672,https://api.github.com/repos/simonw/datasette/issues/176,356161672,MDEyOklzc3VlQ29tbWVudDM1NjE2MTY3Mg==,173848,yozlet,2018-01-09T02:35:35Z,2018-01-09T02:35:35Z,NONE,"@wulfmann I think I disagree, except I'm not entirely sure what you mean by that first paragraph. The JSON API that Datasette currently exposes is quite different to GraphQL. Furthermore, there's no ""just"" about connecting micro-graphql to a DB; at least, no more ""just"" than adding any other API. You still need to configure the schema, which is exactly the kind of thing that Datasette does for JSON API. This is why I think that GraphQL's a good fit here.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/176#issuecomment-356175667,https://api.github.com/repos/simonw/datasette/issues/176,356175667,MDEyOklzc3VlQ29tbWVudDM1NjE3NTY2Nw==,4313116,wulfmann,2018-01-09T04:19:03Z,2018-01-09T04:19:03Z,NONE,"@yozlet Yes I think that I was confused when I posted my original comment. I see your main point now and am in agreement. ","{""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 2, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/176#issuecomment-359697938,https://api.github.com/repos/simonw/datasette/issues/176,359697938,MDEyOklzc3VlQ29tbWVudDM1OTY5NzkzOA==,7193,gijs,2018-01-23T07:17:56Z,2018-01-23T07:17:56Z,NONE,👍 I'd like this too! ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/176#issuecomment-368625350,https://api.github.com/repos/simonw/datasette/issues/176,368625350,MDEyOklzc3VlQ29tbWVudDM2ODYyNTM1MA==,7431774,wuhland,2018-02-26T19:44:11Z,2018-02-26T19:44:11Z,NONE,great idea!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/176#issuecomment-431867885,https://api.github.com/repos/simonw/datasette/issues/176,431867885,MDEyOklzc3VlQ29tbWVudDQzMTg2Nzg4NQ==,634572,eads,2018-10-22T15:24:57Z,2018-10-22T15:24:57Z,NONE,"I'd like this as well. It would let me access Datasette-driven projects from GatsbyJS the same way I can access Postgres DBs via Hasura. While I don't see SQLite replacing Postgres for the 50m row datasets I sometimes have to work with, there's a whole class of smaller datasets that are great with Datasette but currently would find another option.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/176#issuecomment-548508237,https://api.github.com/repos/simonw/datasette/issues/176,548508237,MDEyOklzc3VlQ29tbWVudDU0ODUwODIzNw==,634572,eads,2019-10-31T18:25:44Z,2019-10-31T18:25:44Z,NONE,"👋 I'd be interested in building this out in Q1 or Q2 of 2020 if nobody has tackled it by then. I would love to integrate Datasette into @thechicagoreporter's practice, but we're also fully committed to GraphQL moving forward.","{""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 2, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/176#issuecomment-617208503,https://api.github.com/repos/simonw/datasette/issues/176,617208503,MDEyOklzc3VlQ29tbWVudDYxNzIwODUwMw==,12976,nkirsch,2020-04-21T14:16:24Z,2020-04-21T14:16:24Z,NONE,"@eads I'm interested in helping, if there's still a need...","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/pull/178#issuecomment-357542404,https://api.github.com/repos/simonw/datasette/issues/178,357542404,MDEyOklzc3VlQ29tbWVudDM1NzU0MjQwNA==,9599,simonw,2018-01-14T21:06:07Z,2018-01-14T21:06:07Z,OWNER,"Thanks for catching this, merged!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",287240246,"If metadata exists, add it to heroku launch command", https://github.com/simonw/datasette/issues/179#issuecomment-360535979,https://api.github.com/repos/simonw/datasette/issues/179,360535979,MDEyOklzc3VlQ29tbWVudDM2MDUzNTk3OQ==,82988,psychemedia,2018-01-25T17:18:24Z,2018-01-25T17:18:24Z,CONTRIBUTOR,"To summarise that thread: - expose full `metadata.json` object to the index page template, eg to allow tables to be referred to by name; - ability to import multiple `metadata.json` files, eg to allow metadata files created for a specific SQLite db to be reused in a datasette referring to several database files; It could also be useful to allow users to import a python file containing custom functions that can that be loaded into scope and made available to custom templates. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",288438570,More metadata options for template authors , https://github.com/simonw/datasette/issues/179#issuecomment-392606418,https://api.github.com/repos/simonw/datasette/issues/179,392606418,MDEyOklzc3VlQ29tbWVudDM5MjYwNjQxOA==,9599,simonw,2018-05-28T21:32:37Z,2018-05-28T21:32:37Z,OWNER,"> It could also be useful to allow users to import a python file containing custom functions that can that be loaded into scope and made available to custom templates. That's now covered by the plugins mechanism - you can create plugins that define custom template functions: http://datasette.readthedocs.io/en/stable/plugins.html#prepare-jinja2-environment-env","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",288438570,More metadata options for template authors , https://github.com/simonw/datasette/pull/181#issuecomment-378293484,https://api.github.com/repos/simonw/datasette/issues/181,378293484,MDEyOklzc3VlQ29tbWVudDM3ODI5MzQ4NA==,9599,simonw,2018-04-03T15:34:29Z,2018-04-03T15:34:29Z,OWNER,"Here's what this looks like: ![2018-04-03 at 8 32 am](https://user-images.githubusercontent.com/9599/38259345-9e1c75ea-3719-11e8-83c9-2160c6fa079c.png) I need to figure out the right way to handle licensing of bundled software like this - it's MIT licensed which is compatible with Datasette's Apache 2 license, but I feel like bundled licensed software (including codemirror) needs to be recognized in the README or docs somehow.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",289425975,"add ""format sql"" button to query page, uses sql-formatter", https://github.com/simonw/datasette/pull/181#issuecomment-378293599,https://api.github.com/repos/simonw/datasette/issues/181,378293599,MDEyOklzc3VlQ29tbWVudDM3ODI5MzU5OQ==,9599,simonw,2018-04-03T15:34:50Z,2018-04-03T15:36:58Z,OWNER,"Let's only show the ""Format SQL"" button if the user has JavaScript enabled. We can do that in this code here: https://github.com/bsmithgall/datasette/blob/4a7151a58d6ab7c8404a91beef7083e8a5807cf8/datasette/templates/_codemirror_foot.html#L14-L21","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",289425975,"add ""format sql"" button to query page, uses sql-formatter", https://github.com/simonw/datasette/pull/181#issuecomment-378295376,https://api.github.com/repos/simonw/datasette/issues/181,378295376,MDEyOklzc3VlQ29tbWVudDM3ODI5NTM3Ng==,9599,simonw,2018-04-03T15:39:57Z,2018-04-03T15:39:57Z,OWNER,"On the licensing front: it looks like the way Django handles this is to keep the licensing header in the files intact, e.g. https://github.com/django/django/blob/6deaddcca367d0143c815aaa42342021baa3b41e/django/contrib/admin/static/admin/js/vendor/jquery/jquery.js So for this change, adding a comment at the top of `sql-formatter.min.js` which references the MIT license would do the trick.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",289425975,"add ""format sql"" button to query page, uses sql-formatter", https://github.com/simonw/datasette/pull/181#issuecomment-378297842,https://api.github.com/repos/simonw/datasette/issues/181,378297842,MDEyOklzc3VlQ29tbWVudDM3ODI5Nzg0Mg==,1957344,bsmithgall,2018-04-03T15:47:13Z,2018-04-03T15:47:13Z,NONE,I can work on that -- would you prefer to inline a `display: hidden` and then have the javascript flip the visibility or include it as css?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",289425975,"add ""format sql"" button to query page, uses sql-formatter", https://github.com/simonw/datasette/pull/181#issuecomment-379636695,https://api.github.com/repos/simonw/datasette/issues/181,379636695,MDEyOklzc3VlQ29tbWVudDM3OTYzNjY5NQ==,9599,simonw,2018-04-09T05:30:16Z,2018-04-09T05:30:16Z,OWNER,"I'd prefer to have the JavaScript actually manipulate the DOM to add the button - something like this: var button = document.createElement('button'); button.value = 'Format SQL'; button.addEventListener( 'click', format, false ); document.getElementById('run-sql').parentNode.appendChild(button);","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",289425975,"add ""format sql"" button to query page, uses sql-formatter", https://github.com/simonw/datasette/pull/181#issuecomment-379759875,https://api.github.com/repos/simonw/datasette/issues/181,379759875,MDEyOklzc3VlQ29tbWVudDM3OTc1OTg3NQ==,1957344,bsmithgall,2018-04-09T13:53:14Z,2018-04-09T13:53:14Z,NONE,I've implemented that approach in 86ac746. It does cause the button to pop in only after Codemirror is finished rendering which is a bit awkward.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",289425975,"add ""format sql"" button to query page, uses sql-formatter", https://github.com/simonw/datasette/pull/181#issuecomment-552275451,https://api.github.com/repos/simonw/datasette/issues/181,552275451,MDEyOklzc3VlQ29tbWVudDU1MjI3NTQ1MQ==,9599,simonw,2019-11-11T03:08:25Z,2019-11-11T03:08:25Z,OWNER,Closing this because this feature was shipped in #592 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",289425975,"add ""format sql"" button to query page, uses sql-formatter", https://github.com/simonw/datasette/issues/183#issuecomment-378281740,https://api.github.com/repos/simonw/datasette/issues/183,378281740,MDEyOklzc3VlQ29tbWVudDM3ODI4MTc0MA==,9599,simonw,2018-04-03T15:01:43Z,2018-04-03T15:01:43Z,OWNER,"I'm having trouble replicating this bug. In particular, I don't understand what you mean by ""these are then rendered in the datasette query box using single quotes"" - since canned queries aren't displayed in a textarea. Do you have an example database / metadata.json I can use to investigate this further?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",291639118,Custom Queries - escaping strings, https://github.com/simonw/datasette/issues/183#issuecomment-504880173,https://api.github.com/repos/simonw/datasette/issues/183,504880173,MDEyOklzc3VlQ29tbWVudDUwNDg4MDE3Mw==,9599,simonw,2019-06-24T06:45:07Z,2019-06-24T06:45:07Z,OWNER,Closing as couldn't replicate,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",291639118,Custom Queries - escaping strings, https://github.com/simonw/datasette/issues/184#issuecomment-379636068,https://api.github.com/repos/simonw/datasette/issues/184,379636068,MDEyOklzc3VlQ29tbWVudDM3OTYzNjA2OA==,9599,simonw,2018-04-09T05:26:21Z,2018-04-09T05:26:21Z,OWNER,Do you have steps to reproduce here - ideally a small example SQLite database that exhibits the error?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",292011379,500 from missing table name, https://github.com/simonw/datasette/issues/184#issuecomment-379788103,https://api.github.com/repos/simonw/datasette/issues/184,379788103,MDEyOklzc3VlQ29tbWVudDM3OTc4ODEwMw==,222245,carlmjohnson,2018-04-09T15:15:11Z,2018-04-09T15:15:11Z,NONE,Visit https://salaries.news.baltimoresun.com/salaries/bad-table.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",292011379,500 from missing table name, https://github.com/simonw/datasette/issues/184#issuecomment-380608340,https://api.github.com/repos/simonw/datasette/issues/184,380608340,MDEyOklzc3VlQ29tbWVudDM4MDYwODM0MA==,9599,simonw,2018-04-11T21:55:41Z,2018-04-11T21:55:41Z,OWNER,"Yuck, nasty - OK I get it, this happens with ANY non-existent table name. Let's fix that - these should clearly return an HTTP 404.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",292011379,500 from missing table name, https://github.com/simonw/datasette/issues/184#issuecomment-494459264,https://api.github.com/repos/simonw/datasette/issues/184,494459264,MDEyOklzc3VlQ29tbWVudDQ5NDQ1OTI2NA==,222245,carlmjohnson,2019-05-21T16:17:29Z,2019-05-21T16:17:29Z,NONE,"Reopening this because it still raises 500 for incorrect table capitalization. Example: - https://salaries.news.baltimoresun.com/salaries/2018+Maryland+state+salaries/1 200 OK - https://salaries.news.baltimoresun.com/salaries/bad-table/1 400 - https://salaries.news.baltimoresun.com/salaries/2018+maryland+state+salaries/1 500 Internal Error (note lowercase 'm') I think because the table name exists but is not in its canonical form, it triggers a dict lookup error.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",292011379,500 from missing table name, https://github.com/simonw/datasette/issues/185#issuecomment-370273359,https://api.github.com/repos/simonw/datasette/issues/185,370273359,MDEyOklzc3VlQ29tbWVudDM3MDI3MzM1OQ==,9599,simonw,2018-03-04T23:10:56Z,2018-03-04T23:10:56Z,OWNER,"Are you talking specifically about accessing metadata from HTML templates? That makes a lot of sense, I'll think about how this could work.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-370461231,https://api.github.com/repos/simonw/datasette/issues/185,370461231,MDEyOklzc3VlQ29tbWVudDM3MDQ2MTIzMQ==,222245,carlmjohnson,2018-03-05T15:43:56Z,2018-03-05T15:44:27Z,NONE,"Yes. I think the simplest implementation is to change lines like ```python metadata = self.ds.metadata.get('databases', {}).get(name, {}) ``` to ```python metadata = { **self.ds.metadata, **self.ds.metadata.get('databases', {}).get(name, {}), } ``` so that specified inner values overwrite outer values, but only if they exist.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-376585911,https://api.github.com/repos/simonw/datasette/issues/185,376585911,MDEyOklzc3VlQ29tbWVudDM3NjU4NTkxMQ==,9599,simonw,2018-03-27T16:19:43Z,2018-03-27T16:19:43Z,OWNER,"OK, I have an implementation of this. I realised that not ALL metadata should be inherited: it makes sense for source/source_url/license/license_url to be inherited, but it doesn't make sense for the title and description to be inherited down to the individual databases and tables.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-376587017,https://api.github.com/repos/simonw/datasette/issues/185,376587017,MDEyOklzc3VlQ29tbWVudDM3NjU4NzAxNw==,9599,simonw,2018-03-27T16:22:59Z,2018-03-27T16:22:59Z,OWNER,One thing that's missing from this: if you set source/license data at the individual database level they should be inherited by tables within that database.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-376589591,https://api.github.com/repos/simonw/datasette/issues/185,376589591,MDEyOklzc3VlQ29tbWVudDM3NjU4OTU5MQ==,9599,simonw,2018-03-27T16:30:51Z,2018-03-27T16:30:51Z,OWNER,"Also needed: the ability to unset metadata. If the root metadata specifies a license_url it should be possible to set ""license_url"": null on a child database or table. The current implementation will ignore null (or empty string) values and default to the top level value. I think the templates themselves should be able to indicate if they want the inherited values or not. That way we could support arbitrary key/values and avoid the application code having special knowledge of license_url etc.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-376590265,https://api.github.com/repos/simonw/datasette/issues/185,376590265,MDEyOklzc3VlQ29tbWVudDM3NjU5MDI2NQ==,222245,carlmjohnson,2018-03-27T16:32:51Z,2018-03-27T16:32:51Z,NONE,">I think the templates themselves should be able to indicate if they want the inherited values or not. That way we could support arbitrary key/values and avoid the application code having special knowledge of license_url etc. Yes, you could have `metadata` that works like `metadata` does currently and `inherited_metadata` that works with inheritance.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-376592044,https://api.github.com/repos/simonw/datasette/issues/185,376592044,MDEyOklzc3VlQ29tbWVudDM3NjU5MjA0NA==,222245,carlmjohnson,2018-03-27T16:38:23Z,2018-03-27T16:38:23Z,NONE,"It would be nice to also allow arbitrary keys (maybe under a parent key called params or something to prevent conflicts). For our datasette project, we just have a bunch of dictionaries defined in the base template for things like site URL and column humanized names: https://github.com/baltimore-sun-data/salaries-datasette/blob/master/templates/base.html It would be cleaner if this were in the metadata.json.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-376604558,https://api.github.com/repos/simonw/datasette/issues/185,376604558,MDEyOklzc3VlQ29tbWVudDM3NjYwNDU1OA==,9599,simonw,2018-03-27T17:16:27Z,2018-03-27T17:16:27Z,OWNER,"I am SO inspired by what you've done with https://salaries.news.baltimoresun.com/ - that's pretty much my ideal use-case for Datasette, and it's by far the most elaborate customization I've seen so far. I'd love to hear other ideas that came up while building that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-376614973,https://api.github.com/repos/simonw/datasette/issues/185,376614973,MDEyOklzc3VlQ29tbWVudDM3NjYxNDk3Mw==,222245,carlmjohnson,2018-03-27T17:49:00Z,2018-03-27T17:49:00Z,NONE,"@simonw Other than metadata, the biggest item on wishlist for the salaries project was the ability to reorder by column. Of course, that could be done with a custom SQL query, but we didn't want to have to reimplement all the nav/pagination stuff from scratch. @carolinp, feel free to add your thoughts. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-379595253,https://api.github.com/repos/simonw/datasette/issues/185,379595253,MDEyOklzc3VlQ29tbWVudDM3OTU5NTI1Mw==,9599,simonw,2018-04-09T00:24:10Z,2018-04-09T00:24:10Z,OWNER,@carlmjohnson in case you aren't following along with #189 I've shipped the first working prototype of sort-by-column - you can try it out here: https://datasette-issue-189-demo-2.now.sh/salaries-7859114-7859114/2017+Maryland+state+salaries?_search=university&_sort_desc=annual_salary,"{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-412299013,https://api.github.com/repos/simonw/datasette/issues/185,412299013,MDEyOklzc3VlQ29tbWVudDQxMjI5OTAxMw==,9599,simonw,2018-08-11T20:14:54Z,2018-08-11T20:14:54Z,OWNER,"I've been worrying about how this one relates to #260 - I'd like to validate metadata (to help protect against people e.g. misspelling `license_url` and then being confused when their license isn't displayed properly), but this issue requests the ability to add arbitrary additional keys to the metadata structure. I think the solution is to introduce a metadata key called `extra_metadata_keys` which allows you to specifically list the extra keys that you want to enable. Something like this: ``` { ""title"": ""My title"", ""source"": ""Source"", ""source_url"": ""https://www.example.com/"", ""release_date"": ""2018-04-01"", ""extra_metadata_keys"": [""release_date""] } ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-412663658,https://api.github.com/repos/simonw/datasette/issues/185,412663658,MDEyOklzc3VlQ29tbWVudDQxMjY2MzY1OA==,222245,carlmjohnson,2018-08-13T21:04:11Z,2018-08-13T21:04:11Z,NONE,That seems good to me.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/186#issuecomment-374810115,https://api.github.com/repos/simonw/datasette/issues/186,374810115,MDEyOklzc3VlQ29tbWVudDM3NDgxMDExNQ==,9599,simonw,2018-03-21T01:21:13Z,2018-03-21T01:21:13Z,OWNER,"Hah, this is exactly the opposite of datasette's default approach to caching, which is to cache everything for as long as possible. I don't think we'll need to add `Cache-Control: no-cache` headers provided we instead set it up so you can turn off Datasette's caching.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",306811513,proposal new option to disable user agents cache, https://github.com/simonw/datasette/issues/186#issuecomment-374811114,https://api.github.com/repos/simonw/datasette/issues/186,374811114,MDEyOklzc3VlQ29tbWVudDM3NDgxMTExNA==,9599,simonw,2018-03-21T01:28:30Z,2018-03-21T01:28:30Z,OWNER,"We actually have this already: https://github.com/simonw/datasette/blob/012fc7c5cd3e9160c9a4c19cc964253e97fb054a/datasette/cli.py#L253-L255 You can disable the cache headers using the `datasette --debug` option.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",306811513,proposal new option to disable user agents cache, https://github.com/simonw/datasette/issues/186#issuecomment-374872202,https://api.github.com/repos/simonw/datasette/issues/186,374872202,MDEyOklzc3VlQ29tbWVudDM3NDg3MjIwMg==,47107,stefanocudini,2018-03-21T09:07:22Z,2018-03-21T09:07:22Z,NONE,--debug is perfect tnk,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",306811513,proposal new option to disable user agents cache, https://github.com/simonw/datasette/issues/187#issuecomment-427943710,https://api.github.com/repos/simonw/datasette/issues/187,427943710,MDEyOklzc3VlQ29tbWVudDQyNzk0MzcxMA==,1583271,progpow,2018-10-08T18:58:05Z,2018-10-08T18:58:05Z,NONE,"I have same error: ``` Collecting uvloop Using cached https://files.pythonhosted.org/packages/5c/37/6daa39aac42b2deda6ee77f408bec0419b600e27b89b374b0d440af32b10/uvloop-0.11.2.tar.gz Complete output from command python setup.py egg_info: Traceback (most recent call last): File """", line 1, in File ""C:\Users\sageev\AppData\Local\Temp\pip-install-bq64l8jy\uvloop\setup.py"", line 15, in raise RuntimeError('uvloop does not support Windows at the moment') RuntimeError: uvloop does not support Windows at the moment ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309033998,Windows installation error, https://github.com/simonw/datasette/issues/187#issuecomment-463917744,https://api.github.com/repos/simonw/datasette/issues/187,463917744,MDEyOklzc3VlQ29tbWVudDQ2MzkxNzc0NA==,4190962,phoenixjun,2019-02-15T05:58:44Z,2019-02-15T05:58:44Z,NONE,is this supported or not? you can comment if it is not supported so that people like me can stop trying.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309033998,Windows installation error, https://github.com/simonw/datasette/issues/187#issuecomment-466325528,https://api.github.com/repos/simonw/datasette/issues/187,466325528,MDEyOklzc3VlQ29tbWVudDQ2NjMyNTUyOA==,2892252,fkuhn,2019-02-22T09:03:50Z,2019-02-22T09:03:50Z,NONE,"I ran into the same issue when trying to install datasette on windows after successfully using it on linux. Unfortunately, there has not been any progress in implementing uvloop for windows - so I recommend not to use it on win. You can read about this issue here: [https://github.com/MagicStack/uvloop/issues/14](url)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309033998,Windows installation error, https://github.com/simonw/datasette/issues/187#issuecomment-467264937,https://api.github.com/repos/simonw/datasette/issues/187,467264937,MDEyOklzc3VlQ29tbWVudDQ2NzI2NDkzNw==,9599,simonw,2019-02-26T02:14:28Z,2019-02-26T02:14:28Z,OWNER,I'm working on a port of Datasette to Starlette which I think would fix this issue: https://github.com/encode/starlette,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309033998,Windows installation error, https://github.com/simonw/datasette/issues/187#issuecomment-489353316,https://api.github.com/repos/simonw/datasette/issues/187,489353316,MDEyOklzc3VlQ29tbWVudDQ4OTM1MzMxNg==,46059,carsonyl,2019-05-04T18:36:36Z,2019-05-04T18:36:36Z,NONE,"Hi @simonw - I just hit this issue when trying out Datasette after your PyCon talk today. Datasette is pinned to Sanic 0.7.0, but it looks like 0.8.0 added the option to remove the uvloop dependency for Windows by having an environment variable `SANIC_NO_UVLOOP` at install time. Maybe that'll be sufficient before a port to Starlette?","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 1, ""eyes"": 0}",309033998,Windows installation error, https://github.com/simonw/datasette/issues/187#issuecomment-490039343,https://api.github.com/repos/simonw/datasette/issues/187,490039343,MDEyOklzc3VlQ29tbWVudDQ5MDAzOTM0Mw==,6422964,Maltazar,2019-05-07T11:24:42Z,2019-05-07T11:24:42Z,NONE,I totally agree with carsonyl,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309033998,Windows installation error, https://github.com/simonw/datasette/issues/187#issuecomment-502401636,https://api.github.com/repos/simonw/datasette/issues/187,502401636,MDEyOklzc3VlQ29tbWVudDUwMjQwMTYzNg==,9599,simonw,2019-06-15T21:44:23Z,2019-06-15T21:44:23Z,OWNER,I'm closing his as a duplicate of the new #511 - I hope to have his working very shortly as a result of #272 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309033998,Windows installation error, https://github.com/simonw/datasette/issues/188#issuecomment-376594727,https://api.github.com/repos/simonw/datasette/issues/188,376594727,MDEyOklzc3VlQ29tbWVudDM3NjU5NDcyNw==,9599,simonw,2018-03-27T16:46:49Z,2018-05-28T21:34:34Z,OWNER,"One point of complexity: datasette can be used to bundle multiple .db files into a single ""app"". I think that's OK. We could require that the `datasette_files` table is present in the first database file passed on the command-line. Or we could even construct a search path and consult multiple versions of the table spread across multiple files. That said... any configuration that corresponds to a specific table should live in the same database file as that table. Ditto for general metadata: if we have license/source information for a specific table or database that information should be able to live in the same .db file as the data.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309047460,Ability to bundle metadata and templates inside the SQLite file, https://github.com/simonw/datasette/issues/188#issuecomment-398778485,https://api.github.com/repos/simonw/datasette/issues/188,398778485,MDEyOklzc3VlQ29tbWVudDM5ODc3ODQ4NQ==,12617395,bsilverm,2018-06-20T14:48:39Z,2018-06-20T14:48:39Z,NONE,This would be a great feature to have!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309047460,Ability to bundle metadata and templates inside the SQLite file, https://github.com/simonw/datasette/issues/188#issuecomment-412291327,https://api.github.com/repos/simonw/datasette/issues/188,412291327,MDEyOklzc3VlQ29tbWVudDQxMjI5MTMyNw==,9599,simonw,2018-08-11T17:53:17Z,2018-08-11T17:53:17Z,OWNER,"Potential problem: the existing `metadata.json` format looks like this: ``` { ""title"": ""Custom title for your index page"", ""description"": ""Some description text can go here"", ""license"": ""ODbL"", ""license_url"": ""https://opendatacommons.org/licenses/odbl/"", ""databases"": { ""database1"": { ""source"": ""Alternative source"", ""source_url"": ""http://example.com/"", ""tables"": { ""example_table"": { ""description_html"": ""Custom table description"", ""license"": ""CC BY 3.0 US"", ""license_url"": ""https://creativecommons.org/licenses/by/3.0/us/"" } } } } } ``` This doesn't make sense for metadata that is bundled with a specific database - there's no point in having the `databases` key, instead the content of `database1` in the above example should be at the top level. This also means that if you rename the `*.db` file you won't have to edit its metadata at the same time. Calling such an embedded file `metadata.json` when the shape is different could be confusing. Maybe call it `database-metadata.json` instead.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309047460,Ability to bundle metadata and templates inside the SQLite file, https://github.com/simonw/datasette/issues/188#issuecomment-738905376,https://api.github.com/repos/simonw/datasette/issues/188,738905376,MDEyOklzc3VlQ29tbWVudDczODkwNTM3Ng==,9599,simonw,2020-12-04T17:18:34Z,2020-12-04T17:18:34Z,OWNER,This is likely to be covered by plugin hooks: #860 for the metadata and after investigating in #1042 it looks like the existing `prepare_jinja2_environment` hook may already be enough to load templates from the database.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309047460,Ability to bundle metadata and templates inside the SQLite file, https://github.com/simonw/datasette/issues/189#issuecomment-376981291,https://api.github.com/repos/simonw/datasette/issues/189,376981291,MDEyOklzc3VlQ29tbWVudDM3Njk4MTI5MQ==,9599,simonw,2018-03-28T18:06:08Z,2018-03-28T18:06:08Z,OWNER,"Given how unlikely it is that this will pose a real problem I think I like option 1: enable sort-by-column by default for all tables, then allow power users to instead switch to explicit enabling of the functionality in their `metadata.json` if they know their data is too big.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-376983741,https://api.github.com/repos/simonw/datasette/issues/189,376983741,MDEyOklzc3VlQ29tbWVudDM3Njk4Mzc0MQ==,9599,simonw,2018-03-28T18:12:35Z,2018-03-28T18:12:35Z,OWNER,"I think this can work with a `?_sort=xxx` parameter - and `?_sort=-xxx` to sort in the opposite direction. I'd like to support ""sort by X descending, then by Y ascending if there are dupes for X"" as well. Two ways that could work: `?_sort=-xxx,yyy` Or... `?_sort=-xxx&_sort=yyy` The second option is probably better in that it makes it easier for columns to have a comma in their name. Is it possible for a SQLite column to start with a `-` character?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-376986668,https://api.github.com/repos/simonw/datasette/issues/189,376986668,MDEyOklzc3VlQ29tbWVudDM3Njk4NjY2OA==,9599,simonw,2018-03-28T18:21:53Z,2018-03-28T18:21:53Z,OWNER,"Might have to do something special to get sort-by-nulls-last: https://stackoverflow.com/questions/12503120/how-to-do-nulls-last-in-sqlite order by ifnull(column_name, -999999) Would need to figure out a smart way to get the default value - maybe by running a min() or max() against the column first?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377049625,https://api.github.com/repos/simonw/datasette/issues/189,377049625,MDEyOklzc3VlQ29tbWVudDM3NzA0OTYyNQ==,9599,simonw,2018-03-28T21:52:05Z,2018-03-28T21:52:05Z,OWNER,"This is a better pattern as you don't have to pick a minimum value: ORDER BY CASE WHEN SOMECOL IS NULL THEN 1 ELSE 0 END, SOMECOL","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377050461,https://api.github.com/repos/simonw/datasette/issues/189,377050461,MDEyOklzc3VlQ29tbWVudDM3NzA1MDQ2MQ==,9599,simonw,2018-03-28T21:55:14Z,2018-03-28T22:06:30Z,OWNER,"I think there are actually four kinds of sort order we need to support; * ascending * descending * ascending, nulls last * descending, nulls last It looks like [-blah] is a valid SQLite table name, so mark I descending with a hyphen prefix isn't good. Instead, maybe this: ?_sort_asc=col1&_sort_desc_nulls_last=col2 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377051018,https://api.github.com/repos/simonw/datasette/issues/189,377051018,MDEyOklzc3VlQ29tbWVudDM3NzA1MTAxOA==,9599,simonw,2018-03-28T21:57:20Z,2018-03-28T22:00:17Z,OWNER,"I'd like to continue to support _next=token pagination even for custom sort orders. To do that I should include rowid (or general primary key) as the tie breaker on all sorts so I can incorporate that it into the _next= token.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377052634,https://api.github.com/repos/simonw/datasette/issues/189,377052634,MDEyOklzc3VlQ29tbWVudDM3NzA1MjYzNA==,9599,simonw,2018-03-28T22:03:16Z,2018-03-28T22:03:16Z,OWNER,"In terms of user interface: the obvious place to put this is as a drop down menu on the column headers. This also means the UI can support combined sort orders. Assuming you are already sorted by county descending and you select the candidate column header, the options could be: * sort all by candidate * sort all by candidate, descending * sort by county descending, then by candidate * sort by county descending, then by candidate descending","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377054358,https://api.github.com/repos/simonw/datasette/issues/189,377054358,MDEyOklzc3VlQ29tbWVudDM3NzA1NDM1OA==,9599,simonw,2018-03-28T22:09:25Z,2018-03-28T22:09:25Z,OWNER,I'm tempted to put these verbose sorting options inline in the page HTML but have them in the table footer so they don't clog up the top half of the page with uninteresting links - then use JavaScript to hoik them out into a dropdown menu attached to each column header.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377055663,https://api.github.com/repos/simonw/datasette/issues/189,377055663,MDEyOklzc3VlQ29tbWVudDM3NzA1NTY2Mw==,9599,simonw,2018-03-28T22:14:53Z,2018-03-28T22:14:53Z,OWNER,"There is one other interesting option for auto-enabling/disabling sort: the inspect command could include data about column index presence and whether or not a column has any null values in it. This would allow us to dynamically include a ""nulls last"" option but only for columns that contain at least one null. It's quite a lot of additional engineering for a very minor feature though, so I think I'll punt on that for the moment. We may find that the _group_count feature can benefit from column value statistics later on though.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377362466,https://api.github.com/repos/simonw/datasette/issues/189,377362466,MDEyOklzc3VlQ29tbWVudDM3NzM2MjQ2Ng==,9599,simonw,2018-03-29T20:29:14Z,2018-03-29T20:29:14Z,OWNER,"Alternative idea: by default enable all sorting in the UI. If a table has more than 100,000 rows disable sorting UI except for columns that have an index. Allow this to be overridden in metadata.json ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377459579,https://api.github.com/repos/simonw/datasette/issues/189,377459579,MDEyOklzc3VlQ29tbWVudDM3NzQ1OTU3OQ==,9599,simonw,2018-03-30T06:47:52Z,2018-03-30T06:47:52Z,OWNER,"I'm not entirely sure how to get `_next=` pagination working against sorted collections when a tie-breaker is needed. Consider this data: https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9?sql=select+rowid%2C+*+from+%5Bnfl-wide-receivers%2Fadvanced-historical%5D%0D%0Aorder+by+case+when+career_ranypa+is+null+then+1+else+0+end%2C+career_ranypa%2C+rowid+limit+11 ![2018-03-29 at 11 46 pm](https://user-images.githubusercontent.com/9599/38127549-790c8bd0-33ab-11e8-8d32-66f5d3847c8a.png) If the page size was set to 9 rather than 11, the page divide would be between those two rows with the same value in the `career_ranypa` column. What would the `?_next=` token look like such that the correct row would be returned? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377460127,https://api.github.com/repos/simonw/datasette/issues/189,377460127,MDEyOklzc3VlQ29tbWVudDM3NzQ2MDEyNw==,9599,simonw,2018-03-30T06:51:29Z,2018-03-30T06:51:52Z,OWNER,"The problem is that our `_next=` pagination currently works based on a `>` - but for this case a `>=` for the value is needed combined with a `>` on the tie-breaker (which would be the `rowid` column). So I think this is the right SQL: ``` select rowid, * from [nfl-wide-receivers/advanced-historical] where career_ranypa >= -6.331167749 and rowid > 2736 order by case when career_ranypa is null then 1 else 0 end, career_ranypa, rowid limit 11 ``` https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9?sql=select+rowid%2C+*+from+%5Bnfl-wide-receivers%2Fadvanced-historical%5D%0D%0Awhere+career_ranypa+%3E%3D+-6.331167749+and+rowid+%3E+2736%0D%0Aorder+by+case+when+career_ranypa+is+null+then+1+else+0+end%2C+career_ranypa%2C+rowid+limit+11 But how do I encode a `_next` token that means "">= X and > Y""?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377462334,https://api.github.com/repos/simonw/datasette/issues/189,377462334,MDEyOklzc3VlQ29tbWVudDM3NzQ2MjMzNA==,9599,simonw,2018-03-30T07:06:21Z,2018-03-30T07:06:21Z,OWNER,"Maybe the answer here is that anything that's encoded in the next token is treated as >= with the exception of columns known to be primary keys, which are treated as >","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377546510,https://api.github.com/repos/simonw/datasette/issues/189,377546510,MDEyOklzc3VlQ29tbWVudDM3NzU0NjUxMA==,9599,simonw,2018-03-30T15:13:11Z,2018-03-30T15:13:11Z,OWNER,"Pushed some work-in-progress with failing unit tests here: https://github.com/simonw/datasette/commit/2f8359c6f25768805431c80c74e5ec4213c2b2a6 Here's a demo: https://datasette-column-sort-wip.now.sh/sortable-4bbaa6f/sortable?_sort=sortable - note that the `_sort_desc` and `_sort_nulls_last` options aren't done yet, plus it doesn't correctly paginate (the `_next` tokens do not yet take sorting into account).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377547265,https://api.github.com/repos/simonw/datasette/issues/189,377547265,MDEyOklzc3VlQ29tbWVudDM3NzU0NzI2NQ==,9599,simonw,2018-03-30T15:16:43Z,2018-03-30T15:16:43Z,OWNER,"I think this is the right incantation for a ""next"" link: https://datasette-column-sort-wip.now.sh/sortable-4bbaa6f?sql=select+*+from+sortable%0D%0Awhere+sortable+%3C%3D+94%0D%0Aand+%28%0D%0A++%28pk1+%3E+%27d%27%29%0D%0A++or%0D%0A++%28pk1+%3D+%27d%27+and+pk2+%3E+%27w%27%29%0D%0A%29%0D%0Aorder+by+sortable+desc%2C+pk1%2C+pk2%0D%0Alimit+7","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379555484,https://api.github.com/repos/simonw/datasette/issues/189,379555484,MDEyOklzc3VlQ29tbWVudDM3OTU1NTQ4NA==,9599,simonw,2018-04-08T14:39:57Z,2018-04-08T14:39:57Z,OWNER,I'm going to combine the code for explicit sorting with the existing code for _next= pagination - so even tables without an explicit sort order will run through the same code since they are ordered and paginated by primary key.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379556774,https://api.github.com/repos/simonw/datasette/issues/189,379556774,MDEyOklzc3VlQ29tbWVudDM3OTU1Njc3NA==,9599,simonw,2018-04-08T14:59:05Z,2018-04-08T14:59:05Z,OWNER,"A common problem with keyset pagination is that it can distort the ""total number of rows"" logic - every time you navigate to a further page the total rows count can decrease due to the extra arguments in the `where` clause. The `filtered_table_rows` value (see #194) calculated using `count_sql` currently has this problem.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379557982,https://api.github.com/repos/simonw/datasette/issues/189,379557982,MDEyOklzc3VlQ29tbWVudDM3OTU1Nzk4Mg==,9599,simonw,2018-04-08T15:16:49Z,2018-04-08T15:16:49Z,OWNER,"A note about views: a view cannot be paginated using keyset pagination because records returned from a view don't have a primary key - so there's no way to reliably distinguish between _next= records when the sorted column has duplicates with the same value. Datasette already takes this into account: views are paginated using offset/limit instead. We can continue to do that even for views that have been sorted using a `_sort` parameter. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379591062,https://api.github.com/repos/simonw/datasette/issues/189,379591062,MDEyOklzc3VlQ29tbWVudDM3OTU5MTA2Mg==,9599,simonw,2018-04-08T23:23:12Z,2018-04-08T23:23:12Z,OWNER,"To break this up into smaller units, the first implementation of this will only support a single `_sort` or `_sort_desc` querystring parameter.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379592393,https://api.github.com/repos/simonw/datasette/issues/189,379592393,MDEyOklzc3VlQ29tbWVudDM3OTU5MjM5Mw==,9599,simonw,2018-04-08T23:45:42Z,2018-04-08T23:46:31Z,OWNER,"Actually next page SQL when sorting looks more like this: ``` select rowid, * from [alcohol-consumption/drinks] where ""country"" like :p0 and ( beer_servings > 111 or (beer_servings = 111 and rowid > 190) ) order by beer_servings, rowid limit 101 ``` The next page after row 190 with sortable value 111 should show either records that are greater than 111 or records that match 111 but have a greater primary key than the last one seen. https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9?sql=select+rowid%2C+*+from+%5Balcohol-consumption%2Fdrinks%5D%0D%0Awhere+%22country%22+like+%3Ap0%0D%0Aand+%28%0D%0A++++beer_servings+%3E+111%0D%0A++++or+%28beer_servings+%3D+111+and+rowid+%3E+190%29%0D%0A%29%0D%0Aorder+by+beer_servings%2C+rowid+limit+101&p0=%25a%25","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379594529,https://api.github.com/repos/simonw/datasette/issues/189,379594529,MDEyOklzc3VlQ29tbWVudDM3OTU5NDUyOQ==,9599,simonw,2018-04-09T00:15:03Z,2018-04-09T00:15:03Z,OWNER,"Demo: senator tweets ordered by number of replies: https://datasette-issue-189-demo.now.sh/fivethirtyeight-2628db9/twitter-ratio%2Fsenators?_sort_desc=replies Page 2 (note that since Senators retweet things there are tweets with the same text/number-of-replies but retweeted by different senators that span the page break): https://datasette-issue-189-demo.now.sh/fivethirtyeight-2628db9/twitter-ratio%2Fsenators?_next=8556%2C121799&_sort_desc=replies ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379595274,https://api.github.com/repos/simonw/datasette/issues/189,379595274,MDEyOklzc3VlQ29tbWVudDM3OTU5NTI3NA==,9599,simonw,2018-04-09T00:24:37Z,2018-04-09T00:29:46Z,OWNER,"Another demo: https://datasette-issue-189-demo-2.now.sh/salaries-7859114-7859114/2017+Maryland+state+salaries?_search=university&_sort_desc=annual_salary https://datasette-issue-189-demo-2.now.sh/salaries-7859114-7859114/2017+Maryland+state+salaries?_search=university&last_name__exact=JOHNSON&_sort_desc=annual_salary","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379602339,https://api.github.com/repos/simonw/datasette/issues/189,379602339,MDEyOklzc3VlQ29tbWVudDM3OTYwMjMzOQ==,9599,simonw,2018-04-09T01:33:26Z,2018-04-09T01:33:26Z,OWNER,"Small bug: ""201 rows where sorted by sortable_with_nulls"" shouldn't have the word ""where"" in it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379602690,https://api.github.com/repos/simonw/datasette/issues/189,379602690,MDEyOklzc3VlQ29tbWVudDM3OTYwMjY5MA==,9599,simonw,2018-04-09T01:37:03Z,2018-04-09T01:37:03Z,OWNER,"I'm going to split the following out into separate tickets: * Ability to sort by multiple columns e.g. `?_sort=name&sort_desc=age&_sort=height` * Ability to specify nulls last e.g. `?_sort_desc_nulls_last=age`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379603156,https://api.github.com/repos/simonw/datasette/issues/189,379603156,MDEyOklzc3VlQ29tbWVudDM3OTYwMzE1Ng==,9599,simonw,2018-04-09T01:41:22Z,2018-04-09T01:41:22Z,OWNER,"Actually I think I always want nulls last when ordering asc, nulls first when ordering desc.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379608977,https://api.github.com/repos/simonw/datasette/issues/189,379608977,MDEyOklzc3VlQ29tbWVudDM3OTYwODk3Nw==,9599,simonw,2018-04-09T02:22:59Z,2018-04-09T02:22:59Z,OWNER,"Here's a demo of the new clickable column headers: https://datasette-issue-189-demo-3.now.sh/salaries-7859114-7859114/2017+Maryland+state+salaries?_search=university&_sort_desc=last_name ![2018-04-08 at 7 22 pm](https://user-images.githubusercontent.com/9599/38476370-3e62a60e-3b62-11e8-9d30-8dc6608133dd.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379634425,https://api.github.com/repos/simonw/datasette/issues/189,379634425,MDEyOklzc3VlQ29tbWVudDM3OTYzNDQyNQ==,9599,simonw,2018-04-09T05:16:02Z,2018-04-09T05:16:02Z,OWNER,I've merged this into master.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379791047,https://api.github.com/repos/simonw/datasette/issues/189,379791047,MDEyOklzc3VlQ29tbWVudDM3OTc5MTA0Nw==,222245,carlmjohnson,2018-04-09T15:23:45Z,2018-04-09T15:23:45Z,NONE,Awesome!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379803864,https://api.github.com/repos/simonw/datasette/issues/189,379803864,MDEyOklzc3VlQ29tbWVudDM3OTgwMzg2NA==,9599,simonw,2018-04-09T16:02:09Z,2018-04-09T16:02:09Z,OWNER,This is now released in Datasette 0.15 https://github.com/simonw/datasette/releases/tag/0.15,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379830529,https://api.github.com/repos/simonw/datasette/issues/189,379830529,MDEyOklzc3VlQ29tbWVudDM3OTgzMDUyOQ==,9599,simonw,2018-04-09T17:28:47Z,2018-04-09T17:28:47Z,OWNER,Another demo: https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9/congress-age%2Fcongress-terms,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-381429213,https://api.github.com/repos/simonw/datasette/issues/189,381429213,MDEyOklzc3VlQ29tbWVudDM4MTQyOTIxMw==,222245,carlmjohnson,2018-04-15T18:54:22Z,2018-04-15T18:54:22Z,NONE,"I think I found a bug. I tried to sort by middle initial in my salaries set, and many middle initials are null. The next_url gets set by Datasette to: http://localhost:8001/salaries-d3a5631/2017+Maryland+state+salaries?_next=None%2C391&_sort=middle_initial But then `None` is interpreted literally and it tries to find a name with the middle initial ""None"" and ends up skipping ahead to O on page 2.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/190#issuecomment-377065541,https://api.github.com/repos/simonw/datasette/issues/190,377065541,MDEyOklzc3VlQ29tbWVudDM3NzA2NTU0MQ==,9599,simonw,2018-03-28T22:58:52Z,2018-03-28T22:58:52Z,OWNER,"This is because the SQL we are using here is: select * from compound_primary_key where ""pk1"" > ""d"" and ""pk2"" > ""v"" order by pk1, pk2 limit 101 This is incorrect. The correct SQL syntax (according to the example on https://www.sqlite.org/rowvalue.html#scrolling_window_queries ) is: select * from compound_primary_key where (""pk1"", ""pk2"") > (""d"", ""v"") order by pk1, pk2 limit 101 BUT... this uses ""row values"" syntax which was only added to SQLite in version 3.15.0 in October 2016: https://sqlite.org/changes.html#version_3_15_0 The version on https://datasette-issue-190-compound-pks.now.sh/compound-pks-9aafe8f?sql=select+sqlite_version%28%29%3B is 3.8.7.1 from October 2014.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309558826,Keyset pagination doesn't work correctly for compound primary keys, https://github.com/simonw/datasette/issues/190#issuecomment-377066466,https://api.github.com/repos/simonw/datasette/issues/190,377066466,MDEyOklzc3VlQ29tbWVudDM3NzA2NjQ2Ng==,9599,simonw,2018-03-28T23:03:45Z,2018-03-28T23:03:57Z,OWNER,"Without row values syntax, the necessary SQL to retrieve the next page after `d, v` gets a bit gnarly: select * from compound_primary_key where pk1 >= ""d"" and not (pk1 = ""d"" and pk2 <= ""v"") order by pk1, pk2 See https://datasette-issue-190-compound-pks.now.sh/compound-pks-9aafe8f?sql=select+*+from+compound_primary_key+where+pk1+%3E%3D+%22d%22+and+not+%28pk1+%3D+%22d%22+and+pk2+%3C%3D+%22v%22%29+order+by+pk1%2C+pk2 This article was useful for figuring this out: https://use-the-index-luke.com/sql/partial-results/fetch-next-page","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309558826,Keyset pagination doesn't work correctly for compound primary keys, https://github.com/simonw/datasette/issues/190#issuecomment-377067541,https://api.github.com/repos/simonw/datasette/issues/190,377067541,MDEyOklzc3VlQ29tbWVudDM3NzA2NzU0MQ==,9599,simonw,2018-03-28T23:09:18Z,2018-03-28T23:09:51Z,OWNER,"Here's how I generated the table for testing this with 3 compound primary keys: CREATE_SQL = ''' CREATE TABLE compound_three_primary_keys ( pk1 varchar(30), pk2 varchar(30), pk3 varchar(30), content text, PRIMARY KEY (pk1, pk2, pk3) );''' alphabet = 'abcdefghijklmnopqrstuvwxyz' for a in alphabet: for b in alphabet: for c in alphabet: print(''' INSERT INTO compound_three_primary_keys VALUES ('{}', '{}', '{}', '{}'); '''.strip().format(a, b, c, '{}-{}-{}-{}-{}-{}'.format(a,b,c,a,b,c))) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309558826,Keyset pagination doesn't work correctly for compound primary keys, https://github.com/simonw/datasette/issues/190#issuecomment-377072022,https://api.github.com/repos/simonw/datasette/issues/190,377072022,MDEyOklzc3VlQ29tbWVudDM3NzA3MjAyMg==,9599,simonw,2018-03-28T23:32:24Z,2018-03-28T23:32:24Z,OWNER,"Here's the SQL for a next page with three compound primary keys: https://datasette-issue-190-compound-pks.now.sh/compound-pks-8e99805?sql=select+*+from+compound_three_primary_keys%0D%0Awhere%0D%0A++%28pk1+%3E+%3Apk1%29%0D%0A++++or%0D%0A++%28pk1+%3D+%3Apk1+and+pk2+%3E+%3Apk2%29%0D%0A++++or%0D%0A++%28pk1+%3D+%3Apk1+and+pk2+%3D+%3Apk2+and+pk3+%3E+%3Apk3%29%0D%0Aorder+by+pk1%2C+pk2%2C+pk3%3B%0D%0A%0D%0A%0D%0A&pk1=a&pk2=d&pk3=v ``` select * from compound_three_primary_keys where (pk1 > :pk1) or (pk1 = :pk1 and pk2 > :pk2) or (pk1 = :pk1 and pk2 = :pk2 and pk3 > :pk3) order by pk1, pk2, pk3; ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309558826,Keyset pagination doesn't work correctly for compound primary keys, https://github.com/simonw/datasette/issues/190#issuecomment-377454591,https://api.github.com/repos/simonw/datasette/issues/190,377454591,MDEyOklzc3VlQ29tbWVudDM3NzQ1NDU5MQ==,9599,simonw,2018-03-30T06:11:59Z,2018-03-30T06:11:59Z,OWNER,"Re-opening this issue: my fix doesn't play nicely with extra filter arguments. Consider this page: https://datasette-issue-190-compound-pks-not-quite-fixed.now.sh/compound-pks-8e99805/compound_three_primary_keys?content__contains=d The next link is to `?_next=f%2Cz%2Ct&content__contains=z` (that's next of `f,z,t`) but that gives us https://datasette-issue-190-compound-pks-not-quite-fixed.now.sh/compound-pks-8e99805/compound_three_primary_keys?_next=b%2Cx%2Cd&content__contains=d which shows `a,a,d` at the top. Sure enough, the generated SQL looks like this: https://datasette-issue-190-compound-pks-not-quite-fixed.now.sh/compound-pks-8e99805?sql=select+%2A+from+compound_three_primary_keys+where+%22content%22+like+%3Ap0+and+%28%5Bpk1%5D+%3E+%3Ap0%29%0A++or%0A%28%5Bpk1%5D+%3D+%3Ap0+and+%5Bpk2%5D+%3E+%3Ap1%29%0A++or%0A%28%5Bpk1%5D+%3D+%3Ap0+and+%5Bpk2%5D+%3D+%3Ap1+and+%5Bpk3%5D+%3E+%3Ap2%29+order+by+pk1%2C+pk2%2C+pk3+limit+101&p0=%25d%25&p1=b&p2=x&p3=d select * from compound_three_primary_keys where ""content"" like :p0 and ([pk1] > :p0) or ([pk1] = :p0 and [pk2] > :p1) or ([pk1] = :p0 and [pk2] = :p1 and [pk3] > :p2) order by pk1, pk2, pk3 limit 101 The parameters here are confused. The :p0 should be reserved just for the like clause - the other parameters should be p1, p2 and p3 (not p0, p1 and p2).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309558826,Keyset pagination doesn't work correctly for compound primary keys, https://github.com/simonw/datasette/issues/190#issuecomment-377457087,https://api.github.com/repos/simonw/datasette/issues/190,377457087,MDEyOklzc3VlQ29tbWVudDM3NzQ1NzA4Nw==,9599,simonw,2018-03-30T06:30:23Z,2018-03-30T06:30:23Z,OWNER,"Interestingly, in deploying a copy of the database to demonstrate this final bug fix I had to use the `--force` argument like so: datasette publish now --branch=master compound-pks.db --force This is because `now` had already deployed a Dockerfile referencing `--branch=master` once already, so it thought nothing had changed and it could re-use that last deployment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309558826,Keyset pagination doesn't work correctly for compound primary keys, https://github.com/simonw/datasette/issues/190#issuecomment-377457214,https://api.github.com/repos/simonw/datasette/issues/190,377457214,MDEyOklzc3VlQ29tbWVudDM3NzQ1NzIxNA==,9599,simonw,2018-03-30T06:31:15Z,2018-03-30T06:31:15Z,OWNER,"Fixed! https://datasette-issue-190-compound-pks-second-fix.now.sh/compound-pks-8e99805/compound_three_primary_keys?_next=b%2Cx%2Cd&content__contains=d now correctly shows `b,y,d` as the first row on the page.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309558826,Keyset pagination doesn't work correctly for compound primary keys, https://github.com/simonw/datasette/issues/191#issuecomment-381488049,https://api.github.com/repos/simonw/datasette/issues/191,381488049,MDEyOklzc3VlQ29tbWVudDM4MTQ4ODA0OQ==,9599,simonw,2018-04-16T05:58:15Z,2018-04-16T05:58:15Z,OWNER,"I think this is pretty hard. @coleifer has done some work in this direction, including https://github.com/coleifer/pysqlite3 which ports the standalone pysqlite module to Python 3. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310533258,Figure out how to bundle a more up-to-date SQLite, https://github.com/simonw/datasette/issues/191#issuecomment-381602005,https://api.github.com/repos/simonw/datasette/issues/191,381602005,MDEyOklzc3VlQ29tbWVudDM4MTYwMjAwNQ==,119974,coleifer,2018-04-16T13:37:32Z,2018-04-16T13:37:32Z,NONE,I don't think it should be too difficult... you can look at what @ghaering did with pysqlite (and similarly what I copied for pysqlite3). You would theoretically take an amalgamation build of Sqlite (all code in a single .c and .h file). The `AmalgamationLibSqliteBuilder` class detects the presence of this amalgamated source file and builds a statically-linked pysqlite.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310533258,Figure out how to bundle a more up-to-date SQLite, https://github.com/simonw/datasette/issues/191#issuecomment-392822050,https://api.github.com/repos/simonw/datasette/issues/191,392822050,MDEyOklzc3VlQ29tbWVudDM5MjgyMjA1MA==,9599,simonw,2018-05-29T15:33:25Z,2018-05-29T15:33:25Z,OWNER,"I don't know how it happened, but I've somehow got myself into a state where my local SQLite for Python 3 on OS X is `3.23.1`: ``` ~ $ python3 Python 3.6.5 (default, Mar 30 2018, 06:41:53) [GCC 4.2.1 Compatible Apple LLVM 9.0.0 (clang-900.0.39.2)] on darwin Type ""help"", ""copyright"", ""credits"" or ""license"" for more information. >>> import sqlite3 >>> sqlite3.connect(':memory:').execute('select sqlite_version()').fetchall() [('3.23.1',)] >>> ``` Maybe I did something in homebrew that changed this? I'd love to understand what exactly I did to get to this state.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310533258,Figure out how to bundle a more up-to-date SQLite, https://github.com/simonw/datasette/issues/191#issuecomment-392828475,https://api.github.com/repos/simonw/datasette/issues/191,392828475,MDEyOklzc3VlQ29tbWVudDM5MjgyODQ3NQ==,119974,coleifer,2018-05-29T15:50:18Z,2018-05-29T15:50:18Z,NONE,"Python standard-library SQLite dynamically links against the system sqlite3. So presumably you installed a more up-to-date sqlite3 somewhere on your `LD_LIBRARY_PATH`. To compile a statically-linked pysqlite you need to include an amalgamation in the project root when building the extension. Read the relevant setup.py.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310533258,Figure out how to bundle a more up-to-date SQLite, https://github.com/simonw/datasette/issues/191#issuecomment-392831543,https://api.github.com/repos/simonw/datasette/issues/191,392831543,MDEyOklzc3VlQ29tbWVudDM5MjgzMTU0Mw==,9599,simonw,2018-05-29T15:58:33Z,2018-05-29T15:58:33Z,OWNER,"I ran an informal survey on twitter and most people were on 3.21 - https://twitter.com/simonw/status/1001487546289815553 Maybe this is from upgrading to the latest OS X release.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310533258,Figure out how to bundle a more up-to-date SQLite, https://github.com/simonw/datasette/issues/191#issuecomment-403908704,https://api.github.com/repos/simonw/datasette/issues/191,403908704,MDEyOklzc3VlQ29tbWVudDQwMzkwODcwNA==,9599,simonw,2018-07-10T17:46:13Z,2018-07-10T17:46:13Z,OWNER,I consider this resolved by #46 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310533258,Figure out how to bundle a more up-to-date SQLite, https://github.com/simonw/datasette/issues/193#issuecomment-379142500,https://api.github.com/repos/simonw/datasette/issues/193,379142500,MDEyOklzc3VlQ29tbWVudDM3OTE0MjUwMA==,222245,carlmjohnson,2018-04-06T04:05:58Z,2018-04-06T04:05:58Z,NONE,"You could try pulling out a validate query strings method. If it fails validation build the error object from the message. If it passes, you only need to go down a happy path. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310882100,Cleaner mechanism for handling custom errors, https://github.com/simonw/datasette/issues/193#issuecomment-379624163,https://api.github.com/repos/simonw/datasette/issues/193,379624163,MDEyOklzc3VlQ29tbWVudDM3OTYyNDE2Mw==,9599,simonw,2018-04-09T04:03:49Z,2018-04-09T04:03:49Z,OWNER,"This is harder than I thought, because the `_shape=` logic actually runs AFTER the main block of code which is set up to catch exceptions - this code here: https://github.com/simonw/datasette/blob/0abd3abacb309a2bd5913a7a2df4e9256585b1bb/datasette/app.py#L200-L216","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310882100,Cleaner mechanism for handling custom errors, https://github.com/simonw/datasette/issues/193#issuecomment-380619851,https://api.github.com/repos/simonw/datasette/issues/193,380619851,MDEyOklzc3VlQ29tbWVudDM4MDYxOTg1MQ==,9599,simonw,2018-04-11T22:48:19Z,2018-04-11T22:48:19Z,OWNER,I can clean this up further with the mechanism I'm using for #184,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310882100,Cleaner mechanism for handling custom errors, https://github.com/simonw/datasette/issues/194#issuecomment-379556881,https://api.github.com/repos/simonw/datasette/issues/194,379556881,MDEyOklzc3VlQ29tbWVudDM3OTU1Njg4MQ==,9599,simonw,2018-04-08T15:00:48Z,2018-04-08T15:02:35Z,OWNER,`table_rows_count` is always the *total* number of rows in the table. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312312125,Rename table_rows and filtered_table_rows to have _count suffix, https://github.com/simonw/datasette/issues/194#issuecomment-379556981,https://api.github.com/repos/simonw/datasette/issues/194,379556981,MDEyOklzc3VlQ29tbWVudDM3OTU1Njk4MQ==,9599,simonw,2018-04-08T15:02:23Z,2018-04-08T15:02:23Z,OWNER,Maybe `table_rows_filtered_count` would be more aesthetically pleasing than `filtered_table_rows_count`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312312125,Rename table_rows and filtered_table_rows to have _count suffix, https://github.com/simonw/datasette/issues/195#issuecomment-379557743,https://api.github.com/repos/simonw/datasette/issues/195,379557743,MDEyOklzc3VlQ29tbWVudDM3OTU1Nzc0Mw==,9599,simonw,2018-04-08T15:13:18Z,2018-04-08T15:13:18Z,OWNER,https://github.com/simonw/datasette/blob/446d47fdb005b3776bc06ad8d1f44b01fc2e938b/datasette/app.py#L93-L102,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312313496,"Run pks_for_table in inspect, executing once at build time rather than constantly", https://github.com/simonw/datasette/issues/195#issuecomment-379559074,https://api.github.com/repos/simonw/datasette/issues/195,379559074,MDEyOklzc3VlQ29tbWVudDM3OTU1OTA3NA==,9599,simonw,2018-04-08T15:31:49Z,2018-04-08T15:31:49Z,OWNER,"While I'm at it, doing the same thing for fts_table detection is worth considering: https://github.com/simonw/datasette/blob/446d47fdb005b3776bc06ad8d1f44b01fc2e938b/datasette/app.py#L598-L603","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312313496,"Run pks_for_table in inspect, executing once at build time rather than constantly", https://github.com/simonw/datasette/issues/195#issuecomment-379588602,https://api.github.com/repos/simonw/datasette/issues/195,379588602,MDEyOklzc3VlQ29tbWVudDM3OTU4ODYwMg==,9599,simonw,2018-04-08T22:40:16Z,2018-04-08T22:40:16Z,OWNER,"Could also identify all views for that database, which would save on these queries: https://github.com/simonw/datasette/blob/b2188f044265c95f7e54860e28107c17d2a6ed2e/datasette/app.py#L543-L545","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312313496,"Run pks_for_table in inspect, executing once at build time rather than constantly", https://github.com/simonw/datasette/issues/199#issuecomment-379833216,https://api.github.com/repos/simonw/datasette/issues/199,379833216,MDEyOklzc3VlQ29tbWVudDM3OTgzMzIxNg==,9599,simonw,2018-04-09T17:37:47Z,2018-04-09T17:37:47Z,OWNER,I may do this by adding select boxes for _sort and _sort_desc to the filters UI. This would allow sorting in mobile portrait mode but would also ensure that the existing sort order is persisted if the user edits the current filters (right now sort resets when filters are applied).,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312620566,Ability to apply sort on mobile in portrait mode, https://github.com/simonw/datasette/issues/199#issuecomment-379833481,https://api.github.com/repos/simonw/datasette/issues/199,379833481,MDEyOklzc3VlQ29tbWVudDM3OTgzMzQ4MQ==,9599,simonw,2018-04-09T17:38:39Z,2018-04-09T17:38:39Z,OWNER,"Since you can't apply `_sort` and `_sort_desc` at the same time, maybe just one select box for picking the column to sort by and a boolean checkbox for ""sort descending"" - which then redirects to the `_sort_desc=` URL variant.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312620566,Ability to apply sort on mobile in portrait mode, https://github.com/simonw/datasette/issues/199#issuecomment-379936068,https://api.github.com/repos/simonw/datasette/issues/199,379936068,MDEyOklzc3VlQ29tbWVudDM3OTkzNjA2OA==,9599,simonw,2018-04-10T00:32:37Z,2018-04-10T00:32:37Z,OWNER,"![2018-04-09 at 5 32 pm](https://user-images.githubusercontent.com/9599/38529802-fd2a7e68-3c1b-11e8-974a-bf5438fec701.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312620566,Ability to apply sort on mobile in portrait mode, https://github.com/simonw/datasette/issues/199#issuecomment-379936832,https://api.github.com/repos/simonw/datasette/issues/199,379936832,MDEyOklzc3VlQ29tbWVudDM3OTkzNjgzMg==,9599,simonw,2018-04-10T00:37:52Z,2018-04-10T00:37:52Z,OWNER,Demo: https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9/twitter-ratio%2Fsenators?_sort_desc=replies&text__contains=bipartisan,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312620566,Ability to apply sort on mobile in portrait mode, https://github.com/simonw/datasette/pull/200#issuecomment-380606998,https://api.github.com/repos/simonw/datasette/issues/200,380606998,MDEyOklzc3VlQ29tbWVudDM4MDYwNjk5OA==,9599,simonw,2018-04-11T21:50:14Z,2018-04-11T21:50:14Z,OWNER,"We should only do this if we're certain the spatialite module has been loaded. I could imagine someone having a `sql_statements_log` table of their own without using spatialite for example. I think the most reliable way to detect spatialite is to run `SELECT AddGeometryColumn(1, 2, 3, 4, 5);` against a `:memory:` database and see if it throws an exception - similar to how we detect FTS. We could add this as a `detect_spatialite()` function in `utils.py` and call it once on startup.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313494458,Hide Spatialite system tables, https://github.com/simonw/datasette/pull/200#issuecomment-380608372,https://api.github.com/repos/simonw/datasette/issues/200,380608372,MDEyOklzc3VlQ29tbWVudDM4MDYwODM3Mg==,45057,russss,2018-04-11T21:55:46Z,2018-04-11T21:55:46Z,CONTRIBUTOR,"> I think the most reliable way to detect spatialite is to run `SELECT AddGeometryColumn(1, 2, 3, 4, 5);` against a `:memory:` database and see if it throws an exception Or just see if there's a `geometry_columns` table? I think that's quite unlikely to be added by accident (and it's an OGC standard). It also tells you if Spatialite is installed in the database rather than just loaded.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313494458,Hide Spatialite system tables, https://github.com/simonw/datasette/pull/200#issuecomment-380951474,https://api.github.com/repos/simonw/datasette/issues/200,380951474,MDEyOklzc3VlQ29tbWVudDM4MDk1MTQ3NA==,9599,simonw,2018-04-12T21:34:39Z,2018-04-12T21:34:39Z,OWNER,"Nice, thanks very much.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313494458,Hide Spatialite system tables, https://github.com/simonw/datasette/issues/201#issuecomment-381262824,https://api.github.com/repos/simonw/datasette/issues/201,381262824,MDEyOklzc3VlQ29tbWVudDM4MTI2MjgyNA==,9599,simonw,2018-04-13T21:17:14Z,2018-04-13T21:17:14Z,OWNER,"Demo: https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9?sql=explain+query+plan+select+*+from+%5Bmost-common-name%2Fsurnames%5D+order+by+rank+desc https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9?sql=explain+select+*+from+%5Bmost-common-name%2Fsurnames%5D+order+by+rank+desc","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313512748,Support explain select / explain query plan select, https://github.com/simonw/datasette/pull/202#issuecomment-381220441,https://api.github.com/repos/simonw/datasette/issues/202,381220441,MDEyOklzc3VlQ29tbWVudDM4MTIyMDQ0MQ==,9599,simonw,2018-04-13T18:19:15Z,2018-04-13T18:19:15Z,OWNER,I'm afraid I've just made this obsolete with 9f28bbe43dc277a3963a12aaae37b5ee3c277207,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313785206,Raise 404 on nonexistent table URLs, https://github.com/simonw/datasette/pull/202#issuecomment-381237440,https://api.github.com/repos/simonw/datasette/issues/202,381237440,MDEyOklzc3VlQ29tbWVudDM4MTIzNzQ0MA==,45057,russss,2018-04-13T19:22:53Z,2018-04-13T19:22:53Z,CONTRIBUTOR,I spotted you'd mentioned that in #184 but only after I'd written the patch!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313785206,Raise 404 on nonexistent table URLs, https://github.com/simonw/datasette/issues/203#issuecomment-380951815,https://api.github.com/repos/simonw/datasette/issues/203,380951815,MDEyOklzc3VlQ29tbWVudDM4MDk1MTgxNQ==,9599,simonw,2018-04-12T21:36:10Z,2018-04-12T21:36:10Z,OWNER,I like this. I'd like to be able to attach a full description to a column as well. We could support these in `metadata.json`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/203#issuecomment-380951920,https://api.github.com/repos/simonw/datasette/issues/203,380951920,MDEyOklzc3VlQ29tbWVudDM4MDk1MTkyMA==,9599,simonw,2018-04-12T21:36:38Z,2018-04-12T21:36:38Z,OWNER,This also feeds into the visualization features I want to add - we could use this kind of metadata to automatically apply meaningful labels to graphs.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/203#issuecomment-380966565,https://api.github.com/repos/simonw/datasette/issues/203,380966565,MDEyOklzc3VlQ29tbWVudDM4MDk2NjU2NQ==,45057,russss,2018-04-12T22:43:08Z,2018-04-12T22:43:08Z,CONTRIBUTOR,"Looks like [pint](https://pint.readthedocs.io/en/latest/tutorial.html) is pretty good at this. ```python In [1]: import pint In [2]: ureg = pint.UnitRegistry() In [3]: q = 3e6 * ureg('Hz') In [4]: '{:~P}'.format(q.to_compact()) Out[4]: '3.0 MHz' In [5]: q = 0.3 * ureg('m') In [5]: '{:~P}'.format(q.to_compact()) Out[5]: '300.0 mm' In [6]: q = 5 * ureg('') In [7]: '{:~P}'.format(q.to_compact()) Out[7]: '5' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/203#issuecomment-381300336,https://api.github.com/repos/simonw/datasette/issues/203,381300336,MDEyOklzc3VlQ29tbWVudDM4MTMwMDMzNg==,9599,simonw,2018-04-14T03:35:02Z,2018-04-14T03:35:02Z,OWNER,"This is really cool - I'm very impressed by pint. I'd like to figure out a sensible opt-in way to expose this in the JSON output as well. Maybe with a `&_units=true` parameter? We should definitely expose the units section from the table metadata in the output of https://wtr-api.herokuapp.com/wtr-663ea99/license_frequency.json","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/203#issuecomment-381300386,https://api.github.com/repos/simonw/datasette/issues/203,381300386,MDEyOklzc3VlQ29tbWVudDM4MTMwMDM4Ng==,9599,simonw,2018-04-14T03:35:56Z,2018-04-14T03:35:56Z,OWNER,"In #204 you said ""I'd like to add support for using units when querying but this is PR is pretty usable as-is."" - I'm fascinated to hear more about how this could work.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/203#issuecomment-381315675,https://api.github.com/repos/simonw/datasette/issues/203,381315675,MDEyOklzc3VlQ29tbWVudDM4MTMxNTY3NQ==,45057,russss,2018-04-14T09:14:45Z,2018-04-14T09:27:30Z,CONTRIBUTOR,"> I'd like to figure out a sensible opt-in way to expose this in the JSON output as well. Maybe with a &_units=true parameter? From a machine-readable perspective I'm not sure why it would be useful to decorate the values with units. Edit: Should have had some coffee first. It's clearly useful for stuff like map rendering! I agree that the unit metadata should definitely be exposed in the JSON. > In #204 you said ""I'd like to add support for using units when querying but this is PR is pretty usable as-is."" - I'm fascinated to hear more about how this could work. I'm thinking about a couple of approaches here. I think the simplest one is: if the column has a unit attached, optionally accept units in query fields: ```python column_units = ureg(""Hz"") # Create a unit object for the column's unit query_variable = ureg(""4 GHz"") # Supplied query variable # Now we can convert the query units into column units before querying supplied_value.to(column_units).magnitude > 4000000000.0 # If the user doesn't supply units, pint just returns the plain # number and we can query as usual assuming it's the base unit query_variable = ureg(""50"") query_variable > 50 isinstance(query_variable, numbers.Number) > True ``` This also lets us do some nice unit conversion on querying: ```python column_units = ureg(""m"") query_variable = ureg(""50 ft"") supplied_value.to(column_units) > ``` The alternative would be to provide a dropdown of units next to the query field (so a ""Hz"" field would give you ""kHz"", ""MHz"", ""GHz""). Although this would be clearer to the user, it isn't so easy - we'd need to know more about the context of the field to give you sensible SI prefixes (I'm not so interested in nanoHertz, for example). You also lose the bonus of being able to convert - although pint will happily show you all the compatible units, it again suffers from a lack of context: ```python ureg(""m"").compatible_units() > frozenset({, , , , , , , , , , , }) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/203#issuecomment-381330075,https://api.github.com/repos/simonw/datasette/issues/203,381330075,MDEyOklzc3VlQ29tbWVudDM4MTMzMDA3NQ==,9599,simonw,2018-04-14T13:41:53Z,2018-04-14T13:41:53Z,OWNER,"Presumably units only work for numeric fields? If that's the case then automatically processing them if the incoming query string argument has a unit suffix makes total sense to me. Here's a pretty crazy idea: what if we exposed unit conversion to SQL as a custom SQLite function? That way it would be possible to optionally use units in actual custom SQL queries. I'd have to think quite carefully about performance implications here - wouldn't want a poorly considered unit calculation over a 500,000 row table to lock up the server. But I think the 1s query time limit might still prevent that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/203#issuecomment-381348849,https://api.github.com/repos/simonw/datasette/issues/203,381348849,MDEyOklzc3VlQ29tbWVudDM4MTM0ODg0OQ==,9599,simonw,2018-04-14T18:12:52Z,2018-04-14T18:12:52Z,OWNER,I think I'm going to hold on to the custom sql function idea for the moment and implement it as an example plugin.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/203#issuecomment-381446554,https://api.github.com/repos/simonw/datasette/issues/203,381446554,MDEyOklzc3VlQ29tbWVudDM4MTQ0NjU1NA==,9599,simonw,2018-04-15T23:25:54Z,2018-04-15T23:26:03Z,OWNER,I built a prototype of the `convert_units()` custom SQL function as a plugin over in https://github.com/simonw/datasette/issues/14#issuecomment-381446511,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/203#issuecomment-381763651,https://api.github.com/repos/simonw/datasette/issues/203,381763651,MDEyOklzc3VlQ29tbWVudDM4MTc2MzY1MQ==,45057,russss,2018-04-16T21:59:17Z,2018-04-16T21:59:17Z,CONTRIBUTOR,"Ah, I had no idea you could bind python functions into sqlite! I think the primary purpose of this issue has been served now - I'm going to close this and create a new issue for the only bit of this that hasn't been touched yet, which is (optionally) exposing units in the JSON API.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/pull/205#issuecomment-381330220,https://api.github.com/repos/simonw/datasette/issues/205,381330220,MDEyOklzc3VlQ29tbWVudDM4MTMzMDIyMA==,9599,simonw,2018-04-14T13:44:15Z,2018-04-14T13:44:15Z,OWNER,This looks great so far - love the new documentation. Let's throw in a unit test or two for the basic unit filters (mainly as a protection against accidental regressions in the future).,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314319372,Support filtering with units and more, https://github.com/simonw/datasette/pull/205#issuecomment-381332222,https://api.github.com/repos/simonw/datasette/issues/205,381332222,MDEyOklzc3VlQ29tbWVudDM4MTMzMjIyMg==,45057,russss,2018-04-14T14:16:35Z,2018-04-14T14:16:35Z,CONTRIBUTOR,I've added some tests and that docs link.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314319372,Support filtering with units and more, https://github.com/simonw/datasette/pull/205#issuecomment-381336696,https://api.github.com/repos/simonw/datasette/issues/205,381336696,MDEyOklzc3VlQ29tbWVudDM4MTMzNjY5Ng==,9599,simonw,2018-04-14T15:24:04Z,2018-04-14T15:24:04Z,OWNER,I merged this to master in c857608738d6b6c3e4f3248304a22f8b2648dd3e - thanks @russss!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314319372,Support filtering with units and more, https://github.com/simonw/datasette/pull/207#issuecomment-381334973,https://api.github.com/repos/simonw/datasette/issues/207,381334973,MDEyOklzc3VlQ29tbWVudDM4MTMzNDk3Mw==,9599,simonw,2018-04-14T14:59:52Z,2018-04-14T14:59:52Z,OWNER,I'm going to merge this and then add a unit test.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314329002,Link foreign keys which don't have labels, https://github.com/simonw/datasette/pull/209#issuecomment-381441392,https://api.github.com/repos/simonw/datasette/issues/209,381441392,MDEyOklzc3VlQ29tbWVudDM4MTQ0MTM5Mg==,45057,russss,2018-04-15T21:59:15Z,2018-04-15T21:59:15Z,CONTRIBUTOR,"I suspected this would cause some test failures, but I'll wait for opinions before attempting to fix them.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314455877, Don't duplicate simple primary keys in the link column, https://github.com/simonw/datasette/pull/209#issuecomment-381483301,https://api.github.com/repos/simonw/datasette/issues/209,381483301,MDEyOklzc3VlQ29tbWVudDM4MTQ4MzMwMQ==,9599,simonw,2018-04-16T05:25:08Z,2018-04-16T05:25:08Z,OWNER,I think this is a good improvement. If you fix the tests I'll merge it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314455877, Don't duplicate simple primary keys in the link column, https://github.com/simonw/datasette/pull/209#issuecomment-381738137,https://api.github.com/repos/simonw/datasette/issues/209,381738137,MDEyOklzc3VlQ29tbWVudDM4MTczODEzNw==,45057,russss,2018-04-16T20:27:43Z,2018-04-16T20:27:43Z,CONTRIBUTOR,"Tests now fixed, honest. The failing test on Travis looks like an intermittent sqlite failure which should resolve itself on a retry...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314455877, Don't duplicate simple primary keys in the link column, https://github.com/simonw/datasette/pull/209#issuecomment-381905593,https://api.github.com/repos/simonw/datasette/issues/209,381905593,MDEyOklzc3VlQ29tbWVudDM4MTkwNTU5Mw==,45057,russss,2018-04-17T08:50:28Z,2018-04-17T08:50:28Z,CONTRIBUTOR,"I've added another commit which puts classes a class on each `` by default with its column name, and I've also made the PK column bold. Unfortunately the tests are still failing on 3.6, which is weird. I can't reproduce locally...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314455877, Don't duplicate simple primary keys in the link column, https://github.com/simonw/datasette/pull/209#issuecomment-382205189,https://api.github.com/repos/simonw/datasette/issues/209,382205189,MDEyOklzc3VlQ29tbWVudDM4MjIwNTE4OQ==,9599,simonw,2018-04-18T00:42:44Z,2018-04-18T00:43:02Z,OWNER,"I managed to get a better error message out of that test. The server is returning this (but only on Python 3.6, not on Python 3.5 - and only in Travis, not in my local environment): ```{'error': 'interrupted', 'ok': False, 'status': 400, 'title': 'Invalid SQL'}``` https://travis-ci.org/simonw/datasette/jobs/367929134","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314455877, Don't duplicate simple primary keys in the link column, https://github.com/simonw/datasette/pull/209#issuecomment-382210976,https://api.github.com/repos/simonw/datasette/issues/209,382210976,MDEyOklzc3VlQ29tbWVudDM4MjIxMDk3Ng==,9599,simonw,2018-04-18T01:12:26Z,2018-04-18T01:12:26Z,OWNER,"OK, aaf59db570ab7688af72c08bb5bc1edc145e3e07 should mean that the tests pass when I merge that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314455877, Don't duplicate simple primary keys in the link column, https://github.com/simonw/datasette/issues/211#issuecomment-381456434,https://api.github.com/repos/simonw/datasette/issues/211,381456434,MDEyOklzc3VlQ29tbWVudDM4MTQ1NjQzNA==,9599,simonw,2018-04-16T01:36:16Z,2018-04-16T01:37:44Z,OWNER,"The easiest way to implement this in Python 2 would be `execfile(...)` - but that was removed in Python 3. According to https://stackoverflow.com/a/437857/6083 `2to3` replaces that with this, which ensures the filename is associated with the code for debugging purposes: ``` with open(""somefile.py"") as f: code = compile(f.read(), ""somefile.py"", 'exec') exec(code, global_vars, local_vars) ``` Implementing it this way would force this kind of plugin to be self-contained in a single file. I think that's OK: if you want a more complex plugin you can use the standard pluggy-powered setuptools mechanism to build it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314471743,Load plugins from a `--plugins-dir=plugins/` directory, https://github.com/simonw/datasette/issues/211#issuecomment-381462005,https://api.github.com/repos/simonw/datasette/issues/211,381462005,MDEyOklzc3VlQ29tbWVudDM4MTQ2MjAwNQ==,9599,simonw,2018-04-16T02:23:07Z,2018-04-16T02:23:07Z,OWNER,This needs unit tests. I also need to manually test the `datasette package` and `datesette publish` commands.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314471743,Load plugins from a `--plugins-dir=plugins/` directory, https://github.com/simonw/datasette/issues/211#issuecomment-381478217,https://api.github.com/repos/simonw/datasette/issues/211,381478217,MDEyOklzc3VlQ29tbWVudDM4MTQ3ODIxNw==,9599,simonw,2018-04-16T04:41:38Z,2018-04-16T04:41:38Z,OWNER,"Here's the result of running: datasette publish now fivethirtyeight.db \ --plugins-dir=plugins/ --title=""FiveThirtyEight"" --branch=plugins-dir https://datasette-phjtvzwwzl.now.sh/fivethirtyeight-2628db9?sql=select+convert_units%28100%2C+%27m%27%2C+%27ft%27%29 Where `plugins/pint_plugin.py` contains the following: ``` from datasette import hookimpl import pint ureg = pint.UnitRegistry() @hookimpl def prepare_connection(conn): def convert_units(amount, from_, to_): ""select convert_units(100, 'm', 'ft');"" return (amount * ureg(from_)).to(to_).to_tuple()[0] conn.create_function('convert_units', 3, convert_units) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314471743,Load plugins from a `--plugins-dir=plugins/` directory, https://github.com/simonw/datasette/issues/211#issuecomment-381478253,https://api.github.com/repos/simonw/datasette/issues/211,381478253,MDEyOklzc3VlQ29tbWVudDM4MTQ3ODI1Mw==,9599,simonw,2018-04-16T04:42:02Z,2018-04-16T04:42:02Z,OWNER,"This worked as well: datasette package fivethirtyeight.db \ --plugins-dir=plugins/ --title=""FiveThirtyEight"" --branch=plugins-dir ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314471743,Load plugins from a `--plugins-dir=plugins/` directory, https://github.com/simonw/datasette/issues/211#issuecomment-381481990,https://api.github.com/repos/simonw/datasette/issues/211,381481990,MDEyOklzc3VlQ29tbWVudDM4MTQ4MTk5MA==,9599,simonw,2018-04-16T05:14:57Z,2018-04-16T05:14:57Z,OWNER,Added unit tests in 33c6bcadb962457be6b0c7f369826b404e2bcef5,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314471743,Load plugins from a `--plugins-dir=plugins/` directory, https://github.com/simonw/datasette/issues/211#issuecomment-381482407,https://api.github.com/repos/simonw/datasette/issues/211,381482407,MDEyOklzc3VlQ29tbWVudDM4MTQ4MjQwNw==,9599,simonw,2018-04-16T05:18:29Z,2018-04-16T05:18:29Z,OWNER,"Here's the result of running this: datasette publish heroku fivethirtyeight.db \ --plugins-dir=plugins/ --title=""FiveThirtyEight"" --branch=plugins-dir https://intense-river-24599.herokuapp.com/fivethirtyeight-2628db9?sql=select+convert_units%28100%2C+%27m%27%2C+%27ft%27%29","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314471743,Load plugins from a `--plugins-dir=plugins/` directory, https://github.com/simonw/datasette/issues/214#issuecomment-381490361,https://api.github.com/repos/simonw/datasette/issues/214,381490361,MDEyOklzc3VlQ29tbWVudDM4MTQ5MDM2MQ==,9599,simonw,2018-04-16T06:13:02Z,2018-04-16T06:13:02Z,OWNER,"Packaging JS and CSS in a pip installable wheel is fiddly but possible. http://peak.telecommunity.com/DevCenter/PythonEggs#accessing-package-resources from pkg_resources import resource_string foo_config = resource_string(__name__, 'foo.conf')","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506446,Ability for plugins to define extra JavaScript and CSS, https://github.com/simonw/datasette/issues/214#issuecomment-381491707,https://api.github.com/repos/simonw/datasette/issues/214,381491707,MDEyOklzc3VlQ29tbWVudDM4MTQ5MTcwNw==,9599,simonw,2018-04-16T06:21:23Z,2018-04-16T06:21:23Z,OWNER,This looks like a good example: https://github.com/funkey/nyroglancer/commit/d4438ab42171360b2b8e9020f672846dd70c8d80,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506446,Ability for plugins to define extra JavaScript and CSS, https://github.com/simonw/datasette/issues/214#issuecomment-381612585,https://api.github.com/repos/simonw/datasette/issues/214,381612585,MDEyOklzc3VlQ29tbWVudDM4MTYxMjU4NQ==,9599,simonw,2018-04-16T14:10:16Z,2018-04-16T14:10:16Z,OWNER,`resource_stream` returns a file-like object which may be better for serving from Sanic.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506446,Ability for plugins to define extra JavaScript and CSS, https://github.com/simonw/datasette/issues/214#issuecomment-382038613,https://api.github.com/repos/simonw/datasette/issues/214,382038613,MDEyOklzc3VlQ29tbWVudDM4MjAzODYxMw==,9599,simonw,2018-04-17T15:38:23Z,2018-04-17T15:38:23Z,OWNER,"I figured out the recipe for bundling static assets in a plugin: https://github.com/simonw/datasette-plugin-demos/commit/26c5548f4ab7c6cc6d398df17767950be50d0edf (and then `python3 setup.py bdist_wheel`) Having done that, I ran `pip install ../datasette-plugin-demos/dist/datasette_plugin_demos-0.2-py3-none-any.whl` from my Datasette virtual environment and then did the following: ``` >>> import pkg_resources >>> pkg_resources.resource_stream( ... 'datasette_plugin_demos', 'static/plugin.js' ... ).read() b""alert('hello');\n"" >>> pkg_resources.resource_filename( ... 'datasette_plugin_demos', 'static/plugin.js' ... ) '..../venv/lib/python3.6/site-packages/datasette_plugin_demos/static/plugin.js' >>> pkg_resources.resource_string( ... 'datasette_plugin_demos', 'static/plugin.js' ... ) b""alert('hello');\n"" ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506446,Ability for plugins to define extra JavaScript and CSS, https://github.com/simonw/datasette/issues/214#issuecomment-382048582,https://api.github.com/repos/simonw/datasette/issues/214,382048582,MDEyOklzc3VlQ29tbWVudDM4MjA0ODU4Mg==,9599,simonw,2018-04-17T16:04:42Z,2018-04-18T02:24:46Z,OWNER,"One possible option: let plugins bundle their own `static/` directory and then register themselves with Datasette, then have `/-/static-plugins/name-of-plugin/...` serve files from that directory.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506446,Ability for plugins to define extra JavaScript and CSS, https://github.com/simonw/datasette/issues/214#issuecomment-382069980,https://api.github.com/repos/simonw/datasette/issues/214,382069980,MDEyOklzc3VlQ29tbWVudDM4MjA2OTk4MA==,9599,simonw,2018-04-17T17:08:28Z,2018-04-17T17:08:28Z,OWNER,"Even if we automatically serve ALL `static/` content from installed plugins, we'll still need them to register which files need to be linked to from `extra_css_urls` and `extra_js_urls`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506446,Ability for plugins to define extra JavaScript and CSS, https://github.com/simonw/datasette/issues/215#issuecomment-398826108,https://api.github.com/repos/simonw/datasette/issues/215,398826108,MDEyOklzc3VlQ29tbWVudDM5ODgyNjEwOA==,9599,simonw,2018-06-20T17:09:18Z,2020-06-06T21:46:51Z,OWNER,This depends on #272 - Datasette ported to ASGI.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506669,Allow plugins to define additional URL routes and views, https://github.com/simonw/datasette/issues/215#issuecomment-504881900,https://api.github.com/repos/simonw/datasette/issues/215,504881900,MDEyOklzc3VlQ29tbWVudDUwNDg4MTkwMA==,9599,simonw,2019-06-24T06:51:29Z,2020-06-06T21:47:11Z,OWNER,See also #520 - asgi_wrapper plugin hook.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506669,Allow plugins to define additional URL routes and views, https://github.com/simonw/datasette/issues/215#issuecomment-507929913,https://api.github.com/repos/simonw/datasette/issues/215,507929913,MDEyOklzc3VlQ29tbWVudDUwNzkyOTkxMw==,9599,simonw,2019-07-03T04:08:28Z,2019-07-03T04:08:28Z,OWNER,"I just closed #520 which means this is now technically possible. But... doing it using the new `asgi_wrapper` hook https://datasette.readthedocs.io/en/latest/plugins.html#asgi-wrapper-datasette isn't particularly obvious. I'm going to leave this ticket open for the moment. I think I need at least one example plugin to show that this approach is good enough - and it's still quite possible that I'll add an extra, easier hook for this. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506669,Allow plugins to define additional URL routes and views, https://github.com/simonw/datasette/issues/215#issuecomment-540548765,https://api.github.com/repos/simonw/datasette/issues/215,540548765,MDEyOklzc3VlQ29tbWVudDU0MDU0ODc2NQ==,2181410,clausjuhl,2019-10-10T12:27:56Z,2019-10-10T12:27:56Z,NONE,"Hi Simon. Any news on the ability to add routes (with static content) to datasette? As a public institution I'm required to have at least privacy, cookie and availability policies in place, and it really would be nice to have these under the same url. Thank you for some great work!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506669,Allow plugins to define additional URL routes and views, https://github.com/simonw/datasette/issues/215#issuecomment-640118802,https://api.github.com/repos/simonw/datasette/issues/215,640118802,MDEyOklzc3VlQ29tbWVudDY0MDExODgwMg==,9599,simonw,2020-06-06T21:12:41Z,2020-06-06T21:12:41Z,OWNER,@clausjuhl your use-case there is now covered by custom pages from Datasette 0.41 https://datasette.readthedocs.io/en/stable/changelog.html#v0-41,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506669,Allow plugins to define additional URL routes and views, https://github.com/simonw/datasette/issues/215#issuecomment-640119259,https://api.github.com/repos/simonw/datasette/issues/215,640119259,MDEyOklzc3VlQ29tbWVudDY0MDExOTI1OQ==,9599,simonw,2020-06-06T21:16:46Z,2020-06-06T21:16:46Z,OWNER,"I deprioritised this a while ago because the asgi_wrapper hook allowed me to set up new URL routes: https://datasette.readthedocs.io/en/0.43/plugins.html#asgi-wrapper-datasette But... those were pretty low level, for example this code here: https://github.com/simonw/datasette-auth-github/blob/6c971064f6f4e6857bade5c6b88842f9cdeca9d9/datasette_auth_github/github_auth.py#L104-L113 Now that Datasette has a documented request object #706 and that object is used by things like the flash messages system (#790) - https://datasette.readthedocs.io/en/latest/internals.html#add-message-request-message-message-type-datasette-info - I find myself wanting to add views which get a request, as opposed to an ASGI scope. So I'm re-prioritising this, with the main need being a way for plugins to hook up their own view functions that can accept a request and return a response. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506669,Allow plugins to define additional URL routes and views, https://github.com/simonw/datasette/issues/215#issuecomment-640121036,https://api.github.com/repos/simonw/datasette/issues/215,640121036,MDEyOklzc3VlQ29tbWVudDY0MDEyMTAzNg==,9599,simonw,2020-06-06T21:34:03Z,2020-06-06T21:34:03Z,OWNER,"I'll refactor existing code to register views using the same mechanism that plugins will have access to. Maybe plugins get to register their routes first? That would allow plugins to do things like entirely take over the / page.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506669,Allow plugins to define additional URL routes and views, https://github.com/simonw/datasette/issues/215#issuecomment-640121917,https://api.github.com/repos/simonw/datasette/issues/215,640121917,MDEyOklzc3VlQ29tbWVudDY0MDEyMTkxNw==,9599,simonw,2020-06-06T21:42:58Z,2020-06-07T05:58:36Z,OWNER,"I might use some dependency injection here, with `call_with_supported_arguments()` from https://github.com/simonw/datasette/commit/41a0cd7b6afe0397efbbf27ad822679fc574811a#diff-942305c83055fdc0ff5f4e7d6ab06b29 Maybe a view function can take `request` and optionally also take `datasette`? Or `scope` or `receive` or `send`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506669,Allow plugins to define additional URL routes and views, https://github.com/simonw/datasette/issues/215#issuecomment-640122120,https://api.github.com/repos/simonw/datasette/issues/215,640122120,MDEyOklzc3VlQ29tbWVudDY0MDEyMjEyMA==,9599,simonw,2020-06-06T21:45:13Z,2020-06-06T21:45:52Z,OWNER,"Stretch goal: make it easy for plugin views to implement formats, so they can produce HTML by default and .json or .csv etc as alternative outputs.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506669,Allow plugins to define additional URL routes and views, https://github.com/simonw/datasette/issues/215#issuecomment-640960553,https://api.github.com/repos/simonw/datasette/issues/215,640960553,MDEyOklzc3VlQ29tbWVudDY0MDk2MDU1Mw==,9599,simonw,2020-06-09T00:41:09Z,2020-06-09T00:41:09Z,OWNER,"I'm going to imitate `register_output_renderer` and `register_facet_classes` - both return a list of things to register. So I'll do this: ```python @hookspec def register_routes(): ""Register URL routes. Return a list of (regex, view_function) pairs"" ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506669,Allow plugins to define additional URL routes and views, https://github.com/simonw/datasette/issues/215#issuecomment-640960667,https://api.github.com/repos/simonw/datasette/issues/215,640960667,MDEyOklzc3VlQ29tbWVudDY0MDk2MDY2Nw==,9599,simonw,2020-06-09T00:41:35Z,2020-06-09T00:41:35Z,OWNER,I'm going to implement this one documentation-first in a pull request.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506669,Allow plugins to define additional URL routes and views, https://github.com/simonw/datasette/issues/215#issuecomment-640971470,https://api.github.com/repos/simonw/datasette/issues/215,640971470,MDEyOklzc3VlQ29tbWVudDY0MDk3MTQ3MA==,9599,simonw,2020-06-09T01:19:44Z,2020-06-09T01:19:44Z,OWNER,I'll need to add documentation of the `Response` object (and `Response.html()` and `Response.text()` class methods - I should add `Response.json()` too) to the internals page https://datasette.readthedocs.io/en/stable/internals.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506669,Allow plugins to define additional URL routes and views, https://github.com/simonw/datasette/issues/215#issuecomment-640972952,https://api.github.com/repos/simonw/datasette/issues/215,640972952,MDEyOklzc3VlQ29tbWVudDY0MDk3Mjk1Mg==,9599,simonw,2020-06-09T01:24:52Z,2020-06-09T01:25:33Z,OWNER,WIP documentation: https://github.com/simonw/datasette/blob/770dedb21adfc706592e6b5cdf5e751a8720fdf9/docs/plugins.rst#register_routes,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506669,Allow plugins to define additional URL routes and views, https://github.com/simonw/datasette/issues/215#issuecomment-641002504,https://api.github.com/repos/simonw/datasette/issues/215,641002504,MDEyOklzc3VlQ29tbWVudDY0MTAwMjUwNA==,9599,simonw,2020-06-09T03:14:32Z,2020-06-09T03:14:32Z,OWNER,Documentation: https://datasette.readthedocs.io/en/latest/plugins.html#register-routes,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506669,Allow plugins to define additional URL routes and views, https://github.com/simonw/datasette/issues/216#issuecomment-381643173,https://api.github.com/repos/simonw/datasette/issues/216,381643173,MDEyOklzc3VlQ29tbWVudDM4MTY0MzE3Mw==,9599,simonw,2018-04-16T15:21:17Z,2018-04-16T15:21:17Z,OWNER,"Yikes, definitely a bug.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381644355,https://api.github.com/repos/simonw/datasette/issues/216,381644355,MDEyOklzc3VlQ29tbWVudDM4MTY0NDM1NQ==,9599,simonw,2018-04-16T15:24:38Z,2018-04-16T15:24:38Z,OWNER,"So there are two tricky problems to solve here: * I need a way of encoding `null` into that `_next=` that is unambiguous from the string `None` or `null`. This means introducing some kind of escaping mechanism in those strings. I already use URL encoding as part of the construction of those components here, maybe that can help here? * I need to figure out what the SQL should be for the ""next"" set of results if the previous value was null. Thankfully we use the primary key as a tie-breaker so this shouldn't be impossible.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381645274,https://api.github.com/repos/simonw/datasette/issues/216,381645274,MDEyOklzc3VlQ29tbWVudDM4MTY0NTI3NA==,9599,simonw,2018-04-16T15:27:16Z,2018-04-16T15:27:16Z,OWNER,"Relevant code: https://github.com/simonw/datasette/blob/904f1c75a3c17671d25c53b91e177c249d14ab3b/datasette/app.py#L828-L832","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381645973,https://api.github.com/repos/simonw/datasette/issues/216,381645973,MDEyOklzc3VlQ29tbWVudDM4MTY0NTk3Mw==,9599,simonw,2018-04-16T15:29:11Z,2018-04-16T15:29:11Z,OWNER,"I could use `$null` as a magic value that means None. Since I'm applying `quote_plus()` to actual values, any legit strings that look like this will be encoded as `%24null`: ``` >>> urllib.parse.quote_plus('$null') '%24null' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381648053,https://api.github.com/repos/simonw/datasette/issues/216,381648053,MDEyOklzc3VlQ29tbWVudDM4MTY0ODA1Mw==,9599,simonw,2018-04-16T15:35:17Z,2018-04-16T15:35:17Z,OWNER,"I think the correct SQL is this: https://datasette-issue-189-demo-3.now.sh/salaries-7859114-7859114?sql=select+rowid%2C+*+from+%5B2017+Maryland+state+salaries%5D%0D%0Awhere+%28middle_initial+is+not+null+or+%28middle_initial+is+null+and+rowid+%3E+%3Ap0%29%29%0D%0Aorder+by+middle_initial+limit+101&p0=391 ``` select rowid, * from [2017 Maryland state salaries] where (middle_initial is not null or (middle_initial is null and rowid > :p0)) order by middle_initial limit 101 ``` Though this will also need to be taken into account for #198 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381649140,https://api.github.com/repos/simonw/datasette/issues/216,381649140,MDEyOklzc3VlQ29tbWVudDM4MTY0OTE0MA==,9599,simonw,2018-04-16T15:38:29Z,2018-04-16T15:38:29Z,OWNER,But what would that SQL look like for `_sort_desc`?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381649437,https://api.github.com/repos/simonw/datasette/issues/216,381649437,MDEyOklzc3VlQ29tbWVudDM4MTY0OTQzNw==,9599,simonw,2018-04-16T15:39:21Z,2018-04-16T15:39:21Z,OWNER,"Here's where that SQL gets constructed at the moment: https://github.com/simonw/datasette/blob/10a34f995c70daa37a8a2aa02c3135a4b023a24c/datasette/app.py#L761-L771","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381786522,https://api.github.com/repos/simonw/datasette/issues/216,381786522,MDEyOklzc3VlQ29tbWVudDM4MTc4NjUyMg==,9599,simonw,2018-04-16T23:58:45Z,2018-04-16T23:59:13Z,OWNER,"Weird... tests are failing in Travis, despite passing on my local machine. https://travis-ci.org/simonw/datasette/builds/367423706","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381788051,https://api.github.com/repos/simonw/datasette/issues/216,381788051,MDEyOklzc3VlQ29tbWVudDM4MTc4ODA1MQ==,9599,simonw,2018-04-17T00:07:48Z,2018-04-17T00:07:48Z,OWNER,Still failing. This is very odd.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381794744,https://api.github.com/repos/simonw/datasette/issues/216,381794744,MDEyOklzc3VlQ29tbWVudDM4MTc5NDc0NA==,9599,simonw,2018-04-17T00:51:41Z,2018-04-17T00:51:41Z,OWNER,I'm reverting this out of master until I can figure out why the tests are failing.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381798786,https://api.github.com/repos/simonw/datasette/issues/216,381798786,MDEyOklzc3VlQ29tbWVudDM4MTc5ODc4Ng==,9599,simonw,2018-04-17T01:18:25Z,2018-04-17T01:18:25Z,OWNER,"Here's the test that's failing: https://github.com/simonw/datasette/blob/59a3aa859c0e782aeda9a515b1b52c358e8458a2/tests/test_api.py#L437-L470 I got Travis to spit out the `fetched` and `expected` variables. `expected` has 201 items in it and is identical to what I get on my local laptop. `fetched` has 250 items in it, so it's clearly different from my local environment. I've managed to replicate the bug in production! I created a test database like this: python tests/fixtures.py sortable.db Then deployed that database like so: datasette publish now sortable.db \ --extra-options=""--page_size=50"" --branch=debug-travis-issue-216 And... if you click ""next"" on this page https://datasette-issue-216-pagination.now.sh/sortable-5679797/sortable?_sort_desc=sortable_with_nulls five times you get back 250 results, when you should only get back 201.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381799267,https://api.github.com/repos/simonw/datasette/issues/216,381799267,MDEyOklzc3VlQ29tbWVudDM4MTc5OTI2Nw==,9599,simonw,2018-04-17T01:21:35Z,2018-04-17T01:21:35Z,OWNER,"The version that I deployed which exhibits the bug is running SQLite `3.8.7.1` - https://datasette-issue-216-pagination.now.sh/sortable-5679797?sql=select+sqlite_version%28%29 The version that I have running locally which does NOT exhibit the bug is running SQLite `3.23.0`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381799408,https://api.github.com/repos/simonw/datasette/issues/216,381799408,MDEyOklzc3VlQ29tbWVudDM4MTc5OTQwOA==,9599,simonw,2018-04-17T01:22:30Z,2018-04-17T01:22:30Z,OWNER,"... which is VERY surprising, because `3.23.0` only came out on 2nd April this year: https://www.sqlite.org/changes.html - I have no idea how I came to be running that version on my laptop.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381801302,https://api.github.com/repos/simonw/datasette/issues/216,381801302,MDEyOklzc3VlQ29tbWVudDM4MTgwMTMwMg==,9599,simonw,2018-04-17T01:33:43Z,2018-04-17T01:33:43Z,OWNER,"This is the SQL that returns differing results in production and on my laptop: https://datasette-issue-216-pagination.now.sh/sortable-5679797?sql=select+%2A+from+sortable+where+%28sortable_with_nulls+is+null+and+%28%28pk1+%3E+%3Ap0%29%0A++or%0A%28pk1+%3D+%3Ap0+and+pk2+%3E+%3Ap1%29%29%29+order+by+sortable_with_nulls+desc+limit+51&p0=b&p1=t ``` select * from sortable where (sortable_with_nulls is null and ((pk1 > :p0) or (pk1 = :p0 and pk2 > :p1))) order by sortable_with_nulls desc limit 51 ``` I think that `order by sortable_with_nulls desc` bit is at fault - the primary keys should be included in that order by as well. Sure enough, changing the query to this one returns the same results across both environments: ``` select * from sortable where (sortable_with_nulls is null and ((pk1 > :p0) or (pk1 = :p0 and pk2 > :p1))) order by sortable_with_nulls desc, pk1, pk2 limit 51 ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381803157,https://api.github.com/repos/simonw/datasette/issues/216,381803157,MDEyOklzc3VlQ29tbWVudDM4MTgwMzE1Nw==,9599,simonw,2018-04-17T01:45:24Z,2018-04-17T01:45:24Z,OWNER,Fixed!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/217#issuecomment-407980050,https://api.github.com/repos/simonw/datasette/issues/217,407980050,MDEyOklzc3VlQ29tbWVudDQwNzk4MDA1MA==,9599,simonw,2018-07-26T05:24:17Z,2018-07-26T05:24:17Z,OWNER,Documentation: http://datasette.readthedocs.io/en/latest/plugins.html#publish-subcommand-publish,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314725342,Plugin support for datasette publish, https://github.com/simonw/datasette/issues/220#issuecomment-381777108,https://api.github.com/repos/simonw/datasette/issues/220,381777108,MDEyOklzc3VlQ29tbWVudDM4MTc3NzEwOA==,9599,simonw,2018-04-16T23:04:04Z,2018-04-16T23:04:04Z,OWNER,This could also help workaround the current predicament that a single plugin can only define one prepare_connection hook.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314847571,Investigate syntactic sugar for plugins, https://github.com/simonw/datasette/issues/220#issuecomment-642944645,https://api.github.com/repos/simonw/datasette/issues/220,642944645,MDEyOklzc3VlQ29tbWVudDY0Mjk0NDY0NQ==,9599,simonw,2020-06-11T21:49:55Z,2020-06-11T21:49:55Z,OWNER,"I'm OK with not implementing this - I've got used to the existing mechanism, and it doesn't frustrate me enough to work on this more.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314847571,Investigate syntactic sugar for plugins, https://github.com/simonw/datasette/issues/221#issuecomment-754190814,https://api.github.com/repos/simonw/datasette/issues/221,754190814,MDEyOklzc3VlQ29tbWVudDc1NDE5MDgxNA==,9599,simonw,2021-01-04T20:10:34Z,2021-01-04T20:10:34Z,OWNER,"For the `csvs-to-sqlite` case I'm going with `datasette insert` instead, see #1160.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315142414,Allow plugins to add new cli sub commands , https://github.com/simonw/datasette/issues/221#issuecomment-754190952,https://api.github.com/repos/simonw/datasette/issues/221,754190952,MDEyOklzc3VlQ29tbWVudDc1NDE5MDk1Mg==,9599,simonw,2021-01-04T20:10:51Z,2021-01-04T20:10:51Z,OWNER,Is this still a good idea? I don't have any pressing need for it at the moment.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315142414,Allow plugins to add new cli sub commands , https://github.com/simonw/datasette/issues/221#issuecomment-754191699,https://api.github.com/repos/simonw/datasette/issues/221,754191699,MDEyOklzc3VlQ29tbWVudDc1NDE5MTY5OQ==,9599,simonw,2021-01-04T20:12:14Z,2021-01-04T20:12:14Z,OWNER,I'm going to close this. Plugins can register their own CLI tools (see https://github.com/simonw/click-app) if they need to.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315142414,Allow plugins to add new cli sub commands , https://github.com/simonw/datasette/issues/223#issuecomment-382408128,https://api.github.com/repos/simonw/datasette/issues/223,382408128,MDEyOklzc3VlQ29tbWVudDM4MjQwODEyOA==,9599,simonw,2018-04-18T14:33:09Z,2018-04-18T14:33:09Z,OWNER,"Demo: datasette publish now sortable.db --install datasette-plugin-demos --branch=master Produced this deployment, with both the `random_integer()` function and the static file from https://github.com/simonw/datasette-plugin-demos/tree/0.2 https://datasette-issue-223.now.sh/-/static-plugins/datasette_plugin_demos/plugin.js https://datasette-issue-223.now.sh/sortable-4bbaa6f?sql=select+random_integer%280%2C+10%29 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315327860,datasette publish --install=name-of-plugin, https://github.com/simonw/datasette/issues/223#issuecomment-382409989,https://api.github.com/repos/simonw/datasette/issues/223,382409989,MDEyOklzc3VlQ29tbWVudDM4MjQwOTk4OQ==,9599,simonw,2018-04-18T14:38:08Z,2018-04-18T14:38:08Z,OWNER,"Tested on Heroku as well. datasette publish heroku sortable.db --install datasette-plugin-demos --branch=master https://morning-tor-45944.herokuapp.com/-/static-plugins/datasette_plugin_demos/plugin.js https://morning-tor-45944.herokuapp.com/sortable-4bbaa6f?sql=select+random_integer%280%2C+10%29","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315327860,datasette publish --install=name-of-plugin, https://github.com/simonw/datasette/issues/223#issuecomment-382413121,https://api.github.com/repos/simonw/datasette/issues/223,382413121,MDEyOklzc3VlQ29tbWVudDM4MjQxMzEyMQ==,9599,simonw,2018-04-18T14:47:18Z,2018-04-18T14:47:18Z,OWNER,"And tested `datasette package` - this time exercising the ability to pass more than one `--install` option: ``` $ datasette package sortable.db --branch=master --install requests --install datasette-plugin-demos Sending build context to Docker daemon 125.4kB Step 1/7 : FROM python:3 ---> 79e1dc9af1c1 Step 2/7 : COPY . /app ---> 6e8e40bce378 Step 3/7 : WORKDIR /app Removing intermediate container 7cdc9ab20d09 ---> f42258c2211f Step 4/7 : RUN pip install https://github.com/simonw/datasette/archive/master.zip requests datasette-plugin-demos ---> Running in a0f17cec08a4 Collecting ... Removing intermediate container a0f17cec08a4 ---> beea84e73271 Step 5/7 : RUN datasette inspect sortable.db --inspect-file inspect-data.json ---> Running in 4daa28792348 Removing intermediate container 4daa28792348 ---> c60312d21b99 Step 6/7 : EXPOSE 8001 ---> Running in fa728468482d Removing intermediate container fa728468482d ---> 8f219a61fddc Step 7/7 : CMD [""datasette"", ""serve"", ""--host"", ""0.0.0.0"", ""sortable.db"", ""--cors"", ""--port"", ""8001"", ""--inspect-file"", ""inspect-data.json""] ---> Running in cd4eaeb2ce9e Removing intermediate container cd4eaeb2ce9e ---> 066e257c7c44 Successfully built 066e257c7c44 (venv) datasette $ docker run -p 8081:8001 066e257c7c44 Serve! files=('sortable.db',) on port 8001 [2018-04-18 14:40:18 +0000] [1] [INFO] Goin' Fast @ http://0.0.0.0:8001 [2018-04-18 14:40:18 +0000] [1] [INFO] Starting worker [1] [2018-04-18 14:46:01 +0000] - (sanic.access)[INFO][1:7]: GET http://localhost:8081/-/static-plugins/datasette_plugin_demos/plugin.js 200 16 ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315327860,datasette publish --install=name-of-plugin, https://github.com/simonw/datasette/issues/224#issuecomment-382616527,https://api.github.com/repos/simonw/datasette/issues/224,382616527,MDEyOklzc3VlQ29tbWVudDM4MjYxNjUyNw==,9599,simonw,2018-04-19T05:40:28Z,2018-04-19T05:40:28Z,OWNER,"No need to use `PackageLoader` after all, we can use the same mechanism we used for the static path: https://github.com/simonw/datasette/blob/b55809a1e20986bb2e638b698815a77902e8708d/datasette/utils.py#L694-L695","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315517578,Ability for plugins to bundle templates, https://github.com/simonw/datasette/issues/226#issuecomment-504720379,https://api.github.com/repos/simonw/datasette/issues/226,504720379,MDEyOklzc3VlQ29tbWVudDUwNDcyMDM3OQ==,9599,simonw,2019-06-23T05:05:32Z,2019-06-23T05:05:32Z,OWNER,The mechanism I described here - having a `tests/example_plugin` folder - is probably the right solution for #517 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315738696,Unit tests for installable plugins, https://github.com/simonw/datasette/issues/226#issuecomment-733198051,https://api.github.com/repos/simonw/datasette/issues/226,733198051,MDEyOklzc3VlQ29tbWVudDczMzE5ODA1MQ==,9599,simonw,2020-11-24T19:52:46Z,2020-11-24T19:52:46Z,OWNER,This is well handled now: https://github.com/simonw/datasette/tree/0.51.1/tests/plugins,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315738696,Unit tests for installable plugins, https://github.com/simonw/datasette/issues/227#issuecomment-382808266,https://api.github.com/repos/simonw/datasette/issues/227,382808266,MDEyOklzc3VlQ29tbWVudDM4MjgwODI2Ng==,9599,simonw,2018-04-19T16:59:23Z,2018-04-19T16:59:23Z,OWNER,"Maybe this should have a second argument indicating which codepath was being handled. That way plugins could say ""only inject this extra context variable on the row page"".","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/227#issuecomment-382958693,https://api.github.com/repos/simonw/datasette/issues/227,382958693,MDEyOklzc3VlQ29tbWVudDM4Mjk1ODY5Mw==,9599,simonw,2018-04-20T03:15:52Z,2018-04-20T03:15:52Z,OWNER,"A better way to do this would be with many different plugin hooks, one for each view.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/227#issuecomment-382959857,https://api.github.com/repos/simonw/datasette/issues/227,382959857,MDEyOklzc3VlQ29tbWVudDM4Mjk1OTg1Nw==,9599,simonw,2018-04-20T03:21:43Z,2018-04-20T03:21:43Z,OWNER,"Plus a generic prepare_context() hook called in the common render method. prepare_context_table(), prepare_context_row() etc Arguments are context, request, self (hence can access self.ds) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/227#issuecomment-382964794,https://api.github.com/repos/simonw/datasette/issues/227,382964794,MDEyOklzc3VlQ29tbWVudDM4Mjk2NDc5NA==,9599,simonw,2018-04-20T03:45:18Z,2018-04-20T03:45:18Z,OWNER,"What if the context needs to make await calls? One possible option: plugins can either manipulate the context in place OR they can return an awaitable. If they do that, the caller will await it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/227#issuecomment-382966604,https://api.github.com/repos/simonw/datasette/issues/227,382966604,MDEyOklzc3VlQ29tbWVudDM4Mjk2NjYwNA==,9599,simonw,2018-04-20T03:54:56Z,2018-04-20T03:54:56Z,OWNER,Should this differentiate between preparing the data to be sent back as JSON and preparing the context for the template?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/227#issuecomment-382967238,https://api.github.com/repos/simonw/datasette/issues/227,382967238,MDEyOklzc3VlQ29tbWVudDM4Mjk2NzIzOA==,9599,simonw,2018-04-20T03:58:09Z,2018-04-20T03:58:09Z,OWNER,Maybe prepare_table_data() vs prepare_table_context(),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/227#issuecomment-439194286,https://api.github.com/repos/simonw/datasette/issues/227,439194286,MDEyOklzc3VlQ29tbWVudDQzOTE5NDI4Ng==,222245,carlmjohnson,2018-11-15T21:20:37Z,2018-11-15T21:20:37Z,NONE,I'm diving back into https://salaries.news.baltimoresun.com and what I really want is the ability to inject the request into my context.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/227#issuecomment-603534725,https://api.github.com/repos/simonw/datasette/issues/227,603534725,MDEyOklzc3VlQ29tbWVudDYwMzUzNDcyNQ==,9599,simonw,2020-03-24T22:19:54Z,2020-03-24T22:19:54Z,OWNER,I think the [extra_template_vars()](https://datasette.readthedocs.io/en/stable/plugins.html#extra-template-vars-template-database-table-view-name-request-datasette) hook covers this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/228#issuecomment-382924910,https://api.github.com/repos/simonw/datasette/issues/228,382924910,MDEyOklzc3VlQ29tbWVudDM4MjkyNDkxMA==,9599,simonw,2018-04-20T00:35:48Z,2018-04-20T00:35:48Z,OWNER,"Hiding tables with the `idx_` prefix should be good enough here, since false positives aren't very harmful.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316031566,"If spatialite detected, mark idx_XXX_Geometry tables as hidden", https://github.com/simonw/datasette/issues/229#issuecomment-384512192,https://api.github.com/repos/simonw/datasette/issues/229,384512192,MDEyOklzc3VlQ29tbWVudDM4NDUxMjE5Mg==,9599,simonw,2018-04-26T04:49:46Z,2018-04-26T04:49:46Z,OWNER,Documentation: http://datasette.readthedocs.io/en/latest/json_api.html#special-table-arguments,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316123256,Table view should support ?_size=400 parameter, https://github.com/simonw/datasette/issues/230#issuecomment-383109984,https://api.github.com/repos/simonw/datasette/issues/230,383109984,MDEyOklzc3VlQ29tbWVudDM4MzEwOTk4NA==,9599,simonw,2018-04-20T14:15:39Z,2018-04-20T14:15:39Z,OWNER,Refs #229,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316128955,Setting page size AND max returned rows to 1000 doesn't seem to work, https://github.com/simonw/datasette/issues/231#issuecomment-383315348,https://api.github.com/repos/simonw/datasette/issues/231,383315348,MDEyOklzc3VlQ29tbWVudDM4MzMxNTM0OA==,9599,simonw,2018-04-21T17:37:50Z,2018-04-22T23:06:04Z,OWNER,"I could also have an `""autodetect"": false` option for that plugin to turn off autodetecting entirely. Would be useful if the plugin didn't append its JavaScript in pages that it wasn't used for - that might require making the `extra_js_urls()` hook optionally aware of the columns and table and metadata.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316323336,metadata.json support for plugin configuration options, https://github.com/simonw/datasette/issues/231#issuecomment-392305776,https://api.github.com/repos/simonw/datasette/issues/231,392305776,MDEyOklzc3VlQ29tbWVudDM5MjMwNTc3Ng==,9599,simonw,2018-05-27T05:10:46Z,2018-05-27T05:10:46Z,OWNER,These plugin config options should be exposed to JavaScript as `datasette.config.plugins`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316323336,metadata.json support for plugin configuration options, https://github.com/simonw/datasette/issues/231#issuecomment-412291395,https://api.github.com/repos/simonw/datasette/issues/231,412291395,MDEyOklzc3VlQ29tbWVudDQxMjI5MTM5NQ==,9599,simonw,2018-08-11T17:54:41Z,2018-08-11T17:54:41Z,OWNER,"I'm going to separate the issue of enabling and disabling plugins from the existence of the `plugins` key. The format will simply be: ``` { ""plugins"": { ""name-of-plugin"": { ... any structures you like go here, defined by the plugin ... } } } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316323336,metadata.json support for plugin configuration options, https://github.com/simonw/datasette/issues/231#issuecomment-491943956,https://api.github.com/repos/simonw/datasette/issues/231,491943956,MDEyOklzc3VlQ29tbWVudDQ5MTk0Mzk1Ng==,9599,simonw,2019-05-13T18:56:21Z,2019-05-13T18:56:21Z,OWNER,I implemented this a while ago but forgot to close the issue: https://datasette.readthedocs.io/en/stable/plugins.html#plugin-configuration,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316323336,metadata.json support for plugin configuration options, https://github.com/simonw/datasette/pull/232#issuecomment-383252624,https://api.github.com/repos/simonw/datasette/issues/232,383252624,MDEyOklzc3VlQ29tbWVudDM4MzI1MjYyNA==,9599,simonw,2018-04-21T00:19:00Z,2018-04-21T00:19:00Z,OWNER,Thanks!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316365426,Fix a typo, https://github.com/simonw/datasette/issues/233#issuecomment-397637302,https://api.github.com/repos/simonw/datasette/issues/233,397637302,MDEyOklzc3VlQ29tbWVudDM5NzYzNzMwMg==,9599,simonw,2018-06-15T14:24:08Z,2018-06-15T14:55:19Z,OWNER,"I'm going with the terminology ""labels"" here. You'll be able to add ``?_labels=1`` and the JSON will look something like this: ``` { ""rowid"": 233, ""TreeID"": 121240, ""qLegalStatus"": { ""value"" 2, ""label"": ""Private"" } ""qSpecies"": { ""value"": 16, ""label"": ""Sycamore"" } ""qAddress"": ""91 Commonwealth Ave"", ... } ``` I need this to help build foreign key expansions for CSV files, see #266 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397648080,https://api.github.com/repos/simonw/datasette/issues/233,397648080,MDEyOklzc3VlQ29tbWVudDM5NzY0ODA4MA==,9599,simonw,2018-06-15T14:56:21Z,2018-06-15T14:56:21Z,OWNER,"I considered including a `""table""` key like this: ``` ""qLegalStatus"": { ""value"" 2, ""label"": ""Private"", ""table"": ""qLegalStatus"" } ``` This would help generate the HTML links using just the JSON data. But... I realized that in a list of 50 rows that value would be duplicated 50 times which is a bit nasty.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397663968,https://api.github.com/repos/simonw/datasette/issues/233,397663968,MDEyOklzc3VlQ29tbWVudDM5NzY2Mzk2OA==,9599,simonw,2018-06-15T15:51:17Z,2018-06-15T15:51:17Z,OWNER,"Nearly done, but I need the HTML view to ignore the `?_labels=1` param (it throws an error at the moment).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397668427,https://api.github.com/repos/simonw/datasette/issues/233,397668427,MDEyOklzc3VlQ29tbWVudDM5NzY2ODQyNw==,9599,simonw,2018-06-15T16:07:43Z,2018-06-15T16:07:43Z,OWNER,Demo: https://datasette-json-labels-demo.now.sh/fixtures-fda0fea/facetable.json?_labels=1&_shape=array,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397729319,https://api.github.com/repos/simonw/datasette/issues/233,397729319,MDEyOklzc3VlQ29tbWVudDM5NzcyOTMxOQ==,9599,simonw,2018-06-15T20:10:24Z,2018-06-15T20:10:24Z,OWNER,I'm also going to add the ability to specify individual columns that you want to expand using `?_label=city_id&_label=state_id`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397729500,https://api.github.com/repos/simonw/datasette/issues/233,397729500,MDEyOklzc3VlQ29tbWVudDM5NzcyOTUwMA==,9599,simonw,2018-06-15T20:11:14Z,2018-06-15T20:11:14Z,OWNER,The `.json` and `.csv` links displayed on the table page should default to using `?_labels=1` if Datasette detects that there are foreign key expansions available for the page.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397824991,https://api.github.com/repos/simonw/datasette/issues/233,397824991,MDEyOklzc3VlQ29tbWVudDM5NzgyNDk5MQ==,9599,simonw,2018-06-16T16:50:31Z,2018-06-16T16:50:42Z,OWNER,"I'm going to support `?_labels=` on HTML views, but I'll allow it to be used to turn them off (they are on by default) using `?_labels=off`. Related: 7e0caa1e62607c6579101cc0e62bec8899013715 where I added a new `value_as_boolean` helper extracted from how `--config` works in `cli.py`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397839482,https://api.github.com/repos/simonw/datasette/issues/233,397839482,MDEyOklzc3VlQ29tbWVudDM5NzgzOTQ4Mg==,9599,simonw,2018-06-16T21:21:03Z,2018-06-16T21:21:03Z,OWNER,Should facets always have their labels expanded or should they also obey the `_labels` and `_label` querystring arguments?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397839583,https://api.github.com/repos/simonw/datasette/issues/233,397839583,MDEyOklzc3VlQ29tbWVudDM5NzgzOTU4Mw==,9599,simonw,2018-06-16T21:23:14Z,2018-06-16T21:23:44Z,OWNER,"I'm a bit torn on naming - choices are: * `?_labels=on` and `?_label=col1&_label=col2` * `?_expands=on` (or `?_expand_all=on`) and `?_expand=col1&_expand=col2`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397840676,https://api.github.com/repos/simonw/datasette/issues/233,397840676,MDEyOklzc3VlQ29tbWVudDM5Nzg0MDY3Ng==,9599,simonw,2018-06-16T21:49:50Z,2018-06-16T21:49:50Z,OWNER,For the moment I'm going with `_labels=`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397842194,https://api.github.com/repos/simonw/datasette/issues/233,397842194,MDEyOklzc3VlQ29tbWVudDM5Nzg0MjE5NA==,9599,simonw,2018-06-16T22:26:21Z,2018-06-16T22:26:21Z,OWNER,"Some demos: * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List - regular HTML view * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List?_labels=off - no labels * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List.json?_labels=on - JSON with all labels * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List.json?_label=qSpecies&_shape=array - JSON with specific labels in array shape * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List.csv?_labels=on - CSV with all labels * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List.csv?_label=qSpecies - CSV with specific labels","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/234#issuecomment-383398182,https://api.github.com/repos/simonw/datasette/issues/234,383398182,MDEyOklzc3VlQ29tbWVudDM4MzM5ODE4Mg==,9599,simonw,2018-04-22T17:31:12Z,2018-04-22T17:31:12Z,OWNER,"```{ ""databases"": { ""database1"": { ""tables"": { ""example_table"": { ""label_column"": ""name"" } } } } } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316526433,label_column option in metadata.json, https://github.com/simonw/datasette/issues/234#issuecomment-383399762,https://api.github.com/repos/simonw/datasette/issues/234,383399762,MDEyOklzc3VlQ29tbWVudDM4MzM5OTc2Mg==,9599,simonw,2018-04-22T17:54:39Z,2018-04-22T17:54:39Z,OWNER,Docs here: http://datasette.readthedocs.io/en/latest/metadata.html#specifying-the-label-column-for-a-table,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316526433,label_column option in metadata.json, https://github.com/simonw/datasette/issues/234#issuecomment-383410146,https://api.github.com/repos/simonw/datasette/issues/234,383410146,MDEyOklzc3VlQ29tbWVudDM4MzQxMDE0Ng==,9599,simonw,2018-04-22T20:32:30Z,2018-04-22T20:47:02Z,OWNER,"I built this wrong: my implementation is looking for the `label_column` on the table-being-displayed, but it should be looking for it on the table-the-foreign-key-links-to.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316526433,label_column option in metadata.json, https://github.com/simonw/datasette/issues/235#issuecomment-383727973,https://api.github.com/repos/simonw/datasette/issues/235,383727973,MDEyOklzc3VlQ29tbWVudDM4MzcyNzk3Mw==,9599,simonw,2018-04-23T21:23:59Z,2018-04-23T21:23:59Z,OWNER,"There might also be something clever we can do here with PRAGMA statements: https://stackoverflow.com/questions/14146881/limit-the-maximum-amount-of-memory-sqlite3-uses And https://www.sqlite.org/pragma.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316621102,Add limit on the size in KB of data returned from a single query, https://github.com/simonw/datasette/issues/235#issuecomment-383764533,https://api.github.com/repos/simonw/datasette/issues/235,383764533,MDEyOklzc3VlQ29tbWVudDM4Mzc2NDUzMw==,9599,simonw,2018-04-24T00:30:02Z,2018-04-24T00:30:02Z,OWNER,The `resource` module in he standard library has the ability to set limits on memory usage for the current process: https://pymotw.com/2/resource/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316621102,Add limit on the size in KB of data returned from a single query, https://github.com/simonw/datasette/issues/236#issuecomment-608716819,https://api.github.com/repos/simonw/datasette/issues/236,608716819,MDEyOklzc3VlQ29tbWVudDYwODcxNjgxOQ==,193185,cldellow,2020-04-03T22:19:00Z,2020-04-03T22:19:00Z,CONTRIBUTOR,"Hi Simon, I'm thinking of attempting this. Can you clarify some questions I have? 1) I assume the goal is to have a CORS-friendly HTTPS endpoint that hosts the datasette service + user's db. 2) If that's the goal, I think Lambda alone is insufficient. Lambda provides the compute fabric, but not the HTTP routing. You'd also need to add Application Load Balancer or API Gateway to provide an HTTP endpoint that routes to the lambda function. Do you have a preference between ALB or API GW? ALB has better economics at scale, but has a minimum monthly cost. API GW has worse per-request economics, but scales to zero when no requests are happening. 3) Does Datasette have any native components, or is it all pure python? If it has native bits, they'll likely need to be recompiled to work on Amazon Linux 2. 4) There are a few disparate services that need to be wired together to expose a Python service securely to the web. If I was doing this outside of the datasette publish system, I'd use an AWS CloudFormation template. Even within datasette, I think it still makes sense to use a CloudFormation template and just have the publish plugin invoke it (via the standard `aws` cli) with user-specified parameters. Does that sound reasonable to you? Thanks for your help!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317001500,datasette publish lambda plugin, https://github.com/simonw/datasette/issues/236#issuecomment-612216820,https://api.github.com/repos/simonw/datasette/issues/236,612216820,MDEyOklzc3VlQ29tbWVudDYxMjIxNjgyMA==,193185,cldellow,2020-04-10T21:03:38Z,2020-04-10T21:03:38Z,CONTRIBUTOR,"I made a repo at https://github.com/code402/datasette-lambda to demonstrate the idea, and scratch my personal itch for this. The demo relies on some central authority having already published a public, reusable Lambda layer with Datasette & its dependencies. I think that differs from the other publish plugins which seem to mainly publish Dockerfiles that the host will interpret to install deps from a requirements.txt file. I chose that approach because `uvloop` appears to be a dependency with native code that needs to be compiled for the target runtime environment. In this case, that's Amazon Linux 2. I'm not 100% clear on whether that's still required, because: - maybe `uvloop` is only needed for `uvicorn`, which the demo doesn't actually use since HTTP routing is handled by API Gateway - it seems like `uvloop` may be an optional, drop-in optimization for `asyncio` in any case (but I may be misreading this; I'm very much a Python noob) If it's the case that `uvloop` is truly optional, then I think the publish plugin could do the packaging on the user's machine, regardless of what flavour of operating system they're on. That'd be a bit slower for the user, but would provide the most long-term flexibility in terms of supporting plugins.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317001500,datasette publish lambda plugin, https://github.com/simonw/datasette/issues/236#issuecomment-645066486,https://api.github.com/repos/simonw/datasette/issues/236,645066486,MDEyOklzc3VlQ29tbWVudDY0NTA2NjQ4Ng==,9599,simonw,2020-06-16T23:45:45Z,2020-06-16T23:45:45Z,OWNER,"Hi Colin, Sorry I didn't see this sooner! I've just started digging into this myself, to try and play with the new EFS Lambda support: #850. Yes, uvloop is only needed because of uvicorn. I have a branch here that removes that dependency just for trying out Lambda: https://github.com/simonw/datasette/tree/no-uvicorn - so you can run `pip install https://github.com/simonw/datasette/archive/no-uvicorn.zip` to get that. I'm going to try out your `datasette-lambda` project next - really excited to see how far you've got with it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317001500,datasette publish lambda plugin, https://github.com/simonw/datasette/issues/236#issuecomment-645067611,https://api.github.com/repos/simonw/datasette/issues/236,645067611,MDEyOklzc3VlQ29tbWVudDY0NTA2NzYxMQ==,9599,simonw,2020-06-16T23:50:12Z,2020-06-16T23:50:59Z,OWNER,"As for your other questions: > 1. I assume the goal is to have a CORS-friendly HTTPS endpoint that hosts the datasette service + user's db. Yes, exactly. I know this will limit the size of database that can be deployed (since Lambda has a 50MB total package limit as far as I can tell) but there are plenty of interesting databases that are small enough to fit there. The new EFS support for Lambda means that theoretically the size of database is now unlimited, which is really interesting. That's what got me inspired to take a look at a proof of concept in #850. > 2. If that's the goal, I think Lambda alone is insufficient. Lambda provides the compute fabric, but not the HTTP routing. You'd also need to add Application Load Balancer or API Gateway to provide an HTTP endpoint that routes to the lambda function. > > Do you have a preference between ALB or API GW? ALB has better economics at scale, but has a minimum monthly cost. API GW has worse per-request economics, but scales to zero when no requests are happening. I personally like scale-to-zero because many of my projects are likely to receive very little traffic. So API GW first, and maybe ALB as an option later on for people operating at scale? > 3. Does Datasette have any native components, or is it all pure python? If it has native bits, they'll likely need to be recompiled to work on Amazon Linux 2. As you've found, the only native component is uvloop which is only needed if uvicorn is being used to serve requests. > 4. There are a few disparate services that need to be wired together to expose a Python service securely to the web. If I was doing this outside of the datasette publish system, I'd use an AWS CloudFormation template. Even within datasette, I think it still makes sense to use a CloudFormation template and just have the publish plugin invoke it (via the standard `aws` cli) with user-specified parameters. Does that sound reasonable to you? For the eventual ""datasette publish lambda"" command I want whatever results in the smallest amount of inconvenience for users. I've been trying out Amazon SAM in #850 and it requires users to run Docker on their machines, which is a pretty huge barrier to entry! I don't have much experience with CloudFormation but it's probably a better bet, especially if you can ""pip install"" the dependencies needed to deploy with it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317001500,datasette publish lambda plugin, https://github.com/simonw/datasette/issues/236#issuecomment-799002993,https://api.github.com/repos/simonw/datasette/issues/236,799002993,MDEyOklzc3VlQ29tbWVudDc5OTAwMjk5Mw==,21148,jacobian,2021-03-14T23:41:51Z,2021-03-14T23:41:51Z,CONTRIBUTOR,"Now that [Lambda supports Docker](https://aws.amazon.com/blogs/aws/new-for-aws-lambda-container-image-support/), this probably is a bit easier and may be able to build on top of the existing package command. There are weirdnesses in how the command actually gets invoked; the [aws-lambda-python image](https://hub.docker.com/r/amazon/aws-lambda-python) shows a bit of that. So Datasette would probably need some sort of Lambda-specific entry point to make this work.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317001500,datasette publish lambda plugin, https://github.com/simonw/datasette/issues/236#issuecomment-799003172,https://api.github.com/repos/simonw/datasette/issues/236,799003172,MDEyOklzc3VlQ29tbWVudDc5OTAwMzE3Mg==,21148,jacobian,2021-03-14T23:42:57Z,2021-03-14T23:42:57Z,CONTRIBUTOR,"Oh, and the container image can be up to 10GB, so the EFS step might not be needed except for pretty big stuff.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317001500,datasette publish lambda plugin, https://github.com/simonw/datasette/issues/236#issuecomment-799066252,https://api.github.com/repos/simonw/datasette/issues/236,799066252,MDEyOklzc3VlQ29tbWVudDc5OTA2NjI1Mg==,9599,simonw,2021-03-15T03:34:52Z,2021-03-15T03:34:52Z,OWNER,"Yeah the Lambda Docker stuff is pretty odd - you still don't get to speak HTTP, you have to speak their custom event protocol instead. https://github.com/glassechidna/serverlessish looks interesting here - it adds a proxy inside the container which allows your existing HTTP Docker image to run within Docker-on-Lambda. I've not tried it out yet though.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317001500,datasette publish lambda plugin, https://github.com/simonw/datasette/issues/236#issuecomment-920543967,https://api.github.com/repos/simonw/datasette/issues/236,920543967,IC_kwDOBm6k_c423mLf,164214,sethvincent,2021-09-16T03:19:08Z,2021-09-16T03:19:08Z,NONE,":wave: I just put together a small example using the lambda container image support: https://github.com/sethvincent/datasette-aws-lambda-example It uses mangum and AWS's [python runtime interface client](https://github.com/aws/aws-lambda-python-runtime-interface-client) to handle the lambda event stuff. I'd be happy to help with a publish plugin for AWS lambda as I plan to use this for upcoming projects. The example uses the [serverless](https://www.serverless.com) cli for deployment but there might be a more suitable deployment approach for the plugin. It would be cool if users didn't have to install anything additional other than the aws cli and its associated config/credentials setup.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317001500,datasette publish lambda plugin, https://github.com/simonw/datasette/issues/236#issuecomment-922075480,https://api.github.com/repos/simonw/datasette/issues/236,922075480,IC_kwDOBm6k_c429cFY,9599,simonw,2021-09-17T20:54:13Z,2021-09-17T20:54:13Z,OWNER,"That's so useful @sethvincent! Really interesting reading your code there, especially clever how you're using the `base_url` config. I'd be very interested to see what your demo looks like without using serverless - completely agree that the less additional dependencies there are for this the better. I'm also very interested in figuring out a way to run Datasette in Lambda but with the SQLite database on an EFS volume. Do you have a feel for how hard that would be?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317001500,datasette publish lambda plugin, https://github.com/simonw/datasette/issues/236#issuecomment-1033772902,https://api.github.com/repos/simonw/datasette/issues/236,1033772902,IC_kwDOBm6k_c49nh9m,1376648,jordaneremieff,2022-02-09T13:40:52Z,2022-02-09T13:40:52Z,NONE,"Hi @simonw, I've received some inquiries over the last year or so about Datasette and how it might be supported by [Mangum](https://github.com/jordaneremieff/mangum). I maintain Mangum which is, as far as I know, the only project that provides support for ASGI applications in AWS Lambda. If there is anything that I can help with here, please let me know because I think what Datasette provides to the community (even beyond OSS) is noble and worthy of special consideration.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317001500,datasette publish lambda plugin, https://github.com/simonw/datasette/issues/236#issuecomment-1465208436,https://api.github.com/repos/simonw/datasette/issues/236,1465208436,IC_kwDOBm6k_c5XVU50,545193,sopel,2023-03-12T14:04:15Z,2023-03-12T14:04:15Z,NONE,"I keep coming back to this in search for the related exploration, so I'll just link it now: @simonw has meanwhile researched _how to deploy Datasette to AWS Lambda using function URLs and Mangum_ via https://github.com/simonw/public-notes/issues/6 and concluded _that's everything I need to know in order to build a datasette-publish-lambda plugin_.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317001500,datasette publish lambda plugin, https://github.com/simonw/datasette/issues/237#issuecomment-386840307,https://api.github.com/repos/simonw/datasette/issues/237,386840307,MDEyOklzc3VlQ29tbWVudDM4Njg0MDMwNw==,9599,simonw,2018-05-05T22:45:45Z,2018-05-05T22:45:45Z,OWNER,Documented here: http://datasette.readthedocs.io/en/latest/json_api.html#special-table-arguments,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317475156,Support for ?_search_colname=blah searches, https://github.com/simonw/datasette/issues/237#issuecomment-386840806,https://api.github.com/repos/simonw/datasette/issues/237,386840806,MDEyOklzc3VlQ29tbWVudDM4Njg0MDgwNg==,9599,simonw,2018-05-05T22:56:42Z,2018-05-05T22:56:42Z,OWNER,"Demo: datasette publish now ../datasettes/san-francisco/sf-film-locations.db --branch=master --name datasette-column-search-demo https://datasette-column-search-demo.now.sh/sf-film-locations/Film_Locations_in_San_Francisco?_search_Locations=justin","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317475156,Support for ?_search_colname=blah searches, https://github.com/simonw/datasette/issues/238#issuecomment-384362028,https://api.github.com/repos/simonw/datasette/issues/238,384362028,MDEyOklzc3VlQ29tbWVudDM4NDM2MjAyOA==,9599,simonw,2018-04-25T17:07:11Z,2018-04-25T17:07:11Z,OWNER,"On further thought: this is actually only an issue for immutable deployments to platforms like Zeit Now and Heroku. As such, adding it to `datasette serve` feels clumsy. Maybe `datasette publish` should instead gain the ability to optionally install an extra mechanism that periodically pulls a fresh copy of `metadata.json` from a URL.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317714268,External metadata.json, https://github.com/simonw/datasette/issues/238#issuecomment-412291437,https://api.github.com/repos/simonw/datasette/issues/238,412291437,MDEyOklzc3VlQ29tbWVudDQxMjI5MTQzNw==,9599,simonw,2018-08-11T17:55:26Z,2018-08-11T18:02:48Z,OWNER,"On further thought, I'd much rather implement this using some kind of metadata plugin hook - see #357","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317714268,External metadata.json, https://github.com/simonw/datasette/issues/238#issuecomment-504882244,https://api.github.com/repos/simonw/datasette/issues/238,504882244,MDEyOklzc3VlQ29tbWVudDUwNDg4MjI0NA==,9599,simonw,2019-06-24T06:52:45Z,2019-06-24T06:52:45Z,OWNER,I'm not going to do this - there are plenty of smarter ways of achieving a similar goal.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317714268,External metadata.json, https://github.com/simonw/datasette/issues/239#issuecomment-384500327,https://api.github.com/repos/simonw/datasette/issues/239,384500327,MDEyOklzc3VlQ29tbWVudDM4NDUwMDMyNw==,9599,simonw,2018-04-26T03:18:12Z,2018-04-26T03:18:20Z,OWNER,"``` { ""databases"": { ""database1"": { ""tables"": { ""example_table"": { ""hidden"": true } } } } } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317760361,Support for hidden tables in metadata.json, https://github.com/simonw/datasette/issues/239#issuecomment-384503873,https://api.github.com/repos/simonw/datasette/issues/239,384503873,MDEyOklzc3VlQ29tbWVudDM4NDUwMzg3Mw==,9599,simonw,2018-04-26T03:45:11Z,2018-04-26T03:45:11Z,OWNER,Documentation: http://datasette.readthedocs.io/en/latest/metadata.html#hiding-tables,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317760361,Support for hidden tables in metadata.json, https://github.com/simonw/datasette/issues/243#issuecomment-391030083,https://api.github.com/repos/simonw/datasette/issues/243,391030083,MDEyOklzc3VlQ29tbWVudDM5MTAzMDA4Mw==,9599,simonw,2018-05-22T15:17:10Z,2018-05-22T15:17:10Z,OWNER,See also #278,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",318737808,--spatialite option for datasette publish commands, https://github.com/simonw/datasette/issues/243#issuecomment-393544357,https://api.github.com/repos/simonw/datasette/issues/243,393544357,MDEyOklzc3VlQ29tbWVudDM5MzU0NDM1Nw==,9599,simonw,2018-05-31T14:14:49Z,2018-05-31T14:14:49Z,OWNER,"Demo: https://datasette-publish-spatialite-demo.now.sh/spatialite-test-c88bc35?sql=select+AsText(Geometry)+from+HighWays+limit+1%3B Published using `datasette publish now --spatialite /tmp/spatialite-test.db`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",318737808,--spatialite option for datasette publish commands, https://github.com/simonw/datasette/issues/244#issuecomment-386309928,https://api.github.com/repos/simonw/datasette/issues/244,386309928,MDEyOklzc3VlQ29tbWVudDM4NjMwOTkyOA==,9599,simonw,2018-05-03T14:13:49Z,2018-05-03T14:13:49Z,OWNER,Demo: https://datasette-versions-and-shape-demo.now.sh/-/versions,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",318738000,/-/versions page, https://github.com/simonw/datasette/issues/245#issuecomment-386310149,https://api.github.com/repos/simonw/datasette/issues/245,386310149,MDEyOklzc3VlQ29tbWVudDM4NjMxMDE0OQ==,9599,simonw,2018-05-03T14:14:33Z,2018-05-03T14:14:33Z,OWNER,"Demos: * https://datasette-versions-and-shape-demo.now.sh/sf-trees-02c8ef1/qSpecies.json?_shape=array * https://datasette-versions-and-shape-demo.now.sh/sf-trees-02c8ef1/qSpecies.json?_shape=object * https://datasette-versions-and-shape-demo.now.sh/sf-trees-02c8ef1/qSpecies.json?_shape=arrays * https://datasette-versions-and-shape-demo.now.sh/sf-trees-02c8ef1/qSpecies.json?_shape=objects","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",319358200,?_shape=array option, https://github.com/simonw/datasette/issues/247#issuecomment-390689406,https://api.github.com/repos/simonw/datasette/issues/247,390689406,MDEyOklzc3VlQ29tbWVudDM5MDY4OTQwNg==,11912854,jsancho-gpl,2018-05-21T15:29:31Z,2018-05-21T15:29:31Z,NONE,"I've changed my mind about the way to support external connectors aside of SQLite and I'm working in a more simple style that respects the original Datasette, i.e. less refactoring. I present you [a version of Datasette wich supports other database connectors](https://github.com/jsancho-gpl/datasette/tree/external-connectors) and [a Datasette connector for HDF5/PyTables files](https://github.com/jsancho-gpl/datasette-pytables).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",319449852,SQLite code decoupled from Datasette, https://github.com/simonw/datasette/issues/248#issuecomment-386357645,https://api.github.com/repos/simonw/datasette/issues/248,386357645,MDEyOklzc3VlQ29tbWVudDM4NjM1NzY0NQ==,9599,simonw,2018-05-03T16:36:59Z,2018-05-03T16:36:59Z,OWNER,"Even better: use `plugin_manager.list_plugin_distinfo()` from pluggy to get back a list of tuples, the second item in each tuple is a `pkg_resources.DistInfoDistribution` with a `.version` attribute.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",319954545,/-/plugins should show version of each installed plugin, https://github.com/simonw/datasette/issues/248#issuecomment-386692333,https://api.github.com/repos/simonw/datasette/issues/248,386692333,MDEyOklzc3VlQ29tbWVudDM4NjY5MjMzMw==,9599,simonw,2018-05-04T18:25:40Z,2018-05-04T18:25:40Z,OWNER,Demo: https://datasette-plugins-and-max-size-demo.now.sh/-/plugins,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",319954545,/-/plugins should show version of each installed plugin, https://github.com/simonw/datasette/issues/249#issuecomment-386692534,https://api.github.com/repos/simonw/datasette/issues/249,386692534,MDEyOklzc3VlQ29tbWVudDM4NjY5MjUzNA==,9599,simonw,2018-05-04T18:26:30Z,2018-05-04T18:26:30Z,OWNER,Demo: https://datasette-plugins-and-max-size-demo.now.sh/sf-trees/Street_Tree_List.json?_size=max,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",320090329,?_size=max argument , https://github.com/simonw/datasette/issues/251#issuecomment-386879509,https://api.github.com/repos/simonw/datasette/issues/251,386879509,MDEyOklzc3VlQ29tbWVudDM4Njg3OTUwOQ==,9599,simonw,2018-05-06T13:29:26Z,2018-05-06T13:29:26Z,OWNER,"We can solve this using the `sqlite_timelimit(conn, 20)` helper, which can tell SQLite to give up after 20ms. We can wrap that around the following SQL: select distinct COLUMN from TABLE limit 21; Then we look at the number of rows returned. If it's 21 or more we know that this table had more than 21 distinct values, so we'll treat it as ""unlimited"". Likewise, if the SQL times out before 20ms is up we will skip this introspection.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",320592643,"Explore ""distinct values for column"" in inspect()", https://github.com/simonw/datasette/issues/251#issuecomment-386879840,https://api.github.com/repos/simonw/datasette/issues/251,386879840,MDEyOklzc3VlQ29tbWVudDM4Njg3OTg0MA==,9599,simonw,2018-05-06T13:34:24Z,2018-05-06T13:34:24Z,OWNER,"Here's a quick demo of that exploration: https://datasette-distinct-column-values.now.sh/-/inspect Example output: ``` { ""antiquities-act/actions_under_antiquities_act"": { ""columns"": [ ""current_name"", ""states"", ""original_name"", ""current_agency"", ""action"", ""date"", ""year"", ""pres_or_congress"", ""acres_affected"" ], ""count"": 344, ""distinct_values_by_column"": { ""acres_affected"": null, ""action"": null, ""current_agency"": [ ""NPS"", ""State of Montana"", ""BLM"", ""State of Arizona"", ""USFS"", ""State of North Dakota"", ""NPS, BLM"", ""State of South Carolina"", ""State of New York"", ""FWS"", ""FWS, NOAA"", ""NPS, FWS"", ""NOAA"", ""BLM, USFS"", ""NOAA, FWS"" ], ""current_name"": null, ""date"": null, ""original_name"": null, ""pres_or_congress"": null, ""states"": null, ""year"": null }, ""foreign_keys"": { ""incoming"": [], ""outgoing"": [] }, ""fts_table"": null, ""hidden"": false, ""label_column"": null, ""name"": ""antiquities-act/actions_under_antiquities_act"", ""primary_keys"": [] } } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",320592643,"Explore ""distinct values for column"" in inspect()", https://github.com/simonw/datasette/issues/251#issuecomment-386879878,https://api.github.com/repos/simonw/datasette/issues/251,386879878,MDEyOklzc3VlQ29tbWVudDM4Njg3OTg3OA==,9599,simonw,2018-05-06T13:34:57Z,2018-05-06T13:34:57Z,OWNER,If I'm going to expand column introspection in this way it would be useful to also capture column type information.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",320592643,"Explore ""distinct values for column"" in inspect()", https://github.com/simonw/datasette/issues/251#issuecomment-388987044,https://api.github.com/repos/simonw/datasette/issues/251,388987044,MDEyOklzc3VlQ29tbWVudDM4ODk4NzA0NA==,9599,simonw,2018-05-14T22:47:55Z,2018-05-14T22:47:55Z,OWNER,This work is now happening in the facets branch. Closing this in favor of #255.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",320592643,"Explore ""distinct values for column"" in inspect()", https://github.com/simonw/datasette/issues/253#issuecomment-388550742,https://api.github.com/repos/simonw/datasette/issues/253,388550742,MDEyOklzc3VlQ29tbWVudDM4ODU1MDc0Mg==,9599,simonw,2018-05-12T12:09:02Z,2018-05-12T12:09:02Z,OWNER,http://datasette.readthedocs.io/en/latest/full_text_search.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",321631020,Documentation explaining how to use SQLite FTS with Datasette, https://github.com/simonw/datasette/issues/254#issuecomment-388360255,https://api.github.com/repos/simonw/datasette/issues/254,388360255,MDEyOklzc3VlQ29tbWVudDM4ODM2MDI1NQ==,9599,simonw,2018-05-11T13:16:09Z,2018-05-11T22:45:31Z,OWNER,"Do you have an example I can look at? I think I have a possible route for fixing this, but it's pretty tricky (it involves adding a full SQL statement parser, but that's needed for some other potential improvements as well). In the meantime, is this causing actual errors for you or is it more of an inconvenience (form fields being displayed that don't actually do anything)? Another potential solution here could be to allow canned queries to optionally declare their parameters in metadata.json","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322283067,Escaping named parameters in canned queries, https://github.com/simonw/datasette/issues/254#issuecomment-388367027,https://api.github.com/repos/simonw/datasette/issues/254,388367027,MDEyOklzc3VlQ29tbWVudDM4ODM2NzAyNw==,247131,philroche,2018-05-11T13:41:46Z,2018-05-11T13:41:46Z,NONE,"An example deployment @ https://datasette-zkcvlwdrhl.now.sh/simplestreams-270f20c/cloudimage?content_id__exact=com.ubuntu.cloud%3Areleased%3Adownload It is not causing errors, more of an inconvenience. I have worked around it using a `like` query instead. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322283067,Escaping named parameters in canned queries, https://github.com/simonw/datasette/issues/254#issuecomment-388497467,https://api.github.com/repos/simonw/datasette/issues/254,388497467,MDEyOklzc3VlQ29tbWVudDM4ODQ5NzQ2Nw==,9599,simonw,2018-05-11T22:06:00Z,2018-05-11T22:06:34Z,OWNER,"Got it, this seems to trigger the problem: https://datasette-zkcvlwdrhl.now.sh/simplestreams-270f20c?sql=select+*+from+cloudimage+where+%22content_id%22+%3D+%22com.ubuntu.cloud%3Areleased%3Adownload%22+order+by+id+limit+10","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322283067,Escaping named parameters in canned queries, https://github.com/simonw/datasette/issues/254#issuecomment-626340387,https://api.github.com/repos/simonw/datasette/issues/254,626340387,MDEyOklzc3VlQ29tbWVudDYyNjM0MDM4Nw==,247131,philroche,2020-05-10T14:54:13Z,2020-05-10T14:54:13Z,NONE,"This has now been resolved and is not present in current version of datasette. Sample query @simonw mentioned now returns as expected. https://aggreg8streams.tinyviking.ie/simplestreams?sql=select+*+from+cloudimage+where+%22content_id%22+%3D+%22com.ubuntu.cloud%3Areleased%3Adownload%22+order+by+id+limit+10","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322283067,Escaping named parameters in canned queries, https://github.com/simonw/datasette/issues/255#issuecomment-388525357,https://api.github.com/repos/simonw/datasette/issues/255,388525357,MDEyOklzc3VlQ29tbWVudDM4ODUyNTM1Nw==,9599,simonw,2018-05-12T03:01:14Z,2018-05-12T03:01:14Z,OWNER,Facet counts will be generated by extra SQL queries with their own aggressive time limit.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-388587855,https://api.github.com/repos/simonw/datasette/issues/255,388587855,MDEyOklzc3VlQ29tbWVudDM4ODU4Nzg1NQ==,9599,simonw,2018-05-12T22:30:23Z,2018-05-12T22:30:23Z,OWNER,Adding some TODOs to the original description (so they show up as a todo progress bar),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-388588011,https://api.github.com/repos/simonw/datasette/issues/255,388588011,MDEyOklzc3VlQ29tbWVudDM4ODU4ODAxMQ==,9599,simonw,2018-05-12T22:33:39Z,2018-05-12T22:33:39Z,OWNER,Initial documentation: http://datasette.readthedocs.io/en/latest/facets.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-388588998,https://api.github.com/repos/simonw/datasette/issues/255,388588998,MDEyOklzc3VlQ29tbWVudDM4ODU4ODk5OA==,9599,simonw,2018-05-12T22:57:30Z,2018-05-12T23:00:24Z,OWNER,"A few demos: * https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9/college-majors%2Fall-ages?_facet=Major_category * https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9/congress-age%2Fcongress-terms?_facet=chamber&_facet=state&_facet=party&_facet=incumbent * https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9/bechdel%2Fmovies?_facet=binary&_facet=test","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-388589072,https://api.github.com/repos/simonw/datasette/issues/255,388589072,MDEyOklzc3VlQ29tbWVudDM4ODU4OTA3Mg==,9599,simonw,2018-05-12T22:59:07Z,2018-05-12T22:59:07Z,OWNER,"I need to decide how to display these. They currently look like this: https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9/congress-age%2Fcongress-terms?_facet=chamber&_facet=state&_facet=party&_facet=incumbent&state=MO ![2018-05-12 at 7 58 pm](https://user-images.githubusercontent.com/9599/39962230-e7bf9e10-561e-11e8-80a7-0941b8991318.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-388645828,https://api.github.com/repos/simonw/datasette/issues/255,388645828,MDEyOklzc3VlQ29tbWVudDM4ODY0NTgyOA==,9599,simonw,2018-05-13T18:18:56Z,2018-05-13T18:20:02Z,OWNER,I may be able to run the SQL for all of the facet counts in one go using a WITH CTE query - will have to microbenchmark this to make sure it is worthwhile: https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9?sql=with+blah+as+%28select+*+from+%5Bcollege-majors%2Fall-ages%5D%29%0D%0Aselect+*+from+%28select+%22Major_category%22%2C+Major_category%2C+count%28*%29+as+n+from%0D%0Ablah+group+by+Major_category+order+by+n+desc+limit+10%29%0D%0Aunion+all%0D%0Aselect+*+from+%28select+%22Major_category2%22%2C+Major_category%2C+count%28*%29+as+n+from%0D%0Ablah+group+by+Major_category+order+by+n+desc+limit+10%29,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-388686463,https://api.github.com/repos/simonw/datasette/issues/255,388686463,MDEyOklzc3VlQ29tbWVudDM4ODY4NjQ2Mw==,9599,simonw,2018-05-14T03:23:44Z,2018-05-14T03:25:22Z,OWNER,It would be neat if there was a mechanism for calculating aggregates per facet - e.g. calculating the sum() of specific columns against each facet result on https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9/nba-elo%2Fnbaallelo?_facet=lg_id&_facet=fran_id&lg_id=ABA&_facet=team_id,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-388784063,https://api.github.com/repos/simonw/datasette/issues/255,388784063,MDEyOklzc3VlQ29tbWVudDM4ODc4NDA2Mw==,9599,simonw,2018-05-14T11:25:00Z,2018-05-14T11:25:15Z,OWNER,"Can I get facets working across many2many relationships? This would be fiendishly useful, but the querystring and `metadata.json` syntax is non-obvious.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-388784787,https://api.github.com/repos/simonw/datasette/issues/255,388784787,MDEyOklzc3VlQ29tbWVudDM4ODc4NDc4Nw==,9599,simonw,2018-05-14T11:28:05Z,2018-05-14T11:28:05Z,OWNER,"To decide which facets to suggest: for each column, is the unique value count less than the number of rows matching the current query or is it less than 20 (if we are showing more than 20 rows)? Maybe only do this if there are less than ten non-float columns. Or always try for foreign keys and booleans, then if there are none of those try indexed text and integer fields, then finally try non-indexed text and integer fields but only if there are less than ten.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-389145872,https://api.github.com/repos/simonw/datasette/issues/255,389145872,MDEyOklzc3VlQ29tbWVudDM4OTE0NTg3Mg==,9599,simonw,2018-05-15T12:17:52Z,2018-05-15T12:17:52Z,OWNER,Activity has now moved to this branch: https://github.com/simonw/datasette/commits/suggested-facets,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-389147608,https://api.github.com/repos/simonw/datasette/issues/255,389147608,MDEyOklzc3VlQ29tbWVudDM4OTE0NzYwOA==,9599,simonw,2018-05-15T12:24:46Z,2018-05-15T12:24:46Z,OWNER,"New demo (published with `datasette publish now --branch=suggested-facets fivethirtyeight.db sf-trees.db --name=datastte-suggested-facets-demo`): https://datasette-suggested-facets-demo.now.sh/fivethirtyeight-2628db9/comic-characters%2Fmarvel-wikia-data After turning on a couple of suggested facets... https://datasette-suggested-facets-demo.now.sh/fivethirtyeight-2628db9/comic-characters%2Fmarvel-wikia-data?_facet=SEX&_facet=ID ![2018-05-15 at 7 24 am](https://user-images.githubusercontent.com/9599/40056411-fa265d16-5810-11e8-89ec-e38fe29ffb2c.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-389386919,https://api.github.com/repos/simonw/datasette/issues/255,389386919,MDEyOklzc3VlQ29tbWVudDM4OTM4NjkxOQ==,9599,simonw,2018-05-16T03:57:47Z,2018-05-16T03:58:30Z,OWNER,"I updated that demo to demonstrate the new foreign key label expansions: https://datasette-suggested-facets-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List?_facet=qLegalStatus ![2018-05-15 at 8 58 pm](https://user-images.githubusercontent.com/9599/40095806-b645026a-5882-11e8-8100-76136df50212.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-389397457,https://api.github.com/repos/simonw/datasette/issues/255,389397457,MDEyOklzc3VlQ29tbWVudDM4OTM5NzQ1Nw==,9599,simonw,2018-05-16T05:20:04Z,2018-05-16T05:20:04Z,OWNER,Maybe `suggested_facets` should only be calculated for the HTML view.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-389546040,https://api.github.com/repos/simonw/datasette/issues/255,389546040,MDEyOklzc3VlQ29tbWVudDM4OTU0NjA0MA==,9599,simonw,2018-05-16T14:47:34Z,2018-05-16T14:47:34Z,OWNER,"Latest demo - now with multiple columns: https://datasette-suggested-facets-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List?_facet=qCaretaker&_facet=qCareAssistant&_facet=qLegalStatus ![2018-05-16 at 7 47 am](https://user-images.githubusercontent.com/9599/40124418-63e680ba-58dd-11e8-8063-9686826abb8e.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-389562708,https://api.github.com/repos/simonw/datasette/issues/255,389562708,MDEyOklzc3VlQ29tbWVudDM4OTU2MjcwOA==,9599,simonw,2018-05-16T15:32:12Z,2018-05-16T15:32:12Z,OWNER,"This is now landed in master, ready for the next release.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-390999055,https://api.github.com/repos/simonw/datasette/issues/255,390999055,MDEyOklzc3VlQ29tbWVudDM5MDk5OTA1NQ==,9599,simonw,2018-05-22T13:54:55Z,2018-05-22T13:54:55Z,OWNER,This shipped in Datasette 0.22. Here's my blog post about it: https://simonwillison.net/2018/May/20/datasette-facets/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/256#issuecomment-388684356,https://api.github.com/repos/simonw/datasette/issues/256,388684356,MDEyOklzc3VlQ29tbWVudDM4ODY4NDM1Ng==,9599,simonw,2018-05-14T03:05:37Z,2018-05-14T03:05:37Z,OWNER,"I just landed pull request #257 - I haven't refactored the tests, I may do that later if it looks worthwhile.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322551723,Break up app.py into separate view modules, https://github.com/simonw/datasette/pull/257#issuecomment-388625703,https://api.github.com/repos/simonw/datasette/issues/257,388625703,MDEyOklzc3VlQ29tbWVudDM4ODYyNTcwMw==,9599,simonw,2018-05-13T13:10:09Z,2018-05-13T13:10:09Z,OWNER,"I'm still seeing intermittent Python 3.5 failures due to dictionary ordering differences. https://travis-ci.org/simonw/datasette/jobs/378356802 ``` > assert expected_facet_results == facet_results E AssertionError: assert {'city': [{'c...alue': 'MI'}]} == {'city': [{'co...alue': 'MI'}]} E Omitting 1 identical items, use -vv to show E Differing items: E {'city': [{'count': 4, 'toggle_url': '_facet=state&_facet=city&state=MI&city=Detroit', 'value': 'Detroit'}]} != {'city': [{'count': 4, 'toggle_url': 'state=MI&_facet=state&_facet=city&city=Detroit', 'value': 'Detroit'}]} E Use -v to get the full diff ``` To solve these cleanly I need to be able to run Python 3.5 on my local laptop rather than relying on Travis every time.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322591993,Refactor views, https://github.com/simonw/datasette/pull/257#issuecomment-388626721,https://api.github.com/repos/simonw/datasette/issues/257,388626721,MDEyOklzc3VlQ29tbWVudDM4ODYyNjcyMQ==,9599,simonw,2018-05-13T13:27:04Z,2018-05-13T13:27:04Z,OWNER,"I managed to get Python 3.5.0 running on my laptop using [pyenv](https://github.com/pyenv/pyenv). Here's the incantation I used: ``` # Install pyenv using homebrew (turns out I already had it) brew install pyenv # Check which versions of Python I have installed pyenv versions # Install Python 3.5.0 pyenv install 3.5.0 # Figure out where pyenv has been installing things pyenv root # Check I can run my newly installed Python 3.5.0 /Users/simonw/.pyenv/versions/3.5.0/bin/python # Use it to create a new virtualenv /Users/simonw/.pyenv/versions/3.5.0/bin/python -mvenv venv35 source venv35/bin/activate # Install datasette into that virtualenv python setup.py install ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322591993,Refactor views, https://github.com/simonw/datasette/pull/257#issuecomment-388626804,https://api.github.com/repos/simonw/datasette/issues/257,388626804,MDEyOklzc3VlQ29tbWVudDM4ODYyNjgwNA==,9599,simonw,2018-05-13T13:28:20Z,2018-05-13T13:28:20Z,OWNER,"Unfortunately, running `python setup.py test` on my laptop using Python 3.5.0 in that virtualenv results in a flow of weird Sanic-related errors: ``` File ""/Users/simonw/Dropbox/Development/datasette/venv35/lib/python3.5/site-packages/sanic-0.7.0-py3.5.egg/sanic/testing.py"", line 16, in _local_request import aiohttp File ""/Users/simonw/Dropbox/Development/datasette/.eggs/aiohttp-2.3.2-py3.5-macosx-10.13-x86_64.egg/aiohttp/__init__.py"", line 6, in from .client import * # noqa File ""/Users/simonw/Dropbox/Development/datasette/.eggs/aiohttp-2.3.2-py3.5-macosx-10.13-x86_64.egg/aiohttp/client.py"", line 13, in from yarl import URL File ""/Users/simonw/Dropbox/Development/datasette/.eggs/yarl-1.2.4-py3.5-macosx-10.13-x86_64.egg/yarl/__init__.py"", line 11, in from .quoting import _Quoter, _Unquoter File ""/Users/simonw/Dropbox/Development/datasette/.eggs/yarl-1.2.4-py3.5-macosx-10.13-x86_64.egg/yarl/quoting.py"", line 3, in from typing import Optional, TYPE_CHECKING, cast ImportError: cannot import name 'TYPE_CHECKING' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322591993,Refactor views, https://github.com/simonw/datasette/pull/257#issuecomment-388627281,https://api.github.com/repos/simonw/datasette/issues/257,388627281,MDEyOklzc3VlQ29tbWVudDM4ODYyNzI4MQ==,9599,simonw,2018-05-13T13:36:21Z,2018-05-13T13:36:21Z,OWNER,"https://github.com/rtfd/readthedocs.org/issues/3812#issuecomment-373780860 suggests Python 3.5.2 may have the fix. Yup, that worked: ``` pyenv install 3.5.2 rm -rf venv35 /Users/simonw/.pyenv/versions/3.5.2/bin/python -mvenv venv35 source venv35/bin/activate # Not sure why I need this in my local environment but I do: pip install datasette_plugin_demos python setup.py test ``` This is now giving me the same test failure locally that I am seeing in Travis.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322591993,Refactor views, https://github.com/simonw/datasette/pull/257#issuecomment-388628966,https://api.github.com/repos/simonw/datasette/issues/257,388628966,MDEyOklzc3VlQ29tbWVudDM4ODYyODk2Ng==,9599,simonw,2018-05-13T14:00:47Z,2018-05-13T14:06:35Z,OWNER,"Running specific tests: ``` venv35/bin/pip install pytest beautifulsoup4 aiohttp venv35/bin/pytest tests/test_utils.py ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322591993,Refactor views, https://github.com/simonw/datasette/pull/258#issuecomment-389386142,https://api.github.com/repos/simonw/datasette/issues/258,389386142,MDEyOklzc3VlQ29tbWVudDM4OTM4NjE0Mg==,9599,simonw,2018-05-16T03:51:13Z,2018-05-16T03:51:13Z,OWNER,"The URL does persist across deployments already, in that you can use the URL without the hash and it will redirect to the current location. Here's an example of that: https://san-francisco.datasettes.com/sf-trees/Street_Tree_List.json This also works if you attempt to hit the incorrect hash, e.g. if you have deployed a new version of the database with an updated hash. The old hash will redirect, e.g. https://san-francisco.datasettes.com/sf-trees-c4b972c/Street_Tree_List.json If you serve Datasette from a HTTP/2 proxy (I've been using Cloudflare for this) you won't even have to pay the cost of the redirect - Datasette sends a `Link: ; rel=preload` header with those redirects, which causes Cloudflare to push out the redirected source as part of that HTTP/2 request. You can fire up the Chrome DevTools to watch this happen. https://github.com/simonw/datasette/blob/2b79f2bdeb1efa86e0756e741292d625f91cb93d/datasette/views/base.py#L91 All of that said... I'm not at all opposed to this feature. For consistency with other Datasette options (e.g. `--cors`) I'd prefer to do this as an optional argument to the `datasette serve` command - something like this: datasette serve mydb.db --no-url-hash","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322741659,Add new metadata key persistent_urls which removes the hash from all database urls, https://github.com/simonw/datasette/pull/258#issuecomment-389536870,https://api.github.com/repos/simonw/datasette/issues/258,389536870,MDEyOklzc3VlQ29tbWVudDM4OTUzNjg3MA==,9599,simonw,2018-05-16T14:22:31Z,2018-05-16T14:22:31Z,OWNER,"The principle benefit provided by the hash URLs is that Datasette can set a far-future cache expiry header on every response. This is particularly useful for JavaScript API work as it makes fantastic use of the browser's cache. It also means that if you are serving your API from behind a caching proxy like Cloudflare you get a fantastic cache hit rate. An option to serve without persistent hashes would also need to turn off the cache headers. Maybe the option should support both? If you hit a page with the hash in the URL you still get the cache headers, but hits to the URL without the hash serve uncashed content directly.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322741659,Add new metadata key persistent_urls which removes the hash from all database urls, https://github.com/simonw/datasette/pull/258#issuecomment-390577711,https://api.github.com/repos/simonw/datasette/issues/258,390577711,MDEyOklzc3VlQ29tbWVudDM5MDU3NzcxMQ==,247131,philroche,2018-05-21T07:38:15Z,2018-05-21T07:38:15Z,NONE,"Excellent, I was not aware of the auto redirect to the new hash. My bad This solves my use case. I do agree that your suggested --no-url-hash approach is much neater. I will investigate ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322741659,Add new metadata key persistent_urls which removes the hash from all database urls, https://github.com/simonw/datasette/issues/259#issuecomment-388797919,https://api.github.com/repos/simonw/datasette/issues/259,388797919,MDEyOklzc3VlQ29tbWVudDM4ODc5NzkxOQ==,9599,simonw,2018-05-14T12:23:11Z,2018-05-14T12:23:11Z,OWNER,"For M2M to work we will need a mechanism for applying IN queries to the table view, so you can select multiple M2M filters. Maybe this would work: ?_m2m_category=123&_m2m_category=865","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322787470,inspect() should detect many-to-many relationships, https://github.com/simonw/datasette/issues/259#issuecomment-392212119,https://api.github.com/repos/simonw/datasette/issues/259,392212119,MDEyOklzc3VlQ29tbWVudDM5MjIxMjExOQ==,9599,simonw,2018-05-25T23:22:26Z,2018-05-25T23:22:26Z,OWNER,"This should detect any table which can be linked to the current table via some other table, based on the other table having a foreign key to them both. These join tables could be arbitrarily complicated. They might have foreign keys to more than two other tables, maybe even multiple foreign keys to the same column. Ideally M2M defection would catch all of these cases. Maybe the resulting inspect data looks something like this: ``` ""artists"": { ... ""m2m"": [{ ""other_table"": ""festivals"", ""through"": ""performances"", ""our_fk"": ""artist_id"", ""other_fk"": ""performance_id"" }] ``` Let's ignore compound primary keys: we k it detect m2m relationships where the join table has foreign keys to a single primary key on the other two tables.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322787470,inspect() should detect many-to-many relationships, https://github.com/simonw/datasette/issues/259#issuecomment-392214791,https://api.github.com/repos/simonw/datasette/issues/259,392214791,MDEyOklzc3VlQ29tbWVudDM5MjIxNDc5MQ==,9599,simonw,2018-05-25T23:43:15Z,2018-07-29T00:56:03Z,OWNER,"We may need to derive a usable name for each of these relationships that can be used in eg querystring parameters. The name of the join table is a reasonable choice here. Say the join table is called `event_tags` - the querystring for returning all events that are tagged `badger` could be `/db/events?_m2m_event_tags__tag=badger` perhaps? But what if `event_tags` has more than one foreign key back to `events`? Might need to specify the column in `events` that is referred back to by `event_tags` somehow in that case.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322787470,inspect() should detect many-to-many relationships, https://github.com/simonw/datasette/issues/259#issuecomment-399157944,https://api.github.com/repos/simonw/datasette/issues/259,399157944,MDEyOklzc3VlQ29tbWVudDM5OTE1Nzk0NA==,9599,simonw,2018-06-21T16:07:49Z,2018-06-21T16:07:49Z,OWNER,Thanks to #319 the test suite now includes a m2m table: https://latest.datasette.io/fixtures-e14e080/searchable_tags,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322787470,inspect() should detect many-to-many relationships, https://github.com/simonw/datasette/issues/259#issuecomment-409087501,https://api.github.com/repos/simonw/datasette/issues/259,409087501,MDEyOklzc3VlQ29tbWVudDQwOTA4NzUwMQ==,9599,simonw,2018-07-31T04:03:29Z,2018-07-31T04:03:29Z,OWNER,Parent ticket: #354,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322787470,inspect() should detect many-to-many relationships, https://github.com/simonw/datasette/issues/259#issuecomment-495058104,https://api.github.com/repos/simonw/datasette/issues/259,495058104,MDEyOklzc3VlQ29tbWVudDQ5NTA1ODEwNA==,9599,simonw,2019-05-23T03:55:37Z,2019-05-23T03:55:37Z,OWNER,I got rid of inspect in #462 - I will still be doing many-to-many detection (initially as part of #356) but it doesn't need a separate ticket.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322787470,inspect() should detect many-to-many relationships, https://github.com/simonw/datasette/issues/260#issuecomment-544318517,https://api.github.com/repos/simonw/datasette/issues/260,544318517,MDEyOklzc3VlQ29tbWVudDU0NDMxODUxNw==,9599,simonw,2019-10-21T01:48:24Z,2019-10-21T01:48:24Z,OWNER,"This came up in #588 - it would be helpful if this would spot things like `""queries""` defined against the tables block when they should be defined against a database.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323223872,Validate metadata.json on startup, https://github.com/simonw/datasette/issues/260#issuecomment-1051473892,https://api.github.com/repos/simonw/datasette/issues/260,1051473892,IC_kwDOBm6k_c4-rDfk,596279,zaneselvans,2022-02-26T02:24:15Z,2022-02-26T02:24:15Z,NONE,"Is there already functionality that can be used to validate the `metadata.json` file? Is there a JSON Schema that defines it? Or a validation that's available via datasette with Python? We're working on [automatically building the metadata](https://github.com/catalyst-cooperative/pudl/pull/1479) in CI and when we deploy to cloud run, and it would be nice to be able to check whether the the metadata we're outputting is valid in our tests.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323223872,Validate metadata.json on startup, https://github.com/simonw/datasette/issues/260#issuecomment-1234926923,https://api.github.com/repos/simonw/datasette/issues/260,1234926923,IC_kwDOBm6k_c5Jm31L,9599,simonw,2022-09-02T00:04:26Z,2022-09-02T00:04:45Z,OWNER,"Interesting example of why this would be valuable here: - https://github.com/simonw/datasette/issues/1798 This YAML file: ```yaml title: Some title description_html: |-

This is an experiment.

databases: off: tables: products_from_owners: title: products_from_owners* ``` Was loaded as equivalent to this JSON: ```json { ""title"": ""Some title"", ""description_html"": ""

This is an experiment.

"", ""databases"": { ""false"": { ""tables"": { ""products_from_owners"": { ""title"": ""products_from_owners*"" } } } } } ``` Validation that caught this would have been useful.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323223872,Validate metadata.json on startup, https://github.com/simonw/datasette/issues/260#issuecomment-1234927627,https://api.github.com/repos/simonw/datasette/issues/260,1234927627,IC_kwDOBm6k_c5Jm4AL,9599,simonw,2022-09-02T00:05:43Z,2022-09-02T00:05:43Z,OWNER,"I'm inclined to consider [Pydantic](https://pydantic-docs.helpmanual.io/) for this, since it is widely used now and can generate really good error messages.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323223872,Validate metadata.json on startup, https://github.com/simonw/datasette/issues/260#issuecomment-1235079469,https://api.github.com/repos/simonw/datasette/issues/260,1235079469,IC_kwDOBm6k_c5JndEt,596279,zaneselvans,2022-09-02T05:24:59Z,2022-09-02T05:24:59Z,NONE,@zschira is working with Pydantic while converting between and validating JSON frictionless datapackage descriptors that annotate an SQLite DB ([extracted from FERC's XBRL data](https://github.com/catalyst-cooperative/ferc-xbrl-extractor)) and the Datasette YAML metadata [so we can publish them with Datasette](https://github.com/catalyst-cooperative/pudl/pull/1831). Maybe there's some overlap? We've been loving Pydantic.,"{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 1}",323223872,Validate metadata.json on startup, https://github.com/simonw/datasette/issues/260#issuecomment-1235785955,https://api.github.com/repos/simonw/datasette/issues/260,1235785955,IC_kwDOBm6k_c5JqJjj,9599,simonw,2022-09-02T18:18:06Z,2022-09-02T18:18:06Z,OWNER,"Did some related research work in this issue: - https://github.com/simonw/shot-scraper/issues/28","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323223872,Validate metadata.json on startup, https://github.com/simonw/datasette/issues/260#issuecomment-1600778057,https://api.github.com/repos/simonw/datasette/issues/260,1600778057,IC_kwDOBm6k_c5fae9J,9599,simonw,2023-06-21T12:51:22Z,2023-06-21T12:51:22Z,OWNER,"Another example of confusion from this today: https://discord.com/channels/823971286308356157/823971286941302908/1121042411238457374 See also https://gist.github.com/BinomeDeNewton/651ac8b50dd5420f8e54d1682eee5fed?permalink_comment_id=4605982#gistcomment-4605982","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323223872,Validate metadata.json on startup, https://github.com/simonw/datasette/issues/262#issuecomment-389702480,https://api.github.com/repos/simonw/datasette/issues/262,389702480,MDEyOklzc3VlQ29tbWVudDM4OTcwMjQ4MA==,9599,simonw,2018-05-17T00:00:39Z,2020-09-12T18:19:30Z,OWNER,Idea: `?_extra=sqllog` could output a lot of every individual SQL statement that was executed in order to generate the page - useful for seeing how foreign key expansion and faceting actually works.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-691526719,https://api.github.com/repos/simonw/datasette/issues/262,691526719,MDEyOklzc3VlQ29tbWVudDY5MTUyNjcxOQ==,9599,simonw,2020-09-12T18:19:50Z,2020-09-12T18:19:50Z,OWNER,"> Idea: `?_extra=sqllog` could output a lot of every individual SQL statement that was executed in order to generate the page - useful for seeing how foreign key expansion and faceting actually works. I built a version of that a while ago as the `?_trace=1` argument.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-691526975,https://api.github.com/repos/simonw/datasette/issues/262,691526975,MDEyOklzc3VlQ29tbWVudDY5MTUyNjk3NQ==,9599,simonw,2020-09-12T18:22:44Z,2020-09-12T18:22:44Z,OWNER,Are there any interesting use-cases for a plugin hook that allows plugins to define their own `?_extra=` blocks?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-712988146,https://api.github.com/repos/simonw/datasette/issues/262,712988146,MDEyOklzc3VlQ29tbWVudDcxMjk4ODE0Ng==,9599,simonw,2020-10-20T16:32:02Z,2023-01-17T01:54:13Z,OWNER,"Just realized I added an undocumented `?_extras=` option to the row view years ago and forgot about it - it's not even documented. Added in a30c5b220c15360d575e94b0e67f3255e120b916 - https://latest.datasette.io/fixtures/attraction_characteristic/2.json?_extras=foreign_key_tables That will need to be made consistent with the new mechanism. I think `?_extra=a&_extra=b` is more consistent with other Datasette features (like `?_facet=col1&_facet=col2`) but potentially quite verbose. So I could support `?_extra=a,b,c` as an alternative allowed syntax, or I could allow `?_extra=single` and `?_extras=comma,separated`. I think I prefer allowing commas in `?_extra=`. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-713170284,https://api.github.com/repos/simonw/datasette/issues/262,713170284,MDEyOklzc3VlQ29tbWVudDcxMzE3MDI4NA==,9599,simonw,2020-10-20T22:13:01Z,2020-10-20T22:13:01Z,OWNER,In the documentation for `?_extra=` I think I'll emphasize the comma-separated version of it. Also: there will be `?_extra=` values which act as aliases for collection combinations - e.g. `?_extra=full` will toggle everything.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-713170979,https://api.github.com/repos/simonw/datasette/issues/262,713170979,MDEyOklzc3VlQ29tbWVudDcxMzE3MDk3OQ==,9599,simonw,2020-10-20T22:14:37Z,2020-10-20T22:14:37Z,OWNER,"I think it's worth having a plugin hook for this - it can be same hook that is used internally. Maybe `register_extra` - it lets you return one or more `extra` implementations, each with a name and an async function that gets called. Things like suggested facets will become `register_extra` hooks. Maybe actual facets too?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-713200782,https://api.github.com/repos/simonw/datasette/issues/262,713200782,MDEyOklzc3VlQ29tbWVudDcxMzIwMDc4Mg==,9599,simonw,2020-10-20T23:41:30Z,2020-10-20T23:41:30Z,OWNER,This is now blocking https://github.com/simonw/datasette-graphql/issues/61 because that issue needs a way to turn off suggested facets when retrieving the results of a table query.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-713208667,https://api.github.com/repos/simonw/datasette/issues/262,713208667,MDEyOklzc3VlQ29tbWVudDcxMzIwODY2Nw==,9599,simonw,2020-10-21T00:03:18Z,2020-10-21T00:03:18Z,OWNER,"I think I should prioritize the facets component of this, since that could have significant performance wins while also supporting `datasette-graphql`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-995034911,https://api.github.com/repos/simonw/datasette/issues/262,995034911,IC_kwDOBm6k_c47Twcf,9599,simonw,2021-12-15T18:03:46Z,2021-12-15T18:03:56Z,OWNER,"This is relevant to the big refactor in: - #1518","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-1108890170,https://api.github.com/repos/simonw/datasette/issues/262,1108890170,IC_kwDOBm6k_c5CGFI6,9599,simonw,2022-04-25T18:17:09Z,2022-04-25T18:18:39Z,OWNER,"I spotted in https://github.com/simonw/datasette/issues/1719#issuecomment-1108888494 that there's actually already an undocumented implementation of `?_extras=foreign_key_tables` - https://latest.datasette.io/fixtures/simple_primary_key/1.json?_extras=foreign_key_tables I added that feature all the way back in November 2017! https://github.com/simonw/datasette/commit/a30c5b220c15360d575e94b0e67f3255e120b916","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-1384741055,https://api.github.com/repos/simonw/datasette/issues/262,1384741055,IC_kwDOBm6k_c5SiXi_,9599,simonw,2023-01-17T01:58:24Z,2023-01-17T01:58:24Z,OWNER,"As suggested in this issue: - #1721 There are three parts of the Datasette API that need to support extras: - Table, e.g. https://latest.datasette.io/fixtures/facetable.json - Row, e.g. https://latest.datasette.io/fixtures/facetable/1.json - Query, e.g. https://latest.datasette.io/fixtures/neighborhood_search.json or https://latest.datasette.io/fixtures.json?sql=%0Aselect+_neighborhood%2C+facet_cities.name%2C+state%0Afrom+facetable%0A++++join+facet_cities%0A++++++++on+facetable._city_id+%3D+facet_cities.id%0Awhere+_neighborhood+like+%27%25%27+||+%3Atext+||+%27%25%27%0Aorder+by+_neighborhood%3B%0A&text= There are two other pages I should consider though: - https://latest.datasette.io/.json - the JSON version of the https://latest.datasette.io/ homepage - https://latest.datasette.io/fixtures.json - note that this is different from the same URL with `?sql=...` appended to it. This is the index of tables in a specific database","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-1384742385,https://api.github.com/repos/simonw/datasette/issues/262,1384742385,IC_kwDOBm6k_c5SiX3x,9599,simonw,2023-01-17T02:00:23Z,2023-01-17T02:00:38Z,OWNER,"I'm not actually too happy about how `/fixtures.json` currently entirely changes shape based on whether or not you pass a `?sql=` argument to it. Maybe I can fix that disparity with extras too? The list of tables you see on `/fixtures.json` without the `?sql=` could become another extra. The HTML version of that page could know to request that extra by default. This would also support running a SQL query but also returning a list of tables - which can be useful for building a SQL editor interface which hints at the tables that are available to the user - or even for generating the configuration needed by the CodeMirror editor's SQL completion, added in: - #1893","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-1384743243,https://api.github.com/repos/simonw/datasette/issues/262,1384743243,IC_kwDOBm6k_c5SiYFL,9599,simonw,2023-01-17T02:01:26Z,2023-01-17T02:01:26Z,OWNER,"I'm tempted NOT to document the JSON for the `/.json` page, simply because I'm not at all convinced that the current homepage design is the best possible use of that space - and I'd like to reserve the opportunity to redesign that in e.g. Datasette 1.1 without it being a breaking change to the documented JSON API.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-1384752452,https://api.github.com/repos/simonw/datasette/issues/262,1384752452,IC_kwDOBm6k_c5SiaVE,9599,simonw,2023-01-17T02:14:41Z,2023-01-17T02:15:58Z,OWNER,"Thinking about `?_extra=` values just for the table JSON. The default shape will look like this: ```json { ""ok"": true, ""rows"": [{""id"": 1, ""name"": ""Name""}], ""next"": null, } ``` The table extras could be: - `count` - adds a `""count""` field with a full `count(*)` for that filtered table - `next_url` - the full URL to the next page - `columns` - adds `""columns"": [""id"", ""name""]` - `expandable_columns` - a list of columns that can be expanded (note that `""expanded_columns"": [...]` shows up automatically if the user passes any `?_label=` options, like on https://latest.datasette.io/fixtures/facetable.json?_label=_city_id ) - I'm tempted to rename this to `label_columns` and have it add both `label_columns` and `label_columns_selected` or similar. - `primary_keys` - a list of primary keys e.g. `[""id""]` - not sure what to do about `rowid` columns here - `query` - a `{""sql"": ""select ..."", ""params"": {""p0"": ""1""}}` object - `units` - the units feature - `suggested_facets` - suggested facets - `metadata` - a `{""metadata"": {""source_url"": ""...""}}` etc block - differs from current in that it would be nested in `""metadata"": {...}`. Stuff currently in https://latest.datasette.io/fixtures/facetable.json that is not yet covered by the above: ``` ""database"": ""fixtures"", ""table"": ""facetable"", ""is_view"": false, ""human_description_en"": ""where id = 1"", ""private"": false, ""allow_execute_sql"": true, ""query_ms"": 16.749476999393664, ``` I'm tempted to bundle `database`, `table`, `is_view` and `human_description_en` into one (not sure what to call it though, perhaps `display_details`?) - and then drop `allow_execute_sql` entirely and have `private` and `query_ms` as their own named extras.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-1385805702,https://api.github.com/repos/simonw/datasette/issues/262,1385805702,IC_kwDOBm6k_c5SmbeG,9599,simonw,2023-01-17T17:50:17Z,2023-01-17T17:50:17Z,OWNER,Or maybe have a `permissions` extra which includes `allow_execute_sql` and `private`? Could anything else go in there?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-1385807684,https://api.github.com/repos/simonw/datasette/issues/262,1385807684,IC_kwDOBm6k_c5Smb9E,9599,simonw,2023-01-17T17:51:54Z,2023-01-19T23:20:59Z,OWNER,"In most cases, the `?_extra=xxx` name exactly corresponds to the additional key that is added to the JSON. `?_facet=...` is one example of a query string argument that causes an extra key - `""facet_results""` - to be added to the JSON even though it wasn't requested by name in a `?_extra=`. Am I OK with that? I think so. Related issue: - #1558 Actually there's an edge-case here that's worth considering: it's possible to use metadata to set default facets for a table. If you do this for a table, then `.json` for that table will always calculate and return those facets - which may be an expensive and unnecessary operation. So maybe we don't include `facet_results` in the JSON unless explicitly asked for in that case, but have a rule that `?_facet` implies `?_extra=facet_results`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-1397942113,https://api.github.com/repos/simonw/datasette/issues/262,1397942113,IC_kwDOBm6k_c5TUudh,9599,simonw,2023-01-20T05:33:00Z,2023-01-20T05:33:00Z,OWNER,"I'm going to write code which parses `?_extra=` in the comma separated or multiple parameter format and then looks up functions in a dictionary. It will return an error if you ask for an extra that does not exist.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-1399145981,https://api.github.com/repos/simonw/datasette/issues/262,1399145981,IC_kwDOBm6k_c5TZUX9,9599,simonw,2023-01-21T01:56:52Z,2023-01-21T01:56:52Z,OWNER,"Got first prototype working using `asyncinject` and it's pretty nice: ```diff diff --git a/datasette/views/table.py b/datasette/views/table.py index ad45ecd3..c8690b22 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -2,6 +2,7 @@ import asyncio import itertools import json +from asyncinject import Registry import markupsafe from datasette.plugins import pm @@ -538,57 +539,60 @@ class TableView(DataView): # Execute the main query! results = await db.execute(sql, params, truncate=True, **extra_args) - # Calculate the total count for this query - count = None - if ( - not db.is_mutable - and self.ds.inspect_data - and count_sql == f""select count(*) from {table_name} "" - ): - # We can use a previously cached table row count - try: - count = self.ds.inspect_data[database_name][""tables""][table_name][ - ""count"" - ] - except KeyError: - pass - - # Otherwise run a select count(*) ... - if count_sql and count is None and not nocount: - try: - count_rows = list(await db.execute(count_sql, from_sql_params)) - count = count_rows[0][0] - except QueryInterrupted: - pass - - # Faceting - if not self.ds.setting(""allow_facet"") and any( - arg.startswith(""_facet"") for arg in request.args - ): - raise BadRequest(""_facet= is not allowed"") + # Resolve extras + extras = _get_extras(request) + if request.args.getlist(""_facet""): + extras.add(""facet_results"") - # pylint: disable=no-member - facet_classes = list( - itertools.chain.from_iterable(pm.hook.register_facet_classes()) - ) - facet_results = {} - facets_timed_out = [] - facet_instances = [] - for klass in facet_classes: - facet_instances.append( - klass( - self.ds, - request, - database_name, - sql=sql_no_order_no_limit, - params=params, - table=table_name, - metadata=table_metadata, - row_count=count, - ) + async def extra_count(): + # Calculate the total count for this query + count = None + if ( + not db.is_mutable + and self.ds.inspect_data + and count_sql == f""select count(*) from {table_name} "" + ): + # We can use a previously cached table row count + try: + count = self.ds.inspect_data[database_name][""tables""][table_name][ + ""count"" + ] + except KeyError: + pass + + # Otherwise run a select count(*) ... + if count_sql and count is None and not nocount: + try: + count_rows = list(await db.execute(count_sql, from_sql_params)) + count = count_rows[0][0] + except QueryInterrupted: + pass + return count + + async def facet_instances(extra_count): + facet_instances = [] + facet_classes = list( + itertools.chain.from_iterable(pm.hook.register_facet_classes()) ) + for facet_class in facet_classes: + facet_instances.append( + facet_class( + self.ds, + request, + database_name, + sql=sql_no_order_no_limit, + params=params, + table=table_name, + metadata=table_metadata, + row_count=extra_count, + ) + ) + return facet_instances + + async def extra_facet_results(facet_instances): + facet_results = {} + facets_timed_out = [] - async def execute_facets(): if not nofacet: # Run them in parallel facet_awaitables = [facet.facet_results() for facet in facet_instances] @@ -607,9 +611,13 @@ class TableView(DataView): facet_results[key] = facet_info facets_timed_out.extend(instance_facets_timed_out) - suggested_facets = [] + return { + ""results"": facet_results, + ""timed_out"": facets_timed_out, + } - async def execute_suggested_facets(): + async def extra_suggested_facets(facet_instances): + suggested_facets = [] # Calculate suggested facets if ( self.ds.setting(""suggest_facets"") @@ -624,8 +632,15 @@ class TableView(DataView): ] for suggest_result in await gather(*facet_suggest_awaitables): suggested_facets.extend(suggest_result) + return suggested_facets + + # Faceting + if not self.ds.setting(""allow_facet"") and any( + arg.startswith(""_facet"") for arg in request.args + ): + raise BadRequest(""_facet= is not allowed"") - await gather(execute_facets(), execute_suggested_facets()) + # pylint: disable=no-member # Figure out columns and rows for the query columns = [r[0] for r in results.description] @@ -732,17 +747,56 @@ class TableView(DataView): rows = rows[:page_size] # human_description_en combines filters AND search, if provided - human_description_en = filters.human_description_en( - extra=extra_human_descriptions - ) + async def extra_human_description_en(): + human_description_en = filters.human_description_en( + extra=extra_human_descriptions + ) + if sort or sort_desc: + human_description_en = "" "".join( + [b for b in [human_description_en, sorted_by] if b] + ) + return human_description_en if sort or sort_desc: sorted_by = ""sorted by {}{}"".format( (sort or sort_desc), "" descending"" if sort_desc else """" ) - human_description_en = "" "".join( - [b for b in [human_description_en, sorted_by] if b] - ) + + async def extra_next_url(): + return next_url + + async def extra_columns(): + return columns + + async def extra_primary_keys(): + return pks + + registry = Registry( + extra_count, + extra_facet_results, + extra_suggested_facets, + facet_instances, + extra_human_description_en, + extra_next_url, + extra_columns, + extra_primary_keys, + ) + + results = await registry.resolve_multi( + [""extra_{}"".format(extra) for extra in extras] + ) + data = { + ""ok"": True, + ""rows"": rows[:page_size], + ""next"": next_value and str(next_value) or None, + } + data.update({ + key.replace(""extra_"", """"): value + for key, value in results.items() + if key.startswith(""extra_"") + and key.replace(""extra_"", """") in extras + }) + return Response.json(data, default=repr) async def extra_template(): nonlocal sort @@ -1334,3 +1388,11 @@ class TableDropView(BaseView): await db.execute_write_fn(drop_table) return Response.json({""ok"": True}, status=200) + + +def _get_extras(request): + extra_bits = request.args.getlist(""_extra"") + extras = set() + for bit in extra_bits: + extras.update(bit.split("","")) + return extras ``` With that in place, `http://127.0.0.1:8001/content/releases?author=25778&_size=1&_extra=count,primary_keys,columns&_facet=author` returns: ```json { ""ok"": true, ""rows"": [ { ""html_url"": ""https://github.com/eyeseast/geocode-sqlite/releases/tag/0.1.2"", ""id"": 30926270, ""author"": { ""value"": 25778, ""label"": ""eyeseast"" }, ""node_id"": ""MDc6UmVsZWFzZTMwOTI2Mjcw"", ""tag_name"": ""0.1.2"", ""target_commitish"": ""master"", ""name"": ""v0.1.2"", ""draft"": 0, ""prerelease"": 1, ""created_at"": ""2020-09-08T17:48:24Z"", ""published_at"": ""2020-09-08T17:50:15Z"", ""body"": ""Basic API is in place, with CLI support for Google, Bing, MapQuest and Nominatum (OSM) geocoders."", ""repo"": { ""value"": 293361514, ""label"": ""geocode-sqlite"" }, ""reactions"": null, ""mentions_count"": null } ], ""next"": ""30926270"", ""primary_keys"": [ ""id"" ], ""columns"": [ ""html_url"", ""id"", ""author"", ""node_id"", ""tag_name"", ""target_commitish"", ""name"", ""draft"", ""prerelease"", ""created_at"", ""published_at"", ""body"", ""repo"", ""reactions"", ""mentions_count"" ], ""count"": 25, ""facet_results"": { ""results"": { ""author"": { ""name"": ""author"", ""type"": ""column"", ""hideable"": true, ""toggle_url"": ""/content/releases?author=25778&_size=1&_extra=count%2Cprimary_keys%2Ccolumns"", ""results"": [ { ""value"": 25778, ""label"": ""eyeseast"", ""count"": 25, ""toggle_url"": ""http://127.0.0.1:8001/content/releases?_size=1&_extra=count%2Cprimary_keys%2Ccolumns&_facet=author"", ""selected"": true } ], ""truncated"": false } }, ""timed_out"": [] } } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-1399178591,https://api.github.com/repos/simonw/datasette/issues/262,1399178591,IC_kwDOBm6k_c5TZcVf,9599,simonw,2023-01-21T04:53:15Z,2023-01-21T04:53:15Z,OWNER,"Implementing this to work with the `.json` extension is going to be a lot harder. The challenge here is that we're working with the whole `BaseView()` v.s. `TableView()` abstraction, which I've been wanting to get rid of for a long time. `BaseView()` calls `.data()` and expects to get back a `(data, extra_template_data, templates)` tuple - then if a format is in play (`.json` or `.geojson` or similar from a plugin) it hands off `data` to that. If `.csv` is involved it does something special, in order to support streaming responses. And if it's regular HTML it calls `await extra_template_data()` and combines that with `data` and passes it to the template. I want this to work completely differently: I want the formats (including HTML) to have the option of adding some extra `?_extra=` extras, then I want HTML to be able to render the page entirely from the JSON if necessary.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-1399178823,https://api.github.com/repos/simonw/datasette/issues/262,1399178823,IC_kwDOBm6k_c5TZcZH,9599,simonw,2023-01-21T04:54:49Z,2023-01-21T04:54:49Z,OWNER,"I pushed my prototype so far, going to start a draft PR for it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-1399184540,https://api.github.com/repos/simonw/datasette/issues/262,1399184540,IC_kwDOBm6k_c5TZdyc,9599,simonw,2023-01-21T05:35:32Z,2023-01-21T05:35:32Z,OWNER,"It's annoying that the https://docs.datasette.io/en/0.64.1/plugin_hooks.html#register-output-renderer-datasette plugin hook passes `rows` as ""list of sqlite3.Row objects"" - I'd prefer it if that plugin hook worked with JSON data, not `sqlite3.Row`. https://docs.datasette.io/en/0.64.1/plugin_hooks.html#render-cell-row-value-column-table-database-datasette is documented as accepting `Row` but actually gets `CustomRow`, see: - #1973","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-1399184642,https://api.github.com/repos/simonw/datasette/issues/262,1399184642,IC_kwDOBm6k_c5TZd0C,9599,simonw,2023-01-21T05:36:22Z,2023-01-21T05:41:06Z,OWNER,"Maybe `""rows""` should be a default `?_extra=`... but it should be possible to request `""arrays""` instead which would be a list of arrays, more suitable perhaps for custom renderers such as the CSV one. This could be quite neat, in that EVERY key in the JSON representation would be defined as an extra - just some would be on by default. There could even be a mechanism for turning them back off again, maybe using `?_extra=-rows`. In which case maybe `?_extra=` isn't actually the right name for this feature. It could be `?_key=` perhaps, or `?_field=`. Being able to pass `?_field=count,-rows` to get back just the count (and skip executing the count entirely) would be pretty neat. Although `?_only=count` would be tidier. So maybe the pair of `?_only=` and `?_extra=` would make sense. Would `?_only=rows` still return the `""ok""` field so you can always look at that to confirm an error didn't occur?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-1404253358,https://api.github.com/repos/simonw/datasette/issues/262,1404253358,IC_kwDOBm6k_c5TszSu,9599,simonw,2023-01-25T21:35:32Z,2023-01-25T21:35:32Z,OWNER,"This issue here would benefit from some kid of mechanism for returning just the HTML of the table itself, without any of the surrounding material. I'm not sure if that would make sense as an extra or not: - https://github.com/simonw/datasette-search-all/issues/17","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-1418288327,https://api.github.com/repos/simonw/datasette/issues/262,1418288327,IC_kwDOBm6k_c5UiVzH,9599,simonw,2023-02-05T22:57:58Z,2023-02-06T23:01:15Z,OWNER,"I think that does make sense: `?_extra=table` perhaps, which would add `{""table"": ""...""}`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-1423067724,https://api.github.com/repos/simonw/datasette/issues/262,1423067724,IC_kwDOBm6k_c5U0kpM,9599,simonw,2023-02-08T18:33:32Z,2023-02-08T18:36:48Z,OWNER,"Just realized that it's useful to be able to tell what parameters were used to generate a page... but reflecting things like `_next` back in the JSON is confusing in the presence of `next`. So I'm going to add an extra for that information too. Not sure what to call it though: - `params` - confusing because in the code that's usually used for params passed to SQL queries - `query_string` - wouldn't that be a string, not params as a dictionary? I'm going to experiment with a `request` extra that returns some bits of information about the request.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-1480355670,https://api.github.com/repos/simonw/datasette/issues/262,1480355670,IC_kwDOBm6k_c5YPG9W,9599,simonw,2023-03-22T22:50:30Z,2023-03-22T22:50:30Z,OWNER,"I just landed this PR so this feature is now in `main`: - #1999 Still needs documentation and maybe some extra tests too.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/262#issuecomment-1488010837,https://api.github.com/repos/simonw/datasette/issues/262,1488010837,IC_kwDOBm6k_c5YsT5V,9599,simonw,2023-03-29T06:22:21Z,2023-03-29T06:22:21Z,OWNER,I need to get the arbitrary query page to return the same format. It likely won't have nearly as many extras.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/263#issuecomment-389563719,https://api.github.com/repos/simonw/datasette/issues/263,389563719,MDEyOklzc3VlQ29tbWVudDM4OTU2MzcxOQ==,9599,simonw,2018-05-16T15:34:46Z,2018-05-16T15:34:46Z,OWNER,The underlying mechanics for the `_extras` mechanism described in #262 may help with this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323671577,Facets should not execute for ?shape=array|object, https://github.com/simonw/datasette/issues/263#issuecomment-735960132,https://api.github.com/repos/simonw/datasette/issues/263,735960132,MDEyOklzc3VlQ29tbWVudDczNTk2MDEzMg==,9599,simonw,2020-11-30T18:25:17Z,2020-11-30T18:25:17Z,OWNER,Fixing this would unblock this issue for switching `datasette-graphql` to using `datasette.client` internally: https://github.com/simonw/datasette-graphql/issues/61,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323671577,Facets should not execute for ?shape=array|object, https://github.com/simonw/datasette/issues/263#issuecomment-852256784,https://api.github.com/repos/simonw/datasette/issues/263,852256784,MDEyOklzc3VlQ29tbWVudDg1MjI1Njc4NA==,9599,simonw,2021-06-01T16:20:09Z,2021-06-01T16:20:09Z,OWNER,This is a lot easier to implement now that we have a `?_nofacet=1` option from #1350.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323671577,Facets should not execute for ?shape=array|object, https://github.com/simonw/datasette/issues/264#issuecomment-390105943,https://api.github.com/repos/simonw/datasette/issues/264,390105943,MDEyOklzc3VlQ29tbWVudDM5MDEwNTk0Mw==,9599,simonw,2018-05-18T06:18:00Z,2018-05-18T06:18:00Z,OWNER,Docs: http://datasette.readthedocs.io/en/latest/limits.html#default-facet-size,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323673899,Make it possible to customize various facet settings, https://github.com/simonw/datasette/issues/265#issuecomment-389566147,https://api.github.com/repos/simonw/datasette/issues/265,389566147,MDEyOklzc3VlQ29tbWVudDM4OTU2NjE0Nw==,9599,simonw,2018-05-16T15:41:42Z,2018-05-16T15:41:42Z,OWNER,"An official demo instance of Datasette dedicated to this use-case would be useful, especially if it was automatically deployed by Travis for every commit to master that passes the tests. Maybe there should be a permanent version of it deployed for each released version too?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323677499,Add links to example Datasette instances to appropiate places in docs, https://github.com/simonw/datasette/issues/265#issuecomment-392890045,https://api.github.com/repos/simonw/datasette/issues/265,392890045,MDEyOklzc3VlQ29tbWVudDM5Mjg5MDA0NQ==,231923,yschimke,2018-05-29T18:37:49Z,2018-05-29T18:37:49Z,NONE,"Just about to ask for this! Move this page https://github.com/simonw/datasette/wiki/Datasettes into a datasette, with some concept of versioning as well.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323677499,Add links to example Datasette instances to appropiate places in docs, https://github.com/simonw/datasette/issues/265#issuecomment-393003340,https://api.github.com/repos/simonw/datasette/issues/265,393003340,MDEyOklzc3VlQ29tbWVudDM5MzAwMzM0MA==,9599,simonw,2018-05-30T01:44:22Z,2018-05-30T01:44:22Z,OWNER,Funny you should mention that... I'm planning on doing that as part of the official Datasette website at some point soon. A Datasette instance that lists other Datasette instances feels pleasingly appropriate.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323677499,Add links to example Datasette instances to appropiate places in docs, https://github.com/simonw/datasette/issues/265#issuecomment-393064224,https://api.github.com/repos/simonw/datasette/issues/265,393064224,MDEyOklzc3VlQ29tbWVudDM5MzA2NDIyNA==,9599,simonw,2018-05-30T07:48:37Z,2018-05-30T07:48:37Z,OWNER,"https://datasette-registry.now.sh Is now live, powered by https://github.com/simonw/datasette-registry - still needs plenty of work but it's an interesting start.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323677499,Add links to example Datasette instances to appropiate places in docs, https://github.com/simonw/datasette/issues/265#issuecomment-398102537,https://api.github.com/repos/simonw/datasette/issues/265,398102537,MDEyOklzc3VlQ29tbWVudDM5ODEwMjUzNw==,9599,simonw,2018-06-18T15:52:15Z,2018-06-18T15:52:15Z,OWNER,https://latest.datasette.io/ now always hosts the latest version of the code. I've started linking to it from our documentation.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323677499,Add links to example Datasette instances to appropiate places in docs, https://github.com/simonw/datasette/issues/266#issuecomment-389570841,https://api.github.com/repos/simonw/datasette/issues/266,389570841,MDEyOklzc3VlQ29tbWVudDM4OTU3MDg0MQ==,9599,simonw,2018-05-16T15:54:49Z,2018-06-15T07:41:09Z,OWNER,"At the most basic level, this will work based on an extension. Most places you currently put a `.json` extension should also allow a `.csv` extension. By default this will return the exact results you see on the current page (default max will remain 1000). ## Streaming all records Where things get interested is *streaming mode*. This will be an option which returns ALL matching records as a streaming CSV file, even if that ends up being millions of records. I think the best way to build this will be on top of the existing mechanism used to efficiently implement keyset pagination via `_next=` tokens. ## Expanding foreign keys For tables with foreign key references it would be useful if the CSV format could expand those references to include the labels from `label_column` - maybe via an additional `?_expand=1` option. When expanding each foreign key column will be shown twice: rowid,city_id,city_id_label,state","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389572201,https://api.github.com/repos/simonw/datasette/issues/266,389572201,MDEyOklzc3VlQ29tbWVudDM4OTU3MjIwMQ==,9599,simonw,2018-05-16T15:58:43Z,2018-05-16T16:00:47Z,OWNER,"This will likely be implemented in the `BaseView` class, which needs to know how to spot the `.csv` extension, call the underlying JSON generating function and then return the `columns` and `rows` as correctly formatted CSV. https://github.com/simonw/datasette/blob/9959a9e4deec8e3e178f919e8b494214d5faa7fd/datasette/views/base.py#L201-L207 This means it will take ALL arguments that are available to the `.json` view. It may ignore some (e.g. `_facet=` makes no sense since CSV tables don't have space to show the facet results). In streaming mode, things will behave a little bit differently - in particular, if `_stream=1` then `_next=` will be forbidden. It can't include a length header because we don't know how many bytes it will be CSV output will throw an error if the endpoint doesn't have rows and columns keys eg `/-/inspect.json` So the implementation... - looks for the `.csv` extension - internally fetches the `.json` data instead - If no `_stream` it just transposes that JSON to CSV with the correct content type header - If `_stream=1` - checks for `_next=` and throws an error if it was provided - Otherwise... fetch first page and emit CSV header and first set of rows - Then start async looping, emitting more CSV rows and following the `_next=` internal reference until done I like that this takes advantage of efficient pagination. It may not work so well for views which use offset/limit though. It won't work at all for custom SQL because custom SQL doesn't support _next= pagination. That's fine. For views... easiest fix is to cut off after first X000 records. That seems OK. View JSON would need to include a property that the mechanism can identify.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389579363,https://api.github.com/repos/simonw/datasette/issues/266,389579363,MDEyOklzc3VlQ29tbWVudDM4OTU3OTM2Mw==,9599,simonw,2018-05-16T16:20:06Z,2018-05-16T16:20:06Z,OWNER,I started a thread on Twitter discussing various CSV output dialects: https://twitter.com/simonw/status/996783395504979968 - I want to pick defaults which will work as well as possible for whatever tools people might be using to consume the data.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389579762,https://api.github.com/repos/simonw/datasette/issues/266,389579762,MDEyOklzc3VlQ29tbWVudDM4OTU3OTc2Mg==,9599,simonw,2018-05-16T16:21:12Z,2018-05-16T16:21:12Z,OWNER,"> I basically want someone to tell me which arguments I can pass to Python's csv.writer() function that will result in the least complaints from people who try to parse the results :) https://twitter.com/simonw/status/996786815938977792","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389592566,https://api.github.com/repos/simonw/datasette/issues/266,389592566,MDEyOklzc3VlQ29tbWVudDM4OTU5MjU2Ng==,9599,simonw,2018-05-16T17:01:29Z,2018-05-16T17:02:21Z,OWNER,Let's provide a CSV Dialect definition too: https://frictionlessdata.io/specs/csv-dialect/ - via https://twitter.com/drewdaraabrams/status/996794915680997382,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389608473,https://api.github.com/repos/simonw/datasette/issues/266,389608473,MDEyOklzc3VlQ29tbWVudDM4OTYwODQ3Mw==,9599,simonw,2018-05-16T17:52:35Z,2018-05-16T17:54:11Z,OWNER,"There are some code examples in this issue which should help with the streaming part: https://github.com/channelcat/sanic/issues/1067 Also https://github.com/channelcat/sanic/blob/master/docs/sanic/streaming.md#response-streaming","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389626715,https://api.github.com/repos/simonw/datasette/issues/266,389626715,MDEyOklzc3VlQ29tbWVudDM4OTYyNjcxNQ==,9599,simonw,2018-05-16T18:50:46Z,2018-05-16T18:50:46Z,OWNER,"> I’d recommend using the Windows-1252 encoding for maximum compatibility, unless you have any characters not in that set, in which case use UTF8 with a byte order mark. Bit of a pain, but some progams (eg various versions of Excel) don’t read UTF8. **frankieroberto** https://twitter.com/frankieroberto/status/996823071947460616 > There is software that consumes CSV and doesn't speak UTF8!? Huh. Well I can't just use Windows-1252 because I need to support the full UTF8 range of potential data - maybe I should support an optional ?_encoding=windows-1252 argument **simonw** https://twitter.com/simonw/status/996824677245857793","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389893810,https://api.github.com/repos/simonw/datasette/issues/266,389893810,MDEyOklzc3VlQ29tbWVudDM4OTg5MzgxMA==,9599,simonw,2018-05-17T14:49:35Z,2018-05-17T14:49:35Z,OWNER,Idea: add a `supports_csv = False` property to `BaseView` and over-ride it to `True` just on the view classes that should support CSV (Table and Row). Slight subtlety: the `DatabaseView` class only supports CSV in the `custom_sql()` path. Maybe that needs to be refactored a bit.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389894382,https://api.github.com/repos/simonw/datasette/issues/266,389894382,MDEyOklzc3VlQ29tbWVudDM4OTg5NDM4Mg==,9599,simonw,2018-05-17T14:51:13Z,2018-05-17T14:53:23Z,OWNER,"I should definitely sanity check if the `_next=` route really is the most efficient way to build this. It may turn out that iterating over a SQLite cursor with a million rows in it is super-efficient and would provide much more reliable performance (plus solve the problem for retrieving full custom SQL queries where we can't do keyset pagination). Problem here is that we run SQL queries in a thread pool. A query that returns millions of rows would presumably tie up a SQL thread until it has finished, which could block the server. This may be a reason to stick with `_next=` keyset pagination - since it ensures each SQL thread yields back again after each 1,000 rows.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-393020749,https://api.github.com/repos/simonw/datasette/issues/266,393020749,MDEyOklzc3VlQ29tbWVudDM5MzAyMDc0OQ==,9599,simonw,2018-05-30T03:42:54Z,2018-05-30T03:42:54Z,OWNER,"Challenge: how to deal with tables where the name ends in `.csv`? I actually have one of these in the test suite at the moment: https://github.com/simonw/datasette/blob/d69ebce53385b7c6fafb85fdab3b136dbf3f332c/tests/fixtures.py#L234-L237","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-394417567,https://api.github.com/repos/simonw/datasette/issues/266,394417567,MDEyOklzc3VlQ29tbWVudDM5NDQxNzU2Nw==,9599,simonw,2018-06-04T16:30:48Z,2018-06-04T16:32:55Z,OWNER,"When serving streaming responses, I need to check that a large CSV file doesn't completely max out the CPU in a way that is harmful to the rest of the instance. If it does, one option may be to insert an async sleep call in between each chunk that is streamed back. This could be controlled by a `csv_pause_ms` config setting, defaulting to maybe 5 but can be disabled entirely by setting to 0. That's only if testing proves that this is a necessary mechanism.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397534196,https://api.github.com/repos/simonw/datasette/issues/266,397534196,MDEyOklzc3VlQ29tbWVudDM5NzUzNDE5Ng==,9599,simonw,2018-06-15T07:12:16Z,2018-06-15T07:12:16Z,OWNER,"The first version of this is now shipped to master. I ended up rewriting most of the experimental branch to deal with the nasty issue described in #303 Demo is available on https://fivethirtyeight.datasettes.com/fivethirtyeight-ab24e01/most-common-name%2Fsurnames ![2018-06-15 at 12 11 am](https://user-images.githubusercontent.com/9599/41455090-bd5ece30-7030-11e8-8da4-11fbb1f2ef8b.png) Here's the CSV version of that page: https://fivethirtyeight.datasettes.com/fivethirtyeight-ab24e01/most-common-name%2Fsurnames.csv","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397534404,https://api.github.com/repos/simonw/datasette/issues/266,397534404,MDEyOklzc3VlQ29tbWVudDM5NzUzNDQwNA==,9599,simonw,2018-06-15T07:13:20Z,2018-06-15T07:13:20Z,OWNER,"Still to add: the streaming version that iterates through all of the pages, as seen in experimental commit https://github.com/simonw/datasette/commit/ced379ea325787b8c3bf0a614daba1fa4856a3bd","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397534498,https://api.github.com/repos/simonw/datasette/issues/266,397534498,MDEyOklzc3VlQ29tbWVudDM5NzUzNDQ5OA==,9599,simonw,2018-06-15T07:13:52Z,2018-06-15T07:13:52Z,OWNER,Also needs documentation.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397729945,https://api.github.com/repos/simonw/datasette/issues/266,397729945,MDEyOklzc3VlQ29tbWVudDM5NzcyOTk0NQ==,9599,simonw,2018-06-15T20:13:05Z,2018-06-15T20:13:05Z,OWNER,"The ""This data as ..."" area of the page is getting a bit untidy, especially if I'm going to add other download options in the future. I think I'll move the HTML to the page footer (less concerns about taking up lots of space there) and then have a bit of JavaScript that turns it into a show/hide menu of some sort in its current location.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397842246,https://api.github.com/repos/simonw/datasette/issues/266,397842246,MDEyOklzc3VlQ29tbWVudDM5Nzg0MjI0Ng==,9599,simonw,2018-06-16T22:27:59Z,2018-06-16T22:27:59Z,OWNER,"Two demos of the new functionality in #233 as it applies to CSV: * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List.csv?_labels=on - CSV with all foreign key columns expanded * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List.csv?_label=qSpecies - CSV with specific columns expanded","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397842667,https://api.github.com/repos/simonw/datasette/issues/266,397842667,MDEyOklzc3VlQ29tbWVudDM5Nzg0MjY2Nw==,9599,simonw,2018-06-16T22:38:15Z,2018-06-18T05:55:11Z,OWNER,"Still todo: - [x] Streaming version - [ ] Tidy up the ""This data as ..."" UI - [x] Default .csv (and .json) links to use `?_labels=on` (only if at least one foreign key detected) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397912840,https://api.github.com/repos/simonw/datasette/issues/266,397912840,MDEyOklzc3VlQ29tbWVudDM5NzkxMjg0MA==,9599,simonw,2018-06-17T23:13:35Z,2018-06-17T23:16:42Z,OWNER,"This worked! https://github.com/simonw/datasette/commit/5a0a82faf9cf9dd109d76181ed00eea19472087c - it spat out a 76MB CSV when I ran it against the sf-trees demo database. It was just a quick hack though - it currently ignores `_labels=` and `_dl=` which need to be supported. I'm going to add a config option for turning full CSV export off just in case any Datasette users are uncomfortable with URLs that churn out that much data in one go. ``` ConfigOption(""allow_csv_stream"", True, """""" Allow .csv?_stream=1 to download all rows (ignoring max_returned_rows) """""".strip()), ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397915258,https://api.github.com/repos/simonw/datasette/issues/266,397915258,MDEyOklzc3VlQ29tbWVudDM5NzkxNTI1OA==,9599,simonw,2018-06-18T00:01:05Z,2018-06-18T00:01:05Z,OWNER,Someone malicious could use a UNION to generate an unpleasantly large CSV response. I'll add another config setting which limits the response size to 100MB but can be turned off by setting it to 0.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397915403,https://api.github.com/repos/simonw/datasette/issues/266,397915403,MDEyOklzc3VlQ29tbWVudDM5NzkxNTQwMw==,9599,simonw,2018-06-18T00:03:17Z,2018-06-18T00:14:37Z,OWNER,"Since CSV streaming export doesn't work for custom SQL queries (since they don't support `_next=` pagination) there's no need to provide a option that disables streams just for custom SQL. Related: the UI should not show the option to download everything on custom SQL pages.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397916091,https://api.github.com/repos/simonw/datasette/issues/266,397916091,MDEyOklzc3VlQ29tbWVudDM5NzkxNjA5MQ==,9599,simonw,2018-06-18T00:13:43Z,2018-06-18T00:15:50Z,OWNER,I was also worried about the performance of pagination over custom `_sort` orders or views which use offset pagination - but Datasette's SQL time limits should prevent those from getting out of hand. This does mean that a streaming CSV file may be truncated with an error - if this happens we should ensure the error is written out as the last line of the CSV so anyone who tried to import it gets a relevant error message informing them that the export did not complete.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397916321,https://api.github.com/repos/simonw/datasette/issues/266,397916321,MDEyOklzc3VlQ29tbWVudDM5NzkxNjMyMQ==,9599,simonw,2018-06-18T00:17:44Z,2018-06-18T00:18:05Z,OWNER,The export UI could be a GET form controlling various parameters. This would discourage crawlers from hitting the export links and would also allow us to express the full range of export options.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397918264,https://api.github.com/repos/simonw/datasette/issues/266,397918264,MDEyOklzc3VlQ29tbWVudDM5NzkxODI2NA==,9599,simonw,2018-06-18T00:49:35Z,2018-06-18T00:49:35Z,OWNER,"Simpler design: the top of the page will link to basic .json and .csv and ""advanced"" - which will fragment link to an advanced export format the bottom of the page.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397923253,https://api.github.com/repos/simonw/datasette/issues/266,397923253,MDEyOklzc3VlQ29tbWVudDM5NzkyMzI1Mw==,9599,simonw,2018-06-18T01:49:52Z,2018-06-18T03:02:28Z,OWNER,Ideally the downloadable filenames of exported CSVs would differ across different querystring parameters. Maybe S`treet_Trees-56cbd54.csv` where `56cbd54` is a hash of the querystring?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397949002,https://api.github.com/repos/simonw/datasette/issues/266,397949002,MDEyOklzc3VlQ29tbWVudDM5Nzk0OTAwMg==,9599,simonw,2018-06-18T05:53:17Z,2018-06-18T05:53:17Z,OWNER,"Advanced export pane: ![2018-06-17 at 10 52 pm](https://user-images.githubusercontent.com/9599/41520166-3809a45a-7281-11e8-9dfa-2b10f4cb9672.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397952129,https://api.github.com/repos/simonw/datasette/issues/266,397952129,MDEyOklzc3VlQ29tbWVudDM5Nzk1MjEyOQ==,9599,simonw,2018-06-18T06:15:36Z,2018-06-18T06:15:51Z,OWNER,Advanced export pane demo: https://latest.datasette.io/fixtures-35b6eb6/facetable?_size=4,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-398098582,https://api.github.com/repos/simonw/datasette/issues/266,398098582,MDEyOklzc3VlQ29tbWVudDM5ODA5ODU4Mg==,9599,simonw,2018-06-18T15:40:32Z,2018-06-18T15:40:32Z,OWNER,This is now released in Datasette 0.23! http://datasette.readthedocs.io/en/latest/changelog.html#v0-23,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/267#issuecomment-392121905,https://api.github.com/repos/simonw/datasette/issues/267,392121905,MDEyOklzc3VlQ29tbWVudDM5MjEyMTkwNQ==,9599,simonw,2018-05-25T17:08:14Z,2018-05-25T17:08:14Z,OWNER,See also #286,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323716411,"Documentation for URL hashing, redirects and cache policy", https://github.com/simonw/datasette/issues/267#issuecomment-414860009,https://api.github.com/repos/simonw/datasette/issues/267,414860009,MDEyOklzc3VlQ29tbWVudDQxNDg2MDAwOQ==,78156,annapowellsmith,2018-08-21T23:57:51Z,2018-08-21T23:57:51Z,NONE,"Looks to me like hashing, redirects and caching were documented as part of https://github.com/simonw/datasette/commit/788a542d3c739da5207db7d1fb91789603cdd336#diff-3021b0e065dce289c34c3b49b3952a07 - so perhaps this can be closed? :tada:","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323716411,"Documentation for URL hashing, redirects and cache policy", https://github.com/simonw/datasette/issues/267#issuecomment-504879082,https://api.github.com/repos/simonw/datasette/issues/267,504879082,MDEyOklzc3VlQ29tbWVudDUwNDg3OTA4Mg==,9599,simonw,2019-06-24T06:41:02Z,2019-06-24T06:41:02Z,OWNER,Yes this is definitely documented now https://datasette.readthedocs.io/en/stable/performance.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323716411,"Documentation for URL hashing, redirects and cache policy", https://github.com/simonw/datasette/issues/268#issuecomment-504880796,https://api.github.com/repos/simonw/datasette/issues/268,504880796,MDEyOklzc3VlQ29tbWVudDUwNDg4MDc5Ng==,9599,simonw,2019-06-24T06:47:23Z,2019-06-24T06:47:23Z,OWNER,I did a bunch of research relevant to this a while ago: https://simonwillison.net/2019/Jan/7/exploring-search-relevance-algorithms-sqlite/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323718842,Mechanism for ranking results from SQLite full-text search, https://github.com/simonw/datasette/issues/268#issuecomment-675725464,https://api.github.com/repos/simonw/datasette/issues/268,675725464,MDEyOklzc3VlQ29tbWVudDY3NTcyNTQ2NA==,9599,simonw,2020-08-18T21:18:07Z,2020-08-18T21:18:35Z,OWNER,"I want this on the table page - but that means that the table page will need to run a slightly more complex query since it needs access to a `rank` column to sort by - which it gets from running a join. BUT... that join needs to be constructed in a way that keeps existing filters, `?_where=` clauses etc intact. Here's a prototype using SQLite CTEs: https://register-of-members-interests.datasettes.com/regmem?sql=with+original+as+%28select+rowid%2C+*+from+items%29%0D%0Aselect%0D%0A++original.*%2C%0D%0A++items_fts.rank+as+items_fts_rank%0D%0Afrom%0D%0A++original+join+items_fts+on+original.rowid+%3D+items_fts.rowid%0D%0Awhere%0D%0A++items_fts+match+escape_fts%28%3Asearch%29%0D%0Aorder+by+items_fts_rank+desc+limit+10&search=hotel ```sql with original as ( select rowid, * from items ) select original.*, items_fts.rank as items_fts_rank from original join items_fts on original.rowid = items_fts.rowid where items_fts match escape_fts(:search) order by items_fts_rank desc limit 10 ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323718842,Mechanism for ranking results from SQLite full-text search, https://github.com/simonw/datasette/issues/268#issuecomment-721896822,https://api.github.com/repos/simonw/datasette/issues/268,721896822,MDEyOklzc3VlQ29tbWVudDcyMTg5NjgyMg==,9599,simonw,2020-11-04T18:23:29Z,2020-11-04T18:23:29Z,OWNER,"Worth noting that joining to get the rank works for FTS5 but not for FTS4 - see comment here: https://github.com/simonw/sqlite-utils/issues/192#issuecomment-721420539 Easiest solution would be to only support sort-by-rank for FTS5 tables. Alternative would be to depend on https://github.com/simonw/sqlite-fts4","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323718842,Mechanism for ranking results from SQLite full-text search, https://github.com/simonw/datasette/issues/268#issuecomment-723740546,https://api.github.com/repos/simonw/datasette/issues/268,723740546,MDEyOklzc3VlQ29tbWVudDcyMzc0MDU0Ng==,9599,simonw,2020-11-09T04:01:50Z,2020-11-09T04:01:50Z,OWNER,I should depend on `sqlite-fts4` - I'm doing that in `sqlite-utils` now and it works great: https://github.com/simonw/sqlite-utils/issues/198,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323718842,Mechanism for ranking results from SQLite full-text search, https://github.com/simonw/datasette/issues/268#issuecomment-726419027,https://api.github.com/repos/simonw/datasette/issues/268,726419027,MDEyOklzc3VlQ29tbWVudDcyNjQxOTAyNw==,9599,simonw,2020-11-13T00:09:04Z,2020-11-13T00:09:04Z,OWNER,Part of the challenge here is that this is the first time the `TableView` will have had a complete rewrite of the SQL it is going to execute. That SQL is currently constructed here: https://github.com/simonw/datasette/blob/5eb8e9bf250b26e30b017d39a392c33973997656/datasette/views/table.py#L628-L636,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323718842,Mechanism for ranking results from SQLite full-text search, https://github.com/simonw/datasette/issues/268#issuecomment-789409126,https://api.github.com/repos/simonw/datasette/issues/268,789409126,MDEyOklzc3VlQ29tbWVudDc4OTQwOTEyNg==,649467,mhalle,2021-03-03T03:57:15Z,2021-03-03T03:58:40Z,NONE,"In FTS5, I think doing an FTS search is actually much easier than doing a join against the main table like datasette does now. In fact, FTS5 external content tables provide a transparent interface back to the original table or view. Here's what I'm currently doing: * build a view that joins whatever tables I want and rename the columns to non-joiny names (e.g, `chapter.name AS chapter_name` in the view where needed) * Create an FTS5 table with `content=""viewname""` * As described in the ""external content tables"" section (https://www.sqlite.org/fts5.html#external_content_tables), sql queries can be made directly to the FTS table, which behind the covers makes select calls to the content table when the content of the original columns are needed. * In addition, you get ""rank"" and ""bm25()"" available to you when you select on the _fts table. Unfortunately, datasette doesn't currently seem happy being coerced into doing a real query on an fts5 table. This works: ```select col1, col2, col3 from table_fts where coll1=""value"" and table_fts match escape_fts(""search term"") order by rank``` But this doesn't work in the datasette SQL query interface: ```select col1, col2, col3 from table_fts where coll1=""value"" and table_fts match escape_fts(:search) order by rank``` (the ""search"" input text field doesn't show up) For what datasette is doing right now, I think you could just use contentless fts5 tables (`content=""""`), since all you care about is the rowid since all you're doing a subselect to get the rowid anyway. In fts5, that's just a contentless table. I guess if you want to follow this suggestion, you'd need a somewhat different code path for fts5. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323718842,Mechanism for ranking results from SQLite full-text search, https://github.com/simonw/datasette/issues/268#issuecomment-790257263,https://api.github.com/repos/simonw/datasette/issues/268,790257263,MDEyOklzc3VlQ29tbWVudDc5MDI1NzI2Mw==,649467,mhalle,2021-03-04T03:20:23Z,2021-03-04T03:20:23Z,NONE,"It's kind of an ugly hack, but you can try out what using the fts5 table as an actual datasette-accessible table looks like without changing any datasette code by creating yet another view on top of the fts5 table: `create view proxyview as select *, rank, table_fts as fts from table_fts;` That's now visible from datasette, just like any other view, but you can use `fts match escape_fts(search_string) order by rank`. This is only good as a proof of concept because you're inefficiently going from view -> fts5 external content table -> view -> data table. However, it does show it works.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323718842,Mechanism for ranking results from SQLite full-text search, https://github.com/simonw/datasette/issues/268#issuecomment-876428348,https://api.github.com/repos/simonw/datasette/issues/268,876428348,MDEyOklzc3VlQ29tbWVudDg3NjQyODM0OA==,9308268,rayvoelker,2021-07-08T13:13:12Z,2021-07-08T13:13:12Z,NONE,"I had setup a full text search on my instance of Datasette for title data for our public library, and was noticing that some of the features of the SQLite FTS weren't working as expected ... and maybe the issue is in the `escape_fts()` function ![image](https://user-images.githubusercontent.com/9308268/124925900-f1ea8b00-dfca-11eb-895e-59cc083d6524.png) vs removing the function... ![image](https://user-images.githubusercontent.com/9308268/124925971-0464c480-dfcb-11eb-8fbf-8e9b5d6e0861.png) Also, on the issue of sorting by rank by default .. perhaps something like this could work for the baked-in default SQL query for Datasette? ![image](https://user-images.githubusercontent.com/9308268/124927191-5a863780-dfcc-11eb-9908-3f63577d5ff5.png) [link to the above search in my instance of Datasette](https://ilsweb.cincinnatilibrary.org/collection-analysis/current_collection-87a9011?sql=with+fts_search+as+%28%0D%0A++select%0D%0A++rowid%2C%0D%0A++rank%0D%0A++++from%0D%0A++++++bib_fts%0D%0A++++where%0D%0A++++++bib_fts+match+%3Asearch%0D%0A%29%0D%0A%0D%0Aselect%0D%0A++%0D%0A++bib_record_num%2C%0D%0A++creation_date%2C%0D%0A++record_last_updated%2C%0D%0A++isbn%2C%0D%0A++best_author%2C%0D%0A++best_title%2C%0D%0A++publisher%2C%0D%0A++publish_year%2C%0D%0A++bib_level_callnumber%2C%0D%0A++indexed_subjects%0D%0Afrom%0D%0A++fts_search%0D%0A++join+bib+on+bib.rowid+%3D+fts_search.rowid%0D%0A++%0D%0Aorder+by%0D%0Arank%0D%0A&search=black+death+NOT+fiction)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323718842,Mechanism for ranking results from SQLite full-text search, https://github.com/simonw/datasette/issues/268#issuecomment-876616414,https://api.github.com/repos/simonw/datasette/issues/268,876616414,MDEyOklzc3VlQ29tbWVudDg3NjYxNjQxNA==,9599,simonw,2021-07-08T17:29:04Z,2021-07-08T17:29:04Z,OWNER,"> I had setup a full text search on my instance of Datasette for title data for our public library, and was noticing that some of the features of the SQLite FTS weren't working as expected ... and maybe the issue is in the `escape_fts()` function That's a deliberate feature (albeit controversial, see #759) - part of the main problem here is that it's easy to construct a SQLite full-text search string which results in a database error. This is a bad user-experience! You can opt-in to raw SQL queries by appending `?_searchmode=raw` to the page, see https://docs.datasette.io/en/stable/full_text_search.html#advanced-sqlite-search-queries But maybe there should be an option for turning that on by default without needing the query string? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323718842,Mechanism for ranking results from SQLite full-text search, https://github.com/simonw/datasette/issues/268#issuecomment-876721585,https://api.github.com/repos/simonw/datasette/issues/268,876721585,MDEyOklzc3VlQ29tbWVudDg3NjcyMTU4NQ==,9308268,rayvoelker,2021-07-08T20:22:17Z,2021-07-08T20:22:17Z,NONE,"I do like the idea of there being a option for turning that on by default so that you could use those terms in the default ""Search"" bar presented when you browse to a table where FTS has been enabled. Maybe even a small inline pop up with a short bit explaining the FTS feature and the keywords (e.g. case matters). What are the side-effects of turning that on in the query string, or even by default as you suggested? I see that you stated in the docs... ""to ensure they do not cause any confusion for users who are not aware of them"", but I'm not sure what those could be. Isn't it the case that those keywords are only picked up by sqlite in where you're using the MATCH clause? Seems like a really powerful feature (even though there are a lot of hurdles around setting it up in the sqlite db ... sqlite-utils makes that so simple by the way!)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323718842,Mechanism for ranking results from SQLite full-text search, https://github.com/simonw/datasette/issues/268#issuecomment-880150755,https://api.github.com/repos/simonw/datasette/issues/268,880150755,MDEyOklzc3VlQ29tbWVudDg4MDE1MDc1NQ==,9599,simonw,2021-07-14T19:26:47Z,2021-07-14T19:29:08Z,OWNER,"> What are the side-effects of turning that on in the query string, or even by default as you suggested? I see that you stated in the docs... ""to ensure they do not cause any confusion for users who are not aware of them"", but I'm not sure what those could be. Mainly that it's possible to generate SQL queries that crash with an error. This was the example that convinced me to default to escaping: - https://www.niche-museums.com/browse/museums?_search=park.&_searchmode=raw (returns `fts5: syntax error near "".""`) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323718842,Mechanism for ranking results from SQLite full-text search, https://github.com/simonw/datasette/issues/268#issuecomment-880153069,https://api.github.com/repos/simonw/datasette/issues/268,880153069,MDEyOklzc3VlQ29tbWVudDg4MDE1MzA2OQ==,9599,simonw,2021-07-14T19:31:00Z,2021-07-14T19:31:00Z,OWNER,"... though interestingly I can't replicate that error on `latest.datasette.io` - https://latest.datasette.io/fixtures/searchable?_search=park.&_searchmode=raw That's running https://latest.datasette.io/-/versions SQLite 3.35.4 whereas https://www.niche-museums.com/-/versions is running 3.27.2 (the most recent version available with Vercel) - but there's nothing in the SQLite changelog between those two versions that suggests changes to how the FTS5 parser works. https://www.sqlite.org/changes.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323718842,Mechanism for ranking results from SQLite full-text search, https://github.com/simonw/datasette/issues/270#issuecomment-390105147,https://api.github.com/repos/simonw/datasette/issues/270,390105147,MDEyOklzc3VlQ29tbWVudDM5MDEwNTE0Nw==,9599,simonw,2018-05-18T06:13:07Z,2018-05-18T06:13:07Z,OWNER,I'm going to add a `/-/limits` page that shows the current limits.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323830051,--limit= CLI option for setting limits, https://github.com/simonw/datasette/issues/271#issuecomment-389989015,https://api.github.com/repos/simonw/datasette/issues/271,389989015,MDEyOklzc3VlQ29tbWVudDM4OTk4OTAxNQ==,9599,simonw,2018-05-17T19:54:10Z,2018-05-17T19:54:10Z,OWNER,"This is a departure from how Datasette has been designed so far, and it may turn out that it's not feasible or it requires too many philosophical changes to be worthwhile. If we CAN do it though it would mean Datasette could stay running pointed at a directory on disk and new SQLite databases could be dropped into that directory by another process and served directly as they become available.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324162476,Mechanism for automatically picking up changes when on-disk .db file changes, https://github.com/simonw/datasette/issues/271#issuecomment-389989615,https://api.github.com/repos/simonw/datasette/issues/271,389989615,MDEyOklzc3VlQ29tbWVudDM4OTk4OTYxNQ==,9599,simonw,2018-05-17T19:56:13Z,2018-05-17T19:56:13Z,OWNER,"From https://www.sqlite.org/c3ref/open.html > **immutable**: The immutable parameter is a boolean query parameter that indicates that the database file is stored on read-only media. When immutable is set, SQLite assumes that the database file cannot be changed, even by a process with higher privilege, and so the database is opened read-only and all locking and change detection is disabled. Caution: Setting the immutable property on a database file that does in fact change can result in incorrect query results and/or SQLITE_CORRUPT errors. See also: SQLITE_IOCAP_IMMUTABLE. So this would probably have to be a new mode, `datasette serve --detect-db-changes`, which no longer opens in immutable mode. Or maybe current behavior becomes not-the-default and you opt into it with `datasette serve --immutable`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324162476,Mechanism for automatically picking up changes when on-disk .db file changes, https://github.com/simonw/datasette/issues/271#issuecomment-398133924,https://api.github.com/repos/simonw/datasette/issues/271,398133924,MDEyOklzc3VlQ29tbWVudDM5ODEzMzkyNA==,9599,simonw,2018-06-18T17:32:22Z,2018-06-18T17:32:22Z,OWNER,"As seen in #316 inspect is already taking a VERY long time to run against large (600GB) databases. To get this working I may have to make inspect an optional optimization and run introspection for columns and primary keys in demand. The one catch here is the `count(*)` queries - Datasette may need to learn not to return full table counts in circumstances where the count has not been pre-calculates and takes more than Xms to generate.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324162476,Mechanism for automatically picking up changes when on-disk .db file changes, https://github.com/simonw/datasette/issues/271#issuecomment-453262703,https://api.github.com/repos/simonw/datasette/issues/271,453262703,MDEyOklzc3VlQ29tbWVudDQ1MzI2MjcwMw==,9599,simonw,2019-01-10T21:35:18Z,2019-01-10T21:35:18Z,OWNER,It turns out this was much easier to support than I expected: https://github.com/simonw/datasette/commit/eac08f0dfc61a99e8887442fc247656d419c76f8,"{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324162476,Mechanism for automatically picking up changes when on-disk .db file changes, https://github.com/simonw/datasette/issues/272#issuecomment-391011268,https://api.github.com/repos/simonw/datasette/issues/272,391011268,MDEyOklzc3VlQ29tbWVudDM5MTAxMTI2OA==,9599,simonw,2018-05-22T14:28:12Z,2018-05-22T14:28:12Z,OWNER,"I think I can do this almost entirely within my existing BaseView class structure. First, decouple the async data() methods by teaching them to take a querystring object as an argument instead of a Sanic request object. The get() method can then send that new object instead of a request. Next teach the base class how to obey the ASGI protocol. I should be able to get support for both Sanic and uvicorn/daphne working in the same codebase, which will make it easy to compare their performance. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-392118755,https://api.github.com/repos/simonw/datasette/issues/272,392118755,MDEyOklzc3VlQ29tbWVudDM5MjExODc1NQ==,9599,simonw,2018-05-25T16:56:40Z,2018-06-05T16:01:13Z,OWNER,"Thinking about this further, maybe I should embrace ASGI turtles-all-the-way-down and teach each datasette view class to take a scope to the constructor and act entirely as an ASGI component. Would be a nice way of diving deep into ASGI and I can add utility helpers for things like querystring evaluation as I need them.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-394431323,https://api.github.com/repos/simonw/datasette/issues/272,394431323,MDEyOklzc3VlQ29tbWVudDM5NDQzMTMyMw==,9599,simonw,2018-06-04T17:17:37Z,2018-06-04T17:17:37Z,OWNER,I built this ASGI debugging tool to help with this migration: https://asgi-scope.now.sh/fivethirtyeight-34d6604/most-common-name%2Fsurnames.json?foo=bar&bazoeuto=onetuh&a=.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-394503399,https://api.github.com/repos/simonw/datasette/issues/272,394503399,MDEyOklzc3VlQ29tbWVudDM5NDUwMzM5OQ==,9599,simonw,2018-06-04T21:20:14Z,2018-06-04T21:20:14Z,OWNER,Results of an extremely simple micro-benchmark comparing the two shows that uvicorn is at least as fast as Sanic (benchmarks a little faster with a very simple payload): https://gist.github.com/simonw/418950af178c01c416363cc057420851,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-394764713,https://api.github.com/repos/simonw/datasette/issues/272,394764713,MDEyOklzc3VlQ29tbWVudDM5NDc2NDcxMw==,9599,simonw,2018-06-05T15:58:54Z,2018-06-05T16:00:40Z,OWNER,"https://github.com/encode/uvicorn/blob/572b5fe6c811b63298d5350a06b664839624c860/uvicorn/run.py#L63 is how you start a Uvicorn server from code as opposed to the `uvicorn` CLI from uvicorn.run import UvicornServer UvicornServer().run(app, host=host, port=port) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-400166540,https://api.github.com/repos/simonw/datasette/issues/272,400166540,MDEyOklzc3VlQ29tbWVudDQwMDE2NjU0MA==,9599,simonw,2018-06-26T03:29:43Z,2018-06-26T03:29:43Z,OWNER,This looks VERY relevant: https://github.com/encode/starlette,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-400571521,https://api.github.com/repos/simonw/datasette/issues/272,400571521,MDEyOklzc3VlQ29tbWVudDQwMDU3MTUyMQ==,647359,tomchristie,2018-06-27T07:30:07Z,2018-06-27T07:30:07Z,NONE,"I’m up for helping with this. Looks like you’d need static files support, which I’m planning on adding a component for. Anything else obviously missing? For a quick overview it looks very doable - the test client ought to me your test cases stay roughly the same. Are you using any middleware or other components for the Sanic ecosystem? Do you use cookies or sessions at all?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-403959704,https://api.github.com/repos/simonw/datasette/issues/272,403959704,MDEyOklzc3VlQ29tbWVudDQwMzk1OTcwNA==,9599,simonw,2018-07-10T20:44:47Z,2018-07-10T20:44:47Z,OWNER,"No cookies or sessions - no POST requests in fact, Datasette just cares about GET (path and querystring) and being able to return custom HTTP headers.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-404514973,https://api.github.com/repos/simonw/datasette/issues/272,404514973,MDEyOklzc3VlQ29tbWVudDQwNDUxNDk3Mw==,647359,tomchristie,2018-07-12T13:38:24Z,2018-07-12T13:38:24Z,NONE,"Okay. I reckon the latest version should have all the kinds of components you'd need: Recently added ASGI components for Routing and Static Files support, as well as making few tweaks to make sure requests and responses are instantiated efficiently. Don't have any redirect-to-slash / redirect-to-non-slash stuff out of the box yet, which it looks like you might miss.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-408093480,https://api.github.com/repos/simonw/datasette/issues/272,408093480,MDEyOklzc3VlQ29tbWVudDQwODA5MzQ4MA==,9599,simonw,2018-07-26T13:15:55Z,2018-07-26T13:46:40Z,OWNER,"I'm now hacking around with an initial version of this in the [starlette branch](https://github.com/simonw/datasette/tree/starlette). Here's my work in progress, deployed using `datasette publish now fixtures.db -n datasette-starlette-demo --branch=starlette --extra-options=""--asgi""` https://datasette-starlette-demo.now.sh/ Lots more work to do - the CSS isn't being served correctly for example, it's showing this error when I hit `/-/static/app.css`: ``` INFO: 127.0.0.1 - ""GET /-/static/app.css HTTP/1.1"" 200 ERROR: Exception in ASGI application Traceback (most recent call last): File ""/Users/simonw/Dropbox/Development/datasette/venv/lib/python3.6/site-packages/uvicorn/protocols/http/httptools_impl.py"", line 363, in run_asgi result = await asgi(self.receive, self.send) File ""/Users/simonw/Dropbox/Development/datasette/venv/lib/python3.6/site-packages/starlette/staticfiles.py"", line 91, in __call__ await response(receive, send) File ""/Users/simonw/Dropbox/Development/datasette/venv/lib/python3.6/site-packages/starlette/response.py"", line 180, in __call__ {""type"": ""http.response.body"", ""body"": chunk, ""more_body"": False} File ""/Users/simonw/Dropbox/Development/datasette/venv/lib/python3.6/site-packages/uvicorn/protocols/http/httptools_impl.py"", line 483, in send raise RuntimeError(""Response content shorter than Content-Length"") RuntimeError: Response content shorter than Content-Length ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-408097719,https://api.github.com/repos/simonw/datasette/issues/272,408097719,MDEyOklzc3VlQ29tbWVudDQwODA5NzcxOQ==,9599,simonw,2018-07-26T13:29:38Z,2018-07-26T13:29:38Z,OWNER,It looks like that's a bug in Starlette - filed here: https://github.com/encode/starlette/issues/32,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-408105251,https://api.github.com/repos/simonw/datasette/issues/272,408105251,MDEyOklzc3VlQ29tbWVudDQwODEwNTI1MQ==,9599,simonw,2018-07-26T13:54:06Z,2018-07-26T13:54:06Z,OWNER,"Tom shipped my fix for that bug already, so https://datasette-starlette-demo.now.sh/ is now serving CSS!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-408478935,https://api.github.com/repos/simonw/datasette/issues/272,408478935,MDEyOklzc3VlQ29tbWVudDQwODQ3ODkzNQ==,9599,simonw,2018-07-27T17:00:08Z,2018-07-27T17:00:08Z,OWNER,"Refs https://github.com/encode/uvicorn/issues/168","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-418695115,https://api.github.com/repos/simonw/datasette/issues/272,418695115,MDEyOklzc3VlQ29tbWVudDQxODY5NTExNQ==,647359,tomchristie,2018-09-05T11:21:25Z,2018-09-05T11:21:25Z,NONE,"Some notes: * Starlette just got a bump to 0.3.0 - there's some renamings in there. It's got enough functionality now that you can treat it either as a framework or as a toolkit. Either way the component design is all just *here's an ASGI app* all the way through. * Uvicorn got a bump to 0.3.3 - Removed some cyclical references that were causing garbage collection to impact performance. Ought to be a decent speed bump. * Wrt. passing config - Either use a single envvar that points to a config, or use multiple envvars for the config. Uvicorn could get a flag to read a `.env` file, but I don't see ASGI itself having a specific interface there.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-494190922,https://api.github.com/repos/simonw/datasette/issues/272,494190922,MDEyOklzc3VlQ29tbWVudDQ5NDE5MDkyMg==,9599,simonw,2019-05-21T00:00:40Z,2019-05-21T00:01:09Z,OWNER,"Wow, this issue has been open for a full year now! I've been thinking about this a lot. I've decided I want Datasette to use ASGI 3.0 internally with no dependencies on anything else - then I want the option to run Datasette under both daphne and uvicorn - because uvicorn doesn't support Python 3.5 but Datasette still needs to (primarily for Glitch), and daphne works with 3.5. So I'm going to try to go the following route: - Every Datasette view becomes an ASGI app - The Datasette application itself is an ASGI app that routes to those views - When you `pip install datasette` you get Daphne as a dependency (I'd like you to be able to opt-out of installing Daphne, I'm not yet sure how that would work) - A new `asgi_serve` plugin hook allows a plugin to serve Datasette using uvicorn (or hypercorn) instead","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-494191378,https://api.github.com/repos/simonw/datasette/issues/272,494191378,MDEyOklzc3VlQ29tbWVudDQ5NDE5MTM3OA==,9599,simonw,2019-05-21T00:02:48Z,2019-05-21T00:02:48Z,OWNER,"I said earlier that I only need to support GET - I actually need to be able to support POST too, mainly to support plugins (e.g. a plugin that allows authenticated login before you can view Datasette, but potentially also plugins that let you write data directly to SQLite as well).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-494191738,https://api.github.com/repos/simonw/datasette/issues/272,494191738,MDEyOklzc3VlQ29tbWVudDQ5NDE5MTczOA==,9599,simonw,2019-05-21T00:05:02Z,2019-05-21T00:05:02Z,OWNER,While I'm not depending on Starlette any more I will need to instead depend on https://github.com/andrew-d/python-multipart for POST form parsing - as used by Starlette here https://github.com/encode/starlette/blob/ab86530eddfcf56e0f7e5ca56f6ab69c15594a7d/starlette/requests.py#L178-L193,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-494192163,https://api.github.com/repos/simonw/datasette/issues/272,494192163,MDEyOklzc3VlQ29tbWVudDQ5NDE5MjE2Mw==,9599,simonw,2019-05-21T00:07:25Z,2019-05-21T00:07:25Z,OWNER,"Bah, I'd much rather depend on Starlette for things like form parsing - but it's 3.6+ only! https://github.com/encode/starlette/blob/ab86530eddfcf56e0f7e5ca56f6ab69c15594a7d/setup.py#L39 Maybe I could require Python 3.6 or higher if you want to handle POST data? This would make my internals far too complicated though I think.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-494192779,https://api.github.com/repos/simonw/datasette/issues/272,494192779,MDEyOklzc3VlQ29tbWVudDQ5NDE5Mjc3OQ==,9599,simonw,2019-05-21T00:10:47Z,2019-05-21T00:10:47Z,OWNER,"https://github.com/simonw/datasette/commit/9fdb47ca952b93b7b60adddb965ea6642b1ff523 added `decode_path_component()` and `encode_path_component()` functions because ASGI decodes %2F encoded slashes in URLs automatically. The new encoding scheme looks like this: ""table/and/slashes"" => ""tableU+002FandU+002Fslashes"" ""~table"" => ""U+007Etable"" ""+bobcats!"" => ""U+002Bbobcats!"" ""U+007Etable"" => ""UU+002B007Etable"" For background see this comment: https://github.com/django/asgiref/issues/51#issuecomment-450603464","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-494297022,https://api.github.com/repos/simonw/datasette/issues/272,494297022,MDEyOklzc3VlQ29tbWVudDQ5NDI5NzAyMg==,647359,tomchristie,2019-05-21T08:39:17Z,2019-05-21T08:39:17Z,NONE,"Useful context stuff: > ASGI decodes %2F encoded slashes in URLs automatically `raw_path` for ASGI looks to be under consideration: https://github.com/django/asgiref/issues/87 > uvicorn doesn't support Python 3.5 That was an issue specifically against the <=3.5.2 minor point releases of Python, now resolved: https://github.com/encode/uvicorn/issues/330 👍 > Starlette for things like form parsing - but it's 3.6+ only! Yeah - the bits that require 3.6 are anywhere with the ""async for"" syntax. If it wasn't for that I'd downport it, but that one's a pain. It's the one bit of syntax to watch out for if you're looking to bring any bits of implementation across to Datasette. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-502393107,https://api.github.com/repos/simonw/datasette/issues/272,502393107,MDEyOklzc3VlQ29tbWVudDUwMjM5MzEwNw==,9599,simonw,2019-06-15T19:25:54Z,2019-06-19T01:20:14Z,OWNER,"OK, time for a solid implementation plan. As soon as https://github.com/django/asgiref/pull/92 is merged (hopefully very soon) the ASGI spec will have support for an optional `raw_path` - which means we can continue to use `table%2Fnames` with embedded `/` without being unable to tell if a path has been decoded or not. Steps to implement: ## Refactor classes, then add .asgi() method to BaseView Add a `.asgi(self, scope, receive, send)` method to my base view class. This will expose an ASGI interface to the outside world: the method itself will construct a request object and call the existing `.get()` method. My only true shared base class is actually `RenderMixin` because the `IndexView` doesn't extend `BaseView`. I'm going to refactor the class hierarchy a bit here - `AsgiView` will be my top level class with the `.asgi()` method on it. `RenderMixin` will be renamed `BaseView(AsgiView)`, while existing `BaseView` will be renamed `DataView(BaseView)` since it mainly exists to introduce the handy `.data()` abstraction. So... * `AsgiView` - has `.asgi()` method, extends Sanic `HTTPMethodView` (for the moment) * `BaseView(AsgiView)` - defines utility methods currently on `RenderMixin` * `IndexView(BaseView)` - the current `IndexView` * `DataView(BaseView)` - defines the utilities currently on `BaseView`, including `data()` * Everything else subclasses `DataView` ## Extract routing logic out into a new `DatasetteView` I considered calling this `RouteView`, but one of the goals of this project is to allow other ASGI apps to import Datasette itself and reuse it as its own ASGI function. So `DatasetteView` will subclass `BaseView` and will do all of the routing logic. That logic currently lives here: https://github.com/simonw/datasette/blob/aa911122feab13f8e65875c98edb00fd3832b7b8/datasette/app.py#L594-L640 ## For tests: Implement a version of app_client.get() that calls ASGI instead Almost all of the unit tests currently use `app_client.get(""/path..."")`. I want to be able to run tests against both ASGI and existing-Sanic, so for the moment I'm going to teach `app_client.get()` to use ASGI instead but only in the presence of a new environment variable. I can then have Travis run the tests twice - once with that environement variable and once without. ## Make datasette serve --asgi run ASGI and uvicorn Uvicorn supports Python 3.5 again as of https://github.com/encode/uvicorn/issues/330 - so it's going to be the new dependency for Datasette. ## Do some comparative testing of ASGI and non-ASGI Just some sanity checking to make sure there aren't any weird issues. ## Final step: refactor out Sanic Hopefully this will just involve changes being made to the AsgiView base class, since that subclasses Sanic `HTTPMethodView`. Bonus: It looks like dropping Sanic as a dependency in favour of Uvicorn should give us Windows support! https://github.com/encode/uvicorn/issues/82 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-502393267,https://api.github.com/repos/simonw/datasette/issues/272,502393267,MDEyOklzc3VlQ29tbWVudDUwMjM5MzI2Nw==,9599,simonw,2019-06-15T19:28:27Z,2019-06-15T19:28:27Z,OWNER,I'll probably revert 9fdb47ca952b93b7b60adddb965ea6642b1ff523 from https://github.com/simonw/datasette/issues/272#issuecomment-494192779 since I won't need it now that ASGI is getting `raw_path` support.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-502394420,https://api.github.com/repos/simonw/datasette/issues/272,502394420,MDEyOklzc3VlQ29tbWVudDUwMjM5NDQyMA==,9599,simonw,2019-06-15T19:45:46Z,2019-06-15T19:45:46Z,OWNER,"For reference, here's some WIP code I wrote last year against the old ASGI 2 spec: https://github.com/simonw/datasette/commit/4fd36ba2f3f91da7258859808616078e3464fb97","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-502395689,https://api.github.com/repos/simonw/datasette/issues/272,502395689,MDEyOklzc3VlQ29tbWVudDUwMjM5NTY4OQ==,9599,simonw,2019-06-15T20:05:26Z,2019-06-15T20:05:26Z,OWNER,"For the routing component: I'm going to base my implementation on the one from Django Channels. https://github.com/django/channels/blob/507cb54fcb36df63282dd19653ea743036e7d63c/channels/routing.py#L123-L149 Documented here: https://channels.readthedocs.io/en/latest/topics/routing.html#urlrouter Particularly relevant: my view classes need access to the components that were already parsed out of the URL by the router. I'm going to copy the Django Channels mechanism of stashing those in `scope[""url_route""][""kwargs""]`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-502401078,https://api.github.com/repos/simonw/datasette/issues/272,502401078,MDEyOklzc3VlQ29tbWVudDUwMjQwMTA3OA==,9599,simonw,2019-06-15T21:35:26Z,2019-06-15T21:35:26Z,OWNER,Started sketching out the router in the `asgi` branch: https://github.com/simonw/datasette/commit/7cdc55c6836fe246b1ca8a13a965a39991c9ffec,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-502466466,https://api.github.com/repos/simonw/datasette/issues/272,502466466,MDEyOklzc3VlQ29tbWVudDUwMjQ2NjQ2Ng==,9599,simonw,2019-06-16T16:28:10Z,2019-06-16T16:28:10Z,OWNER,I have an open pull request to Uvicorn with an implementation of `raw_path`: https://github.com/encode/uvicorn/pull/372,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-503195217,https://api.github.com/repos/simonw/datasette/issues/272,503195217,MDEyOklzc3VlQ29tbWVudDUwMzE5NTIxNw==,9599,simonw,2019-06-18T15:46:31Z,2019-06-18T15:54:18Z,OWNER,"How should file serving work? Starlette and Sanic both use `aiofiles` - https://github.com/Tinche/aiofiles - which is a small wrapper around file operations which runs them all in an executor thread. It doesn't have any C dependencies so it looks like a good option. [Quart uses it too](https://gitlab.com/pgjones/quart/blob/317562ea660edb7159efc20fa57b95223d408ea0/quart/wrappers/response.py#L122-169). `aiohttp` does things differently: it has [an implementation based on sendfile](https://github.com/aio-libs/aiohttp/blob/7a324fd46ff7dc9bb0bb1bc5afb326e04cf7cef0/aiohttp/web_fileresponse.py#L46-L122) with [an alternative fallback](https://github.com/aio-libs/aiohttp/blob/7a324fd46ff7dc9bb0bb1bc5afb326e04cf7cef0/aiohttp/web_fileresponse.py#L175-L200) which reads chunks from a file object and yields them one chunk at a time, ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-503351966,https://api.github.com/repos/simonw/datasette/issues/272,503351966,MDEyOklzc3VlQ29tbWVudDUwMzM1MTk2Ng==,9599,simonw,2019-06-18T23:45:17Z,2019-06-18T23:45:17Z,OWNER,Uvicorn 0.8.1 is our and supports `raw_path`!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-503369834,https://api.github.com/repos/simonw/datasette/issues/272,503369834,MDEyOklzc3VlQ29tbWVudDUwMzM2OTgzNA==,9599,simonw,2019-06-19T01:26:24Z,2019-06-19T01:26:24Z,OWNER,"I need to be able to define the URL routes once and have them work for both Sanic and ASGI. I'm going to extract the web application bits out of the `Datasette` class into a `DatasetteServer` class. Then I can have a `add_route()` method on that class, then have `DatasetteSanic` and `DatasetteAsgi` subclasses which redefine that method. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-504697742,https://api.github.com/repos/simonw/datasette/issues/272,504697742,MDEyOklzc3VlQ29tbWVudDUwNDY5Nzc0Mg==,9599,simonw,2019-06-22T20:55:59Z,2019-06-22T20:56:22Z,OWNER,"Getting this to work with both Sanic AND ASGI at the same time (via the classes described previously with an `--asgi` command-line option) is proving way trickier than I expected, mainly because of the complexity of [the current Datasette.app() method](https://github.com/simonw/datasette/blob/35429f90894321eda7f2db31b9ea7976f31f73ac/datasette/app.py#L545-L721). I'm going to drop the compatibility path for a bit and see if I can make progress on a pure-ASGI port.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-504710331,https://api.github.com/repos/simonw/datasette/issues/272,504710331,MDEyOklzc3VlQ29tbWVudDUwNDcxMDMzMQ==,9599,simonw,2019-06-23T01:08:45Z,2019-06-23T01:08:45Z,OWNER,"Lots still to do: * Static files are not being served * Streaming CSV files don't work * Tests all fail * Some URLs (e.g. the 'next' link on tables) are incorrect I'm going to work on getting the unit test framework to be ASGI-compatible next.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-504711468,https://api.github.com/repos/simonw/datasette/issues/272,504711468,MDEyOklzc3VlQ29tbWVudDUwNDcxMTQ2OA==,9599,simonw,2019-06-23T01:36:33Z,2019-06-23T01:36:33Z,OWNER,"Published an in-progress demo: datasette publish now fixtures.db -n datasette-asgi-early-demo --branch=asgi Here it is: https://datasette-asgi-early-demo-qahhxctqpw.now.sh/","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-504716988,https://api.github.com/repos/simonw/datasette/issues/272,504716988,MDEyOklzc3VlQ29tbWVudDUwNDcxNjk4OA==,9599,simonw,2019-06-23T03:43:46Z,2019-06-23T15:15:26Z,OWNER,"OK, it's beginning to shape up now. Next steps: - [x] Static file support (including for plugins) - plus tests - [x] Streaming support so the CSV tests will pass - [x] Ability to download the database file - [x] Implement missing-slash redirects ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-504754433,https://api.github.com/repos/simonw/datasette/issues/272,504754433,MDEyOklzc3VlQ29tbWVudDUwNDc1NDQzMw==,9599,simonw,2019-06-23T13:51:53Z,2019-06-23T13:51:53Z,OWNER,"CSV tests all pass as of https://github.com/simonw/datasette/commit/ff9efa668ebc33f17ef9b30139960e29906a18fb This code could be a lot neater though. At the very least I'm going to refactor `datasette/utils.py` into a `datasette/utils` package and put all of my new ASGI utilities in `datasette/utils/asgi.py` The way I implemented streaming on top of a writer object (inspired by Sanic) is a bit of a weird hack. I think I'd rather use an abstraction where my view functions can yield chunks of body data.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-504754552,https://api.github.com/repos/simonw/datasette/issues/272,504754552,MDEyOklzc3VlQ29tbWVudDUwNDc1NDU1Mg==,9599,simonw,2019-06-23T13:53:39Z,2019-06-23T13:53:39Z,OWNER,"Next test to fix (because by new test harness doesn't actually obey the `allow_redirects=` parameter): ``` _____________ test_database_page_redirects_with_url_hash _____________ app_client_with_hash = def test_database_page_redirects_with_url_hash(app_client_with_hash): response = app_client_with_hash.get(""/fixtures"", allow_redirects=False) assert response.status == 302 response = app_client_with_hash.get(""/fixtures"") > assert ""fixtures"" in response.text E AssertionError: assert 'fixtures' in '' E + where '' = .text ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-504759683,https://api.github.com/repos/simonw/datasette/issues/272,504759683,MDEyOklzc3VlQ29tbWVudDUwNDc1OTY4Mw==,9599,simonw,2019-06-23T14:57:50Z,2019-06-23T14:57:50Z,OWNER,"All of the tests are now passing! I still need a solution for this: https://github.com/simonw/datasette/blob/5bd510b01adae3f719e4426b9bfbc346a946ba5c/datasette/app.py#L706-L714 I think the answer is ASGI lifespan, which is supported by Uvicorn. https://asgi.readthedocs.io/en/latest/specs/lifespan.html#startup","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-504759842,https://api.github.com/repos/simonw/datasette/issues/272,504759842,MDEyOklzc3VlQ29tbWVudDUwNDc1OTg0Mg==,9599,simonw,2019-06-23T15:00:06Z,2019-06-23T15:00:06Z,OWNER,I also need to actually take advantage of `raw_path` such that pages like https://fivethirtyeight.datasettes.com/fivethirtyeight/twitter-ratio%2Fsenators can be correctly served.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-504760061,https://api.github.com/repos/simonw/datasette/issues/272,504760061,MDEyOklzc3VlQ29tbWVudDUwNDc2MDA2MQ==,9599,simonw,2019-06-23T15:02:52Z,2019-06-23T15:02:52Z,OWNER,"Tests are failing on Python 3.5: https://travis-ci.org/simonw/datasette/jobs/549380098 - error is `TypeError: the JSON object must be str, not 'bytes'`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-504761039,https://api.github.com/repos/simonw/datasette/issues/272,504761039,MDEyOklzc3VlQ29tbWVudDUwNDc2MTAzOQ==,9599,simonw,2019-06-23T15:15:41Z,2019-06-23T15:18:36Z,OWNER,"And now the tests are all passing! Still to do: * Use `raw_path` so table names containing `/` can work correctly * Get ?_trace=1 working again * Replacement for `@app.listener(""before_server_start"")` * Replace Sanic request object with my own request class, so I can remove Sanic dependency","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-504761165,https://api.github.com/repos/simonw/datasette/issues/272,504761165,MDEyOklzc3VlQ29tbWVudDUwNDc2MTE2NQ==,9599,simonw,2019-06-23T15:17:07Z,2019-06-23T15:17:07Z,OWNER,I'm going to move the remaining work into a pull request.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-504844339,https://api.github.com/repos/simonw/datasette/issues/272,504844339,MDEyOklzc3VlQ29tbWVudDUwNDg0NDMzOQ==,9599,simonw,2019-06-24T03:33:06Z,2019-06-24T03:33:06Z,OWNER,"It's alive! Here's the first deployed version: https://a559123.datasette.io/ You can confirm it's running under ASGI by viewing https://a559123.datasette.io/-/versions and looking for the `""asgi""` key. Compare to the last version of master running on Sanic here: http://aa91112.datasette.io/","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-504857097,https://api.github.com/repos/simonw/datasette/issues/272,504857097,MDEyOklzc3VlQ29tbWVudDUwNDg1NzA5Nw==,9599,simonw,2019-06-24T04:54:15Z,2019-06-24T04:54:15Z,OWNER,I wrote about this on my blog: https://simonwillison.net/2019/Jun/23/datasette-asgi/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/273#issuecomment-390250253,https://api.github.com/repos/simonw/datasette/issues/273,390250253,MDEyOklzc3VlQ29tbWVudDM5MDI1MDI1Mw==,198537,rgieseke,2018-05-18T15:49:52Z,2018-05-18T15:49:52Z,CONTRIBUTOR,"Shouldn't [versioneer](https://github.com/warner/python-versioneer) do that? E.g. 0.21+2.g1076c97 You'd need to install via `pip install git+https://github.com/simow/datasette.git` though, this does a temp git clone.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324451322,Figure out a way to have /-/version return current git commit hash, https://github.com/simonw/datasette/issues/273#issuecomment-391003285,https://api.github.com/repos/simonw/datasette/issues/273,391003285,MDEyOklzc3VlQ29tbWVudDM5MTAwMzI4NQ==,9599,simonw,2018-05-22T14:06:40Z,2018-05-22T14:06:40Z,OWNER,"That looks great. I don't think it's possible to derive the current commit version from the .zip downloaded directly from GitHub, so needing to pip install via git+https feels reasonable to me.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324451322,Figure out a way to have /-/version return current git commit hash, https://github.com/simonw/datasette/issues/274#issuecomment-390433040,https://api.github.com/repos/simonw/datasette/issues/274,390433040,MDEyOklzc3VlQ29tbWVudDM5MDQzMzA0MA==,9599,simonw,2018-05-19T21:12:42Z,2018-05-20T16:01:03Z,OWNER,Could also support these as optional environment variables - `DATASETTE_NAMEOFSETTING`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324652142,"Rename --limit to --config, add --help-config", https://github.com/simonw/datasette/issues/274#issuecomment-390496376,https://api.github.com/repos/simonw/datasette/issues/274,390496376,MDEyOklzc3VlQ29tbWVudDM5MDQ5NjM3Ng==,9599,simonw,2018-05-20T17:04:55Z,2018-05-20T17:04:55Z,OWNER,http://datasette.readthedocs.io/en/latest/config.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324652142,"Rename --limit to --config, add --help-config", https://github.com/simonw/datasette/issues/275#issuecomment-391771202,https://api.github.com/repos/simonw/datasette/issues/275,391771202,MDEyOklzc3VlQ29tbWVudDM5MTc3MTIwMg==,9599,simonw,2018-05-24T16:08:41Z,2018-05-24T16:08:41Z,OWNER,"So the lookup priority order should be: * table level in metadata * database level in metadata * root level in metadata * `--config` options passed to `datasette serve` * `DATASETTE_X` environment variables","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324720095,"""config"" section in metadata.json (root, database and table level)", https://github.com/simonw/datasette/issues/275#issuecomment-391771658,https://api.github.com/repos/simonw/datasette/issues/275,391771658,MDEyOklzc3VlQ29tbWVudDM5MTc3MTY1OA==,9599,simonw,2018-05-24T16:09:55Z,2018-05-24T16:09:55Z,OWNER,It feels slightly weird continuing to call it `metadata.json` as it starts to grow support for config options (which already started with the `units` and `facets` keys) but I can live with that.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324720095,"""config"" section in metadata.json (root, database and table level)", https://github.com/simonw/datasette/issues/276#issuecomment-390707760,https://api.github.com/repos/simonw/datasette/issues/276,390707760,MDEyOklzc3VlQ29tbWVudDM5MDcwNzc2MA==,9599,simonw,2018-05-21T16:30:35Z,2018-05-21T16:30:35Z,OWNER,"This probably needs to be in a plugin simply because getting Spatialite compiled and installed is a bit of a pain. It's a great opportunity to expand the plugin hooks in useful ways though.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-390795067,https://api.github.com/repos/simonw/datasette/issues/276,390795067,MDEyOklzc3VlQ29tbWVudDM5MDc5NTA2Nw==,45057,russss,2018-05-21T21:55:57Z,2018-05-21T21:55:57Z,CONTRIBUTOR,"Well, we do have the capability to detect spatialite so my intention certainly wasn't to require it. I can see the advantage of having it as a plugin but it does touch a number of points in the code. I think I'm going to attack this by refactoring the necessary bits and seeing where that leads (which was my plan anyway). I think my main concern is - if I add certain plugin hooks for this, is anything else ever going to use them? I'm not sure I have an answer to that question yet, either way.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-391000659,https://api.github.com/repos/simonw/datasette/issues/276,391000659,MDEyOklzc3VlQ29tbWVudDM5MTAwMDY1OQ==,9599,simonw,2018-05-22T13:59:27Z,2018-05-22T13:59:27Z,OWNER,"Right now the plugin stuff is early enough that I'd like to get as many potential plugin hooks as possible crafted out A much easier to judge if they should be added as actual hooks if we have a working branch prototype of them. Some kind of mechanism for custom column display is already needed - eg there are columns where I want to say ""render this as markdown"" or ""URLify any links in this text"" - or even ""use this date format"" or ""add commas to this integer"". You can do it with a custom template but a lower-level mechanism would be nicer. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-391025841,https://api.github.com/repos/simonw/datasette/issues/276,391025841,MDEyOklzc3VlQ29tbWVudDM5MTAyNTg0MQ==,9599,simonw,2018-05-22T15:06:36Z,2018-05-22T15:06:36Z,OWNER,"The other reason I mention plugins is that I have an idea to outlaw JavaScript entirely from Datasette core and instead encourage ALL JavaScript functionality to move into plugins.right now that just means CodeMirror. I may set up some of those plugins (like CodeMirror) as default dependencies so you get them from ""pip install datasette"". I like the neatness of saying that core Datasette is a very simple JSON + HTML application, then encouraging people to go completely wild with JavaScript in the plugins.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-391050113,https://api.github.com/repos/simonw/datasette/issues/276,391050113,MDEyOklzc3VlQ29tbWVudDM5MTA1MDExMw==,45057,russss,2018-05-22T16:13:00Z,2018-05-22T16:13:00Z,CONTRIBUTOR,"Yup, I'll have a think about it. My current thoughts are for spatialite we'll need to hook into the following places: * Inspection, so we can detect which columns are geometry columns. (We also currently ignore spatialite tables during inspection, it may be worth moving that to the plugin as well.) * After data load, so we can convert WKB into the correct intermediate format for display. The alternative here is to alter the select SQL itself and get spatialite to do this conversion, but that strikes me as a bit more complex and possibly not as useful. * HTML rendering. * Querying? The rendering and querying hooks could also potentially be used to move the units support into a plugin.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-391504199,https://api.github.com/repos/simonw/datasette/issues/276,391504199,MDEyOklzc3VlQ29tbWVudDM5MTUwNDE5OQ==,9599,simonw,2018-05-23T21:35:17Z,2018-05-23T21:35:17Z,OWNER,"I'm not keen on anything that modifies the SQLite file itself on startup - part of the datasette contract is that it should work with any SQLite file you throw at it without having any side-effects. A neat thing about SQLite is that because everything happens in the same process there's very little additional overhead involved in executing extra SQL queries - even if we ran a query-per-row to transform data in one specific column it shouldn't add more than a few ms to the total page load time (whereas with MySQL all of the extra query overhead would kill us).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-391504757,https://api.github.com/repos/simonw/datasette/issues/276,391504757,MDEyOklzc3VlQ29tbWVudDM5MTUwNDc1Nw==,9599,simonw,2018-05-23T21:37:07Z,2018-05-23T21:37:18Z,OWNER,"That said, it looks like we may be able to use a library like https://github.com/geomet/geomet to run the conversion from WKB entirely in Python space.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-391505930,https://api.github.com/repos/simonw/datasette/issues/276,391505930,MDEyOklzc3VlQ29tbWVudDM5MTUwNTkzMA==,45057,russss,2018-05-23T21:41:37Z,2018-05-23T21:41:37Z,CONTRIBUTOR,"> I'm not keen on anything that modifies the SQLite file itself on startup Ah I didn't mean that - I meant altering the SELECT query to fetch the data so that it ran a spatialite function to transform that specific column. I think that's less useful as a general-purpose plugin hook though, and it's not that hard to parse the WKB in Python (my default approach would be to use [shapely](https://github.com/Toblerity/Shapely), which is great, but geomet looks like an interesting pure-python alternative).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-392279508,https://api.github.com/repos/simonw/datasette/issues/276,392279508,MDEyOklzc3VlQ29tbWVudDM5MjI3OTUwOA==,9599,simonw,2018-05-26T18:32:07Z,2018-05-26T18:32:07Z,OWNER,Related: I started the documentation for using SpatiaLite with Datasette here: https://datasette.readthedocs.io/en/latest/spatialite.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-392279644,https://api.github.com/repos/simonw/datasette/issues/276,392279644,MDEyOklzc3VlQ29tbWVudDM5MjI3OTY0NA==,9599,simonw,2018-05-26T18:34:21Z,2018-05-26T18:34:21Z,OWNER,"I've been thinking a bit about modifying the SQL select statement used for the table view recently. I've run into a few examples of SQLite database that slow to a crawl when viewed with datasette because the rows are too big, so there's definitely scope for supporting custom select clauses (avoiding some columns, showing length(colname) for others).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-392316250,https://api.github.com/repos/simonw/datasette/issues/276,392316250,MDEyOklzc3VlQ29tbWVudDM5MjMxNjI1MA==,9599,simonw,2018-05-27T08:59:46Z,2018-05-27T08:59:46Z,OWNER,It looks like we can use the `geometry_columns` table to introspect which columns are SpatiaLite geometries. It includes a `geometry_type` integer which is documented here: https://www.gaia-gis.it/fossil/libspatialite/wiki?name=switching-to-4.0,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-392316306,https://api.github.com/repos/simonw/datasette/issues/276,392316306,MDEyOklzc3VlQ29tbWVudDM5MjMxNjMwNg==,9599,simonw,2018-05-27T09:00:46Z,2018-05-27T09:00:46Z,OWNER,Relevant to this ticket: I've been playing with a plugin that automatically renders any GeoJSON cells as leaflet maps: https://github.com/simonw/datasette-leaflet-geojson,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-392815673,https://api.github.com/repos/simonw/datasette/issues/276,392815673,MDEyOklzc3VlQ29tbWVudDM5MjgxNTY3Mw==,9599,simonw,2018-05-29T15:17:04Z,2018-05-29T15:17:04Z,OWNER,"I'm coming round to the idea that this should be baked into Datasette core - see above referenced issues for some of the explorations I've been doing around this area. Datasette should absolutely work without SpatiaLite, but it's such a huge bonus part of the SQLite ecosystem that I'm happy to ship features that take advantage of it without being relegated to plugins. I'm also becoming aware that there aren't really that many other interesting loadable extensions for SQLite. If SpatiaLite was one of dozens I'd feel that a rule that ""anything dependent on an extension lives in a plugin"" would make sense, but as it stands I think 99% of the time the only loadable extensions people will be using will be SpatiaLite and json1 (and json1 is available in the amalgamation anyway). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-392825746,https://api.github.com/repos/simonw/datasette/issues/276,392825746,MDEyOklzc3VlQ29tbWVudDM5MjgyNTc0Ng==,45057,russss,2018-05-29T15:42:53Z,2018-05-29T15:42:53Z,CONTRIBUTOR,"I haven't had time to look further into this, but if doing this as a plugin results in useful hooks then I think we should do it that way. We could always require the plugin as a standard dependency. I think this is going to result in quite a bit of refactoring anyway so it's a good time to add hooks regardless. On the other hand, if we have to add lots of specialist hooks for it then maybe it's worth integrating into the core.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-392969173,https://api.github.com/repos/simonw/datasette/issues/276,392969173,MDEyOklzc3VlQ29tbWVudDM5Mjk2OTE3Mw==,9599,simonw,2018-05-29T22:32:08Z,2018-05-29T22:32:08Z,OWNER,The more time I spend with SpatiaLite the more convinced I am that this should be default behavior. There's nothing useful about the binary Geometry representation - it's not even valid WKB. I'm on board with WKT as the default display in HTML and GeoJSON as the default for `.json`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-393014943,https://api.github.com/repos/simonw/datasette/issues/276,393014943,MDEyOklzc3VlQ29tbWVudDM5MzAxNDk0Mw==,9599,simonw,2018-05-30T02:59:53Z,2018-05-30T02:59:53Z,OWNER,I just realised a problem with GeoJSON is that it assumes that the underlying geometry is WGS 84 latitude/longitude points - but it's very possible for a SpatiaLite geometry to contain geometric data that's nothing to do with geospatial projections.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-393106520,https://api.github.com/repos/simonw/datasette/issues/276,393106520,MDEyOklzc3VlQ29tbWVudDM5MzEwNjUyMA==,45057,russss,2018-05-30T10:09:25Z,2018-05-30T10:09:25Z,CONTRIBUTOR,"I don't think it's unreasonable to only support spatialite geometries in a coordinate reference system which is at least transformable to WGS84. It would be nice to support different CRSes in the database so conversion to spatialite from the source data is lossless. I think the working CRS for datasette should be WGS84 though (leaflet requires it, for example) - it's just a case of calling `ST_Transform(geom, 4326)` on the column while we're loading the data.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-401310732,https://api.github.com/repos/simonw/datasette/issues/276,401310732,MDEyOklzc3VlQ29tbWVudDQwMTMxMDczMg==,82988,psychemedia,2018-06-29T10:05:04Z,2018-06-29T10:07:25Z,CONTRIBUTOR,"@russs Different map projections can presumably be handled on the client side using a leaflet plugin to transform the geometry (eg [kartena/Proj4Leaflet](https://kartena.github.io/Proj4Leaflet/)) although the leaflet side would need to detect or be informed of the original projection? Another possibility would be to provide an easy way/guidance for users to create an FK'd table containing the WGS84 projection of a non-WGS84 geometry in the original/principle table? This could then as a proxy for serving GeoJSON to the leaflet map?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-401312981,https://api.github.com/repos/simonw/datasette/issues/276,401312981,MDEyOklzc3VlQ29tbWVudDQwMTMxMjk4MQ==,45057,russss,2018-06-29T10:14:54Z,2018-06-29T10:14:54Z,CONTRIBUTOR,"> @RusSs Different map projections can presumably be handled on the client side using a leaflet plugin to transform the geometry (eg kartena/Proj4Leaflet) although the leaflet side would need to detect or be informed of the original projection? Well, as @simonw mentioned, GeoJSON only supports WGS84, and GeoJSON (and/or TopoJSON) is the standard we probably want to aim for. On-the-fly reprojection in spatialite is not an issue anyway, and in general I think you want to be serving stuff to web maps in WGS84 or Web Mercator.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-744461856,https://api.github.com/repos/simonw/datasette/issues/276,744461856,MDEyOklzc3VlQ29tbWVudDc0NDQ2MTg1Ng==,296686,robintw,2020-12-14T14:04:57Z,2020-12-14T14:04:57Z,NONE,"I'm looking into using datasette with a database with spatialite geometry columns, and came across this issue. Has there been any progress on this since 2018? In one of my tables I'm just storing lat/lon points in a spatialite point geometry, and I've managed to make datasette-cluster-map display the points by extracting the lat and lon in SQL - using something like `select ... ST_X(location) as longitude, ST_Y(location) as latitude from Blah`. Something more 'built-in' would be great though - particularly for the tables I have that store more complex geometries.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-1074479768,https://api.github.com/repos/simonw/datasette/issues/276,1074479768,IC_kwDOBm6k_c5AC0KY,9599,simonw,2022-03-21T22:22:20Z,2022-03-21T22:22:20Z,OWNER,"I'm closing this issue because this is now solved by a number of neat plugins: - https://datasette.io/plugins/datasette-geojson-map shows the geometry from SpatiaLite columns on a map - https://datasette.io/plugins/datasette-leaflet-geojson can be used to display inline maps next to each column","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/pull/277#issuecomment-390707183,https://api.github.com/repos/simonw/datasette/issues/277,390707183,MDEyOklzc3VlQ29tbWVudDM5MDcwNzE4Mw==,9599,simonw,2018-05-21T16:28:39Z,2018-05-21T16:28:39Z,OWNER,"This is definitely a big improvement. I'd like to refactor the unit tests that cover .inspect() too - currently they are a huge ugly blob at the top of test_api.py","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324836533,Refactor inspect logic, https://github.com/simonw/datasette/pull/277#issuecomment-390804333,https://api.github.com/repos/simonw/datasette/issues/277,390804333,MDEyOklzc3VlQ29tbWVudDM5MDgwNDMzMw==,9599,simonw,2018-05-21T22:40:16Z,2018-05-21T22:43:50Z,OWNER,"We should merge this before refactoring the tests though, because that way we don't couple the new tests to the verification of this change.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324836533,Refactor inspect logic, https://github.com/simonw/datasette/issues/278#issuecomment-390991640,https://api.github.com/repos/simonw/datasette/issues/278,390991640,MDEyOklzc3VlQ29tbWVudDM5MDk5MTY0MA==,9599,simonw,2018-05-22T13:33:46Z,2018-05-22T13:33:46Z,OWNER,For SpatiaLite this example may be useful - though it's building 4.3.0 and not 4.4.0: https://github.com/terranodo/spatialite-docker/blob/master/Dockerfile,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325294102,Build smallest possible Docker image with Datasette plus recent SQLite (with json1) plus Spatialite 4.4.0, https://github.com/simonw/datasette/issues/278#issuecomment-390993397,https://api.github.com/repos/simonw/datasette/issues/278,390993397,MDEyOklzc3VlQ29tbWVudDM5MDk5MzM5Nw==,9599,simonw,2018-05-22T13:38:57Z,2018-05-22T13:38:57Z,OWNER,"Useful GitHub code search: https://github.com/search?utf8=✓&q=%22libspatialite-4.4.0%22+%22RC0%22&type=Code ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325294102,Build smallest possible Docker image with Datasette plus recent SQLite (with json1) plus Spatialite 4.4.0, https://github.com/simonw/datasette/issues/278#issuecomment-390993861,https://api.github.com/repos/simonw/datasette/issues/278,390993861,MDEyOklzc3VlQ29tbWVudDM5MDk5Mzg2MQ==,9599,simonw,2018-05-22T13:40:14Z,2018-05-22T14:38:05Z,OWNER,If we can't get `import sqlite3` to load the latest version but we can get `import pysqlite3` to work that's fine too - I can teach Datasette to import the best available version.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325294102,Build smallest possible Docker image with Datasette plus recent SQLite (with json1) plus Spatialite 4.4.0, https://github.com/simonw/datasette/pull/279#issuecomment-391055490,https://api.github.com/repos/simonw/datasette/issues/279,391055490,MDEyOklzc3VlQ29tbWVudDM5MTA1NTQ5MA==,9599,simonw,2018-05-22T16:29:30Z,2018-05-22T16:29:30Z,OWNER,"This is fantastic! I think I prefer the aesthetics of just ""0.22"" for the version string if it's a tagged release with no additional changes - does that work? I'd like to continue to provide a tuple that can be imported from the version.py module as well, as seen here: https://github.com/simonw/datasette/blob/558d9d7bfef3dd633eb16389281b67d42c9bdeef/datasette/version.py#L1 Presumably we can generate that from the versioneer string? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325352370,Add version number support with Versioneer, https://github.com/simonw/datasette/pull/279#issuecomment-391073009,https://api.github.com/repos/simonw/datasette/issues/279,391073009,MDEyOklzc3VlQ29tbWVudDM5MTA3MzAwOQ==,198537,rgieseke,2018-05-22T17:23:26Z,2018-05-22T17:23:26Z,CONTRIBUTOR,"> I think I prefer the aesthetics of just ""0.22"" for the version string if it's a tagged release with no additional changes - does that work? Yes! That's the default versioneer behaviour. > I'd like to continue to provide a tuple that can be imported from the version.py module as well, as seen here: Should work now, it can be a two (for a tagged version), three or four items tuple. ``` In [2]: datasette.__version__ Out[2]: '0.12+292.ga70c2a8.dirty' In [3]: datasette.__version_info__ Out[3]: ('0', '12+292', 'ga70c2a8', 'dirty') ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325352370,Add version number support with Versioneer, https://github.com/simonw/datasette/pull/279#issuecomment-391073267,https://api.github.com/repos/simonw/datasette/issues/279,391073267,MDEyOklzc3VlQ29tbWVudDM5MTA3MzI2Nw==,198537,rgieseke,2018-05-22T17:24:16Z,2018-05-22T17:24:16Z,CONTRIBUTOR,"Sorry, just realised you rely on `version` being a module ...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325352370,Add version number support with Versioneer, https://github.com/simonw/datasette/pull/279#issuecomment-391077700,https://api.github.com/repos/simonw/datasette/issues/279,391077700,MDEyOklzc3VlQ29tbWVudDM5MTA3NzcwMA==,198537,rgieseke,2018-05-22T17:38:17Z,2018-05-22T17:38:17Z,CONTRIBUTOR,"Alright, that should work now -- let me know if you would prefer any different behaviour.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325352370,Add version number support with Versioneer, https://github.com/simonw/datasette/pull/280#issuecomment-391059008,https://api.github.com/repos/simonw/datasette/issues/280,391059008,MDEyOklzc3VlQ29tbWVudDM5MTA1OTAwOA==,565628,r4vi,2018-05-22T16:40:27Z,2018-05-22T16:40:27Z,CONTRIBUTOR,"```python >>> import sqlite3 >>> sqlite3.sqlite_version '3.23.1' >>> ``` running the above in the container seems to show 3.23.1 too so maybe we don't need pysqlite3 at all?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/280#issuecomment-391076239,https://api.github.com/repos/simonw/datasette/issues/280,391076239,MDEyOklzc3VlQ29tbWVudDM5MTA3NjIzOQ==,9599,simonw,2018-05-22T17:33:33Z,2018-05-22T17:33:33Z,OWNER,This looks amazing! Can't wait to try this out this evening.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/280#issuecomment-391076458,https://api.github.com/repos/simonw/datasette/issues/280,391076458,MDEyOklzc3VlQ29tbWVudDM5MTA3NjQ1OA==,9599,simonw,2018-05-22T17:34:13Z,2018-05-22T17:34:13Z,OWNER,Yeah let's try this without pysqlite3 and see if we still get the correct version.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/280#issuecomment-391141391,https://api.github.com/repos/simonw/datasette/issues/280,391141391,MDEyOklzc3VlQ29tbWVudDM5MTE0MTM5MQ==,565628,r4vi,2018-05-22T21:08:39Z,2018-05-22T21:08:39Z,CONTRIBUTOR,"I'm going to clean this up for consistency tomorrow morning so hold off merging until then please On Tue, May 22, 2018 at 6:34 PM, Simon Willison wrote: > Yeah let's try this without pysqlite3 and see if we still get the correct > version. > > — > You are receiving this because you authored the thread. > Reply to this email directly, view it on GitHub > , or mute > the thread > > . > ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/280#issuecomment-391190497,https://api.github.com/repos/simonw/datasette/issues/280,391190497,MDEyOklzc3VlQ29tbWVudDM5MTE5MDQ5Nw==,9599,simonw,2018-05-23T01:22:53Z,2018-05-23T01:22:53Z,OWNER,"I grabbed just your Dockerfile and built it like this: docker build . -t datasette Once it had built, I ran it like this: docker run -p 8001:8001 -v `pwd`:/mnt datasette \ datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db \ --load-extension=/usr/local/lib/mod_spatialite.so (The fixtures.db file is created by running `python tests/fixtures.py fixtures.db`) Then I visited http://localhost:8001/-/versions and I got this: { ""datasette"": { ""version"": ""0+unknown"" }, ""python"": { ""full"": ""3.6.3 (default, Dec 12 2017, 06:37:05) \n[GCC 6.3.0 20170516]"", ""version"": ""3.6.3"" }, ""sqlite"": { ""extensions"": { ""json1"": null, ""spatialite"": ""4.4.0-RC0"" }, ""fts_versions"": [ ""FTS4"", ""FTS3"" ], ""version"": ""3.23.1"" } } Fantastic! I'm getting SQLite `3.23.1` and SpatiaLite `4.4.0-RC0`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/280#issuecomment-391290271,https://api.github.com/repos/simonw/datasette/issues/280,391290271,MDEyOklzc3VlQ29tbWVudDM5MTI5MDI3MQ==,565628,r4vi,2018-05-23T09:53:38Z,2018-05-23T09:53:38Z,CONTRIBUTOR,"Running: ```bash docker run -p 8001:8001 -v `pwd`:/mnt datasette \ datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db \ --load-extension=/usr/local/lib/mod_spatialite.so ``` is now returning FTS5 enabled in the versions output: ```json { ""datasette"": { ""version"": ""0.22"" }, ""python"": { ""full"": ""3.6.5 (default, May 5 2018, 03:07:21) \n[GCC 6.3.0 20170516]"", ""version"": ""3.6.5"" }, ""sqlite"": { ""extensions"": { ""json1"": null, ""spatialite"": ""4.4.0-RC0"" }, ""fts_versions"": [ ""FTS5"", ""FTS4"", ""FTS3"" ], ""version"": ""3.23.1"" } } ``` The old query didn't work because specifying `(t TEXT)` caused an error","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/280#issuecomment-391354237,https://api.github.com/repos/simonw/datasette/issues/280,391354237,MDEyOklzc3VlQ29tbWVudDM5MTM1NDIzNw==,9599,simonw,2018-05-23T13:51:22Z,2018-05-23T13:51:22Z,OWNER,@r4vi any objections to me merging this?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/280#issuecomment-391355030,https://api.github.com/repos/simonw/datasette/issues/280,391355030,MDEyOklzc3VlQ29tbWVudDM5MTM1NTAzMA==,565628,r4vi,2018-05-23T13:53:27Z,2018-05-23T15:22:45Z,CONTRIBUTOR,"No objections; It's good to go @simonw On Wed, 23 May 2018, 14:51 Simon Willison, wrote: > @r4vi any objections to me merging this? > > — > You are receiving this because you were mentioned. > Reply to this email directly, view it on GitHub > , or mute > the thread > > . > ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/280#issuecomment-391437199,https://api.github.com/repos/simonw/datasette/issues/280,391437199,MDEyOklzc3VlQ29tbWVudDM5MTQzNzE5OQ==,9599,simonw,2018-05-23T17:44:20Z,2018-05-23T17:44:20Z,OWNER,Thank you very much! This is most excellent.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/280#issuecomment-401003061,https://api.github.com/repos/simonw/datasette/issues/280,401003061,MDEyOklzc3VlQ29tbWVudDQwMTAwMzA2MQ==,9599,simonw,2018-06-28T11:26:23Z,2018-06-28T11:26:23Z,OWNER,I pushed this to Docker Hub https://hub.docker.com/r/datasetteproject/datasette/ and added notes on how to use it to the documentation: http://datasette.readthedocs.io/en/latest/installation.html#using-docker,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/281#issuecomment-391437462,https://api.github.com/repos/simonw/datasette/issues/281,391437462,MDEyOklzc3VlQ29tbWVudDM5MTQzNzQ2Mg==,9599,simonw,2018-05-23T17:45:07Z,2018-05-23T17:45:07Z,OWNER,I'm afraid I just merged #280 which means this no longer applies. You're very welcome to see if you can further optimize the new Dockerfile though.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325553991,Reduces image size using Alpine + Multistage (re: #278), https://github.com/simonw/datasette/issues/282#issuecomment-391355099,https://api.github.com/repos/simonw/datasette/issues/282,391355099,MDEyOklzc3VlQ29tbWVudDM5MTM1NTA5OQ==,9599,simonw,2018-05-23T13:53:39Z,2018-05-23T13:53:39Z,OWNER,Confirmed fixed: https://fivethirtyeight-datasette-mipwbeadvr.now.sh/fivethirtyeight-5de27e3/nba-elo%2Fnbaallelo?_facet=lg_id&_next=100 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325705981,Faceting breaks pagination, https://github.com/simonw/datasette/issues/283#issuecomment-391583528,https://api.github.com/repos/simonw/datasette/issues/283,391583528,MDEyOklzc3VlQ29tbWVudDM5MTU4MzUyOA==,9599,simonw,2018-05-24T04:21:49Z,2018-05-24T04:21:49Z,OWNER,"The challenge here is which database should be the ""default"" database. The first database attached to SQLite is treated as the default - if no database is specified in a query, that's the database that queries will be executed against. Currently, each database URL in Datasette (e.g. https://san-francisco.datasettes.com/sf-film-locations-84594a7 v.s. https://san-francisco.datasettes.com/sf-trees-ebc2ad9 ) gets its own independent connection, and all queries within that base URL run against that database. If we're going to attach multiple databases to the same connection, how do we set which database gets to be the default? The easiest thing to do here will be to have a special database (maybe which is turned off by default and can be enabled using `datasette serve --enable-cross-database-joins` or similar) which attaches to ALL the databases. Perhaps it starts as an in-memory database, maybe at `/memory`? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391584112,https://api.github.com/repos/simonw/datasette/issues/283,391584112,MDEyOklzc3VlQ29tbWVudDM5MTU4NDExMg==,9599,simonw,2018-05-24T04:26:29Z,2018-05-24T04:30:50Z,OWNER,"I built a very rough prototype of this to prove it could work. It's deployed here - and here's an example of a query that joins across two different databases: https://datasette-cross-database-joins-prototype.now.sh/memory?sql=select+fivethirtyeight.%5Blove-actually%2Flove_actually_adjacencies%5D.rowid%2C%0D%0Afivethirtyeight.%5Blove-actually%2Flove_actually_adjacencies%5D.actors%2C%0D%0A%5Bgoogle-trends%5D.%5B20150430_UKDebate%5D.city%0D%0Afrom+fivethirtyeight.%5Blove-actually%2Flove_actually_adjacencies%5D%0D%0Ajoin+%5Bgoogle-trends%5D.%5B20150430_UKDebate%5D%0D%0A++on+%5Bgoogle-trends%5D.%5B20150430_UKDebate%5D.rowid+%3D+fivethirtyeight.%5Blove-actually%2Flove_actually_adjacencies%5D.rowid ``` select fivethirtyeight.[love-actually/love_actually_adjacencies].rowid, fivethirtyeight.[love-actually/love_actually_adjacencies].actors, [google-trends].[20150430_UKDebate].city from fivethirtyeight.[love-actually/love_actually_adjacencies] join [google-trends].[20150430_UKDebate] on [google-trends].[20150430_UKDebate].rowid = fivethirtyeight.[love-actually/love_actually_adjacencies].rowid ``` I deployed it like this: datasette publish now --branch=cross-database-joins fivethirtyeight.db google-trends.db --name=datasette-cross-database-joins-prototype ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391584366,https://api.github.com/repos/simonw/datasette/issues/283,391584366,MDEyOklzc3VlQ29tbWVudDM5MTU4NDM2Ng==,9599,simonw,2018-05-24T04:28:20Z,2018-05-24T04:28:20Z,OWNER,"I used some pretty ugly hacks, like faking an entire `.inspect()` block for the `:memory:` database just to get past the errors I was seeing. To ship this as a feature it will need quite a bit of code refactoring to make those hacks unnecessary. https://github.com/simonw/datasette/blob/7a3040f5782375373b2b66e5969bc2c49b3a6f0e/datasette/views/database.py#L18-L26","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391584527,https://api.github.com/repos/simonw/datasette/issues/283,391584527,MDEyOklzc3VlQ29tbWVudDM5MTU4NDUyNw==,9599,simonw,2018-05-24T04:29:40Z,2018-05-24T04:29:40Z,OWNER,Rather than stealing the `/memory` namespace for this it would be nicer if these cross-database joins could be executed at the very top-level URL of the Datasette instance - `https://example.com/?sql=...`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391752218,https://api.github.com/repos/simonw/datasette/issues/283,391752218,MDEyOklzc3VlQ29tbWVudDM5MTc1MjIxOA==,9599,simonw,2018-05-24T15:15:19Z,2018-05-24T15:15:19Z,OWNER,Most of the time Datasette is used with just a single database file. So maybe it makes sense for this option to be turned on by default and to ALWAYS be available on the Datasette instance homepage unless the user has explicitly disabled it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391752425,https://api.github.com/repos/simonw/datasette/issues/283,391752425,MDEyOklzc3VlQ29tbWVudDM5MTc1MjQyNQ==,9599,simonw,2018-05-24T15:15:51Z,2018-05-24T15:15:51Z,OWNER,"This would make Datasett's SQL features a lot more instantly obvious to people who land on a homepage, which is probably a good thing.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391752629,https://api.github.com/repos/simonw/datasette/issues/283,391752629,MDEyOklzc3VlQ29tbWVudDM5MTc1MjYyOQ==,9599,simonw,2018-05-24T15:16:25Z,2018-05-24T15:16:25Z,OWNER,"Should this support canned queries too? I think it should, though that raises interesting questions regarding their URL structure.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391752882,https://api.github.com/repos/simonw/datasette/issues/283,391752882,MDEyOklzc3VlQ29tbWVudDM5MTc1Mjg4Mg==,9599,simonw,2018-05-24T15:17:10Z,2018-05-24T15:17:10Z,OWNER,Another option: give this the `/-/all` URL namespace.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391754506,https://api.github.com/repos/simonw/datasette/issues/283,391754506,MDEyOklzc3VlQ29tbWVudDM5MTc1NDUwNg==,9599,simonw,2018-05-24T15:21:37Z,2018-05-24T15:21:53Z,OWNER,"Giving it `/all/` would be easier since that way the existing URL routes (including canned queries) would all work... but I would have to teach it NOT to expect a database content hash on that URL. Or maybe it should still have a content hash (to enable far-future cache expiry headers on query results) but the hash should be constructed out of all of the other database hashes concatenated together. That way the URLs would be `/all-5de27e3` and `/all-5de27e3/canned-query-name` Only downside: this would make it impossible to have a database file with the name `all.db`. I think that's probably an OK trade-off. You could turn the feature off with a config flag if you really want to use that filename (for whatever reason). How about `/-all-5de27e3/` instead to avoid collisions?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391755300,https://api.github.com/repos/simonw/datasette/issues/283,391755300,MDEyOklzc3VlQ29tbWVudDM5MTc1NTMwMA==,9599,simonw,2018-05-24T15:23:37Z,2018-05-24T15:23:37Z,OWNER,On the `/-all-5de27e3` page we can show the regular https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3 interface but instead of the list of tables we can show a list of attached databases plus some help text showing how to construct a cross-database join.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391756841,https://api.github.com/repos/simonw/datasette/issues/283,391756841,MDEyOklzc3VlQ29tbWVudDM5MTc1Njg0MQ==,9599,simonw,2018-05-24T15:27:42Z,2018-05-24T15:27:42Z,OWNER,"For an example query that pre-populates that textarea... maybe a UNION that pulls the first 10 rows from the first table of each of the first two databases? ``` select * from (select rowid, actors from fivethirtyeight.[love-actually/love_actually_adjacencies] limit 10) union all select * from (select rowid, city from [google-trends].[20150430_UKDebate] limit 10) ``` https://datasette-cross-database-joins-prototype.now.sh/memory?sql=select+*+from+%28select+rowid%2C+actors+from+fivethirtyeight.%5Blove-actually%2Flove_actually_adjacencies%5D+limit+10%29%0D%0A+++union+all%0D%0Aselect+*+from+%28select+rowid%2C+city+from+%5Bgoogle-trends%5D.%5B20150430_UKDebate%5D+limit+10%29","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391768302,https://api.github.com/repos/simonw/datasette/issues/283,391768302,MDEyOklzc3VlQ29tbWVudDM5MTc2ODMwMg==,9599,simonw,2018-05-24T16:00:05Z,2018-05-24T16:00:05Z,OWNER,I like `/-/all-5de27e3` for this (with `/-/all` redirecting to the correct hash),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-537716955,https://api.github.com/repos/simonw/datasette/issues/283,537716955,MDEyOklzc3VlQ29tbWVudDUzNzcxNjk1NQ==,9599,simonw,2019-10-02T23:02:15Z,2019-10-02T23:02:15Z,OWNER,"I've been thinking pretty hard about this as part of #569. My big concerns are: * If I'm caching and reusing connections I need to worry about the different combinations - if I have four databases do I cache separate connections for the (""one"", ""two"") AND (""two"", ""three"") AND (""one"", ""three"") and so on pairs? * How does the API and interface deal with instances where you have a database connected as the primary and you want to ATTACH another database and talk to that as well? I think the best way to do this is to say that cross-database joins will only be available against the `:memory:` database. Maybe with an optional mode you can run like `datasette --crossdb` which causes every database to be `ATTACHd` to that connection with an alias so you can start running queries. If this proves to be a problem when hundreds of files are attached to a Datasette Library instance (#417) then maybe cross database joins are handled (in that case) by the authenticated user selecting which ones to ?_attach= and detaching them at the end of the request. Also perhaps limit to joining across a maximum of 3 databases at once in this case. I can probably avoid the scariest negative consequences of cross-database joins by having them turned off by default for signed-out users. The datasette-on-my-laptop or authenticated Datasette Library cases can be opt-in and can be a little less locked down.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-552140870,https://api.github.com/repos/simonw/datasette/issues/283,552140870,MDEyOklzc3VlQ29tbWVudDU1MjE0MDg3MA==,9599,simonw,2019-11-09T21:49:51Z,2019-11-09T21:49:51Z,OWNER,"Better idea: if you run Datasette in cross-database joining mode, all connections start out as memory connections and then have new databases attached to them on-demand. All table view queries will be automatically rewritten to start `SELECT db.table.one, db.table.two FROM db.table ...`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-552140975,https://api.github.com/repos/simonw/datasette/issues/283,552140975,MDEyOklzc3VlQ29tbWVudDU1MjE0MDk3NQ==,9599,simonw,2019-11-09T21:51:41Z,2019-11-09T21:51:41Z,OWNER,It may turn out that we have to recommend NOT exposing a Datasette instance to the public with dozens of database files that has multi-db queries enabled - will need to load test to understand if this recommendation is needed or not.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-780991910,https://api.github.com/repos/simonw/datasette/issues/283,780991910,MDEyOklzc3VlQ29tbWVudDc4MDk5MTkxMA==,9308268,rayvoelker,2021-02-18T02:13:56Z,2021-02-18T02:13:56Z,NONE,"I was going ask you about this issue when we talk during your office-hours schedule this Friday, but was there any support ever added for doing this cross-database joining? I have a use-case where could be pretty neat to do analysis using this tool on time-specific databases from snapshots https://ilsweb.cincinnatilibrary.org/collection-analysis/ ![image](https://user-images.githubusercontent.com/9308268/108294883-ba3a8e00-7164-11eb-9206-fcd5a8cdd883.png) and thanks again for such an amazing tool!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-781077127,https://api.github.com/repos/simonw/datasette/issues/283,781077127,MDEyOklzc3VlQ29tbWVudDc4MTA3NzEyNw==,9599,simonw,2021-02-18T05:56:30Z,2021-02-18T05:57:34Z,OWNER,I'm going to to try prototyping the `--crossdb` option that causes `/_memory` to connect to all databases as a starting point and see how well that works.,"{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 1, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-781573676,https://api.github.com/repos/simonw/datasette/issues/283,781573676,MDEyOklzc3VlQ29tbWVudDc4MTU3MzY3Ng==,9599,simonw,2021-02-18T19:13:30Z,2021-02-18T19:13:30Z,OWNER,"It turns out SQLite defaults to a maximum of 10 attached databases. This can be increased using a compile-time constant, but even with that it cannot be more than 62: https://stackoverflow.com/questions/9845448/attach-limit-10","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-781574786,https://api.github.com/repos/simonw/datasette/issues/283,781574786,MDEyOklzc3VlQ29tbWVudDc4MTU3NDc4Ng==,9599,simonw,2021-02-18T19:15:37Z,2021-02-18T19:15:37Z,OWNER,`select * from pragma_database_list();` is useful - shows all attached databases for the current connection.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-781591015,https://api.github.com/repos/simonw/datasette/issues/283,781591015,MDEyOklzc3VlQ29tbWVudDc4MTU5MTAxNQ==,9599,simonw,2021-02-18T19:44:02Z,2021-02-18T19:44:02Z,OWNER,For the moment I'm going to hard-code a `SQLITE_LIMIT_ATTACHED=10` constant and only attach the first 10 databases to the `_memory` connection.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-781593169,https://api.github.com/repos/simonw/datasette/issues/283,781593169,MDEyOklzc3VlQ29tbWVudDc4MTU5MzE2OQ==,9599,simonw,2021-02-18T19:47:34Z,2021-02-18T19:47:34Z,OWNER,"I have a working version now, moving development to a pull request.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-781665560,https://api.github.com/repos/simonw/datasette/issues/283,781665560,MDEyOklzc3VlQ29tbWVudDc4MTY2NTU2MA==,9599,simonw,2021-02-18T22:06:14Z,2021-02-18T22:06:14Z,OWNER,"The implementation in #1232 is ready to land. It's the simplest-thing-that-could-possibly-work: you can run `datasette one.db two.db three.db --crossdb` and then use the `/_memory` page to run joins across tables from multiple databases. It only works on the first 10 databases that were passed to the command-line. This means that if you have a Datasette instance with hundreds of attached databases (see [Datasette Library](https://github.com/simonw/datasette/issues/417)) this won't be particularly useful for you. So... a better, future version of this feature would be one that lets you join across databases on command - maybe by hitting `/_memory?attach=db1&attach=db2` to get a special connection. Also worth noting: plugins that implement the [prepare_connection()](https://docs.datasette.io/en/stable/plugin_hooks.html#prepare-connection-conn-database-datasette) hook can attach additional databases - so if you need better, customized support for this one way to handle that would be with a custom plugin.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-781670827,https://api.github.com/repos/simonw/datasette/issues/283,781670827,MDEyOklzc3VlQ29tbWVudDc4MTY3MDgyNw==,9599,simonw,2021-02-18T22:16:46Z,2021-02-18T22:16:46Z,OWNER,"Demo is now live here: https://latest.datasette.io/_memory The documentation is at https://docs.datasette.io/en/latest/sql_queries.html#cross-database-queries - it links to this example query: https://latest.datasette.io/_memory?sql=select%0D%0A++%27fixtures%27+as+database%2C+*%0D%0Afrom%0D%0A++%5Bfixtures%5D.sqlite_master%0D%0Aunion%0D%0Aselect%0D%0A++%27extra_database%27+as+database%2C+*%0D%0Afrom%0D%0A++%5Bextra_database%5D.sqlite_master","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-781764561,https://api.github.com/repos/simonw/datasette/issues/283,781764561,MDEyOklzc3VlQ29tbWVudDc4MTc2NDU2MQ==,9599,simonw,2021-02-19T02:10:21Z,2021-02-19T02:10:21Z,OWNER,This feature is now released! https://docs.datasette.io/en/stable/changelog.html#v0-55,"{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 1, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-789680230,https://api.github.com/repos/simonw/datasette/issues/283,789680230,MDEyOklzc3VlQ29tbWVudDc4OTY4MDIzMA==,605492,justinpinkney,2021-03-03T12:28:42Z,2021-03-03T12:28:42Z,NONE,"One note on using this pragma I got an error on starting datasette `no such table: pragma_database_list`. I diagnosed this to an older version of sqlite3 (3.14.2) and upgrading to a newer version (3.34.2) fixed the issue.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-855369819,https://api.github.com/repos/simonw/datasette/issues/283,855369819,MDEyOklzc3VlQ29tbWVudDg1NTM2OTgxOQ==,9599,simonw,2021-06-06T09:40:18Z,2021-06-06T09:40:18Z,OWNER,"> One note on using this pragma I got an error on starting datasette `no such table: pragma_database_list`. > > I diagnosed this to an older version of sqlite3 (3.14.2) and upgrading to a newer version (3.34.2) fixed the issue. That issue is fixed in #1276.","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/284#issuecomment-391765706,https://api.github.com/repos/simonw/datasette/issues/284,391765706,MDEyOklzc3VlQ29tbWVudDM5MTc2NTcwNg==,9599,simonw,2018-05-24T15:52:24Z,2018-05-24T15:52:24Z,OWNER,I'm not crazy about the `enable_` prefix on these.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326182814,Ability to enable/disable specific features via --config, https://github.com/simonw/datasette/issues/284#issuecomment-391765973,https://api.github.com/repos/simonw/datasette/issues/284,391765973,MDEyOklzc3VlQ29tbWVudDM5MTc2NTk3Mw==,9599,simonw,2018-05-24T15:53:08Z,2018-05-24T15:53:08Z,OWNER,This will also give us a mechanism for turning on and off the cross-database joins feature from #283,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326182814,Ability to enable/disable specific features via --config, https://github.com/simonw/datasette/issues/284#issuecomment-391766420,https://api.github.com/repos/simonw/datasette/issues/284,391766420,MDEyOklzc3VlQ29tbWVudDM5MTc2NjQyMA==,9599,simonw,2018-05-24T15:54:33Z,2018-05-24T15:54:33Z,OWNER,"Maybe `allow_sql`, `allow_facet` and `allow_download`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326182814,Ability to enable/disable specific features via --config, https://github.com/simonw/datasette/issues/284#issuecomment-391912392,https://api.github.com/repos/simonw/datasette/issues/284,391912392,MDEyOklzc3VlQ29tbWVudDM5MTkxMjM5Mg==,9599,simonw,2018-05-25T01:16:56Z,2018-05-25T01:17:13Z,OWNER,`allow_sql` should only affect the `?sql=` parameter and whether or not the form is displayed. You should still be able to use and execute canned queries even if this option is turned off.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326182814,Ability to enable/disable specific features via --config, https://github.com/simonw/datasette/issues/284#issuecomment-391950691,https://api.github.com/repos/simonw/datasette/issues/284,391950691,MDEyOklzc3VlQ29tbWVudDM5MTk1MDY5MQ==,9599,simonw,2018-05-25T06:01:23Z,2018-05-25T06:05:02Z,OWNER,"Demo: datasette publish now --branch=master fixtures.db \ --source=""#284 Demo"" \ --source_url=""https://github.com/simonw/datasette/issues/284"" \ --extra-options ""--config allow_sql:off --config allow_facet:off --config allow_download:off"" \ --name=datasette-demo-284 now alias https://datasette-demo-284-jogjwngegj.now.sh datasette-demo-284.now.sh https://datasette-demo-284.now.sh/ Note the following: * https://datasette-demo-284.now.sh/fixtures-fda0fea has no SQL input textarea * https://datasette-demo-284.now.sh/fixtures-fda0fea has no database download link * https://datasette-demo-284.now.sh/fixtures-fda0fea.db returns 403 forbidden * https://datasette-demo-284.now.sh/fixtures-fda0fea?sql=select%20*%20from%20sqlite_master throws error 400 * https://datasette-demo-284.now.sh/fixtures-fda0fea/facetable shows no suggested facets * https://datasette-demo-284.now.sh/fixtures-fda0fea/facetable?_facet=city_id throws error 400","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326182814,Ability to enable/disable specific features via --config, https://github.com/simonw/datasette/issues/285#issuecomment-392297392,https://api.github.com/repos/simonw/datasette/issues/285,392297392,MDEyOklzc3VlQ29tbWVudDM5MjI5NzM5Mg==,9599,simonw,2018-05-27T00:50:27Z,2018-05-27T00:50:27Z,OWNER,"I ran a very rough micro-benchmark on the new `num_sql_threads` config option. datasette --config num_sql_threads:1 fivethirtyeight.db Then ab -n 100 -c 10 'http://127.0.0.1:8011/fivethirtyeight-2628db9/twitter-ratio%2Fsenators' | Number of threads | Requests/second | |---|---| | 1 | 4.57 | | 3 | 9.77 | | 10 | 13.53 | | 20 | 15.24 | 50 | 8.21 | ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326189744,num_threads and cache_max_age should be --config options, https://github.com/simonw/datasette/issues/285#issuecomment-392297508,https://api.github.com/repos/simonw/datasette/issues/285,392297508,MDEyOklzc3VlQ29tbWVudDM5MjI5NzUwOA==,9599,simonw,2018-05-27T00:53:35Z,2018-05-27T00:53:35Z,OWNER,Documentation: http://datasette.readthedocs.io/en/latest/config.html#num-sql-threads,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326189744,num_threads and cache_max_age should be --config options, https://github.com/simonw/datasette/issues/286#issuecomment-392121500,https://api.github.com/repos/simonw/datasette/issues/286,392121500,MDEyOklzc3VlQ29tbWVudDM5MjEyMTUwMA==,9599,simonw,2018-05-25T17:06:46Z,2018-05-25T17:06:46Z,OWNER,"A few extra thoughts: * Some users may want to opt out of this. We could have `--config version_in_hash:false` * should this affect the filename for the downloadable copy of the SQLite database? Maybe that should stay as just the hash of the contents, but that's a fair bit more complex * What about users who stick with the same version of datasette but deploy changes to their custom templates - how can we help them cache bust? Maybe with `--config cache_version:2`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326599525,Database hash should include current datasette version, https://github.com/simonw/datasette/issues/286#issuecomment-392121743,https://api.github.com/repos/simonw/datasette/issues/286,392121743,MDEyOklzc3VlQ29tbWVudDM5MjEyMTc0Mw==,9599,simonw,2018-05-25T17:07:36Z,2018-05-25T17:07:36Z,OWNER,This is also a great excuse to finally write up some detailed documentation on Datasette's caching strategy,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326599525,Database hash should include current datasette version, https://github.com/simonw/datasette/issues/287#issuecomment-392296758,https://api.github.com/repos/simonw/datasette/issues/287,392296758,MDEyOklzc3VlQ29tbWVudDM5MjI5Njc1OA==,9599,simonw,2018-05-27T00:32:53Z,2018-05-27T00:32:53Z,OWNER,Docs: https://datasette.readthedocs.io/en/latest/json_api.html#different-shapes,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326617744,?_shape=arrayfirst, https://github.com/simonw/datasette/issues/288#issuecomment-392288531,https://api.github.com/repos/simonw/datasette/issues/288,392288531,MDEyOklzc3VlQ29tbWVudDM5MjI4ODUzMQ==,9599,simonw,2018-05-26T21:14:37Z,2019-04-15T23:01:17Z,OWNER,"This might also be an opportunity to support an __in= operator - though that's an odd one as it acts equivalent to an OR whereas every other parameter is combined with an AND UPDATE 15th April 2019: I implemented `?column__in=` in a different way, see #433 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326767626,Support multiple filters of the same type, https://github.com/simonw/datasette/issues/288#issuecomment-483106262,https://api.github.com/repos/simonw/datasette/issues/288,483106262,MDEyOklzc3VlQ29tbWVudDQ4MzEwNjI2Mg==,9599,simonw,2019-04-15T04:48:59Z,2019-04-15T04:48:59Z,OWNER,"This has got more urgent now that I've added the `?column__arraycontains=foo` filter as part of the effort to implement facet-by-array for #359 I added that filter in https://github.com/simonw/datasette/commit/78e45ead4d771007c57b307edf8fc920101f8733 but it can only be applied once - for proper faceting this needs to work: https://latest.datasette.io/fixtures/facetable?tags__arraycontains=tag1&tags__arraycontains=tag2","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326767626,Support multiple filters of the same type, https://github.com/simonw/datasette/issues/288#issuecomment-483458569,https://api.github.com/repos/simonw/datasette/issues/288,483458569,MDEyOklzc3VlQ29tbWVudDQ4MzQ1ODU2OQ==,9599,simonw,2019-04-15T23:45:04Z,2019-04-15T23:45:04Z,OWNER,https://latest.datasette.io/fixtures/facetable?tags__arraycontains=tag1&tags__arraycontains=tag2 now works.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326767626,Support multiple filters of the same type, https://github.com/simonw/datasette/issues/289#issuecomment-392288990,https://api.github.com/repos/simonw/datasette/issues/289,392288990,MDEyOklzc3VlQ29tbWVudDM5MjI4ODk5MA==,9599,simonw,2018-05-26T21:24:10Z,2018-05-26T21:24:10Z,OWNER,An example of a query where you might want to use this option: https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3?sql=select+rowid%2C+*+from+%5Balcohol-consumption%2Fdrinks%5D+order+by+random%28%29+limit+1,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326768188,?_ttl= parameter to control caching, https://github.com/simonw/datasette/issues/289#issuecomment-392291605,https://api.github.com/repos/simonw/datasette/issues/289,392291605,MDEyOklzc3VlQ29tbWVudDM5MjI5MTYwNQ==,9599,simonw,2018-05-26T22:20:02Z,2018-05-26T22:20:02Z,OWNER,Documented here https://datasette.readthedocs.io/en/latest/json_api.html#special-table-arguments and here: https://datasette.readthedocs.io/en/latest/config.html#default-cache-ttl,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326768188,?_ttl= parameter to control caching, https://github.com/simonw/datasette/issues/289#issuecomment-392291716,https://api.github.com/repos/simonw/datasette/issues/289,392291716,MDEyOklzc3VlQ29tbWVudDM5MjI5MTcxNg==,9599,simonw,2018-05-26T22:22:47Z,2018-05-26T22:22:47Z,OWNER,Demo: hit refresh on https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3?sql=select+rowid%2C+*+from+%5Balcohol-consumption%2Fdrinks%5D+order+by+random%28%29+limit+1&_ttl=0,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326768188,?_ttl= parameter to control caching, https://github.com/simonw/datasette/issues/291#issuecomment-392302406,https://api.github.com/repos/simonw/datasette/issues/291,392302406,MDEyOklzc3VlQ29tbWVudDM5MjMwMjQwNg==,9599,simonw,2018-05-27T03:18:06Z,2018-05-27T03:18:06Z,OWNER,"My first attempt at this was to have plugins depend on each other - so there would be a `datasette-leaflet` plugin which adds Leaflet to the page, and the `datasette-cluster-map` and `datasette-leaflet-geojson` plugins would depend on that plugin. I tried this and it didn't work, because it turns out the order in which plugins are loaded isn't predictable. `datasette-cluster-map` ended up adding it's script link before Leaflet had been loaded by `datasette-leaflet`, resulting in JavaScript errors.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326783670,Avoid plugins accidentally loading dependencies twice, https://github.com/simonw/datasette/issues/291#issuecomment-392302416,https://api.github.com/repos/simonw/datasette/issues/291,392302416,MDEyOklzc3VlQ29tbWVudDM5MjMwMjQxNg==,9599,simonw,2018-05-27T03:18:16Z,2018-05-27T03:18:16Z,OWNER,For the moment then I'm going with a really simple solution: when iterating through `extra_css_urls` and `extra_js_urls` de-dupe by URL and avoid outputting the same link twice.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326783670,Avoid plugins accidentally loading dependencies twice, https://github.com/simonw/datasette/issues/291#issuecomment-392302456,https://api.github.com/repos/simonw/datasette/issues/291,392302456,MDEyOklzc3VlQ29tbWVudDM5MjMwMjQ1Ng==,9599,simonw,2018-05-27T03:19:24Z,2018-05-27T03:19:24Z,OWNER,The big gap in this solution is conflicting versions: I don't yet have a story for what happens if two plugins attempt to load different versions of Leaflet. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326783670,Avoid plugins accidentally loading dependencies twice, https://github.com/simonw/datasette/issues/292#issuecomment-392316673,https://api.github.com/repos/simonw/datasette/issues/292,392316673,MDEyOklzc3VlQ29tbWVudDM5MjMxNjY3Mw==,9599,simonw,2018-05-27T09:08:06Z,2018-05-27T09:08:06Z,OWNER,Open question: how should this affect the row page? Just because columns were hidden on the table page doesn't necessarily mean they should be hidden on the row page as well. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392316701,https://api.github.com/repos/simonw/datasette/issues/292,392316701,MDEyOklzc3VlQ29tbWVudDM5MjMxNjcwMQ==,9599,simonw,2018-05-27T09:08:49Z,2018-05-27T09:08:49Z,OWNER,I could certainly see people wanting different custom column selects for the row page compared to the table page.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392338130,https://api.github.com/repos/simonw/datasette/issues/292,392338130,MDEyOklzc3VlQ29tbWVudDM5MjMzODEzMA==,9599,simonw,2018-05-27T15:09:18Z,2018-05-27T15:09:28Z,OWNER,"Here's my first sketch at a metadata format for this: * `columns`: optional list of columns to include - if missing, shows all * `column_selects`: dictionary mapping column names to alternative select clauses `column_selects` can also invent new keys and use them to create derived columns. These new keys will be selected at the end of the list of columns UNLESS they are mentioned in `columns`, in which case that sequence will define the order. Can you facet by things that are customized using `column_selects`? Yes, and let's try running suggested facets against those columns as well. ``` { ""databases"": { ""databasename"": { ""tables"": { ""tablename"": { ""columns"": [ ""id"", ""name"", ""size"" ], ""column_selects"": { ""name"": ""upper(name)"", ""geo_json"": ""AsGeoJSON(Geometry)"" } ""row_columns"": [...] ""row_column_selects"": {...} } ``` The `row_columns` and `row_column_selects` properties work the same as the `column*` ones, except they are applied on the row page instead. If omitted, the `column*` ones will be used on the row page as well. If you want the row page to switch back to Datasette's default behaviour you can set `""row_columns"": [], ""row_column_selects"": {}`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392342269,https://api.github.com/repos/simonw/datasette/issues/292,392342269,MDEyOklzc3VlQ29tbWVudDM5MjM0MjI2OQ==,9599,simonw,2018-05-27T15:55:40Z,2018-05-27T16:01:26Z,OWNER,"Here's the metadata I tried against that first working prototype: ``` { ""databases"": { ""timezones"": { ""tables"": { ""timezones"": { ""columns"": [""PK_UID""], ""column_selects"": { ""upper_tzid"": ""upper(tzid)"", ""Geometry"": ""AsGeoJSON(Geometry)"" } } } }, ""wtr"": { ""tables"": { ""license_frequency"": { ""columns"": [""id"", ""license"", ""tx_rx"", ""frequency""], ""column_selects"": { ""latitude"": ""Y(Geometry)"", ""longitude"": ""X(Geometry)"" } } } } } } ``` Run using this: datasette timezones.db wtr.db \ --reload --debug --load-extension=/usr/local/lib/mod_spatialite.dylib \ -m column-metadata.json --config sql_time_limit_ms:10000 Usefully, the `--reload` flag detects changes to the `metadata.json` file as well as Datasette's own Python code.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392342947,https://api.github.com/repos/simonw/datasette/issues/292,392342947,MDEyOklzc3VlQ29tbWVudDM5MjM0Mjk0Nw==,9599,simonw,2018-05-27T16:01:43Z,2018-05-27T16:01:43Z,OWNER,I'd still like to be able to over-ride this using querystring arguments.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392343690,https://api.github.com/repos/simonw/datasette/issues/292,392343690,MDEyOklzc3VlQ29tbWVudDM5MjM0MzY5MA==,9599,simonw,2018-05-27T16:08:25Z,2018-05-27T16:08:40Z,OWNER,"Turns out it's actually possible to pull data from other tables using the mechanism in the prototype: ``` { ""databases"": { ""wtr"": { ""tables"": { ""license"": { ""column_selects"": { ""count"": ""(select count(*) from license_frequency where license_frequency.license = license.id)"" } } } } } } ``` Performance using this technique is pretty terrible though: ![2018-05-27 at 9 07 am](https://user-images.githubusercontent.com/9599/40588124-8169d7fa-618d-11e8-9880-ccc1904b05d9.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392343839,https://api.github.com/repos/simonw/datasette/issues/292,392343839,MDEyOklzc3VlQ29tbWVudDM5MjM0MzgzOQ==,9599,simonw,2018-05-27T16:10:09Z,2018-06-04T17:38:04Z,OWNER,"The more efficient way of doing this kind of count would be to provide a mechanism which can also add extra fragments to a `GROUP BY` clause used for the `SELECT`. Or... how about a mechanism similar to Django's `prefetch_related` which lets you define extra queries that will be called with a list of primary keys (or values from other columns) and used to populate a new column? A little unconventional but could be extremely useful and efficient. Related to that: since the per-query overhead in SQLite is tiny, could even define an extra query to be run once-per-row before returning results.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392345062,https://api.github.com/repos/simonw/datasette/issues/292,392345062,MDEyOklzc3VlQ29tbWVudDM5MjM0NTA2Mg==,9599,simonw,2018-05-27T16:26:53Z,2018-05-27T16:26:53Z,OWNER,There needs to be a way to turn this off and return to Datasette default bahviour. Maybe a `?_raw=1` querystring parameter for the table view.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392350495,https://api.github.com/repos/simonw/datasette/issues/292,392350495,MDEyOklzc3VlQ29tbWVudDM5MjM1MDQ5NQ==,9599,simonw,2018-05-27T17:47:31Z,2018-05-27T17:47:31Z,OWNER,"Querystring design: * `?_column=a&_column=b` - equivalent of `""columns"": [""a"", ""b""]` in `metadata.json` * `?_select_nameupper=upper(name)` - equivalent of `""column_selects"": {""nameupper"": ""upper(name)""}`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392350568,https://api.github.com/repos/simonw/datasette/issues/292,392350568,MDEyOklzc3VlQ29tbWVudDM5MjM1MDU2OA==,9599,simonw,2018-05-27T17:48:45Z,2018-05-27T17:54:41Z,OWNER,"If any `?_column=` parameters are provided the metadata version is completely ignored. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392350980,https://api.github.com/repos/simonw/datasette/issues/292,392350980,MDEyOklzc3VlQ29tbWVudDM5MjM1MDk4MA==,9599,simonw,2018-05-27T17:56:30Z,2018-05-27T17:56:50Z,OWNER,"Should `?_raw=1` also turn off foreign key expansions? No, we will eventually provide a separate mechanism for that (or leave it to nerds who care to figure out using JSON or CSV export).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-423543060,https://api.github.com/repos/simonw/datasette/issues/292,423543060,MDEyOklzc3VlQ29tbWVudDQyMzU0MzA2MA==,9599,simonw,2018-09-21T14:06:31Z,2018-09-21T14:09:06Z,OWNER,"I keep on finding new reasons that I want this. The latest is that I'm playing with the more advanced features of FTS5 - in particular the highlight() function and the ability to sort by rank. The problem is... in order to do this, I need to properly join against the `_fts` table. Here's an example query: select highlight(events_fts, 0, '', ''), events_fts.rank, events.* from events join events_fts on events.rowid = events_fts.rowid where events_fts match :search order by rank Note that this is a different query from the usual FTS one (which does `where rowid in (select rowid from events_fts...)`) because I need the rank column somewhere I can sort against. I'd like to be able to use this on the table view page so I can get faceting etc for free, but this is a completely different query from the default. Maybe I need a way to customize the entire query? That feels weird though - why am I not using a view in that case? Answer: because views can't accept `:search` style parameters. I could use a canned query, but canned queries don't get faceting etc.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-549169101,https://api.github.com/repos/simonw/datasette/issues/292,549169101,MDEyOklzc3VlQ29tbWVudDU0OTE2OTEwMQ==,9599,simonw,2019-11-03T19:17:08Z,2019-11-03T19:17:16Z,OWNER,"A good basic starting point for this would be to ignore the ability to add custom SQL fragments and instead focus on being able to show and hide specific columns. This will play particularly well with #613. Proposed syntax for that: `/db/table?_col=id&_col=name` - just show the `id` and `name` columns `/db/table?_nocol=extras&_nocol=age` - show all columns except for `extras` and `age` I don't think it makes sense to allow both `?_col=` and `?_nocol=` arguments in the same request, so if you provide both I think we throw a 400 error.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-549584753,https://api.github.com/repos/simonw/datasette/issues/292,549584753,MDEyOklzc3VlQ29tbWVudDU0OTU4NDc1Mw==,9599,simonw,2019-11-04T22:54:26Z,2019-11-04T22:54:26Z,OWNER,I'm going to split off an issue just for `?_col=` and `?_nocol=`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-849308907,https://api.github.com/repos/simonw/datasette/issues/292,849308907,MDEyOklzc3VlQ29tbWVudDg0OTMwODkwNw==,9599,simonw,2021-05-27T04:25:01Z,2021-05-27T04:25:01Z,OWNER,Now that `?_col=` and `?_nocol=` are implemented I'm closing this ticket - other customizations can already be handled by defining SQL views or creating canned SQL queries.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/pull/293#issuecomment-420295524,https://api.github.com/repos/simonw/datasette/issues/293,420295524,MDEyOklzc3VlQ29tbWVudDQyMDI5NTUyNA==,11912854,jsancho-gpl,2018-09-11T14:32:45Z,2018-09-11T14:32:45Z,NONE,I close this PR because it's better to use the new one #364 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326987229,Support for external database connectors, https://github.com/simonw/datasette/issues/294#issuecomment-393547960,https://api.github.com/repos/simonw/datasette/issues/294,393547960,MDEyOklzc3VlQ29tbWVudDM5MzU0Nzk2MA==,9599,simonw,2018-05-31T14:25:43Z,2018-05-31T14:25:43Z,OWNER,"SpatialLite columns are actually quite a bit more interesting than this - they also have a `geometry_type` (point, polygon, linestring etc), a `coord_dimension` (usually 2 but can be higher) and an `srid`. For example: https://datasette-publish-spatialite-demo.now.sh/spatialite-test-c88bc35/geometry_columns ![2018-05-31 at 7 22 am](https://user-images.githubusercontent.com/9599/40787843-6f9600ee-64a3-11e8-84e5-64d7cc69603a.png) The SRID here is particularly interesting, because it helps hint at the fact that the results from these queries won't be latitude/longitude co-ordinates - which means that `AsGeoJSON()` won't return results that can be easily rendered by Leaflet: https://datasette-publish-spatialite-demo.now.sh/spatialite-test-c88bc35?sql=select+AsGeoJSON(Geometry)+from+HighWays%20limit1 Compare with https://timezones-api.now.sh/timezones-a99b2e3/geometry_columns: ![2018-05-31 at 7 25 am](https://user-images.githubusercontent.com/9599/40787991-d2650756-64a3-11e8-936e-2dcce7dd1515.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327365110,inspect should record column types, https://github.com/simonw/datasette/issues/294#issuecomment-393548602,https://api.github.com/repos/simonw/datasette/issues/294,393548602,MDEyOklzc3VlQ29tbWVudDM5MzU0ODYwMg==,9599,simonw,2018-05-31T14:27:41Z,2018-05-31T14:27:56Z,OWNER,Presumably the difference in primary key structure between those two is caused by the fact that the `spatialite-test` database (actually https://www.gaia-gis.it/spatialite-2.3.1/test-2.3.sqlite.gz downloaded from https://www.gaia-gis.it/spatialite-2.3.1/resources.html ) was created by a much older version of SpatialLite - presumably v2.3.1,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327365110,inspect should record column types, https://github.com/simonw/datasette/issues/294#issuecomment-393549215,https://api.github.com/repos/simonw/datasette/issues/294,393549215,MDEyOklzc3VlQ29tbWVudDM5MzU0OTIxNQ==,9599,simonw,2018-05-31T14:29:37Z,2018-05-31T14:29:37Z,OWNER,"Also of note: `spatialite-test` uses readable strings in the `type` column, while `timezones` has a `geometry_type` column with integers in it. Those integers are documented here: https://www.gaia-gis.it/fossil/libspatialite/wiki?name=switching-to-4.0 ![2018-05-31 at 7 29 am](https://user-images.githubusercontent.com/9599/40788210-5d0f0dd4-64a4-11e8-8141-0386b5c7b384.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327365110,inspect should record column types, https://github.com/simonw/datasette/issues/294#issuecomment-393557968,https://api.github.com/repos/simonw/datasette/issues/294,393557968,MDEyOklzc3VlQ29tbWVudDM5MzU1Nzk2OA==,9599,simonw,2018-05-31T14:55:46Z,2018-05-31T14:55:46Z,OWNER,"I'm not sure what the best JSON shape for this would be considering the potential complexity of geospatial columns. I do think it's worth exposing these in the inspect JSON though, mainly so Datasette Registry can keep track of all of the openly available geodata out there.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327365110,inspect should record column types, https://github.com/simonw/datasette/issues/294#issuecomment-405026800,https://api.github.com/repos/simonw/datasette/issues/294,405026800,MDEyOklzc3VlQ29tbWVudDQwNTAyNjgwMA==,45057,russss,2018-07-14T14:24:31Z,2018-07-14T14:24:31Z,CONTRIBUTOR,"I had a quick look at this in relation to #343 and I feel like it might be worth modelling the inspected table metadata internally as an object rather than a dict. (We'd still have to serialise it back to JSON.) There are a few places where we rely on the structure of this metadata dict for various reasons, including in templates (and potentially also in user templates). It would be nice to have a reasonably well defined API for accessing metadata internally so that it's clearer what we're breaking.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327365110,inspect should record column types, https://github.com/simonw/datasette/issues/294#issuecomment-504882686,https://api.github.com/repos/simonw/datasette/issues/294,504882686,MDEyOklzc3VlQ29tbWVudDUwNDg4MjY4Ng==,9599,simonw,2019-06-24T06:54:22Z,2019-06-24T06:54:22Z,OWNER,Consider this when solving #465 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327365110,inspect should record column types, https://github.com/simonw/datasette/issues/294#issuecomment-506801311,https://api.github.com/repos/simonw/datasette/issues/294,506801311,MDEyOklzc3VlQ29tbWVudDUwNjgwMTMxMQ==,9599,simonw,2019-06-28T16:45:28Z,2019-06-28T16:45:28Z,OWNER,This can happen as part of #531 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327365110,inspect should record column types, https://github.com/simonw/datasette/issues/295#issuecomment-393534579,https://api.github.com/repos/simonw/datasette/issues/295,393534579,MDEyOklzc3VlQ29tbWVudDM5MzUzNDU3OQ==,9599,simonw,2018-05-31T13:44:15Z,2018-05-31T13:44:15Z,OWNER,I actually started doing this in 45e502aace6cc1198cc5f9a04d61b4a1860a012b,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327383759,Extract unit tests for inspect out to test_inspect.py, https://github.com/simonw/datasette/issues/295#issuecomment-491545892,https://api.github.com/repos/simonw/datasette/issues/295,491545892,MDEyOklzc3VlQ29tbWVudDQ5MTU0NTg5Mg==,9599,simonw,2019-05-11T21:40:32Z,2019-05-11T21:40:32Z,OWNER,"I'm not going to do this, as a result of #462 and #419 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327383759,Extract unit tests for inspect out to test_inspect.py, https://github.com/simonw/datasette/issues/296#issuecomment-392840811,https://api.github.com/repos/simonw/datasette/issues/296,392840811,MDEyOklzc3VlQ29tbWVudDM5Mjg0MDgxMQ==,9599,simonw,2018-05-29T16:26:27Z,2018-05-29T19:43:23Z,OWNER,"Since #275 will allow configs to be overridden at the table and database level it also makes sense to expose a completely evaluated list of configs at: * `/dbname/-/config` * `/dbname/tablename/-/config` Similar to https://fivethirtyeight.datasettes.com/-/config","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327395270,Per-database and per-table /-/ URL namespace, https://github.com/simonw/datasette/issues/296#issuecomment-392918311,https://api.github.com/repos/simonw/datasette/issues/296,392918311,MDEyOklzc3VlQ29tbWVudDM5MjkxODMxMQ==,9599,simonw,2018-05-29T19:44:33Z,2018-05-29T19:44:33Z,OWNER,Should the `tablename` ones also work for views and canned queries? Probably not.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327395270,Per-database and per-table /-/ URL namespace, https://github.com/simonw/datasette/issues/296#issuecomment-506801644,https://api.github.com/repos/simonw/datasette/issues/296,506801644,MDEyOklzc3VlQ29tbWVudDUwNjgwMTY0NA==,9599,simonw,2019-06-28T16:46:34Z,2019-06-28T16:46:34Z,OWNER,The first of these will be built in #531 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327395270,Per-database and per-table /-/ URL namespace, https://github.com/simonw/datasette/issues/297#issuecomment-393554151,https://api.github.com/repos/simonw/datasette/issues/297,393554151,MDEyOklzc3VlQ29tbWVudDM5MzU1NDE1MQ==,9599,simonw,2018-05-31T14:44:37Z,2018-05-31T14:44:37Z,OWNER,I fixed this in https://github.com/simonw/datasette/commit/b18e4515855c3f1eeca3dfcccdbb6df05869084a,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327420945,datasette publish Dockerfile should use python:3.6-slim-stretch, https://github.com/simonw/datasette/issues/298#issuecomment-392917380,https://api.github.com/repos/simonw/datasette/issues/298,392917380,MDEyOklzc3VlQ29tbWVudDM5MjkxNzM4MA==,9599,simonw,2018-05-29T19:41:59Z,2018-05-29T19:41:59Z,OWNER,Creating URLs using concatenation as seen in `('https://twitter.com/' || user) as user_url` is likely to have all sorts of useful applications for ad-hoc analysis.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327459829,URLify URLs in results from custom SQL statements / views, https://github.com/simonw/datasette/issues/298#issuecomment-407274059,https://api.github.com/repos/simonw/datasette/issues/298,407274059,MDEyOklzc3VlQ29tbWVudDQwNzI3NDA1OQ==,9599,simonw,2018-07-24T04:03:05Z,2018-07-24T04:03:05Z,OWNER,Demo: https://latest.datasette.io/fixtures-dcc1dbf?sql=select+%28%27https%3A%2F%2Ftwitter.com%2F%27+%7C%7C+%27simonw%27%29+as+user_url%3B,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327459829,URLify URLs in results from custom SQL statements / views, https://github.com/simonw/datasette/issues/299#issuecomment-408581551,https://api.github.com/repos/simonw/datasette/issues/299,408581551,MDEyOklzc3VlQ29tbWVudDQwODU4MTU1MQ==,9599,simonw,2018-07-28T04:24:05Z,2018-07-28T04:24:05Z,OWNER,New documentation is now online here: https://datasette.readthedocs.io/en/latest/pages.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327461381,Documentation covering ALL datasette URLs, https://github.com/simonw/datasette/issues/301#issuecomment-407979065,https://api.github.com/repos/simonw/datasette/issues/301,407979065,MDEyOklzc3VlQ29tbWVudDQwNzk3OTA2NQ==,9599,simonw,2018-07-26T05:17:34Z,2018-07-26T05:17:34Z,OWNER,This code now lives in https://github.com/simonw/datasette/blob/master/datasette/publish/heroku.py,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328155946,"--spatialite option for ""datasette publish heroku""", https://github.com/simonw/datasette/issues/302#issuecomment-394412784,https://api.github.com/repos/simonw/datasette/issues/302,394412784,MDEyOklzc3VlQ29tbWVudDM5NDQxMjc4NA==,9599,simonw,2018-06-04T16:15:22Z,2018-06-04T16:15:22Z,OWNER,I think this is related to #303,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328171513,test-2.3.sqlite database filename throws a 404, https://github.com/simonw/datasette/issues/302#issuecomment-398825294,https://api.github.com/repos/simonw/datasette/issues/302,398825294,MDEyOklzc3VlQ29tbWVudDM5ODgyNTI5NA==,9599,simonw,2018-06-20T17:06:36Z,2018-06-20T17:06:36Z,OWNER,Still a bug in 0.23,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328171513,test-2.3.sqlite database filename throws a 404, https://github.com/simonw/datasette/issues/303#issuecomment-393557406,https://api.github.com/repos/simonw/datasette/issues/303,393557406,MDEyOklzc3VlQ29tbWVudDM5MzU1NzQwNg==,9599,simonw,2018-05-31T14:54:03Z,2018-05-31T14:54:03Z,OWNER,"Our test fixtures currently have a table with a name ending in `.csv`: https://github.com/simonw/datasette/blob/d69ebce53385b7c6fafb85fdab3b136dbf3f332c/tests/fixtures.py#L234-L237","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328172521,Support table names ending with .json or .csv, https://github.com/simonw/datasette/issues/303#issuecomment-393599840,https://api.github.com/repos/simonw/datasette/issues/303,393599840,MDEyOklzc3VlQ29tbWVudDM5MzU5OTg0MA==,9599,simonw,2018-05-31T16:54:22Z,2018-05-31T16:54:32Z,OWNER,The interesting thing about this is that it requires URL routing to become aware of the names of all of the available tables.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328172521,Support table names ending with .json or .csv, https://github.com/simonw/datasette/issues/303#issuecomment-393600441,https://api.github.com/repos/simonw/datasette/issues/303,393600441,MDEyOklzc3VlQ29tbWVudDM5MzYwMDQ0MQ==,9599,simonw,2018-05-31T16:56:25Z,2018-05-31T16:57:41Z,OWNER,"Here's a nasty challenge: what happens if a database has the following two tables: * `blah` * `blah.json` What would the URL be for the JSON endpoint for the `blah` table?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328172521,Support table names ending with .json or .csv, https://github.com/simonw/datasette/issues/303#issuecomment-394037368,https://api.github.com/repos/simonw/datasette/issues/303,394037368,MDEyOklzc3VlQ29tbWVudDM5NDAzNzM2OA==,9599,simonw,2018-06-01T23:50:17Z,2018-06-01T23:50:35Z,OWNER,"Solution for he above: support an optional `?_format=json/csv` parameter on the regular table view. Then if you have tables with the above colliding names you can use `/db/blah.json?_format=json` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328172521,Support table names ending with .json or .csv, https://github.com/simonw/datasette/issues/304#issuecomment-393610731,https://api.github.com/repos/simonw/datasette/issues/304,393610731,MDEyOklzc3VlQ29tbWVudDM5MzYxMDczMQ==,9599,simonw,2018-05-31T17:29:31Z,2018-05-31T17:30:05Z,OWNER,I prototyped this a while ago here https://github.com/simonw/datasette/commit/04476ead53758044a5f272ae8696b63d6703115e before we had the ``--config`` mechanism.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328229224,Ability to configure SQLite cache_size, https://github.com/simonw/datasette/issues/304#issuecomment-394400419,https://api.github.com/repos/simonw/datasette/issues/304,394400419,MDEyOklzc3VlQ29tbWVudDM5NDQwMDQxOQ==,9599,simonw,2018-06-04T15:39:03Z,2018-06-04T15:39:03Z,OWNER,"In the interest of getting this shipped, I'm going to ignore the `3.7.10` issue.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328229224,Ability to configure SQLite cache_size, https://github.com/simonw/datasette/issues/304#issuecomment-394412217,https://api.github.com/repos/simonw/datasette/issues/304,394412217,MDEyOklzc3VlQ29tbWVudDM5NDQxMjIxNw==,9599,simonw,2018-06-04T16:13:32Z,2018-06-04T16:13:32Z,OWNER,Docs: http://datasette.readthedocs.io/en/latest/config.html#cache-size-kb,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328229224,Ability to configure SQLite cache_size, https://github.com/simonw/datasette/issues/305#issuecomment-396048471,https://api.github.com/repos/simonw/datasette/issues/305,396048471,MDEyOklzc3VlQ29tbWVudDM5NjA0ODQ3MQ==,9599,simonw,2018-06-10T13:16:13Z,2018-06-10T13:16:13Z,OWNER,https://github.com/kubernetes/community/blob/master/contributors/devel/help-wanted.md Is worth stealing from too.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",329147284,Add contributor guidelines to docs, https://github.com/simonw/datasette/issues/305#issuecomment-504878886,https://api.github.com/repos/simonw/datasette/issues/305,504878886,MDEyOklzc3VlQ29tbWVudDUwNDg3ODg4Ng==,9599,simonw,2019-06-24T06:40:19Z,2019-06-24T06:40:19Z,OWNER,I did this a while ago https://datasette.readthedocs.io/en/stable/contributing.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",329147284,Add contributor guidelines to docs, https://github.com/simonw/datasette/issues/306#issuecomment-394894500,https://api.github.com/repos/simonw/datasette/issues/306,394894500,MDEyOklzc3VlQ29tbWVudDM5NDg5NDUwMA==,9599,simonw,2018-06-05T23:40:40Z,2018-06-05T23:40:40Z,OWNER,"Input: - function that says if a name is a valid database - Function that says if a table exists - URL Output: - view class - Arguments - Redirect (if it should redirect)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",329661905,Custom URL routing with independent tests, https://github.com/simonw/datasette/issues/306#issuecomment-394894910,https://api.github.com/repos/simonw/datasette/issues/306,394894910,MDEyOklzc3VlQ29tbWVudDM5NDg5NDkxMA==,9599,simonw,2018-06-05T23:43:18Z,2018-06-05T23:49:41Z,OWNER,I'm going to use a named tuple for the output. That way I can support either tuple destructing or explicit property access on the returned value.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",329661905,Custom URL routing with independent tests, https://github.com/simonw/datasette/issues/306#issuecomment-394895267,https://api.github.com/repos/simonw/datasette/issues/306,394895267,MDEyOklzc3VlQ29tbWVudDM5NDg5NTI2Nw==,9599,simonw,2018-06-05T23:45:26Z,2018-06-05T23:45:26Z,OWNER,To support a future where Datasette is an ASGI app that can be attached to a URL within a larger application the routing function should have the option to accept a path prefix which will then be automatically attached to any resulting redirects.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",329661905,Custom URL routing with independent tests, https://github.com/simonw/datasette/issues/306#issuecomment-394895750,https://api.github.com/repos/simonw/datasette/issues/306,394895750,MDEyOklzc3VlQ29tbWVudDM5NDg5NTc1MA==,9599,simonw,2018-06-05T23:48:06Z,2018-06-06T23:50:31Z,OWNER,"A neat trick could be that if the router returns a redirect it could then resolve that redirect to see if it will 404 (or redirect itself) before returning that response. This would need its own counter to guard against infinite redirects. I'm not going to do this though: any view that results in a chain of redirects like this is a bug that should be fixed at the source.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",329661905,Custom URL routing with independent tests, https://github.com/simonw/datasette/issues/306#issuecomment-395463497,https://api.github.com/repos/simonw/datasette/issues/306,395463497,MDEyOklzc3VlQ29tbWVudDM5NTQ2MzQ5Nw==,9599,simonw,2018-06-07T15:29:28Z,2018-06-07T15:29:28Z,OWNER,"I started sketching this out in a branch, see pull request #307 - but I've decided I don't like it. I'm going to close this ticket and stick with regular expression URL routing for the moment. If I change my mind in the future the code in #307 lives in separate files (`datasette/routes.py` and `tests/test_routes.py`) so bringing it back into the project will be trivial.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",329661905,Custom URL routing with independent tests,