html_url,issue_url,id,node_id,user,user_label,created_at,updated_at,author_association,body,reactions,issue,issue_label,performed_via_github_app https://github.com/simonw/datasette/issues/1#issuecomment-338524454,https://api.github.com/repos/simonw/datasette/issues/1,338524454,MDEyOklzc3VlQ29tbWVudDMzODUyNDQ1NA==,9599,simonw,2017-10-23T01:15:24Z,2017-10-23T01:15:24Z,OWNER,Table rendering logic needs to detect the primary key field and turn it into a hyperlink. If there is a compound primary key it should add an extra column at the start of the table which displays the compound key as a link,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267513424,Addressable pages for every row in a table, https://github.com/simonw/datasette/issues/5#issuecomment-338524857,https://api.github.com/repos/simonw/datasette/issues/5,338524857,MDEyOklzc3VlQ29tbWVudDMzODUyNDg1Nw==,9599,simonw,2017-10-23T01:20:30Z,2017-10-23T01:20:30Z,OWNER,"https://stackoverflow.com/a/14468878/6083 Looks like I should order by compound primary key and implement cursor-based pagination.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267516066,Implement sensible query pagination, https://github.com/simonw/datasette/issues/3#issuecomment-338526148,https://api.github.com/repos/simonw/datasette/issues/3,338526148,MDEyOklzc3VlQ29tbWVudDMzODUyNjE0OA==,9599,simonw,2017-10-23T01:35:17Z,2017-10-23T01:35:17Z,OWNER,https://github.com/ahupp/python-magic/blob/master/README.md,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267515678,"Make individual column valuables addressable, with smart content types", https://github.com/simonw/datasette/issues/4#issuecomment-338530389,https://api.github.com/repos/simonw/datasette/issues/4,338530389,MDEyOklzc3VlQ29tbWVudDMzODUzMDM4OQ==,9599,simonw,2017-10-23T02:15:41Z,2017-10-23T02:15:41Z,OWNER,"This means I need a good solution for these compile time options while running in development mode ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267515836,Make URLs immutable, https://github.com/simonw/datasette/issues/4#issuecomment-338530480,https://api.github.com/repos/simonw/datasette/issues/4,338530480,MDEyOklzc3VlQ29tbWVudDMzODUzMDQ4MA==,9599,simonw,2017-10-23T02:16:33Z,2017-10-23T02:16:33Z,OWNER," How about when the service starts up it checks for a compile.json file and, if it is missing, creates it using the same code we run at compile time normally ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267515836,Make URLs immutable, https://github.com/simonw/datasette/issues/11#issuecomment-338530704,https://api.github.com/repos/simonw/datasette/issues/11,338530704,MDEyOklzc3VlQ29tbWVudDMzODUzMDcwNA==,9599,simonw,2017-10-23T02:18:36Z,2017-10-23T02:18:36Z,OWNER,Needed by https://github.com/simonw/stateless-datasets/issues/4#issuecomment-338530389,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267522549,Code that generates compile-time properties about the database , https://github.com/simonw/datasette/issues/4#issuecomment-338531827,https://api.github.com/repos/simonw/datasette/issues/4,338531827,MDEyOklzc3VlQ29tbWVudDMzODUzMTgyNw==,9599,simonw,2017-10-23T02:28:31Z,2017-10-23T02:29:05Z,OWNER,"Many of the applications I want to implement with this would benefit from having permanent real URLs. So let’s have both. The sha1 urls will serve far future cache headers (and an etag derived from their path). The non sha1 URLs will serve 302 uncached redirects to the sha1 locations. We will have a setting that lets people opt out of this behavior.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267515836,Make URLs immutable, https://github.com/simonw/datasette/issues/8#issuecomment-338697223,https://api.github.com/repos/simonw/datasette/issues/8,338697223,MDEyOklzc3VlQ29tbWVudDMzODY5NzIyMw==,9599,simonw,2017-10-23T15:28:11Z,2017-10-23T15:28:11Z,OWNER,"Now returning this: { ""error"": ""attempt to write a readonly database"", ""ok"": false } ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267517314,Attempting an INSERT or UPDATE should return a sane error message, https://github.com/simonw/datasette/issues/16#issuecomment-338768860,https://api.github.com/repos/simonw/datasette/issues/16,338768860,MDEyOklzc3VlQ29tbWVudDMzODc2ODg2MA==,9599,simonw,2017-10-23T19:23:29Z,2017-10-23T19:23:29Z,OWNER,I could use the table-reflow mechanism demonstrated here: http://demos.jquerymobile.com/1.4.3/table-reflow/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267726219,Default HTML/CSS needs to look reasonable and be responsive, https://github.com/simonw/datasette/issues/20#issuecomment-338769538,https://api.github.com/repos/simonw/datasette/issues/20,338769538,MDEyOklzc3VlQ29tbWVudDMzODc2OTUzOA==,9599,simonw,2017-10-23T19:25:55Z,2017-10-23T19:25:55Z,OWNER,"Maybe this should be handled by views instead? https://stateless-datasets-wreplxalgu.now.sh/ lists some views https://stateless-datasets-wreplxalgu.now.sh/?sql=select%20*%20from%20%22Order%20Subtotals%22 is an example showing the content of a view. What would the URL to views be? I don't think a view can share a name with a table, so the same URL scheme could work for both.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/4#issuecomment-338797522,https://api.github.com/repos/simonw/datasette/issues/4,338797522,MDEyOklzc3VlQ29tbWVudDMzODc5NzUyMg==,9599,simonw,2017-10-23T21:09:33Z,2017-10-23T21:09:33Z,OWNER,"https://stackoverflow.com/a/18134919/6083 is a good answer about how many characters of the hash are needed to be unique. I say we default to 7 characters, like git does - but allow extras to be configured.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267515836,Make URLs immutable, https://github.com/simonw/datasette/issues/4#issuecomment-338789734,https://api.github.com/repos/simonw/datasette/issues/4,338789734,MDEyOklzc3VlQ29tbWVudDMzODc4OTczNA==,9599,simonw,2017-10-23T20:40:25Z,2017-10-23T21:10:19Z,OWNER,"URL design: /database/table.json - redirects to /database-6753f4a/table.json So we always redirect to the version with the truncated hash in the URL. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267515836,Make URLs immutable, https://github.com/simonw/datasette/issues/4#issuecomment-338799438,https://api.github.com/repos/simonw/datasette/issues/4,338799438,MDEyOklzc3VlQ29tbWVudDMzODc5OTQzOA==,9599,simonw,2017-10-23T21:17:25Z,2017-10-23T21:17:25Z,OWNER,Can I take advantage of HTTP/2 so even if you get redirected I start serving you the correct resource straight away?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267515836,Make URLs immutable, https://github.com/simonw/datasette/issues/4#issuecomment-338804173,https://api.github.com/repos/simonw/datasette/issues/4,338804173,MDEyOklzc3VlQ29tbWVudDMzODgwNDE3Mw==,9599,simonw,2017-10-23T21:36:37Z,2017-10-23T21:36:37Z,OWNER,"Looks like the easiest way to implement HTTP/2 server push today is to run behind Cloudflare and use this: Link: ; rel=preload; as=script https://blog.cloudflare.com/announcing-support-for-http-2-server-push-2/ Here's the W3C draft: https://w3c.github.io/preload/ From https://w3c.github.io/preload/#as-attribute it looks like I should use `as=fetch` if the content is intended for consumption by fetch() or XMLHTTPRequest. Unclear if I should throw `as=fetch crossorigin` in there. Need to experiment on that. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267515836,Make URLs immutable, https://github.com/simonw/datasette/issues/4#issuecomment-338806718,https://api.github.com/repos/simonw/datasette/issues/4,338806718,MDEyOklzc3VlQ29tbWVudDMzODgwNjcxOA==,9599,simonw,2017-10-23T21:47:53Z,2017-10-23T21:47:53Z,OWNER,"Here's what the homepage of cloudflare.com does (with newlines added within the link header for clarity): $ curl -i 'https://www.cloudflare.com/' HTTP/1.1 200 OK Date: Mon, 23 Oct 2017 21:45:58 GMT Content-Type: text/html; charset=utf-8 Transfer-Encoding: chunked Connection: keep-alive link: ; rel=preload; as=style, ; rel=preload; as=style, ; rel=preload, ; rel=preload, ; rel=preload; as=video, ; rel=preload; as=video, ; rel=preload; as=video, ; rel=preload; as=video, ; rel=preload; as=video, ; rel=preload; as=image The original header looked like this: link: ; rel=preload; as=style, ; rel=preload; as=style, ; rel=preload, ; rel=preload, ; rel=preload; as=video, ; rel=preload; as=video, ; rel=preload; as=video, ; rel=preload; as=video, ; rel=preload; as=video, ; rel=preload; as=image ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267515836,Make URLs immutable, https://github.com/simonw/datasette/issues/24#issuecomment-338834213,https://api.github.com/repos/simonw/datasette/issues/24,338834213,MDEyOklzc3VlQ29tbWVudDMzODgzNDIxMw==,9599,simonw,2017-10-24T00:23:05Z,2017-10-24T00:23:05Z,OWNER,"If I can’t setect a primary key, I won’t provide a URL for those records","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267828746,Implement full URL design, https://github.com/simonw/datasette/issues/17#issuecomment-338852971,https://api.github.com/repos/simonw/datasette/issues/17,338852971,MDEyOklzc3VlQ29tbWVudDMzODg1Mjk3MQ==,9599,simonw,2017-10-24T02:26:47Z,2017-10-24T02:26:47Z,OWNER,I'm not going to bother with this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267732005,"In development mode, should still pick up new .db files", https://github.com/simonw/datasette/issues/7#issuecomment-338853083,https://api.github.com/repos/simonw/datasette/issues/7,338853083,MDEyOklzc3VlQ29tbWVudDMzODg1MzA4Mw==,9599,simonw,2017-10-24T02:27:25Z,2017-10-24T02:27:25Z,OWNER,Fixed in 9d219140694551453bfa528e0624919eb065f9d6,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267516650,Framework where by every page is JSON plus a template, https://github.com/simonw/datasette/issues/1#issuecomment-338523957,https://api.github.com/repos/simonw/datasette/issues/1,338523957,MDEyOklzc3VlQ29tbWVudDMzODUyMzk1Nw==,9599,simonw,2017-10-23T01:09:05Z,2017-10-24T02:42:12Z,OWNER,"I also need to solve for weird primary keys. If it’s a single integer or a single char field that’s easy. But what if it is a compound key with more than one chat field? What delimiter can I use that will definitely be safe? Let’s say I use hyphen. Now I need to find a durable encoding for any hyphens that might exist in the key fields themselves. How about I use URLencoding for every non-alpha-numeric character? That will turn hyphens into (I think) %2D. It should also solve for unicode characters, but it means the vast majority of keys (integers) will display neatly, including a compound key of eg 5678-345 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267513424,Addressable pages for every row in a table, https://github.com/simonw/datasette/issues/1#issuecomment-338857568,https://api.github.com/repos/simonw/datasette/issues/1,338857568,MDEyOklzc3VlQ29tbWVudDMzODg1NzU2OA==,9599,simonw,2017-10-24T02:57:12Z,2017-10-24T02:57:12Z,OWNER,"I can find the primary keys using: PRAGMA table_info(myTable) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267513424,Addressable pages for every row in a table, https://github.com/simonw/datasette/issues/23#issuecomment-338859620,https://api.github.com/repos/simonw/datasette/issues/23,338859620,MDEyOklzc3VlQ29tbWVudDMzODg1OTYyMA==,9599,simonw,2017-10-24T03:11:42Z,2017-10-24T03:11:42Z,OWNER,I’m going to implement everything in https://docs.djangoproject.com/en/1.11/ref/models/querysets/#field-lookups with the exception of range and the various date ones.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267788884,Support Django-style filters in querystring arguments, https://github.com/simonw/datasette/issues/23#issuecomment-338859709,https://api.github.com/repos/simonw/datasette/issues/23,338859709,MDEyOklzc3VlQ29tbWVudDMzODg1OTcwOQ==,9599,simonw,2017-10-24T03:12:18Z,2017-10-24T03:12:42Z,OWNER,"I’m going to need to write unit tests for this, is this depends on #9","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267788884,Support Django-style filters in querystring arguments, https://github.com/simonw/datasette/issues/1#issuecomment-338861511,https://api.github.com/repos/simonw/datasette/issues/1,338861511,MDEyOklzc3VlQ29tbWVudDMzODg2MTUxMQ==,9599,simonw,2017-10-24T03:24:17Z,2017-10-24T03:24:17Z,OWNER,"Some tables won't have primary keys, in which case I won't generate pages for individual records.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267513424,Addressable pages for every row in a table, https://github.com/simonw/datasette/issues/9#issuecomment-338863155,https://api.github.com/repos/simonw/datasette/issues/9,338863155,MDEyOklzc3VlQ29tbWVudDMzODg2MzE1NQ==,9599,simonw,2017-10-24T03:36:58Z,2017-10-24T03:36:58Z,OWNER,I’m going to use py.test and start with all tests in a single tests.py module,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267517348,Initial test suite, https://github.com/simonw/datasette/issues/1#issuecomment-338872286,https://api.github.com/repos/simonw/datasette/issues/1,338872286,MDEyOklzc3VlQ29tbWVudDMzODg3MjI4Ng==,9599,simonw,2017-10-24T04:46:06Z,2017-10-24T04:46:06Z,OWNER,"I'm going to use `,` as the separator between elements of a compound primary key. If those elements themselves include a comma I will use `%2C` in its place.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267513424,Addressable pages for every row in a table, https://github.com/simonw/datasette/issues/9#issuecomment-338882110,https://api.github.com/repos/simonw/datasette/issues/9,338882110,MDEyOklzc3VlQ29tbWVudDMzODg4MjExMA==,9599,simonw,2017-10-24T05:55:33Z,2017-10-24T05:55:33Z,OWNER,"Well, I've started it at least.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267517348,Initial test suite, https://github.com/simonw/datasette/issues/1#issuecomment-338882207,https://api.github.com/repos/simonw/datasette/issues/1,338882207,MDEyOklzc3VlQ29tbWVudDMzODg4MjIwNw==,9599,simonw,2017-10-24T05:56:04Z,2017-10-24T05:56:04Z,OWNER,Next step: generate links to these.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267513424,Addressable pages for every row in a table, https://github.com/simonw/datasette/issues/24#issuecomment-339003850,https://api.github.com/repos/simonw/datasette/issues/24,339003850,MDEyOklzc3VlQ29tbWVudDMzOTAwMzg1MA==,9599,simonw,2017-10-24T14:12:00Z,2017-10-24T14:12:00Z,OWNER,As of b46e370ee6126aa2fa85cf789a31da38aed98496 this is done.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267828746,Implement full URL design, https://github.com/simonw/datasette/issues/29#issuecomment-339019873,https://api.github.com/repos/simonw/datasette/issues/29,339019873,MDEyOklzc3VlQ29tbWVudDMzOTAxOTg3Mw==,9599,simonw,2017-10-24T14:58:33Z,2017-10-24T14:58:33Z,OWNER,"Here's what I've got now: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268050821,Handle bytestring records encoding to JSON, https://github.com/simonw/datasette/issues/5#issuecomment-339027711,https://api.github.com/repos/simonw/datasette/issues/5,339027711,MDEyOklzc3VlQ29tbWVudDMzOTAyNzcxMQ==,9599,simonw,2017-10-24T15:21:30Z,2017-10-24T15:21:30Z,OWNER,I have code to detect primary keys on tables... but what should I do for tables that lack primary keys? How should I even sort them?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267516066,Implement sensible query pagination, https://github.com/simonw/datasette/issues/5#issuecomment-339028979,https://api.github.com/repos/simonw/datasette/issues/5,339028979,MDEyOklzc3VlQ29tbWVudDMzOTAyODk3OQ==,9599,simonw,2017-10-24T15:25:08Z,2017-10-24T15:25:08Z,OWNER,"Looks like I can use the SQLite specific “rowid” in that case. It isn’t guaranteed to stay consistent across a VACUUM but that’s ok because we are immutable anyway. https://www.sqlite.org/lang_createtable.html#rowid","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267516066,Implement sensible query pagination, https://github.com/simonw/datasette/issues/23#issuecomment-339138809,https://api.github.com/repos/simonw/datasette/issues/23,339138809,MDEyOklzc3VlQ29tbWVudDMzOTEzODgwOQ==,9599,simonw,2017-10-24T21:32:46Z,2017-10-24T21:32:46Z,OWNER,May as well support most of https://sqlite.org/lang_expr.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267788884,Support Django-style filters in querystring arguments, https://github.com/simonw/datasette/issues/23#issuecomment-338854988,https://api.github.com/repos/simonw/datasette/issues/23,338854988,MDEyOklzc3VlQ29tbWVudDMzODg1NDk4OA==,9599,simonw,2017-10-24T02:40:12Z,2017-10-25T00:05:46Z,OWNER," /database-name/table-name?name__contains=simon&sort=id+desc Note that if there's a column called ""sort"" you can still do sort__exact=blah ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267788884,Support Django-style filters in querystring arguments, https://github.com/simonw/datasette/issues/23#issuecomment-339186887,https://api.github.com/repos/simonw/datasette/issues/23,339186887,MDEyOklzc3VlQ29tbWVudDMzOTE4Njg4Nw==,9599,simonw,2017-10-25T01:39:43Z,2017-10-25T04:22:41Z,OWNER,"Still to do: - [x] `gt`, `gte`, `lt`, `lte` - [x] `like` - [x] `glob` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267788884,Support Django-style filters in querystring arguments, https://github.com/simonw/datasette/issues/23#issuecomment-339210353,https://api.github.com/repos/simonw/datasette/issues/23,339210353,MDEyOklzc3VlQ29tbWVudDMzOTIxMDM1Mw==,9599,simonw,2017-10-25T04:23:02Z,2017-10-25T04:23:02Z,OWNER,I'm going to call this one done for the moment. The date filters can go in a stretch goal.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267788884,Support Django-style filters in querystring arguments, https://github.com/simonw/datasette/issues/19#issuecomment-339366612,https://api.github.com/repos/simonw/datasette/issues/19,339366612,MDEyOklzc3VlQ29tbWVudDMzOTM2NjYxMg==,9599,simonw,2017-10-25T15:21:16Z,2017-10-25T15:21:16Z,OWNER,"I had to manually set the content disposition header: return await response.file_stream( filepath, headers={ 'Content-Disposition': 'attachment; filename=""{}""'.format(ilepath) } ) In the next release of Sanic I can just use the filename= argument instead: https://github.com/channelcat/sanic/commit/07e95dba4f5983afc1e673df14bdd278817288aa","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267741262,Efficient url for downloading the raw database file, https://github.com/simonw/datasette/issues/37#issuecomment-339382054,https://api.github.com/repos/simonw/datasette/issues/37,339382054,MDEyOklzc3VlQ29tbWVudDMzOTM4MjA1NA==,9599,simonw,2017-10-25T16:05:56Z,2017-10-25T16:05:56Z,OWNER,Could this be as simple as using the iterative JSON encoder and adding a yield statement in between each chunk?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268453968,Ability to serialize massive JSON without blocking event loop, https://github.com/simonw/datasette/issues/38#issuecomment-339388215,https://api.github.com/repos/simonw/datasette/issues/38,339388215,MDEyOklzc3VlQ29tbWVudDMzOTM4ODIxNQ==,9599,simonw,2017-10-25T16:25:45Z,2017-10-25T16:25:45Z,OWNER,"First experiment: hook up an iterative CSV dump (just because that’s a tiny bit easier to get started with than iterative a JSON). Have it execute a big select statement and then iterate through the result set 100 rows at a time using sqite fetchmany() - also have it async sleep for a second in between each batch of 100. Can this work without needing python threads? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268462768,Experiment with patterns for concurrent long running queries, https://github.com/simonw/datasette/issues/38#issuecomment-339388771,https://api.github.com/repos/simonw/datasette/issues/38,339388771,MDEyOklzc3VlQ29tbWVudDMzOTM4ODc3MQ==,9599,simonw,2017-10-25T16:27:29Z,2017-10-25T16:27:29Z,OWNER,"If this does work, I need to figure it what to do about the HTML view. ASsuming I can iteratively produce JSON and CSV, what to do about HTML? One option: render the first 500 rows as HTML, then hand off to an infinite scroll experience that iteratively loads more rows as JSON.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268462768,Experiment with patterns for concurrent long running queries, https://github.com/simonw/datasette/issues/38#issuecomment-339389105,https://api.github.com/repos/simonw/datasette/issues/38,339389105,MDEyOklzc3VlQ29tbWVudDMzOTM4OTEwNQ==,9599,simonw,2017-10-25T16:28:39Z,2017-10-25T16:28:39Z,OWNER,The gold standard here is to be able to serve up increasingly large datasets without blocking the event loop and while using a sustainable amount of RAM,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268462768,Experiment with patterns for concurrent long running queries, https://github.com/simonw/datasette/issues/38#issuecomment-339389328,https://api.github.com/repos/simonw/datasette/issues/38,339389328,MDEyOklzc3VlQ29tbWVudDMzOTM4OTMyOA==,9599,simonw,2017-10-25T16:29:23Z,2017-10-25T16:29:23Z,OWNER,Ideally we can get some serious gains from the fact that our database file is opened with the immutable option.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268462768,Experiment with patterns for concurrent long running queries, https://github.com/simonw/datasette/issues/40#issuecomment-339395551,https://api.github.com/repos/simonw/datasette/issues/40,339395551,MDEyOklzc3VlQ29tbWVudDMzOTM5NTU1MQ==,9599,simonw,2017-10-25T16:49:32Z,2017-10-25T16:49:32Z,OWNER,"Simplest implementation will be to create a temporary directory somewhere, copy in a Dockerfile and the databases and run “now” in it. Ideally I can use symlinks rather than copying potentially large database files around.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/39#issuecomment-339406634,https://api.github.com/repos/simonw/datasette/issues/39,339406634,MDEyOklzc3VlQ29tbWVudDMzOTQwNjYzNA==,9599,simonw,2017-10-25T17:27:10Z,2017-10-25T17:27:10Z,OWNER,It certainly looks like some of the stuff in https://sqlite.org/pragma.html could be used to screw around with things. Example: `PRAGMA case_sensitive_like = 1` - would that affect future queries?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268469569,Protect against malicious SQL that causes damage even though our DB is immutable, https://github.com/simonw/datasette/issues/39#issuecomment-339413825,https://api.github.com/repos/simonw/datasette/issues/39,339413825,MDEyOklzc3VlQ29tbWVudDMzOTQxMzgyNQ==,9599,simonw,2017-10-25T17:48:48Z,2017-10-25T17:48:48Z,OWNER,Could I use https://sqlparse.readthedocs.io/en/latest/ to parse incoming statements and ensure they are pure SELECTs? Would that prevent people from using a compound SELECT statement to trigger an evil PRAGMA of some sort?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268469569,Protect against malicious SQL that causes damage even though our DB is immutable, https://github.com/simonw/datasette/issues/16#issuecomment-339420462,https://api.github.com/repos/simonw/datasette/issues/16,339420462,MDEyOklzc3VlQ29tbWVudDMzOTQyMDQ2Mg==,9599,simonw,2017-10-25T18:10:51Z,2017-10-25T18:10:51Z,OWNER,"https://sitesforprofit.com/responsive-table-plugins-and-patterns has some useful links. I really like the pattern from https://css-tricks.com/responsive-data-tables/ /* Max width before this PARTICULAR table gets nasty This query will take effect for any screen smaller than 760px and also iPads specifically. */ @media only screen and (max-width: 760px), (min-device-width: 768px) and (max-device-width: 1024px) { /* Force table to not be like tables anymore */ table, thead, tbody, th, td, tr { display: block; } /* Hide table headers (but not display: none;, for accessibility) */ thead tr { position: absolute; top: -9999px; left: -9999px; } tr { border: 1px solid #ccc; } td { /* Behave like a ""row"" */ border: none; border-bottom: 1px solid #eee; position: relative; padding-left: 50%; } td:before { /* Now like a table header */ position: absolute; /* Top/left values mimic padding */ top: 6px; left: 6px; width: 45%; padding-right: 10px; white-space: nowrap; } /* Label the data */ td:nth-of-type(1):before { content: ""First Name""; } td:nth-of-type(2):before { content: ""Last Name""; } td:nth-of-type(3):before { content: ""Job Title""; } td:nth-of-type(4):before { content: ""Favorite Color""; } td:nth-of-type(5):before { content: ""Wars of Trek?""; } td:nth-of-type(6):before { content: ""Porn Name""; } td:nth-of-type(7):before { content: ""Date of Birth""; } td:nth-of-type(8):before { content: ""Dream Vacation City""; } td:nth-of-type(9):before { content: ""GPA""; } td:nth-of-type(10):before { content: ""Arbitrary Data""; } }","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267726219,Default HTML/CSS needs to look reasonable and be responsive, https://github.com/simonw/datasette/issues/39#issuecomment-339510770,https://api.github.com/repos/simonw/datasette/issues/39,339510770,MDEyOklzc3VlQ29tbWVudDMzOTUxMDc3MA==,9599,simonw,2017-10-26T00:07:40Z,2017-10-26T00:07:40Z,OWNER,It looks like I should double quote my columns and ensure they are correctly escaped https://blog.christosoft.de/2012/10/sqlite-escaping-table-acolumn-names/ - hopefully using ? placeholders for column names will work. I should use ? for tables too.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268469569,Protect against malicious SQL that causes damage even though our DB is immutable, https://github.com/simonw/datasette/issues/40#issuecomment-339514819,https://api.github.com/repos/simonw/datasette/issues/40,339514819,MDEyOklzc3VlQ29tbWVudDMzOTUxNDgxOQ==,9599,simonw,2017-10-26T00:35:46Z,2017-10-26T00:35:46Z,OWNER,"I’m going to have a single command-line app that does everything. Name to be decided - options include dataset, stateless, datasite (I quite like that - it reflects SQLite and the fact that you create a website)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/40#issuecomment-339515822,https://api.github.com/repos/simonw/datasette/issues/40,339515822,MDEyOklzc3VlQ29tbWVudDMzOTUxNTgyMg==,9599,simonw,2017-10-26T00:43:34Z,2017-10-26T00:43:34Z,OWNER,"datasite . - starts web app in current directory, serving all DB files datasite . -p 8001 - serves on custom port datasite blah.db blah2.db - serves specified files You can’t specify more than one directory. You can specify as many files as you like. If you specify two files with different oaths but the same name then they must be accessed by hash. datasite publish . - publishes current directory to the internet! Uses now by default, if it detects it on your path. Other publishers will be eventually added as plugins. datasite publish http://path-to-db.db - publishes a DB available at a URL. Works by constructing the Dockerfile with wget calls in it. datasite blah.db -m metadata.json If you specify a directory it looks for metadata.json in that directory. Otherwise you can pass an explicit metadata file oath with -m or —metadata","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/40#issuecomment-339516032,https://api.github.com/repos/simonw/datasette/issues/40,339516032,MDEyOklzc3VlQ29tbWVudDMzOTUxNjAzMg==,9599,simonw,2017-10-26T00:44:52Z,2017-10-26T00:44:52Z,OWNER,Another potential name: datapi ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/40#issuecomment-339517846,https://api.github.com/repos/simonw/datasette/issues/40,339517846,MDEyOklzc3VlQ29tbWVudDMzOTUxNzg0Ng==,9599,simonw,2017-10-26T00:58:39Z,2017-10-26T00:58:39Z,OWNER,"I’m going to use Click for this http://nvie.com/posts/writing-a-cli-in-python-in-under-60-seconds/ https://kushaldas.in/posts/building-command-line-tools-in-python-with-click.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/40#issuecomment-339724700,https://api.github.com/repos/simonw/datasette/issues/40,339724700,MDEyOklzc3VlQ29tbWVudDMzOTcyNDcwMA==,9599,simonw,2017-10-26T16:35:20Z,2017-10-26T16:35:20Z,OWNER,"Here’s how to make the “serve” subcommand the default if it is called with no arguments: @click.group(invoke_without_command=True) def serve(): # ...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/41#issuecomment-339866724,https://api.github.com/repos/simonw/datasette/issues/41,339866724,MDEyOklzc3VlQ29tbWVudDMzOTg2NjcyNA==,9599,simonw,2017-10-27T04:04:52Z,2017-10-27T04:04:52Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268590777,Homepage should show summary of databases, https://github.com/simonw/datasette/issues/40#issuecomment-339891755,https://api.github.com/repos/simonw/datasette/issues/40,339891755,MDEyOklzc3VlQ29tbWVudDMzOTg5MTc1NQ==,9599,simonw,2017-10-27T07:10:53Z,2017-10-27T07:10:53Z,OWNER,"Deploys to Now aren't working at the moment - they aren't showing the uploaded databases, because I've broken the path handling somehow. I need to do a bit more work here.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/40#issuecomment-340561577,https://api.github.com/repos/simonw/datasette/issues/40,340561577,MDEyOklzc3VlQ29tbWVudDM0MDU2MTU3Nw==,9599,simonw,2017-10-30T19:43:40Z,2017-10-30T19:43:40Z,OWNER,http://the-hitchhikers-guide-to-packaging.readthedocs.io/en/latest/quickstart.html describes how to package this for PyPI,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/39#issuecomment-340787868,https://api.github.com/repos/simonw/datasette/issues/39,340787868,MDEyOklzc3VlQ29tbWVudDM0MDc4Nzg2OA==,9599,simonw,2017-10-31T14:54:14Z,2017-10-31T14:54:14Z,OWNER,"Here’s how I can (I think) provide safe execution of arbitrary SQL while blocking PRAGMA calls: let people use names parameters in their SQL and apply strict filtering to the SQL query but not to the parameter values. cur.execute( ""select * from people where name_last=:who and age=:age"", { ""who"": who, ""age"": age }) In URL form: ?sql=select...&who=Terry&age=34 Now we can apply strict, dumb validation rules to the SQL part while allowing anything in the named queries - so people can execute a search for PRAGMA without being able to execute a PRAGMA statement.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268469569,Protect against malicious SQL that causes damage even though our DB is immutable, https://github.com/simonw/datasette/issues/10#issuecomment-341938424,https://api.github.com/repos/simonw/datasette/issues/10,341938424,MDEyOklzc3VlQ29tbWVudDM0MTkzODQyNA==,9599,simonw,2017-11-04T23:48:57Z,2017-11-04T23:48:57Z,OWNER,Done: https://github.com/simonw/stateless-datasets/commit/edaa10587e60946e0c1935333f6b79553db33798,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267517381,Set up Travis, https://github.com/simonw/datasette/issues/40#issuecomment-341945420,https://api.github.com/repos/simonw/datasette/issues/40,341945420,MDEyOklzc3VlQ29tbWVudDM0MTk0NTQyMA==,9599,simonw,2017-11-05T02:55:07Z,2017-11-05T02:55:07Z,OWNER,"To simplify things a bit, I'm going to require that every database is explicitly listed in the command line. I won't support ""serve everything in this directory"" for the moment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/40#issuecomment-342030075,https://api.github.com/repos/simonw/datasette/issues/40,342030075,MDEyOklzc3VlQ29tbWVudDM0MjAzMDA3NQ==,9599,simonw,2017-11-06T02:25:48Z,2017-11-06T02:25:48Z,OWNER,"... I tried that, I don't like it. I'm going to bring back ""directory serving"" by allowing you to pass a directory as an argument to `datasite` (including `datasite .`). I may even make `.` the default if you don't provide anything at all.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/16#issuecomment-342032943,https://api.github.com/repos/simonw/datasette/issues/16,342032943,MDEyOklzc3VlQ29tbWVudDM0MjAzMjk0Mw==,9599,simonw,2017-11-06T02:50:07Z,2017-11-06T02:50:07Z,OWNER,"Default look with Bootstrap 4 looks like this: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267726219,Default HTML/CSS needs to look reasonable and be responsive, https://github.com/simonw/datasette/issues/44#issuecomment-342484889,https://api.github.com/repos/simonw/datasette/issues/44,342484889,MDEyOklzc3VlQ29tbWVudDM0MjQ4NDg4OQ==,9599,simonw,2017-11-07T13:39:49Z,2017-11-07T13:39:49Z,OWNER,I’m going to call this feature “count values”,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",269731374,?_group_count=country - return counts by specific column(s), https://github.com/simonw/datasette/issues/47#issuecomment-342521344,https://api.github.com/repos/simonw/datasette/issues/47,342521344,MDEyOklzc3VlQ29tbWVudDM0MjUyMTM0NA==,9599,simonw,2017-11-07T15:37:45Z,2017-11-07T15:37:45Z,OWNER,GDS Registries could be fun too: https://registers.cloudapps.digital/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271831408,Create neat example database, https://github.com/simonw/datasette/issues/32#issuecomment-343164111,https://api.github.com/repos/simonw/datasette/issues/32,343164111,MDEyOklzc3VlQ29tbWVudDM0MzE2NDExMQ==,9599,simonw,2017-11-09T14:05:56Z,2017-11-09T14:05:56Z,OWNER,Implemented in 31b21f5c5e15fc3acab7fabb170c1da71dc3c98c,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268106803,Try running SQLite queries in a separate thread, https://github.com/simonw/datasette/issues/48#issuecomment-343168796,https://api.github.com/repos/simonw/datasette/issues/48,343168796,MDEyOklzc3VlQ29tbWVudDM0MzE2ODc5Ng==,9599,simonw,2017-11-09T14:22:21Z,2017-11-09T14:22:21Z,OWNER,Won't fix: ujson is not compatible with the custom JSON encoder I'm using here: https://github.com/simonw/immutabase/blob/b2dee11fcd989d9e2a7bf4de1e23dbc320c05013/immutabase/app.py#L401-L416,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272391665,Switch to ujson, https://github.com/simonw/datasette/issues/49#issuecomment-343237982,https://api.github.com/repos/simonw/datasette/issues/49,343237982,MDEyOklzc3VlQ29tbWVudDM0MzIzNzk4Mg==,9599,simonw,2017-11-09T17:58:01Z,2017-11-09T17:58:01Z,OWNER,"More terms: * publish * share * docker * host * stateless I want to capture the idea of publishing an immutable database in a stateless container.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272661336,Pick a name, https://github.com/simonw/datasette/issues/49#issuecomment-343238262,https://api.github.com/repos/simonw/datasette/issues/49,343238262,MDEyOklzc3VlQ29tbWVudDM0MzIzODI2Mg==,9599,simonw,2017-11-09T17:58:59Z,2017-11-09T17:58:59Z,OWNER,The name should ideally be available on PyPI and should make sense as both a command line application and a library.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272661336,Pick a name, https://github.com/simonw/datasette/issues/48#issuecomment-343239062,https://api.github.com/repos/simonw/datasette/issues/48,343239062,MDEyOklzc3VlQ29tbWVudDM0MzIzOTA2Mg==,9599,simonw,2017-11-09T18:01:46Z,2017-11-09T18:01:46Z,OWNER,This looks promising: https://github.com/esnme/ultrajson/issues/124#issuecomment-323882878,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272391665,Switch to ujson, https://github.com/simonw/datasette/issues/50#issuecomment-343266326,https://api.github.com/repos/simonw/datasette/issues/50,343266326,MDEyOklzc3VlQ29tbWVudDM0MzI2NjMyNg==,9599,simonw,2017-11-09T19:33:18Z,2017-11-09T19:33:18Z,OWNER,http://sanic.readthedocs.io/en/latest/sanic/testing.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272694136,Unit tests against application itself, https://github.com/simonw/datasette/issues/49#issuecomment-343281876,https://api.github.com/repos/simonw/datasette/issues/49,343281876,MDEyOklzc3VlQ29tbWVudDM0MzI4MTg3Ng==,9599,simonw,2017-11-09T20:30:42Z,2017-11-09T20:30:42Z,OWNER,How about datasette?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272661336,Pick a name, https://github.com/simonw/datasette/issues/49#issuecomment-343551356,https://api.github.com/repos/simonw/datasette/issues/49,343551356,MDEyOklzc3VlQ29tbWVudDM0MzU1MTM1Ng==,9599,simonw,2017-11-10T18:33:22Z,2017-11-10T18:33:22Z,OWNER,I'm going with datasette.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272661336,Pick a name, https://github.com/simonw/datasette/issues/52#issuecomment-343557070,https://api.github.com/repos/simonw/datasette/issues/52,343557070,MDEyOklzc3VlQ29tbWVudDM0MzU1NzA3MA==,9599,simonw,2017-11-10T18:57:47Z,2017-11-10T18:57:47Z,OWNER,"https://file.io/ looks like it could be good for this. It's been around since 2015, and lets you upload a temporary file which can be downloaded once. $ curl -s -F ""file=@database.db"" ""https://file.io/?expires=1d"" {""success"":true,""key"":""ySrl1j"",""link"":""https://file.io/ySrl1j"",""expiry"":""1 day""} Downloading from that URL serves up the data with a `Content-disposition` header containing the filename: simonw$ curl -vv https://file.io/ySrl1j | more % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0* Trying 34.232.1.167... * Connected to file.io (34.232.1.167) port 443 (#0) * TLS 1.2 connection using TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256 * Server certificate: file.io * Server certificate: Amazon * Server certificate: Amazon Root CA 1 * Server certificate: Starfield Services Root Certificate Authority - G2 > GET /ySrl1j HTTP/1.1 > Host: file.io > User-Agent: curl/7.43.0 > Accept: */* > < HTTP/1.1 200 OK < Date: Fri, 10 Nov 2017 18:14:38 GMT < Content-Type: undefined < Transfer-Encoding: chunked < Connection: keep-alive < X-Powered-By: Express < X-RateLimit-Limit: 5 < X-RateLimit-Remaining: 4 < Access-Control-Allow-Origin: * < Access-Control-Allow-Headers: Cache-Control,X-reqed-With,x-requested-with < Content-disposition: attachment; filename=database.db ... ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273026602,Solution for temporarily uploading DB so it can be built by docker, https://github.com/simonw/datasette/issues/20#issuecomment-343581130,https://api.github.com/repos/simonw/datasette/issues/20,343581130,MDEyOklzc3VlQ29tbWVudDM0MzU4MTEzMA==,9599,simonw,2017-11-10T20:44:38Z,2017-11-10T20:44:38Z,OWNER,"I'm going to handle this a different way. I'm going to support a local history of your own queries stored in localStorage, but if you want to share a query you have to do it with a URL. If people really want canned query support, they can do that using custom templates - see #12 - or by adding views to their database before they publish it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/21#issuecomment-343581332,https://api.github.com/repos/simonw/datasette/issues/21,343581332,MDEyOklzc3VlQ29tbWVudDM0MzU4MTMzMg==,9599,simonw,2017-11-10T20:45:42Z,2017-11-10T20:45:42Z,OWNER,I'm not going to use Sanic's mechanism for this. I'll use arguments passed to my cli instead.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267769034,Use Sanic configuration mechanism , https://github.com/simonw/datasette/issues/16#issuecomment-343643332,https://api.github.com/repos/simonw/datasette/issues/16,343643332,MDEyOklzc3VlQ29tbWVudDM0MzY0MzMzMg==,9599,simonw,2017-11-11T06:00:04Z,2017-11-11T06:00:04Z,OWNER,"Here's what a table looks like now at a smaller screen size: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267726219,Default HTML/CSS needs to look reasonable and be responsive, https://github.com/simonw/datasette/issues/54#issuecomment-343644891,https://api.github.com/repos/simonw/datasette/issues/54,343644891,MDEyOklzc3VlQ29tbWVudDM0MzY0NDg5MQ==,9599,simonw,2017-11-11T06:39:54Z,2017-11-11T06:39:54Z,OWNER,"I can detect something is a view like this: SELECT name from sqlite_master WHERE type ='view'; ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273121803,Views should not attempt to link to records / use rowids, https://github.com/simonw/datasette/issues/26#issuecomment-343644976,https://api.github.com/repos/simonw/datasette/issues/26,343644976,MDEyOklzc3VlQ29tbWVudDM0MzY0NDk3Ng==,9599,simonw,2017-11-11T06:42:23Z,2017-11-11T06:42:23Z,OWNER,"Simplest version of this: 1. Create a temporary directory 2. Write a Dockerfile into it that pulls an image and pip installs datasette 3. Add symlinks to the DBs they listed (so we don't have to copy them) 4. Shell out to ""now"" 5. Done! ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267861210,Command line tool for uploading one or more DBs to Now, https://github.com/simonw/datasette/issues/26#issuecomment-343645249,https://api.github.com/repos/simonw/datasette/issues/26,343645249,MDEyOklzc3VlQ29tbWVudDM0MzY0NTI0OQ==,9599,simonw,2017-11-11T06:48:59Z,2017-11-11T06:48:59Z,OWNER,"Doing this works: import os os.link('/tmp/databases/northwind.db', '/tmp/tmp-blah/northwind.db') That creates a link in tmp-blah - and then when I delete that entire directory like so: import shutil shutil.rmtree('/tmp/tmp-blah') The original database is not deleted, just the link.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267861210,Command line tool for uploading one or more DBs to Now, https://github.com/simonw/datasette/issues/26#issuecomment-343645327,https://api.github.com/repos/simonw/datasette/issues/26,343645327,MDEyOklzc3VlQ29tbWVudDM0MzY0NTMyNw==,9599,simonw,2017-11-11T06:51:16Z,2017-11-11T06:51:16Z,OWNER,"I can create the temporary directory like so: import tempfile t = tempfile.TemporaryDirectory() t t.name '/var/folders/w9/0xm39tk94ng9h52g06z4b54c0000gp/T/tmpkym70wlp' And then to delete it all: t.cleanup() ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267861210,Command line tool for uploading one or more DBs to Now, https://github.com/simonw/datasette/issues/40#issuecomment-343646740,https://api.github.com/repos/simonw/datasette/issues/40,343646740,MDEyOklzc3VlQ29tbWVudDM0MzY0Njc0MA==,9599,simonw,2017-11-11T07:27:33Z,2017-11-11T07:27:33Z,OWNER,I'm happy with this now that I've implemented the publish command in #26 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572,Implement command-line tool interface, https://github.com/simonw/datasette/issues/47#issuecomment-343647102,https://api.github.com/repos/simonw/datasette/issues/47,343647102,MDEyOklzc3VlQ29tbWVudDM0MzY0NzEwMg==,9599,simonw,2017-11-11T07:36:00Z,2017-11-11T07:36:00Z,OWNER,"http://2016.padjo.org/tutorials/data-primer-census-acs1-demographics/ has a sqlite database: http://2016.padjo.org/files/data/starterpack/census-acs-1year/acs-1-year-2015.sqlite I tested this by deploying it here: https://datasette-fewuggrvwr.now.sh/","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271831408,Create neat example database, https://github.com/simonw/datasette/issues/16#issuecomment-343647300,https://api.github.com/repos/simonw/datasette/issues/16,343647300,MDEyOklzc3VlQ29tbWVudDM0MzY0NzMwMA==,9599,simonw,2017-11-11T07:41:19Z,2017-11-11T07:53:09Z,OWNER,"Still needed: - [ ] A link to the homepage from some kind of navigation bar in the header - [ ] link to github.com/simonw/datasette in the footer - [ ] Slightly better titles (maybe ditch the visited link colours for titles only? should keep those for primary key links) - [ ] Links to the .json and .jsono versions of every view","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267726219,Default HTML/CSS needs to look reasonable and be responsive, https://github.com/simonw/datasette/issues/14#issuecomment-343675165,https://api.github.com/repos/simonw/datasette/issues/14,343675165,MDEyOklzc3VlQ29tbWVudDM0MzY3NTE2NQ==,9599,simonw,2017-11-11T16:07:10Z,2017-11-11T16:07:10Z,OWNER,The plugin system can also allow alternative providers for the `publish` command - e.g. maybe hook up hyper.sh as an option for publishing containers.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/59#issuecomment-343676574,https://api.github.com/repos/simonw/datasette/issues/59,343676574,MDEyOklzc3VlQ29tbWVudDM0MzY3NjU3NA==,9599,simonw,2017-11-11T16:29:48Z,2017-11-11T16:29:48Z,OWNER,See also #14,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273157085,datasette publish hyper, https://github.com/simonw/datasette/issues/60#issuecomment-343683566,https://api.github.com/repos/simonw/datasette/issues/60,343683566,MDEyOklzc3VlQ29tbWVudDM0MzY4MzU2Ng==,9599,simonw,2017-11-11T18:12:24Z,2017-11-11T18:12:24Z,OWNER,"I’m going to solve this by making it an optional argument you can pass to the serve command. Then the Dockerfile can still build and use it but it won’t interfere with tests or dev. If argument is not passed, we will calculate hashes on startup and calculate table row counts on demand. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273163905,Rethink how metadata is generated and stored, https://github.com/simonw/datasette/issues/47#issuecomment-343690060,https://api.github.com/repos/simonw/datasette/issues/47,343690060,MDEyOklzc3VlQ29tbWVudDM0MzY5MDA2MA==,9599,simonw,2017-11-11T19:56:08Z,2017-11-11T19:56:08Z,OWNER," ""parlgov-development.db"": { ""url"": ""http://www.parlgov.org/"" }, ""nhsadmin.sqlite"": { ""url"": ""https://github.com/psychemedia/openHealthDataDoodles"" }","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271831408,Create neat example database, https://github.com/simonw/datasette/issues/16#issuecomment-343691342,https://api.github.com/repos/simonw/datasette/issues/16,343691342,MDEyOklzc3VlQ29tbWVudDM0MzY5MTM0Mg==,9599,simonw,2017-11-11T20:19:07Z,2017-11-11T20:19:07Z,OWNER,"Closing this, opening a fresh ticket for the navigation stuff.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267726219,Default HTML/CSS needs to look reasonable and be responsive, https://github.com/simonw/datasette/issues/63#issuecomment-343697291,https://api.github.com/repos/simonw/datasette/issues/63,343697291,MDEyOklzc3VlQ29tbWVudDM0MzY5NzI5MQ==,9599,simonw,2017-11-11T22:05:06Z,2017-11-11T22:11:49Z,OWNER,"I'm going to bundle sql and sql_params together into a query nested object like this: { ""query"": { ""sql"": ""select ..."", ""params"": { ""p0"": ""blah"" } } }","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273174447,Review design of JSON output, https://github.com/simonw/datasette/issues/50#issuecomment-343698214,https://api.github.com/repos/simonw/datasette/issues/50,343698214,MDEyOklzc3VlQ29tbWVudDM0MzY5ODIxNA==,9599,simonw,2017-11-11T22:23:21Z,2017-11-11T22:23:21Z,OWNER,"I'm closing #50 - more tests will be added in the future, but the framework is neatly in place for them now. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272694136,Unit tests against application itself, https://github.com/simonw/datasette/issues/53#issuecomment-343699115,https://api.github.com/repos/simonw/datasette/issues/53,343699115,MDEyOklzc3VlQ29tbWVudDM0MzY5OTExNQ==,9599,simonw,2017-11-11T22:41:38Z,2017-11-11T22:41:38Z,OWNER,This needs to incorporate a sensible way of presenting custom SQL query results too. And let's get a textarea in there for executing SQL while we're at it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273054652,Implement a better database index page, https://github.com/simonw/datasette/issues/47#issuecomment-343705966,https://api.github.com/repos/simonw/datasette/issues/47,343705966,MDEyOklzc3VlQ29tbWVudDM0MzcwNTk2Ng==,9599,simonw,2017-11-12T01:00:20Z,2017-11-12T01:00:20Z,OWNER,https://github.com/fivethirtyeight/data has a ton of CSVs,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271831408,Create neat example database, https://github.com/simonw/datasette/issues/53#issuecomment-343707624,https://api.github.com/repos/simonw/datasette/issues/53,343707624,MDEyOklzc3VlQ29tbWVudDM0MzcwNzYyNA==,9599,simonw,2017-11-12T01:47:45Z,2017-11-12T01:47:45Z,OWNER,Split the SQL thing out into #65 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273054652,Implement a better database index page, https://github.com/simonw/datasette/issues/53#issuecomment-343707676,https://api.github.com/repos/simonw/datasette/issues/53,343707676,MDEyOklzc3VlQ29tbWVudDM0MzcwNzY3Ng==,9599,simonw,2017-11-12T01:49:07Z,2017-11-12T01:49:07Z,OWNER,"Here's the new design: Also lists views at the bottom (refs #54): ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273054652,Implement a better database index page, https://github.com/simonw/datasette/issues/42#issuecomment-343708447,https://api.github.com/repos/simonw/datasette/issues/42,343708447,MDEyOklzc3VlQ29tbWVudDM0MzcwODQ0Nw==,9599,simonw,2017-11-12T02:12:15Z,2017-11-12T02:12:15Z,OWNER,I ditched the metadata file concept.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268591332,Homepage UI for editing metadata file, https://github.com/simonw/datasette/issues/65#issuecomment-343709217,https://api.github.com/repos/simonw/datasette/issues/65,343709217,MDEyOklzc3VlQ29tbWVudDM0MzcwOTIxNw==,9599,simonw,2017-11-12T02:36:37Z,2017-11-12T02:36:37Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273191608,Re-implement ?sql= mode, https://github.com/simonw/datasette/issues/25#issuecomment-343715915,https://api.github.com/repos/simonw/datasette/issues/25,343715915,MDEyOklzc3VlQ29tbWVudDM0MzcxNTkxNQ==,9599,simonw,2017-11-12T06:08:28Z,2017-11-12T06:08:28Z,OWNER," con = sqlite3.connect('existing_db.db') with open('dump.sql', 'w') as f: for line in con.iterdump(): f.write('%s\n' % line) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267857622,Endpoint that returns SQL ready to be piped into DB, https://github.com/simonw/datasette/issues/42#issuecomment-343752404,https://api.github.com/repos/simonw/datasette/issues/42,343752404,MDEyOklzc3VlQ29tbWVudDM0Mzc1MjQwNA==,9599,simonw,2017-11-12T17:20:10Z,2017-11-12T17:20:10Z,OWNER,"Re-opening this - I've decided to bring back this concept, see #68 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268591332,Homepage UI for editing metadata file, https://github.com/simonw/datasette/issues/69#issuecomment-343752579,https://api.github.com/repos/simonw/datasette/issues/69,343752579,MDEyOklzc3VlQ29tbWVudDM0Mzc1MjU3OQ==,9599,simonw,2017-11-12T17:22:39Z,2017-11-12T17:22:39Z,OWNER,"By default I'll allow LIMIT and OFFSET up to a maximum of X (where X is let's say 50,000 to start with, but can be custom configured to a larger number or set to None for no limit).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273248366,Enforce pagination (or at least limits) for arbitrary custom SQL, https://github.com/simonw/datasette/issues/66#issuecomment-343752683,https://api.github.com/repos/simonw/datasette/issues/66,343752683,MDEyOklzc3VlQ29tbWVudDM0Mzc1MjY4Mw==,9599,simonw,2017-11-12T17:24:05Z,2017-11-12T17:24:21Z,OWNER,"Maybe SQL views should have their own Sanic view class (`ViewView` is kinda funny), subclassed from `TableView`?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273191806,Show table SQL on table page, https://github.com/simonw/datasette/issues/68#issuecomment-343754058,https://api.github.com/repos/simonw/datasette/issues/68,343754058,MDEyOklzc3VlQ29tbWVudDM0Mzc1NDA1OA==,9599,simonw,2017-11-12T17:46:13Z,2017-11-12T17:46:13Z,OWNER,I’m going to store this stuff in a file called metadata.json and move the existing automatically generated metadata to a file called build.json,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273247186,Support for title/source/license metadata, https://github.com/simonw/datasette/issues/68#issuecomment-343753999,https://api.github.com/repos/simonw/datasette/issues/68,343753999,MDEyOklzc3VlQ29tbWVudDM0Mzc1Mzk5OQ==,9599,simonw,2017-11-12T17:45:21Z,2017-11-12T19:38:33Z,OWNER,"For initial launch, I could just support this as some optional command line arguments you pass to the publish command: datasette publish data.db --title=""Title"" --source=""url""","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273247186,Support for title/source/license metadata, https://github.com/simonw/datasette/issues/57#issuecomment-343769692,https://api.github.com/repos/simonw/datasette/issues/57,343769692,MDEyOklzc3VlQ29tbWVudDM0Mzc2OTY5Mg==,9599,simonw,2017-11-12T21:32:36Z,2017-11-12T21:32:36Z,OWNER,I have created a Docker Hub public repository for this: https://hub.docker.com/r/simonwillison/datasette/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127694,Ship a Docker image of the whole thing, https://github.com/simonw/datasette/issues/69#issuecomment-343780039,https://api.github.com/repos/simonw/datasette/issues/69,343780039,MDEyOklzc3VlQ29tbWVudDM0Mzc4MDAzOQ==,9599,simonw,2017-11-13T00:05:27Z,2017-11-13T00:05:27Z,OWNER,"I think the only safe way to do this is using SQLite `.fetchmany(1000)` - I can't guarantee that the user has not entered SQL that will outfox a limit in some way. So instead of attempting to edit their SQL, I'll always return 1001 records and let them know if they went over 1000 or not.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273248366,Enforce pagination (or at least limits) for arbitrary custom SQL, https://github.com/simonw/datasette/issues/71#issuecomment-343780141,https://api.github.com/repos/simonw/datasette/issues/71,343780141,MDEyOklzc3VlQ29tbWVudDM0Mzc4MDE0MQ==,9599,simonw,2017-11-13T00:06:52Z,2017-11-13T00:06:52Z,OWNER,I've registered datasettes.com as a domain name for doing this. Now setting it up so Cloudflare and Now can serve content from it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840,Set up some example datasets on a Cloudflare-backed domain, https://github.com/simonw/datasette/issues/71#issuecomment-343780671,https://api.github.com/repos/simonw/datasette/issues/71,343780671,MDEyOklzc3VlQ29tbWVudDM0Mzc4MDY3MQ==,9599,simonw,2017-11-13T00:15:21Z,2017-11-13T00:17:37Z,OWNER,- [x] Redirect https://datasettes.com/ and https://www.datasettes.com/ to https://github.com/simonw/datasette,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840,Set up some example datasets on a Cloudflare-backed domain, https://github.com/simonw/datasette/issues/71#issuecomment-343780814,https://api.github.com/repos/simonw/datasette/issues/71,343780814,MDEyOklzc3VlQ29tbWVudDM0Mzc4MDgxNA==,9599,simonw,2017-11-13T00:17:50Z,2017-11-13T00:18:19Z,OWNER,"Achieved those redirects using Cloudflare ""page rules"": https://www.cloudflare.com/a/page-rules/datasettes.com","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840,Set up some example datasets on a Cloudflare-backed domain, https://github.com/simonw/datasette/issues/71#issuecomment-343780539,https://api.github.com/repos/simonw/datasette/issues/71,343780539,MDEyOklzc3VlQ29tbWVudDM0Mzc4MDUzOQ==,9599,simonw,2017-11-13T00:13:29Z,2017-11-13T00:19:46Z,OWNER,"https://zeit.co/docs/features/dns is docs now domain add -e datasettes.com I had to set up a custom TXT record on `_now.datasettes.com` to get this to work.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840,Set up some example datasets on a Cloudflare-backed domain, https://github.com/simonw/datasette/issues/71#issuecomment-343788581,https://api.github.com/repos/simonw/datasette/issues/71,343788581,MDEyOklzc3VlQ29tbWVudDM0Mzc4ODU4MQ==,9599,simonw,2017-11-13T01:48:17Z,2017-11-13T01:48:17Z,OWNER,"I had to add a rule like this to get letsencrypt certificates on now.sh working: https://github.com/zeit/now-cli/issues/188#issuecomment-270105052 I also have to flip this switch off every time I want to add a new alias: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840,Set up some example datasets on a Cloudflare-backed domain, https://github.com/simonw/datasette/issues/71#issuecomment-343788780,https://api.github.com/repos/simonw/datasette/issues/71,343788780,MDEyOklzc3VlQ29tbWVudDM0Mzc4ODc4MA==,9599,simonw,2017-11-13T01:50:01Z,2017-11-13T01:50:01Z,OWNER,"Added another page rule in order to get Cloudflare to always obey cache headers sent by the server: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840,Set up some example datasets on a Cloudflare-backed domain, https://github.com/simonw/datasette/issues/71#issuecomment-343788817,https://api.github.com/repos/simonw/datasette/issues/71,343788817,MDEyOklzc3VlQ29tbWVudDM0Mzc4ODgxNw==,9599,simonw,2017-11-13T01:50:27Z,2017-11-13T01:50:27Z,OWNER,https://fivethirtyeight.datasettes.com/ is now up and running.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840,Set up some example datasets on a Cloudflare-backed domain, https://github.com/simonw/datasette/issues/71#issuecomment-343789162,https://api.github.com/repos/simonw/datasette/issues/71,343789162,MDEyOklzc3VlQ29tbWVudDM0Mzc4OTE2Mg==,9599,simonw,2017-11-13T01:53:29Z,2017-11-13T01:53:29Z,OWNER,"``` $ curl -i 'https://fivethirtyeight.datasettes.com/fivethirtyeight-75d605c/obama-commutations%2Fobama_commutations.csv.jsono' HTTP/1.1 200 OK Date: Mon, 13 Nov 2017 01:50:57 GMT Content-Type: application/json Transfer-Encoding: chunked Connection: keep-alive Set-Cookie: __cfduid=de836090f3e12a60579cc7a1696cf0d9e1510537857; expires=Tue, 13-Nov-18 01:50:57 GMT; path=/; domain=.datasettes.com; HttpOnly; Secure Access-Control-Allow-Origin: * Cache-Control: public, max-age=31536000 X-Now-Region: now-sfo CF-Cache-Status: HIT Expires: Tue, 13 Nov 2018 01:50:57 GMT Server: cloudflare-nginx CF-RAY: 3bce154a6d9293b4-SJC {""database"": ""fivethirtyeight"", ""table"": ""obama-commutations/obama_commutations.csv""...```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840,Set up some example datasets on a Cloudflare-backed domain, https://github.com/simonw/datasette/issues/71#issuecomment-343781030,https://api.github.com/repos/simonw/datasette/issues/71,343781030,MDEyOklzc3VlQ29tbWVudDM0Mzc4MTAzMA==,9599,simonw,2017-11-13T00:21:05Z,2017-11-13T02:09:32Z,OWNER,"- [x] Have `now domain add -e datasettes.com` run without errors (hopefully just a matter of waiting for the DNS to update) - [x] Alias an example dataset hosted on Now on a datasettes.com subdomain - [x] Confirm that HTTP caching and HTTP/2 redirect pushing works as expected - this may require another page rule","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840,Set up some example datasets on a Cloudflare-backed domain, https://github.com/simonw/datasette/issues/71#issuecomment-343790984,https://api.github.com/repos/simonw/datasette/issues/71,343790984,MDEyOklzc3VlQ29tbWVudDM0Mzc5MDk4NA==,9599,simonw,2017-11-13T02:09:34Z,2017-11-13T02:09:34Z,OWNER,"HTTP/2 push totally worked on the redirect! fetch('https://fivethirtyeight.datasettes.com/fivethirtyeight/riddler-pick-lowest%2Flow_numbers.csv.jsono').then(r => r.json()).then(console.log) Meanwhile, in the network pane... ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840,Set up some example datasets on a Cloudflare-backed domain, https://github.com/simonw/datasette/issues/68#issuecomment-343791348,https://api.github.com/repos/simonw/datasette/issues/68,343791348,MDEyOklzc3VlQ29tbWVudDM0Mzc5MTM0OA==,9599,simonw,2017-11-13T02:12:58Z,2017-11-13T02:12:58Z,OWNER,I should use this on https://fivethirtyeight.datasettes.com/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273247186,Support for title/source/license metadata, https://github.com/simonw/datasette/issues/73#issuecomment-343801392,https://api.github.com/repos/simonw/datasette/issues/73,343801392,MDEyOklzc3VlQ29tbWVudDM0MzgwMTM5Mg==,9599,simonw,2017-11-13T03:36:47Z,2017-11-13T03:36:47Z,OWNER,"While I’m at it, let’s allow people to opt out of HTTP/2 push with a ?_nopush=1 argument too - in case they decide they don’t want to receive large 302 responses.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273296178,_nocache=1 query string option for use with sort-by-random, https://github.com/simonw/datasette/issues/68#issuecomment-343951751,https://api.github.com/repos/simonw/datasette/issues/68,343951751,MDEyOklzc3VlQ29tbWVudDM0Mzk1MTc1MQ==,9599,simonw,2017-11-13T15:21:04Z,2017-11-13T15:21:04Z,OWNER,"For first version, I'm just supporting title, source and license information at the database level.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273247186,Support for title/source/license metadata, https://github.com/simonw/datasette/issues/67#issuecomment-343961784,https://api.github.com/repos/simonw/datasette/issues/67,343961784,MDEyOklzc3VlQ29tbWVudDM0Mzk2MTc4NA==,9599,simonw,2017-11-13T15:50:50Z,2017-11-13T15:50:50Z,OWNER,"`datasette package ...` - same arguments as `datasette publish`. Creates Docker container in your local repo, optionally tagged with `--tag`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273192789,Command that builds a local docker container, https://github.com/simonw/datasette/issues/67#issuecomment-343967020,https://api.github.com/repos/simonw/datasette/issues/67,343967020,MDEyOklzc3VlQ29tbWVudDM0Mzk2NzAyMA==,9599,simonw,2017-11-13T16:06:10Z,2017-11-13T16:06:10Z,OWNER,http://odewahn.github.io/docker-jumpstart/example.html is helpful,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273192789,Command that builds a local docker container, https://github.com/simonw/datasette/issues/75#issuecomment-344000982,https://api.github.com/repos/simonw/datasette/issues/75,344000982,MDEyOklzc3VlQ29tbWVudDM0NDAwMDk4Mg==,9599,simonw,2017-11-13T17:50:27Z,2017-11-13T17:50:27Z,OWNER,"This is necessary because one of the fun things to do with this tool is run it locally, e.g.: datasette ~/Library/Application\ Support/Google/Chrome/Default/History -p 8003 BUT... if we enable CORS by default, an evil site could try sniffing for localhost:8003 and attempt to steal data. So we'll enable the CORS headers only if `--cors` is provided to the command, and then use that command in the default Dockerfile.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273509159,Add --cors argument to serve, https://github.com/simonw/datasette/issues/51#issuecomment-344017088,https://api.github.com/repos/simonw/datasette/issues/51,344017088,MDEyOklzc3VlQ29tbWVudDM0NDAxNzA4OA==,9599,simonw,2017-11-13T18:44:23Z,2017-11-13T18:44:23Z,OWNER,Implemented in https://github.com/simonw/datasette/commit/e838bd743d31358b362875854a0ac5e78047727f,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272735257,Make a proper README, https://github.com/simonw/datasette/issues/74#issuecomment-344018680,https://api.github.com/repos/simonw/datasette/issues/74,344018680,MDEyOklzc3VlQ29tbWVudDM0NDAxODY4MA==,9599,simonw,2017-11-13T18:49:58Z,2017-11-13T18:49:58Z,OWNER,Turns out it does this already: https://github.com/simonw/datasette/blob/6b3b05b6db0d2a7b7cec8b8dbb4ddc5e12a376b2/datasette/app.py#L96-L107,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273296684,Send a 302 redirect to the new hash for hits to old hashes, https://github.com/simonw/datasette/issues/69#issuecomment-344019631,https://api.github.com/repos/simonw/datasette/issues/69,344019631,MDEyOklzc3VlQ29tbWVudDM0NDAxOTYzMQ==,9599,simonw,2017-11-13T18:53:13Z,2017-11-13T18:53:13Z,OWNER,I'm going with a page size of 100 and a max limit of 1000,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273248366,Enforce pagination (or at least limits) for arbitrary custom SQL, https://github.com/simonw/datasette/issues/69#issuecomment-344048656,https://api.github.com/repos/simonw/datasette/issues/69,344048656,MDEyOklzc3VlQ29tbWVudDM0NDA0ODY1Ng==,9599,simonw,2017-11-13T20:32:47Z,2017-11-13T20:32:47Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273248366,Enforce pagination (or at least limits) for arbitrary custom SQL, https://github.com/simonw/datasette/issues/55#issuecomment-344060070,https://api.github.com/repos/simonw/datasette/issues/55,344060070,MDEyOklzc3VlQ29tbWVudDM0NDA2MDA3MA==,9599,simonw,2017-11-13T21:14:13Z,2017-11-13T21:14:13Z,OWNER,"I'm going to add some extra metadata to setup.py and then tag this as version 0.8: git tag 0.8 git push --tags Then to ship to PyPI: python setup.py bdist_wheel twine register dist/datasette-0.8-py3-none-any.whl twine upload dist/datasette-0.8-py3-none-any.whl ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127117,Ship first version to PyPI, https://github.com/simonw/datasette/issues/55#issuecomment-344061762,https://api.github.com/repos/simonw/datasette/issues/55,344061762,MDEyOklzc3VlQ29tbWVudDM0NDA2MTc2Mg==,9599,simonw,2017-11-13T21:19:43Z,2017-11-13T21:19:43Z,OWNER,And we're live! https://pypi.python.org/pypi/datasette,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127117,Ship first version to PyPI, https://github.com/simonw/datasette/issues/80#issuecomment-344074443,https://api.github.com/repos/simonw/datasette/issues/80,344074443,MDEyOklzc3VlQ29tbWVudDM0NDA3NDQ0Mw==,9599,simonw,2017-11-13T22:04:54Z,2017-11-13T22:05:02Z,OWNER,"The fivethirtyeight dataset: datasette publish now --name fivethirtyeight --metadata metadata.json fivethirtyeight.db now alias https://fivethirtyeight-jyqfudvjli.now.sh fivethirtyeight.datasettes.com And parlgov: datasette publish now parlgov.db --name=parlgov --metadata=parlgov.json now alias https://parlgov-hqvxuhmbyh.now.sh parlgov.datasettes.com ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273569477,Deploy final versions of fivethirtyeight and parlgov datasets (with view pagination), https://github.com/simonw/datasette/issues/80#issuecomment-344075696,https://api.github.com/repos/simonw/datasette/issues/80,344075696,MDEyOklzc3VlQ29tbWVudDM0NDA3NTY5Ng==,9599,simonw,2017-11-13T22:09:46Z,2017-11-13T22:09:46Z,OWNER,"Parlgov was throwing errors on one of the views, which takes longer than 1000ms to execute - so I added the ability to customize the time limit in https://github.com/simonw/datasette/commit/1e698787a4dd6df0432021a6814c446c8b69bba2 datasette publish now parlgov.db --metadata parlgov.json --name parlgov --extra-options=""--sql_time_limit_ms=3500"" now alias https://parlgov-nvkcowlixq.now.sh parlgov.datasettes.com https://parlgov.datasettes.com/parlgov-25f9855/view_cabinet now returns in just over 2.5s ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273569477,Deploy final versions of fivethirtyeight and parlgov datasets (with view pagination), https://github.com/simonw/datasette/pull/81#issuecomment-344076554,https://api.github.com/repos/simonw/datasette/issues/81,344076554,MDEyOklzc3VlQ29tbWVudDM0NDA3NjU1NA==,9599,simonw,2017-11-13T22:12:57Z,2017-11-13T22:12:57Z,OWNER,"Hah, I haven't even announced this yet :) Travis is upset because I'm using SQL in the tests which isn't compatible with their version of Python 3.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273595473,:fire: Removes DS_Store, https://github.com/simonw/datasette/issues/59#issuecomment-344081876,https://api.github.com/repos/simonw/datasette/issues/59,344081876,MDEyOklzc3VlQ29tbWVudDM0NDA4MTg3Ng==,9599,simonw,2017-11-13T22:33:43Z,2017-11-13T22:33:43Z,OWNER,The `datasette package` command introduced in 4143e3b45c16cbae5e3e3419ef479a71810e7df3 is relevant here.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273157085,datasette publish hyper, https://github.com/simonw/datasette/issues/82#issuecomment-344118849,https://api.github.com/repos/simonw/datasette/issues/82,344118849,MDEyOklzc3VlQ29tbWVudDM0NDExODg0OQ==,9599,simonw,2017-11-14T01:46:10Z,2017-11-14T01:46:10Z,OWNER,Did this: https://simonwillison.net/2017/Nov/13/datasette/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273596159,Post a blog entry announcing it to the world, https://github.com/simonw/datasette/pull/81#issuecomment-344125441,https://api.github.com/repos/simonw/datasette/issues/81,344125441,MDEyOklzc3VlQ29tbWVudDM0NDEyNTQ0MQ==,50527,jefftriplett,2017-11-14T02:24:54Z,2017-11-14T02:24:54Z,CONTRIBUTOR,"Oops, if I jumped the gun. I saw the project in my github activity feed and saw some low hanging fruit :) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273595473,:fire: Removes DS_Store, https://github.com/simonw/datasette/issues/47#issuecomment-344132481,https://api.github.com/repos/simonw/datasette/issues/47,344132481,MDEyOklzc3VlQ29tbWVudDM0NDEzMjQ4MQ==,9599,simonw,2017-11-14T03:08:13Z,2017-11-14T03:08:13Z,OWNER,I ended up shipping with https://fivethirtyeight.datasettes.com/ and https://parlgov.datasettes.com/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271831408,Create neat example database, https://github.com/simonw/datasette/issues/59#issuecomment-344141199,https://api.github.com/repos/simonw/datasette/issues/59,344141199,MDEyOklzc3VlQ29tbWVudDM0NDE0MTE5OQ==,9599,simonw,2017-11-14T04:13:11Z,2017-11-14T04:13:11Z,OWNER,"I managed to do this manually: datasette package ~/parlgov-db/parlgov.db --metadata=parlgov.json # Output 8758ec31dda3 as the new image ID docker save 8758ec31dda3 > /tmp/my-image # I could have just piped this straight to hyper cat /tmp/my-image | hyper load # Now start the container running in hyper hyper run -d -p 80:8001 --name parlgov 8758ec31dda3 # We need to assign an IP address so we can see it hyper fip allocate 1 # Outputs 199.245.58.78 hyper fip attach 199.245.58.78 parlgov At this point, visiting the IP address in a browser showed the parlgov UI. To clean up... hyper hyper fip detach parlgov hyper fip release 199.245.58.78 hyper stop parlgov hyper rm parlgov ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273157085,datasette publish hyper, https://github.com/simonw/datasette/issues/79#issuecomment-344141515,https://api.github.com/repos/simonw/datasette/issues/79,344141515,MDEyOklzc3VlQ29tbWVudDM0NDE0MTUxNQ==,9599,simonw,2017-11-14T04:16:01Z,2017-11-14T04:16:01Z,OWNER,This is probably a bit too much for the README - I should get readthedocs working.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273569068,Add more detailed API documentation to the README, https://github.com/simonw/datasette/issues/57#issuecomment-344145265,https://api.github.com/repos/simonw/datasette/issues/57,344145265,MDEyOklzc3VlQ29tbWVudDM0NDE0NTI2NQ==,247192,macropin,2017-11-14T04:45:38Z,2017-11-14T04:45:38Z,CONTRIBUTOR,"I'm happy to contribute this. Just let me know if you want a Dockerfile for development or production purposes, or both. If it's prod then we can just pip install the source from pypi, otherwise for dev we'll need a `requirements.txt` to speed up rebuilds.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127694,Ship a Docker image of the whole thing, https://github.com/simonw/datasette/issues/57#issuecomment-344147583,https://api.github.com/repos/simonw/datasette/issues/57,344147583,MDEyOklzc3VlQ29tbWVudDM0NDE0NzU4Mw==,247192,macropin,2017-11-14T05:03:47Z,2017-11-14T05:03:47Z,CONTRIBUTOR,"Let me know if you'd like a PR. The image is usable as `docker run --rm -t -i -p 9000:8001 -v $(pwd)/db:/db datasette datasette serve /db/chinook.db`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127694,Ship a Docker image of the whole thing, https://github.com/simonw/datasette/issues/57#issuecomment-344149165,https://api.github.com/repos/simonw/datasette/issues/57,344149165,MDEyOklzc3VlQ29tbWVudDM0NDE0OTE2NQ==,9599,simonw,2017-11-14T05:16:34Z,2017-11-14T05:17:14Z,OWNER,"I’m intrigued by this pattern: https://github.com/macropin/datasette/blob/147195c2fdfa2b984d8f9fc1c6cab6634970a056/Dockerfile#L8 What’s the benefit of doing that? Does it result in a smaller image size?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127694,Ship a Docker image of the whole thing, https://github.com/simonw/datasette/issues/57#issuecomment-344151223,https://api.github.com/repos/simonw/datasette/issues/57,344151223,MDEyOklzc3VlQ29tbWVudDM0NDE1MTIyMw==,247192,macropin,2017-11-14T05:32:28Z,2017-11-14T05:33:03Z,CONTRIBUTOR,"The pattern is called ""multi-stage builds"". And the result is a svelte 226MB image (201MB for 3.6-slim) vs 700MB+ for the full image. It's possible to get it even smaller, but that takes a lot more work.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127694,Ship a Docker image of the whole thing, https://github.com/simonw/datasette/issues/46#issuecomment-344161226,https://api.github.com/repos/simonw/datasette/issues/46,344161226,MDEyOklzc3VlQ29tbWVudDM0NDE2MTIyNg==,9599,simonw,2017-11-14T06:41:21Z,2017-11-14T06:41:21Z,OWNER,Spatial extensions would be really useful too. https://www.gaia-gis.it/spatialite-2.1/SpatiaLite-manual.html,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/46#issuecomment-344161371,https://api.github.com/repos/simonw/datasette/issues/46,344161371,MDEyOklzc3VlQ29tbWVudDM0NDE2MTM3MQ==,9599,simonw,2017-11-14T06:42:15Z,2017-11-14T06:42:15Z,OWNER,http://charlesleifer.com/blog/going-fast-with-sqlite-and-python/ is useful here too.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/46#issuecomment-344161430,https://api.github.com/repos/simonw/datasette/issues/46,344161430,MDEyOklzc3VlQ29tbWVudDM0NDE2MTQzMA==,9599,simonw,2017-11-14T06:42:44Z,2017-11-14T06:42:44Z,OWNER,Also requested on Twitter: https://twitter.com/DenubisX/status/930322813864439808,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/27#issuecomment-344179878,https://api.github.com/repos/simonw/datasette/issues/27,344179878,MDEyOklzc3VlQ29tbWVudDM0NDE3OTg3OA==,9599,simonw,2017-11-14T08:21:22Z,2017-11-14T08:21:22Z,OWNER,https://github.com/frappe/charts perhaps ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267886330,Ability to plot a simple graph, https://github.com/simonw/datasette/issues/43#issuecomment-344180866,https://api.github.com/repos/simonw/datasette/issues/43,344180866,MDEyOklzc3VlQ29tbWVudDM0NDE4MDg2Ng==,9599,simonw,2017-11-14T08:25:37Z,2017-11-14T08:25:37Z,OWNER,"This isn’t necessary - restarting the server is fast and easy, and I’ve not found myself needing this at all during development.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268592894,"While running, server should spot new db files added to its directory ", https://github.com/simonw/datasette/issues/57#issuecomment-344185817,https://api.github.com/repos/simonw/datasette/issues/57,344185817,MDEyOklzc3VlQ29tbWVudDM0NDE4NTgxNw==,9599,simonw,2017-11-14T08:46:24Z,2017-11-14T08:46:24Z,OWNER,Thanks for the explanation! Please do start a pull request. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127694,Ship a Docker image of the whole thing, https://github.com/simonw/datasette/issues/30#issuecomment-344352573,https://api.github.com/repos/simonw/datasette/issues/30,344352573,MDEyOklzc3VlQ29tbWVudDM0NDM1MjU3Mw==,9599,simonw,2017-11-14T18:29:01Z,2017-11-14T18:29:01Z,OWNER,This is a dupe of #85 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268078453,Do something neat with foreign keys, https://github.com/simonw/datasette/issues/93#issuecomment-344409906,https://api.github.com/repos/simonw/datasette/issues/93,344409906,MDEyOklzc3VlQ29tbWVudDM0NDQwOTkwNg==,9599,simonw,2017-11-14T21:47:02Z,2017-11-14T21:47:02Z,OWNER,"Even without bundling in the database file itself, I'd love to have a standalone binary version of the core `datasette` CLI utility. I think Sanic may have some complex dependencies, but I've never tried pyinstaller so I don't know how easy or hard it would be to get this working.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-344415756,https://api.github.com/repos/simonw/datasette/issues/93,344415756,MDEyOklzc3VlQ29tbWVudDM0NDQxNTc1Ng==,9599,simonw,2017-11-14T22:09:13Z,2017-11-14T22:09:13Z,OWNER,Looks like we'd need to use this recipe: https://github.com/pyinstaller/pyinstaller/wiki/Recipe-Setuptools-Entry-Point,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-344424382,https://api.github.com/repos/simonw/datasette/issues/93,344424382,MDEyOklzc3VlQ29tbWVudDM0NDQyNDM4Mg==,67420,atomotic,2017-11-14T22:42:16Z,2017-11-14T22:42:16Z,NONE,"tried quickly, this seems working: ``` ~ pip3 install pyinstaller ~ pyinstaller -F --add-data /usr/local/lib/python3.6/site-packages/datasette/templates:datasette/templates --add-data /usr/local/lib/python3.6/site-packages/datasette/static:datasette/static /usr/local/bin/datasette ~ du -h dist/datasette 6.8M dist/datasette ~ file dist/datasette dist/datasette: Mach-O 64-bit executable x86_64 ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-344426887,https://api.github.com/repos/simonw/datasette/issues/93,344426887,MDEyOklzc3VlQ29tbWVudDM0NDQyNjg4Nw==,9599,simonw,2017-11-14T22:51:46Z,2017-11-14T22:51:46Z,OWNER,"That didn't quite work for me. It built me a `dist/datasette` executable but when I try to run it I get an error: $ pwd /Users/simonw/Dropbox/Development/datasette $ source venv/bin/activate $ pyinstaller -F --add-data datasette/templates:datasette/templates --add-data datasette/static:datasette/static /Users/simonw/Dropbox/Development/datasette/venv/bin/datasette $ dist/datasette --help Traceback (most recent call last): File ""datasette"", line 11, in File ""site-packages/pkg_resources/__init__.py"", line 572, in load_entry_point File ""site-packages/pkg_resources/__init__.py"", line 564, in get_distribution File ""site-packages/pkg_resources/__init__.py"", line 436, in get_provider File ""site-packages/pkg_resources/__init__.py"", line 984, in require File ""site-packages/pkg_resources/__init__.py"", line 870, in resolve pkg_resources.DistributionNotFound: The 'datasette' distribution was not found and is required by the application [99117] Failed to execute script datasette ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/88#issuecomment-344427448,https://api.github.com/repos/simonw/datasette/issues/88,344427448,MDEyOklzc3VlQ29tbWVudDM0NDQyNzQ0OA==,9599,simonw,2017-11-14T22:54:06Z,2017-11-14T22:54:06Z,OWNER,Hooray! First dataset that wasn't deployed by me :) https://github.com/simonw/datasette/wiki/Datasettes,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273775212,Add NHS England Hospitals example to wiki, https://github.com/simonw/datasette/issues/88#issuecomment-344427560,https://api.github.com/repos/simonw/datasette/issues/88,344427560,MDEyOklzc3VlQ29tbWVudDM0NDQyNzU2MA==,9599,simonw,2017-11-14T22:54:33Z,2017-11-14T22:54:33Z,OWNER,I'm getting an internal server error on http://run.plnkr.co/preview/cj9zlf1qc0003414y90ajkwpk/ at the moment,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273775212,Add NHS England Hospitals example to wiki, https://github.com/simonw/datasette/issues/93#issuecomment-344430299,https://api.github.com/repos/simonw/datasette/issues/93,344430299,MDEyOklzc3VlQ29tbWVudDM0NDQzMDI5OQ==,67420,atomotic,2017-11-14T23:06:33Z,2017-11-14T23:06:33Z,NONE,"i will look better tomorrow, it's late i surely made some mistake https://asciinema.org/a/ZyAWbetrlriDadwWyVPUWB94H","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/88#issuecomment-344430689,https://api.github.com/repos/simonw/datasette/issues/88,344430689,MDEyOklzc3VlQ29tbWVudDM0NDQzMDY4OQ==,15543,tomdyson,2017-11-14T23:08:22Z,2017-11-14T23:08:22Z,CONTRIBUTOR,"> I'm getting an internal server error on http://run.plnkr.co/preview/cj9zlf1qc0003414y90ajkwpk/ at the moment Sorry about that - here's a working version on Netlify: https://nhs-england-map.netlify.com","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273775212,Add NHS England Hospitals example to wiki, https://github.com/simonw/datasette/issues/14#issuecomment-344438724,https://api.github.com/repos/simonw/datasette/issues/14,344438724,MDEyOklzc3VlQ29tbWVudDM0NDQzODcyNA==,9599,simonw,2017-11-14T23:47:54Z,2017-11-14T23:47:54Z,OWNER,"Plugins should be able to interact with the build step. This would give plugins an opportunity to modify the SQL databases and help prepare them for serving - for example, a full-text search plugin might create additional FTS tables, or a mapping plugin might pre-calculate a bunch of geohashes for tables that have latitude/longitude values. Plugins could really take advantage of the immutable nature of the dataset here.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/93#issuecomment-344440377,https://api.github.com/repos/simonw/datasette/issues/93,344440377,MDEyOklzc3VlQ29tbWVudDM0NDQ0MDM3Nw==,9599,simonw,2017-11-14T23:56:35Z,2017-11-14T23:56:35Z,OWNER,"It worked! $ pyinstaller -F \ --add-data /usr/local/lib/python3.5/site-packages/datasette/templates:datasette/templates \ --add-data /usr/local/lib/python3.5/site-packages/datasette/static:datasette/static \ /usr/local/bin/datasette $ file dist/datasette dist/datasette: Mach-O 64-bit executable x86_64 $ dist/datasette --help Usage: datasette [OPTIONS] COMMAND [ARGS]... Datasette! Options: --help Show this message and exit. Commands: serve* Serve up specified SQLite database files with... build package Package specified SQLite files into a new... publish Publish specified SQLite database files to... ","{""total_count"": 3, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 3, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/93#issuecomment-344440658,https://api.github.com/repos/simonw/datasette/issues/93,344440658,MDEyOklzc3VlQ29tbWVudDM0NDQ0MDY1OA==,9599,simonw,2017-11-14T23:58:07Z,2017-11-14T23:58:07Z,OWNER,It's a shame pyinstaller can't act as a cross-compiler - so I don't think I can get Travis CI to build packages. But it's fantastic that it's possible to turn the tool into a standalone executable!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/85#issuecomment-344452063,https://api.github.com/repos/simonw/datasette/issues/85,344452063,MDEyOklzc3VlQ29tbWVudDM0NDQ1MjA2Mw==,9599,simonw,2017-11-15T01:03:03Z,2017-11-15T01:03:03Z,OWNER,"This can work in reverse too. If you view the row page for something that has foreign keys against it, we can show you “53 items in TABLE link to this” and provide a link to view them all. That count worry could be prohibitively expensive. To counter that, we could run the count query via Ajax and set a strict time limit on it. See #95","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273678673,Detect foreign keys and use them to link HTML pages together, https://github.com/simonw/datasette/issues/85#issuecomment-344452326,https://api.github.com/repos/simonw/datasette/issues/85,344452326,MDEyOklzc3VlQ29tbWVudDM0NDQ1MjMyNg==,9599,simonw,2017-11-15T01:04:38Z,2017-11-15T01:04:38Z,OWNER,This will work well in conjunction with https://github.com/simonw/csvs-to-sqlite/issues/2,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273678673,Detect foreign keys and use them to link HTML pages together, https://github.com/simonw/datasette/pull/89#issuecomment-344462277,https://api.github.com/repos/simonw/datasette/issues/89,344462277,MDEyOklzc3VlQ29tbWVudDM0NDQ2MjI3Nw==,9599,simonw,2017-11-15T02:02:52Z,2017-11-15T02:02:52Z,OWNER,"This is exactly what I was after, thanks!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273816720,SQL syntax highlighting with CodeMirror, https://github.com/simonw/datasette/issues/13#issuecomment-344462608,https://api.github.com/repos/simonw/datasette/issues/13,344462608,MDEyOklzc3VlQ29tbWVudDM0NDQ2MjYwOA==,9599,simonw,2017-11-15T02:04:51Z,2017-11-15T02:04:51Z,OWNER,"Fixed in https://github.com/simonw/datasette/commit/8252daa4c14d73b4b69e3f2db4576bb39d73c070 - thanks, @tomdyson!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267542338,Add a syntax highlighting SQL editor, https://github.com/simonw/datasette/issues/95#issuecomment-344463436,https://api.github.com/repos/simonw/datasette/issues/95,344463436,MDEyOklzc3VlQ29tbWVudDM0NDQ2MzQzNg==,9599,simonw,2017-11-15T02:10:10Z,2017-11-15T02:10:10Z,OWNER,"This means clients can ask questions but say ""don't bother if it takes longer than X"" - which is really handy when you're working against unknown databases that might be small or might be enormous.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273998513,Allow shorter time limits to be set using a ?_sql_time_limit_ms =20 query string limit, https://github.com/simonw/datasette/pull/94#issuecomment-344472313,https://api.github.com/repos/simonw/datasette/issues/94,344472313,MDEyOklzc3VlQ29tbWVudDM0NDQ3MjMxMw==,9599,simonw,2017-11-15T03:08:00Z,2017-11-15T03:08:00Z,OWNER,"Works for me. I'm going to land this. Just one thing: simonw$ docker run --rm -t -i -p 9001:8001 c408e8cfbe40 datasette publish now The publish command requires ""now"" to be installed and configured Follow the instructions at https://zeit.co/now#whats-now Maybe we should have the Docker container install the ""now"" client? Not sure how much size that would add though. I think it's OK without for the moment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273961179,Initial add simple prod ready Dockerfile refs #57, https://github.com/simonw/datasette/issues/25#issuecomment-344487639,https://api.github.com/repos/simonw/datasette/issues/25,344487639,MDEyOklzc3VlQ29tbWVudDM0NDQ4NzYzOQ==,9599,simonw,2017-11-15T05:11:11Z,2017-11-15T05:11:11Z,OWNER,"Since you can already download the database directly, I'm not going to bother with this one.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267857622,Endpoint that returns SQL ready to be piped into DB, https://github.com/simonw/datasette/issues/93#issuecomment-344516406,https://api.github.com/repos/simonw/datasette/issues/93,344516406,MDEyOklzc3VlQ29tbWVudDM0NDUxNjQwNg==,67420,atomotic,2017-11-15T08:09:41Z,2017-11-15T08:09:41Z,NONE,actually you can use travis to build for linux/macos and [appveyor](https://www.appveyor.com/) to build for windows.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952,Package as standalone binary, https://github.com/simonw/datasette/issues/101#issuecomment-344597274,https://api.github.com/repos/simonw/datasette/issues/101,344597274,MDEyOklzc3VlQ29tbWVudDM0NDU5NzI3NA==,450244,eaubin,2017-11-15T13:48:55Z,2017-11-15T13:48:55Z,NONE,This is a duplicate of https://github.com/simonw/datasette/issues/100,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274161964,TemplateAssertionError: no filter named 'tojson', https://github.com/simonw/datasette/issues/85#issuecomment-344657040,https://api.github.com/repos/simonw/datasette/issues/85,344657040,MDEyOklzc3VlQ29tbWVudDM0NDY1NzA0MA==,9599,simonw,2017-11-15T16:56:48Z,2017-11-15T16:56:48Z,OWNER,"Since detecting foreign keys that point to a specific table is a bit expensive (you have to call a PRAGMA on every other table) I’m going to add this to the build/inspect stage. Idea: if we detect that the foreign key table only has one other column in it (id, name) AND we know that the id is the primary key, we can add an efficient lookup on the table list view and prefetch a dictionary mapping IDs to their value. Then we can feed that dictionary in as extra tenplate context and use it to render labeled hyperlinks in the corresponding column. This means our build step should also cache which columns are indexed, and add a “label_column” property for tables with an obvious lane column.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273678673,Detect foreign keys and use them to link HTML pages together, https://github.com/simonw/datasette/issues/90#issuecomment-344667202,https://api.github.com/repos/simonw/datasette/issues/90,344667202,MDEyOklzc3VlQ29tbWVudDM0NDY2NzIwMg==,9599,simonw,2017-11-15T17:29:38Z,2017-11-15T17:29:38Z,OWNER,@jacobian points out that a buildpack may be a better fit than a Docker container for implementing this: https://twitter.com/jacobian/status/930849058465255424,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123,datasette publish heroku, https://github.com/simonw/datasette/issues/90#issuecomment-344680385,https://api.github.com/repos/simonw/datasette/issues/90,344680385,MDEyOklzc3VlQ29tbWVudDM0NDY4MDM4NQ==,9599,simonw,2017-11-15T18:14:11Z,2017-11-15T18:14:11Z,OWNER,"Maybe we don’t even need a buildpack... we could create a temporary directory, set up a classic heroku app with the datasette serve command in the Procfile and then git push to deploy.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123,datasette publish heroku, https://github.com/simonw/datasette/issues/90#issuecomment-344686483,https://api.github.com/repos/simonw/datasette/issues/90,344686483,MDEyOklzc3VlQ29tbWVudDM0NDY4NjQ4Mw==,9599,simonw,2017-11-15T18:36:23Z,2017-11-15T18:36:23Z,OWNER,The “datasette build” command would need to run in a bin/post_compile script eg https://github.com/simonw/simonwillisonblog/blob/cloudflare-ips/bin/post_compile,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123,datasette publish heroku, https://github.com/simonw/datasette/issues/90#issuecomment-344687328,https://api.github.com/repos/simonw/datasette/issues/90,344687328,MDEyOklzc3VlQ29tbWVudDM0NDY4NzMyOA==,9599,simonw,2017-11-15T18:39:14Z,2017-11-15T18:39:49Z,OWNER,"By default the command could use a temporary directory that gets cleaned up after the deploy, but we could allow users to opt in to keeping the generated directory like so: datasette publish heroku mydb.py -d ~/dev/my-heroku-app This would create the my-heroku-app folder so you can later execute further git deploys from there.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123,datasette publish heroku, https://github.com/simonw/datasette/pull/104#issuecomment-344710204,https://api.github.com/repos/simonw/datasette/issues/104,344710204,MDEyOklzc3VlQ29tbWVudDM0NDcxMDIwNA==,21148,jacobian,2017-11-15T19:57:50Z,2017-11-15T19:57:50Z,CONTRIBUTOR,"A first basic stab at making this work, just to prove the approach. Right now this requires [a Heroku CLI plugin](https://github.com/heroku/heroku-builds), which seems pretty unreasonable. I think this can be replaced with direct API calls, which could clean up a lot of things. But I wanted to prove it worked first, and it does.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274284246,[WIP] Add publish to heroku support, https://github.com/simonw/datasette/pull/107#issuecomment-344770170,https://api.github.com/repos/simonw/datasette/issues/107,344770170,MDEyOklzc3VlQ29tbWVudDM0NDc3MDE3MA==,9599,simonw,2017-11-16T00:01:00Z,2017-11-16T00:01:22Z,OWNER,"It is - but I think this will break on this line since it expects two format string parameters: https://github.com/simonw/datasette/blob/f45ca30f91b92ac68adaba893bf034f13ec61ced/datasette/utils.py#L61 Needs unit tests too, which live here: https://github.com/simonw/datasette/blob/f45ca30f91b92ac68adaba893bf034f13ec61ced/tests/test_utils.py#L49","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274343647,add support for ?field__isnull=1, https://github.com/simonw/datasette/issues/100#issuecomment-344771130,https://api.github.com/repos/simonw/datasette/issues/100,344771130,MDEyOklzc3VlQ29tbWVudDM0NDc3MTEzMA==,9599,simonw,2017-11-16T00:06:00Z,2017-11-16T00:06:00Z,OWNER,"Aha... it looks like this is a Jinja version problem: https://github.com/ansible/ansible/issues/25381#issuecomment-306492389 Datasette depends on sanic-jinja2 - and that doesn't depend on a particular jinja2 version: https://github.com/lixxu/sanic-jinja2/blob/7e9520850d8c6bb66faf43b7f252593d7efe3452/setup.py#L22 So if you have an older version of Jinja installed, stuff breaks.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274160723,TemplateAssertionError: no filter named 'tojson', https://github.com/simonw/datasette/issues/96#issuecomment-344786528,https://api.github.com/repos/simonw/datasette/issues/96,344786528,MDEyOklzc3VlQ29tbWVudDM0NDc4NjUyOA==,9599,simonw,2017-11-16T01:32:41Z,2017-11-16T01:32:41Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274001453,UI for editing named parameters, https://github.com/simonw/datasette/issues/96#issuecomment-344788435,https://api.github.com/repos/simonw/datasette/issues/96,344788435,MDEyOklzc3VlQ29tbWVudDM0NDc4ODQzNQ==,9599,simonw,2017-11-16T01:43:52Z,2017-11-16T01:43:52Z,OWNER,Demo: https://australian-dogs.now.sh/australian-dogs-3ba9628?sql=select+name%2C+count%28*%29+as+n+from+%28%0D%0A%0D%0Aselect+upper%28%22Animal+name%22%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2013%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2014%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2015%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22AnimalName%22%29+as+name+from+%5BCity-of-Port-Adelaide-Enfield-Dog_Registrations_2016%5D+where+AnimalBreed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5BMitcham-dog-registrations-2015%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22DOG_NAME%22%29+as+name+from+%5Bburnside-dog-registrations-2015%5D+where+DOG_BREED+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Animal_Name%22%29+as+name+from+%5Bcity-of-playford-2015-dog-registration%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5Bcity-of-prospect-dog-registration-details-2016%5D+where%22Breed+Description%22+like+%3Abreed%0D%0A%0D%0A%29+group+by+name+order+by+n+desc%3B&breed=chihuahua,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274001453,UI for editing named parameters, https://github.com/simonw/datasette/issues/96#issuecomment-344788763,https://api.github.com/repos/simonw/datasette/issues/96,344788763,MDEyOklzc3VlQ29tbWVudDM0NDc4ODc2Mw==,9599,simonw,2017-11-16T01:45:51Z,2017-11-16T01:45:51Z,OWNER,Another demo - this time it lets you search by name and see the most popular breeds with that name: https://australian-dogs.now.sh/australian-dogs-3ba9628?sql=select+breed%2C+count%28*%29+as+n+from+%28%0D%0A%0D%0Aselect+upper%28%22Breed%22%29+as+breed+from+%5BAdelaide-City-Council-dog-registrations-2013%5D+where+%22Animal+name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Breed_Description%22%29+as+breed+from+%5BAdelaide-City-Council-dog-registrations-2014%5D+where+%22Animal_Name%22+like+%3Aname%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Breed_Description%22%29+as+breed+from+%5BAdelaide-City-Council-dog-registrations-2015%5D+where+%22Animal_Name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22AnimalBreed%22%29+as+breed+from+%5BCity-of-Port-Adelaide-Enfield-Dog_Registrations_2016%5D+where+%22AnimalName%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Breed%22%29+as+breed+from+%5BMitcham-dog-registrations-2015%5D+where+%22Animal+Name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22DOG_BREED%22%29+as+breed+from+%5Bburnside-dog-registrations-2015%5D+where+%22DOG_NAME%22+like+%3Aname%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Breed_Description%22%29+as+breed+from+%5Bcity-of-playford-2015-dog-registration%5D+where+%22Animal_Name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Breed+Description%22%29+as+breed+from+%5Bcity-of-prospect-dog-registration-details-2016%5D+where+%22Animal+Name%22+like+%3Aname%0D%0A%0D%0A%29+group+by+breed+order+by+n+desc%3B&name=rex,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274001453,UI for editing named parameters, https://github.com/simonw/datasette/issues/46#issuecomment-344810525,https://api.github.com/repos/simonw/datasette/issues/46,344810525,MDEyOklzc3VlQ29tbWVudDM0NDgxMDUyNQ==,54999,ingenieroariel,2017-11-16T04:11:25Z,2017-11-16T04:11:25Z,CONTRIBUTOR,"@simonw On the spatialite support, here is some info to make it work and a screenshot: I used the following Dockerfile: ``` FROM prolocutor/python3-sqlite-ext:3.5.1-spatialite as build RUN mkdir /code ADD . /code/ RUN pip install /code/ EXPOSE 8001 CMD [""datasette"", ""serve"", ""/code/ne.sqlite"", ""--host"", ""0.0.0.0""] ``` and added this to `prepare_connection`: ``` conn.enable_load_extension(True) conn.execute(""SELECT load_extension('/usr/local/lib/mod_spatialite.so')"") ```","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/pull/107#issuecomment-344811268,https://api.github.com/repos/simonw/datasette/issues/107,344811268,MDEyOklzc3VlQ29tbWVudDM0NDgxMTI2OA==,3433657,raynae,2017-11-16T04:17:45Z,2017-11-16T04:17:45Z,CONTRIBUTOR,"Thanks for the guidance. I added a unit test and made a slight change to utils.py. I didn't realize this, but evidently string.format only complains if you supply less arguments than there are format placeholders, so the original commit worked, but was adding a superfluous named param. I added a conditional that prevents the named param from being created and ensures the correct number of args are passed to sting.format. It has the side effect of hiding the SQL query in /templates/table.html when there are no other where clauses--not sure if that's the desired outcome here.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274343647,add support for ?field__isnull=1, https://github.com/simonw/datasette/issues/100#issuecomment-344864254,https://api.github.com/repos/simonw/datasette/issues/100,344864254,MDEyOklzc3VlQ29tbWVudDM0NDg2NDI1NA==,13304454,coisnepe,2017-11-16T09:25:10Z,2017-11-16T09:25:10Z,NONE,@simonw I see. I upgraded sanic-jinja2 and jinja2: it now works flawlessly. Thank you!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274160723,TemplateAssertionError: no filter named 'tojson', https://github.com/simonw/datasette/issues/46#issuecomment-344975156,https://api.github.com/repos/simonw/datasette/issues/46,344975156,MDEyOklzc3VlQ29tbWVudDM0NDk3NTE1Ng==,9599,simonw,2017-11-16T16:19:44Z,2017-11-16T16:19:44Z,OWNER,"That's fantastic! Thank you very much for that. Do you know if it's possible to view the Dockerfile used by https://hub.docker.com/r/prolocutor/python3-sqlite-ext/ ?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/46#issuecomment-344976104,https://api.github.com/repos/simonw/datasette/issues/46,344976104,MDEyOklzc3VlQ29tbWVudDM0NDk3NjEwNA==,9599,simonw,2017-11-16T16:22:45Z,2017-11-16T16:22:45Z,OWNER,Found a relevant Dockerfile on Reddit: https://www.reddit.com/r/Python/comments/5unkb3/install_sqlite3_on_python_3/ddzdz2b/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/46#issuecomment-344976882,https://api.github.com/repos/simonw/datasette/issues/46,344976882,MDEyOklzc3VlQ29tbWVudDM0NDk3Njg4Mg==,9599,simonw,2017-11-16T16:25:07Z,2017-11-16T16:25:07Z,OWNER,Maybe part of the solution here is to add a `--load-extension` argument to `datasette` - so when you run the command you can specify SQLite extensions that should be loaded. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/109#issuecomment-344986423,https://api.github.com/repos/simonw/datasette/issues/109,344986423,MDEyOklzc3VlQ29tbWVudDM0NDk4NjQyMw==,9599,simonw,2017-11-16T16:53:26Z,2017-11-16T16:53:26Z,OWNER,http://datasette.readthedocs.io/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274378301,Set up readthedocs, https://github.com/simonw/datasette/issues/110#issuecomment-344988263,https://api.github.com/repos/simonw/datasette/issues/110,344988263,MDEyOklzc3VlQ29tbWVudDM0NDk4ODI2Mw==,9599,simonw,2017-11-16T16:58:48Z,2017-11-16T16:58:48Z,OWNER,"Here's how I tested this. First I downloaded and started a docker container using https://hub.docker.com/r/prolocutor/python3-sqlite-ext - which includes the compiled spatialite extension. This downloads it, then starts a shell in that container. docker run -it -p 8018:8018 prolocutor/python3-sqlite-ext:3.5.1-spatialite /bin/sh Installed a pre-release build of datasette which includes the new `--load-extension` option. pip install https://static.simonwillison.net/static/2017/datasette-0.13-py3-none-any.whl Now grab a sample database from https://www.gaia-gis.it/spatialite-2.3.1/resources.html - and unzip and rename it (datasette doesn't yet like databases with dots in their filename): wget http://www.gaia-gis.it/spatialite-2.3.1/test-2.3.sqlite.gz gunzip test-2.3.sqlite.gz mv test-2.3.sqlite test23.sqlite Now start datasette on port 8018 (the port I exposed earlier) with the extension loaded: datasette test23.sqlite -p 8018 -h 0.0.0.0 --load-extension /usr/local/lib/mod_spatialite.so Now I can confirm that it worked: http://localhost:8018/test23-c88bc35?sql=select+ST_AsText%28Geometry%29+from+HighWays+limit+1 If I run datasette without `--load-extension` I get this: datasette test23.sqlite -p 8018 -h 0.0.0.0 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274578142,Add --load-extension option to datasette for loading extra SQLite extensions, https://github.com/simonw/datasette/issues/46#issuecomment-344988591,https://api.github.com/repos/simonw/datasette/issues/46,344988591,MDEyOklzc3VlQ29tbWVudDM0NDk4ODU5MQ==,9599,simonw,2017-11-16T16:59:51Z,2017-11-16T16:59:51Z,OWNER,"OK, `--load-extension` is now a supported command line option - see #110 which includes my notes on how I manually tested it using the `prolocutor/python3-sqlite-ext` Docker image.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/46#issuecomment-344989340,https://api.github.com/repos/simonw/datasette/issues/46,344989340,MDEyOklzc3VlQ29tbWVudDM0NDk4OTM0MA==,9599,simonw,2017-11-16T17:02:07Z,2017-11-16T17:02:07Z,OWNER,The fact that `prolocutor/python3-sqlite-ext` doesn't provide a visible Dockerfile and hasn't been updated in two years makes me hesitant to bake it into datasette itself. I'd rather put together a Dockerfile that enables the necessary extensions and can live in the datasette repository itself.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/46#issuecomment-344995571,https://api.github.com/repos/simonw/datasette/issues/46,344995571,MDEyOklzc3VlQ29tbWVudDM0NDk5NTU3MQ==,9599,simonw,2017-11-16T17:22:32Z,2017-11-16T17:22:32Z,OWNER,The JSON extension would be very worthwhile too: https://www.sqlite.org/json1.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/46#issuecomment-345002908,https://api.github.com/repos/simonw/datasette/issues/46,345002908,MDEyOklzc3VlQ29tbWVudDM0NTAwMjkwOA==,54999,ingenieroariel,2017-11-16T17:47:49Z,2017-11-16T17:47:49Z,CONTRIBUTOR,I'll try to find alternatives to the Dockerfile option - I also think we should not use that old one without sources or license.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/111#issuecomment-345013127,https://api.github.com/repos/simonw/datasette/issues/111,345013127,MDEyOklzc3VlQ29tbWVudDM0NTAxMzEyNw==,9599,simonw,2017-11-16T18:23:56Z,2017-11-16T18:23:56Z,OWNER,Having this as a global option may not make sense when publishing multiple databases. We can revisit that when we implement per-database and per-table metadata.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274615452,Add “updated” to metadata, https://github.com/simonw/datasette/issues/110#issuecomment-345017256,https://api.github.com/repos/simonw/datasette/issues/110,345017256,MDEyOklzc3VlQ29tbWVudDM0NTAxNzI1Ng==,9599,simonw,2017-11-16T18:38:30Z,2017-11-16T18:38:30Z,OWNER,"To finish up, I committed the image I created in the above so I can run it again in the future: docker commit $(docker ps -lq) datasette-sqlite Now I can run it like this: docker run -it -p 8018:8018 datasette-sqlite datasette /tmp/test23.sqlite -p 8018 -h 0.0.0.0 --load-extension /usr/local/lib/mod_spatialite.so ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274578142,Add --load-extension option to datasette for loading extra SQLite extensions, https://github.com/simonw/datasette/issues/14#issuecomment-345067498,https://api.github.com/repos/simonw/datasette/issues/14,345067498,MDEyOklzc3VlQ29tbWVudDM0NTA2NzQ5OA==,9599,simonw,2017-11-16T21:25:32Z,2017-11-16T21:26:22Z,OWNER,"For visualizations, Google Maps should be made available as a plugin. The default visualizations can use Leaflet and Open Street Map, but there's no reason to not make Google Maps available as a plugin, especially if the plugin can provide a mechanism for configuring the necessary API key. I'm particularly excited in the Google Maps heatmap visualization https://developers.google.com/maps/documentation/javascript/heatmaplayer as seen on http://mochimachine.org/wasteland/","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/pull/107#issuecomment-345108644,https://api.github.com/repos/simonw/datasette/issues/107,345108644,MDEyOklzc3VlQ29tbWVudDM0NTEwODY0NA==,9599,simonw,2017-11-17T00:34:46Z,2017-11-17T00:34:46Z,OWNER,Looks like your tests are failing because of a bug which I fixed in https://github.com/simonw/datasette/commit/9199945a1bcec4852e1cb866eb3642614dd32a48 - if you rebase to master the tests should pass.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274343647,add support for ?field__isnull=1, https://github.com/simonw/datasette/pull/107#issuecomment-345117690,https://api.github.com/repos/simonw/datasette/issues/107,345117690,MDEyOklzc3VlQ29tbWVudDM0NTExNzY5MA==,3433657,raynae,2017-11-17T01:29:41Z,2017-11-17T01:29:41Z,CONTRIBUTOR,"Thanks for bearing with me. I was getting a message about my branch diverging when I tried to push after rebasing, so I merged master into isnull, seems like that did the trick. Let me know if I should make any corrections.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274343647,add support for ?field__isnull=1, https://github.com/simonw/datasette/pull/114#issuecomment-345138134,https://api.github.com/repos/simonw/datasette/issues/114,345138134,MDEyOklzc3VlQ29tbWVudDM0NTEzODEzNA==,9599,simonw,2017-11-17T03:50:38Z,2017-11-17T03:50:38Z,OWNER,Fantastic! Thank you very much.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274733145,"Add spatialite, switch to debian and local build", https://github.com/simonw/datasette/issues/46#issuecomment-345138347,https://api.github.com/repos/simonw/datasette/issues/46,345138347,MDEyOklzc3VlQ29tbWVudDM0NTEzODM0Nw==,9599,simonw,2017-11-17T03:52:25Z,2017-11-17T03:52:25Z,OWNER,We now have a Dockerfile that compiles spatialite! https://github.com/simonw/datasette/pull/114/commits/6c6b63d890529eeefcefb7ab126ea3bd7b2315c1,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/85#issuecomment-345150048,https://api.github.com/repos/simonw/datasette/issues/85,345150048,MDEyOklzc3VlQ29tbWVudDM0NTE1MDA0OA==,9599,simonw,2017-11-17T05:35:25Z,2017-11-17T05:35:25Z,OWNER,`csvs-to-sqlite` is now capable of generating databases with foreign key lookup tables: https://github.com/simonw/csvs-to-sqlite/releases/tag/0.3,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273678673,Detect foreign keys and use them to link HTML pages together, https://github.com/simonw/datasette/issues/85#issuecomment-345242447,https://api.github.com/repos/simonw/datasette/issues/85,345242447,MDEyOklzc3VlQ29tbWVudDM0NTI0MjQ0Nw==,9599,simonw,2017-11-17T13:22:33Z,2017-11-17T13:23:14Z,OWNER,"I could support explicit label columns using additional arguments to `datasette serve`: datasette serve mydb.py --label-column mydb:table1:name --label-column mydb:table2:title This would mean ""in mydb, set the label column for table1 to name, and the label column for table2 to title""","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273678673,Detect foreign keys and use them to link HTML pages together, https://github.com/simonw/datasette/issues/112#issuecomment-345255655,https://api.github.com/repos/simonw/datasette/issues/112,345255655,MDEyOklzc3VlQ29tbWVudDM0NTI1NTY1NQ==,9599,simonw,2017-11-17T14:19:23Z,2017-11-17T14:19:23Z,OWNER,"I tesed this by first building and running a container using the new Dockerfile from #114: docker build . docker run -it -p 8001:8001 6c9ca7e29181 /bin/sh Then I ran this inside the container itself: apt update && apt-get install wget -y \ && wget http://www.gaia-gis.it/spatialite-2.3.1/test-2.3.sqlite.gz \ && gunzip test-2.3.sqlite.gz \ && mv test-2.3.sqlite test23.sqlite \ && datasette -h 0.0.0.0 test23.sqlite I visited this URL to confirm I got an error due to spatialite not being loaded: http://localhost:8001/test23-c88bc35?sql=select+ST_AsText%28Geometry%29+from+HighWays+limit+1 Then I checked that loading it with `--load-extension` worked correctly: datasette -h 0.0.0.0 test23.sqlite \ --load-extension=/usr/lib/x86_64-linux-gnu/mod_spatialite.so Then, finally, I tested it with the new environment variable option: SQLITE_EXTENSIONS=/usr/lib/x86_64-linux-gnu/mod_spatialite.so \ datasette -h 0.0.0.0 test23.sqlite Running it with an invalid environment variable option shows an error: $ SQLITE_EXTENSIONS=/usr/lib/x86_64-linux-gnu/blah.so datasette \ -h 0.0.0.0 test23.sqlite Usage: datasette -h [OPTIONS] [FILES]... Error: Invalid value for ""--load-extension"": Path ""/usr/lib/x86_64-linux-gnu/blah.so"" does not exist. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274617240,Allow --load-extension to be set via environment variables, https://github.com/simonw/datasette/pull/115#issuecomment-345256576,https://api.github.com/repos/simonw/datasette/issues/115,345256576,MDEyOklzc3VlQ29tbWVudDM0NTI1NjU3Ng==,9599,simonw,2017-11-17T14:22:51Z,2017-11-17T14:22:51Z,OWNER,"This is great - I've been frustrated by how CodeMirror prevents me from hitting tab-enter to activate the ""Run SQL"" button. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274877366,Add keyboard shortcut to execute SQL query, https://github.com/simonw/datasette/issues/46#issuecomment-345259115,https://api.github.com/repos/simonw/datasette/issues/46,345259115,MDEyOklzc3VlQ29tbWVudDM0NTI1OTExNQ==,9599,simonw,2017-11-17T14:32:12Z,2017-11-17T14:32:12Z,OWNER,"OK, I can confirm that the version in the new docker container supports FTS5, JSON *and* spatialite! Notes on how I built the container and tested the spatialite extension are here: https://github.com/simonw/datasette/issues/112#issuecomment-345255655 To confirm that JSON and FTS5 are working, I ran the following: $ docker run -it -p 8001:8001 6c9ca7e29181 python Python 3.6.3 (default, Nov 4 2017, 14:24:48) [GCC 6.3.0 20170516] on linux Type ""help"", ""copyright"", ""credits"" or ""license"" for more information. >>> import sqlite3 >>> sqlite3.connect(':memory:').execute('CREATE VIRTUAL TABLE email USING fts5(sender, title, body);') >>> list(sqlite3.connect(':memory:').execute('''SELECT json(' { ""this"" : ""is"", ""a"": [ ""test"" ] } ') ''')) [('{""this"":""is"",""a"":[""test""]}',)] If I do the same thing in python3 on my OS X laptop directly, I get this: $ python3 Python 3.5.1 (default, Apr 18 2016, 11:46:32) [GCC 4.2.1 Compatible Apple LLVM 7.3.0 (clang-703.0.29)] on darwin Type ""help"", ""copyright"", ""credits"" or ""license"" for more information. >>> import sqlite3 >>> sqlite3.connect(':memory:').execute('CREATE VIRTUAL TABLE email USING fts5(sender, title, body);') Traceback (most recent call last): File """", line 1, in sqlite3.OperationalError: no such module: fts5 >>> list(sqlite3.connect(':memory:').execute('''SELECT json(' { ""this"" : ""is"", ""a"": [ ""test"" ] } ') ''')) Traceback (most recent call last): File """", line 1, in sqlite3.OperationalError: no such function: json ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468,Dockerfile should build more recent SQLite with FTS5 and spatialite support, https://github.com/simonw/datasette/issues/64#issuecomment-345260784,https://api.github.com/repos/simonw/datasette/issues/64,345260784,MDEyOklzc3VlQ29tbWVudDM0NTI2MDc4NA==,9599,simonw,2017-11-17T14:38:21Z,2017-11-17T14:38:21Z,OWNER,This was fixed by ed2b3f25beac720f14869350baacc5f62b065194 in #107 - thanks @raynae!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273181020,Support for ?field__isnull=1 or similar, https://github.com/simonw/datasette/issues/36#issuecomment-345262738,https://api.github.com/repos/simonw/datasette/issues/36,345262738,MDEyOklzc3VlQ29tbWVudDM0NTI2MjczOA==,9599,simonw,2017-11-17T14:45:37Z,2017-11-17T14:45:37Z,OWNER,"Consider for example https://fivethirtyeight.datasettes.com/fivethirtyeight/inconvenient-sequel%2Fratings The idea here is to be able to support querystring parameters like this: * `?timestamp___date=2017-07-17` - return every item where the timestamp falls on that date * `?timestamp___year=2017` - return every item where the timestamp falls within 2017 * `?timestamp___month=1` - return every item where the month component is January * `?timestamp___day=10` - return every item where the day-of-the-month component is 10 This is similar to #64 but a fair bit more complicated. SQLite date functions are documented here: https://sqlite.org/lang_datefunc.html ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268262480,"date, year, month and day querystring lookups", https://github.com/simonw/datasette/issues/44#issuecomment-345343079,https://api.github.com/repos/simonw/datasette/issues/44,345343079,MDEyOklzc3VlQ29tbWVudDM0NTM0MzA3OQ==,9599,simonw,2017-11-17T19:29:43Z,2017-11-17T19:29:43Z,OWNER,Should this support sum/avg/etc as well?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",269731374,?_group_count=country - return counts by specific column(s), https://github.com/simonw/datasette/pull/117#issuecomment-345404257,https://api.github.com/repos/simonw/datasette/issues/117,345404257,MDEyOklzc3VlQ29tbWVudDM0NTQwNDI1Nw==,9599,simonw,2017-11-18T00:53:58Z,2017-11-18T00:53:58Z,OWNER,Thanks!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274900388,Don't prevent tabbing to `Run SQL` button, https://github.com/simonw/datasette/pull/104#issuecomment-345447161,https://api.github.com/repos/simonw/datasette/issues/104,345447161,MDEyOklzc3VlQ29tbWVudDM0NTQ0NzE2MQ==,9599,simonw,2017-11-18T14:53:17Z,2017-11-18T14:53:17Z,OWNER,any reason I shouldn't land this?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274284246,[WIP] Add publish to heroku support, https://github.com/simonw/datasette/issues/36#issuecomment-345448756,https://api.github.com/repos/simonw/datasette/issues/36,345448756,MDEyOklzc3VlQ29tbWVudDM0NTQ0ODc1Ng==,9599,simonw,2017-11-18T15:17:43Z,2017-11-18T15:17:43Z,OWNER,"This may be useful: https://github.com/coleifer/peewee/blob/db85167d93861451a1fe7cde8c4f05748b222634/peewee.py#L162-L185","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268262480,"date, year, month and day querystring lookups", https://github.com/simonw/datasette/issues/121#issuecomment-345452215,https://api.github.com/repos/simonw/datasette/issues/121,345452215,MDEyOklzc3VlQ29tbWVudDM0NTQ1MjIxNQ==,9599,simonw,2017-11-18T16:11:23Z,2017-11-18T16:11:23Z,OWNER,"If a column value is invalid JSON, let's return the invalid JSON as a regular string.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275089535,?_json=foo&_json=bar query string argument , https://github.com/simonw/datasette/pull/104#issuecomment-345452669,https://api.github.com/repos/simonw/datasette/issues/104,345452669,MDEyOklzc3VlQ29tbWVudDM0NTQ1MjY2OQ==,21148,jacobian,2017-11-18T16:18:45Z,2017-11-18T16:18:45Z,CONTRIBUTOR,"I'd like to do a bit of cleanup, and some error checking in case heroku/heroku-builds isn't installed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274284246,[WIP] Add publish to heroku support, https://github.com/simonw/datasette/issues/105#issuecomment-345493344,https://api.github.com/repos/simonw/datasette/issues/105,345493344,MDEyOklzc3VlQ29tbWVudDM0NTQ5MzM0NA==,9599,simonw,2017-11-19T05:28:49Z,2017-11-19T05:28:49Z,OWNER,Looks like there are a ton of interesting datasets packaged in this way at http://datahub.io/docs/core-data - see also https://github.com/datasets,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274314940,Consider data-package as a format for metadata, https://github.com/simonw/datasette/issues/105#issuecomment-345494052,https://api.github.com/repos/simonw/datasette/issues/105,345494052,MDEyOklzc3VlQ29tbWVudDM0NTQ5NDA1Mg==,9599,simonw,2017-11-19T05:49:53Z,2017-11-19T05:49:53Z,OWNER,https://github.com/rgieseke/pandas-datapackage-reader,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274314940,Consider data-package as a format for metadata, https://github.com/simonw/datasette/issues/85#issuecomment-345494724,https://api.github.com/repos/simonw/datasette/issues/85,345494724,MDEyOklzc3VlQ29tbWVudDM0NTQ5NDcyNA==,9599,simonw,2017-11-19T06:08:19Z,2017-11-19T06:08:19Z,OWNER,"This is working really nicely now: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273678673,Detect foreign keys and use them to link HTML pages together, https://github.com/simonw/datasette/issues/86#issuecomment-345494775,https://api.github.com/repos/simonw/datasette/issues/86,345494775,MDEyOklzc3VlQ29tbWVudDM0NTQ5NDc3NQ==,9599,simonw,2017-11-19T06:09:43Z,2017-11-19T06:09:43Z,OWNER,"Now that we have foreign key support (#85) this is even more important, since foreign key support actively encourages linking to filtered table views.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/86#issuecomment-345494918,https://api.github.com/repos/simonw/datasette/issues/86,345494918,MDEyOklzc3VlQ29tbWVudDM0NTQ5NDkxOA==,9599,simonw,2017-11-19T06:14:17Z,2017-11-19T06:14:17Z,OWNER,"If the selected relationship is a foreign key reference, we should resolve that foreign key and display it on the page.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/44#issuecomment-345494971,https://api.github.com/repos/simonw/datasette/issues/44,345494971,MDEyOklzc3VlQ29tbWVudDM0NTQ5NDk3MQ==,9599,simonw,2017-11-19T06:15:39Z,2017-11-19T06:15:39Z,OWNER,It would be great if this could support foreign key references and automatically resolve and hyperlink them if they are detected.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",269731374,?_group_count=country - return counts by specific column(s), https://github.com/simonw/datasette/issues/127#issuecomment-345495046,https://api.github.com/repos/simonw/datasette/issues/127,345495046,MDEyOklzc3VlQ29tbWVudDM0NTQ5NTA0Ng==,9599,simonw,2017-11-19T06:17:42Z,2017-11-19T06:17:42Z,OWNER,Maybe I should support `&_count=1` to handle this - that would be easy to Ajax-in in conjenction with the other filters.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275135719,"Filtered tables should show count of all matching rows, if fast enough", https://github.com/simonw/datasette/issues/86#issuecomment-345496540,https://api.github.com/repos/simonw/datasette/issues/86,345496540,MDEyOklzc3VlQ29tbWVudDM0NTQ5NjU0MA==,9599,simonw,2017-11-19T06:59:40Z,2017-11-19T06:59:40Z,OWNER,"OK,I've figured out how to do an initial version of this without JavaScript. I'll provide three form fields labell d ""add filter"": * a select box of all of the columns * a select box of the available operations * a value box Submit those and the site will redirect you to a correctly populated querystring for that filter. If you have filters applied, those will display as prepopulated form field triples. For foreign key reference filters, I will display the resolved value next to the text box containing the numeric ID. In the future this can get a select2 style treatment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/86#issuecomment-345497453,https://api.github.com/repos/simonw/datasette/issues/86,345497453,MDEyOklzc3VlQ29tbWVudDM0NTQ5NzQ1Mw==,9599,simonw,2017-11-19T07:21:22Z,2017-11-19T07:21:22Z,OWNER,I'm going to be a bit classier about this and auto generate a title for the page that describes the currently applied filters.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/86#issuecomment-345497534,https://api.github.com/repos/simonw/datasette/issues/86,345497534,MDEyOklzc3VlQ29tbWVudDM0NTQ5NzUzNA==,9599,simonw,2017-11-19T07:23:33Z,2017-11-19T07:23:33Z,OWNER,"""Tablename: 3,567 rows where status = 3 (published) and n > 55""","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/86#issuecomment-345497689,https://api.github.com/repos/simonw/datasette/issues/86,345497689,MDEyOklzc3VlQ29tbWVudDM0NTQ5NzY4OQ==,9599,simonw,2017-11-19T07:27:40Z,2017-11-19T07:27:40Z,OWNER,"I'll have to refactor the foreign key annotating code to be usable in other contexts - at the moment it only works for annotating displays of rows, but I need to use it to resolve selected filters as well. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/105#issuecomment-345503897,https://api.github.com/repos/simonw/datasette/issues/105,345503897,MDEyOklzc3VlQ29tbWVudDM0NTUwMzg5Nw==,198537,rgieseke,2017-11-19T09:38:08Z,2017-11-19T09:38:08Z,CONTRIBUTOR,"Thanks, I wrote this very simple reader because the default approach as described on the Datahub pages seemed to complicated. I had metadata from the `datapackage.json` attached to the returned DataFrames but removed this due to some attribute handling change in the latest Pandas version. This could also be useful for getting from Data Package to SQL db: https://github.com/frictionlessdata/tableschema-sql-py I maintain a few climate science related dataset at https://github.com/openclimatedata/ The Data Retriever (mainly ecological data) by @ethanwhite et al. is also using the Data Package format for metadata and has some tooling for different dbs: https://frictionlessdata.io/articles/the-data-retriever/ https://github.com/weecology/retriever The Open Power System Data project also has a couple of datasets that show nicely how CSV is great for assembling and then already make SQLite files available. It's one of the first data sets I tried with Datasette, perfect for the use case of getting an API for putting power stations on a map ... https://data.open-power-system-data.org/","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274314940,Consider data-package as a format for metadata, https://github.com/simonw/datasette/issues/97#issuecomment-345509500,https://api.github.com/repos/simonw/datasette/issues/97,345509500,MDEyOklzc3VlQ29tbWVudDM0NTUwOTUwMA==,231923,yschimke,2017-11-19T11:26:58Z,2017-11-19T11:26:58Z,NONE,"Specifically docs should make it clearer this file exists https://parlgov.datasettes.com/.json And from that you can build https://parlgov.datasettes.com/parlgov-25f9855.json Then https://parlgov.datasettes.com/parlgov-25f9855/cabinet.json","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274022950,Link to JSON for the list of tables , https://github.com/simonw/datasette/issues/131#issuecomment-345526171,https://api.github.com/repos/simonw/datasette/issues/131,345526171,MDEyOklzc3VlQ29tbWVudDM0NTUyNjE3MQ==,9599,simonw,2017-11-19T15:44:30Z,2017-11-19T15:44:30Z,OWNER,"Relevant SQLite docs: * https://sqlite.org/fts5.html * https://www.sqlite.org/fts3.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275166669,UI support for running FTS searches, https://github.com/simonw/datasette/issues/131#issuecomment-345526517,https://api.github.com/repos/simonw/datasette/issues/131,345526517,MDEyOklzc3VlQ29tbWVudDM0NTUyNjUxNw==,9599,simonw,2017-11-19T15:48:28Z,2017-11-19T15:48:28Z,OWNER,"Since SQLite supports column specifications in the MATCH body itself, there's no need to provide a separate mechanism for specifying columns in the query string: https://sqlite.org/fts5.html#fts5_column_filters","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275166669,UI support for running FTS searches, https://github.com/simonw/datasette/issues/131#issuecomment-345533274,https://api.github.com/repos/simonw/datasette/issues/131,345533274,MDEyOklzc3VlQ29tbWVudDM0NTUzMzI3NA==,9599,simonw,2017-11-19T17:17:37Z,2017-11-19T17:18:05Z,OWNER,"Demo: https://sf-trees.now.sh/sf-trees-ebc2ad9/Street_Tree_List?_search=grove+st ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275166669,UI support for running FTS searches, https://github.com/simonw/datasette/issues/134#issuecomment-345537268,https://api.github.com/repos/simonw/datasette/issues/134,345537268,MDEyOklzc3VlQ29tbWVudDM0NTUzNzI2OA==,9599,simonw,2017-11-19T18:10:48Z,2017-11-19T18:10:48Z,OWNER,Dupe of #127 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275176094,Filtered table view should show a count, https://github.com/simonw/datasette/issues/44#issuecomment-345537315,https://api.github.com/repos/simonw/datasette/issues/44,345537315,MDEyOklzc3VlQ29tbWVudDM0NTUzNzMxNQ==,9599,simonw,2017-11-19T18:11:27Z,2017-11-19T18:11:27Z,OWNER,This would enable faceted search - moving it to the search milestone.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",269731374,?_group_count=country - return counts by specific column(s), https://github.com/simonw/datasette/issues/127#issuecomment-345538016,https://api.github.com/repos/simonw/datasette/issues/127,345538016,MDEyOklzc3VlQ29tbWVudDM0NTUzODAxNg==,9599,simonw,2017-11-19T18:22:45Z,2017-11-19T18:22:45Z,OWNER,I implemented a basic version of this in f59c840e7db8870afcdeba7a53bdea07bb674334 for custom SQL.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275135719,"Filtered tables should show count of all matching rows, if fast enough", https://github.com/simonw/datasette/issues/122#issuecomment-345552440,https://api.github.com/repos/simonw/datasette/issues/122,345552440,MDEyOklzc3VlQ29tbWVudDM0NTU1MjQ0MA==,9599,simonw,2017-11-19T21:46:43Z,2017-11-19T21:46:43Z,OWNER,"This calls for refactoring the code so the table view, the row view and the custom SQL view share as much logic as possible.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275092453,"Redesign JSON output, ditch jsono, offer variants controlled by parameter instead", https://github.com/simonw/datasette/issues/122#issuecomment-345552500,https://api.github.com/repos/simonw/datasette/issues/122,345552500,MDEyOklzc3VlQ29tbWVudDM0NTU1MjUwMA==,9599,simonw,2017-11-19T21:47:27Z,2017-11-19T21:47:27Z,OWNER,"To start with, I could just ditch the .jsono in favour of the new _shape argument.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275092453,"Redesign JSON output, ditch jsono, offer variants controlled by parameter instead", https://github.com/simonw/datasette/issues/86#issuecomment-345559864,https://api.github.com/repos/simonw/datasette/issues/86,345559864,MDEyOklzc3VlQ29tbWVudDM0NTU1OTg2NA==,9599,simonw,2017-11-19T23:35:48Z,2017-11-19T23:35:48Z,OWNER,"I need a nicer abstraction around the concept of filters. It needs to be able to: - convert querystring parameters into filters - convert filters into a querystring - iterate through currently applied filters - convert selected filters into a human description (e.g. for a title) - expand filters that involve a foreign key - add filters - remove filters - define different types of filters It should replace my current `build_where_clauses` implementation, in particular this bit: https://github.com/simonw/datasette/blob/a5881e105a02830d26f07e98177248d5910893da/datasette/utils.py#L38-L56","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/44#issuecomment-345342512,https://api.github.com/repos/simonw/datasette/issues/44,345342512,MDEyOklzc3VlQ29tbWVudDM0NTM0MjUxMg==,9599,simonw,2017-11-17T19:27:53Z,2017-11-20T04:37:35Z,OWNER,"This should support multiple columns, e.g. `?_group_count=precinct&_group_count=candidate`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",269731374,?_group_count=country - return counts by specific column(s), https://github.com/simonw/datasette/issues/44#issuecomment-345601103,https://api.github.com/repos/simonw/datasette/issues/44,345601103,MDEyOklzc3VlQ29tbWVudDM0NTYwMTEwMw==,9599,simonw,2017-11-20T06:13:35Z,2017-11-20T06:13:35Z,OWNER,"Some demos: Single column: https://sf-trees-flat.now.sh/sf-trees-flat-ba738ce/Street_Tree_List?_group_count=qSpecies Multi column: https://sf-trees-flat.now.sh/sf-trees-flat-ba738ce/Street_Tree_List?_group_count=qLegalStatus&_group_count=qSpecies ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",269731374,?_group_count=country - return counts by specific column(s), https://github.com/simonw/datasette/issues/133#issuecomment-345601870,https://api.github.com/repos/simonw/datasette/issues/133,345601870,MDEyOklzc3VlQ29tbWVudDM0NTYwMTg3MA==,9599,simonw,2017-11-20T06:18:53Z,2017-11-20T06:18:53Z,OWNER,This may be tackled by the filters work happening in #86,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275176006,"If view is filtered, search should apply within those filtered rows", https://github.com/simonw/datasette/issues/27#issuecomment-345652450,https://api.github.com/repos/simonw/datasette/issues/27,345652450,MDEyOklzc3VlQ29tbWVudDM0NTY1MjQ1MA==,198537,rgieseke,2017-11-20T10:19:39Z,2017-11-20T10:19:39Z,CONTRIBUTOR,"If Data Package metadata gets adopted (#105) the views spec work might also be worth a look: http://frictionlessdata.io/specs/views/ http://datahub.io/docs/features/views ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267886330,Ability to plot a simple graph, https://github.com/simonw/datasette/issues/129#issuecomment-345793887,https://api.github.com/repos/simonw/datasette/issues/129,345793887,MDEyOklzc3VlQ29tbWVudDM0NTc5Mzg4Nw==,9599,simonw,2017-11-20T19:00:30Z,2017-11-20T19:00:30Z,OWNER,"Need to hide these from the index summary page as well: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275164558,Hide FTS-created tables by default on the database index page, https://github.com/simonw/datasette/issues/105#issuecomment-345809808,https://api.github.com/repos/simonw/datasette/issues/105,345809808,MDEyOklzc3VlQ29tbWVudDM0NTgwOTgwOA==,9599,simonw,2017-11-20T19:50:53Z,2017-11-20T19:50:53Z,OWNER,"OK, https://github.com/openclimatedata/global-carbon-budget/blob/master/datapackage.json really does look like it covers all of the bases I need for #138. Closing this ticket in favour of that new one.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274314940,Consider data-package as a format for metadata, https://github.com/simonw/datasette/issues/42#issuecomment-345810031,https://api.github.com/repos/simonw/datasette/issues/42,345810031,MDEyOklzc3VlQ29tbWVudDM0NTgxMDAzMQ==,9599,simonw,2017-11-20T19:51:29Z,2017-11-20T19:51:29Z,OWNER,See also #138,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268591332,Homepage UI for editing metadata file, https://github.com/simonw/datasette/issues/14#issuecomment-345893877,https://api.github.com/repos/simonw/datasette/issues/14,345893877,MDEyOklzc3VlQ29tbWVudDM0NTg5Mzg3Nw==,9599,simonw,2017-11-21T02:11:27Z,2017-11-21T02:11:27Z,OWNER,http://setuptools.readthedocs.io/en/latest/setuptools.html#dynamic-discovery-of-services-and-plugins Is pretty good ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/pull/104#issuecomment-346116745,https://api.github.com/repos/simonw/datasette/issues/104,346116745,MDEyOklzc3VlQ29tbWVudDM0NjExNjc0NQ==,21148,jacobian,2017-11-21T18:23:25Z,2017-11-21T18:23:25Z,CONTRIBUTOR,"@simonw ready for a review and merge if you want. There's still some nasty duplicated code in cli.py and utils.py, which is just going to get worse if/when we start adding any other deploy targets (and I want to do one for cloud.gov, at least). I think there's an opportunity for some refactoring here. I'm happy to do that now as part of this PR, or if you merge this first I'll do it in a different one.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274284246,[WIP] Add publish to heroku support, https://github.com/simonw/datasette/pull/104#issuecomment-346124073,https://api.github.com/repos/simonw/datasette/issues/104,346124073,MDEyOklzc3VlQ29tbWVudDM0NjEyNDA3Mw==,21148,jacobian,2017-11-21T18:49:55Z,2017-11-21T18:49:55Z,CONTRIBUTOR,"Actually hang on, don't merge - there are some bugs that #141 masked when I tested this out elsewhere.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274284246,[WIP] Add publish to heroku support, https://github.com/simonw/datasette/pull/104#issuecomment-346124764,https://api.github.com/repos/simonw/datasette/issues/104,346124764,MDEyOklzc3VlQ29tbWVudDM0NjEyNDc2NA==,21148,jacobian,2017-11-21T18:52:14Z,2017-11-21T18:52:14Z,CONTRIBUTOR,"OK, now this should work.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274284246,[WIP] Add publish to heroku support, https://github.com/simonw/datasette/issues/141#issuecomment-346157542,https://api.github.com/repos/simonw/datasette/issues/141,346157542,MDEyOklzc3VlQ29tbWVudDM0NjE1NzU0Mg==,9599,simonw,2017-11-21T20:53:47Z,2017-11-21T20:53:47Z,OWNER,"I think a copy is the right thing to do here - it will be cleaned up when the temp directory is removed. The hard link thing was always intended to save space, but if we can't do a hard link I don't see any harm in a temporary file copy.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275814941,datasette publish can fail if /tmp is on a different device, https://github.com/simonw/datasette/issues/90#issuecomment-346161985,https://api.github.com/repos/simonw/datasette/issues/90,346161985,MDEyOklzc3VlQ29tbWVudDM0NjE2MTk4NQ==,9599,simonw,2017-11-21T21:10:22Z,2017-11-21T21:10:22Z,OWNER,"Woohoo! I've found one tiny issue: right now, the following doesn't work: datasette publish heroku ../demo-databses/google-trends.db It results in this error in the Heroku logs: 2017-11-21T21:03:29.210511+00:00 app[web.1]: Usage: datasette serve [OPTIONS] [FILES]... 2017-11-21T21:03:29.210524+00:00 app[web.1]: 2017-11-21T21:03:29.210555+00:00 app[web.1]: Error: Invalid value for ""files"": Path ""../demo-databses/google-trends.db"" does not exist. The command works fine if you run it in the same directory as the database file you are publishing.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123,datasette publish heroku, https://github.com/simonw/datasette/issues/90#issuecomment-346163513,https://api.github.com/repos/simonw/datasette/issues/90,346163513,MDEyOklzc3VlQ29tbWVudDM0NjE2MzUxMw==,9599,simonw,2017-11-21T21:16:16Z,2017-11-21T21:16:16Z,OWNER,"The reason relative paths work for `publish now` is that the `make_dockerfile()` function is called by passing the file names, not the full file paths: https://github.com/simonw/datasette/blob/e47117ce1d15f11246a3120aa49de70205713d05/datasette/utils.py#L166 Clearly the correct thing to do here is for us to refactor the shared code between heroku/package/now.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123,datasette publish heroku, https://github.com/simonw/datasette/issues/142#issuecomment-346217739,https://api.github.com/repos/simonw/datasette/issues/142,346217739,MDEyOklzc3VlQ29tbWVudDM0NjIxNzczOQ==,9599,simonw,2017-11-22T01:45:30Z,2017-11-22T01:45:30Z,OWNER,Might be nice to have a --no-limits option that disables time and maximum row count limits.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275917760,Show extra instructions with the interrupted, https://github.com/simonw/datasette/issues/14#issuecomment-346244871,https://api.github.com/repos/simonw/datasette/issues/14,346244871,MDEyOklzc3VlQ29tbWVudDM0NjI0NDg3MQ==,21148,jacobian,2017-11-22T05:06:30Z,2017-11-22T05:06:30Z,CONTRIBUTOR,"I'd also suggest taking a look at [stevedore](https://docs.openstack.org/stevedore/latest/), which has a ton of tools for doing plugin stuff. I've had good luck with it in the past.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/144#issuecomment-346405660,https://api.github.com/repos/simonw/datasette/issues/144,346405660,MDEyOklzc3VlQ29tbWVudDM0NjQwNTY2MA==,9599,simonw,2017-11-22T16:38:05Z,2017-11-22T16:38:05Z,OWNER,"I have a solution for FTS already, but I'm interested in apsw as a mechanism for allowing custom virtual tables to be written in Python (pysqlite only lets you write custom functions) Not having PyPI support is pretty tough though. I'm planning a plugin/extension system which would be ideal for things like an optional apsw mode, but that's a lot harder if apsw isn't in PyPI.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276091279,apsw as alternative sqlite3 binding (for full text search), https://github.com/simonw/datasette/issues/14#issuecomment-346406009,https://api.github.com/repos/simonw/datasette/issues/14,346406009,MDEyOklzc3VlQ29tbWVudDM0NjQwNjAwOQ==,9599,simonw,2017-11-22T16:39:08Z,2017-11-22T16:39:08Z,OWNER,"Oh thanks, that definitely looks like an interesting option.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/144#issuecomment-346427794,https://api.github.com/repos/simonw/datasette/issues/144,346427794,MDEyOklzc3VlQ29tbWVudDM0NjQyNzc5NA==,649467,mhalle,2017-11-22T17:55:45Z,2017-11-22T17:55:45Z,NONE,"Thanks. There is a way to use pip to grab apsw, which also let's you configure it (flags to build extensions, use an internal sqlite, etc). Don't know how that works as a dependency for another package, though. On November 22, 2017 11:38:06 AM EST, Simon Willison wrote: >I have a solution for FTS already, but I'm interested in apsw as a >mechanism for allowing custom virtual tables to be written in Python >(pysqlite only lets you write custom functions) > >Not having PyPI support is pretty tough though. I'm planning a >plugin/extension system which would be ideal for things like an >optional apsw mode, but that's a lot harder if apsw isn't in PyPI. > >-- >You are receiving this because you authored the thread. >Reply to this email directly or view it on GitHub: >https://github.com/simonw/datasette/issues/144#issuecomment-346405660 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276091279,apsw as alternative sqlite3 binding (for full text search), https://github.com/simonw/datasette/issues/129#issuecomment-346463342,https://api.github.com/repos/simonw/datasette/issues/129,346463342,MDEyOklzc3VlQ29tbWVudDM0NjQ2MzM0Mg==,9599,simonw,2017-11-22T20:22:02Z,2017-11-22T20:22:02Z,OWNER,"On the index page: On the database index page: After clicking that link: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275164558,Hide FTS-created tables by default on the database index page, https://github.com/simonw/datasette/issues/86#issuecomment-346530498,https://api.github.com/repos/simonw/datasette/issues/86,346530498,MDEyOklzc3VlQ29tbWVudDM0NjUzMDQ5OA==,9599,simonw,2017-11-23T04:35:07Z,2017-11-23T04:35:07Z,OWNER,"Here's where I am now. Needs a bit of UI tidy up and it will be good to release: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/146#issuecomment-346682905,https://api.github.com/repos/simonw/datasette/issues/146,346682905,MDEyOklzc3VlQ29tbWVudDM0NjY4MjkwNQ==,9599,simonw,2017-11-23T18:55:08Z,2017-11-23T18:55:08Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276455748,datasette publish gcloud, https://github.com/simonw/datasette/issues/86#issuecomment-346691243,https://api.github.com/repos/simonw/datasette/issues/86,346691243,MDEyOklzc3VlQ29tbWVudDM0NjY5MTI0Mw==,9599,simonw,2017-11-23T20:07:15Z,2017-11-23T20:07:15Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/86#issuecomment-346694211,https://api.github.com/repos/simonw/datasette/issues/86,346694211,MDEyOklzc3VlQ29tbWVudDM0NjY5NDIxMQ==,9599,simonw,2017-11-23T20:34:32Z,2017-11-23T20:34:32Z,OWNER,And with ef3eacf622e69723d48ab1ad597645770a7361db I'm ready to call this one done.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/132#issuecomment-346701751,https://api.github.com/repos/simonw/datasette/issues/132,346701751,MDEyOklzc3VlQ29tbWVudDM0NjcwMTc1MQ==,9599,simonw,2017-11-23T21:51:51Z,2017-11-23T21:51:51Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275175929,Row view is not currently expanding foreign keys, https://github.com/simonw/datasette/issues/147#issuecomment-346900554,https://api.github.com/repos/simonw/datasette/issues/147,346900554,MDEyOklzc3VlQ29tbWVudDM0NjkwMDU1NA==,9599,simonw,2017-11-24T22:02:22Z,2017-11-24T22:02:22Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276476670,Tidy up design of the header of the table page, https://github.com/simonw/datasette/issues/133#issuecomment-346705879,https://api.github.com/repos/simonw/datasette/issues/133,346705879,MDEyOklzc3VlQ29tbWVudDM0NjcwNTg3OQ==,9599,simonw,2017-11-23T22:43:42Z,2017-11-24T22:07:46Z,OWNER,"Easiest way to do this will be to move it into the same `
` as the filters. Would be nice to detect `?_search=` and redirect to URL without the `_search` parameter, just for aesthetics.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275176006,"If view is filtered, search should apply within those filtered rows", https://github.com/simonw/datasette/issues/133#issuecomment-346902583,https://api.github.com/repos/simonw/datasette/issues/133,346902583,MDEyOklzc3VlQ29tbWVudDM0NjkwMjU4Mw==,9599,simonw,2017-11-24T22:30:32Z,2017-11-24T22:30:32Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275176006,"If view is filtered, search should apply within those filtered rows", https://github.com/simonw/datasette/issues/149#issuecomment-346903317,https://api.github.com/repos/simonw/datasette/issues/149,346903317,MDEyOklzc3VlQ29tbWVudDM0NjkwMzMxNw==,9599,simonw,2017-11-24T22:41:58Z,2017-11-24T22:41:58Z,OWNER,"Custom SQL results now look like this: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276704127,Update custom SQL results to match new table view header, https://github.com/simonw/datasette/issues/141#issuecomment-346974336,https://api.github.com/repos/simonw/datasette/issues/141,346974336,MDEyOklzc3VlQ29tbWVudDM0Njk3NDMzNg==,50138,janimo,2017-11-26T00:00:35Z,2017-11-26T00:00:35Z,NONE,FWIW I worked around this by setting TMPDIR to ~/tmp before running the command.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275814941,datasette publish can fail if /tmp is on a different device, https://github.com/simonw/datasette/issues/124#issuecomment-346987395,https://api.github.com/repos/simonw/datasette/issues/124,346987395,MDEyOklzc3VlQ29tbWVudDM0Njk4NzM5NQ==,50138,janimo,2017-11-26T06:24:08Z,2017-11-26T06:24:08Z,NONE,"Are there performance gains when using immutable as opposed to read-only? From what I see other processes can still modify the DB when immutable, but there are no change notifications.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125805,Option to open readonly but not immutable, https://github.com/simonw/datasette/issues/124#issuecomment-347049888,https://api.github.com/repos/simonw/datasette/issues/124,347049888,MDEyOklzc3VlQ29tbWVudDM0NzA0OTg4OA==,9599,simonw,2017-11-27T00:01:08Z,2017-11-27T00:01:08Z,OWNER,"https://sqlite.org/c3ref/open.html Is the only documentation I've been able to find of the immutable option: > **immutable**: The immutable parameter is a boolean query parameter that indicates that the database file is stored on read-only media. When immutable is set, SQLite assumes that the database file cannot be changed, even by a process with higher privilege, and so the database is opened read-only and all locking and change detection is disabled. Caution: Setting the immutable property on a database file that does in fact change can result in incorrect query results and/or SQLITE_CORRUPT errors. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125805,Option to open readonly but not immutable, https://github.com/simonw/datasette/issues/153#issuecomment-347050235,https://api.github.com/repos/simonw/datasette/issues/153,347050235,MDEyOklzc3VlQ29tbWVudDM0NzA1MDIzNQ==,9599,simonw,2017-11-27T00:06:24Z,2017-11-27T00:06:24Z,OWNER,"I've been thinking about 1. a bit - I actually think it would be fine to have a rule that says ""if the contents of the cell starts with `http://` or `https://` and doesn't contain any whitespace, turn that into a link"". If you need the non-linked version that will always be available in the JSON. For the other two... I think #12 may be the way to go here: if you can easily over-ride the `row.html` and `table.html` templates for specific databases you can easily set pre-formatted text or similar for certain values - maybe even with CSS that targets a specific table column.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-347051331,https://api.github.com/repos/simonw/datasette/issues/153,347051331,MDEyOklzc3VlQ29tbWVudDM0NzA1MTMzMQ==,9599,simonw,2017-11-27T00:23:40Z,2017-11-27T03:58:49Z,OWNER,"One quick fix could be to add a `extra_css_url` key to the `metadata.json` format (which currently hosts `title`, `license_url` etc) - if populated, we can inject a link to that stylesheet on every page. We could add a few classes in strategic places that include the database and table names to give people styling hooks. While we're at it, an `extra_js_url` key would let people go really nuts!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/124#issuecomment-347123991,https://api.github.com/repos/simonw/datasette/issues/124,347123991,MDEyOklzc3VlQ29tbWVudDM0NzEyMzk5MQ==,50138,janimo,2017-11-27T09:25:15Z,2017-11-27T09:25:15Z,NONE,"That's the only reference to immutable I saw as well, making me think that there may be no perceivable advantages over simply using mode=ro. Since the database is never or seldom updated the change notifications should not impact performance.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125805,Option to open readonly but not immutable, https://github.com/simonw/datasette/issues/124#issuecomment-347236102,https://api.github.com/repos/simonw/datasette/issues/124,347236102,MDEyOklzc3VlQ29tbWVudDM0NzIzNjEwMg==,9599,simonw,2017-11-27T16:24:15Z,2017-11-27T16:24:15Z,OWNER,I'd really like to get some benchmarks working so I can see the actual impact of this kind of thing.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125805,Option to open readonly but not immutable, https://github.com/simonw/datasette/issues/155#issuecomment-347713453,https://api.github.com/repos/simonw/datasette/issues/155,347713453,MDEyOklzc3VlQ29tbWVudDM0NzcxMzQ1Mw==,9599,simonw,2017-11-29T00:41:30Z,2017-11-29T00:41:30Z,OWNER,Could you provide the SQL to create a reproducible test case (both CREATE TABLE and INSERT statements)?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",277589569,A primary key column that has foreign key restriction associated won't rendering label column, https://github.com/simonw/datasette/issues/155#issuecomment-347714314,https://api.github.com/repos/simonw/datasette/issues/155,347714314,MDEyOklzc3VlQ29tbWVudDM0NzcxNDMxNA==,388154,wsxiaoys,2017-11-29T00:46:25Z,2017-11-29T00:46:25Z,NONE,"``` CREATE TABLE rhs ( id INTEGER PRIMARY KEY, name TEXT ); CREATE TABLE lhs ( symbol INTEGER PRIMARY KEY, FOREIGN KEY (symbol) REFERENCES rhs(id) ); INSERT INTO rhs VALUES (1, ""foo""); INSERT INTO rhs VALUES (2, ""bar""); INSERT INTO lhs VALUES (1); INSERT INTO lhs VALUES (2); ``` It's expected that in lhs's view, foo / bar should be displayed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",277589569,A primary key column that has foreign key restriction associated won't rendering label column, https://github.com/simonw/datasette/issues/155#issuecomment-347714471,https://api.github.com/repos/simonw/datasette/issues/155,347714471,MDEyOklzc3VlQ29tbWVudDM0NzcxNDQ3MQ==,9599,simonw,2017-11-29T00:47:21Z,2017-11-29T00:47:21Z,OWNER,Thanks!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",277589569,A primary key column that has foreign key restriction associated won't rendering label column, https://github.com/simonw/datasette/issues/155#issuecomment-347715452,https://api.github.com/repos/simonw/datasette/issues/155,347715452,MDEyOklzc3VlQ29tbWVudDM0NzcxNTQ1Mg==,9599,simonw,2017-11-29T00:52:30Z,2017-11-29T00:52:30Z,OWNER,"Interestingly, it almost does the right thing on the individual row page: https://bug-155-dkcqckhgki.now.sh/bug-155-9a7bb68/lhs/1 The symbol has been expanded, but there's a rogue '1' that shouldn't be there at all - I think that's bug #152 The table view itself is definitely doing the wrong thing: https://bug-155-dkcqckhgki.now.sh/bug-155-9a7bb68/lhs ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",277589569,A primary key column that has foreign key restriction associated won't rendering label column, https://github.com/simonw/datasette/issues/153#issuecomment-347735334,https://api.github.com/repos/simonw/datasette/issues/153,347735334,MDEyOklzc3VlQ29tbWVudDM0NzczNTMzNA==,9599,simonw,2017-11-29T02:45:03Z,2017-11-29T02:45:03Z,OWNER,"@ftrain OK I've shipped the first version of this. Here's the initial documentation: Create a `metadata.json` file that looks like this: { ""extra_css_urls"": [ ""https://simonwillison.net/static/css/all.bf8cd891642c.css"" ], ""extra_js_urls"": [ ""https://code.jquery.com/jquery-3.2.1.slim.min.js"" ] } Then start datasette like this: datasette mydb.db --metadata=metadata.json The CSS and JavaScript files will be linked in the `` of every page. You can also specify a SRI (subresource integrity hash) for these assets: { ""extra_css_urls"": [ { ""url"": ""https://simonwillison.net/static/css/all.bf8cd891642c.css"", ""sri"": ""sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI"" } ], ""extra_js_urls"": [ { ""url"": ""https://code.jquery.com/jquery-3.2.1.slim.min.js"", ""sri"": ""sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g="" } ] } Modern browsers will only execute the stylsheet or JavaScript if the SRI hash matches the content served. You can generate hashes using www.srihash.org This isn't shipped in a release yet, but you can still access these features in `datasette publish` like so: datasette publish now mydb.db --metadata=metadata.json --branch=master The `--branch=master` option will pull the latest master build of Datasette from GitHub.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-347735724,https://api.github.com/repos/simonw/datasette/issues/153,347735724,MDEyOklzc3VlQ29tbWVudDM0NzczNTcyNA==,9599,simonw,2017-11-29T02:47:14Z,2017-11-29T02:47:14Z,OWNER,(This only addresses point 2 in your issue description - points 1 and point 3 are still to come),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-347735598,https://api.github.com/repos/simonw/datasette/issues/153,347735598,MDEyOklzc3VlQ29tbWVudDM0NzczNTU5OA==,9599,simonw,2017-11-29T02:46:31Z,2017-11-29T02:47:27Z,OWNER,"To style individual columns you'll currently need to use the `nth-of-type` selector, e.g.: td:nth-of-type(5):before { white-space: pre }","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-347928926,https://api.github.com/repos/simonw/datasette/issues/153,347928926,MDEyOklzc3VlQ29tbWVudDM0NzkyODkyNg==,9599,simonw,2017-11-29T17:09:40Z,2017-11-29T17:09:40Z,OWNER,"OK, that's point 1 covered.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-348103270,https://api.github.com/repos/simonw/datasette/issues/153,348103270,MDEyOklzc3VlQ29tbWVudDM0ODEwMzI3MA==,9599,simonw,2017-11-30T07:16:40Z,2017-11-30T07:16:40Z,OWNER,"Every template now gets CSS classes in the body designed to support custom styling. The index template (the top level page at /) gets this: The database template (/dbname/) gets this: The table template (/dbname/tablename) gets: The row template (/dbname/tablename/rowid) gets: The db-x and table-x classes use the database or table names themselves IF they are valid CSS identifiers. If they aren't, we strip any invalid characters out and append a 6 character md5 digest of the original name, in order to ensure that multiple tables which resolve to the same stripped character version still have different CSS classes. Some examples (extracted from the unit tests): ""simple"" => ""simple"" ""MixedCase"" => ""MixedCase"" ""-no-leading-hyphens"" => ""no-leading-hyphens-65bea6"" ""_no-leading-underscores"" => ""no-leading-underscores-b921bc"" ""no spaces"" => ""no-spaces-7088d7"" ""-"" => ""336d5e"" ""no $ characters"" => ""no--characters-59e024"" ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/12#issuecomment-348245757,https://api.github.com/repos/simonw/datasette/issues/12,348245757,MDEyOklzc3VlQ29tbWVudDM0ODI0NTc1Nw==,9599,simonw,2017-11-30T16:39:45Z,2017-11-30T16:39:45Z,OWNER,"It is now possible to over-ride templates on a per-database / per-row or per- table basis. When you access e.g. `/mydatabase/mytable` Datasette will look for the following: - table-mydatabase-mytable.html - table.html If you provided a `--template-dir` argument to datasette serve it will look in that directory first. The lookup rules are as follows: Index page (/): index.html Database page (/mydatabase): database-mydatabase.html database.html Table page (/mydatabase/mytable): table-mydatabase-mytable.html table.html Row page (/mydatabase/mytable/id): row-mydatabase-mytable.html row.html If a table name has spaces or other unexpected characters in it, the template filename will follow the same rules as our custom `` CSS classes introduced in 8ab3a16 - for example, a table called ""Food Trucks"" will attempt to load the following templates: table-mydatabase-Food-Trucks-399138.html table.html It is possible to extend the default templates using Jinja template inheritance. If you want to customize EVERY row template with some additional content you can do so by creating a `row.html` template like this: {% extends ""default:row.html"" %} {% block content %}

EXTRA HTML AT THE TOP OF THE CONTENT BLOCK

This line renders the original block:

{{ super() }} {% endblock %} ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267523511,Make it so you can override templates, https://github.com/simonw/datasette/issues/153#issuecomment-348245843,https://api.github.com/repos/simonw/datasette/issues/153,348245843,MDEyOklzc3VlQ29tbWVudDM0ODI0NTg0Mw==,9599,simonw,2017-11-30T16:40:02Z,2017-11-30T16:40:02Z,OWNER,"It is now possible to over-ride templates on a per-database / per-row or per- table basis. When you access e.g. `/mydatabase/mytable` Datasette will look for the following: - table-mydatabase-mytable.html - table.html If you provided a `--template-dir` argument to datasette serve it will look in that directory first. The lookup rules are as follows: Index page (/): index.html Database page (/mydatabase): database-mydatabase.html database.html Table page (/mydatabase/mytable): table-mydatabase-mytable.html table.html Row page (/mydatabase/mytable/id): row-mydatabase-mytable.html row.html If a table name has spaces or other unexpected characters in it, the template filename will follow the same rules as our custom `` CSS classes introduced in 8ab3a16 - for example, a table called ""Food Trucks"" will attempt to load the following templates: table-mydatabase-Food-Trucks-399138.html table.html It is possible to extend the default templates using Jinja template inheritance. If you want to customize EVERY row template with some additional content you can do so by creating a `row.html` template like this: {% extends ""default:row.html"" %} {% block content %}

EXTRA HTML AT THE TOP OF THE CONTENT BLOCK

This line renders the original block:

{{ super() }} {% endblock %} ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-348248406,https://api.github.com/repos/simonw/datasette/issues/153,348248406,MDEyOklzc3VlQ29tbWVudDM0ODI0ODQwNg==,9599,simonw,2017-11-30T16:47:45Z,2017-11-30T16:47:45Z,OWNER,Remaining work on this now lives in a milestone: https://github.com/simonw/datasette/milestone/6,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/126#issuecomment-348248957,https://api.github.com/repos/simonw/datasette/issues/126,348248957,MDEyOklzc3VlQ29tbWVudDM0ODI0ODk1Nw==,9599,simonw,2017-11-30T16:49:24Z,2017-11-30T16:49:24Z,OWNER,https://simonwillison.net/2017/Nov/25/new-in-datasette/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275135535,Blog entry announcing foreign key support, https://github.com/simonw/datasette/issues/153#issuecomment-348252037,https://api.github.com/repos/simonw/datasette/issues/153,348252037,MDEyOklzc3VlQ29tbWVudDM0ODI1MjAzNw==,20264,ftrain,2017-11-30T16:59:00Z,2017-11-30T16:59:00Z,NONE,"WOW! -- Paul Ford // (646) 369-7128 // @ftrain On Thu, Nov 30, 2017 at 11:47 AM, Simon Willison wrote: > Remaining work on this now lives in a milestone: > https://github.com/simonw/datasette/milestone/6 > > — > You are receiving this because you were mentioned. > Reply to this email directly, view it on GitHub > , > or mute the thread > > . > ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/156#issuecomment-348255782,https://api.github.com/repos/simonw/datasette/issues/156,348255782,MDEyOklzc3VlQ29tbWVudDM0ODI1NTc4Mg==,9599,simonw,2017-11-30T17:11:34Z,2017-11-30T17:11:34Z,OWNER,http://datasette.readthedocs.io/en/latest/custom_templates.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278189708,Document CSS hooks and custom templates, https://github.com/simonw/datasette/issues/153#issuecomment-348255925,https://api.github.com/repos/simonw/datasette/issues/153,348255925,MDEyOklzc3VlQ29tbWVudDM0ODI1NTkyNQ==,9599,simonw,2017-11-30T17:12:03Z,2017-11-30T17:12:03Z,OWNER,Documentation is now live for this: http://datasette.readthedocs.io/en/latest/custom_templates.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/160#issuecomment-348404864,https://api.github.com/repos/simonw/datasette/issues/160,348404864,MDEyOklzc3VlQ29tbWVudDM0ODQwNDg2NA==,9599,simonw,2017-12-01T05:26:57Z,2017-12-01T05:26:57Z,OWNER,"Question is... what should happen to the default static stuff? At the moment that's just https://fivethirtyeight.datasettes.com/-/static/app.css - though I want to improve that to include a content hash, see #154 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/154#issuecomment-348404988,https://api.github.com/repos/simonw/datasette/issues/154,348404988,MDEyOklzc3VlQ29tbWVudDM0ODQwNDk4OA==,9599,simonw,2017-12-01T05:27:40Z,2017-12-01T05:27:40Z,OWNER,If I do add additional static file bundling should that automatically get content hashes as well? #160 - problem with that is then I might have to parse the CSS files and rewrite their internal background-url references etc.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276873891,Datasette CSS should include content hash in the URL, https://github.com/simonw/datasette/issues/20#issuecomment-348420129,https://api.github.com/repos/simonw/datasette/issues/20,348420129,MDEyOklzc3VlQ29tbWVudDM0ODQyMDEyOQ==,9599,simonw,2017-12-01T07:16:25Z,2017-12-01T07:16:25Z,OWNER,"I've found some examples of canned queries I want to support that can't be represented as views, so I'm going to reopen this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/20#issuecomment-348420955,https://api.github.com/repos/simonw/datasette/issues/20,348420955,MDEyOklzc3VlQ29tbWVudDM0ODQyMDk1NQ==,9599,simonw,2017-12-01T07:21:08Z,2017-12-01T07:21:08Z,OWNER,"I'll use the existing metadata.json file: { ""databases"": { ""mydb"": { ""queries"": { ""custom_thingy"": {... The query definition can either be just a string of SQL, or it can be an object with a sql key and optional title and description keys. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/160#issuecomment-348719680,https://api.github.com/repos/simonw/datasette/issues/160,348719680,MDEyOklzc3VlQ29tbWVudDM0ODcxOTY4MA==,9599,simonw,2017-12-02T20:59:27Z,2017-12-02T20:59:27Z,OWNER,"This is about more than just CSS and JavaScript - there are plenty of reasons someone might want to bundle HTML as well, e.g. for building something like https://sf-tree-search.now.sh/ So, instead of thinking about this in terms of /static/, I'm going to think about this in terms of allowing people to mount one or more document roots (or docroots). datasette serve mydb.db -d my-doc-root/ This will cause the root of the server to show content from the `my-doc-root/` directory (assuming it has an index.html file in it). A more common option will be to mount specific folders to specific directories, like this: datasette serve mydb.db -d static:my-static/ Now any hits to `/static/foo.css` will serve content from `my-static/foo.css`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/160#issuecomment-348719752,https://api.github.com/repos/simonw/datasette/issues/160,348719752,MDEyOklzc3VlQ29tbWVudDM0ODcxOTc1Mg==,9599,simonw,2017-12-02T21:00:21Z,2017-12-02T21:00:21Z,OWNER,Not sure which I like better out of `-d/--docroot` or `-s/--static` or `-m/--mount` for this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/160#issuecomment-348719827,https://api.github.com/repos/simonw/datasette/issues/160,348719827,MDEyOklzc3VlQ29tbWVudDM0ODcxOTgyNw==,9599,simonw,2017-12-02T21:01:36Z,2017-12-02T21:01:36Z,OWNER,`-m` is already taken for `--metadata`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/160#issuecomment-348793054,https://api.github.com/repos/simonw/datasette/issues/160,348793054,MDEyOklzc3VlQ29tbWVudDM0ODc5MzA1NA==,9599,simonw,2017-12-03T16:35:22Z,2017-12-03T16:35:22Z,OWNER,"You can now tell Datasette to serve static files from a specific location at a specific mountpoint. For example: datasette serve mydb.db --static extra-css:/tmp/static/css Now if you visit this URL: http://localhost:8001/extra-css/blah.css The following file will be served: /tmp/static/css/blah.css ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/160#issuecomment-348793156,https://api.github.com/repos/simonw/datasette/issues/160,348793156,MDEyOklzc3VlQ29tbWVudDM0ODc5MzE1Ng==,9599,simonw,2017-12-03T16:35:53Z,2017-12-03T16:35:53Z,OWNER,Still TODO: teach `datasette publish` and friends about this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/161#issuecomment-348860191,https://api.github.com/repos/simonw/datasette/issues/161,348860191,MDEyOklzc3VlQ29tbWVudDM0ODg2MDE5MQ==,9599,simonw,2017-12-04T04:52:14Z,2017-12-04T04:52:14Z,OWNER,Seems like a reasonable thing for us to support.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278814220,Support WITH query , https://github.com/simonw/datasette/issues/20#issuecomment-348860623,https://api.github.com/repos/simonw/datasette/issues/20,348860623,MDEyOklzc3VlQ29tbWVudDM0ODg2MDYyMw==,9599,simonw,2017-12-04T04:56:21Z,2017-12-04T04:56:21Z,OWNER,"While I'm doing this, I could add per-database and per-table metadata too ala #68","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/20#issuecomment-349027974,https://api.github.com/repos/simonw/datasette/issues/20,349027974,MDEyOklzc3VlQ29tbWVudDM0OTAyNzk3NA==,9599,simonw,2017-12-04T17:01:19Z,2017-12-04T17:01:19Z,OWNER, This is also a good opportunity to re-factor out a separate query.html template - right now the database.html template is doing two jobs.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/135#issuecomment-349047335,https://api.github.com/repos/simonw/datasette/issues/135,349047335,MDEyOklzc3VlQ29tbWVudDM0OTA0NzMzNQ==,9599,simonw,2017-12-04T17:57:08Z,2017-12-04T17:57:08Z,OWNER,Turns out there's a bug in this: https://timezones-now-hrjgkinozh.now.sh/timezones-0d61a90/ElementaryGeometries should not be showing the search box.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275179724,?_search=x should work if used directly against a FTS virtual table, https://github.com/simonw/datasette/issues/20#issuecomment-349359498,https://api.github.com/repos/simonw/datasette/issues/20,349359498,MDEyOklzc3VlQ29tbWVudDM0OTM1OTQ5OA==,9599,simonw,2017-12-05T16:30:06Z,2017-12-05T16:30:06Z,OWNER,"Named canned queries can now be defined in metadata.json like this: { ""databases"": { ""timezones"": { ""queries"": { ""timezone_for_point"": ""select tzid from timezones ..."" } } } } These will be shown in a new ""Queries"" section beneath ""Views"" on the database page. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/20#issuecomment-349383276,https://api.github.com/repos/simonw/datasette/issues/20,349383276,MDEyOklzc3VlQ29tbWVudDM0OTM4MzI3Ng==,9599,simonw,2017-12-05T17:45:20Z,2017-12-05T17:45:20Z,OWNER,http://datasette.readthedocs.io/en/latest/sql_queries.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/20#issuecomment-349406761,https://api.github.com/repos/simonw/datasette/issues/20,349406761,MDEyOklzc3VlQ29tbWVudDM0OTQwNjc2MQ==,9599,simonw,2017-12-05T19:03:06Z,2017-12-05T19:03:06Z,OWNER,Demo: https://timezones-api.now.sh/timezones-3cb9f64/by_point,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/122#issuecomment-349408214,https://api.github.com/repos/simonw/datasette/issues/122,349408214,MDEyOklzc3VlQ29tbWVudDM0OTQwODIxNA==,9599,simonw,2017-12-05T19:08:04Z,2017-12-05T19:08:04Z,OWNER,I think `.json` should continue to return rows as list-of-lists - it's a nice default because it produces a smaller overall JSON file. Encouraging people to specify an alternative shape to get the current `.jsono` format feels appropriate.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275092453,"Redesign JSON output, ditch jsono, offer variants controlled by parameter instead", https://github.com/simonw/datasette/issues/122#issuecomment-345552358,https://api.github.com/repos/simonw/datasette/issues/122,345552358,MDEyOklzc3VlQ29tbWVudDM0NTU1MjM1OA==,9599,simonw,2017-11-19T21:45:38Z,2017-12-05T19:09:52Z,OWNER,"For the overall shape of the rows: `?_shape=lists` (default), `?_shape=objects`, `?_shape=object` (primary key as object keys) For getting back extra keys: `?_extras=schema,query,timing` For expanding columns: `?_expand_all=1` Or `?_expand=qSpecies&_expand=qCaretaker` The template view will only be allowed to work with data it can request using extra options. That leaves one sighted nasty edge-case: the default view will expand all columns, but the `.json` view of it won't? I think that's OK. The default view won't include the extras used by the template to render the page either.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275092453,"Redesign JSON output, ditch jsono, offer variants controlled by parameter instead", https://github.com/simonw/datasette/issues/135#issuecomment-349860851,https://api.github.com/repos/simonw/datasette/issues/135,349860851,MDEyOklzc3VlQ29tbWVudDM0OTg2MDg1MQ==,9599,simonw,2017-12-07T04:37:59Z,2017-12-07T04:37:59Z,OWNER,"I'm testing this like so: datasette ~/Dropbox/Development/timezones-api/timezones.db --reload --load-extension /usr/local/lib/mod_spatialite.dylib ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275179724,?_search=x should work if used directly against a FTS virtual table, https://github.com/simonw/datasette/issues/135#issuecomment-349861461,https://api.github.com/repos/simonw/datasette/issues/135,349861461,MDEyOklzc3VlQ29tbWVudDM0OTg2MTQ2MQ==,9599,simonw,2017-12-07T04:43:12Z,2017-12-07T04:43:12Z,OWNER,"This query looks like it does the right thing: select * from sqlite_master where rootpage = 0 and ( sql like '%VIRTUAL TABLE%USING FTS%content=""ElementaryGeometries""%' or ( tbl_name = ""ElementaryGeometries"" and sql like '%VIRTUAL TABLE%USING FTS%' ) ) Against a table that should not be shown as FTS: https://timezones-now-hrjgkinozh.now.sh/timezones-0d61a90?sql=++++++++select+*+from+sqlite_master%0D%0A++++++++++++where+rootpage+%3D+0%0D%0A++++++++++++and+%28%0D%0A++++++++++++++++sql+like+%27%25VIRTUAL+TABLE%25USING+FTS%25content%3D%22ElementaryGeometries%22%25%27%0D%0A++++++++++++++++or+%28%0D%0A++++++++++++++++++tbl_name+%3D+%22ElementaryGeometries%22%0D%0A++++++++++++++++++and+sql+like+%27%25VIRTUAL+TABLE%25USING+FTS%25%27%0D%0A++++++++++++++++%29%0D%0A++++++++++++%29+ Against a table that SHOULD match: https://sf-trees.now.sh/sf-trees-ebc2ad9?sql=++++++++select+*+from+sqlite_master%0D%0A++++++++++++where+rootpage+%3D+0%0D%0A++++++++++++and+%28%0D%0A++++++++++++++++sql+like+%27%25VIRTUAL+TABLE%25USING+FTS%25content%3D%22Street_Tree_List_fts%22%25%27%0D%0A++++++++++++++++or+%28%0D%0A++++++++++++++++++tbl_name+%3D+%22Street_Tree_List_fts%22%0D%0A++++++++++++++++++and+sql+like+%27%25VIRTUAL+TABLE%25USING+FTS%25%27%0D%0A++++++++++++++++%29%0D%0A++++++++++++%29+","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275179724,?_search=x should work if used directly against a FTS virtual table, https://github.com/simonw/datasette/issues/158#issuecomment-349868849,https://api.github.com/repos/simonw/datasette/issues/158,349868849,MDEyOklzc3VlQ29tbWVudDM0OTg2ODg0OQ==,9599,simonw,2017-12-07T05:41:08Z,2017-12-07T05:41:08Z,OWNER,"I'm happy with this - we have extra_head, content, body_class and title blocks which should provide enough hooks for most reasonable customizations.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278190981,Ensure default templates are designed to be extended, https://github.com/simonw/datasette/issues/153#issuecomment-349874052,https://api.github.com/repos/simonw/datasette/issues/153,349874052,MDEyOklzc3VlQ29tbWVudDM0OTg3NDA1Mg==,9599,simonw,2017-12-07T06:17:33Z,2017-12-07T06:17:33Z,OWNER,"In #159 I added a mechanism for easily customizing per-column displays, and I've added documentation showing an example of using this mechanism to set certain columns to display as unescaped HTML: http://datasette.readthedocs.io/en/latest/custom_templates.html#custom-templates This fixes item 3, so I'm closing this ticket!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/164#issuecomment-349874709,https://api.github.com/repos/simonw/datasette/issues/164,349874709,MDEyOklzc3VlQ29tbWVudDM0OTg3NDcwOQ==,9599,simonw,2017-12-07T06:22:10Z,2017-12-07T06:22:10Z,OWNER,"Example usage: datasette skeleton parlgov.db -m parlgov.json Generates a `parlgov.json` file containing this: { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null, ""databases"": { ""parlgov"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null, ""queries"": {}, ""tables"": { ""info_data_source"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_castles_mair"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_chess"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_huber_inglehart"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""info_table"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_euprofiler"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""party_family"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""info_id"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""sqlite_stat1"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_benoit_laver"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_country_iso"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""viewcalc_party_position"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""viewcalc_election_parameter"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""viewcalc_parliament_composition"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""viewcalc_country_year_share"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""election"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""politician_president"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""party_name_change"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_commissioner_doering"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_ray"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""party_change"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""cabinet_party"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_ees"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""party"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_cmp"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""country"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""cabinet"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""info_variable"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""election_result"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null } } } } } ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280013907,datasette skeleton command for kick-starting database and table metadata, https://github.com/simonw/datasette/issues/164#issuecomment-349874844,https://api.github.com/repos/simonw/datasette/issues/164,349874844,MDEyOklzc3VlQ29tbWVudDM0OTg3NDg0NA==,9599,simonw,2017-12-07T06:22:58Z,2017-12-07T06:22:58Z,OWNER,This metadata doesn't yet do anything - need to implement #165,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280013907,datasette skeleton command for kick-starting database and table metadata, https://github.com/simonw/datasette/issues/165#issuecomment-350026183,https://api.github.com/repos/simonw/datasette/issues/165,350026183,MDEyOklzc3VlQ29tbWVudDM1MDAyNjE4Mw==,9599,simonw,2017-12-07T16:47:46Z,2017-12-07T16:47:46Z,OWNER,"Here's an example metadata.json file illustrating custom per-database and per- table metadata: { ""title"": ""Overall datasette title"", ""description_html"": ""This is a description with HTML."", ""databases"": { ""db1"": { ""title"": ""First database"", ""description"": ""This is a string description & has no HTML"", ""license_url"": ""http://example.com/"", ""license"": ""The example license"", ""queries"": { ""canned_query"": ""select * from table1 limit 3;"" }, ""tables"": { ""table1"": { ""title"": ""Custom title for table1"", ""description"": ""Tables can have descriptions too"", ""source"": ""This has a custom source"", ""source_url"": ""http://example.com/"" } } } } }","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280014287,metadata.json support for per-database and per-table information, https://github.com/simonw/datasette/issues/165#issuecomment-350026452,https://api.github.com/repos/simonw/datasette/issues/165,350026452,MDEyOklzc3VlQ29tbWVudDM1MDAyNjQ1Mg==,9599,simonw,2017-12-07T16:48:34Z,2017-12-07T16:48:34Z,OWNER,"Needs documentation, see #166 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280014287,metadata.json support for per-database and per-table information, https://github.com/simonw/datasette/issues/166#issuecomment-350035741,https://api.github.com/repos/simonw/datasette/issues/166,350035741,MDEyOklzc3VlQ29tbWVudDM1MDAzNTc0MQ==,9599,simonw,2017-12-07T17:20:35Z,2017-12-07T17:20:35Z,OWNER,"http://datasette.readthedocs.io/en/latest/metadata.html ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280023225,Documentation for metadata.json and datasette skeleton, https://github.com/simonw/datasette/issues/161#issuecomment-350108113,https://api.github.com/repos/simonw/datasette/issues/161,350108113,MDEyOklzc3VlQ29tbWVudDM1MDEwODExMw==,388154,wsxiaoys,2017-12-07T22:02:24Z,2017-12-07T22:02:24Z,NONE,"It's not throwing the validation error anymore, but i still cannot run following with query: ``` WITH RECURSIVE cnt(x) AS (SELECT 1 UNION ALL SELECT x+1 FROM cnt LIMIT 10) SELECT x FROM cnt; ``` I got `near ""WITH"": syntax error`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278814220,Support WITH query , https://github.com/simonw/datasette/issues/167#issuecomment-350125953,https://api.github.com/repos/simonw/datasette/issues/167,350125953,MDEyOklzc3VlQ29tbWVudDM1MDEyNTk1Mw==,9599,simonw,2017-12-07T23:25:28Z,2017-12-07T23:25:28Z,OWNER,"My column/row HTML display logic has got way too convoluted. This is a sign I need to add proper unit tests for it and clean it up. The complexity comes from: * Displaying a rowid for tables that do not have a primary key * Showing an additional Link column for rows with a primary key * Not displaying that Link column on the individual row pages * Trying to get foreign keys working correctly in all cases, e.g. #152 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280315352,Nasty bug: last column not being correctly displayed, https://github.com/simonw/datasette/issues/161#issuecomment-350158037,https://api.github.com/repos/simonw/datasette/issues/161,350158037,MDEyOklzc3VlQ29tbWVudDM1MDE1ODAzNw==,9599,simonw,2017-12-08T02:52:34Z,2017-12-08T02:52:34Z,OWNER,That might mean your version of SQLite doesn't support that syntax. Unfortunately the version bundled with Python is a bit old - the one built by the Dockerfile in this repo should handle it though.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278814220,Support WITH query , https://github.com/simonw/datasette/issues/161#issuecomment-350182904,https://api.github.com/repos/simonw/datasette/issues/161,350182904,MDEyOklzc3VlQ29tbWVudDM1MDE4MjkwNA==,388154,wsxiaoys,2017-12-08T06:18:12Z,2017-12-08T06:18:12Z,NONE,"You're right..got this resolved after upgrading the sqlite version. Thanks you!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278814220,Support WITH query , https://github.com/simonw/datasette/issues/141#issuecomment-350292364,https://api.github.com/repos/simonw/datasette/issues/141,350292364,MDEyOklzc3VlQ29tbWVudDM1MDI5MjM2NA==,9599,simonw,2017-12-08T15:33:18Z,2017-12-08T15:33:18Z,OWNER,"I can emulate this on OS X using a disk image (Disk Utility -> File -> New Image -> Blank Image...) - once mounted, I get the following: >>> os.link('/tmp/hello', '/Volumes/Untitled/hello') Traceback (most recent call last): File """", line 1, in OSError: [Errno 18] Cross-device link: '/tmp/hello' -> '/Volumes/Untitled/hello' I can simulate that in a mock like this: >>> from unittest.mock import patch >>> @patch('os.link') ... def test_link(mock_link): ... mock_link.side_effect = OSError ... mock_link() ... ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275814941,datasette publish can fail if /tmp is on a different device, https://github.com/simonw/datasette/issues/141#issuecomment-350301248,https://api.github.com/repos/simonw/datasette/issues/141,350301248,MDEyOklzc3VlQ29tbWVudDM1MDMwMTI0OA==,9599,simonw,2017-12-08T16:07:04Z,2017-12-08T16:07:04Z,OWNER,"This fix should work, please have a go with latest master and let me know if you run into any problems.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275814941,datasette publish can fail if /tmp is on a different device, https://github.com/simonw/datasette/issues/154#issuecomment-350302417,https://api.github.com/repos/simonw/datasette/issues/154,350302417,MDEyOklzc3VlQ29tbWVudDM1MDMwMjQxNw==,9599,simonw,2017-12-08T16:11:24Z,2017-12-08T16:11:24Z,OWNER,I think I'll do this as a custom Jinja template filter. That way template authors can re-use it for their own static files if they want.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276873891,Datasette CSS should include content hash in the URL, https://github.com/simonw/datasette/issues/154#issuecomment-350323722,https://api.github.com/repos/simonw/datasette/issues/154,350323722,MDEyOklzc3VlQ29tbWVudDM1MDMyMzcyMg==,9599,simonw,2017-12-08T17:35:25Z,2017-12-08T17:35:25Z,OWNER,If I do this as a querystring parameter I won't need to worry about URL routing.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276873891,Datasette CSS should include content hash in the URL, https://github.com/simonw/datasette/pull/168#issuecomment-350413422,https://api.github.com/repos/simonw/datasette/issues/168,350413422,MDEyOklzc3VlQ29tbWVudDM1MDQxMzQyMg==,9599,simonw,2017-12-09T01:33:40Z,2017-12-09T01:33:40Z,OWNER,https://github.com/channelcat/sanic/releases/tag/0.7.0,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280662866,Upgrade to Sanic 0.7.0, https://github.com/simonw/datasette/issues/167#issuecomment-350421661,https://api.github.com/repos/simonw/datasette/issues/167,350421661,MDEyOklzc3VlQ29tbWVudDM1MDQyMTY2MQ==,9599,simonw,2017-12-09T03:52:46Z,2017-12-09T03:52:46Z,OWNER,"Input: results from the database, foreign key definitions, primary key definitions, type of page Output: display_columns and display_rows","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280315352,Nasty bug: last column not being correctly displayed, https://github.com/simonw/datasette/issues/167#issuecomment-350424595,https://api.github.com/repos/simonw/datasette/issues/167,350424595,MDEyOklzc3VlQ29tbWVudDM1MDQyNDU5NQ==,9599,simonw,2017-12-09T05:08:27Z,2017-12-09T05:08:27Z,OWNER,Perhaps the row.html and table.html templates should be passed the same data but should themselves decide if they will display the Link column ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280315352,Nasty bug: last column not being correctly displayed, https://github.com/simonw/datasette/issues/160#issuecomment-350496258,https://api.github.com/repos/simonw/datasette/issues/160,350496258,MDEyOklzc3VlQ29tbWVudDM1MDQ5NjI1OA==,9599,simonw,2017-12-09T18:29:28Z,2017-12-09T18:29:28Z,OWNER,"Example usage: datasette package --static css:extra-css/ --static js:extra-js/ \ sf-trees.db --template-dir templates/ --tag sf-trees --branch master This creates a local Docker image that includes copies of the templates/, extra-css/ and extra-js/ directories. You can then run it like this: docker run -p 8001:8001 sf-trees For publishing to Zeit now: datasette publish now --static css:extra-css/ --static js:extra-js/ \ sf-trees.db --template-dir templates/ --name sf-trees --branch master Example: https://sf-trees-wbihszoazc.now.sh/sf-trees-02c8ef1/Street_Tree_List For publishing to Heroku: datasette publish heroku --static css:extra-css/ --static js:extra-js/ \ sf-trees.db --template-dir templates/ --branch master ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/157#issuecomment-350496277,https://api.github.com/repos/simonw/datasette/issues/157,350496277,MDEyOklzc3VlQ29tbWVudDM1MDQ5NjI3Nw==,9599,simonw,2017-12-09T18:29:41Z,2017-12-09T18:29:41Z,OWNER,"Example usage: datasette package --static css:extra-css/ --static js:extra-js/ \ sf-trees.db --template-dir templates/ --tag sf-trees --branch master This creates a local Docker image that includes copies of the templates/, extra-css/ and extra-js/ directories. You can then run it like this: docker run -p 8001:8001 sf-trees For publishing to Zeit now: datasette publish now --static css:extra-css/ --static js:extra-js/ \ sf-trees.db --template-dir templates/ --name sf-trees --branch master Example: https://sf-trees-wbihszoazc.now.sh/sf-trees-02c8ef1/Street_Tree_List For publishing to Heroku: datasette publish heroku --static css:extra-css/ --static js:extra-js/ \ sf-trees.db --template-dir templates/ --branch master ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278190321,"Teach ""datasette publish"" about custom template directories", https://github.com/simonw/datasette/issues/170#issuecomment-350506593,https://api.github.com/repos/simonw/datasette/issues/170,350506593,MDEyOklzc3VlQ29tbWVudDM1MDUwNjU5Mw==,9599,simonw,2017-12-09T21:25:50Z,2017-12-09T21:25:50Z,OWNER,Turns out this is already supported: https://github.com/simonw/datasette/blob/6bdfcf60760c27e29ff34692d06e62b36aeecc56/datasette/app.py#L307,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280745470,Custom template for named canned query, https://github.com/simonw/datasette/issues/170#issuecomment-350506751,https://api.github.com/repos/simonw/datasette/issues/170,350506751,MDEyOklzc3VlQ29tbWVudDM1MDUwNjc1MQ==,9599,simonw,2017-12-09T21:28:32Z,2017-12-09T21:28:32Z,OWNER,"My mistake, that's using the database name - there isn't a way of customizing for a specific named query yet.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280745470,Custom template for named canned query, https://github.com/simonw/datasette/issues/170#issuecomment-350507155,https://api.github.com/repos/simonw/datasette/issues/170,350507155,MDEyOklzc3VlQ29tbWVudDM1MDUwNzE1NQ==,9599,simonw,2017-12-09T21:35:30Z,2017-12-09T21:35:30Z,OWNER," Canned query page (/mydatabase/canned-query): query-mydatabase-canned-query.html query-mydatabase.html query.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280745470,Custom template for named canned query, https://github.com/simonw/datasette/issues/171#issuecomment-350508049,https://api.github.com/repos/simonw/datasette/issues/171,350508049,MDEyOklzc3VlQ29tbWVudDM1MDUwODA0OQ==,9599,simonw,2017-12-09T21:50:50Z,2017-12-09T21:50:50Z,OWNER,"Quoting the new documentation: You can find out which templates were considered for a specific page by viewing source on that page and looking for an HTML comment at the bottom. The comment will look something like this: This example is from the canned query page for a query called ""tz"" in the database called ""mydb"". The asterisk shows which template was selected - so in this case, Datasette found a template file called `query-mydb-tz.html` and used that - but if that template had not been found, it would have tried for `query-mydb.html` or the default `query.html`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280745746,HTML comments specifying custom templates for page, https://github.com/simonw/datasette/issues/167#issuecomment-350515616,https://api.github.com/repos/simonw/datasette/issues/167,350515616,MDEyOklzc3VlQ29tbWVudDM1MDUxNTYxNg==,9599,simonw,2017-12-10T00:21:58Z,2017-12-10T00:21:58Z,OWNER,This function signature is pretty gross: https://github.com/simonw/datasette/blob/7a7e4b2ed8c76c6d002a9d707dbc840f6a2abf7f/datasette/app.py#L418,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280315352,Nasty bug: last column not being correctly displayed, https://github.com/simonw/datasette/issues/167#issuecomment-350515985,https://api.github.com/repos/simonw/datasette/issues/167,350515985,MDEyOklzc3VlQ29tbWVudDM1MDUxNTk4NQ==,9599,simonw,2017-12-10T00:28:39Z,2017-12-10T00:28:39Z,OWNER,"A better alternative: ```async def display_columns_and_rows(self, database, table, rows, link_column=False):```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280315352,Nasty bug: last column not being correctly displayed, https://github.com/simonw/datasette/issues/167#issuecomment-350516782,https://api.github.com/repos/simonw/datasette/issues/167,350516782,MDEyOklzc3VlQ29tbWVudDM1MDUxNjc4Mg==,9599,simonw,2017-12-10T00:48:54Z,2017-12-10T00:48:54Z,OWNER,I can simplify this all by dropping the nicety where if a table is using a rowid the Link column is titled rowid instead.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280315352,Nasty bug: last column not being correctly displayed, https://github.com/simonw/datasette/issues/169#issuecomment-350519711,https://api.github.com/repos/simonw/datasette/issues/169,350519711,MDEyOklzc3VlQ29tbWVudDM1MDUxOTcxMQ==,9599,simonw,2017-12-10T02:04:56Z,2017-12-10T02:04:56Z,OWNER,Done! https://github.com/simonw/datasette/releases/tag/0.14,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280744309,Release v0.14 with templates and static files features, https://github.com/simonw/datasette/issues/153#issuecomment-350519736,https://api.github.com/repos/simonw/datasette/issues/153,350519736,MDEyOklzc3VlQ29tbWVudDM1MDUxOTczNg==,9599,simonw,2017-12-10T02:06:01Z,2017-12-10T02:06:01Z,OWNER,@ftrain Datasette 0.14 is now released with all of the above: https://github.com/simonw/datasette/releases/tag/0.14,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-350519821,https://api.github.com/repos/simonw/datasette/issues/153,350519821,MDEyOklzc3VlQ29tbWVudDM1MDUxOTgyMQ==,9599,simonw,2017-12-10T02:08:45Z,2017-12-10T02:08:45Z,OWNER,"Also worth mentioning: as of #160 and #157 the `datasette publish now`, `datasette publish heroku` and `datasette package` commands all know how to bundle up any `--static` or `--template-dir` content and include it in the Docker image / Heroku/Now deployment that gets generated.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/42#issuecomment-350521619,https://api.github.com/repos/simonw/datasette/issues/42,350521619,MDEyOklzc3VlQ29tbWVudDM1MDUyMTYxOQ==,9599,simonw,2017-12-10T03:02:14Z,2017-12-10T03:02:14Z,OWNER,I think the `datasette skeleton` command from #164 makes this obsolete.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268591332,Homepage UI for editing metadata file, https://github.com/simonw/datasette/issues/52#issuecomment-350521635,https://api.github.com/repos/simonw/datasette/issues/52,350521635,MDEyOklzc3VlQ29tbWVudDM1MDUyMTYzNQ==,9599,simonw,2017-12-10T03:02:56Z,2017-12-10T03:02:56Z,OWNER,I don't think this is necessary.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273026602,Solution for temporarily uploading DB so it can be built by docker, https://github.com/simonw/datasette/issues/90#issuecomment-350521711,https://api.github.com/repos/simonw/datasette/issues/90,350521711,MDEyOklzc3VlQ29tbWVudDM1MDUyMTcxMQ==,9599,simonw,2017-12-10T03:05:48Z,2017-12-10T03:05:48Z,OWNER,I fixed that last issue in c195ee4d46f2577b1943836a8270d84c8341d138,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123,datasette publish heroku, https://github.com/simonw/datasette/issues/90#issuecomment-350521736,https://api.github.com/repos/simonw/datasette/issues/90,350521736,MDEyOklzc3VlQ29tbWVudDM1MDUyMTczNg==,9599,simonw,2017-12-10T03:06:34Z,2017-12-10T03:06:34Z,OWNER,Heroku is now in the README as of 6bdfcf60760c27e29ff34692d06e62b36aeecc56,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123,datasette publish heroku, https://github.com/simonw/datasette/issues/91#issuecomment-350521780,https://api.github.com/repos/simonw/datasette/issues/91,350521780,MDEyOklzc3VlQ29tbWVudDM1MDUyMTc4MA==,9599,simonw,2017-12-10T03:07:53Z,2017-12-10T03:07:53Z,OWNER,Won't fix - I think the custom templates and static stuff in https://github.com/simonw/datasette/releases/tag/0.14 renders this obsolete.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273878873,"Option to serve databases from a different prefix, serve regular content elsewhere", https://github.com/simonw/datasette/issues/138#issuecomment-350521806,https://api.github.com/repos/simonw/datasette/issues/138,350521806,MDEyOklzc3VlQ29tbWVudDM1MDUyMTgwNg==,9599,simonw,2017-12-10T03:08:26Z,2017-12-10T03:08:36Z,OWNER,Implemented this in 80bf3afa43e3cb396c7a7c9b168eedbc6fe0fa15 and #165. Didn't use data package though.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275476839,"Per-database and per-table metadata, probably using data-package", https://github.com/simonw/datasette/issues/123#issuecomment-350521853,https://api.github.com/repos/simonw/datasette/issues/123,350521853,MDEyOklzc3VlQ29tbWVudDM1MDUyMTg1Mw==,9599,simonw,2017-12-10T03:09:53Z,2017-12-10T03:09:53Z,OWNER,I'm going to keep this separate in csvs-to-sqlite.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125561,Datasette serve should accept paths/URLs to CSVs and other file formats, https://github.com/simonw/datasette/issues/121#issuecomment-350527283,https://api.github.com/repos/simonw/datasette/issues/121,350527283,MDEyOklzc3VlQ29tbWVudDM1MDUyNzI4Mw==,9599,simonw,2017-12-10T06:00:47Z,2017-12-10T06:00:47Z,OWNER,This is also really interesting when combined with the spatialite AsGeoJSON function: http://www.gaia-gis.it/gaia-sins/spatialite-sql-4.2.0.html#p3misc,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275089535,?_json=foo&_json=bar query string argument , https://github.com/simonw/datasette/issues/175#issuecomment-353424169,https://api.github.com/repos/simonw/datasette/issues/175,353424169,MDEyOklzc3VlQ29tbWVudDM1MzQyNDE2OQ==,9599,simonw,2017-12-21T18:33:55Z,2017-12-21T18:33:55Z,OWNER,Done - thanks for curating these: https://github.com/topics/automatic-api,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",282971961,"Add project topic ""automatic-api""", https://github.com/simonw/datasette/issues/120#issuecomment-355487646,https://api.github.com/repos/simonw/datasette/issues/120,355487646,MDEyOklzc3VlQ29tbWVudDM1NTQ4NzY0Ng==,723567,nickdirienzo,2018-01-05T07:10:12Z,2018-01-05T07:10:12Z,NONE,"Ah, glad I found this issue. I have private data that I'd like to share to a few different people. Personally, a shared username and password would be sufficient for me, more-or-less Basic Auth. Do you have more complex requirements in mind? I'm not sure if ""plugin"" means ""build a plugin"" or ""find a plugin"" or something else entirely. FWIW, I stumbled upon [sanic-auth](https://github.com/pyx/sanic-auth) which looks like a new project to bring some interfaces around auth to sanic, similar to Flask. Alternatively, it shouldn't be too bad to add in Basic Auth. If we went down that route, that would probably be best built as a separate package for sanic that `datasette` brings in. What are your thoughts around this?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275087397,Plugin that adds an authentication layer of some sort, https://github.com/simonw/datasette/issues/176#issuecomment-356115657,https://api.github.com/repos/simonw/datasette/issues/176,356115657,MDEyOklzc3VlQ29tbWVudDM1NjExNTY1Nw==,4313116,wulfmann,2018-01-08T22:22:32Z,2018-01-08T22:22:32Z,NONE,"This project probably would not be the place for that. This is a layer for sqllite specifically. It solves a similar problem as graphql, so adding that here wouldn't make sense. Here's an example i found from google that uses micro to run a graphql microservice. you'd just then need to connect your db. https://github.com/timneutkens/micro-graphql","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/176#issuecomment-356161672,https://api.github.com/repos/simonw/datasette/issues/176,356161672,MDEyOklzc3VlQ29tbWVudDM1NjE2MTY3Mg==,173848,yozlet,2018-01-09T02:35:35Z,2018-01-09T02:35:35Z,NONE,"@wulfmann I think I disagree, except I'm not entirely sure what you mean by that first paragraph. The JSON API that Datasette currently exposes is quite different to GraphQL. Furthermore, there's no ""just"" about connecting micro-graphql to a DB; at least, no more ""just"" than adding any other API. You still need to configure the schema, which is exactly the kind of thing that Datasette does for JSON API. This is why I think that GraphQL's a good fit here.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/176#issuecomment-356175667,https://api.github.com/repos/simonw/datasette/issues/176,356175667,MDEyOklzc3VlQ29tbWVudDM1NjE3NTY2Nw==,4313116,wulfmann,2018-01-09T04:19:03Z,2018-01-09T04:19:03Z,NONE,"@yozlet Yes I think that I was confused when I posted my original comment. I see your main point now and am in agreement. ","{""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 2, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/pull/178#issuecomment-357542404,https://api.github.com/repos/simonw/datasette/issues/178,357542404,MDEyOklzc3VlQ29tbWVudDM1NzU0MjQwNA==,9599,simonw,2018-01-14T21:06:07Z,2018-01-14T21:06:07Z,OWNER,"Thanks for catching this, merged!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",287240246,"If metadata exists, add it to heroku launch command", https://github.com/simonw/datasette/issues/176#issuecomment-359697938,https://api.github.com/repos/simonw/datasette/issues/176,359697938,MDEyOklzc3VlQ29tbWVudDM1OTY5NzkzOA==,7193,gijs,2018-01-23T07:17:56Z,2018-01-23T07:17:56Z,NONE,👍 I'd like this too! ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/179#issuecomment-360535979,https://api.github.com/repos/simonw/datasette/issues/179,360535979,MDEyOklzc3VlQ29tbWVudDM2MDUzNTk3OQ==,82988,psychemedia,2018-01-25T17:18:24Z,2018-01-25T17:18:24Z,CONTRIBUTOR,"To summarise that thread: - expose full `metadata.json` object to the index page template, eg to allow tables to be referred to by name; - ability to import multiple `metadata.json` files, eg to allow metadata files created for a specific SQLite db to be reused in a datasette referring to several database files; It could also be useful to allow users to import a python file containing custom functions that can that be loaded into scope and made available to custom templates. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",288438570,More metadata options for template authors , https://github.com/simonw/datasette/issues/176#issuecomment-368625350,https://api.github.com/repos/simonw/datasette/issues/176,368625350,MDEyOklzc3VlQ29tbWVudDM2ODYyNTM1MA==,7431774,wuhland,2018-02-26T19:44:11Z,2018-02-26T19:44:11Z,NONE,great idea!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/185#issuecomment-370273359,https://api.github.com/repos/simonw/datasette/issues/185,370273359,MDEyOklzc3VlQ29tbWVudDM3MDI3MzM1OQ==,9599,simonw,2018-03-04T23:10:56Z,2018-03-04T23:10:56Z,OWNER,"Are you talking specifically about accessing metadata from HTML templates? That makes a lot of sense, I'll think about how this could work.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-370461231,https://api.github.com/repos/simonw/datasette/issues/185,370461231,MDEyOklzc3VlQ29tbWVudDM3MDQ2MTIzMQ==,222245,carlmjohnson,2018-03-05T15:43:56Z,2018-03-05T15:44:27Z,NONE,"Yes. I think the simplest implementation is to change lines like ```python metadata = self.ds.metadata.get('databases', {}).get(name, {}) ``` to ```python metadata = { **self.ds.metadata, **self.ds.metadata.get('databases', {}).get(name, {}), } ``` so that specified inner values overwrite outer values, but only if they exist.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/186#issuecomment-374810115,https://api.github.com/repos/simonw/datasette/issues/186,374810115,MDEyOklzc3VlQ29tbWVudDM3NDgxMDExNQ==,9599,simonw,2018-03-21T01:21:13Z,2018-03-21T01:21:13Z,OWNER,"Hah, this is exactly the opposite of datasette's default approach to caching, which is to cache everything for as long as possible. I don't think we'll need to add `Cache-Control: no-cache` headers provided we instead set it up so you can turn off Datasette's caching.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",306811513,proposal new option to disable user agents cache, https://github.com/simonw/datasette/issues/186#issuecomment-374811114,https://api.github.com/repos/simonw/datasette/issues/186,374811114,MDEyOklzc3VlQ29tbWVudDM3NDgxMTExNA==,9599,simonw,2018-03-21T01:28:30Z,2018-03-21T01:28:30Z,OWNER,"We actually have this already: https://github.com/simonw/datasette/blob/012fc7c5cd3e9160c9a4c19cc964253e97fb054a/datasette/cli.py#L253-L255 You can disable the cache headers using the `datasette --debug` option.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",306811513,proposal new option to disable user agents cache, https://github.com/simonw/datasette/issues/186#issuecomment-374872202,https://api.github.com/repos/simonw/datasette/issues/186,374872202,MDEyOklzc3VlQ29tbWVudDM3NDg3MjIwMg==,47107,stefanocudini,2018-03-21T09:07:22Z,2018-03-21T09:07:22Z,NONE,--debug is perfect tnk,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",306811513,proposal new option to disable user agents cache, https://github.com/simonw/datasette/issues/185#issuecomment-376585911,https://api.github.com/repos/simonw/datasette/issues/185,376585911,MDEyOklzc3VlQ29tbWVudDM3NjU4NTkxMQ==,9599,simonw,2018-03-27T16:19:43Z,2018-03-27T16:19:43Z,OWNER,"OK, I have an implementation of this. I realised that not ALL metadata should be inherited: it makes sense for source/source_url/license/license_url to be inherited, but it doesn't make sense for the title and description to be inherited down to the individual databases and tables.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-376587017,https://api.github.com/repos/simonw/datasette/issues/185,376587017,MDEyOklzc3VlQ29tbWVudDM3NjU4NzAxNw==,9599,simonw,2018-03-27T16:22:59Z,2018-03-27T16:22:59Z,OWNER,One thing that's missing from this: if you set source/license data at the individual database level they should be inherited by tables within that database.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-376589591,https://api.github.com/repos/simonw/datasette/issues/185,376589591,MDEyOklzc3VlQ29tbWVudDM3NjU4OTU5MQ==,9599,simonw,2018-03-27T16:30:51Z,2018-03-27T16:30:51Z,OWNER,"Also needed: the ability to unset metadata. If the root metadata specifies a license_url it should be possible to set ""license_url"": null on a child database or table. The current implementation will ignore null (or empty string) values and default to the top level value. I think the templates themselves should be able to indicate if they want the inherited values or not. That way we could support arbitrary key/values and avoid the application code having special knowledge of license_url etc.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-376590265,https://api.github.com/repos/simonw/datasette/issues/185,376590265,MDEyOklzc3VlQ29tbWVudDM3NjU5MDI2NQ==,222245,carlmjohnson,2018-03-27T16:32:51Z,2018-03-27T16:32:51Z,NONE,">I think the templates themselves should be able to indicate if they want the inherited values or not. That way we could support arbitrary key/values and avoid the application code having special knowledge of license_url etc. Yes, you could have `metadata` that works like `metadata` does currently and `inherited_metadata` that works with inheritance.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-376592044,https://api.github.com/repos/simonw/datasette/issues/185,376592044,MDEyOklzc3VlQ29tbWVudDM3NjU5MjA0NA==,222245,carlmjohnson,2018-03-27T16:38:23Z,2018-03-27T16:38:23Z,NONE,"It would be nice to also allow arbitrary keys (maybe under a parent key called params or something to prevent conflicts). For our datasette project, we just have a bunch of dictionaries defined in the base template for things like site URL and column humanized names: https://github.com/baltimore-sun-data/salaries-datasette/blob/master/templates/base.html It would be cleaner if this were in the metadata.json.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-376604558,https://api.github.com/repos/simonw/datasette/issues/185,376604558,MDEyOklzc3VlQ29tbWVudDM3NjYwNDU1OA==,9599,simonw,2018-03-27T17:16:27Z,2018-03-27T17:16:27Z,OWNER,"I am SO inspired by what you've done with https://salaries.news.baltimoresun.com/ - that's pretty much my ideal use-case for Datasette, and it's by far the most elaborate customization I've seen so far. I'd love to hear other ideas that came up while building that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-376614973,https://api.github.com/repos/simonw/datasette/issues/185,376614973,MDEyOklzc3VlQ29tbWVudDM3NjYxNDk3Mw==,222245,carlmjohnson,2018-03-27T17:49:00Z,2018-03-27T17:49:00Z,NONE,"@simonw Other than metadata, the biggest item on wishlist for the salaries project was the ability to reorder by column. Of course, that could be done with a custom SQL query, but we didn't want to have to reimplement all the nav/pagination stuff from scratch. @carolinp, feel free to add your thoughts. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/189#issuecomment-376981291,https://api.github.com/repos/simonw/datasette/issues/189,376981291,MDEyOklzc3VlQ29tbWVudDM3Njk4MTI5MQ==,9599,simonw,2018-03-28T18:06:08Z,2018-03-28T18:06:08Z,OWNER,"Given how unlikely it is that this will pose a real problem I think I like option 1: enable sort-by-column by default for all tables, then allow power users to instead switch to explicit enabling of the functionality in their `metadata.json` if they know their data is too big.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-376983741,https://api.github.com/repos/simonw/datasette/issues/189,376983741,MDEyOklzc3VlQ29tbWVudDM3Njk4Mzc0MQ==,9599,simonw,2018-03-28T18:12:35Z,2018-03-28T18:12:35Z,OWNER,"I think this can work with a `?_sort=xxx` parameter - and `?_sort=-xxx` to sort in the opposite direction. I'd like to support ""sort by X descending, then by Y ascending if there are dupes for X"" as well. Two ways that could work: `?_sort=-xxx,yyy` Or... `?_sort=-xxx&_sort=yyy` The second option is probably better in that it makes it easier for columns to have a comma in their name. Is it possible for a SQLite column to start with a `-` character?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-376986668,https://api.github.com/repos/simonw/datasette/issues/189,376986668,MDEyOklzc3VlQ29tbWVudDM3Njk4NjY2OA==,9599,simonw,2018-03-28T18:21:53Z,2018-03-28T18:21:53Z,OWNER,"Might have to do something special to get sort-by-nulls-last: https://stackoverflow.com/questions/12503120/how-to-do-nulls-last-in-sqlite order by ifnull(column_name, -999999) Would need to figure out a smart way to get the default value - maybe by running a min() or max() against the column first?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377049625,https://api.github.com/repos/simonw/datasette/issues/189,377049625,MDEyOklzc3VlQ29tbWVudDM3NzA0OTYyNQ==,9599,simonw,2018-03-28T21:52:05Z,2018-03-28T21:52:05Z,OWNER,"This is a better pattern as you don't have to pick a minimum value: ORDER BY CASE WHEN SOMECOL IS NULL THEN 1 ELSE 0 END, SOMECOL","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377051018,https://api.github.com/repos/simonw/datasette/issues/189,377051018,MDEyOklzc3VlQ29tbWVudDM3NzA1MTAxOA==,9599,simonw,2018-03-28T21:57:20Z,2018-03-28T22:00:17Z,OWNER,"I'd like to continue to support _next=token pagination even for custom sort orders. To do that I should include rowid (or general primary key) as the tie breaker on all sorts so I can incorporate that it into the _next= token.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377052634,https://api.github.com/repos/simonw/datasette/issues/189,377052634,MDEyOklzc3VlQ29tbWVudDM3NzA1MjYzNA==,9599,simonw,2018-03-28T22:03:16Z,2018-03-28T22:03:16Z,OWNER,"In terms of user interface: the obvious place to put this is as a drop down menu on the column headers. This also means the UI can support combined sort orders. Assuming you are already sorted by county descending and you select the candidate column header, the options could be: * sort all by candidate * sort all by candidate, descending * sort by county descending, then by candidate * sort by county descending, then by candidate descending","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377050461,https://api.github.com/repos/simonw/datasette/issues/189,377050461,MDEyOklzc3VlQ29tbWVudDM3NzA1MDQ2MQ==,9599,simonw,2018-03-28T21:55:14Z,2018-03-28T22:06:30Z,OWNER,"I think there are actually four kinds of sort order we need to support; * ascending * descending * ascending, nulls last * descending, nulls last It looks like [-blah] is a valid SQLite table name, so mark I descending with a hyphen prefix isn't good. Instead, maybe this: ?_sort_asc=col1&_sort_desc_nulls_last=col2 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377054358,https://api.github.com/repos/simonw/datasette/issues/189,377054358,MDEyOklzc3VlQ29tbWVudDM3NzA1NDM1OA==,9599,simonw,2018-03-28T22:09:25Z,2018-03-28T22:09:25Z,OWNER,I'm tempted to put these verbose sorting options inline in the page HTML but have them in the table footer so they don't clog up the top half of the page with uninteresting links - then use JavaScript to hoik them out into a dropdown menu attached to each column header.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377055663,https://api.github.com/repos/simonw/datasette/issues/189,377055663,MDEyOklzc3VlQ29tbWVudDM3NzA1NTY2Mw==,9599,simonw,2018-03-28T22:14:53Z,2018-03-28T22:14:53Z,OWNER,"There is one other interesting option for auto-enabling/disabling sort: the inspect command could include data about column index presence and whether or not a column has any null values in it. This would allow us to dynamically include a ""nulls last"" option but only for columns that contain at least one null. It's quite a lot of additional engineering for a very minor feature though, so I think I'll punt on that for the moment. We may find that the _group_count feature can benefit from column value statistics later on though.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/190#issuecomment-377065541,https://api.github.com/repos/simonw/datasette/issues/190,377065541,MDEyOklzc3VlQ29tbWVudDM3NzA2NTU0MQ==,9599,simonw,2018-03-28T22:58:52Z,2018-03-28T22:58:52Z,OWNER,"This is because the SQL we are using here is: select * from compound_primary_key where ""pk1"" > ""d"" and ""pk2"" > ""v"" order by pk1, pk2 limit 101 This is incorrect. The correct SQL syntax (according to the example on https://www.sqlite.org/rowvalue.html#scrolling_window_queries ) is: select * from compound_primary_key where (""pk1"", ""pk2"") > (""d"", ""v"") order by pk1, pk2 limit 101 BUT... this uses ""row values"" syntax which was only added to SQLite in version 3.15.0 in October 2016: https://sqlite.org/changes.html#version_3_15_0 The version on https://datasette-issue-190-compound-pks.now.sh/compound-pks-9aafe8f?sql=select+sqlite_version%28%29%3B is 3.8.7.1 from October 2014.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309558826,Keyset pagination doesn't work correctly for compound primary keys, https://github.com/simonw/datasette/issues/190#issuecomment-377066466,https://api.github.com/repos/simonw/datasette/issues/190,377066466,MDEyOklzc3VlQ29tbWVudDM3NzA2NjQ2Ng==,9599,simonw,2018-03-28T23:03:45Z,2018-03-28T23:03:57Z,OWNER,"Without row values syntax, the necessary SQL to retrieve the next page after `d, v` gets a bit gnarly: select * from compound_primary_key where pk1 >= ""d"" and not (pk1 = ""d"" and pk2 <= ""v"") order by pk1, pk2 See https://datasette-issue-190-compound-pks.now.sh/compound-pks-9aafe8f?sql=select+*+from+compound_primary_key+where+pk1+%3E%3D+%22d%22+and+not+%28pk1+%3D+%22d%22+and+pk2+%3C%3D+%22v%22%29+order+by+pk1%2C+pk2 This article was useful for figuring this out: https://use-the-index-luke.com/sql/partial-results/fetch-next-page","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309558826,Keyset pagination doesn't work correctly for compound primary keys, https://github.com/simonw/datasette/issues/190#issuecomment-377067541,https://api.github.com/repos/simonw/datasette/issues/190,377067541,MDEyOklzc3VlQ29tbWVudDM3NzA2NzU0MQ==,9599,simonw,2018-03-28T23:09:18Z,2018-03-28T23:09:51Z,OWNER,"Here's how I generated the table for testing this with 3 compound primary keys: CREATE_SQL = ''' CREATE TABLE compound_three_primary_keys ( pk1 varchar(30), pk2 varchar(30), pk3 varchar(30), content text, PRIMARY KEY (pk1, pk2, pk3) );''' alphabet = 'abcdefghijklmnopqrstuvwxyz' for a in alphabet: for b in alphabet: for c in alphabet: print(''' INSERT INTO compound_three_primary_keys VALUES ('{}', '{}', '{}', '{}'); '''.strip().format(a, b, c, '{}-{}-{}-{}-{}-{}'.format(a,b,c,a,b,c))) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309558826,Keyset pagination doesn't work correctly for compound primary keys, https://github.com/simonw/datasette/issues/190#issuecomment-377072022,https://api.github.com/repos/simonw/datasette/issues/190,377072022,MDEyOklzc3VlQ29tbWVudDM3NzA3MjAyMg==,9599,simonw,2018-03-28T23:32:24Z,2018-03-28T23:32:24Z,OWNER,"Here's the SQL for a next page with three compound primary keys: https://datasette-issue-190-compound-pks.now.sh/compound-pks-8e99805?sql=select+*+from+compound_three_primary_keys%0D%0Awhere%0D%0A++%28pk1+%3E+%3Apk1%29%0D%0A++++or%0D%0A++%28pk1+%3D+%3Apk1+and+pk2+%3E+%3Apk2%29%0D%0A++++or%0D%0A++%28pk1+%3D+%3Apk1+and+pk2+%3D+%3Apk2+and+pk3+%3E+%3Apk3%29%0D%0Aorder+by+pk1%2C+pk2%2C+pk3%3B%0D%0A%0D%0A%0D%0A&pk1=a&pk2=d&pk3=v ``` select * from compound_three_primary_keys where (pk1 > :pk1) or (pk1 = :pk1 and pk2 > :pk2) or (pk1 = :pk1 and pk2 = :pk2 and pk3 > :pk3) order by pk1, pk2, pk3; ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309558826,Keyset pagination doesn't work correctly for compound primary keys, https://github.com/simonw/datasette/issues/189#issuecomment-377362466,https://api.github.com/repos/simonw/datasette/issues/189,377362466,MDEyOklzc3VlQ29tbWVudDM3NzM2MjQ2Ng==,9599,simonw,2018-03-29T20:29:14Z,2018-03-29T20:29:14Z,OWNER,"Alternative idea: by default enable all sorting in the UI. If a table has more than 100,000 rows disable sorting UI except for columns that have an index. Allow this to be overridden in metadata.json ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/190#issuecomment-377454591,https://api.github.com/repos/simonw/datasette/issues/190,377454591,MDEyOklzc3VlQ29tbWVudDM3NzQ1NDU5MQ==,9599,simonw,2018-03-30T06:11:59Z,2018-03-30T06:11:59Z,OWNER,"Re-opening this issue: my fix doesn't play nicely with extra filter arguments. Consider this page: https://datasette-issue-190-compound-pks-not-quite-fixed.now.sh/compound-pks-8e99805/compound_three_primary_keys?content__contains=d The next link is to `?_next=f%2Cz%2Ct&content__contains=z` (that's next of `f,z,t`) but that gives us https://datasette-issue-190-compound-pks-not-quite-fixed.now.sh/compound-pks-8e99805/compound_three_primary_keys?_next=b%2Cx%2Cd&content__contains=d which shows `a,a,d` at the top. Sure enough, the generated SQL looks like this: https://datasette-issue-190-compound-pks-not-quite-fixed.now.sh/compound-pks-8e99805?sql=select+%2A+from+compound_three_primary_keys+where+%22content%22+like+%3Ap0+and+%28%5Bpk1%5D+%3E+%3Ap0%29%0A++or%0A%28%5Bpk1%5D+%3D+%3Ap0+and+%5Bpk2%5D+%3E+%3Ap1%29%0A++or%0A%28%5Bpk1%5D+%3D+%3Ap0+and+%5Bpk2%5D+%3D+%3Ap1+and+%5Bpk3%5D+%3E+%3Ap2%29+order+by+pk1%2C+pk2%2C+pk3+limit+101&p0=%25d%25&p1=b&p2=x&p3=d select * from compound_three_primary_keys where ""content"" like :p0 and ([pk1] > :p0) or ([pk1] = :p0 and [pk2] > :p1) or ([pk1] = :p0 and [pk2] = :p1 and [pk3] > :p2) order by pk1, pk2, pk3 limit 101 The parameters here are confused. The :p0 should be reserved just for the like clause - the other parameters should be p1, p2 and p3 (not p0, p1 and p2).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309558826,Keyset pagination doesn't work correctly for compound primary keys, https://github.com/simonw/datasette/issues/190#issuecomment-377457087,https://api.github.com/repos/simonw/datasette/issues/190,377457087,MDEyOklzc3VlQ29tbWVudDM3NzQ1NzA4Nw==,9599,simonw,2018-03-30T06:30:23Z,2018-03-30T06:30:23Z,OWNER,"Interestingly, in deploying a copy of the database to demonstrate this final bug fix I had to use the `--force` argument like so: datasette publish now --branch=master compound-pks.db --force This is because `now` had already deployed a Dockerfile referencing `--branch=master` once already, so it thought nothing had changed and it could re-use that last deployment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309558826,Keyset pagination doesn't work correctly for compound primary keys, https://github.com/simonw/datasette/issues/190#issuecomment-377457214,https://api.github.com/repos/simonw/datasette/issues/190,377457214,MDEyOklzc3VlQ29tbWVudDM3NzQ1NzIxNA==,9599,simonw,2018-03-30T06:31:15Z,2018-03-30T06:31:15Z,OWNER,"Fixed! https://datasette-issue-190-compound-pks-second-fix.now.sh/compound-pks-8e99805/compound_three_primary_keys?_next=b%2Cx%2Cd&content__contains=d now correctly shows `b,y,d` as the first row on the page.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309558826,Keyset pagination doesn't work correctly for compound primary keys, https://github.com/simonw/datasette/issues/189#issuecomment-377459579,https://api.github.com/repos/simonw/datasette/issues/189,377459579,MDEyOklzc3VlQ29tbWVudDM3NzQ1OTU3OQ==,9599,simonw,2018-03-30T06:47:52Z,2018-03-30T06:47:52Z,OWNER,"I'm not entirely sure how to get `_next=` pagination working against sorted collections when a tie-breaker is needed. Consider this data: https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9?sql=select+rowid%2C+*+from+%5Bnfl-wide-receivers%2Fadvanced-historical%5D%0D%0Aorder+by+case+when+career_ranypa+is+null+then+1+else+0+end%2C+career_ranypa%2C+rowid+limit+11 ![2018-03-29 at 11 46 pm](https://user-images.githubusercontent.com/9599/38127549-790c8bd0-33ab-11e8-8d32-66f5d3847c8a.png) If the page size was set to 9 rather than 11, the page divide would be between those two rows with the same value in the `career_ranypa` column. What would the `?_next=` token look like such that the correct row would be returned? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377460127,https://api.github.com/repos/simonw/datasette/issues/189,377460127,MDEyOklzc3VlQ29tbWVudDM3NzQ2MDEyNw==,9599,simonw,2018-03-30T06:51:29Z,2018-03-30T06:51:52Z,OWNER,"The problem is that our `_next=` pagination currently works based on a `>` - but for this case a `>=` for the value is needed combined with a `>` on the tie-breaker (which would be the `rowid` column). So I think this is the right SQL: ``` select rowid, * from [nfl-wide-receivers/advanced-historical] where career_ranypa >= -6.331167749 and rowid > 2736 order by case when career_ranypa is null then 1 else 0 end, career_ranypa, rowid limit 11 ``` https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9?sql=select+rowid%2C+*+from+%5Bnfl-wide-receivers%2Fadvanced-historical%5D%0D%0Awhere+career_ranypa+%3E%3D+-6.331167749+and+rowid+%3E+2736%0D%0Aorder+by+case+when+career_ranypa+is+null+then+1+else+0+end%2C+career_ranypa%2C+rowid+limit+11 But how do I encode a `_next` token that means "">= X and > Y""?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377462334,https://api.github.com/repos/simonw/datasette/issues/189,377462334,MDEyOklzc3VlQ29tbWVudDM3NzQ2MjMzNA==,9599,simonw,2018-03-30T07:06:21Z,2018-03-30T07:06:21Z,OWNER,"Maybe the answer here is that anything that's encoded in the next token is treated as >= with the exception of columns known to be primary keys, which are treated as >","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377546510,https://api.github.com/repos/simonw/datasette/issues/189,377546510,MDEyOklzc3VlQ29tbWVudDM3NzU0NjUxMA==,9599,simonw,2018-03-30T15:13:11Z,2018-03-30T15:13:11Z,OWNER,"Pushed some work-in-progress with failing unit tests here: https://github.com/simonw/datasette/commit/2f8359c6f25768805431c80c74e5ec4213c2b2a6 Here's a demo: https://datasette-column-sort-wip.now.sh/sortable-4bbaa6f/sortable?_sort=sortable - note that the `_sort_desc` and `_sort_nulls_last` options aren't done yet, plus it doesn't correctly paginate (the `_next` tokens do not yet take sorting into account).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377547265,https://api.github.com/repos/simonw/datasette/issues/189,377547265,MDEyOklzc3VlQ29tbWVudDM3NzU0NzI2NQ==,9599,simonw,2018-03-30T15:16:43Z,2018-03-30T15:16:43Z,OWNER,"I think this is the right incantation for a ""next"" link: https://datasette-column-sort-wip.now.sh/sortable-4bbaa6f?sql=select+*+from+sortable%0D%0Awhere+sortable+%3C%3D+94%0D%0Aand+%28%0D%0A++%28pk1+%3E+%27d%27%29%0D%0A++or%0D%0A++%28pk1+%3D+%27d%27+and+pk2+%3E+%27w%27%29%0D%0A%29%0D%0Aorder+by+sortable+desc%2C+pk1%2C+pk2%0D%0Alimit+7","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/122#issuecomment-378279612,https://api.github.com/repos/simonw/datasette/issues/122,378279612,MDEyOklzc3VlQ29tbWVudDM3ODI3OTYxMg==,9599,simonw,2018-04-03T14:55:54Z,2018-04-03T14:55:54Z,OWNER,The new documentation for the `_shape=` parameter is now live at http://datasette.readthedocs.io/en/latest/json_api.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275092453,"Redesign JSON output, ditch jsono, offer variants controlled by parameter instead", https://github.com/simonw/datasette/issues/183#issuecomment-378281740,https://api.github.com/repos/simonw/datasette/issues/183,378281740,MDEyOklzc3VlQ29tbWVudDM3ODI4MTc0MA==,9599,simonw,2018-04-03T15:01:43Z,2018-04-03T15:01:43Z,OWNER,"I'm having trouble replicating this bug. In particular, I don't understand what you mean by ""these are then rendered in the datasette query box using single quotes"" - since canned queries aren't displayed in a textarea. Do you have an example database / metadata.json I can use to investigate this further?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",291639118,Custom Queries - escaping strings, https://github.com/simonw/datasette/pull/181#issuecomment-378293484,https://api.github.com/repos/simonw/datasette/issues/181,378293484,MDEyOklzc3VlQ29tbWVudDM3ODI5MzQ4NA==,9599,simonw,2018-04-03T15:34:29Z,2018-04-03T15:34:29Z,OWNER,"Here's what this looks like: ![2018-04-03 at 8 32 am](https://user-images.githubusercontent.com/9599/38259345-9e1c75ea-3719-11e8-83c9-2160c6fa079c.png) I need to figure out the right way to handle licensing of bundled software like this - it's MIT licensed which is compatible with Datasette's Apache 2 license, but I feel like bundled licensed software (including codemirror) needs to be recognized in the README or docs somehow.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",289425975,"add ""format sql"" button to query page, uses sql-formatter", https://github.com/simonw/datasette/pull/181#issuecomment-378293599,https://api.github.com/repos/simonw/datasette/issues/181,378293599,MDEyOklzc3VlQ29tbWVudDM3ODI5MzU5OQ==,9599,simonw,2018-04-03T15:34:50Z,2018-04-03T15:36:58Z,OWNER,"Let's only show the ""Format SQL"" button if the user has JavaScript enabled. We can do that in this code here: https://github.com/bsmithgall/datasette/blob/4a7151a58d6ab7c8404a91beef7083e8a5807cf8/datasette/templates/_codemirror_foot.html#L14-L21","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",289425975,"add ""format sql"" button to query page, uses sql-formatter", https://github.com/simonw/datasette/pull/181#issuecomment-378295376,https://api.github.com/repos/simonw/datasette/issues/181,378295376,MDEyOklzc3VlQ29tbWVudDM3ODI5NTM3Ng==,9599,simonw,2018-04-03T15:39:57Z,2018-04-03T15:39:57Z,OWNER,"On the licensing front: it looks like the way Django handles this is to keep the licensing header in the files intact, e.g. https://github.com/django/django/blob/6deaddcca367d0143c815aaa42342021baa3b41e/django/contrib/admin/static/admin/js/vendor/jquery/jquery.js So for this change, adding a comment at the top of `sql-formatter.min.js` which references the MIT license would do the trick.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",289425975,"add ""format sql"" button to query page, uses sql-formatter", https://github.com/simonw/datasette/pull/181#issuecomment-378297842,https://api.github.com/repos/simonw/datasette/issues/181,378297842,MDEyOklzc3VlQ29tbWVudDM3ODI5Nzg0Mg==,1957344,bsmithgall,2018-04-03T15:47:13Z,2018-04-03T15:47:13Z,NONE,I can work on that -- would you prefer to inline a `display: hidden` and then have the javascript flip the visibility or include it as css?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",289425975,"add ""format sql"" button to query page, uses sql-formatter", https://github.com/simonw/datasette/issues/193#issuecomment-379142500,https://api.github.com/repos/simonw/datasette/issues/193,379142500,MDEyOklzc3VlQ29tbWVudDM3OTE0MjUwMA==,222245,carlmjohnson,2018-04-06T04:05:58Z,2018-04-06T04:05:58Z,NONE,"You could try pulling out a validate query strings method. If it fails validation build the error object from the message. If it passes, you only need to go down a happy path. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310882100,Cleaner mechanism for handling custom errors, https://github.com/simonw/datasette/issues/189#issuecomment-379555484,https://api.github.com/repos/simonw/datasette/issues/189,379555484,MDEyOklzc3VlQ29tbWVudDM3OTU1NTQ4NA==,9599,simonw,2018-04-08T14:39:57Z,2018-04-08T14:39:57Z,OWNER,I'm going to combine the code for explicit sorting with the existing code for _next= pagination - so even tables without an explicit sort order will run through the same code since they are ordered and paginated by primary key.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/48#issuecomment-379556637,https://api.github.com/repos/simonw/datasette/issues/48,379556637,MDEyOklzc3VlQ29tbWVudDM3OTU1NjYzNw==,9599,simonw,2018-04-08T14:56:52Z,2018-04-08T14:56:52Z,OWNER,It would be useful to have a microbenchmark in place to help understand how much of a performance benefit this would actually provide.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272391665,Switch to ujson, https://github.com/simonw/datasette/issues/189#issuecomment-379556774,https://api.github.com/repos/simonw/datasette/issues/189,379556774,MDEyOklzc3VlQ29tbWVudDM3OTU1Njc3NA==,9599,simonw,2018-04-08T14:59:05Z,2018-04-08T14:59:05Z,OWNER,"A common problem with keyset pagination is that it can distort the ""total number of rows"" logic - every time you navigate to a further page the total rows count can decrease due to the extra arguments in the `where` clause. The `filtered_table_rows` value (see #194) calculated using `count_sql` currently has this problem.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/194#issuecomment-379556981,https://api.github.com/repos/simonw/datasette/issues/194,379556981,MDEyOklzc3VlQ29tbWVudDM3OTU1Njk4MQ==,9599,simonw,2018-04-08T15:02:23Z,2018-04-08T15:02:23Z,OWNER,Maybe `table_rows_filtered_count` would be more aesthetically pleasing than `filtered_table_rows_count`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312312125,Rename table_rows and filtered_table_rows to have _count suffix, https://github.com/simonw/datasette/issues/194#issuecomment-379556881,https://api.github.com/repos/simonw/datasette/issues/194,379556881,MDEyOklzc3VlQ29tbWVudDM3OTU1Njg4MQ==,9599,simonw,2018-04-08T15:00:48Z,2018-04-08T15:02:35Z,OWNER,`table_rows_count` is always the *total* number of rows in the table. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312312125,Rename table_rows and filtered_table_rows to have _count suffix, https://github.com/simonw/datasette/issues/195#issuecomment-379557743,https://api.github.com/repos/simonw/datasette/issues/195,379557743,MDEyOklzc3VlQ29tbWVudDM3OTU1Nzc0Mw==,9599,simonw,2018-04-08T15:13:18Z,2018-04-08T15:13:18Z,OWNER,https://github.com/simonw/datasette/blob/446d47fdb005b3776bc06ad8d1f44b01fc2e938b/datasette/app.py#L93-L102,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312313496,"Run pks_for_table in inspect, executing once at build time rather than constantly", https://github.com/simonw/datasette/issues/189#issuecomment-379557982,https://api.github.com/repos/simonw/datasette/issues/189,379557982,MDEyOklzc3VlQ29tbWVudDM3OTU1Nzk4Mg==,9599,simonw,2018-04-08T15:16:49Z,2018-04-08T15:16:49Z,OWNER,"A note about views: a view cannot be paginated using keyset pagination because records returned from a view don't have a primary key - so there's no way to reliably distinguish between _next= records when the sorted column has duplicates with the same value. Datasette already takes this into account: views are paginated using offset/limit instead. We can continue to do that even for views that have been sorted using a `_sort` parameter. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/195#issuecomment-379559074,https://api.github.com/repos/simonw/datasette/issues/195,379559074,MDEyOklzc3VlQ29tbWVudDM3OTU1OTA3NA==,9599,simonw,2018-04-08T15:31:49Z,2018-04-08T15:31:49Z,OWNER,"While I'm at it, doing the same thing for fts_table detection is worth considering: https://github.com/simonw/datasette/blob/446d47fdb005b3776bc06ad8d1f44b01fc2e938b/datasette/app.py#L598-L603","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312313496,"Run pks_for_table in inspect, executing once at build time rather than constantly", https://github.com/simonw/datasette/issues/150#issuecomment-379559214,https://api.github.com/repos/simonw/datasette/issues/150,379559214,MDEyOklzc3VlQ29tbWVudDM3OTU1OTIxNA==,9599,simonw,2018-04-08T15:33:58Z,2018-04-08T15:33:58Z,OWNER,The single biggest challenge here is expanding foreign key references. This is the blocker that prevents `_group_count` from being useful at the moment.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276704327,_group_count= feature improvements, https://github.com/simonw/datasette/issues/150#issuecomment-379559319,https://api.github.com/repos/simonw/datasette/issues/150,379559319,MDEyOklzc3VlQ29tbWVudDM3OTU1OTMxOQ==,9599,simonw,2018-04-08T15:35:43Z,2018-04-08T15:35:43Z,OWNER,"From a code point of view, the current mechanism for `_group_count` makes the `TableView` even **more** complicated: https://github.com/simonw/datasette/blob/446d47fdb005b3776bc06ad8d1f44b01fc2e938b/datasette/app.py#L644-L653 Instead, I think if `_group_count` is detected we should generate the SQL and then defer to `self.custom_sql`, like we do for canned queries: https://github.com/simonw/datasette/blob/446d47fdb005b3776bc06ad8d1f44b01fc2e938b/datasette/app.py#L539-L541","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276704327,_group_count= feature improvements, https://github.com/simonw/datasette/issues/195#issuecomment-379588602,https://api.github.com/repos/simonw/datasette/issues/195,379588602,MDEyOklzc3VlQ29tbWVudDM3OTU4ODYwMg==,9599,simonw,2018-04-08T22:40:16Z,2018-04-08T22:40:16Z,OWNER,"Could also identify all views for that database, which would save on these queries: https://github.com/simonw/datasette/blob/b2188f044265c95f7e54860e28107c17d2a6ed2e/datasette/app.py#L543-L545","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312313496,"Run pks_for_table in inspect, executing once at build time rather than constantly", https://github.com/simonw/datasette/issues/189#issuecomment-379591062,https://api.github.com/repos/simonw/datasette/issues/189,379591062,MDEyOklzc3VlQ29tbWVudDM3OTU5MTA2Mg==,9599,simonw,2018-04-08T23:23:12Z,2018-04-08T23:23:12Z,OWNER,"To break this up into smaller units, the first implementation of this will only support a single `_sort` or `_sort_desc` querystring parameter.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379592393,https://api.github.com/repos/simonw/datasette/issues/189,379592393,MDEyOklzc3VlQ29tbWVudDM3OTU5MjM5Mw==,9599,simonw,2018-04-08T23:45:42Z,2018-04-08T23:46:31Z,OWNER,"Actually next page SQL when sorting looks more like this: ``` select rowid, * from [alcohol-consumption/drinks] where ""country"" like :p0 and ( beer_servings > 111 or (beer_servings = 111 and rowid > 190) ) order by beer_servings, rowid limit 101 ``` The next page after row 190 with sortable value 111 should show either records that are greater than 111 or records that match 111 but have a greater primary key than the last one seen. https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9?sql=select+rowid%2C+*+from+%5Balcohol-consumption%2Fdrinks%5D%0D%0Awhere+%22country%22+like+%3Ap0%0D%0Aand+%28%0D%0A++++beer_servings+%3E+111%0D%0A++++or+%28beer_servings+%3D+111+and+rowid+%3E+190%29%0D%0A%29%0D%0Aorder+by+beer_servings%2C+rowid+limit+101&p0=%25a%25","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379594529,https://api.github.com/repos/simonw/datasette/issues/189,379594529,MDEyOklzc3VlQ29tbWVudDM3OTU5NDUyOQ==,9599,simonw,2018-04-09T00:15:03Z,2018-04-09T00:15:03Z,OWNER,"Demo: senator tweets ordered by number of replies: https://datasette-issue-189-demo.now.sh/fivethirtyeight-2628db9/twitter-ratio%2Fsenators?_sort_desc=replies Page 2 (note that since Senators retweet things there are tweets with the same text/number-of-replies but retweeted by different senators that span the page break): https://datasette-issue-189-demo.now.sh/fivethirtyeight-2628db9/twitter-ratio%2Fsenators?_next=8556%2C121799&_sort_desc=replies ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/185#issuecomment-379595253,https://api.github.com/repos/simonw/datasette/issues/185,379595253,MDEyOklzc3VlQ29tbWVudDM3OTU5NTI1Mw==,9599,simonw,2018-04-09T00:24:10Z,2018-04-09T00:24:10Z,OWNER,@carlmjohnson in case you aren't following along with #189 I've shipped the first working prototype of sort-by-column - you can try it out here: https://datasette-issue-189-demo-2.now.sh/salaries-7859114-7859114/2017+Maryland+state+salaries?_search=university&_sort_desc=annual_salary,"{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/189#issuecomment-379595274,https://api.github.com/repos/simonw/datasette/issues/189,379595274,MDEyOklzc3VlQ29tbWVudDM3OTU5NTI3NA==,9599,simonw,2018-04-09T00:24:37Z,2018-04-09T00:29:46Z,OWNER,"Another demo: https://datasette-issue-189-demo-2.now.sh/salaries-7859114-7859114/2017+Maryland+state+salaries?_search=university&_sort_desc=annual_salary https://datasette-issue-189-demo-2.now.sh/salaries-7859114-7859114/2017+Maryland+state+salaries?_search=university&last_name__exact=JOHNSON&_sort_desc=annual_salary","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379602339,https://api.github.com/repos/simonw/datasette/issues/189,379602339,MDEyOklzc3VlQ29tbWVudDM3OTYwMjMzOQ==,9599,simonw,2018-04-09T01:33:26Z,2018-04-09T01:33:26Z,OWNER,"Small bug: ""201 rows where sorted by sortable_with_nulls"" shouldn't have the word ""where"" in it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379602690,https://api.github.com/repos/simonw/datasette/issues/189,379602690,MDEyOklzc3VlQ29tbWVudDM3OTYwMjY5MA==,9599,simonw,2018-04-09T01:37:03Z,2018-04-09T01:37:03Z,OWNER,"I'm going to split the following out into separate tickets: * Ability to sort by multiple columns e.g. `?_sort=name&sort_desc=age&_sort=height` * Ability to specify nulls last e.g. `?_sort_desc_nulls_last=age`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379603156,https://api.github.com/repos/simonw/datasette/issues/189,379603156,MDEyOklzc3VlQ29tbWVudDM3OTYwMzE1Ng==,9599,simonw,2018-04-09T01:41:22Z,2018-04-09T01:41:22Z,OWNER,"Actually I think I always want nulls last when ordering asc, nulls first when ordering desc.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379608977,https://api.github.com/repos/simonw/datasette/issues/189,379608977,MDEyOklzc3VlQ29tbWVudDM3OTYwODk3Nw==,9599,simonw,2018-04-09T02:22:59Z,2018-04-09T02:22:59Z,OWNER,"Here's a demo of the new clickable column headers: https://datasette-issue-189-demo-3.now.sh/salaries-7859114-7859114/2017+Maryland+state+salaries?_search=university&_sort_desc=last_name ![2018-04-08 at 7 22 pm](https://user-images.githubusercontent.com/9599/38476370-3e62a60e-3b62-11e8-9d30-8dc6608133dd.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/193#issuecomment-379624163,https://api.github.com/repos/simonw/datasette/issues/193,379624163,MDEyOklzc3VlQ29tbWVudDM3OTYyNDE2Mw==,9599,simonw,2018-04-09T04:03:49Z,2018-04-09T04:03:49Z,OWNER,"This is harder than I thought, because the `_shape=` logic actually runs AFTER the main block of code which is set up to catch exceptions - this code here: https://github.com/simonw/datasette/blob/0abd3abacb309a2bd5913a7a2df4e9256585b1bb/datasette/app.py#L200-L216","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310882100,Cleaner mechanism for handling custom errors, https://github.com/simonw/datasette/issues/189#issuecomment-379634425,https://api.github.com/repos/simonw/datasette/issues/189,379634425,MDEyOklzc3VlQ29tbWVudDM3OTYzNDQyNQ==,9599,simonw,2018-04-09T05:16:02Z,2018-04-09T05:16:02Z,OWNER,I've merged this into master.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/184#issuecomment-379636068,https://api.github.com/repos/simonw/datasette/issues/184,379636068,MDEyOklzc3VlQ29tbWVudDM3OTYzNjA2OA==,9599,simonw,2018-04-09T05:26:21Z,2018-04-09T05:26:21Z,OWNER,Do you have steps to reproduce here - ideally a small example SQLite database that exhibits the error?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",292011379,500 from missing table name, https://github.com/simonw/datasette/pull/181#issuecomment-379636695,https://api.github.com/repos/simonw/datasette/issues/181,379636695,MDEyOklzc3VlQ29tbWVudDM3OTYzNjY5NQ==,9599,simonw,2018-04-09T05:30:16Z,2018-04-09T05:30:16Z,OWNER,"I'd prefer to have the JavaScript actually manipulate the DOM to add the button - something like this: var button = document.createElement('button'); button.value = 'Format SQL'; button.addEventListener( 'click', format, false ); document.getElementById('run-sql').parentNode.appendChild(button);","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",289425975,"add ""format sql"" button to query page, uses sql-formatter", https://github.com/simonw/datasette/pull/181#issuecomment-379759875,https://api.github.com/repos/simonw/datasette/issues/181,379759875,MDEyOklzc3VlQ29tbWVudDM3OTc1OTg3NQ==,1957344,bsmithgall,2018-04-09T13:53:14Z,2018-04-09T13:53:14Z,NONE,I've implemented that approach in 86ac746. It does cause the button to pop in only after Codemirror is finished rendering which is a bit awkward.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",289425975,"add ""format sql"" button to query page, uses sql-formatter", https://github.com/simonw/datasette/issues/184#issuecomment-379788103,https://api.github.com/repos/simonw/datasette/issues/184,379788103,MDEyOklzc3VlQ29tbWVudDM3OTc4ODEwMw==,222245,carlmjohnson,2018-04-09T15:15:11Z,2018-04-09T15:15:11Z,NONE,Visit https://salaries.news.baltimoresun.com/salaries/bad-table.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",292011379,500 from missing table name, https://github.com/simonw/datasette/issues/189#issuecomment-379791047,https://api.github.com/repos/simonw/datasette/issues/189,379791047,MDEyOklzc3VlQ29tbWVudDM3OTc5MTA0Nw==,222245,carlmjohnson,2018-04-09T15:23:45Z,2018-04-09T15:23:45Z,NONE,Awesome!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379803864,https://api.github.com/repos/simonw/datasette/issues/189,379803864,MDEyOklzc3VlQ29tbWVudDM3OTgwMzg2NA==,9599,simonw,2018-04-09T16:02:09Z,2018-04-09T16:02:09Z,OWNER,This is now released in Datasette 0.15 https://github.com/simonw/datasette/releases/tag/0.15,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379830529,https://api.github.com/repos/simonw/datasette/issues/189,379830529,MDEyOklzc3VlQ29tbWVudDM3OTgzMDUyOQ==,9599,simonw,2018-04-09T17:28:47Z,2018-04-09T17:28:47Z,OWNER,Another demo: https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9/congress-age%2Fcongress-terms,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/199#issuecomment-379833216,https://api.github.com/repos/simonw/datasette/issues/199,379833216,MDEyOklzc3VlQ29tbWVudDM3OTgzMzIxNg==,9599,simonw,2018-04-09T17:37:47Z,2018-04-09T17:37:47Z,OWNER,I may do this by adding select boxes for _sort and _sort_desc to the filters UI. This would allow sorting in mobile portrait mode but would also ensure that the existing sort order is persisted if the user edits the current filters (right now sort resets when filters are applied).,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312620566,Ability to apply sort on mobile in portrait mode, https://github.com/simonw/datasette/issues/199#issuecomment-379833481,https://api.github.com/repos/simonw/datasette/issues/199,379833481,MDEyOklzc3VlQ29tbWVudDM3OTgzMzQ4MQ==,9599,simonw,2018-04-09T17:38:39Z,2018-04-09T17:38:39Z,OWNER,"Since you can't apply `_sort` and `_sort_desc` at the same time, maybe just one select box for picking the column to sort by and a boolean checkbox for ""sort descending"" - which then redirects to the `_sort_desc=` URL variant.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312620566,Ability to apply sort on mobile in portrait mode, https://github.com/simonw/datasette/issues/199#issuecomment-379936068,https://api.github.com/repos/simonw/datasette/issues/199,379936068,MDEyOklzc3VlQ29tbWVudDM3OTkzNjA2OA==,9599,simonw,2018-04-10T00:32:37Z,2018-04-10T00:32:37Z,OWNER,"![2018-04-09 at 5 32 pm](https://user-images.githubusercontent.com/9599/38529802-fd2a7e68-3c1b-11e8-974a-bf5438fec701.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312620566,Ability to apply sort on mobile in portrait mode, https://github.com/simonw/datasette/issues/199#issuecomment-379936832,https://api.github.com/repos/simonw/datasette/issues/199,379936832,MDEyOklzc3VlQ29tbWVudDM3OTkzNjgzMg==,9599,simonw,2018-04-10T00:37:52Z,2018-04-10T00:37:52Z,OWNER,Demo: https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9/twitter-ratio%2Fsenators?_sort_desc=replies&text__contains=bipartisan,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312620566,Ability to apply sort on mobile in portrait mode, https://github.com/simonw/datasette/pull/200#issuecomment-380606998,https://api.github.com/repos/simonw/datasette/issues/200,380606998,MDEyOklzc3VlQ29tbWVudDM4MDYwNjk5OA==,9599,simonw,2018-04-11T21:50:14Z,2018-04-11T21:50:14Z,OWNER,"We should only do this if we're certain the spatialite module has been loaded. I could imagine someone having a `sql_statements_log` table of their own without using spatialite for example. I think the most reliable way to detect spatialite is to run `SELECT AddGeometryColumn(1, 2, 3, 4, 5);` against a `:memory:` database and see if it throws an exception - similar to how we detect FTS. We could add this as a `detect_spatialite()` function in `utils.py` and call it once on startup.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313494458,Hide Spatialite system tables, https://github.com/simonw/datasette/issues/184#issuecomment-380608340,https://api.github.com/repos/simonw/datasette/issues/184,380608340,MDEyOklzc3VlQ29tbWVudDM4MDYwODM0MA==,9599,simonw,2018-04-11T21:55:41Z,2018-04-11T21:55:41Z,OWNER,"Yuck, nasty - OK I get it, this happens with ANY non-existent table name. Let's fix that - these should clearly return an HTTP 404.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",292011379,500 from missing table name, https://github.com/simonw/datasette/pull/200#issuecomment-380608372,https://api.github.com/repos/simonw/datasette/issues/200,380608372,MDEyOklzc3VlQ29tbWVudDM4MDYwODM3Mg==,45057,russss,2018-04-11T21:55:46Z,2018-04-11T21:55:46Z,CONTRIBUTOR,"> I think the most reliable way to detect spatialite is to run `SELECT AddGeometryColumn(1, 2, 3, 4, 5);` against a `:memory:` database and see if it throws an exception Or just see if there's a `geometry_columns` table? I think that's quite unlikely to be added by accident (and it's an OGC standard). It also tells you if Spatialite is installed in the database rather than just loaded.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313494458,Hide Spatialite system tables, https://github.com/simonw/datasette/issues/193#issuecomment-380619851,https://api.github.com/repos/simonw/datasette/issues/193,380619851,MDEyOklzc3VlQ29tbWVudDM4MDYxOTg1MQ==,9599,simonw,2018-04-11T22:48:19Z,2018-04-11T22:48:19Z,OWNER,I can clean this up further with the mechanism I'm using for #184,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310882100,Cleaner mechanism for handling custom errors, https://github.com/simonw/datasette/pull/200#issuecomment-380951474,https://api.github.com/repos/simonw/datasette/issues/200,380951474,MDEyOklzc3VlQ29tbWVudDM4MDk1MTQ3NA==,9599,simonw,2018-04-12T21:34:39Z,2018-04-12T21:34:39Z,OWNER,"Nice, thanks very much.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313494458,Hide Spatialite system tables, https://github.com/simonw/datasette/issues/203#issuecomment-380951815,https://api.github.com/repos/simonw/datasette/issues/203,380951815,MDEyOklzc3VlQ29tbWVudDM4MDk1MTgxNQ==,9599,simonw,2018-04-12T21:36:10Z,2018-04-12T21:36:10Z,OWNER,I like this. I'd like to be able to attach a full description to a column as well. We could support these in `metadata.json`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/203#issuecomment-380951920,https://api.github.com/repos/simonw/datasette/issues/203,380951920,MDEyOklzc3VlQ29tbWVudDM4MDk1MTkyMA==,9599,simonw,2018-04-12T21:36:38Z,2018-04-12T21:36:38Z,OWNER,This also feeds into the visualization features I want to add - we could use this kind of metadata to automatically apply meaningful labels to graphs.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/203#issuecomment-380966565,https://api.github.com/repos/simonw/datasette/issues/203,380966565,MDEyOklzc3VlQ29tbWVudDM4MDk2NjU2NQ==,45057,russss,2018-04-12T22:43:08Z,2018-04-12T22:43:08Z,CONTRIBUTOR,"Looks like [pint](https://pint.readthedocs.io/en/latest/tutorial.html) is pretty good at this. ```python In [1]: import pint In [2]: ureg = pint.UnitRegistry() In [3]: q = 3e6 * ureg('Hz') In [4]: '{:~P}'.format(q.to_compact()) Out[4]: '3.0 MHz' In [5]: q = 0.3 * ureg('m') In [5]: '{:~P}'.format(q.to_compact()) Out[5]: '300.0 mm' In [6]: q = 5 * ureg('') In [7]: '{:~P}'.format(q.to_compact()) Out[7]: '5' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/pull/202#issuecomment-381220441,https://api.github.com/repos/simonw/datasette/issues/202,381220441,MDEyOklzc3VlQ29tbWVudDM4MTIyMDQ0MQ==,9599,simonw,2018-04-13T18:19:15Z,2018-04-13T18:19:15Z,OWNER,I'm afraid I've just made this obsolete with 9f28bbe43dc277a3963a12aaae37b5ee3c277207,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313785206,Raise 404 on nonexistent table URLs, https://github.com/simonw/datasette/pull/202#issuecomment-381237440,https://api.github.com/repos/simonw/datasette/issues/202,381237440,MDEyOklzc3VlQ29tbWVudDM4MTIzNzQ0MA==,45057,russss,2018-04-13T19:22:53Z,2018-04-13T19:22:53Z,CONTRIBUTOR,I spotted you'd mentioned that in #184 but only after I'd written the patch!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313785206,Raise 404 on nonexistent table URLs, https://github.com/simonw/datasette/issues/201#issuecomment-381262824,https://api.github.com/repos/simonw/datasette/issues/201,381262824,MDEyOklzc3VlQ29tbWVudDM4MTI2MjgyNA==,9599,simonw,2018-04-13T21:17:14Z,2018-04-13T21:17:14Z,OWNER,"Demo: https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9?sql=explain+query+plan+select+*+from+%5Bmost-common-name%2Fsurnames%5D+order+by+rank+desc https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9?sql=explain+select+*+from+%5Bmost-common-name%2Fsurnames%5D+order+by+rank+desc","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313512748,Support explain select / explain query plan select, https://github.com/simonw/datasette/issues/203#issuecomment-381300336,https://api.github.com/repos/simonw/datasette/issues/203,381300336,MDEyOklzc3VlQ29tbWVudDM4MTMwMDMzNg==,9599,simonw,2018-04-14T03:35:02Z,2018-04-14T03:35:02Z,OWNER,"This is really cool - I'm very impressed by pint. I'd like to figure out a sensible opt-in way to expose this in the JSON output as well. Maybe with a `&_units=true` parameter? We should definitely expose the units section from the table metadata in the output of https://wtr-api.herokuapp.com/wtr-663ea99/license_frequency.json","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/203#issuecomment-381300386,https://api.github.com/repos/simonw/datasette/issues/203,381300386,MDEyOklzc3VlQ29tbWVudDM4MTMwMDM4Ng==,9599,simonw,2018-04-14T03:35:56Z,2018-04-14T03:35:56Z,OWNER,"In #204 you said ""I'd like to add support for using units when querying but this is PR is pretty usable as-is."" - I'm fascinated to hear more about how this could work.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/203#issuecomment-381315675,https://api.github.com/repos/simonw/datasette/issues/203,381315675,MDEyOklzc3VlQ29tbWVudDM4MTMxNTY3NQ==,45057,russss,2018-04-14T09:14:45Z,2018-04-14T09:27:30Z,CONTRIBUTOR,"> I'd like to figure out a sensible opt-in way to expose this in the JSON output as well. Maybe with a &_units=true parameter? From a machine-readable perspective I'm not sure why it would be useful to decorate the values with units. Edit: Should have had some coffee first. It's clearly useful for stuff like map rendering! I agree that the unit metadata should definitely be exposed in the JSON. > In #204 you said ""I'd like to add support for using units when querying but this is PR is pretty usable as-is."" - I'm fascinated to hear more about how this could work. I'm thinking about a couple of approaches here. I think the simplest one is: if the column has a unit attached, optionally accept units in query fields: ```python column_units = ureg(""Hz"") # Create a unit object for the column's unit query_variable = ureg(""4 GHz"") # Supplied query variable # Now we can convert the query units into column units before querying supplied_value.to(column_units).magnitude > 4000000000.0 # If the user doesn't supply units, pint just returns the plain # number and we can query as usual assuming it's the base unit query_variable = ureg(""50"") query_variable > 50 isinstance(query_variable, numbers.Number) > True ``` This also lets us do some nice unit conversion on querying: ```python column_units = ureg(""m"") query_variable = ureg(""50 ft"") supplied_value.to(column_units) > ``` The alternative would be to provide a dropdown of units next to the query field (so a ""Hz"" field would give you ""kHz"", ""MHz"", ""GHz""). Although this would be clearer to the user, it isn't so easy - we'd need to know more about the context of the field to give you sensible SI prefixes (I'm not so interested in nanoHertz, for example). You also lose the bonus of being able to convert - although pint will happily show you all the compatible units, it again suffers from a lack of context: ```python ureg(""m"").compatible_units() > frozenset({, , , , , , , , , , , }) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/203#issuecomment-381330075,https://api.github.com/repos/simonw/datasette/issues/203,381330075,MDEyOklzc3VlQ29tbWVudDM4MTMzMDA3NQ==,9599,simonw,2018-04-14T13:41:53Z,2018-04-14T13:41:53Z,OWNER,"Presumably units only work for numeric fields? If that's the case then automatically processing them if the incoming query string argument has a unit suffix makes total sense to me. Here's a pretty crazy idea: what if we exposed unit conversion to SQL as a custom SQLite function? That way it would be possible to optionally use units in actual custom SQL queries. I'd have to think quite carefully about performance implications here - wouldn't want a poorly considered unit calculation over a 500,000 row table to lock up the server. But I think the 1s query time limit might still prevent that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/pull/205#issuecomment-381330220,https://api.github.com/repos/simonw/datasette/issues/205,381330220,MDEyOklzc3VlQ29tbWVudDM4MTMzMDIyMA==,9599,simonw,2018-04-14T13:44:15Z,2018-04-14T13:44:15Z,OWNER,This looks great so far - love the new documentation. Let's throw in a unit test or two for the basic unit filters (mainly as a protection against accidental regressions in the future).,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314319372,Support filtering with units and more, https://github.com/simonw/datasette/pull/205#issuecomment-381332222,https://api.github.com/repos/simonw/datasette/issues/205,381332222,MDEyOklzc3VlQ29tbWVudDM4MTMzMjIyMg==,45057,russss,2018-04-14T14:16:35Z,2018-04-14T14:16:35Z,CONTRIBUTOR,I've added some tests and that docs link.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314319372,Support filtering with units and more, https://github.com/simonw/datasette/pull/207#issuecomment-381334973,https://api.github.com/repos/simonw/datasette/issues/207,381334973,MDEyOklzc3VlQ29tbWVudDM4MTMzNDk3Mw==,9599,simonw,2018-04-14T14:59:52Z,2018-04-14T14:59:52Z,OWNER,I'm going to merge this and then add a unit test.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314329002,Link foreign keys which don't have labels, https://github.com/simonw/datasette/pull/205#issuecomment-381336696,https://api.github.com/repos/simonw/datasette/issues/205,381336696,MDEyOklzc3VlQ29tbWVudDM4MTMzNjY5Ng==,9599,simonw,2018-04-14T15:24:04Z,2018-04-14T15:24:04Z,OWNER,I merged this to master in c857608738d6b6c3e4f3248304a22f8b2648dd3e - thanks @russss!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314319372,Support filtering with units and more, https://github.com/simonw/datasette/issues/203#issuecomment-381348849,https://api.github.com/repos/simonw/datasette/issues/203,381348849,MDEyOklzc3VlQ29tbWVudDM4MTM0ODg0OQ==,9599,simonw,2018-04-14T18:12:52Z,2018-04-14T18:12:52Z,OWNER,I think I'm going to hold on to the custom sql function idea for the moment and implement it as an example plugin.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/125#issuecomment-381361734,https://api.github.com/repos/simonw/datasette/issues/125,381361734,MDEyOklzc3VlQ29tbWVudDM4MTM2MTczNA==,45057,russss,2018-04-14T21:26:30Z,2018-04-14T21:26:30Z,CONTRIBUTOR,"FWIW I am now doing this on my WTR app (instead of silently limiting maps to 1000). [Telefonica](https://wtr-api.herokuapp.com/wtr-663ea99/licensee/18325) now has about 4000 markers and good old [BT](https://wtr-api.herokuapp.com/wtr-663ea99/licensee/8412) has 22,000 or so.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275135393,Plot rows on a map with Leaflet and Leaflet.markercluster, https://github.com/simonw/datasette/issues/189#issuecomment-381429213,https://api.github.com/repos/simonw/datasette/issues/189,381429213,MDEyOklzc3VlQ29tbWVudDM4MTQyOTIxMw==,222245,carlmjohnson,2018-04-15T18:54:22Z,2018-04-15T18:54:22Z,NONE,"I think I found a bug. I tried to sort by middle initial in my salaries set, and many middle initials are null. The next_url gets set by Datasette to: http://localhost:8001/salaries-d3a5631/2017+Maryland+state+salaries?_next=None%2C391&_sort=middle_initial But then `None` is interpreted literally and it tries to find a name with the middle initial ""None"" and ends up skipping ahead to O on page 2.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/pull/209#issuecomment-381441392,https://api.github.com/repos/simonw/datasette/issues/209,381441392,MDEyOklzc3VlQ29tbWVudDM4MTQ0MTM5Mg==,45057,russss,2018-04-15T21:59:15Z,2018-04-15T21:59:15Z,CONTRIBUTOR,"I suspected this would cause some test failures, but I'll wait for opinions before attempting to fix them.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314455877, Don't duplicate simple primary keys in the link column, https://github.com/simonw/datasette/issues/14#issuecomment-381442233,https://api.github.com/repos/simonw/datasette/issues/14,381442233,MDEyOklzc3VlQ29tbWVudDM4MTQ0MjIzMw==,9599,simonw,2018-04-15T22:13:06Z,2018-04-15T22:13:06Z,OWNER,"I started a thread on Twitter asking people for good examples of Python projects with a strong plugin ecosystem: https://twitter.com/simonw/status/985377670388105216 The most impressive example that came back was pytest - which now has nearly 400 plugins: https://plugincompat.herokuapp.com/ The pytest plugin infrastructure is available as an independent package called pluggy - which appears to offer everything I need for Datasette. I'm going to give that a go and see how well it works: https://pluggy.readthedocs.io/en/latest/","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381442494,https://api.github.com/repos/simonw/datasette/issues/14,381442494,MDEyOklzc3VlQ29tbWVudDM4MTQ0MjQ5NA==,9599,simonw,2018-04-15T22:17:59Z,2018-04-15T22:17:59Z,OWNER,"Datasette 1.0 will be the release of Datasette that attempts to provide a stable plugin API: https://github.com/simonw/datasette/milestone/7 There's a lot of work to be done before then, but as a starting point I'm going to support two very simple extension mechanisms: * Template system plugins - where the hook gets passed the Jinja environment and can freely register new template tags and filters * SQLite connection plugins - where the hook gets passed a new SQLite connection and can register custom SQLite functions The template system hook will go near here: https://github.com/simonw/datasette/blob/efbb4e83374a2c795e436c72fa79f70da72309b8/datasette/app.py#L1225-L1228 The SQLite connection hook will go near here: https://github.com/simonw/datasette/blob/efbb4e83374a2c795e436c72fa79f70da72309b8/datasette/app.py#L1094-L1098 These two feel simple enough that I'm not worried that I might design an API that I later regret.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381443728,https://api.github.com/repos/simonw/datasette/issues/14,381443728,MDEyOklzc3VlQ29tbWVudDM4MTQ0MzcyOA==,9599,simonw,2018-04-15T22:39:00Z,2018-04-15T22:39:00Z,OWNER,Tox is a good example of a project that uses pluggy in the way I want to use it (function hooks rather than classes): https://github.com/tox-dev/tox/blob/master/tox/hookspecs.py,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381446511,https://api.github.com/repos/simonw/datasette/issues/14,381446511,MDEyOklzc3VlQ29tbWVudDM4MTQ0NjUxMQ==,9599,simonw,2018-04-15T23:25:04Z,2018-04-15T23:25:04Z,OWNER,"Here's a demo of the `convert_units()` SQL function I prototyped in f2720b0c6b7172ebe88 ![2018-04-15 at 4 23 pm](https://user-images.githubusercontent.com/9599/38784633-8c43821e-40c9-11e8-97dd-697755a0f858.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/203#issuecomment-381446554,https://api.github.com/repos/simonw/datasette/issues/203,381446554,MDEyOklzc3VlQ29tbWVudDM4MTQ0NjU1NA==,9599,simonw,2018-04-15T23:25:54Z,2018-04-15T23:26:03Z,OWNER,I built a prototype of the `convert_units()` custom SQL function as a plugin over in https://github.com/simonw/datasette/issues/14#issuecomment-381446511,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/14#issuecomment-381446906,https://api.github.com/repos/simonw/datasette/issues/14,381446906,MDEyOklzc3VlQ29tbWVudDM4MTQ0NjkwNg==,9599,simonw,2018-04-15T23:31:58Z,2018-04-15T23:34:10Z,OWNER,"Once I've got the plugins mechanism stable and people start releasing plugins it would be useful to have a dedicated Trove classifier on PyPI for Datasette plugins - `Framework :: Datasette` for example. This would help me build a Datasette equivalent of the http://plugincompat.herokuapp.com/ site, which works by scanning PyPI for items with the ``Framework :: Pytest`` classifier: https://github.com/pytest-dev/plugincompat/blob/8bdf1a6fb82807091ece0c68c196103ee8270194/update_index.py#L52-L53 It looks like the mechanism for requesting new PyPI classifiers is to file a ticket against warehouse, like these ones: https://github.com/pypa/warehouse/issues/3570 and https://github.com/pypa/warehouse/issues/2881","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381450394,https://api.github.com/repos/simonw/datasette/issues/14,381450394,MDEyOklzc3VlQ29tbWVudDM4MTQ1MDM5NA==,9599,simonw,2018-04-16T00:27:23Z,2018-04-16T00:27:23Z,OWNER,"I created https://github.com/simonw/datasette-plugin-demos which is now published to PyPI and can be installed with `pip install datasette-plugin-demos` - I've confirmed that if you DO install it my Datasette `plugins` branch picks up the plugins, and `select random_integer(1, 4)` works as it should.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381450591,https://api.github.com/repos/simonw/datasette/issues/14,381450591,MDEyOklzc3VlQ29tbWVudDM4MTQ1MDU5MQ==,9599,simonw,2018-04-16T00:30:22Z,2018-04-16T00:34:42Z,OWNER,"Slight code design problem... when I tried installing my branch in a fresh virtual environment I got this error, because `setup.py` now depends on `pluggy` (from importing `__version__`): ``` File ""/private/var/folders/jj/fngnv0810tn2lt_kd3911pdc0000gp/T/pip-req-build-dftqdezt/setup.py"", line 2, in from datasette import __version__ File ""/private/var/folders/jj/fngnv0810tn2lt_kd3911pdc0000gp/T/pip-req-build-dftqdezt/datasette/__init__.py"", line 2, in from .hookspecs import hookimpl # noqa File ""/private/var/folders/jj/fngnv0810tn2lt_kd3911pdc0000gp/T/pip-req-build-dftqdezt/datasette/hookspecs.py"", line 1, in from pluggy import HookimplMarker ModuleNotFoundError: No module named 'pluggy' ``` Looks like I've run into point 6 on https://packaging.python.org/guides/single-sourcing-package-version/ : ![2018-04-15 at 5 34 pm](https://user-images.githubusercontent.com/9599/38785314-403ce86a-40d3-11e8-8542-ba426eddf4ac.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/139#issuecomment-381455054,https://api.github.com/repos/simonw/datasette/issues/139,381455054,MDEyOklzc3VlQ29tbWVudDM4MTQ1NTA1NA==,9599,simonw,2018-04-16T01:24:13Z,2018-04-16T01:24:13Z,OWNER,"I think Vega-Lite is the way to go here: https://vega.github.io/vega-lite/ I've been playing around with it and Datasette with some really positive initial results: https://vega.github.io/editor/#/gist/vega-lite/simonw/89100ce80573d062d70f780d10e5e609/decada131575825875c0a076e418c661c2adb014/vice-shootings-gender-race-by-department.vl.json https://vega.github.io/editor/#/gist/vega-lite/simonw/5f69fbe29380b0d5d95f31a385f49ee4/7087b64df03cf9dba44a5258a606f29182cb8619/trees-san-francisco.vl.json","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275493851,Build a visualization plugin for Vega, https://github.com/simonw/datasette/issues/211#issuecomment-381456434,https://api.github.com/repos/simonw/datasette/issues/211,381456434,MDEyOklzc3VlQ29tbWVudDM4MTQ1NjQzNA==,9599,simonw,2018-04-16T01:36:16Z,2018-04-16T01:37:44Z,OWNER,"The easiest way to implement this in Python 2 would be `execfile(...)` - but that was removed in Python 3. According to https://stackoverflow.com/a/437857/6083 `2to3` replaces that with this, which ensures the filename is associated with the code for debugging purposes: ``` with open(""somefile.py"") as f: code = compile(f.read(), ""somefile.py"", 'exec') exec(code, global_vars, local_vars) ``` Implementing it this way would force this kind of plugin to be self-contained in a single file. I think that's OK: if you want a more complex plugin you can use the standard pluggy-powered setuptools mechanism to build it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314471743,Load plugins from a `--plugins-dir=plugins/` directory, https://github.com/simonw/datasette/issues/211#issuecomment-381462005,https://api.github.com/repos/simonw/datasette/issues/211,381462005,MDEyOklzc3VlQ29tbWVudDM4MTQ2MjAwNQ==,9599,simonw,2018-04-16T02:23:07Z,2018-04-16T02:23:07Z,OWNER,This needs unit tests. I also need to manually test the `datasette package` and `datesette publish` commands.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314471743,Load plugins from a `--plugins-dir=plugins/` directory, https://github.com/simonw/datasette/issues/211#issuecomment-381478217,https://api.github.com/repos/simonw/datasette/issues/211,381478217,MDEyOklzc3VlQ29tbWVudDM4MTQ3ODIxNw==,9599,simonw,2018-04-16T04:41:38Z,2018-04-16T04:41:38Z,OWNER,"Here's the result of running: datasette publish now fivethirtyeight.db \ --plugins-dir=plugins/ --title=""FiveThirtyEight"" --branch=plugins-dir https://datasette-phjtvzwwzl.now.sh/fivethirtyeight-2628db9?sql=select+convert_units%28100%2C+%27m%27%2C+%27ft%27%29 Where `plugins/pint_plugin.py` contains the following: ``` from datasette import hookimpl import pint ureg = pint.UnitRegistry() @hookimpl def prepare_connection(conn): def convert_units(amount, from_, to_): ""select convert_units(100, 'm', 'ft');"" return (amount * ureg(from_)).to(to_).to_tuple()[0] conn.create_function('convert_units', 3, convert_units) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314471743,Load plugins from a `--plugins-dir=plugins/` directory, https://github.com/simonw/datasette/issues/211#issuecomment-381478253,https://api.github.com/repos/simonw/datasette/issues/211,381478253,MDEyOklzc3VlQ29tbWVudDM4MTQ3ODI1Mw==,9599,simonw,2018-04-16T04:42:02Z,2018-04-16T04:42:02Z,OWNER,"This worked as well: datasette package fivethirtyeight.db \ --plugins-dir=plugins/ --title=""FiveThirtyEight"" --branch=plugins-dir ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314471743,Load plugins from a `--plugins-dir=plugins/` directory, https://github.com/simonw/datasette/issues/211#issuecomment-381481990,https://api.github.com/repos/simonw/datasette/issues/211,381481990,MDEyOklzc3VlQ29tbWVudDM4MTQ4MTk5MA==,9599,simonw,2018-04-16T05:14:57Z,2018-04-16T05:14:57Z,OWNER,Added unit tests in 33c6bcadb962457be6b0c7f369826b404e2bcef5,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314471743,Load plugins from a `--plugins-dir=plugins/` directory, https://github.com/simonw/datasette/issues/211#issuecomment-381482407,https://api.github.com/repos/simonw/datasette/issues/211,381482407,MDEyOklzc3VlQ29tbWVudDM4MTQ4MjQwNw==,9599,simonw,2018-04-16T05:18:29Z,2018-04-16T05:18:29Z,OWNER,"Here's the result of running this: datasette publish heroku fivethirtyeight.db \ --plugins-dir=plugins/ --title=""FiveThirtyEight"" --branch=plugins-dir https://intense-river-24599.herokuapp.com/fivethirtyeight-2628db9?sql=select+convert_units%28100%2C+%27m%27%2C+%27ft%27%29","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314471743,Load plugins from a `--plugins-dir=plugins/` directory, https://github.com/simonw/datasette/pull/209#issuecomment-381483301,https://api.github.com/repos/simonw/datasette/issues/209,381483301,MDEyOklzc3VlQ29tbWVudDM4MTQ4MzMwMQ==,9599,simonw,2018-04-16T05:25:08Z,2018-04-16T05:25:08Z,OWNER,I think this is a good improvement. If you fix the tests I'll merge it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314455877, Don't duplicate simple primary keys in the link column, https://github.com/simonw/datasette/issues/14#issuecomment-381446392,https://api.github.com/repos/simonw/datasette/issues/14,381446392,MDEyOklzc3VlQ29tbWVudDM4MTQ0NjM5Mg==,9599,simonw,2018-04-15T23:22:40Z,2018-04-16T05:25:57Z,OWNER,"OK, from that prototype in f2720b0c6b7172ebe8820 it looks like pluggy provides a solid path forward. Next steps: - [x] Build a demo plugin that uses setuptools entrypoints to register with the `datasette` plugin manager via pluggy - [x] Figure out a mechanism for registering plugins without first needing to publish them to PyPI. Can I load plugins from a special `plugins/` directory similar to the `--template-dir=templates/` option already supported by Datasette? #211","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/191#issuecomment-381488049,https://api.github.com/repos/simonw/datasette/issues/191,381488049,MDEyOklzc3VlQ29tbWVudDM4MTQ4ODA0OQ==,9599,simonw,2018-04-16T05:58:15Z,2018-04-16T05:58:15Z,OWNER,"I think this is pretty hard. @coleifer has done some work in this direction, including https://github.com/coleifer/pysqlite3 which ports the standalone pysqlite module to Python 3. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310533258,Figure out how to bundle a more up-to-date SQLite, https://github.com/simonw/datasette/issues/214#issuecomment-381490361,https://api.github.com/repos/simonw/datasette/issues/214,381490361,MDEyOklzc3VlQ29tbWVudDM4MTQ5MDM2MQ==,9599,simonw,2018-04-16T06:13:02Z,2018-04-16T06:13:02Z,OWNER,"Packaging JS and CSS in a pip installable wheel is fiddly but possible. http://peak.telecommunity.com/DevCenter/PythonEggs#accessing-package-resources from pkg_resources import resource_string foo_config = resource_string(__name__, 'foo.conf')","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506446,Ability for plugins to define extra JavaScript and CSS, https://github.com/simonw/datasette/issues/214#issuecomment-381491707,https://api.github.com/repos/simonw/datasette/issues/214,381491707,MDEyOklzc3VlQ29tbWVudDM4MTQ5MTcwNw==,9599,simonw,2018-04-16T06:21:23Z,2018-04-16T06:21:23Z,OWNER,This looks like a good example: https://github.com/funkey/nyroglancer/commit/d4438ab42171360b2b8e9020f672846dd70c8d80,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506446,Ability for plugins to define extra JavaScript and CSS, https://github.com/simonw/datasette/issues/191#issuecomment-381602005,https://api.github.com/repos/simonw/datasette/issues/191,381602005,MDEyOklzc3VlQ29tbWVudDM4MTYwMjAwNQ==,119974,coleifer,2018-04-16T13:37:32Z,2018-04-16T13:37:32Z,NONE,I don't think it should be too difficult... you can look at what @ghaering did with pysqlite (and similarly what I copied for pysqlite3). You would theoretically take an amalgamation build of Sqlite (all code in a single .c and .h file). The `AmalgamationLibSqliteBuilder` class detects the presence of this amalgamated source file and builds a statically-linked pysqlite.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310533258,Figure out how to bundle a more up-to-date SQLite, https://github.com/simonw/datasette/issues/14#issuecomment-381611738,https://api.github.com/repos/simonw/datasette/issues/14,381611738,MDEyOklzc3VlQ29tbWVudDM4MTYxMTczOA==,9599,simonw,2018-04-16T14:07:30Z,2018-04-16T14:07:30Z,OWNER,I should check if it's possible to have two template registration function plugins in a single plugin module. If it isn't maybe I should use class plugins instead of module plugins.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/214#issuecomment-381612585,https://api.github.com/repos/simonw/datasette/issues/214,381612585,MDEyOklzc3VlQ29tbWVudDM4MTYxMjU4NQ==,9599,simonw,2018-04-16T14:10:16Z,2018-04-16T14:10:16Z,OWNER,`resource_stream` returns a file-like object which may be better for serving from Sanic.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506446,Ability for plugins to define extra JavaScript and CSS, https://github.com/simonw/datasette/issues/14#issuecomment-381621338,https://api.github.com/repos/simonw/datasette/issues/14,381621338,MDEyOklzc3VlQ29tbWVudDM4MTYyMTMzOA==,9599,simonw,2018-04-16T14:36:27Z,2018-04-16T14:36:27Z,OWNER,"Annoyingly, the following only results in the last of the two `prepare_connection` hooks being registered: ``` from datasette import hookimpl import pint import random ureg = pint.UnitRegistry() @hookimpl def prepare_connection(conn): def convert_units(amount, from_, to_): ""select convert_units(100, 'm', 'ft');"" return (amount * ureg(from_)).to(to_).to_tuple()[0] conn.create_function('convert_units', 3, convert_units) @hookimpl def prepare_connection(conn): conn.create_function('random_integer', 2, random.randint) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/216#issuecomment-381643173,https://api.github.com/repos/simonw/datasette/issues/216,381643173,MDEyOklzc3VlQ29tbWVudDM4MTY0MzE3Mw==,9599,simonw,2018-04-16T15:21:17Z,2018-04-16T15:21:17Z,OWNER,"Yikes, definitely a bug.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381644355,https://api.github.com/repos/simonw/datasette/issues/216,381644355,MDEyOklzc3VlQ29tbWVudDM4MTY0NDM1NQ==,9599,simonw,2018-04-16T15:24:38Z,2018-04-16T15:24:38Z,OWNER,"So there are two tricky problems to solve here: * I need a way of encoding `null` into that `_next=` that is unambiguous from the string `None` or `null`. This means introducing some kind of escaping mechanism in those strings. I already use URL encoding as part of the construction of those components here, maybe that can help here? * I need to figure out what the SQL should be for the ""next"" set of results if the previous value was null. Thankfully we use the primary key as a tie-breaker so this shouldn't be impossible.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381645274,https://api.github.com/repos/simonw/datasette/issues/216,381645274,MDEyOklzc3VlQ29tbWVudDM4MTY0NTI3NA==,9599,simonw,2018-04-16T15:27:16Z,2018-04-16T15:27:16Z,OWNER,"Relevant code: https://github.com/simonw/datasette/blob/904f1c75a3c17671d25c53b91e177c249d14ab3b/datasette/app.py#L828-L832","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381645973,https://api.github.com/repos/simonw/datasette/issues/216,381645973,MDEyOklzc3VlQ29tbWVudDM4MTY0NTk3Mw==,9599,simonw,2018-04-16T15:29:11Z,2018-04-16T15:29:11Z,OWNER,"I could use `$null` as a magic value that means None. Since I'm applying `quote_plus()` to actual values, any legit strings that look like this will be encoded as `%24null`: ``` >>> urllib.parse.quote_plus('$null') '%24null' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381648053,https://api.github.com/repos/simonw/datasette/issues/216,381648053,MDEyOklzc3VlQ29tbWVudDM4MTY0ODA1Mw==,9599,simonw,2018-04-16T15:35:17Z,2018-04-16T15:35:17Z,OWNER,"I think the correct SQL is this: https://datasette-issue-189-demo-3.now.sh/salaries-7859114-7859114?sql=select+rowid%2C+*+from+%5B2017+Maryland+state+salaries%5D%0D%0Awhere+%28middle_initial+is+not+null+or+%28middle_initial+is+null+and+rowid+%3E+%3Ap0%29%29%0D%0Aorder+by+middle_initial+limit+101&p0=391 ``` select rowid, * from [2017 Maryland state salaries] where (middle_initial is not null or (middle_initial is null and rowid > :p0)) order by middle_initial limit 101 ``` Though this will also need to be taken into account for #198 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381649140,https://api.github.com/repos/simonw/datasette/issues/216,381649140,MDEyOklzc3VlQ29tbWVudDM4MTY0OTE0MA==,9599,simonw,2018-04-16T15:38:29Z,2018-04-16T15:38:29Z,OWNER,But what would that SQL look like for `_sort_desc`?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381649437,https://api.github.com/repos/simonw/datasette/issues/216,381649437,MDEyOklzc3VlQ29tbWVudDM4MTY0OTQzNw==,9599,simonw,2018-04-16T15:39:21Z,2018-04-16T15:39:21Z,OWNER,"Here's where that SQL gets constructed at the moment: https://github.com/simonw/datasette/blob/10a34f995c70daa37a8a2aa02c3135a4b023a24c/datasette/app.py#L761-L771","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/pull/209#issuecomment-381738137,https://api.github.com/repos/simonw/datasette/issues/209,381738137,MDEyOklzc3VlQ29tbWVudDM4MTczODEzNw==,45057,russss,2018-04-16T20:27:43Z,2018-04-16T20:27:43Z,CONTRIBUTOR,"Tests now fixed, honest. The failing test on Travis looks like an intermittent sqlite failure which should resolve itself on a retry...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314455877, Don't duplicate simple primary keys in the link column, https://github.com/simonw/datasette/issues/203#issuecomment-381763651,https://api.github.com/repos/simonw/datasette/issues/203,381763651,MDEyOklzc3VlQ29tbWVudDM4MTc2MzY1MQ==,45057,russss,2018-04-16T21:59:17Z,2018-04-16T21:59:17Z,CONTRIBUTOR,"Ah, I had no idea you could bind python functions into sqlite! I think the primary purpose of this issue has been served now - I'm going to close this and create a new issue for the only bit of this that hasn't been touched yet, which is (optionally) exposing units in the JSON API.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/220#issuecomment-381777108,https://api.github.com/repos/simonw/datasette/issues/220,381777108,MDEyOklzc3VlQ29tbWVudDM4MTc3NzEwOA==,9599,simonw,2018-04-16T23:04:04Z,2018-04-16T23:04:04Z,OWNER,This could also help workaround the current predicament that a single plugin can only define one prepare_connection hook.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314847571,Investigate syntactic sugar for plugins, https://github.com/simonw/datasette/issues/216#issuecomment-381786522,https://api.github.com/repos/simonw/datasette/issues/216,381786522,MDEyOklzc3VlQ29tbWVudDM4MTc4NjUyMg==,9599,simonw,2018-04-16T23:58:45Z,2018-04-16T23:59:13Z,OWNER,"Weird... tests are failing in Travis, despite passing on my local machine. https://travis-ci.org/simonw/datasette/builds/367423706","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381788051,https://api.github.com/repos/simonw/datasette/issues/216,381788051,MDEyOklzc3VlQ29tbWVudDM4MTc4ODA1MQ==,9599,simonw,2018-04-17T00:07:48Z,2018-04-17T00:07:48Z,OWNER,Still failing. This is very odd.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381794744,https://api.github.com/repos/simonw/datasette/issues/216,381794744,MDEyOklzc3VlQ29tbWVudDM4MTc5NDc0NA==,9599,simonw,2018-04-17T00:51:41Z,2018-04-17T00:51:41Z,OWNER,I'm reverting this out of master until I can figure out why the tests are failing.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381798786,https://api.github.com/repos/simonw/datasette/issues/216,381798786,MDEyOklzc3VlQ29tbWVudDM4MTc5ODc4Ng==,9599,simonw,2018-04-17T01:18:25Z,2018-04-17T01:18:25Z,OWNER,"Here's the test that's failing: https://github.com/simonw/datasette/blob/59a3aa859c0e782aeda9a515b1b52c358e8458a2/tests/test_api.py#L437-L470 I got Travis to spit out the `fetched` and `expected` variables. `expected` has 201 items in it and is identical to what I get on my local laptop. `fetched` has 250 items in it, so it's clearly different from my local environment. I've managed to replicate the bug in production! I created a test database like this: python tests/fixtures.py sortable.db Then deployed that database like so: datasette publish now sortable.db \ --extra-options=""--page_size=50"" --branch=debug-travis-issue-216 And... if you click ""next"" on this page https://datasette-issue-216-pagination.now.sh/sortable-5679797/sortable?_sort_desc=sortable_with_nulls five times you get back 250 results, when you should only get back 201.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381799267,https://api.github.com/repos/simonw/datasette/issues/216,381799267,MDEyOklzc3VlQ29tbWVudDM4MTc5OTI2Nw==,9599,simonw,2018-04-17T01:21:35Z,2018-04-17T01:21:35Z,OWNER,"The version that I deployed which exhibits the bug is running SQLite `3.8.7.1` - https://datasette-issue-216-pagination.now.sh/sortable-5679797?sql=select+sqlite_version%28%29 The version that I have running locally which does NOT exhibit the bug is running SQLite `3.23.0`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381799408,https://api.github.com/repos/simonw/datasette/issues/216,381799408,MDEyOklzc3VlQ29tbWVudDM4MTc5OTQwOA==,9599,simonw,2018-04-17T01:22:30Z,2018-04-17T01:22:30Z,OWNER,"... which is VERY surprising, because `3.23.0` only came out on 2nd April this year: https://www.sqlite.org/changes.html - I have no idea how I came to be running that version on my laptop.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381801302,https://api.github.com/repos/simonw/datasette/issues/216,381801302,MDEyOklzc3VlQ29tbWVudDM4MTgwMTMwMg==,9599,simonw,2018-04-17T01:33:43Z,2018-04-17T01:33:43Z,OWNER,"This is the SQL that returns differing results in production and on my laptop: https://datasette-issue-216-pagination.now.sh/sortable-5679797?sql=select+%2A+from+sortable+where+%28sortable_with_nulls+is+null+and+%28%28pk1+%3E+%3Ap0%29%0A++or%0A%28pk1+%3D+%3Ap0+and+pk2+%3E+%3Ap1%29%29%29+order+by+sortable_with_nulls+desc+limit+51&p0=b&p1=t ``` select * from sortable where (sortable_with_nulls is null and ((pk1 > :p0) or (pk1 = :p0 and pk2 > :p1))) order by sortable_with_nulls desc limit 51 ``` I think that `order by sortable_with_nulls desc` bit is at fault - the primary keys should be included in that order by as well. Sure enough, changing the query to this one returns the same results across both environments: ``` select * from sortable where (sortable_with_nulls is null and ((pk1 > :p0) or (pk1 = :p0 and pk2 > :p1))) order by sortable_with_nulls desc, pk1, pk2 limit 51 ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381803157,https://api.github.com/repos/simonw/datasette/issues/216,381803157,MDEyOklzc3VlQ29tbWVudDM4MTgwMzE1Nw==,9599,simonw,2018-04-17T01:45:24Z,2018-04-17T01:45:24Z,OWNER,Fixed!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/14#issuecomment-381622793,https://api.github.com/repos/simonw/datasette/issues/14,381622793,MDEyOklzc3VlQ29tbWVudDM4MTYyMjc5Mw==,9599,simonw,2018-04-16T14:40:39Z,2018-04-17T01:47:15Z,OWNER,"I think that's OK. The two plugins I've implemented so far (`prepare_connection` and `prepare_jinja2_environment`) both make sense if they can only be defined once-per-plugin. For the moment I'll assume I can define future hooks to work well with the same limitation. The syntactic sugar idea in #220 can help here too.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381809998,https://api.github.com/repos/simonw/datasette/issues/14,381809998,MDEyOklzc3VlQ29tbWVudDM4MTgwOTk5OA==,9599,simonw,2018-04-17T02:23:39Z,2018-04-17T02:23:39Z,OWNER,I just shipped Datasette 0.19 with where I'm at so far: https://github.com/simonw/datasette/releases/tag/0.19,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/pull/209#issuecomment-381905593,https://api.github.com/repos/simonw/datasette/issues/209,381905593,MDEyOklzc3VlQ29tbWVudDM4MTkwNTU5Mw==,45057,russss,2018-04-17T08:50:28Z,2018-04-17T08:50:28Z,CONTRIBUTOR,"I've added another commit which puts classes a class on each `` by default with its column name, and I've also made the PK column bold. Unfortunately the tests are still failing on 3.6, which is weird. I can't reproduce locally...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314455877, Don't duplicate simple primary keys in the link column, https://github.com/simonw/datasette/issues/214#issuecomment-382038613,https://api.github.com/repos/simonw/datasette/issues/214,382038613,MDEyOklzc3VlQ29tbWVudDM4MjAzODYxMw==,9599,simonw,2018-04-17T15:38:23Z,2018-04-17T15:38:23Z,OWNER,"I figured out the recipe for bundling static assets in a plugin: https://github.com/simonw/datasette-plugin-demos/commit/26c5548f4ab7c6cc6d398df17767950be50d0edf (and then `python3 setup.py bdist_wheel`) Having done that, I ran `pip install ../datasette-plugin-demos/dist/datasette_plugin_demos-0.2-py3-none-any.whl` from my Datasette virtual environment and then did the following: ``` >>> import pkg_resources >>> pkg_resources.resource_stream( ... 'datasette_plugin_demos', 'static/plugin.js' ... ).read() b""alert('hello');\n"" >>> pkg_resources.resource_filename( ... 'datasette_plugin_demos', 'static/plugin.js' ... ) '..../venv/lib/python3.6/site-packages/datasette_plugin_demos/static/plugin.js' >>> pkg_resources.resource_string( ... 'datasette_plugin_demos', 'static/plugin.js' ... ) b""alert('hello');\n"" ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506446,Ability for plugins to define extra JavaScript and CSS, https://github.com/simonw/datasette/issues/214#issuecomment-382069980,https://api.github.com/repos/simonw/datasette/issues/214,382069980,MDEyOklzc3VlQ29tbWVudDM4MjA2OTk4MA==,9599,simonw,2018-04-17T17:08:28Z,2018-04-17T17:08:28Z,OWNER,"Even if we automatically serve ALL `static/` content from installed plugins, we'll still need them to register which files need to be linked to from `extra_css_urls` and `extra_js_urls`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506446,Ability for plugins to define extra JavaScript and CSS, https://github.com/simonw/datasette/pull/209#issuecomment-382205189,https://api.github.com/repos/simonw/datasette/issues/209,382205189,MDEyOklzc3VlQ29tbWVudDM4MjIwNTE4OQ==,9599,simonw,2018-04-18T00:42:44Z,2018-04-18T00:43:02Z,OWNER,"I managed to get a better error message out of that test. The server is returning this (but only on Python 3.6, not on Python 3.5 - and only in Travis, not in my local environment): ```{'error': 'interrupted', 'ok': False, 'status': 400, 'title': 'Invalid SQL'}``` https://travis-ci.org/simonw/datasette/jobs/367929134","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314455877, Don't duplicate simple primary keys in the link column, https://github.com/simonw/datasette/pull/209#issuecomment-382210976,https://api.github.com/repos/simonw/datasette/issues/209,382210976,MDEyOklzc3VlQ29tbWVudDM4MjIxMDk3Ng==,9599,simonw,2018-04-18T01:12:26Z,2018-04-18T01:12:26Z,OWNER,"OK, aaf59db570ab7688af72c08bb5bc1edc145e3e07 should mean that the tests pass when I merge that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314455877, Don't duplicate simple primary keys in the link column, https://github.com/simonw/datasette/issues/214#issuecomment-382048582,https://api.github.com/repos/simonw/datasette/issues/214,382048582,MDEyOklzc3VlQ29tbWVudDM4MjA0ODU4Mg==,9599,simonw,2018-04-17T16:04:42Z,2018-04-18T02:24:46Z,OWNER,"One possible option: let plugins bundle their own `static/` directory and then register themselves with Datasette, then have `/-/static-plugins/name-of-plugin/...` serve files from that directory.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506446,Ability for plugins to define extra JavaScript and CSS, https://github.com/simonw/datasette/issues/14#issuecomment-382256729,https://api.github.com/repos/simonw/datasette/issues/14,382256729,MDEyOklzc3VlQ29tbWVudDM4MjI1NjcyOQ==,9599,simonw,2018-04-18T04:29:29Z,2018-04-18T04:30:14Z,OWNER,I added a mechanism for plugins to serve static files and define custom CSS and JS URLs in #214 - see new documentation on http://datasette.readthedocs.io/en/latest/plugins.html#static-assets and http://datasette.readthedocs.io/en/latest/plugins.html#extra-css-urls,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/223#issuecomment-382408128,https://api.github.com/repos/simonw/datasette/issues/223,382408128,MDEyOklzc3VlQ29tbWVudDM4MjQwODEyOA==,9599,simonw,2018-04-18T14:33:09Z,2018-04-18T14:33:09Z,OWNER,"Demo: datasette publish now sortable.db --install datasette-plugin-demos --branch=master Produced this deployment, with both the `random_integer()` function and the static file from https://github.com/simonw/datasette-plugin-demos/tree/0.2 https://datasette-issue-223.now.sh/-/static-plugins/datasette_plugin_demos/plugin.js https://datasette-issue-223.now.sh/sortable-4bbaa6f?sql=select+random_integer%280%2C+10%29 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315327860,datasette publish --install=name-of-plugin, https://github.com/simonw/datasette/issues/223#issuecomment-382409989,https://api.github.com/repos/simonw/datasette/issues/223,382409989,MDEyOklzc3VlQ29tbWVudDM4MjQwOTk4OQ==,9599,simonw,2018-04-18T14:38:08Z,2018-04-18T14:38:08Z,OWNER,"Tested on Heroku as well. datasette publish heroku sortable.db --install datasette-plugin-demos --branch=master https://morning-tor-45944.herokuapp.com/-/static-plugins/datasette_plugin_demos/plugin.js https://morning-tor-45944.herokuapp.com/sortable-4bbaa6f?sql=select+random_integer%280%2C+10%29","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315327860,datasette publish --install=name-of-plugin, https://github.com/simonw/datasette/issues/223#issuecomment-382413121,https://api.github.com/repos/simonw/datasette/issues/223,382413121,MDEyOklzc3VlQ29tbWVudDM4MjQxMzEyMQ==,9599,simonw,2018-04-18T14:47:18Z,2018-04-18T14:47:18Z,OWNER,"And tested `datasette package` - this time exercising the ability to pass more than one `--install` option: ``` $ datasette package sortable.db --branch=master --install requests --install datasette-plugin-demos Sending build context to Docker daemon 125.4kB Step 1/7 : FROM python:3 ---> 79e1dc9af1c1 Step 2/7 : COPY . /app ---> 6e8e40bce378 Step 3/7 : WORKDIR /app Removing intermediate container 7cdc9ab20d09 ---> f42258c2211f Step 4/7 : RUN pip install https://github.com/simonw/datasette/archive/master.zip requests datasette-plugin-demos ---> Running in a0f17cec08a4 Collecting ... Removing intermediate container a0f17cec08a4 ---> beea84e73271 Step 5/7 : RUN datasette inspect sortable.db --inspect-file inspect-data.json ---> Running in 4daa28792348 Removing intermediate container 4daa28792348 ---> c60312d21b99 Step 6/7 : EXPOSE 8001 ---> Running in fa728468482d Removing intermediate container fa728468482d ---> 8f219a61fddc Step 7/7 : CMD [""datasette"", ""serve"", ""--host"", ""0.0.0.0"", ""sortable.db"", ""--cors"", ""--port"", ""8001"", ""--inspect-file"", ""inspect-data.json""] ---> Running in cd4eaeb2ce9e Removing intermediate container cd4eaeb2ce9e ---> 066e257c7c44 Successfully built 066e257c7c44 (venv) datasette $ docker run -p 8081:8001 066e257c7c44 Serve! files=('sortable.db',) on port 8001 [2018-04-18 14:40:18 +0000] [1] [INFO] Goin' Fast @ http://0.0.0.0:8001 [2018-04-18 14:40:18 +0000] [1] [INFO] Starting worker [1] [2018-04-18 14:46:01 +0000] - (sanic.access)[INFO][1:7]: GET http://localhost:8081/-/static-plugins/datasette_plugin_demos/plugin.js 200 16 ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315327860,datasette publish --install=name-of-plugin, https://github.com/simonw/datasette/issues/224#issuecomment-382616527,https://api.github.com/repos/simonw/datasette/issues/224,382616527,MDEyOklzc3VlQ29tbWVudDM4MjYxNjUyNw==,9599,simonw,2018-04-19T05:40:28Z,2018-04-19T05:40:28Z,OWNER,"No need to use `PackageLoader` after all, we can use the same mechanism we used for the static path: https://github.com/simonw/datasette/blob/b55809a1e20986bb2e638b698815a77902e8708d/datasette/utils.py#L694-L695","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315517578,Ability for plugins to bundle templates, https://github.com/simonw/datasette/issues/227#issuecomment-382808266,https://api.github.com/repos/simonw/datasette/issues/227,382808266,MDEyOklzc3VlQ29tbWVudDM4MjgwODI2Ng==,9599,simonw,2018-04-19T16:59:23Z,2018-04-19T16:59:23Z,OWNER,"Maybe this should have a second argument indicating which codepath was being handled. That way plugins could say ""only inject this extra context variable on the row page"".","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/228#issuecomment-382924910,https://api.github.com/repos/simonw/datasette/issues/228,382924910,MDEyOklzc3VlQ29tbWVudDM4MjkyNDkxMA==,9599,simonw,2018-04-20T00:35:48Z,2018-04-20T00:35:48Z,OWNER,"Hiding tables with the `idx_` prefix should be good enough here, since false positives aren't very harmful.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316031566,"If spatialite detected, mark idx_XXX_Geometry tables as hidden", https://github.com/simonw/datasette/issues/227#issuecomment-382958693,https://api.github.com/repos/simonw/datasette/issues/227,382958693,MDEyOklzc3VlQ29tbWVudDM4Mjk1ODY5Mw==,9599,simonw,2018-04-20T03:15:52Z,2018-04-20T03:15:52Z,OWNER,"A better way to do this would be with many different plugin hooks, one for each view.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/227#issuecomment-382959857,https://api.github.com/repos/simonw/datasette/issues/227,382959857,MDEyOklzc3VlQ29tbWVudDM4Mjk1OTg1Nw==,9599,simonw,2018-04-20T03:21:43Z,2018-04-20T03:21:43Z,OWNER,"Plus a generic prepare_context() hook called in the common render method. prepare_context_table(), prepare_context_row() etc Arguments are context, request, self (hence can access self.ds) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/227#issuecomment-382964794,https://api.github.com/repos/simonw/datasette/issues/227,382964794,MDEyOklzc3VlQ29tbWVudDM4Mjk2NDc5NA==,9599,simonw,2018-04-20T03:45:18Z,2018-04-20T03:45:18Z,OWNER,"What if the context needs to make await calls? One possible option: plugins can either manipulate the context in place OR they can return an awaitable. If they do that, the caller will await it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/227#issuecomment-382966604,https://api.github.com/repos/simonw/datasette/issues/227,382966604,MDEyOklzc3VlQ29tbWVudDM4Mjk2NjYwNA==,9599,simonw,2018-04-20T03:54:56Z,2018-04-20T03:54:56Z,OWNER,Should this differentiate between preparing the data to be sent back as JSON and preparing the context for the template?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/227#issuecomment-382967238,https://api.github.com/repos/simonw/datasette/issues/227,382967238,MDEyOklzc3VlQ29tbWVudDM4Mjk2NzIzOA==,9599,simonw,2018-04-20T03:58:09Z,2018-04-20T03:58:09Z,OWNER,Maybe prepare_table_data() vs prepare_table_context(),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/230#issuecomment-383109984,https://api.github.com/repos/simonw/datasette/issues/230,383109984,MDEyOklzc3VlQ29tbWVudDM4MzEwOTk4NA==,9599,simonw,2018-04-20T14:15:39Z,2018-04-20T14:15:39Z,OWNER,Refs #229,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316128955,Setting page size AND max returned rows to 1000 doesn't seem to work, https://github.com/simonw/datasette/issues/14#issuecomment-383139889,https://api.github.com/repos/simonw/datasette/issues/14,383139889,MDEyOklzc3VlQ29tbWVudDM4MzEzOTg4OQ==,9599,simonw,2018-04-20T15:51:47Z,2018-04-20T15:51:47Z,OWNER,"I released everything we have so far in [Datasette 0.20](https://github.com/simonw/datasette/releases/tag/0.20) and built and released an example plugin, [datasette-cluster-map](https://pypi.org/project/datasette-cluster-map/). Here's my blog entry about it: https://simonwillison.net/2018/Apr/20/datasette-plugins/","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-383140111,https://api.github.com/repos/simonw/datasette/issues/14,383140111,MDEyOklzc3VlQ29tbWVudDM4MzE0MDExMQ==,9599,simonw,2018-04-20T15:52:33Z,2018-04-20T15:52:33Z,OWNER,Here's a link demonstrating my new plugin: https://datasette-cluster-map-demo.now.sh/polar-bears-455fe3a/USGS_WC_eartags_output_files_2009-2011-Status,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/pull/232#issuecomment-383252624,https://api.github.com/repos/simonw/datasette/issues/232,383252624,MDEyOklzc3VlQ29tbWVudDM4MzI1MjYyNA==,9599,simonw,2018-04-21T00:19:00Z,2018-04-21T00:19:00Z,OWNER,Thanks!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316365426,Fix a typo, https://github.com/simonw/datasette/issues/234#issuecomment-383398182,https://api.github.com/repos/simonw/datasette/issues/234,383398182,MDEyOklzc3VlQ29tbWVudDM4MzM5ODE4Mg==,9599,simonw,2018-04-22T17:31:12Z,2018-04-22T17:31:12Z,OWNER,"```{ ""databases"": { ""database1"": { ""tables"": { ""example_table"": { ""label_column"": ""name"" } } } } } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316526433,label_column option in metadata.json, https://github.com/simonw/datasette/issues/234#issuecomment-383399762,https://api.github.com/repos/simonw/datasette/issues/234,383399762,MDEyOklzc3VlQ29tbWVudDM4MzM5OTc2Mg==,9599,simonw,2018-04-22T17:54:39Z,2018-04-22T17:54:39Z,OWNER,Docs here: http://datasette.readthedocs.io/en/latest/metadata.html#specifying-the-label-column-for-a-table,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316526433,label_column option in metadata.json, https://github.com/simonw/datasette/issues/234#issuecomment-383410146,https://api.github.com/repos/simonw/datasette/issues/234,383410146,MDEyOklzc3VlQ29tbWVudDM4MzQxMDE0Ng==,9599,simonw,2018-04-22T20:32:30Z,2018-04-22T20:47:02Z,OWNER,"I built this wrong: my implementation is looking for the `label_column` on the table-being-displayed, but it should be looking for it on the table-the-foreign-key-links-to.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316526433,label_column option in metadata.json, https://github.com/simonw/datasette/issues/231#issuecomment-383315348,https://api.github.com/repos/simonw/datasette/issues/231,383315348,MDEyOklzc3VlQ29tbWVudDM4MzMxNTM0OA==,9599,simonw,2018-04-21T17:37:50Z,2018-04-22T23:06:04Z,OWNER,"I could also have an `""autodetect"": false` option for that plugin to turn off autodetecting entirely. Would be useful if the plugin didn't append its JavaScript in pages that it wasn't used for - that might require making the `extra_js_urls()` hook optionally aware of the columns and table and metadata.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316323336,metadata.json support for plugin configuration options, https://github.com/simonw/datasette/issues/235#issuecomment-383727973,https://api.github.com/repos/simonw/datasette/issues/235,383727973,MDEyOklzc3VlQ29tbWVudDM4MzcyNzk3Mw==,9599,simonw,2018-04-23T21:23:59Z,2018-04-23T21:23:59Z,OWNER,"There might also be something clever we can do here with PRAGMA statements: https://stackoverflow.com/questions/14146881/limit-the-maximum-amount-of-memory-sqlite3-uses And https://www.sqlite.org/pragma.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316621102,Add limit on the size in KB of data returned from a single query, https://github.com/simonw/datasette/issues/235#issuecomment-383764533,https://api.github.com/repos/simonw/datasette/issues/235,383764533,MDEyOklzc3VlQ29tbWVudDM4Mzc2NDUzMw==,9599,simonw,2018-04-24T00:30:02Z,2018-04-24T00:30:02Z,OWNER,The `resource` module in he standard library has the ability to set limits on memory usage for the current process: https://pymotw.com/2/resource/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316621102,Add limit on the size in KB of data returned from a single query, https://github.com/simonw/datasette/issues/238#issuecomment-384362028,https://api.github.com/repos/simonw/datasette/issues/238,384362028,MDEyOklzc3VlQ29tbWVudDM4NDM2MjAyOA==,9599,simonw,2018-04-25T17:07:11Z,2018-04-25T17:07:11Z,OWNER,"On further thought: this is actually only an issue for immutable deployments to platforms like Zeit Now and Heroku. As such, adding it to `datasette serve` feels clumsy. Maybe `datasette publish` should instead gain the ability to optionally install an extra mechanism that periodically pulls a fresh copy of `metadata.json` from a URL.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317714268,External metadata.json, https://github.com/simonw/datasette/issues/239#issuecomment-384500327,https://api.github.com/repos/simonw/datasette/issues/239,384500327,MDEyOklzc3VlQ29tbWVudDM4NDUwMDMyNw==,9599,simonw,2018-04-26T03:18:12Z,2018-04-26T03:18:20Z,OWNER,"``` { ""databases"": { ""database1"": { ""tables"": { ""example_table"": { ""hidden"": true } } } } } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317760361,Support for hidden tables in metadata.json, https://github.com/simonw/datasette/issues/239#issuecomment-384503873,https://api.github.com/repos/simonw/datasette/issues/239,384503873,MDEyOklzc3VlQ29tbWVudDM4NDUwMzg3Mw==,9599,simonw,2018-04-26T03:45:11Z,2018-04-26T03:45:11Z,OWNER,Documentation: http://datasette.readthedocs.io/en/latest/metadata.html#hiding-tables,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317760361,Support for hidden tables in metadata.json, https://github.com/simonw/datasette/issues/229#issuecomment-384512192,https://api.github.com/repos/simonw/datasette/issues/229,384512192,MDEyOklzc3VlQ29tbWVudDM4NDUxMjE5Mg==,9599,simonw,2018-04-26T04:49:46Z,2018-04-26T04:49:46Z,OWNER,Documentation: http://datasette.readthedocs.io/en/latest/json_api.html#special-table-arguments,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316123256,Table view should support ?_size=400 parameter, https://github.com/simonw/datasette/issues/79#issuecomment-384675792,https://api.github.com/repos/simonw/datasette/issues/79,384675792,MDEyOklzc3VlQ29tbWVudDM4NDY3NTc5Mg==,9599,simonw,2018-04-26T15:08:13Z,2018-04-26T15:08:13Z,OWNER,"Docs now live at http://datasette.readthedocs.io/ I still need to document a few more parts of the API before closing this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273569068,Add more detailed API documentation to the README, https://github.com/simonw/datasette/issues/44#issuecomment-384676488,https://api.github.com/repos/simonw/datasette/issues/44,384676488,MDEyOklzc3VlQ29tbWVudDM4NDY3NjQ4OA==,9599,simonw,2018-04-26T15:09:57Z,2018-04-26T15:09:57Z,OWNER,Remaining work for this is tracked in #150,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",269731374,?_group_count=country - return counts by specific column(s), https://github.com/simonw/datasette/issues/125#issuecomment-384678319,https://api.github.com/repos/simonw/datasette/issues/125,384678319,MDEyOklzc3VlQ29tbWVudDM4NDY3ODMxOQ==,9599,simonw,2018-04-26T15:14:31Z,2018-04-26T15:14:31Z,OWNER,"I shipped this last week as the first plugin: https://simonwillison.net/2018/Apr/20/datasette-plugins/ Demo: https://datasette-cluster-map-demo.datasettes.com/polar-bears-455fe3a/USGS_WC_eartags_output_files_2009-2011-Status Plugin: https://github.com/simonw/datasette-cluster-map","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275135393,Plot rows on a map with Leaflet and Leaflet.markercluster, https://github.com/simonw/datasette/issues/244#issuecomment-386309928,https://api.github.com/repos/simonw/datasette/issues/244,386309928,MDEyOklzc3VlQ29tbWVudDM4NjMwOTkyOA==,9599,simonw,2018-05-03T14:13:49Z,2018-05-03T14:13:49Z,OWNER,Demo: https://datasette-versions-and-shape-demo.now.sh/-/versions,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",318738000,/-/versions page, https://github.com/simonw/datasette/issues/245#issuecomment-386310149,https://api.github.com/repos/simonw/datasette/issues/245,386310149,MDEyOklzc3VlQ29tbWVudDM4NjMxMDE0OQ==,9599,simonw,2018-05-03T14:14:33Z,2018-05-03T14:14:33Z,OWNER,"Demos: * https://datasette-versions-and-shape-demo.now.sh/sf-trees-02c8ef1/qSpecies.json?_shape=array * https://datasette-versions-and-shape-demo.now.sh/sf-trees-02c8ef1/qSpecies.json?_shape=object * https://datasette-versions-and-shape-demo.now.sh/sf-trees-02c8ef1/qSpecies.json?_shape=arrays * https://datasette-versions-and-shape-demo.now.sh/sf-trees-02c8ef1/qSpecies.json?_shape=objects","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",319358200,?_shape=array option, https://github.com/simonw/datasette/issues/248#issuecomment-386357645,https://api.github.com/repos/simonw/datasette/issues/248,386357645,MDEyOklzc3VlQ29tbWVudDM4NjM1NzY0NQ==,9599,simonw,2018-05-03T16:36:59Z,2018-05-03T16:36:59Z,OWNER,"Even better: use `plugin_manager.list_plugin_distinfo()` from pluggy to get back a list of tuples, the second item in each tuple is a `pkg_resources.DistInfoDistribution` with a `.version` attribute.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",319954545,/-/plugins should show version of each installed plugin, https://github.com/simonw/datasette/issues/248#issuecomment-386692333,https://api.github.com/repos/simonw/datasette/issues/248,386692333,MDEyOklzc3VlQ29tbWVudDM4NjY5MjMzMw==,9599,simonw,2018-05-04T18:25:40Z,2018-05-04T18:25:40Z,OWNER,Demo: https://datasette-plugins-and-max-size-demo.now.sh/-/plugins,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",319954545,/-/plugins should show version of each installed plugin, https://github.com/simonw/datasette/issues/249#issuecomment-386692534,https://api.github.com/repos/simonw/datasette/issues/249,386692534,MDEyOklzc3VlQ29tbWVudDM4NjY5MjUzNA==,9599,simonw,2018-05-04T18:26:30Z,2018-05-04T18:26:30Z,OWNER,Demo: https://datasette-plugins-and-max-size-demo.now.sh/sf-trees/Street_Tree_List.json?_size=max,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",320090329,?_size=max argument , https://github.com/simonw/datasette/issues/237#issuecomment-386840307,https://api.github.com/repos/simonw/datasette/issues/237,386840307,MDEyOklzc3VlQ29tbWVudDM4Njg0MDMwNw==,9599,simonw,2018-05-05T22:45:45Z,2018-05-05T22:45:45Z,OWNER,Documented here: http://datasette.readthedocs.io/en/latest/json_api.html#special-table-arguments,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317475156,Support for ?_search_colname=blah searches, https://github.com/simonw/datasette/issues/237#issuecomment-386840806,https://api.github.com/repos/simonw/datasette/issues/237,386840806,MDEyOklzc3VlQ29tbWVudDM4Njg0MDgwNg==,9599,simonw,2018-05-05T22:56:42Z,2018-05-05T22:56:42Z,OWNER,"Demo: datasette publish now ../datasettes/san-francisco/sf-film-locations.db --branch=master --name datasette-column-search-demo https://datasette-column-search-demo.now.sh/sf-film-locations/Film_Locations_in_San_Francisco?_search_Locations=justin","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317475156,Support for ?_search_colname=blah searches, https://github.com/simonw/datasette/issues/251#issuecomment-386879509,https://api.github.com/repos/simonw/datasette/issues/251,386879509,MDEyOklzc3VlQ29tbWVudDM4Njg3OTUwOQ==,9599,simonw,2018-05-06T13:29:26Z,2018-05-06T13:29:26Z,OWNER,"We can solve this using the `sqlite_timelimit(conn, 20)` helper, which can tell SQLite to give up after 20ms. We can wrap that around the following SQL: select distinct COLUMN from TABLE limit 21; Then we look at the number of rows returned. If it's 21 or more we know that this table had more than 21 distinct values, so we'll treat it as ""unlimited"". Likewise, if the SQL times out before 20ms is up we will skip this introspection.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",320592643,"Explore ""distinct values for column"" in inspect()", https://github.com/simonw/datasette/issues/251#issuecomment-386879840,https://api.github.com/repos/simonw/datasette/issues/251,386879840,MDEyOklzc3VlQ29tbWVudDM4Njg3OTg0MA==,9599,simonw,2018-05-06T13:34:24Z,2018-05-06T13:34:24Z,OWNER,"Here's a quick demo of that exploration: https://datasette-distinct-column-values.now.sh/-/inspect Example output: ``` { ""antiquities-act/actions_under_antiquities_act"": { ""columns"": [ ""current_name"", ""states"", ""original_name"", ""current_agency"", ""action"", ""date"", ""year"", ""pres_or_congress"", ""acres_affected"" ], ""count"": 344, ""distinct_values_by_column"": { ""acres_affected"": null, ""action"": null, ""current_agency"": [ ""NPS"", ""State of Montana"", ""BLM"", ""State of Arizona"", ""USFS"", ""State of North Dakota"", ""NPS, BLM"", ""State of South Carolina"", ""State of New York"", ""FWS"", ""FWS, NOAA"", ""NPS, FWS"", ""NOAA"", ""BLM, USFS"", ""NOAA, FWS"" ], ""current_name"": null, ""date"": null, ""original_name"": null, ""pres_or_congress"": null, ""states"": null, ""year"": null }, ""foreign_keys"": { ""incoming"": [], ""outgoing"": [] }, ""fts_table"": null, ""hidden"": false, ""label_column"": null, ""name"": ""antiquities-act/actions_under_antiquities_act"", ""primary_keys"": [] } } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",320592643,"Explore ""distinct values for column"" in inspect()", https://github.com/simonw/datasette/issues/251#issuecomment-386879878,https://api.github.com/repos/simonw/datasette/issues/251,386879878,MDEyOklzc3VlQ29tbWVudDM4Njg3OTg3OA==,9599,simonw,2018-05-06T13:34:57Z,2018-05-06T13:34:57Z,OWNER,If I'm going to expand column introspection in this way it would be useful to also capture column type information.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",320592643,"Explore ""distinct values for column"" in inspect()", https://github.com/simonw/datasette/issues/254#issuecomment-388367027,https://api.github.com/repos/simonw/datasette/issues/254,388367027,MDEyOklzc3VlQ29tbWVudDM4ODM2NzAyNw==,247131,philroche,2018-05-11T13:41:46Z,2018-05-11T13:41:46Z,NONE,"An example deployment @ https://datasette-zkcvlwdrhl.now.sh/simplestreams-270f20c/cloudimage?content_id__exact=com.ubuntu.cloud%3Areleased%3Adownload It is not causing errors, more of an inconvenience. I have worked around it using a `like` query instead. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322283067,Escaping named parameters in canned queries, https://github.com/simonw/datasette/issues/254#issuecomment-388497467,https://api.github.com/repos/simonw/datasette/issues/254,388497467,MDEyOklzc3VlQ29tbWVudDM4ODQ5NzQ2Nw==,9599,simonw,2018-05-11T22:06:00Z,2018-05-11T22:06:34Z,OWNER,"Got it, this seems to trigger the problem: https://datasette-zkcvlwdrhl.now.sh/simplestreams-270f20c?sql=select+*+from+cloudimage+where+%22content_id%22+%3D+%22com.ubuntu.cloud%3Areleased%3Adownload%22+order+by+id+limit+10","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322283067,Escaping named parameters in canned queries, https://github.com/simonw/datasette/issues/254#issuecomment-388360255,https://api.github.com/repos/simonw/datasette/issues/254,388360255,MDEyOklzc3VlQ29tbWVudDM4ODM2MDI1NQ==,9599,simonw,2018-05-11T13:16:09Z,2018-05-11T22:45:31Z,OWNER,"Do you have an example I can look at? I think I have a possible route for fixing this, but it's pretty tricky (it involves adding a full SQL statement parser, but that's needed for some other potential improvements as well). In the meantime, is this causing actual errors for you or is it more of an inconvenience (form fields being displayed that don't actually do anything)? Another potential solution here could be to allow canned queries to optionally declare their parameters in metadata.json","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322283067,Escaping named parameters in canned queries, https://github.com/simonw/datasette/issues/255#issuecomment-388525357,https://api.github.com/repos/simonw/datasette/issues/255,388525357,MDEyOklzc3VlQ29tbWVudDM4ODUyNTM1Nw==,9599,simonw,2018-05-12T03:01:14Z,2018-05-12T03:01:14Z,OWNER,Facet counts will be generated by extra SQL queries with their own aggressive time limit.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/253#issuecomment-388550742,https://api.github.com/repos/simonw/datasette/issues/253,388550742,MDEyOklzc3VlQ29tbWVudDM4ODU1MDc0Mg==,9599,simonw,2018-05-12T12:09:02Z,2018-05-12T12:09:02Z,OWNER,http://datasette.readthedocs.io/en/latest/full_text_search.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",321631020,Documentation explaining how to use SQLite FTS with Datasette, https://github.com/simonw/datasette/issues/255#issuecomment-388587855,https://api.github.com/repos/simonw/datasette/issues/255,388587855,MDEyOklzc3VlQ29tbWVudDM4ODU4Nzg1NQ==,9599,simonw,2018-05-12T22:30:23Z,2018-05-12T22:30:23Z,OWNER,Adding some TODOs to the original description (so they show up as a todo progress bar),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-388588011,https://api.github.com/repos/simonw/datasette/issues/255,388588011,MDEyOklzc3VlQ29tbWVudDM4ODU4ODAxMQ==,9599,simonw,2018-05-12T22:33:39Z,2018-05-12T22:33:39Z,OWNER,Initial documentation: http://datasette.readthedocs.io/en/latest/facets.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-388589072,https://api.github.com/repos/simonw/datasette/issues/255,388589072,MDEyOklzc3VlQ29tbWVudDM4ODU4OTA3Mg==,9599,simonw,2018-05-12T22:59:07Z,2018-05-12T22:59:07Z,OWNER,"I need to decide how to display these. They currently look like this: https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9/congress-age%2Fcongress-terms?_facet=chamber&_facet=state&_facet=party&_facet=incumbent&state=MO ![2018-05-12 at 7 58 pm](https://user-images.githubusercontent.com/9599/39962230-e7bf9e10-561e-11e8-80a7-0941b8991318.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-388588998,https://api.github.com/repos/simonw/datasette/issues/255,388588998,MDEyOklzc3VlQ29tbWVudDM4ODU4ODk5OA==,9599,simonw,2018-05-12T22:57:30Z,2018-05-12T23:00:24Z,OWNER,"A few demos: * https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9/college-majors%2Fall-ages?_facet=Major_category * https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9/congress-age%2Fcongress-terms?_facet=chamber&_facet=state&_facet=party&_facet=incumbent * https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9/bechdel%2Fmovies?_facet=binary&_facet=test","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/pull/257#issuecomment-388625703,https://api.github.com/repos/simonw/datasette/issues/257,388625703,MDEyOklzc3VlQ29tbWVudDM4ODYyNTcwMw==,9599,simonw,2018-05-13T13:10:09Z,2018-05-13T13:10:09Z,OWNER,"I'm still seeing intermittent Python 3.5 failures due to dictionary ordering differences. https://travis-ci.org/simonw/datasette/jobs/378356802 ``` > assert expected_facet_results == facet_results E AssertionError: assert {'city': [{'c...alue': 'MI'}]} == {'city': [{'co...alue': 'MI'}]} E Omitting 1 identical items, use -vv to show E Differing items: E {'city': [{'count': 4, 'toggle_url': '_facet=state&_facet=city&state=MI&city=Detroit', 'value': 'Detroit'}]} != {'city': [{'count': 4, 'toggle_url': 'state=MI&_facet=state&_facet=city&city=Detroit', 'value': 'Detroit'}]} E Use -v to get the full diff ``` To solve these cleanly I need to be able to run Python 3.5 on my local laptop rather than relying on Travis every time.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322591993,Refactor views, https://github.com/simonw/datasette/pull/257#issuecomment-388626721,https://api.github.com/repos/simonw/datasette/issues/257,388626721,MDEyOklzc3VlQ29tbWVudDM4ODYyNjcyMQ==,9599,simonw,2018-05-13T13:27:04Z,2018-05-13T13:27:04Z,OWNER,"I managed to get Python 3.5.0 running on my laptop using [pyenv](https://github.com/pyenv/pyenv). Here's the incantation I used: ``` # Install pyenv using homebrew (turns out I already had it) brew install pyenv # Check which versions of Python I have installed pyenv versions # Install Python 3.5.0 pyenv install 3.5.0 # Figure out where pyenv has been installing things pyenv root # Check I can run my newly installed Python 3.5.0 /Users/simonw/.pyenv/versions/3.5.0/bin/python # Use it to create a new virtualenv /Users/simonw/.pyenv/versions/3.5.0/bin/python -mvenv venv35 source venv35/bin/activate # Install datasette into that virtualenv python setup.py install ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322591993,Refactor views, https://github.com/simonw/datasette/pull/257#issuecomment-388626804,https://api.github.com/repos/simonw/datasette/issues/257,388626804,MDEyOklzc3VlQ29tbWVudDM4ODYyNjgwNA==,9599,simonw,2018-05-13T13:28:20Z,2018-05-13T13:28:20Z,OWNER,"Unfortunately, running `python setup.py test` on my laptop using Python 3.5.0 in that virtualenv results in a flow of weird Sanic-related errors: ``` File ""/Users/simonw/Dropbox/Development/datasette/venv35/lib/python3.5/site-packages/sanic-0.7.0-py3.5.egg/sanic/testing.py"", line 16, in _local_request import aiohttp File ""/Users/simonw/Dropbox/Development/datasette/.eggs/aiohttp-2.3.2-py3.5-macosx-10.13-x86_64.egg/aiohttp/__init__.py"", line 6, in from .client import * # noqa File ""/Users/simonw/Dropbox/Development/datasette/.eggs/aiohttp-2.3.2-py3.5-macosx-10.13-x86_64.egg/aiohttp/client.py"", line 13, in from yarl import URL File ""/Users/simonw/Dropbox/Development/datasette/.eggs/yarl-1.2.4-py3.5-macosx-10.13-x86_64.egg/yarl/__init__.py"", line 11, in from .quoting import _Quoter, _Unquoter File ""/Users/simonw/Dropbox/Development/datasette/.eggs/yarl-1.2.4-py3.5-macosx-10.13-x86_64.egg/yarl/quoting.py"", line 3, in from typing import Optional, TYPE_CHECKING, cast ImportError: cannot import name 'TYPE_CHECKING' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322591993,Refactor views, https://github.com/simonw/datasette/pull/257#issuecomment-388627281,https://api.github.com/repos/simonw/datasette/issues/257,388627281,MDEyOklzc3VlQ29tbWVudDM4ODYyNzI4MQ==,9599,simonw,2018-05-13T13:36:21Z,2018-05-13T13:36:21Z,OWNER,"https://github.com/rtfd/readthedocs.org/issues/3812#issuecomment-373780860 suggests Python 3.5.2 may have the fix. Yup, that worked: ``` pyenv install 3.5.2 rm -rf venv35 /Users/simonw/.pyenv/versions/3.5.2/bin/python -mvenv venv35 source venv35/bin/activate # Not sure why I need this in my local environment but I do: pip install datasette_plugin_demos python setup.py test ``` This is now giving me the same test failure locally that I am seeing in Travis.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322591993,Refactor views, https://github.com/simonw/datasette/pull/257#issuecomment-388628966,https://api.github.com/repos/simonw/datasette/issues/257,388628966,MDEyOklzc3VlQ29tbWVudDM4ODYyODk2Ng==,9599,simonw,2018-05-13T14:00:47Z,2018-05-13T14:06:35Z,OWNER,"Running specific tests: ``` venv35/bin/pip install pytest beautifulsoup4 aiohttp venv35/bin/pytest tests/test_utils.py ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322591993,Refactor views, https://github.com/simonw/datasette/issues/255#issuecomment-388645828,https://api.github.com/repos/simonw/datasette/issues/255,388645828,MDEyOklzc3VlQ29tbWVudDM4ODY0NTgyOA==,9599,simonw,2018-05-13T18:18:56Z,2018-05-13T18:20:02Z,OWNER,I may be able to run the SQL for all of the facet counts in one go using a WITH CTE query - will have to microbenchmark this to make sure it is worthwhile: https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9?sql=with+blah+as+%28select+*+from+%5Bcollege-majors%2Fall-ages%5D%29%0D%0Aselect+*+from+%28select+%22Major_category%22%2C+Major_category%2C+count%28*%29+as+n+from%0D%0Ablah+group+by+Major_category+order+by+n+desc+limit+10%29%0D%0Aunion+all%0D%0Aselect+*+from+%28select+%22Major_category2%22%2C+Major_category%2C+count%28*%29+as+n+from%0D%0Ablah+group+by+Major_category+order+by+n+desc+limit+10%29,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/256#issuecomment-388684356,https://api.github.com/repos/simonw/datasette/issues/256,388684356,MDEyOklzc3VlQ29tbWVudDM4ODY4NDM1Ng==,9599,simonw,2018-05-14T03:05:37Z,2018-05-14T03:05:37Z,OWNER,"I just landed pull request #257 - I haven't refactored the tests, I may do that later if it looks worthwhile.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322551723,Break up app.py into separate view modules, https://github.com/simonw/datasette/issues/255#issuecomment-388686463,https://api.github.com/repos/simonw/datasette/issues/255,388686463,MDEyOklzc3VlQ29tbWVudDM4ODY4NjQ2Mw==,9599,simonw,2018-05-14T03:23:44Z,2018-05-14T03:25:22Z,OWNER,It would be neat if there was a mechanism for calculating aggregates per facet - e.g. calculating the sum() of specific columns against each facet result on https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9/nba-elo%2Fnbaallelo?_facet=lg_id&_facet=fran_id&lg_id=ABA&_facet=team_id,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-388784063,https://api.github.com/repos/simonw/datasette/issues/255,388784063,MDEyOklzc3VlQ29tbWVudDM4ODc4NDA2Mw==,9599,simonw,2018-05-14T11:25:00Z,2018-05-14T11:25:15Z,OWNER,"Can I get facets working across many2many relationships? This would be fiendishly useful, but the querystring and `metadata.json` syntax is non-obvious.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-388784787,https://api.github.com/repos/simonw/datasette/issues/255,388784787,MDEyOklzc3VlQ29tbWVudDM4ODc4NDc4Nw==,9599,simonw,2018-05-14T11:28:05Z,2018-05-14T11:28:05Z,OWNER,"To decide which facets to suggest: for each column, is the unique value count less than the number of rows matching the current query or is it less than 20 (if we are showing more than 20 rows)? Maybe only do this if there are less than ten non-float columns. Or always try for foreign keys and booleans, then if there are none of those try indexed text and integer fields, then finally try non-indexed text and integer fields but only if there are less than ten.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/259#issuecomment-388797919,https://api.github.com/repos/simonw/datasette/issues/259,388797919,MDEyOklzc3VlQ29tbWVudDM4ODc5NzkxOQ==,9599,simonw,2018-05-14T12:23:11Z,2018-05-14T12:23:11Z,OWNER,"For M2M to work we will need a mechanism for applying IN queries to the table view, so you can select multiple M2M filters. Maybe this would work: ?_m2m_category=123&_m2m_category=865","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322787470,inspect() should detect many-to-many relationships, https://github.com/simonw/datasette/issues/251#issuecomment-388987044,https://api.github.com/repos/simonw/datasette/issues/251,388987044,MDEyOklzc3VlQ29tbWVudDM4ODk4NzA0NA==,9599,simonw,2018-05-14T22:47:55Z,2018-05-14T22:47:55Z,OWNER,This work is now happening in the facets branch. Closing this in favor of #255.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",320592643,"Explore ""distinct values for column"" in inspect()", https://github.com/simonw/datasette/issues/255#issuecomment-389145872,https://api.github.com/repos/simonw/datasette/issues/255,389145872,MDEyOklzc3VlQ29tbWVudDM4OTE0NTg3Mg==,9599,simonw,2018-05-15T12:17:52Z,2018-05-15T12:17:52Z,OWNER,Activity has now moved to this branch: https://github.com/simonw/datasette/commits/suggested-facets,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-389147608,https://api.github.com/repos/simonw/datasette/issues/255,389147608,MDEyOklzc3VlQ29tbWVudDM4OTE0NzYwOA==,9599,simonw,2018-05-15T12:24:46Z,2018-05-15T12:24:46Z,OWNER,"New demo (published with `datasette publish now --branch=suggested-facets fivethirtyeight.db sf-trees.db --name=datastte-suggested-facets-demo`): https://datasette-suggested-facets-demo.now.sh/fivethirtyeight-2628db9/comic-characters%2Fmarvel-wikia-data After turning on a couple of suggested facets... https://datasette-suggested-facets-demo.now.sh/fivethirtyeight-2628db9/comic-characters%2Fmarvel-wikia-data?_facet=SEX&_facet=ID ![2018-05-15 at 7 24 am](https://user-images.githubusercontent.com/9599/40056411-fa265d16-5810-11e8-89ec-e38fe29ffb2c.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/pull/258#issuecomment-389386142,https://api.github.com/repos/simonw/datasette/issues/258,389386142,MDEyOklzc3VlQ29tbWVudDM4OTM4NjE0Mg==,9599,simonw,2018-05-16T03:51:13Z,2018-05-16T03:51:13Z,OWNER,"The URL does persist across deployments already, in that you can use the URL without the hash and it will redirect to the current location. Here's an example of that: https://san-francisco.datasettes.com/sf-trees/Street_Tree_List.json This also works if you attempt to hit the incorrect hash, e.g. if you have deployed a new version of the database with an updated hash. The old hash will redirect, e.g. https://san-francisco.datasettes.com/sf-trees-c4b972c/Street_Tree_List.json If you serve Datasette from a HTTP/2 proxy (I've been using Cloudflare for this) you won't even have to pay the cost of the redirect - Datasette sends a `Link: ; rel=preload` header with those redirects, which causes Cloudflare to push out the redirected source as part of that HTTP/2 request. You can fire up the Chrome DevTools to watch this happen. https://github.com/simonw/datasette/blob/2b79f2bdeb1efa86e0756e741292d625f91cb93d/datasette/views/base.py#L91 All of that said... I'm not at all opposed to this feature. For consistency with other Datasette options (e.g. `--cors`) I'd prefer to do this as an optional argument to the `datasette serve` command - something like this: datasette serve mydb.db --no-url-hash","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322741659,Add new metadata key persistent_urls which removes the hash from all database urls, https://github.com/simonw/datasette/issues/255#issuecomment-389386919,https://api.github.com/repos/simonw/datasette/issues/255,389386919,MDEyOklzc3VlQ29tbWVudDM4OTM4NjkxOQ==,9599,simonw,2018-05-16T03:57:47Z,2018-05-16T03:58:30Z,OWNER,"I updated that demo to demonstrate the new foreign key label expansions: https://datasette-suggested-facets-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List?_facet=qLegalStatus ![2018-05-15 at 8 58 pm](https://user-images.githubusercontent.com/9599/40095806-b645026a-5882-11e8-8100-76136df50212.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-389397457,https://api.github.com/repos/simonw/datasette/issues/255,389397457,MDEyOklzc3VlQ29tbWVudDM4OTM5NzQ1Nw==,9599,simonw,2018-05-16T05:20:04Z,2018-05-16T05:20:04Z,OWNER,Maybe `suggested_facets` should only be calculated for the HTML view.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/pull/258#issuecomment-389536870,https://api.github.com/repos/simonw/datasette/issues/258,389536870,MDEyOklzc3VlQ29tbWVudDM4OTUzNjg3MA==,9599,simonw,2018-05-16T14:22:31Z,2018-05-16T14:22:31Z,OWNER,"The principle benefit provided by the hash URLs is that Datasette can set a far-future cache expiry header on every response. This is particularly useful for JavaScript API work as it makes fantastic use of the browser's cache. It also means that if you are serving your API from behind a caching proxy like Cloudflare you get a fantastic cache hit rate. An option to serve without persistent hashes would also need to turn off the cache headers. Maybe the option should support both? If you hit a page with the hash in the URL you still get the cache headers, but hits to the URL without the hash serve uncashed content directly.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322741659,Add new metadata key persistent_urls which removes the hash from all database urls, https://github.com/simonw/datasette/issues/255#issuecomment-389546040,https://api.github.com/repos/simonw/datasette/issues/255,389546040,MDEyOklzc3VlQ29tbWVudDM4OTU0NjA0MA==,9599,simonw,2018-05-16T14:47:34Z,2018-05-16T14:47:34Z,OWNER,"Latest demo - now with multiple columns: https://datasette-suggested-facets-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List?_facet=qCaretaker&_facet=qCareAssistant&_facet=qLegalStatus ![2018-05-16 at 7 47 am](https://user-images.githubusercontent.com/9599/40124418-63e680ba-58dd-11e8-8063-9686826abb8e.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-389562708,https://api.github.com/repos/simonw/datasette/issues/255,389562708,MDEyOklzc3VlQ29tbWVudDM4OTU2MjcwOA==,9599,simonw,2018-05-16T15:32:12Z,2018-05-16T15:32:12Z,OWNER,"This is now landed in master, ready for the next release.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/263#issuecomment-389563719,https://api.github.com/repos/simonw/datasette/issues/263,389563719,MDEyOklzc3VlQ29tbWVudDM4OTU2MzcxOQ==,9599,simonw,2018-05-16T15:34:46Z,2018-05-16T15:34:46Z,OWNER,The underlying mechanics for the `_extras` mechanism described in #262 may help with this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323671577,Facets should not execute for ?shape=array|object, https://github.com/simonw/datasette/issues/265#issuecomment-389566147,https://api.github.com/repos/simonw/datasette/issues/265,389566147,MDEyOklzc3VlQ29tbWVudDM4OTU2NjE0Nw==,9599,simonw,2018-05-16T15:41:42Z,2018-05-16T15:41:42Z,OWNER,"An official demo instance of Datasette dedicated to this use-case would be useful, especially if it was automatically deployed by Travis for every commit to master that passes the tests. Maybe there should be a permanent version of it deployed for each released version too?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323677499,Add links to example Datasette instances to appropiate places in docs, https://github.com/simonw/datasette/issues/266#issuecomment-389572201,https://api.github.com/repos/simonw/datasette/issues/266,389572201,MDEyOklzc3VlQ29tbWVudDM4OTU3MjIwMQ==,9599,simonw,2018-05-16T15:58:43Z,2018-05-16T16:00:47Z,OWNER,"This will likely be implemented in the `BaseView` class, which needs to know how to spot the `.csv` extension, call the underlying JSON generating function and then return the `columns` and `rows` as correctly formatted CSV. https://github.com/simonw/datasette/blob/9959a9e4deec8e3e178f919e8b494214d5faa7fd/datasette/views/base.py#L201-L207 This means it will take ALL arguments that are available to the `.json` view. It may ignore some (e.g. `_facet=` makes no sense since CSV tables don't have space to show the facet results). In streaming mode, things will behave a little bit differently - in particular, if `_stream=1` then `_next=` will be forbidden. It can't include a length header because we don't know how many bytes it will be CSV output will throw an error if the endpoint doesn't have rows and columns keys eg `/-/inspect.json` So the implementation... - looks for the `.csv` extension - internally fetches the `.json` data instead - If no `_stream` it just transposes that JSON to CSV with the correct content type header - If `_stream=1` - checks for `_next=` and throws an error if it was provided - Otherwise... fetch first page and emit CSV header and first set of rows - Then start async looping, emitting more CSV rows and following the `_next=` internal reference until done I like that this takes advantage of efficient pagination. It may not work so well for views which use offset/limit though. It won't work at all for custom SQL because custom SQL doesn't support _next= pagination. That's fine. For views... easiest fix is to cut off after first X000 records. That seems OK. View JSON would need to include a property that the mechanism can identify.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389579363,https://api.github.com/repos/simonw/datasette/issues/266,389579363,MDEyOklzc3VlQ29tbWVudDM4OTU3OTM2Mw==,9599,simonw,2018-05-16T16:20:06Z,2018-05-16T16:20:06Z,OWNER,I started a thread on Twitter discussing various CSV output dialects: https://twitter.com/simonw/status/996783395504979968 - I want to pick defaults which will work as well as possible for whatever tools people might be using to consume the data.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389579762,https://api.github.com/repos/simonw/datasette/issues/266,389579762,MDEyOklzc3VlQ29tbWVudDM4OTU3OTc2Mg==,9599,simonw,2018-05-16T16:21:12Z,2018-05-16T16:21:12Z,OWNER,"> I basically want someone to tell me which arguments I can pass to Python's csv.writer() function that will result in the least complaints from people who try to parse the results :) https://twitter.com/simonw/status/996786815938977792","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389592566,https://api.github.com/repos/simonw/datasette/issues/266,389592566,MDEyOklzc3VlQ29tbWVudDM4OTU5MjU2Ng==,9599,simonw,2018-05-16T17:01:29Z,2018-05-16T17:02:21Z,OWNER,Let's provide a CSV Dialect definition too: https://frictionlessdata.io/specs/csv-dialect/ - via https://twitter.com/drewdaraabrams/status/996794915680997382,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389608473,https://api.github.com/repos/simonw/datasette/issues/266,389608473,MDEyOklzc3VlQ29tbWVudDM4OTYwODQ3Mw==,9599,simonw,2018-05-16T17:52:35Z,2018-05-16T17:54:11Z,OWNER,"There are some code examples in this issue which should help with the streaming part: https://github.com/channelcat/sanic/issues/1067 Also https://github.com/channelcat/sanic/blob/master/docs/sanic/streaming.md#response-streaming","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389626715,https://api.github.com/repos/simonw/datasette/issues/266,389626715,MDEyOklzc3VlQ29tbWVudDM4OTYyNjcxNQ==,9599,simonw,2018-05-16T18:50:46Z,2018-05-16T18:50:46Z,OWNER,"> I’d recommend using the Windows-1252 encoding for maximum compatibility, unless you have any characters not in that set, in which case use UTF8 with a byte order mark. Bit of a pain, but some progams (eg various versions of Excel) don’t read UTF8. **frankieroberto** https://twitter.com/frankieroberto/status/996823071947460616 > There is software that consumes CSV and doesn't speak UTF8!? Huh. Well I can't just use Windows-1252 because I need to support the full UTF8 range of potential data - maybe I should support an optional ?_encoding=windows-1252 argument **simonw** https://twitter.com/simonw/status/996824677245857793","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389893810,https://api.github.com/repos/simonw/datasette/issues/266,389893810,MDEyOklzc3VlQ29tbWVudDM4OTg5MzgxMA==,9599,simonw,2018-05-17T14:49:35Z,2018-05-17T14:49:35Z,OWNER,Idea: add a `supports_csv = False` property to `BaseView` and over-ride it to `True` just on the view classes that should support CSV (Table and Row). Slight subtlety: the `DatabaseView` class only supports CSV in the `custom_sql()` path. Maybe that needs to be refactored a bit.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389894382,https://api.github.com/repos/simonw/datasette/issues/266,389894382,MDEyOklzc3VlQ29tbWVudDM4OTg5NDM4Mg==,9599,simonw,2018-05-17T14:51:13Z,2018-05-17T14:53:23Z,OWNER,"I should definitely sanity check if the `_next=` route really is the most efficient way to build this. It may turn out that iterating over a SQLite cursor with a million rows in it is super-efficient and would provide much more reliable performance (plus solve the problem for retrieving full custom SQL queries where we can't do keyset pagination). Problem here is that we run SQL queries in a thread pool. A query that returns millions of rows would presumably tie up a SQL thread until it has finished, which could block the server. This may be a reason to stick with `_next=` keyset pagination - since it ensures each SQL thread yields back again after each 1,000 rows.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/271#issuecomment-389989015,https://api.github.com/repos/simonw/datasette/issues/271,389989015,MDEyOklzc3VlQ29tbWVudDM4OTk4OTAxNQ==,9599,simonw,2018-05-17T19:54:10Z,2018-05-17T19:54:10Z,OWNER,"This is a departure from how Datasette has been designed so far, and it may turn out that it's not feasible or it requires too many philosophical changes to be worthwhile. If we CAN do it though it would mean Datasette could stay running pointed at a directory on disk and new SQLite databases could be dropped into that directory by another process and served directly as they become available.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324162476,Mechanism for automatically picking up changes when on-disk .db file changes, https://github.com/simonw/datasette/issues/271#issuecomment-389989615,https://api.github.com/repos/simonw/datasette/issues/271,389989615,MDEyOklzc3VlQ29tbWVudDM4OTk4OTYxNQ==,9599,simonw,2018-05-17T19:56:13Z,2018-05-17T19:56:13Z,OWNER,"From https://www.sqlite.org/c3ref/open.html > **immutable**: The immutable parameter is a boolean query parameter that indicates that the database file is stored on read-only media. When immutable is set, SQLite assumes that the database file cannot be changed, even by a process with higher privilege, and so the database is opened read-only and all locking and change detection is disabled. Caution: Setting the immutable property on a database file that does in fact change can result in incorrect query results and/or SQLITE_CORRUPT errors. See also: SQLITE_IOCAP_IMMUTABLE. So this would probably have to be a new mode, `datasette serve --detect-db-changes`, which no longer opens in immutable mode. Or maybe current behavior becomes not-the-default and you opt into it with `datasette serve --immutable`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324162476,Mechanism for automatically picking up changes when on-disk .db file changes, https://github.com/simonw/datasette/issues/270#issuecomment-390105147,https://api.github.com/repos/simonw/datasette/issues/270,390105147,MDEyOklzc3VlQ29tbWVudDM5MDEwNTE0Nw==,9599,simonw,2018-05-18T06:13:07Z,2018-05-18T06:13:07Z,OWNER,I'm going to add a `/-/limits` page that shows the current limits.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323830051,--limit= CLI option for setting limits, https://github.com/simonw/datasette/issues/264#issuecomment-390105943,https://api.github.com/repos/simonw/datasette/issues/264,390105943,MDEyOklzc3VlQ29tbWVudDM5MDEwNTk0Mw==,9599,simonw,2018-05-18T06:18:00Z,2018-05-18T06:18:00Z,OWNER,Docs: http://datasette.readthedocs.io/en/latest/limits.html#default-facet-size,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323673899,Make it possible to customize various facet settings, https://github.com/simonw/datasette/issues/273#issuecomment-390250253,https://api.github.com/repos/simonw/datasette/issues/273,390250253,MDEyOklzc3VlQ29tbWVudDM5MDI1MDI1Mw==,198537,rgieseke,2018-05-18T15:49:52Z,2018-05-18T15:49:52Z,CONTRIBUTOR,"Shouldn't [versioneer](https://github.com/warner/python-versioneer) do that? E.g. 0.21+2.g1076c97 You'd need to install via `pip install git+https://github.com/simow/datasette.git` though, this does a temp git clone.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324451322,Figure out a way to have /-/version return current git commit hash, https://github.com/simonw/datasette/issues/274#issuecomment-390433040,https://api.github.com/repos/simonw/datasette/issues/274,390433040,MDEyOklzc3VlQ29tbWVudDM5MDQzMzA0MA==,9599,simonw,2018-05-19T21:12:42Z,2018-05-20T16:01:03Z,OWNER,Could also support these as optional environment variables - `DATASETTE_NAMEOFSETTING`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324652142,"Rename --limit to --config, add --help-config", https://github.com/simonw/datasette/issues/274#issuecomment-390496376,https://api.github.com/repos/simonw/datasette/issues/274,390496376,MDEyOklzc3VlQ29tbWVudDM5MDQ5NjM3Ng==,9599,simonw,2018-05-20T17:04:55Z,2018-05-20T17:04:55Z,OWNER,http://datasette.readthedocs.io/en/latest/config.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324652142,"Rename --limit to --config, add --help-config", https://github.com/simonw/datasette/pull/258#issuecomment-390577711,https://api.github.com/repos/simonw/datasette/issues/258,390577711,MDEyOklzc3VlQ29tbWVudDM5MDU3NzcxMQ==,247131,philroche,2018-05-21T07:38:15Z,2018-05-21T07:38:15Z,NONE,"Excellent, I was not aware of the auto redirect to the new hash. My bad This solves my use case. I do agree that your suggested --no-url-hash approach is much neater. I will investigate ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322741659,Add new metadata key persistent_urls which removes the hash from all database urls, https://github.com/simonw/datasette/issues/247#issuecomment-390689406,https://api.github.com/repos/simonw/datasette/issues/247,390689406,MDEyOklzc3VlQ29tbWVudDM5MDY4OTQwNg==,11912854,jsancho-gpl,2018-05-21T15:29:31Z,2018-05-21T15:29:31Z,NONE,"I've changed my mind about the way to support external connectors aside of SQLite and I'm working in a more simple style that respects the original Datasette, i.e. less refactoring. I present you [a version of Datasette wich supports other database connectors](https://github.com/jsancho-gpl/datasette/tree/external-connectors) and [a Datasette connector for HDF5/PyTables files](https://github.com/jsancho-gpl/datasette-pytables).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",319449852,SQLite code decoupled from Datasette, https://github.com/simonw/datasette/pull/277#issuecomment-390707183,https://api.github.com/repos/simonw/datasette/issues/277,390707183,MDEyOklzc3VlQ29tbWVudDM5MDcwNzE4Mw==,9599,simonw,2018-05-21T16:28:39Z,2018-05-21T16:28:39Z,OWNER,"This is definitely a big improvement. I'd like to refactor the unit tests that cover .inspect() too - currently they are a huge ugly blob at the top of test_api.py","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324836533,Refactor inspect logic, https://github.com/simonw/datasette/issues/276#issuecomment-390707760,https://api.github.com/repos/simonw/datasette/issues/276,390707760,MDEyOklzc3VlQ29tbWVudDM5MDcwNzc2MA==,9599,simonw,2018-05-21T16:30:35Z,2018-05-21T16:30:35Z,OWNER,"This probably needs to be in a plugin simply because getting Spatialite compiled and installed is a bit of a pain. It's a great opportunity to expand the plugin hooks in useful ways though.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-390795067,https://api.github.com/repos/simonw/datasette/issues/276,390795067,MDEyOklzc3VlQ29tbWVudDM5MDc5NTA2Nw==,45057,russss,2018-05-21T21:55:57Z,2018-05-21T21:55:57Z,CONTRIBUTOR,"Well, we do have the capability to detect spatialite so my intention certainly wasn't to require it. I can see the advantage of having it as a plugin but it does touch a number of points in the code. I think I'm going to attack this by refactoring the necessary bits and seeing where that leads (which was my plan anyway). I think my main concern is - if I add certain plugin hooks for this, is anything else ever going to use them? I'm not sure I have an answer to that question yet, either way.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/pull/277#issuecomment-390804333,https://api.github.com/repos/simonw/datasette/issues/277,390804333,MDEyOklzc3VlQ29tbWVudDM5MDgwNDMzMw==,9599,simonw,2018-05-21T22:40:16Z,2018-05-21T22:43:50Z,OWNER,"We should merge this before refactoring the tests though, because that way we don't couple the new tests to the verification of this change.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324836533,Refactor inspect logic, https://github.com/simonw/datasette/issues/278#issuecomment-390991640,https://api.github.com/repos/simonw/datasette/issues/278,390991640,MDEyOklzc3VlQ29tbWVudDM5MDk5MTY0MA==,9599,simonw,2018-05-22T13:33:46Z,2018-05-22T13:33:46Z,OWNER,For SpatiaLite this example may be useful - though it's building 4.3.0 and not 4.4.0: https://github.com/terranodo/spatialite-docker/blob/master/Dockerfile,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325294102,Build smallest possible Docker image with Datasette plus recent SQLite (with json1) plus Spatialite 4.4.0, https://github.com/simonw/datasette/issues/278#issuecomment-390993397,https://api.github.com/repos/simonw/datasette/issues/278,390993397,MDEyOklzc3VlQ29tbWVudDM5MDk5MzM5Nw==,9599,simonw,2018-05-22T13:38:57Z,2018-05-22T13:38:57Z,OWNER,"Useful GitHub code search: https://github.com/search?utf8=✓&q=%22libspatialite-4.4.0%22+%22RC0%22&type=Code ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325294102,Build smallest possible Docker image with Datasette plus recent SQLite (with json1) plus Spatialite 4.4.0, https://github.com/simonw/datasette/issues/255#issuecomment-390999055,https://api.github.com/repos/simonw/datasette/issues/255,390999055,MDEyOklzc3VlQ29tbWVudDM5MDk5OTA1NQ==,9599,simonw,2018-05-22T13:54:55Z,2018-05-22T13:54:55Z,OWNER,This shipped in Datasette 0.22. Here's my blog post about it: https://simonwillison.net/2018/May/20/datasette-facets/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/276#issuecomment-391000659,https://api.github.com/repos/simonw/datasette/issues/276,391000659,MDEyOklzc3VlQ29tbWVudDM5MTAwMDY1OQ==,9599,simonw,2018-05-22T13:59:27Z,2018-05-22T13:59:27Z,OWNER,"Right now the plugin stuff is early enough that I'd like to get as many potential plugin hooks as possible crafted out A much easier to judge if they should be added as actual hooks if we have a working branch prototype of them. Some kind of mechanism for custom column display is already needed - eg there are columns where I want to say ""render this as markdown"" or ""URLify any links in this text"" - or even ""use this date format"" or ""add commas to this integer"". You can do it with a custom template but a lower-level mechanism would be nicer. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/273#issuecomment-391003285,https://api.github.com/repos/simonw/datasette/issues/273,391003285,MDEyOklzc3VlQ29tbWVudDM5MTAwMzI4NQ==,9599,simonw,2018-05-22T14:06:40Z,2018-05-22T14:06:40Z,OWNER,"That looks great. I don't think it's possible to derive the current commit version from the .zip downloaded directly from GitHub, so needing to pip install via git+https feels reasonable to me.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324451322,Figure out a way to have /-/version return current git commit hash, https://github.com/simonw/datasette/issues/272#issuecomment-391011268,https://api.github.com/repos/simonw/datasette/issues/272,391011268,MDEyOklzc3VlQ29tbWVudDM5MTAxMTI2OA==,9599,simonw,2018-05-22T14:28:12Z,2018-05-22T14:28:12Z,OWNER,"I think I can do this almost entirely within my existing BaseView class structure. First, decouple the async data() methods by teaching them to take a querystring object as an argument instead of a Sanic request object. The get() method can then send that new object instead of a request. Next teach the base class how to obey the ASGI protocol. I should be able to get support for both Sanic and uvicorn/daphne working in the same codebase, which will make it easy to compare their performance. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/278#issuecomment-390993861,https://api.github.com/repos/simonw/datasette/issues/278,390993861,MDEyOklzc3VlQ29tbWVudDM5MDk5Mzg2MQ==,9599,simonw,2018-05-22T13:40:14Z,2018-05-22T14:38:05Z,OWNER,If we can't get `import sqlite3` to load the latest version but we can get `import pysqlite3` to work that's fine too - I can teach Datasette to import the best available version.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325294102,Build smallest possible Docker image with Datasette plus recent SQLite (with json1) plus Spatialite 4.4.0, https://github.com/simonw/datasette/issues/276#issuecomment-391025841,https://api.github.com/repos/simonw/datasette/issues/276,391025841,MDEyOklzc3VlQ29tbWVudDM5MTAyNTg0MQ==,9599,simonw,2018-05-22T15:06:36Z,2018-05-22T15:06:36Z,OWNER,"The other reason I mention plugins is that I have an idea to outlaw JavaScript entirely from Datasette core and instead encourage ALL JavaScript functionality to move into plugins.right now that just means CodeMirror. I may set up some of those plugins (like CodeMirror) as default dependencies so you get them from ""pip install datasette"". I like the neatness of saying that core Datasette is a very simple JSON + HTML application, then encouraging people to go completely wild with JavaScript in the plugins.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/243#issuecomment-391030083,https://api.github.com/repos/simonw/datasette/issues/243,391030083,MDEyOklzc3VlQ29tbWVudDM5MTAzMDA4Mw==,9599,simonw,2018-05-22T15:17:10Z,2018-05-22T15:17:10Z,OWNER,See also #278,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",318737808,--spatialite option for datasette publish commands, https://github.com/simonw/datasette/issues/276#issuecomment-391050113,https://api.github.com/repos/simonw/datasette/issues/276,391050113,MDEyOklzc3VlQ29tbWVudDM5MTA1MDExMw==,45057,russss,2018-05-22T16:13:00Z,2018-05-22T16:13:00Z,CONTRIBUTOR,"Yup, I'll have a think about it. My current thoughts are for spatialite we'll need to hook into the following places: * Inspection, so we can detect which columns are geometry columns. (We also currently ignore spatialite tables during inspection, it may be worth moving that to the plugin as well.) * After data load, so we can convert WKB into the correct intermediate format for display. The alternative here is to alter the select SQL itself and get spatialite to do this conversion, but that strikes me as a bit more complex and possibly not as useful. * HTML rendering. * Querying? The rendering and querying hooks could also potentially be used to move the units support into a plugin.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/pull/279#issuecomment-391055490,https://api.github.com/repos/simonw/datasette/issues/279,391055490,MDEyOklzc3VlQ29tbWVudDM5MTA1NTQ5MA==,9599,simonw,2018-05-22T16:29:30Z,2018-05-22T16:29:30Z,OWNER,"This is fantastic! I think I prefer the aesthetics of just ""0.22"" for the version string if it's a tagged release with no additional changes - does that work? I'd like to continue to provide a tuple that can be imported from the version.py module as well, as seen here: https://github.com/simonw/datasette/blob/558d9d7bfef3dd633eb16389281b67d42c9bdeef/datasette/version.py#L1 Presumably we can generate that from the versioneer string? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325352370,Add version number support with Versioneer, https://github.com/simonw/datasette/pull/280#issuecomment-391059008,https://api.github.com/repos/simonw/datasette/issues/280,391059008,MDEyOklzc3VlQ29tbWVudDM5MTA1OTAwOA==,565628,r4vi,2018-05-22T16:40:27Z,2018-05-22T16:40:27Z,CONTRIBUTOR,"```python >>> import sqlite3 >>> sqlite3.sqlite_version '3.23.1' >>> ``` running the above in the container seems to show 3.23.1 too so maybe we don't need pysqlite3 at all?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/279#issuecomment-391073009,https://api.github.com/repos/simonw/datasette/issues/279,391073009,MDEyOklzc3VlQ29tbWVudDM5MTA3MzAwOQ==,198537,rgieseke,2018-05-22T17:23:26Z,2018-05-22T17:23:26Z,CONTRIBUTOR,"> I think I prefer the aesthetics of just ""0.22"" for the version string if it's a tagged release with no additional changes - does that work? Yes! That's the default versioneer behaviour. > I'd like to continue to provide a tuple that can be imported from the version.py module as well, as seen here: Should work now, it can be a two (for a tagged version), three or four items tuple. ``` In [2]: datasette.__version__ Out[2]: '0.12+292.ga70c2a8.dirty' In [3]: datasette.__version_info__ Out[3]: ('0', '12+292', 'ga70c2a8', 'dirty') ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325352370,Add version number support with Versioneer, https://github.com/simonw/datasette/pull/279#issuecomment-391073267,https://api.github.com/repos/simonw/datasette/issues/279,391073267,MDEyOklzc3VlQ29tbWVudDM5MTA3MzI2Nw==,198537,rgieseke,2018-05-22T17:24:16Z,2018-05-22T17:24:16Z,CONTRIBUTOR,"Sorry, just realised you rely on `version` being a module ...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325352370,Add version number support with Versioneer, https://github.com/simonw/datasette/pull/280#issuecomment-391076239,https://api.github.com/repos/simonw/datasette/issues/280,391076239,MDEyOklzc3VlQ29tbWVudDM5MTA3NjIzOQ==,9599,simonw,2018-05-22T17:33:33Z,2018-05-22T17:33:33Z,OWNER,This looks amazing! Can't wait to try this out this evening.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/280#issuecomment-391076458,https://api.github.com/repos/simonw/datasette/issues/280,391076458,MDEyOklzc3VlQ29tbWVudDM5MTA3NjQ1OA==,9599,simonw,2018-05-22T17:34:13Z,2018-05-22T17:34:13Z,OWNER,Yeah let's try this without pysqlite3 and see if we still get the correct version.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/279#issuecomment-391077700,https://api.github.com/repos/simonw/datasette/issues/279,391077700,MDEyOklzc3VlQ29tbWVudDM5MTA3NzcwMA==,198537,rgieseke,2018-05-22T17:38:17Z,2018-05-22T17:38:17Z,CONTRIBUTOR,"Alright, that should work now -- let me know if you would prefer any different behaviour.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325352370,Add version number support with Versioneer, https://github.com/simonw/datasette/pull/280#issuecomment-391141391,https://api.github.com/repos/simonw/datasette/issues/280,391141391,MDEyOklzc3VlQ29tbWVudDM5MTE0MTM5MQ==,565628,r4vi,2018-05-22T21:08:39Z,2018-05-22T21:08:39Z,CONTRIBUTOR,"I'm going to clean this up for consistency tomorrow morning so hold off merging until then please On Tue, May 22, 2018 at 6:34 PM, Simon Willison wrote: > Yeah let's try this without pysqlite3 and see if we still get the correct > version. > > — > You are receiving this because you authored the thread. > Reply to this email directly, view it on GitHub > , or mute > the thread > > . > ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/280#issuecomment-391190497,https://api.github.com/repos/simonw/datasette/issues/280,391190497,MDEyOklzc3VlQ29tbWVudDM5MTE5MDQ5Nw==,9599,simonw,2018-05-23T01:22:53Z,2018-05-23T01:22:53Z,OWNER,"I grabbed just your Dockerfile and built it like this: docker build . -t datasette Once it had built, I ran it like this: docker run -p 8001:8001 -v `pwd`:/mnt datasette \ datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db \ --load-extension=/usr/local/lib/mod_spatialite.so (The fixtures.db file is created by running `python tests/fixtures.py fixtures.db`) Then I visited http://localhost:8001/-/versions and I got this: { ""datasette"": { ""version"": ""0+unknown"" }, ""python"": { ""full"": ""3.6.3 (default, Dec 12 2017, 06:37:05) \n[GCC 6.3.0 20170516]"", ""version"": ""3.6.3"" }, ""sqlite"": { ""extensions"": { ""json1"": null, ""spatialite"": ""4.4.0-RC0"" }, ""fts_versions"": [ ""FTS4"", ""FTS3"" ], ""version"": ""3.23.1"" } } Fantastic! I'm getting SQLite `3.23.1` and SpatiaLite `4.4.0-RC0`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/280#issuecomment-391290271,https://api.github.com/repos/simonw/datasette/issues/280,391290271,MDEyOklzc3VlQ29tbWVudDM5MTI5MDI3MQ==,565628,r4vi,2018-05-23T09:53:38Z,2018-05-23T09:53:38Z,CONTRIBUTOR,"Running: ```bash docker run -p 8001:8001 -v `pwd`:/mnt datasette \ datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db \ --load-extension=/usr/local/lib/mod_spatialite.so ``` is now returning FTS5 enabled in the versions output: ```json { ""datasette"": { ""version"": ""0.22"" }, ""python"": { ""full"": ""3.6.5 (default, May 5 2018, 03:07:21) \n[GCC 6.3.0 20170516]"", ""version"": ""3.6.5"" }, ""sqlite"": { ""extensions"": { ""json1"": null, ""spatialite"": ""4.4.0-RC0"" }, ""fts_versions"": [ ""FTS5"", ""FTS4"", ""FTS3"" ], ""version"": ""3.23.1"" } } ``` The old query didn't work because specifying `(t TEXT)` caused an error","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/280#issuecomment-391354237,https://api.github.com/repos/simonw/datasette/issues/280,391354237,MDEyOklzc3VlQ29tbWVudDM5MTM1NDIzNw==,9599,simonw,2018-05-23T13:51:22Z,2018-05-23T13:51:22Z,OWNER,@r4vi any objections to me merging this?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/issues/282#issuecomment-391355099,https://api.github.com/repos/simonw/datasette/issues/282,391355099,MDEyOklzc3VlQ29tbWVudDM5MTM1NTA5OQ==,9599,simonw,2018-05-23T13:53:39Z,2018-05-23T13:53:39Z,OWNER,Confirmed fixed: https://fivethirtyeight-datasette-mipwbeadvr.now.sh/fivethirtyeight-5de27e3/nba-elo%2Fnbaallelo?_facet=lg_id&_next=100 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325705981,Faceting breaks pagination, https://github.com/simonw/datasette/pull/280#issuecomment-391355030,https://api.github.com/repos/simonw/datasette/issues/280,391355030,MDEyOklzc3VlQ29tbWVudDM5MTM1NTAzMA==,565628,r4vi,2018-05-23T13:53:27Z,2018-05-23T15:22:45Z,CONTRIBUTOR,"No objections; It's good to go @simonw On Wed, 23 May 2018, 14:51 Simon Willison, wrote: > @r4vi any objections to me merging this? > > — > You are receiving this because you were mentioned. > Reply to this email directly, view it on GitHub > , or mute > the thread > > . > ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/280#issuecomment-391437199,https://api.github.com/repos/simonw/datasette/issues/280,391437199,MDEyOklzc3VlQ29tbWVudDM5MTQzNzE5OQ==,9599,simonw,2018-05-23T17:44:20Z,2018-05-23T17:44:20Z,OWNER,Thank you very much! This is most excellent.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/281#issuecomment-391437462,https://api.github.com/repos/simonw/datasette/issues/281,391437462,MDEyOklzc3VlQ29tbWVudDM5MTQzNzQ2Mg==,9599,simonw,2018-05-23T17:45:07Z,2018-05-23T17:45:07Z,OWNER,I'm afraid I just merged #280 which means this no longer applies. You're very welcome to see if you can further optimize the new Dockerfile though.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325553991,Reduces image size using Alpine + Multistage (re: #278), https://github.com/simonw/datasette/issues/276#issuecomment-391504199,https://api.github.com/repos/simonw/datasette/issues/276,391504199,MDEyOklzc3VlQ29tbWVudDM5MTUwNDE5OQ==,9599,simonw,2018-05-23T21:35:17Z,2018-05-23T21:35:17Z,OWNER,"I'm not keen on anything that modifies the SQLite file itself on startup - part of the datasette contract is that it should work with any SQLite file you throw at it without having any side-effects. A neat thing about SQLite is that because everything happens in the same process there's very little additional overhead involved in executing extra SQL queries - even if we ran a query-per-row to transform data in one specific column it shouldn't add more than a few ms to the total page load time (whereas with MySQL all of the extra query overhead would kill us).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-391504757,https://api.github.com/repos/simonw/datasette/issues/276,391504757,MDEyOklzc3VlQ29tbWVudDM5MTUwNDc1Nw==,9599,simonw,2018-05-23T21:37:07Z,2018-05-23T21:37:18Z,OWNER,"That said, it looks like we may be able to use a library like https://github.com/geomet/geomet to run the conversion from WKB entirely in Python space.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-391505930,https://api.github.com/repos/simonw/datasette/issues/276,391505930,MDEyOklzc3VlQ29tbWVudDM5MTUwNTkzMA==,45057,russss,2018-05-23T21:41:37Z,2018-05-23T21:41:37Z,CONTRIBUTOR,"> I'm not keen on anything that modifies the SQLite file itself on startup Ah I didn't mean that - I meant altering the SELECT query to fetch the data so that it ran a spatialite function to transform that specific column. I think that's less useful as a general-purpose plugin hook though, and it's not that hard to parse the WKB in Python (my default approach would be to use [shapely](https://github.com/Toblerity/Shapely), which is great, but geomet looks like an interesting pure-python alternative).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/283#issuecomment-391583528,https://api.github.com/repos/simonw/datasette/issues/283,391583528,MDEyOklzc3VlQ29tbWVudDM5MTU4MzUyOA==,9599,simonw,2018-05-24T04:21:49Z,2018-05-24T04:21:49Z,OWNER,"The challenge here is which database should be the ""default"" database. The first database attached to SQLite is treated as the default - if no database is specified in a query, that's the database that queries will be executed against. Currently, each database URL in Datasette (e.g. https://san-francisco.datasettes.com/sf-film-locations-84594a7 v.s. https://san-francisco.datasettes.com/sf-trees-ebc2ad9 ) gets its own independent connection, and all queries within that base URL run against that database. If we're going to attach multiple databases to the same connection, how do we set which database gets to be the default? The easiest thing to do here will be to have a special database (maybe which is turned off by default and can be enabled using `datasette serve --enable-cross-database-joins` or similar) which attaches to ALL the databases. Perhaps it starts as an in-memory database, maybe at `/memory`? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391584366,https://api.github.com/repos/simonw/datasette/issues/283,391584366,MDEyOklzc3VlQ29tbWVudDM5MTU4NDM2Ng==,9599,simonw,2018-05-24T04:28:20Z,2018-05-24T04:28:20Z,OWNER,"I used some pretty ugly hacks, like faking an entire `.inspect()` block for the `:memory:` database just to get past the errors I was seeing. To ship this as a feature it will need quite a bit of code refactoring to make those hacks unnecessary. https://github.com/simonw/datasette/blob/7a3040f5782375373b2b66e5969bc2c49b3a6f0e/datasette/views/database.py#L18-L26","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391584527,https://api.github.com/repos/simonw/datasette/issues/283,391584527,MDEyOklzc3VlQ29tbWVudDM5MTU4NDUyNw==,9599,simonw,2018-05-24T04:29:40Z,2018-05-24T04:29:40Z,OWNER,Rather than stealing the `/memory` namespace for this it would be nicer if these cross-database joins could be executed at the very top-level URL of the Datasette instance - `https://example.com/?sql=...`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391584112,https://api.github.com/repos/simonw/datasette/issues/283,391584112,MDEyOklzc3VlQ29tbWVudDM5MTU4NDExMg==,9599,simonw,2018-05-24T04:26:29Z,2018-05-24T04:30:50Z,OWNER,"I built a very rough prototype of this to prove it could work. It's deployed here - and here's an example of a query that joins across two different databases: https://datasette-cross-database-joins-prototype.now.sh/memory?sql=select+fivethirtyeight.%5Blove-actually%2Flove_actually_adjacencies%5D.rowid%2C%0D%0Afivethirtyeight.%5Blove-actually%2Flove_actually_adjacencies%5D.actors%2C%0D%0A%5Bgoogle-trends%5D.%5B20150430_UKDebate%5D.city%0D%0Afrom+fivethirtyeight.%5Blove-actually%2Flove_actually_adjacencies%5D%0D%0Ajoin+%5Bgoogle-trends%5D.%5B20150430_UKDebate%5D%0D%0A++on+%5Bgoogle-trends%5D.%5B20150430_UKDebate%5D.rowid+%3D+fivethirtyeight.%5Blove-actually%2Flove_actually_adjacencies%5D.rowid ``` select fivethirtyeight.[love-actually/love_actually_adjacencies].rowid, fivethirtyeight.[love-actually/love_actually_adjacencies].actors, [google-trends].[20150430_UKDebate].city from fivethirtyeight.[love-actually/love_actually_adjacencies] join [google-trends].[20150430_UKDebate] on [google-trends].[20150430_UKDebate].rowid = fivethirtyeight.[love-actually/love_actually_adjacencies].rowid ``` I deployed it like this: datasette publish now --branch=cross-database-joins fivethirtyeight.db google-trends.db --name=datasette-cross-database-joins-prototype ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391752218,https://api.github.com/repos/simonw/datasette/issues/283,391752218,MDEyOklzc3VlQ29tbWVudDM5MTc1MjIxOA==,9599,simonw,2018-05-24T15:15:19Z,2018-05-24T15:15:19Z,OWNER,Most of the time Datasette is used with just a single database file. So maybe it makes sense for this option to be turned on by default and to ALWAYS be available on the Datasette instance homepage unless the user has explicitly disabled it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391752425,https://api.github.com/repos/simonw/datasette/issues/283,391752425,MDEyOklzc3VlQ29tbWVudDM5MTc1MjQyNQ==,9599,simonw,2018-05-24T15:15:51Z,2018-05-24T15:15:51Z,OWNER,"This would make Datasett's SQL features a lot more instantly obvious to people who land on a homepage, which is probably a good thing.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391752629,https://api.github.com/repos/simonw/datasette/issues/283,391752629,MDEyOklzc3VlQ29tbWVudDM5MTc1MjYyOQ==,9599,simonw,2018-05-24T15:16:25Z,2018-05-24T15:16:25Z,OWNER,"Should this support canned queries too? I think it should, though that raises interesting questions regarding their URL structure.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391752882,https://api.github.com/repos/simonw/datasette/issues/283,391752882,MDEyOklzc3VlQ29tbWVudDM5MTc1Mjg4Mg==,9599,simonw,2018-05-24T15:17:10Z,2018-05-24T15:17:10Z,OWNER,Another option: give this the `/-/all` URL namespace.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391754506,https://api.github.com/repos/simonw/datasette/issues/283,391754506,MDEyOklzc3VlQ29tbWVudDM5MTc1NDUwNg==,9599,simonw,2018-05-24T15:21:37Z,2018-05-24T15:21:53Z,OWNER,"Giving it `/all/` would be easier since that way the existing URL routes (including canned queries) would all work... but I would have to teach it NOT to expect a database content hash on that URL. Or maybe it should still have a content hash (to enable far-future cache expiry headers on query results) but the hash should be constructed out of all of the other database hashes concatenated together. That way the URLs would be `/all-5de27e3` and `/all-5de27e3/canned-query-name` Only downside: this would make it impossible to have a database file with the name `all.db`. I think that's probably an OK trade-off. You could turn the feature off with a config flag if you really want to use that filename (for whatever reason). How about `/-all-5de27e3/` instead to avoid collisions?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391755300,https://api.github.com/repos/simonw/datasette/issues/283,391755300,MDEyOklzc3VlQ29tbWVudDM5MTc1NTMwMA==,9599,simonw,2018-05-24T15:23:37Z,2018-05-24T15:23:37Z,OWNER,On the `/-all-5de27e3` page we can show the regular https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3 interface but instead of the list of tables we can show a list of attached databases plus some help text showing how to construct a cross-database join.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391756841,https://api.github.com/repos/simonw/datasette/issues/283,391756841,MDEyOklzc3VlQ29tbWVudDM5MTc1Njg0MQ==,9599,simonw,2018-05-24T15:27:42Z,2018-05-24T15:27:42Z,OWNER,"For an example query that pre-populates that textarea... maybe a UNION that pulls the first 10 rows from the first table of each of the first two databases? ``` select * from (select rowid, actors from fivethirtyeight.[love-actually/love_actually_adjacencies] limit 10) union all select * from (select rowid, city from [google-trends].[20150430_UKDebate] limit 10) ``` https://datasette-cross-database-joins-prototype.now.sh/memory?sql=select+*+from+%28select+rowid%2C+actors+from+fivethirtyeight.%5Blove-actually%2Flove_actually_adjacencies%5D+limit+10%29%0D%0A+++union+all%0D%0Aselect+*+from+%28select+rowid%2C+city+from+%5Bgoogle-trends%5D.%5B20150430_UKDebate%5D+limit+10%29","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/284#issuecomment-391765706,https://api.github.com/repos/simonw/datasette/issues/284,391765706,MDEyOklzc3VlQ29tbWVudDM5MTc2NTcwNg==,9599,simonw,2018-05-24T15:52:24Z,2018-05-24T15:52:24Z,OWNER,I'm not crazy about the `enable_` prefix on these.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326182814,Ability to enable/disable specific features via --config, https://github.com/simonw/datasette/issues/284#issuecomment-391765973,https://api.github.com/repos/simonw/datasette/issues/284,391765973,MDEyOklzc3VlQ29tbWVudDM5MTc2NTk3Mw==,9599,simonw,2018-05-24T15:53:08Z,2018-05-24T15:53:08Z,OWNER,This will also give us a mechanism for turning on and off the cross-database joins feature from #283,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326182814,Ability to enable/disable specific features via --config, https://github.com/simonw/datasette/issues/284#issuecomment-391766420,https://api.github.com/repos/simonw/datasette/issues/284,391766420,MDEyOklzc3VlQ29tbWVudDM5MTc2NjQyMA==,9599,simonw,2018-05-24T15:54:33Z,2018-05-24T15:54:33Z,OWNER,"Maybe `allow_sql`, `allow_facet` and `allow_download`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326182814,Ability to enable/disable specific features via --config, https://github.com/simonw/datasette/issues/283#issuecomment-391768302,https://api.github.com/repos/simonw/datasette/issues/283,391768302,MDEyOklzc3VlQ29tbWVudDM5MTc2ODMwMg==,9599,simonw,2018-05-24T16:00:05Z,2018-05-24T16:00:05Z,OWNER,I like `/-/all-5de27e3` for this (with `/-/all` redirecting to the correct hash),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/275#issuecomment-391771202,https://api.github.com/repos/simonw/datasette/issues/275,391771202,MDEyOklzc3VlQ29tbWVudDM5MTc3MTIwMg==,9599,simonw,2018-05-24T16:08:41Z,2018-05-24T16:08:41Z,OWNER,"So the lookup priority order should be: * table level in metadata * database level in metadata * root level in metadata * `--config` options passed to `datasette serve` * `DATASETTE_X` environment variables","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324720095,"""config"" section in metadata.json (root, database and table level)", https://github.com/simonw/datasette/issues/275#issuecomment-391771658,https://api.github.com/repos/simonw/datasette/issues/275,391771658,MDEyOklzc3VlQ29tbWVudDM5MTc3MTY1OA==,9599,simonw,2018-05-24T16:09:55Z,2018-05-24T16:09:55Z,OWNER,It feels slightly weird continuing to call it `metadata.json` as it starts to grow support for config options (which already started with the `units` and `facets` keys) but I can live with that.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324720095,"""config"" section in metadata.json (root, database and table level)", https://github.com/simonw/datasette/issues/284#issuecomment-391912392,https://api.github.com/repos/simonw/datasette/issues/284,391912392,MDEyOklzc3VlQ29tbWVudDM5MTkxMjM5Mg==,9599,simonw,2018-05-25T01:16:56Z,2018-05-25T01:17:13Z,OWNER,`allow_sql` should only affect the `?sql=` parameter and whether or not the form is displayed. You should still be able to use and execute canned queries even if this option is turned off.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326182814,Ability to enable/disable specific features via --config, https://github.com/simonw/datasette/issues/284#issuecomment-391950691,https://api.github.com/repos/simonw/datasette/issues/284,391950691,MDEyOklzc3VlQ29tbWVudDM5MTk1MDY5MQ==,9599,simonw,2018-05-25T06:01:23Z,2018-05-25T06:05:02Z,OWNER,"Demo: datasette publish now --branch=master fixtures.db \ --source=""#284 Demo"" \ --source_url=""https://github.com/simonw/datasette/issues/284"" \ --extra-options ""--config allow_sql:off --config allow_facet:off --config allow_download:off"" \ --name=datasette-demo-284 now alias https://datasette-demo-284-jogjwngegj.now.sh datasette-demo-284.now.sh https://datasette-demo-284.now.sh/ Note the following: * https://datasette-demo-284.now.sh/fixtures-fda0fea has no SQL input textarea * https://datasette-demo-284.now.sh/fixtures-fda0fea has no database download link * https://datasette-demo-284.now.sh/fixtures-fda0fea.db returns 403 forbidden * https://datasette-demo-284.now.sh/fixtures-fda0fea?sql=select%20*%20from%20sqlite_master throws error 400 * https://datasette-demo-284.now.sh/fixtures-fda0fea/facetable shows no suggested facets * https://datasette-demo-284.now.sh/fixtures-fda0fea/facetable?_facet=city_id throws error 400","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326182814,Ability to enable/disable specific features via --config, https://github.com/simonw/datasette/issues/286#issuecomment-392121500,https://api.github.com/repos/simonw/datasette/issues/286,392121500,MDEyOklzc3VlQ29tbWVudDM5MjEyMTUwMA==,9599,simonw,2018-05-25T17:06:46Z,2018-05-25T17:06:46Z,OWNER,"A few extra thoughts: * Some users may want to opt out of this. We could have `--config version_in_hash:false` * should this affect the filename for the downloadable copy of the SQLite database? Maybe that should stay as just the hash of the contents, but that's a fair bit more complex * What about users who stick with the same version of datasette but deploy changes to their custom templates - how can we help them cache bust? Maybe with `--config cache_version:2`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326599525,Database hash should include current datasette version, https://github.com/simonw/datasette/issues/286#issuecomment-392121743,https://api.github.com/repos/simonw/datasette/issues/286,392121743,MDEyOklzc3VlQ29tbWVudDM5MjEyMTc0Mw==,9599,simonw,2018-05-25T17:07:36Z,2018-05-25T17:07:36Z,OWNER,This is also a great excuse to finally write up some detailed documentation on Datasette's caching strategy,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326599525,Database hash should include current datasette version, https://github.com/simonw/datasette/issues/267#issuecomment-392121905,https://api.github.com/repos/simonw/datasette/issues/267,392121905,MDEyOklzc3VlQ29tbWVudDM5MjEyMTkwNQ==,9599,simonw,2018-05-25T17:08:14Z,2018-05-25T17:08:14Z,OWNER,See also #286,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323716411,"Documentation for URL hashing, redirects and cache policy", https://github.com/simonw/datasette/issues/259#issuecomment-392212119,https://api.github.com/repos/simonw/datasette/issues/259,392212119,MDEyOklzc3VlQ29tbWVudDM5MjIxMjExOQ==,9599,simonw,2018-05-25T23:22:26Z,2018-05-25T23:22:26Z,OWNER,"This should detect any table which can be linked to the current table via some other table, based on the other table having a foreign key to them both. These join tables could be arbitrarily complicated. They might have foreign keys to more than two other tables, maybe even multiple foreign keys to the same column. Ideally M2M defection would catch all of these cases. Maybe the resulting inspect data looks something like this: ``` ""artists"": { ... ""m2m"": [{ ""other_table"": ""festivals"", ""through"": ""performances"", ""our_fk"": ""artist_id"", ""other_fk"": ""performance_id"" }] ``` Let's ignore compound primary keys: we k it detect m2m relationships where the join table has foreign keys to a single primary key on the other two tables.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322787470,inspect() should detect many-to-many relationships, https://github.com/simonw/datasette/issues/276#issuecomment-392279508,https://api.github.com/repos/simonw/datasette/issues/276,392279508,MDEyOklzc3VlQ29tbWVudDM5MjI3OTUwOA==,9599,simonw,2018-05-26T18:32:07Z,2018-05-26T18:32:07Z,OWNER,Related: I started the documentation for using SpatiaLite with Datasette here: https://datasette.readthedocs.io/en/latest/spatialite.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-392279644,https://api.github.com/repos/simonw/datasette/issues/276,392279644,MDEyOklzc3VlQ29tbWVudDM5MjI3OTY0NA==,9599,simonw,2018-05-26T18:34:21Z,2018-05-26T18:34:21Z,OWNER,"I've been thinking a bit about modifying the SQL select statement used for the table view recently. I've run into a few examples of SQLite database that slow to a crawl when viewed with datasette because the rows are too big, so there's definitely scope for supporting custom select clauses (avoiding some columns, showing length(colname) for others).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/289#issuecomment-392288990,https://api.github.com/repos/simonw/datasette/issues/289,392288990,MDEyOklzc3VlQ29tbWVudDM5MjI4ODk5MA==,9599,simonw,2018-05-26T21:24:10Z,2018-05-26T21:24:10Z,OWNER,An example of a query where you might want to use this option: https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3?sql=select+rowid%2C+*+from+%5Balcohol-consumption%2Fdrinks%5D+order+by+random%28%29+limit+1,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326768188,?_ttl= parameter to control caching, https://github.com/simonw/datasette/issues/289#issuecomment-392291605,https://api.github.com/repos/simonw/datasette/issues/289,392291605,MDEyOklzc3VlQ29tbWVudDM5MjI5MTYwNQ==,9599,simonw,2018-05-26T22:20:02Z,2018-05-26T22:20:02Z,OWNER,Documented here https://datasette.readthedocs.io/en/latest/json_api.html#special-table-arguments and here: https://datasette.readthedocs.io/en/latest/config.html#default-cache-ttl,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326768188,?_ttl= parameter to control caching, https://github.com/simonw/datasette/issues/289#issuecomment-392291716,https://api.github.com/repos/simonw/datasette/issues/289,392291716,MDEyOklzc3VlQ29tbWVudDM5MjI5MTcxNg==,9599,simonw,2018-05-26T22:22:47Z,2018-05-26T22:22:47Z,OWNER,Demo: hit refresh on https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3?sql=select+rowid%2C+*+from+%5Balcohol-consumption%2Fdrinks%5D+order+by+random%28%29+limit+1&_ttl=0,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326768188,?_ttl= parameter to control caching, https://github.com/simonw/datasette/issues/287#issuecomment-392296758,https://api.github.com/repos/simonw/datasette/issues/287,392296758,MDEyOklzc3VlQ29tbWVudDM5MjI5Njc1OA==,9599,simonw,2018-05-27T00:32:53Z,2018-05-27T00:32:53Z,OWNER,Docs: https://datasette.readthedocs.io/en/latest/json_api.html#different-shapes,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326617744,?_shape=arrayfirst, https://github.com/simonw/datasette/issues/285#issuecomment-392297392,https://api.github.com/repos/simonw/datasette/issues/285,392297392,MDEyOklzc3VlQ29tbWVudDM5MjI5NzM5Mg==,9599,simonw,2018-05-27T00:50:27Z,2018-05-27T00:50:27Z,OWNER,"I ran a very rough micro-benchmark on the new `num_sql_threads` config option. datasette --config num_sql_threads:1 fivethirtyeight.db Then ab -n 100 -c 10 'http://127.0.0.1:8011/fivethirtyeight-2628db9/twitter-ratio%2Fsenators' | Number of threads | Requests/second | |---|---| | 1 | 4.57 | | 3 | 9.77 | | 10 | 13.53 | | 20 | 15.24 | 50 | 8.21 | ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326189744,num_threads and cache_max_age should be --config options, https://github.com/simonw/datasette/issues/285#issuecomment-392297508,https://api.github.com/repos/simonw/datasette/issues/285,392297508,MDEyOklzc3VlQ29tbWVudDM5MjI5NzUwOA==,9599,simonw,2018-05-27T00:53:35Z,2018-05-27T00:53:35Z,OWNER,Documentation: http://datasette.readthedocs.io/en/latest/config.html#num-sql-threads,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326189744,num_threads and cache_max_age should be --config options, https://github.com/simonw/datasette/issues/291#issuecomment-392302406,https://api.github.com/repos/simonw/datasette/issues/291,392302406,MDEyOklzc3VlQ29tbWVudDM5MjMwMjQwNg==,9599,simonw,2018-05-27T03:18:06Z,2018-05-27T03:18:06Z,OWNER,"My first attempt at this was to have plugins depend on each other - so there would be a `datasette-leaflet` plugin which adds Leaflet to the page, and the `datasette-cluster-map` and `datasette-leaflet-geojson` plugins would depend on that plugin. I tried this and it didn't work, because it turns out the order in which plugins are loaded isn't predictable. `datasette-cluster-map` ended up adding it's script link before Leaflet had been loaded by `datasette-leaflet`, resulting in JavaScript errors.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326783670,Avoid plugins accidentally loading dependencies twice, https://github.com/simonw/datasette/issues/291#issuecomment-392302416,https://api.github.com/repos/simonw/datasette/issues/291,392302416,MDEyOklzc3VlQ29tbWVudDM5MjMwMjQxNg==,9599,simonw,2018-05-27T03:18:16Z,2018-05-27T03:18:16Z,OWNER,For the moment then I'm going with a really simple solution: when iterating through `extra_css_urls` and `extra_js_urls` de-dupe by URL and avoid outputting the same link twice.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326783670,Avoid plugins accidentally loading dependencies twice, https://github.com/simonw/datasette/issues/291#issuecomment-392302456,https://api.github.com/repos/simonw/datasette/issues/291,392302456,MDEyOklzc3VlQ29tbWVudDM5MjMwMjQ1Ng==,9599,simonw,2018-05-27T03:19:24Z,2018-05-27T03:19:24Z,OWNER,The big gap in this solution is conflicting versions: I don't yet have a story for what happens if two plugins attempt to load different versions of Leaflet. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326783670,Avoid plugins accidentally loading dependencies twice, https://github.com/simonw/datasette/issues/231#issuecomment-392305776,https://api.github.com/repos/simonw/datasette/issues/231,392305776,MDEyOklzc3VlQ29tbWVudDM5MjMwNTc3Ng==,9599,simonw,2018-05-27T05:10:46Z,2018-05-27T05:10:46Z,OWNER,These plugin config options should be exposed to JavaScript as `datasette.config.plugins`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316323336,metadata.json support for plugin configuration options, https://github.com/simonw/datasette/issues/276#issuecomment-392316250,https://api.github.com/repos/simonw/datasette/issues/276,392316250,MDEyOklzc3VlQ29tbWVudDM5MjMxNjI1MA==,9599,simonw,2018-05-27T08:59:46Z,2018-05-27T08:59:46Z,OWNER,It looks like we can use the `geometry_columns` table to introspect which columns are SpatiaLite geometries. It includes a `geometry_type` integer which is documented here: https://www.gaia-gis.it/fossil/libspatialite/wiki?name=switching-to-4.0,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-392316306,https://api.github.com/repos/simonw/datasette/issues/276,392316306,MDEyOklzc3VlQ29tbWVudDM5MjMxNjMwNg==,9599,simonw,2018-05-27T09:00:46Z,2018-05-27T09:00:46Z,OWNER,Relevant to this ticket: I've been playing with a plugin that automatically renders any GeoJSON cells as leaflet maps: https://github.com/simonw/datasette-leaflet-geojson,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/292#issuecomment-392316673,https://api.github.com/repos/simonw/datasette/issues/292,392316673,MDEyOklzc3VlQ29tbWVudDM5MjMxNjY3Mw==,9599,simonw,2018-05-27T09:08:06Z,2018-05-27T09:08:06Z,OWNER,Open question: how should this affect the row page? Just because columns were hidden on the table page doesn't necessarily mean they should be hidden on the row page as well. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392316701,https://api.github.com/repos/simonw/datasette/issues/292,392316701,MDEyOklzc3VlQ29tbWVudDM5MjMxNjcwMQ==,9599,simonw,2018-05-27T09:08:49Z,2018-05-27T09:08:49Z,OWNER,I could certainly see people wanting different custom column selects for the row page compared to the table page.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392338130,https://api.github.com/repos/simonw/datasette/issues/292,392338130,MDEyOklzc3VlQ29tbWVudDM5MjMzODEzMA==,9599,simonw,2018-05-27T15:09:18Z,2018-05-27T15:09:28Z,OWNER,"Here's my first sketch at a metadata format for this: * `columns`: optional list of columns to include - if missing, shows all * `column_selects`: dictionary mapping column names to alternative select clauses `column_selects` can also invent new keys and use them to create derived columns. These new keys will be selected at the end of the list of columns UNLESS they are mentioned in `columns`, in which case that sequence will define the order. Can you facet by things that are customized using `column_selects`? Yes, and let's try running suggested facets against those columns as well. ``` { ""databases"": { ""databasename"": { ""tables"": { ""tablename"": { ""columns"": [ ""id"", ""name"", ""size"" ], ""column_selects"": { ""name"": ""upper(name)"", ""geo_json"": ""AsGeoJSON(Geometry)"" } ""row_columns"": [...] ""row_column_selects"": {...} } ``` The `row_columns` and `row_column_selects` properties work the same as the `column*` ones, except they are applied on the row page instead. If omitted, the `column*` ones will be used on the row page as well. If you want the row page to switch back to Datasette's default behaviour you can set `""row_columns"": [], ""row_column_selects"": {}`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392342269,https://api.github.com/repos/simonw/datasette/issues/292,392342269,MDEyOklzc3VlQ29tbWVudDM5MjM0MjI2OQ==,9599,simonw,2018-05-27T15:55:40Z,2018-05-27T16:01:26Z,OWNER,"Here's the metadata I tried against that first working prototype: ``` { ""databases"": { ""timezones"": { ""tables"": { ""timezones"": { ""columns"": [""PK_UID""], ""column_selects"": { ""upper_tzid"": ""upper(tzid)"", ""Geometry"": ""AsGeoJSON(Geometry)"" } } } }, ""wtr"": { ""tables"": { ""license_frequency"": { ""columns"": [""id"", ""license"", ""tx_rx"", ""frequency""], ""column_selects"": { ""latitude"": ""Y(Geometry)"", ""longitude"": ""X(Geometry)"" } } } } } } ``` Run using this: datasette timezones.db wtr.db \ --reload --debug --load-extension=/usr/local/lib/mod_spatialite.dylib \ -m column-metadata.json --config sql_time_limit_ms:10000 Usefully, the `--reload` flag detects changes to the `metadata.json` file as well as Datasette's own Python code.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392342947,https://api.github.com/repos/simonw/datasette/issues/292,392342947,MDEyOklzc3VlQ29tbWVudDM5MjM0Mjk0Nw==,9599,simonw,2018-05-27T16:01:43Z,2018-05-27T16:01:43Z,OWNER,I'd still like to be able to over-ride this using querystring arguments.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392343690,https://api.github.com/repos/simonw/datasette/issues/292,392343690,MDEyOklzc3VlQ29tbWVudDM5MjM0MzY5MA==,9599,simonw,2018-05-27T16:08:25Z,2018-05-27T16:08:40Z,OWNER,"Turns out it's actually possible to pull data from other tables using the mechanism in the prototype: ``` { ""databases"": { ""wtr"": { ""tables"": { ""license"": { ""column_selects"": { ""count"": ""(select count(*) from license_frequency where license_frequency.license = license.id)"" } } } } } } ``` Performance using this technique is pretty terrible though: ![2018-05-27 at 9 07 am](https://user-images.githubusercontent.com/9599/40588124-8169d7fa-618d-11e8-9880-ccc1904b05d9.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392345062,https://api.github.com/repos/simonw/datasette/issues/292,392345062,MDEyOklzc3VlQ29tbWVudDM5MjM0NTA2Mg==,9599,simonw,2018-05-27T16:26:53Z,2018-05-27T16:26:53Z,OWNER,There needs to be a way to turn this off and return to Datasette default bahviour. Maybe a `?_raw=1` querystring parameter for the table view.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392350495,https://api.github.com/repos/simonw/datasette/issues/292,392350495,MDEyOklzc3VlQ29tbWVudDM5MjM1MDQ5NQ==,9599,simonw,2018-05-27T17:47:31Z,2018-05-27T17:47:31Z,OWNER,"Querystring design: * `?_column=a&_column=b` - equivalent of `""columns"": [""a"", ""b""]` in `metadata.json` * `?_select_nameupper=upper(name)` - equivalent of `""column_selects"": {""nameupper"": ""upper(name)""}`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392350568,https://api.github.com/repos/simonw/datasette/issues/292,392350568,MDEyOklzc3VlQ29tbWVudDM5MjM1MDU2OA==,9599,simonw,2018-05-27T17:48:45Z,2018-05-27T17:54:41Z,OWNER,"If any `?_column=` parameters are provided the metadata version is completely ignored. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392350980,https://api.github.com/repos/simonw/datasette/issues/292,392350980,MDEyOklzc3VlQ29tbWVudDM5MjM1MDk4MA==,9599,simonw,2018-05-27T17:56:30Z,2018-05-27T17:56:50Z,OWNER,"Should `?_raw=1` also turn off foreign key expansions? No, we will eventually provide a separate mechanism for that (or leave it to nerds who care to figure out using JSON or CSV export).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/150#issuecomment-392568047,https://api.github.com/repos/simonw/datasette/issues/150,392568047,MDEyOklzc3VlQ29tbWVudDM5MjU2ODA0Nw==,9599,simonw,2018-05-28T16:41:28Z,2018-05-28T16:41:28Z,OWNER,Closing this as obsolete since we have facets now.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276704327,_group_count= feature improvements, https://github.com/simonw/datasette/issues/116#issuecomment-392574208,https://api.github.com/repos/simonw/datasette/issues/116,392574208,MDEyOklzc3VlQ29tbWVudDM5MjU3NDIwOA==,9599,simonw,2018-05-28T17:23:41Z,2018-05-28T17:23:41Z,OWNER,"I'm handling this as separate documentation sections instead, e.g. http://datasette.readthedocs.io/en/latest/spatialite.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274884209,Add documentation section about SQLite extensions, https://github.com/simonw/datasette/issues/79#issuecomment-392574358,https://api.github.com/repos/simonw/datasette/issues/79,392574358,MDEyOklzc3VlQ29tbWVudDM5MjU3NDM1OA==,9599,simonw,2018-05-28T17:24:48Z,2018-05-28T17:24:48Z,OWNER,Closing this as obsolete in favor of other issues [tagged documentation](https://github.com/simonw/datasette/labels/documentation).,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273569068,Add more detailed API documentation to the README, https://github.com/simonw/datasette/issues/73#issuecomment-392574415,https://api.github.com/repos/simonw/datasette/issues/73,392574415,MDEyOklzc3VlQ29tbWVudDM5MjU3NDQxNQ==,9599,simonw,2018-05-28T17:25:14Z,2018-05-28T17:25:14Z,OWNER,I implemented this as `?_ttl=0` in #289 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273296178,_nocache=1 query string option for use with sort-by-random, https://github.com/simonw/datasette/issues/36#issuecomment-392575160,https://api.github.com/repos/simonw/datasette/issues/36,392575160,MDEyOklzc3VlQ29tbWVudDM5MjU3NTE2MA==,9599,simonw,2018-05-28T17:30:52Z,2018-05-28T17:30:52Z,OWNER,"I've changed my mind about this. ""Select every record on the 3rd day of the month"" doesn't strike me as an actually useful feature. ""Select every record in 2018 / in May 2018 / on 1st May 2018"", if you are using the SQLite-preferred datestring format, are already supported using LIKE queries (or the startswith filter): * https://fivethirtyeight.datasettes.com/fivethirtyeight/inconvenient-sequel%2Fratings?timestamp__startswith=2017 * https://fivethirtyeight.datasettes.com/fivethirtyeight/inconvenient-sequel%2Fratings?timestamp__startswith=2017-08 * https://fivethirtyeight.datasettes.com/fivethirtyeight/inconvenient-sequel%2Fratings?timestamp__startswith=2017-08-29 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268262480,"date, year, month and day querystring lookups", https://github.com/simonw/datasette/issues/121#issuecomment-392575448,https://api.github.com/repos/simonw/datasette/issues/121,392575448,MDEyOklzc3VlQ29tbWVudDM5MjU3NTQ0OA==,9599,simonw,2018-05-28T17:33:07Z,2018-05-28T17:33:07Z,OWNER,"This shouldn't be a comma-separated list, it should be an argument you can pass multiple times to better match #255 and #292 Maybe `?_json=foo&_json=bar` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275089535,?_json=foo&_json=bar query string argument , https://github.com/simonw/datasette/issues/31#issuecomment-392580715,https://api.github.com/repos/simonw/datasette/issues/31,392580715,MDEyOklzc3VlQ29tbWVudDM5MjU4MDcxNQ==,9599,simonw,2018-05-28T18:10:45Z,2018-05-28T18:10:45Z,OWNER,"Oops, that commit should have referenced #121 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268087542,Idea: colour scheme based on sha256 of db, https://github.com/simonw/datasette/issues/121#issuecomment-392580902,https://api.github.com/repos/simonw/datasette/issues/121,392580902,MDEyOklzc3VlQ29tbWVudDM5MjU4MDkwMg==,9599,simonw,2018-05-28T18:11:51Z,2018-05-28T18:11:51Z,OWNER,"Implemented in 76d11eb768e2f05f593c4d37a25280c0fcdf8fd6 Documented here: http://datasette.readthedocs.io/en/latest/json_api.html#special-json-arguments","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275089535,?_json=foo&_json=bar query string argument , https://github.com/simonw/datasette/issues/34#issuecomment-392600866,https://api.github.com/repos/simonw/datasette/issues/34,392600866,MDEyOklzc3VlQ29tbWVudDM5MjYwMDg2Ng==,9599,simonw,2018-05-28T20:45:34Z,2018-05-28T20:45:42Z,OWNER,"This is an accidental duplicate, work is now taking place in #266","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268176505,Support CSV export with a .csv extension, https://github.com/simonw/datasette/issues/38#issuecomment-392601114,https://api.github.com/repos/simonw/datasette/issues/38,392601114,MDEyOklzc3VlQ29tbWVudDM5MjYwMTExNA==,9599,simonw,2018-05-28T20:47:31Z,2018-05-28T20:47:31Z,OWNER,I think the way Datasette executes SQL queries in a thread pool introduced in #45 is a good solution for this ticket.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268462768,Experiment with patterns for concurrent long running queries, https://github.com/simonw/datasette/issues/56#issuecomment-392601478,https://api.github.com/repos/simonw/datasette/issues/56,392601478,MDEyOklzc3VlQ29tbWVudDM5MjYwMTQ3OA==,9599,simonw,2018-05-28T20:50:24Z,2018-05-28T20:50:24Z,OWNER,I'm going to close this as WONTFIX for the moment. Once Plugins #14 grows the ability to add extra URL paths and views someone who needs this could build it as a plugin instead.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127443,Easy way to block search engine crawling in robots.txt, https://github.com/simonw/datasette/issues/97#issuecomment-392602334,https://api.github.com/repos/simonw/datasette/issues/97,392602334,MDEyOklzc3VlQ29tbWVudDM5MjYwMjMzNA==,9599,simonw,2018-05-28T20:57:21Z,2018-05-28T20:57:21Z,OWNER,"The `/.json` endpoint is more of an implementation detail of the homepage at this point. A better, documented ( http://datasette.readthedocs.io/en/stable/introspection.html#inspect ) endpoint for finding all of the databases and tables is https://parlgov.datasettes.com/-/inspect.json","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274022950,Link to JSON for the list of tables , https://github.com/simonw/datasette/issues/142#issuecomment-392602558,https://api.github.com/repos/simonw/datasette/issues/142,392602558,MDEyOklzc3VlQ29tbWVudDM5MjYwMjU1OA==,9599,simonw,2018-05-28T20:58:59Z,2018-05-28T20:58:59Z,OWNER,I'll have the error message display a link to the documentation.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275917760,Show extra instructions with the interrupted, https://github.com/simonw/datasette/issues/142#issuecomment-392605574,https://api.github.com/repos/simonw/datasette/issues/142,392605574,MDEyOklzc3VlQ29tbWVudDM5MjYwNTU3NA==,9599,simonw,2018-05-28T21:25:05Z,2018-05-28T21:25:05Z,OWNER,"![2018-05-28 at 2 24 pm](https://user-images.githubusercontent.com/9599/40629887-e991c61c-6282-11e8-9d66-6387f90e87ca.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275917760,Show extra instructions with the interrupted, https://github.com/simonw/datasette/issues/144#issuecomment-392606044,https://api.github.com/repos/simonw/datasette/issues/144,392606044,MDEyOklzc3VlQ29tbWVudDM5MjYwNjA0NA==,9599,simonw,2018-05-28T21:29:42Z,2018-05-28T21:29:42Z,OWNER,"The other major limitation of APSW is its treatment of unicode: https://rogerbinns.github.io/apsw/types.html - it tells you that it is your responsibility to ensure that TEXT columns in your SQLite database are correctly encoded. Since Datasette is designed to work against ANY SQLite database that someone may have already created, I see that as a show-stopping limitation. Thanks to https://github.com/coleifer/sqlite-vtfunc I now have a working mechanism for virtual tables (I've even built a demo plugin with them - https://github.com/simonw/datasette-sql-scraper ) which was the main thing that interested me about APSW. I'm going to close this as WONTFIX - I think Python's built-in `sqlite3` is good enough, and is now so firmly embedded in the project that making it pluggable would be more trouble than it's worth.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276091279,apsw as alternative sqlite3 binding (for full text search), https://github.com/simonw/datasette/issues/179#issuecomment-392606418,https://api.github.com/repos/simonw/datasette/issues/179,392606418,MDEyOklzc3VlQ29tbWVudDM5MjYwNjQxOA==,9599,simonw,2018-05-28T21:32:37Z,2018-05-28T21:32:37Z,OWNER,"> It could also be useful to allow users to import a python file containing custom functions that can that be loaded into scope and made available to custom templates. That's now covered by the plugins mechanism - you can create plugins that define custom template functions: http://datasette.readthedocs.io/en/stable/plugins.html#prepare-jinja2-environment-env","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",288438570,More metadata options for template authors , https://github.com/simonw/datasette/issues/188#issuecomment-376594727,https://api.github.com/repos/simonw/datasette/issues/188,376594727,MDEyOklzc3VlQ29tbWVudDM3NjU5NDcyNw==,9599,simonw,2018-03-27T16:46:49Z,2018-05-28T21:34:34Z,OWNER,"One point of complexity: datasette can be used to bundle multiple .db files into a single ""app"". I think that's OK. We could require that the `datasette_files` table is present in the first database file passed on the command-line. Or we could even construct a search path and consult multiple versions of the table spread across multiple files. That said... any configuration that corresponds to a specific table should live in the same database file as that table. Ditto for general metadata: if we have license/source information for a specific table or database that information should be able to live in the same .db file as the data.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309047460,Ability to bundle metadata and templates inside the SQLite file, https://github.com/simonw/datasette/issues/276#issuecomment-392815673,https://api.github.com/repos/simonw/datasette/issues/276,392815673,MDEyOklzc3VlQ29tbWVudDM5MjgxNTY3Mw==,9599,simonw,2018-05-29T15:17:04Z,2018-05-29T15:17:04Z,OWNER,"I'm coming round to the idea that this should be baked into Datasette core - see above referenced issues for some of the explorations I've been doing around this area. Datasette should absolutely work without SpatiaLite, but it's such a huge bonus part of the SQLite ecosystem that I'm happy to ship features that take advantage of it without being relegated to plugins. I'm also becoming aware that there aren't really that many other interesting loadable extensions for SQLite. If SpatiaLite was one of dozens I'd feel that a rule that ""anything dependent on an extension lives in a plugin"" would make sense, but as it stands I think 99% of the time the only loadable extensions people will be using will be SpatiaLite and json1 (and json1 is available in the amalgamation anyway). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/191#issuecomment-392822050,https://api.github.com/repos/simonw/datasette/issues/191,392822050,MDEyOklzc3VlQ29tbWVudDM5MjgyMjA1MA==,9599,simonw,2018-05-29T15:33:25Z,2018-05-29T15:33:25Z,OWNER,"I don't know how it happened, but I've somehow got myself into a state where my local SQLite for Python 3 on OS X is `3.23.1`: ``` ~ $ python3 Python 3.6.5 (default, Mar 30 2018, 06:41:53) [GCC 4.2.1 Compatible Apple LLVM 9.0.0 (clang-900.0.39.2)] on darwin Type ""help"", ""copyright"", ""credits"" or ""license"" for more information. >>> import sqlite3 >>> sqlite3.connect(':memory:').execute('select sqlite_version()').fetchall() [('3.23.1',)] >>> ``` Maybe I did something in homebrew that changed this? I'd love to understand what exactly I did to get to this state.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310533258,Figure out how to bundle a more up-to-date SQLite, https://github.com/simonw/datasette/issues/276#issuecomment-392825746,https://api.github.com/repos/simonw/datasette/issues/276,392825746,MDEyOklzc3VlQ29tbWVudDM5MjgyNTc0Ng==,45057,russss,2018-05-29T15:42:53Z,2018-05-29T15:42:53Z,CONTRIBUTOR,"I haven't had time to look further into this, but if doing this as a plugin results in useful hooks then I think we should do it that way. We could always require the plugin as a standard dependency. I think this is going to result in quite a bit of refactoring anyway so it's a good time to add hooks regardless. On the other hand, if we have to add lots of specialist hooks for it then maybe it's worth integrating into the core.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/191#issuecomment-392828475,https://api.github.com/repos/simonw/datasette/issues/191,392828475,MDEyOklzc3VlQ29tbWVudDM5MjgyODQ3NQ==,119974,coleifer,2018-05-29T15:50:18Z,2018-05-29T15:50:18Z,NONE,"Python standard-library SQLite dynamically links against the system sqlite3. So presumably you installed a more up-to-date sqlite3 somewhere on your `LD_LIBRARY_PATH`. To compile a statically-linked pysqlite you need to include an amalgamation in the project root when building the extension. Read the relevant setup.py.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310533258,Figure out how to bundle a more up-to-date SQLite, https://github.com/simonw/datasette/issues/191#issuecomment-392831543,https://api.github.com/repos/simonw/datasette/issues/191,392831543,MDEyOklzc3VlQ29tbWVudDM5MjgzMTU0Mw==,9599,simonw,2018-05-29T15:58:33Z,2018-05-29T15:58:33Z,OWNER,"I ran an informal survey on twitter and most people were on 3.21 - https://twitter.com/simonw/status/1001487546289815553 Maybe this is from upgrading to the latest OS X release.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310533258,Figure out how to bundle a more up-to-date SQLite, https://github.com/simonw/datasette/issues/265#issuecomment-392890045,https://api.github.com/repos/simonw/datasette/issues/265,392890045,MDEyOklzc3VlQ29tbWVudDM5Mjg5MDA0NQ==,231923,yschimke,2018-05-29T18:37:49Z,2018-05-29T18:37:49Z,NONE,"Just about to ask for this! Move this page https://github.com/simonw/datasette/wiki/Datasettes into a datasette, with some concept of versioning as well.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323677499,Add links to example Datasette instances to appropiate places in docs, https://github.com/simonw/datasette/issues/97#issuecomment-392895733,https://api.github.com/repos/simonw/datasette/issues/97,392895733,MDEyOklzc3VlQ29tbWVudDM5Mjg5NTczMw==,231923,yschimke,2018-05-29T18:51:35Z,2018-05-29T18:51:35Z,NONE,Do you have an existing example with views?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274022950,Link to JSON for the list of tables , https://github.com/simonw/datasette/issues/298#issuecomment-392917380,https://api.github.com/repos/simonw/datasette/issues/298,392917380,MDEyOklzc3VlQ29tbWVudDM5MjkxNzM4MA==,9599,simonw,2018-05-29T19:41:59Z,2018-05-29T19:41:59Z,OWNER,Creating URLs using concatenation as seen in `('https://twitter.com/' || user) as user_url` is likely to have all sorts of useful applications for ad-hoc analysis.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327459829,URLify URLs in results from custom SQL statements / views, https://github.com/simonw/datasette/issues/296#issuecomment-392840811,https://api.github.com/repos/simonw/datasette/issues/296,392840811,MDEyOklzc3VlQ29tbWVudDM5Mjg0MDgxMQ==,9599,simonw,2018-05-29T16:26:27Z,2018-05-29T19:43:23Z,OWNER,"Since #275 will allow configs to be overridden at the table and database level it also makes sense to expose a completely evaluated list of configs at: * `/dbname/-/config` * `/dbname/tablename/-/config` Similar to https://fivethirtyeight.datasettes.com/-/config","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327395270,Per-database and per-table /-/ URL namespace, https://github.com/simonw/datasette/issues/296#issuecomment-392918311,https://api.github.com/repos/simonw/datasette/issues/296,392918311,MDEyOklzc3VlQ29tbWVudDM5MjkxODMxMQ==,9599,simonw,2018-05-29T19:44:33Z,2018-05-29T19:44:33Z,OWNER,Should the `tablename` ones also work for views and canned queries? Probably not.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327395270,Per-database and per-table /-/ URL namespace, https://github.com/simonw/datasette/issues/276#issuecomment-392969173,https://api.github.com/repos/simonw/datasette/issues/276,392969173,MDEyOklzc3VlQ29tbWVudDM5Mjk2OTE3Mw==,9599,simonw,2018-05-29T22:32:08Z,2018-05-29T22:32:08Z,OWNER,The more time I spend with SpatiaLite the more convinced I am that this should be default behavior. There's nothing useful about the binary Geometry representation - it's not even valid WKB. I'm on board with WKT as the default display in HTML and GeoJSON as the default for `.json`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/265#issuecomment-393003340,https://api.github.com/repos/simonw/datasette/issues/265,393003340,MDEyOklzc3VlQ29tbWVudDM5MzAwMzM0MA==,9599,simonw,2018-05-30T01:44:22Z,2018-05-30T01:44:22Z,OWNER,Funny you should mention that... I'm planning on doing that as part of the official Datasette website at some point soon. A Datasette instance that lists other Datasette instances feels pleasingly appropriate.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323677499,Add links to example Datasette instances to appropiate places in docs, https://github.com/simonw/datasette/issues/276#issuecomment-393014943,https://api.github.com/repos/simonw/datasette/issues/276,393014943,MDEyOklzc3VlQ29tbWVudDM5MzAxNDk0Mw==,9599,simonw,2018-05-30T02:59:53Z,2018-05-30T02:59:53Z,OWNER,I just realised a problem with GeoJSON is that it assumes that the underlying geometry is WGS 84 latitude/longitude points - but it's very possible for a SpatiaLite geometry to contain geometric data that's nothing to do with geospatial projections.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/266#issuecomment-393020749,https://api.github.com/repos/simonw/datasette/issues/266,393020749,MDEyOklzc3VlQ29tbWVudDM5MzAyMDc0OQ==,9599,simonw,2018-05-30T03:42:54Z,2018-05-30T03:42:54Z,OWNER,"Challenge: how to deal with tables where the name ends in `.csv`? I actually have one of these in the test suite at the moment: https://github.com/simonw/datasette/blob/d69ebce53385b7c6fafb85fdab3b136dbf3f332c/tests/fixtures.py#L234-L237","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/265#issuecomment-393064224,https://api.github.com/repos/simonw/datasette/issues/265,393064224,MDEyOklzc3VlQ29tbWVudDM5MzA2NDIyNA==,9599,simonw,2018-05-30T07:48:37Z,2018-05-30T07:48:37Z,OWNER,"https://datasette-registry.now.sh Is now live, powered by https://github.com/simonw/datasette-registry - still needs plenty of work but it's an interesting start.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323677499,Add links to example Datasette instances to appropiate places in docs, https://github.com/simonw/datasette/issues/276#issuecomment-393106520,https://api.github.com/repos/simonw/datasette/issues/276,393106520,MDEyOklzc3VlQ29tbWVudDM5MzEwNjUyMA==,45057,russss,2018-05-30T10:09:25Z,2018-05-30T10:09:25Z,CONTRIBUTOR,"I don't think it's unreasonable to only support spatialite geometries in a coordinate reference system which is at least transformable to WGS84. It would be nice to support different CRSes in the database so conversion to spatialite from the source data is lossless. I think the working CRS for datasette should be WGS84 though (leaflet requires it, for example) - it's just a case of calling `ST_Transform(geom, 4326)` on the column while we're loading the data.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/295#issuecomment-393534579,https://api.github.com/repos/simonw/datasette/issues/295,393534579,MDEyOklzc3VlQ29tbWVudDM5MzUzNDU3OQ==,9599,simonw,2018-05-31T13:44:15Z,2018-05-31T13:44:15Z,OWNER,I actually started doing this in 45e502aace6cc1198cc5f9a04d61b4a1860a012b,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327383759,Extract unit tests for inspect out to test_inspect.py, https://github.com/simonw/datasette/issues/243#issuecomment-393544357,https://api.github.com/repos/simonw/datasette/issues/243,393544357,MDEyOklzc3VlQ29tbWVudDM5MzU0NDM1Nw==,9599,simonw,2018-05-31T14:14:49Z,2018-05-31T14:14:49Z,OWNER,"Demo: https://datasette-publish-spatialite-demo.now.sh/spatialite-test-c88bc35?sql=select+AsText(Geometry)+from+HighWays+limit+1%3B Published using `datasette publish now --spatialite /tmp/spatialite-test.db`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",318737808,--spatialite option for datasette publish commands, https://github.com/simonw/datasette/issues/294#issuecomment-393547960,https://api.github.com/repos/simonw/datasette/issues/294,393547960,MDEyOklzc3VlQ29tbWVudDM5MzU0Nzk2MA==,9599,simonw,2018-05-31T14:25:43Z,2018-05-31T14:25:43Z,OWNER,"SpatialLite columns are actually quite a bit more interesting than this - they also have a `geometry_type` (point, polygon, linestring etc), a `coord_dimension` (usually 2 but can be higher) and an `srid`. For example: https://datasette-publish-spatialite-demo.now.sh/spatialite-test-c88bc35/geometry_columns ![2018-05-31 at 7 22 am](https://user-images.githubusercontent.com/9599/40787843-6f9600ee-64a3-11e8-84e5-64d7cc69603a.png) The SRID here is particularly interesting, because it helps hint at the fact that the results from these queries won't be latitude/longitude co-ordinates - which means that `AsGeoJSON()` won't return results that can be easily rendered by Leaflet: https://datasette-publish-spatialite-demo.now.sh/spatialite-test-c88bc35?sql=select+AsGeoJSON(Geometry)+from+HighWays%20limit1 Compare with https://timezones-api.now.sh/timezones-a99b2e3/geometry_columns: ![2018-05-31 at 7 25 am](https://user-images.githubusercontent.com/9599/40787991-d2650756-64a3-11e8-936e-2dcce7dd1515.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327365110,inspect should record column types, https://github.com/simonw/datasette/issues/294#issuecomment-393548602,https://api.github.com/repos/simonw/datasette/issues/294,393548602,MDEyOklzc3VlQ29tbWVudDM5MzU0ODYwMg==,9599,simonw,2018-05-31T14:27:41Z,2018-05-31T14:27:56Z,OWNER,Presumably the difference in primary key structure between those two is caused by the fact that the `spatialite-test` database (actually https://www.gaia-gis.it/spatialite-2.3.1/test-2.3.sqlite.gz downloaded from https://www.gaia-gis.it/spatialite-2.3.1/resources.html ) was created by a much older version of SpatialLite - presumably v2.3.1,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327365110,inspect should record column types, https://github.com/simonw/datasette/issues/294#issuecomment-393549215,https://api.github.com/repos/simonw/datasette/issues/294,393549215,MDEyOklzc3VlQ29tbWVudDM5MzU0OTIxNQ==,9599,simonw,2018-05-31T14:29:37Z,2018-05-31T14:29:37Z,OWNER,"Also of note: `spatialite-test` uses readable strings in the `type` column, while `timezones` has a `geometry_type` column with integers in it. Those integers are documented here: https://www.gaia-gis.it/fossil/libspatialite/wiki?name=switching-to-4.0 ![2018-05-31 at 7 29 am](https://user-images.githubusercontent.com/9599/40788210-5d0f0dd4-64a4-11e8-8141-0386b5c7b384.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327365110,inspect should record column types, https://github.com/simonw/datasette/issues/297#issuecomment-393554151,https://api.github.com/repos/simonw/datasette/issues/297,393554151,MDEyOklzc3VlQ29tbWVudDM5MzU1NDE1MQ==,9599,simonw,2018-05-31T14:44:37Z,2018-05-31T14:44:37Z,OWNER,I fixed this in https://github.com/simonw/datasette/commit/b18e4515855c3f1eeca3dfcccdbb6df05869084a,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327420945,datasette publish Dockerfile should use python:3.6-slim-stretch, https://github.com/simonw/datasette/issues/303#issuecomment-393557406,https://api.github.com/repos/simonw/datasette/issues/303,393557406,MDEyOklzc3VlQ29tbWVudDM5MzU1NzQwNg==,9599,simonw,2018-05-31T14:54:03Z,2018-05-31T14:54:03Z,OWNER,"Our test fixtures currently have a table with a name ending in `.csv`: https://github.com/simonw/datasette/blob/d69ebce53385b7c6fafb85fdab3b136dbf3f332c/tests/fixtures.py#L234-L237","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328172521,Support table names ending with .json or .csv, https://github.com/simonw/datasette/issues/294#issuecomment-393557968,https://api.github.com/repos/simonw/datasette/issues/294,393557968,MDEyOklzc3VlQ29tbWVudDM5MzU1Nzk2OA==,9599,simonw,2018-05-31T14:55:46Z,2018-05-31T14:55:46Z,OWNER,"I'm not sure what the best JSON shape for this would be considering the potential complexity of geospatial columns. I do think it's worth exposing these in the inspect JSON though, mainly so Datasette Registry can keep track of all of the openly available geodata out there.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327365110,inspect should record column types, https://github.com/simonw/datasette/issues/303#issuecomment-393599840,https://api.github.com/repos/simonw/datasette/issues/303,393599840,MDEyOklzc3VlQ29tbWVudDM5MzU5OTg0MA==,9599,simonw,2018-05-31T16:54:22Z,2018-05-31T16:54:32Z,OWNER,The interesting thing about this is that it requires URL routing to become aware of the names of all of the available tables.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328172521,Support table names ending with .json or .csv, https://github.com/simonw/datasette/issues/303#issuecomment-393600441,https://api.github.com/repos/simonw/datasette/issues/303,393600441,MDEyOklzc3VlQ29tbWVudDM5MzYwMDQ0MQ==,9599,simonw,2018-05-31T16:56:25Z,2018-05-31T16:57:41Z,OWNER,"Here's a nasty challenge: what happens if a database has the following two tables: * `blah` * `blah.json` What would the URL be for the JSON endpoint for the `blah` table?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328172521,Support table names ending with .json or .csv, https://github.com/simonw/datasette/issues/304#issuecomment-393610731,https://api.github.com/repos/simonw/datasette/issues/304,393610731,MDEyOklzc3VlQ29tbWVudDM5MzYxMDczMQ==,9599,simonw,2018-05-31T17:29:31Z,2018-05-31T17:30:05Z,OWNER,I prototyped this a while ago here https://github.com/simonw/datasette/commit/04476ead53758044a5f272ae8696b63d6703115e before we had the ``--config`` mechanism.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328229224,Ability to configure SQLite cache_size, https://github.com/simonw/datasette/issues/303#issuecomment-394037368,https://api.github.com/repos/simonw/datasette/issues/303,394037368,MDEyOklzc3VlQ29tbWVudDM5NDAzNzM2OA==,9599,simonw,2018-06-01T23:50:17Z,2018-06-01T23:50:35Z,OWNER,"Solution for he above: support an optional `?_format=json/csv` parameter on the regular table view. Then if you have tables with the above colliding names you can use `/db/blah.json?_format=json` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328172521,Support table names ending with .json or .csv, https://github.com/simonw/datasette/issues/304#issuecomment-394400419,https://api.github.com/repos/simonw/datasette/issues/304,394400419,MDEyOklzc3VlQ29tbWVudDM5NDQwMDQxOQ==,9599,simonw,2018-06-04T15:39:03Z,2018-06-04T15:39:03Z,OWNER,"In the interest of getting this shipped, I'm going to ignore the `3.7.10` issue.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328229224,Ability to configure SQLite cache_size, https://github.com/simonw/datasette/issues/304#issuecomment-394412217,https://api.github.com/repos/simonw/datasette/issues/304,394412217,MDEyOklzc3VlQ29tbWVudDM5NDQxMjIxNw==,9599,simonw,2018-06-04T16:13:32Z,2018-06-04T16:13:32Z,OWNER,Docs: http://datasette.readthedocs.io/en/latest/config.html#cache-size-kb,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328229224,Ability to configure SQLite cache_size, https://github.com/simonw/datasette/issues/302#issuecomment-394412784,https://api.github.com/repos/simonw/datasette/issues/302,394412784,MDEyOklzc3VlQ29tbWVudDM5NDQxMjc4NA==,9599,simonw,2018-06-04T16:15:22Z,2018-06-04T16:15:22Z,OWNER,I think this is related to #303,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328171513,test-2.3.sqlite database filename throws a 404, https://github.com/simonw/datasette/issues/266#issuecomment-394417567,https://api.github.com/repos/simonw/datasette/issues/266,394417567,MDEyOklzc3VlQ29tbWVudDM5NDQxNzU2Nw==,9599,simonw,2018-06-04T16:30:48Z,2018-06-04T16:32:55Z,OWNER,"When serving streaming responses, I need to check that a large CSV file doesn't completely max out the CPU in a way that is harmful to the rest of the instance. If it does, one option may be to insert an async sleep call in between each chunk that is streamed back. This could be controlled by a `csv_pause_ms` config setting, defaulting to maybe 5 but can be disabled entirely by setting to 0. That's only if testing proves that this is a necessary mechanism.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/272#issuecomment-394431323,https://api.github.com/repos/simonw/datasette/issues/272,394431323,MDEyOklzc3VlQ29tbWVudDM5NDQzMTMyMw==,9599,simonw,2018-06-04T17:17:37Z,2018-06-04T17:17:37Z,OWNER,I built this ASGI debugging tool to help with this migration: https://asgi-scope.now.sh/fivethirtyeight-34d6604/most-common-name%2Fsurnames.json?foo=bar&bazoeuto=onetuh&a=.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/292#issuecomment-392343839,https://api.github.com/repos/simonw/datasette/issues/292,392343839,MDEyOklzc3VlQ29tbWVudDM5MjM0MzgzOQ==,9599,simonw,2018-05-27T16:10:09Z,2018-06-04T17:38:04Z,OWNER,"The more efficient way of doing this kind of count would be to provide a mechanism which can also add extra fragments to a `GROUP BY` clause used for the `SELECT`. Or... how about a mechanism similar to Django's `prefetch_related` which lets you define extra queries that will be called with a list of primary keys (or values from other columns) and used to populate a new column? A little unconventional but could be extremely useful and efficient. Related to that: since the per-query overhead in SQLite is tiny, could even define an extra query to be run once-per-row before returning results.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/272#issuecomment-394503399,https://api.github.com/repos/simonw/datasette/issues/272,394503399,MDEyOklzc3VlQ29tbWVudDM5NDUwMzM5OQ==,9599,simonw,2018-06-04T21:20:14Z,2018-06-04T21:20:14Z,OWNER,Results of an extremely simple micro-benchmark comparing the two shows that uvicorn is at least as fast as Sanic (benchmarks a little faster with a very simple payload): https://gist.github.com/simonw/418950af178c01c416363cc057420851,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-394764713,https://api.github.com/repos/simonw/datasette/issues/272,394764713,MDEyOklzc3VlQ29tbWVudDM5NDc2NDcxMw==,9599,simonw,2018-06-05T15:58:54Z,2018-06-05T16:00:40Z,OWNER,"https://github.com/encode/uvicorn/blob/572b5fe6c811b63298d5350a06b664839624c860/uvicorn/run.py#L63 is how you start a Uvicorn server from code as opposed to the `uvicorn` CLI from uvicorn.run import UvicornServer UvicornServer().run(app, host=host, port=port) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-392118755,https://api.github.com/repos/simonw/datasette/issues/272,392118755,MDEyOklzc3VlQ29tbWVudDM5MjExODc1NQ==,9599,simonw,2018-05-25T16:56:40Z,2018-06-05T16:01:13Z,OWNER,"Thinking about this further, maybe I should embrace ASGI turtles-all-the-way-down and teach each datasette view class to take a scope to the constructor and act entirely as an ASGI component. Would be a nice way of diving deep into ASGI and I can add utility helpers for things like querystring evaluation as I need them.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/306#issuecomment-394894500,https://api.github.com/repos/simonw/datasette/issues/306,394894500,MDEyOklzc3VlQ29tbWVudDM5NDg5NDUwMA==,9599,simonw,2018-06-05T23:40:40Z,2018-06-05T23:40:40Z,OWNER,"Input: - function that says if a name is a valid database - Function that says if a table exists - URL Output: - view class - Arguments - Redirect (if it should redirect)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",329661905,Custom URL routing with independent tests, https://github.com/simonw/datasette/issues/306#issuecomment-394895267,https://api.github.com/repos/simonw/datasette/issues/306,394895267,MDEyOklzc3VlQ29tbWVudDM5NDg5NTI2Nw==,9599,simonw,2018-06-05T23:45:26Z,2018-06-05T23:45:26Z,OWNER,To support a future where Datasette is an ASGI app that can be attached to a URL within a larger application the routing function should have the option to accept a path prefix which will then be automatically attached to any resulting redirects.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",329661905,Custom URL routing with independent tests, https://github.com/simonw/datasette/issues/306#issuecomment-394894910,https://api.github.com/repos/simonw/datasette/issues/306,394894910,MDEyOklzc3VlQ29tbWVudDM5NDg5NDkxMA==,9599,simonw,2018-06-05T23:43:18Z,2018-06-05T23:49:41Z,OWNER,I'm going to use a named tuple for the output. That way I can support either tuple destructing or explicit property access on the returned value.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",329661905,Custom URL routing with independent tests, https://github.com/simonw/datasette/issues/306#issuecomment-394895750,https://api.github.com/repos/simonw/datasette/issues/306,394895750,MDEyOklzc3VlQ29tbWVudDM5NDg5NTc1MA==,9599,simonw,2018-06-05T23:48:06Z,2018-06-06T23:50:31Z,OWNER,"A neat trick could be that if the router returns a redirect it could then resolve that redirect to see if it will 404 (or redirect itself) before returning that response. This would need its own counter to guard against infinite redirects. I'm not going to do this though: any view that results in a chain of redirects like this is a bug that should be fixed at the source.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",329661905,Custom URL routing with independent tests, https://github.com/simonw/datasette/issues/306#issuecomment-395463497,https://api.github.com/repos/simonw/datasette/issues/306,395463497,MDEyOklzc3VlQ29tbWVudDM5NTQ2MzQ5Nw==,9599,simonw,2018-06-07T15:29:28Z,2018-06-07T15:29:28Z,OWNER,"I started sketching this out in a branch, see pull request #307 - but I've decided I don't like it. I'm going to close this ticket and stick with regular expression URL routing for the moment. If I change my mind in the future the code in #307 lives in separate files (`datasette/routes.py` and `tests/test_routes.py`) so bringing it back into the project will be trivial.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",329661905,Custom URL routing with independent tests, https://github.com/simonw/datasette/pull/307#issuecomment-395463598,https://api.github.com/repos/simonw/datasette/issues/307,395463598,MDEyOklzc3VlQ29tbWVudDM5NTQ2MzU5OA==,9599,simonw,2018-06-07T15:29:41Z,2018-06-07T15:29:41Z,OWNER,Closing this pull request for reasons outlined here: https://github.com/simonw/datasette/issues/306#issuecomment-395463497,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",330323860,"Initial sketch of custom URL routing, refs #306", https://github.com/simonw/datasette/issues/305#issuecomment-396048471,https://api.github.com/repos/simonw/datasette/issues/305,396048471,MDEyOklzc3VlQ29tbWVudDM5NjA0ODQ3MQ==,9599,simonw,2018-06-10T13:16:13Z,2018-06-10T13:16:13Z,OWNER,https://github.com/kubernetes/community/blob/master/contributors/devel/help-wanted.md Is worth stealing from too.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",329147284,Add contributor guidelines to docs, https://github.com/simonw/datasette/issues/266#issuecomment-397534196,https://api.github.com/repos/simonw/datasette/issues/266,397534196,MDEyOklzc3VlQ29tbWVudDM5NzUzNDE5Ng==,9599,simonw,2018-06-15T07:12:16Z,2018-06-15T07:12:16Z,OWNER,"The first version of this is now shipped to master. I ended up rewriting most of the experimental branch to deal with the nasty issue described in #303 Demo is available on https://fivethirtyeight.datasettes.com/fivethirtyeight-ab24e01/most-common-name%2Fsurnames ![2018-06-15 at 12 11 am](https://user-images.githubusercontent.com/9599/41455090-bd5ece30-7030-11e8-8da4-11fbb1f2ef8b.png) Here's the CSV version of that page: https://fivethirtyeight.datasettes.com/fivethirtyeight-ab24e01/most-common-name%2Fsurnames.csv","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397534404,https://api.github.com/repos/simonw/datasette/issues/266,397534404,MDEyOklzc3VlQ29tbWVudDM5NzUzNDQwNA==,9599,simonw,2018-06-15T07:13:20Z,2018-06-15T07:13:20Z,OWNER,"Still to add: the streaming version that iterates through all of the pages, as seen in experimental commit https://github.com/simonw/datasette/commit/ced379ea325787b8c3bf0a614daba1fa4856a3bd","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397534498,https://api.github.com/repos/simonw/datasette/issues/266,397534498,MDEyOklzc3VlQ29tbWVudDM5NzUzNDQ5OA==,9599,simonw,2018-06-15T07:13:52Z,2018-06-15T07:13:52Z,OWNER,Also needs documentation.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389570841,https://api.github.com/repos/simonw/datasette/issues/266,389570841,MDEyOklzc3VlQ29tbWVudDM4OTU3MDg0MQ==,9599,simonw,2018-05-16T15:54:49Z,2018-06-15T07:41:09Z,OWNER,"At the most basic level, this will work based on an extension. Most places you currently put a `.json` extension should also allow a `.csv` extension. By default this will return the exact results you see on the current page (default max will remain 1000). ## Streaming all records Where things get interested is *streaming mode*. This will be an option which returns ALL matching records as a streaming CSV file, even if that ends up being millions of records. I think the best way to build this will be on top of the existing mechanism used to efficiently implement keyset pagination via `_next=` tokens. ## Expanding foreign keys For tables with foreign key references it would be useful if the CSV format could expand those references to include the labels from `label_column` - maybe via an additional `?_expand=1` option. When expanding each foreign key column will be shown twice: rowid,city_id,city_id_label,state","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397637302,https://api.github.com/repos/simonw/datasette/issues/233,397637302,MDEyOklzc3VlQ29tbWVudDM5NzYzNzMwMg==,9599,simonw,2018-06-15T14:24:08Z,2018-06-15T14:55:19Z,OWNER,"I'm going with the terminology ""labels"" here. You'll be able to add ``?_labels=1`` and the JSON will look something like this: ``` { ""rowid"": 233, ""TreeID"": 121240, ""qLegalStatus"": { ""value"" 2, ""label"": ""Private"" } ""qSpecies"": { ""value"": 16, ""label"": ""Sycamore"" } ""qAddress"": ""91 Commonwealth Ave"", ... } ``` I need this to help build foreign key expansions for CSV files, see #266 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397648080,https://api.github.com/repos/simonw/datasette/issues/233,397648080,MDEyOklzc3VlQ29tbWVudDM5NzY0ODA4MA==,9599,simonw,2018-06-15T14:56:21Z,2018-06-15T14:56:21Z,OWNER,"I considered including a `""table""` key like this: ``` ""qLegalStatus"": { ""value"" 2, ""label"": ""Private"", ""table"": ""qLegalStatus"" } ``` This would help generate the HTML links using just the JSON data. But... I realized that in a list of 50 rows that value would be duplicated 50 times which is a bit nasty.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397663968,https://api.github.com/repos/simonw/datasette/issues/233,397663968,MDEyOklzc3VlQ29tbWVudDM5NzY2Mzk2OA==,9599,simonw,2018-06-15T15:51:17Z,2018-06-15T15:51:17Z,OWNER,"Nearly done, but I need the HTML view to ignore the `?_labels=1` param (it throws an error at the moment).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397668427,https://api.github.com/repos/simonw/datasette/issues/233,397668427,MDEyOklzc3VlQ29tbWVudDM5NzY2ODQyNw==,9599,simonw,2018-06-15T16:07:43Z,2018-06-15T16:07:43Z,OWNER,Demo: https://datasette-json-labels-demo.now.sh/fixtures-fda0fea/facetable.json?_labels=1&_shape=array,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397729319,https://api.github.com/repos/simonw/datasette/issues/233,397729319,MDEyOklzc3VlQ29tbWVudDM5NzcyOTMxOQ==,9599,simonw,2018-06-15T20:10:24Z,2018-06-15T20:10:24Z,OWNER,I'm also going to add the ability to specify individual columns that you want to expand using `?_label=city_id&_label=state_id`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397729500,https://api.github.com/repos/simonw/datasette/issues/233,397729500,MDEyOklzc3VlQ29tbWVudDM5NzcyOTUwMA==,9599,simonw,2018-06-15T20:11:14Z,2018-06-15T20:11:14Z,OWNER,The `.json` and `.csv` links displayed on the table page should default to using `?_labels=1` if Datasette detects that there are foreign key expansions available for the page.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397729945,https://api.github.com/repos/simonw/datasette/issues/266,397729945,MDEyOklzc3VlQ29tbWVudDM5NzcyOTk0NQ==,9599,simonw,2018-06-15T20:13:05Z,2018-06-15T20:13:05Z,OWNER,"The ""This data as ..."" area of the page is getting a bit untidy, especially if I'm going to add other download options in the future. I think I'll move the HTML to the page footer (less concerns about taking up lots of space there) and then have a bit of JavaScript that turns it into a show/hide menu of some sort in its current location.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/pull/311#issuecomment-397823913,https://api.github.com/repos/simonw/datasette/issues/311,397823913,MDEyOklzc3VlQ29tbWVudDM5NzgyMzkxMw==,9599,simonw,2018-06-16T16:32:07Z,2018-06-16T16:48:48Z,OWNER,"Still todo: - [ ] HTML view to obey the ?_labels=1 param (it throws an error at the moment) - [ ] `?_label=one&_label=2` support for only expanding specific labels - [ ] Better docs","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",332998752,"?_labels=1 to expand foreign keys (in csv and json), refs #233", https://github.com/simonw/datasette/issues/233#issuecomment-397824991,https://api.github.com/repos/simonw/datasette/issues/233,397824991,MDEyOklzc3VlQ29tbWVudDM5NzgyNDk5MQ==,9599,simonw,2018-06-16T16:50:31Z,2018-06-16T16:50:42Z,OWNER,"I'm going to support `?_labels=` on HTML views, but I'll allow it to be used to turn them off (they are on by default) using `?_labels=off`. Related: 7e0caa1e62607c6579101cc0e62bec8899013715 where I added a new `value_as_boolean` helper extracted from how `--config` works in `cli.py`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/312#issuecomment-397825583,https://api.github.com/repos/simonw/datasette/issues/312,397825583,MDEyOklzc3VlQ29tbWVudDM5NzgyNTU4Mw==,9599,simonw,2018-06-16T17:00:12Z,2018-06-16T17:00:12Z,OWNER,This is already covered by #292 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333000163,"HTML, CSV and JSON views should support ?_col=&_col=", https://github.com/simonw/datasette/issues/233#issuecomment-397839482,https://api.github.com/repos/simonw/datasette/issues/233,397839482,MDEyOklzc3VlQ29tbWVudDM5NzgzOTQ4Mg==,9599,simonw,2018-06-16T21:21:03Z,2018-06-16T21:21:03Z,OWNER,Should facets always have their labels expanded or should they also obey the `_labels` and `_label` querystring arguments?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397839583,https://api.github.com/repos/simonw/datasette/issues/233,397839583,MDEyOklzc3VlQ29tbWVudDM5NzgzOTU4Mw==,9599,simonw,2018-06-16T21:23:14Z,2018-06-16T21:23:44Z,OWNER,"I'm a bit torn on naming - choices are: * `?_labels=on` and `?_label=col1&_label=col2` * `?_expands=on` (or `?_expand_all=on`) and `?_expand=col1&_expand=col2`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397840676,https://api.github.com/repos/simonw/datasette/issues/233,397840676,MDEyOklzc3VlQ29tbWVudDM5Nzg0MDY3Ng==,9599,simonw,2018-06-16T21:49:50Z,2018-06-16T21:49:50Z,OWNER,For the moment I'm going with `_labels=`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/pull/311#issuecomment-397841968,https://api.github.com/repos/simonw/datasette/issues/311,397841968,MDEyOklzc3VlQ29tbWVudDM5Nzg0MTk2OA==,9599,simonw,2018-06-16T22:20:31Z,2018-06-16T22:20:31Z,OWNER,I merged this manually in ed631e690b81e34fcaeaba1f16c9166f1c505990,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",332998752,"?_labels=1 to expand foreign keys (in csv and json), refs #233", https://github.com/simonw/datasette/issues/233#issuecomment-397842194,https://api.github.com/repos/simonw/datasette/issues/233,397842194,MDEyOklzc3VlQ29tbWVudDM5Nzg0MjE5NA==,9599,simonw,2018-06-16T22:26:21Z,2018-06-16T22:26:21Z,OWNER,"Some demos: * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List - regular HTML view * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List?_labels=off - no labels * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List.json?_labels=on - JSON with all labels * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List.json?_label=qSpecies&_shape=array - JSON with specific labels in array shape * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List.csv?_labels=on - CSV with all labels * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List.csv?_label=qSpecies - CSV with specific labels","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397842246,https://api.github.com/repos/simonw/datasette/issues/266,397842246,MDEyOklzc3VlQ29tbWVudDM5Nzg0MjI0Ng==,9599,simonw,2018-06-16T22:27:59Z,2018-06-16T22:27:59Z,OWNER,"Two demos of the new functionality in #233 as it applies to CSV: * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List.csv?_labels=on - CSV with all foreign key columns expanded * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List.csv?_label=qSpecies - CSV with specific columns expanded","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/313#issuecomment-397900434,https://api.github.com/repos/simonw/datasette/issues/313,397900434,MDEyOklzc3VlQ29tbWVudDM5NzkwMDQzNA==,9599,simonw,2018-06-17T19:23:23Z,2018-06-17T19:23:23Z,OWNER,This will require some relatively sophisticated Travis build steps. Useful docs: https://docs.travis-ci.com/user/build-stages/ - useful example: https://docs.travis-ci.com/user/build-stages/deploy-heroku/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333086005,Deploy demo of Datasette on every commit that passes tests, https://github.com/simonw/datasette/issues/313#issuecomment-397907987,https://api.github.com/repos/simonw/datasette/issues/313,397907987,MDEyOklzc3VlQ29tbWVudDM5NzkwNzk4Nw==,9599,simonw,2018-06-17T21:32:52Z,2018-06-17T21:32:52Z,OWNER,"This very nearly works... * https://latest.datasette.io/ * https://f0c1722.datasette.io/ But... https://f0c1722.datasette.io/-/versions isn't showing the correct note: ``` { ""datasette"": { ""version"": ""0.22.1"" } ... ``` There should be a `""note""` field there with the full commit hash.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333086005,Deploy demo of Datasette on every commit that passes tests, https://github.com/simonw/datasette/issues/313#issuecomment-397908093,https://api.github.com/repos/simonw/datasette/issues/313,397908093,MDEyOklzc3VlQ29tbWVudDM5NzkwODA5Mw==,9599,simonw,2018-06-17T21:34:52Z,2018-06-17T21:34:52Z,OWNER,"It looks like all of my test deploys ended up going to the same Zeit deployment ID: https://zeit.co/simonw/datasette-latest/rbmtcedvlj This is strange... the Dockerfile should be different for each one (due to the differing version-note). https://github.com/simonw/datasette/commit/db1e6bc182d11f333e6addaa1a6be87625a4e12b#diff-34418c57343344c73271e13b01b7fcd9R255","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333086005,Deploy demo of Datasette on every commit that passes tests, https://github.com/simonw/datasette/issues/313#issuecomment-397908185,https://api.github.com/repos/simonw/datasette/issues/313,397908185,MDEyOklzc3VlQ29tbWVudDM5NzkwODE4NQ==,9599,simonw,2018-06-17T21:36:50Z,2018-06-17T21:36:50Z,OWNER,"``` The command ""datasette publish now fixtures.db -m fixtures.json --token=$NOW_TOKEN --branch=$TRAVIS_COMMIT --version-note=$TRAVIS_COMMIT"" exited with 0. ``` Partial log of the ``datasette publish now`` output: ``` > Step 5/7 : RUN datasette inspect fixtures.db --inspect-file inspect-data.json > ---> Running in d373f330e53e > ---> 09bab386aaa3 > Removing intermediate container d373f330e53e > Step 6/7 : EXPOSE 8001 > ---> Running in e0fe37b3061c > ---> 47798440e214 > Removing intermediate container e0fe37b3061c > Step 7/7 : CMD datasette serve --host 0.0.0.0 fixtures.db --cors --port 8001 --inspect-file inspect-data.json --metadata metadata.json --version-note f0c17229b7a7914d3da02e087dfd0e25d8321448 ``` So it looks like `--version-note` is being correctly set there.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333086005,Deploy demo of Datasette on every commit that passes tests, https://github.com/simonw/datasette/issues/313#issuecomment-397908614,https://api.github.com/repos/simonw/datasette/issues/313,397908614,MDEyOklzc3VlQ29tbWVudDM5NzkwODYxNA==,9599,simonw,2018-06-17T21:44:51Z,2018-06-17T21:45:03Z,OWNER,"Aha! ```1.03s$ now alias --token=$NOW_TOKEN > Error! Couldn't find a deployment to alias. Please provide one as an argument. The command ""now alias --token=$NOW_TOKEN"" exited with 1. ``` That explains it. I need to set the same alias in my call to `datasette publish`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333086005,Deploy demo of Datasette on every commit that passes tests, https://github.com/simonw/datasette/issues/313#issuecomment-397908947,https://api.github.com/repos/simonw/datasette/issues/313,397908947,MDEyOklzc3VlQ29tbWVudDM5NzkwODk0Nw==,9599,simonw,2018-06-17T21:51:34Z,2018-06-17T21:51:34Z,OWNER,"That fixed it! https://958b75c.datasette.io/-/versions ``` { ""python"": { ""version"": ""3.6.5"", ""full"": ""3.6.5 (default, Jun 6 2018, 19:19:24) \n[GCC 6.3.0 20170516]"" }, ""datasette"": { ""version"": ""0+unknown"", ""note"": ""958b75c69841ef5913da86e0eb2df634a9b95fda"" }, ""sqlite"": { ""version"": ""3.16.2"", ""fts_versions"": [ ""FTS5"", ""FTS4"", ""FTS3"" ], ""extensions"": { ""json1"": null } } } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333086005,Deploy demo of Datasette on every commit that passes tests, https://github.com/simonw/datasette/issues/266#issuecomment-397912840,https://api.github.com/repos/simonw/datasette/issues/266,397912840,MDEyOklzc3VlQ29tbWVudDM5NzkxMjg0MA==,9599,simonw,2018-06-17T23:13:35Z,2018-06-17T23:16:42Z,OWNER,"This worked! https://github.com/simonw/datasette/commit/5a0a82faf9cf9dd109d76181ed00eea19472087c - it spat out a 76MB CSV when I ran it against the sf-trees demo database. It was just a quick hack though - it currently ignores `_labels=` and `_dl=` which need to be supported. I'm going to add a config option for turning full CSV export off just in case any Datasette users are uncomfortable with URLs that churn out that much data in one go. ``` ConfigOption(""allow_csv_stream"", True, """""" Allow .csv?_stream=1 to download all rows (ignoring max_returned_rows) """""".strip()), ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397915258,https://api.github.com/repos/simonw/datasette/issues/266,397915258,MDEyOklzc3VlQ29tbWVudDM5NzkxNTI1OA==,9599,simonw,2018-06-18T00:01:05Z,2018-06-18T00:01:05Z,OWNER,Someone malicious could use a UNION to generate an unpleasantly large CSV response. I'll add another config setting which limits the response size to 100MB but can be turned off by setting it to 0.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397915403,https://api.github.com/repos/simonw/datasette/issues/266,397915403,MDEyOklzc3VlQ29tbWVudDM5NzkxNTQwMw==,9599,simonw,2018-06-18T00:03:17Z,2018-06-18T00:14:37Z,OWNER,"Since CSV streaming export doesn't work for custom SQL queries (since they don't support `_next=` pagination) there's no need to provide a option that disables streams just for custom SQL. Related: the UI should not show the option to download everything on custom SQL pages.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397916091,https://api.github.com/repos/simonw/datasette/issues/266,397916091,MDEyOklzc3VlQ29tbWVudDM5NzkxNjA5MQ==,9599,simonw,2018-06-18T00:13:43Z,2018-06-18T00:15:50Z,OWNER,I was also worried about the performance of pagination over custom `_sort` orders or views which use offset pagination - but Datasette's SQL time limits should prevent those from getting out of hand. This does mean that a streaming CSV file may be truncated with an error - if this happens we should ensure the error is written out as the last line of the CSV so anyone who tried to import it gets a relevant error message informing them that the export did not complete.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397916321,https://api.github.com/repos/simonw/datasette/issues/266,397916321,MDEyOklzc3VlQ29tbWVudDM5NzkxNjMyMQ==,9599,simonw,2018-06-18T00:17:44Z,2018-06-18T00:18:05Z,OWNER,The export UI could be a GET form controlling various parameters. This would discourage crawlers from hitting the export links and would also allow us to express the full range of export options.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397918264,https://api.github.com/repos/simonw/datasette/issues/266,397918264,MDEyOklzc3VlQ29tbWVudDM5NzkxODI2NA==,9599,simonw,2018-06-18T00:49:35Z,2018-06-18T00:49:35Z,OWNER,"Simpler design: the top of the page will link to basic .json and .csv and ""advanced"" - which will fragment link to an advanced export format the bottom of the page.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397923253,https://api.github.com/repos/simonw/datasette/issues/266,397923253,MDEyOklzc3VlQ29tbWVudDM5NzkyMzI1Mw==,9599,simonw,2018-06-18T01:49:52Z,2018-06-18T03:02:28Z,OWNER,Ideally the downloadable filenames of exported CSVs would differ across different querystring parameters. Maybe S`treet_Trees-56cbd54.csv` where `56cbd54` is a hash of the querystring?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397949002,https://api.github.com/repos/simonw/datasette/issues/266,397949002,MDEyOklzc3VlQ29tbWVudDM5Nzk0OTAwMg==,9599,simonw,2018-06-18T05:53:17Z,2018-06-18T05:53:17Z,OWNER,"Advanced export pane: ![2018-06-17 at 10 52 pm](https://user-images.githubusercontent.com/9599/41520166-3809a45a-7281-11e8-9dfa-2b10f4cb9672.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397842667,https://api.github.com/repos/simonw/datasette/issues/266,397842667,MDEyOklzc3VlQ29tbWVudDM5Nzg0MjY2Nw==,9599,simonw,2018-06-16T22:38:15Z,2018-06-18T05:55:11Z,OWNER,"Still todo: - [x] Streaming version - [ ] Tidy up the ""This data as ..."" UI - [x] Default .csv (and .json) links to use `?_labels=on` (only if at least one foreign key detected) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397952129,https://api.github.com/repos/simonw/datasette/issues/266,397952129,MDEyOklzc3VlQ29tbWVudDM5Nzk1MjEyOQ==,9599,simonw,2018-06-18T06:15:36Z,2018-06-18T06:15:51Z,OWNER,Advanced export pane demo: https://latest.datasette.io/fixtures-35b6eb6/facetable?_size=4,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/316#issuecomment-398030903,https://api.github.com/repos/simonw/datasette/issues/316,398030903,MDEyOklzc3VlQ29tbWVudDM5ODAzMDkwMw==,132230,gavinband,2018-06-18T12:00:43Z,2018-06-18T12:00:43Z,NONE,"I should add that I'm using datasette version 0.22, Python 2.7.10 on Mac OS X. Happy to send more info if helpful.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333238932,datasette inspect takes a very long time on large dbs, https://github.com/simonw/datasette/issues/266#issuecomment-398098582,https://api.github.com/repos/simonw/datasette/issues/266,398098582,MDEyOklzc3VlQ29tbWVudDM5ODA5ODU4Mg==,9599,simonw,2018-06-18T15:40:32Z,2018-06-18T15:40:32Z,OWNER,This is now released in Datasette 0.23! http://datasette.readthedocs.io/en/latest/changelog.html#v0-23,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/316#issuecomment-398101670,https://api.github.com/repos/simonw/datasette/issues/316,398101670,MDEyOklzc3VlQ29tbWVudDM5ODEwMTY3MA==,9599,simonw,2018-06-18T15:49:35Z,2018-06-18T15:50:38Z,OWNER,"Wow, I've gone as high as 7GB but I've never tried it against 600GB. `datasette inspect` is indeed expected to take a long time for large databases. That's why it's available as a separate command: by running `datasette inspect` to generate `inspect-data.json` you can execute it just once against a large database and then have `datasette serve` take advantage of that cached metadata (hence avoiding `datasette serve` hanging on startup). As you spotted, most of the time is spent in those counts. I imagine you don't need those row counts in order for the rest of Datasette to function correctly (they are mainly used for display purposes - on the https://latest.datasette.io/fixtures index page for example). If your database changes infrequently, for the moment I recommend running `datasette inspect` once to generate the `inspect-data.json` file (let me know how long it takes) and then passing that file to `datasette serve mydb.db --inspect-file=inspect-data.json` If your database DOES change frequently then this workaround won't help you much. Let me know and I'll see how much work it would take to have those row counts be optional rather than required.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333238932,datasette inspect takes a very long time on large dbs, https://github.com/simonw/datasette/issues/265#issuecomment-398102537,https://api.github.com/repos/simonw/datasette/issues/265,398102537,MDEyOklzc3VlQ29tbWVudDM5ODEwMjUzNw==,9599,simonw,2018-06-18T15:52:15Z,2018-06-18T15:52:15Z,OWNER,https://latest.datasette.io/ now always hosts the latest version of the code. I've started linking to it from our documentation.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323677499,Add links to example Datasette instances to appropiate places in docs, https://github.com/simonw/datasette/issues/316#issuecomment-398109204,https://api.github.com/repos/simonw/datasette/issues/316,398109204,MDEyOklzc3VlQ29tbWVudDM5ODEwOTIwNA==,132230,gavinband,2018-06-18T16:12:45Z,2018-06-18T16:12:45Z,NONE,"Hi Simon, Thanks for the response. Ok I'll try running `datasette inspect` up front. In principle the db won't change. However, the site's in development and it's likely I'll need to add views and some auxiliary (smaller) tables as I go along. I will need to be careful with this if it involves an inspect step in each iteration, though. g. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333238932,datasette inspect takes a very long time on large dbs, https://github.com/simonw/datasette/issues/271#issuecomment-398133924,https://api.github.com/repos/simonw/datasette/issues/271,398133924,MDEyOklzc3VlQ29tbWVudDM5ODEzMzkyNA==,9599,simonw,2018-06-18T17:32:22Z,2018-06-18T17:32:22Z,OWNER,"As seen in #316 inspect is already taking a VERY long time to run against large (600GB) databases. To get this working I may have to make inspect an optional optimization and run introspection for columns and primary keys in demand. The one catch here is the `count(*)` queries - Datasette may need to learn not to return full table counts in circumstances where the count has not been pre-calculates and takes more than Xms to generate.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324162476,Mechanism for automatically picking up changes when on-disk .db file changes, https://github.com/simonw/datasette/issues/188#issuecomment-398778485,https://api.github.com/repos/simonw/datasette/issues/188,398778485,MDEyOklzc3VlQ29tbWVudDM5ODc3ODQ4NQ==,12617395,bsilverm,2018-06-20T14:48:39Z,2018-06-20T14:48:39Z,NONE,This would be a great feature to have!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309047460,Ability to bundle metadata and templates inside the SQLite file, https://github.com/simonw/datasette/issues/302#issuecomment-398825294,https://api.github.com/repos/simonw/datasette/issues/302,398825294,MDEyOklzc3VlQ29tbWVudDM5ODgyNTI5NA==,9599,simonw,2018-06-20T17:06:36Z,2018-06-20T17:06:36Z,OWNER,Still a bug in 0.23,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328171513,test-2.3.sqlite database filename throws a 404, https://github.com/simonw/datasette/issues/318#issuecomment-398973309,https://api.github.com/repos/simonw/datasette/issues/318,398973309,MDEyOklzc3VlQ29tbWVudDM5ODk3MzMwOQ==,9599,simonw,2018-06-21T04:35:12Z,2018-06-21T04:37:37Z,OWNER,"Demo of fix: the `on_earth` facet on https://latest.datasette.io/fixtures-cafd088/facetable?_facet=planet_int&_facet=on_earth&_facet=city_id ![2018-06-20 at 9 35 pm](https://user-images.githubusercontent.com/9599/41698208-ebb6b72a-74d1-11e8-9d85-de7600177f69.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334148669,Facets with value of 0 displayed incorrectly, https://github.com/simonw/datasette/issues/321#issuecomment-398976488,https://api.github.com/repos/simonw/datasette/issues/321,398976488,MDEyOklzc3VlQ29tbWVudDM5ODk3NjQ4OA==,9599,simonw,2018-06-21T04:59:33Z,2018-06-21T06:11:02Z,OWNER,"I've added this to the unit tests and the documentation. Docs: http://datasette.readthedocs.io/en/latest/sql_queries.html#canned-queries Canned query demo: https://latest.datasette.io/fixtures/neighborhood_search?text=town New unit test: https://github.com/simonw/datasette/blob/3683a6b626b2e79f4dc9600d45853ca4ae8de11a/tests/test_api.py#L333-L344 https://github.com/simonw/datasette/blob/3683a6b626b2e79f4dc9600d45853ca4ae8de11a/tests/fixtures.py#L145-L153","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334190959,Wildcard support in query parameters, https://github.com/simonw/datasette/issues/321#issuecomment-399098080,https://api.github.com/repos/simonw/datasette/issues/321,399098080,MDEyOklzc3VlQ29tbWVudDM5OTA5ODA4MA==,12617395,bsilverm,2018-06-21T13:10:48Z,2018-06-21T13:10:48Z,NONE,"Perfect, thank you!!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334190959,Wildcard support in query parameters, https://github.com/simonw/datasette/issues/321#issuecomment-399106871,https://api.github.com/repos/simonw/datasette/issues/321,399106871,MDEyOklzc3VlQ29tbWVudDM5OTEwNjg3MQ==,12617395,bsilverm,2018-06-21T13:39:37Z,2018-06-21T13:39:37Z,NONE,"One thing I've noticed with this approach is that the query is executed with no parameters which I do not believe was the case previously. In the case the table contains a lot of data, this adds some time executing the query before the user can enter their input and run it with the parameters they want.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334190959,Wildcard support in query parameters, https://github.com/simonw/datasette/issues/321#issuecomment-399126228,https://api.github.com/repos/simonw/datasette/issues/321,399126228,MDEyOklzc3VlQ29tbWVudDM5OTEyNjIyOA==,9599,simonw,2018-06-21T14:36:40Z,2018-06-21T14:36:53Z,OWNER,"This seems to fix that: ``` select neighborhood, facet_cities.name, state from facetable join facet_cities on facetable.city_id = facet_cities.id where :text != '' and neighborhood like '%' || :text || '%' order by neighborhood; ``` Compare this (with empty string): https://latest.datasette.io/fixtures-cafd088?sql=select+neighborhood%2C+facet_cities.name%2C+state%0D%0Afrom+facetable%0D%0A++++join+facet_cities+on+facetable.city_id+%3D+facet_cities.id%0D%0Awhere+%3Atext+%21%3D+%22%22+and+neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0D%0Aorder+by+neighborhood%3B To this: https://latest.datasette.io/fixtures-cafd088?sql=select+neighborhood%2C+facet_cities.name%2C+state%0D%0Afrom+facetable%0D%0A++++join+facet_cities+on+facetable.city_id+%3D+facet_cities.id%0D%0Awhere+%3Atext+%21%3D+%22%22+and+neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0D%0Aorder+by+neighborhood%3B&text=town","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334190959,Wildcard support in query parameters, https://github.com/simonw/datasette/issues/321#issuecomment-399129220,https://api.github.com/repos/simonw/datasette/issues/321,399129220,MDEyOklzc3VlQ29tbWVudDM5OTEyOTIyMA==,12617395,bsilverm,2018-06-21T14:45:02Z,2018-06-21T14:45:02Z,NONE,Those queries look identical. How can this be prevented if the queries are in a metadata.json file?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334190959,Wildcard support in query parameters, https://github.com/simonw/datasette/issues/309#issuecomment-399134680,https://api.github.com/repos/simonw/datasette/issues/309,399134680,MDEyOklzc3VlQ29tbWVudDM5OTEzNDY4MA==,9599,simonw,2018-06-21T14:59:57Z,2018-06-21T14:59:57Z,OWNER,I can use Sanic middleware for this: http://sanic.readthedocs.io/en/latest/sanic/middleware.html#responding-early,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",331343824,On 404s with a trailing slash redirect to that page without a trailing slash, https://github.com/simonw/datasette/issues/319#issuecomment-399139462,https://api.github.com/repos/simonw/datasette/issues/319,399139462,MDEyOklzc3VlQ29tbWVudDM5OTEzOTQ2Mg==,9599,simonw,2018-06-21T15:13:58Z,2018-06-21T15:13:58Z,OWNER,"Demo of fix: https://latest.datasette.io/fixtures-e14e080/searchable_tags ![2018-06-21 at 8 13 am](https://user-images.githubusercontent.com/9599/41728203-0b571e9a-752b-11e8-9702-9887e3ede5bc.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334149717,Incorrect display of compound primary keys with foreign key relationships, https://github.com/simonw/datasette/issues/309#issuecomment-399142274,https://api.github.com/repos/simonw/datasette/issues/309,399142274,MDEyOklzc3VlQ29tbWVudDM5OTE0MjI3NA==,9599,simonw,2018-06-21T15:22:02Z,2018-06-21T15:22:02Z,OWNER,Demo: https://latest.datasette.io/fixtures-e14e080/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",331343824,On 404s with a trailing slash redirect to that page without a trailing slash, https://github.com/simonw/datasette/issues/317#issuecomment-399144688,https://api.github.com/repos/simonw/datasette/issues/317,399144688,MDEyOklzc3VlQ29tbWVudDM5OTE0NDY4OA==,9599,simonw,2018-06-21T15:29:06Z,2018-06-21T15:29:16Z,OWNER,"From https://docs.travis-ci.com/user/deployment/pypi/ > Note that if your PyPI password contains special characters you need to escape them before encrypting your password. Some people have [reported difficulties](https://github.com/travis-ci/dpl/issues/377) connecting to PyPI with passwords containing anything except alphanumeric characters. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333326107,Travis CI fails to upload new releases to PyPI, https://github.com/simonw/datasette/issues/317#issuecomment-399150285,https://api.github.com/repos/simonw/datasette/issues/317,399150285,MDEyOklzc3VlQ29tbWVudDM5OTE1MDI4NQ==,9599,simonw,2018-06-21T15:45:47Z,2018-06-21T15:45:47Z,OWNER,That fixed it! https://travis-ci.org/simonw/datasette/jobs/395078407 ran successfully and https://pypi.org/project/datasette/ now hosts Datasette 0.23.1 deployed via Travis.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333326107,Travis CI fails to upload new releases to PyPI, https://github.com/simonw/datasette/issues/319#issuecomment-399154550,https://api.github.com/repos/simonw/datasette/issues/319,399154550,MDEyOklzc3VlQ29tbWVudDM5OTE1NDU1MA==,9599,simonw,2018-06-21T15:58:15Z,2018-06-21T15:58:15Z,OWNER,Fixed here too now: https://registry.datasette.io/registry-c10707b/datasette_tags,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334149717,Incorrect display of compound primary keys with foreign key relationships, https://github.com/simonw/datasette/issues/314#issuecomment-399156960,https://api.github.com/repos/simonw/datasette/issues/314,399156960,MDEyOklzc3VlQ29tbWVudDM5OTE1Njk2MA==,9599,simonw,2018-06-21T16:04:59Z,2018-06-21T16:04:59Z,OWNER,"Demo of fix: https://latest.datasette.io/fixtures-e14e080/simple_view ![2018-06-21 at 9 04 am](https://user-images.githubusercontent.com/9599/41731021-2be526aa-7532-11e8-9c3b-f787f918328e.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333096176,HTML table does not correctly display entirely blank rows, https://github.com/simonw/datasette/issues/259#issuecomment-399157944,https://api.github.com/repos/simonw/datasette/issues/259,399157944,MDEyOklzc3VlQ29tbWVudDM5OTE1Nzk0NA==,9599,simonw,2018-06-21T16:07:49Z,2018-06-21T16:07:49Z,OWNER,Thanks to #319 the test suite now includes a m2m table: https://latest.datasette.io/fixtures-e14e080/searchable_tags,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322787470,inspect() should detect many-to-many relationships, https://github.com/simonw/datasette/issues/321#issuecomment-399171239,https://api.github.com/repos/simonw/datasette/issues/321,399171239,MDEyOklzc3VlQ29tbWVudDM5OTE3MTIzOQ==,9599,simonw,2018-06-21T16:51:08Z,2018-06-21T16:51:29Z,OWNER,"I may have misunderstood your problem here. I understood that the problem is that when using the `""%"" || :text || ""%""` construct the first hit to that page (with an empty string for `:text`) results in a `where neighborhood like ""%%""` query which is slow because it matches every row in the database. My fix was to add this to the where clause: where :text != '' and ... Which means that when you first load the page the where fails to match any rows and you get no results (and hopefully instant loading times assuming SQLite is smart enough to optimize this away). That's why you don't see any rows returned on this page: https://latest.datasette.io/fixtures-cafd088?sql=select+neighborhood%2C+facet_cities.name%2C+state%0D%0Afrom+facetable%0D%0A++++join+facet_cities+on+facetable.city_id+%3D+facet_cities.id%0D%0Awhere+%3Atext+%21%3D+%22%22+and+neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0D%0Aorder+by+neighborhood%3B","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334190959,Wildcard support in query parameters, https://github.com/simonw/datasette/issues/321#issuecomment-398973176,https://api.github.com/repos/simonw/datasette/issues/321,398973176,MDEyOklzc3VlQ29tbWVudDM5ODk3MzE3Ng==,9599,simonw,2018-06-21T04:34:11Z,2018-06-21T16:53:57Z,OWNER,"This is a little bit fiddly, but it's possible to do it using SQLite string concatenation. Here's an example: ``` select * from facetable where neighborhood like ""%"" || :text || ""%""; ``` Try it here: https://latest.datasette.io/fixtures-35b6eb6?sql=select+*+from+facetable+where+neighborhood+like+%22%25%22+%7C%7C+%3Atext+%7C%7C+%22%25%22%3B&text=town ![2018-06-20 at 9 33 pm](https://user-images.githubusercontent.com/9599/41698185-a52143f2-74d1-11e8-8d16-32bfc4542104.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334190959,Wildcard support in query parameters, https://github.com/simonw/datasette/issues/321#issuecomment-399173916,https://api.github.com/repos/simonw/datasette/issues/321,399173916,MDEyOklzc3VlQ29tbWVudDM5OTE3MzkxNg==,12617395,bsilverm,2018-06-21T17:00:10Z,2018-06-21T17:00:10Z,NONE,"Oh I see.. My issue is that the query executes with an empty string prior to the user submitting the parameters. I'll try adding your workaround to some of my queries. Thanks again,","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334190959,Wildcard support in query parameters, https://github.com/simonw/datasette/issues/326#issuecomment-399721346,https://api.github.com/repos/simonw/datasette/issues/326,399721346,MDEyOklzc3VlQ29tbWVudDM5OTcyMTM0Ng==,9599,simonw,2018-06-24T01:10:26Z,2018-06-24T01:10:26Z,OWNER,"Demo: go to https://vega.github.io/editor/ and paste in the following: ``` { ""data"": { ""url"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight/twitter-ratio%2Fsenators.csv?_size=max&_sort_desc=replies"", ""format"": { ""type"": ""csv"" } }, ""mark"": ""bar"", ""encoding"": { ""x"": { ""field"": ""created_at"", ""type"": ""temporal"" }, ""y"": { ""field"": ""replies"", ""type"": ""quantitative"" }, ""color"": { ""field"": ""user"", ""type"": ""nominal"" } } } ``` ![2018-06-23 at 6 10 pm](https://user-images.githubusercontent.com/9599/41814923-b1613370-7710-11e8-94ac-5b87b0b629ed.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",335141434,CSV should respect --cors and return cors headers, https://github.com/simonw/datasette/issues/272#issuecomment-400166540,https://api.github.com/repos/simonw/datasette/issues/272,400166540,MDEyOklzc3VlQ29tbWVudDQwMDE2NjU0MA==,9599,simonw,2018-06-26T03:29:43Z,2018-06-26T03:29:43Z,OWNER,This looks VERY relevant: https://github.com/encode/starlette,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-400571521,https://api.github.com/repos/simonw/datasette/issues/272,400571521,MDEyOklzc3VlQ29tbWVudDQwMDU3MTUyMQ==,647359,tomchristie,2018-06-27T07:30:07Z,2018-06-27T07:30:07Z,NONE,"I’m up for helping with this. Looks like you’d need static files support, which I’m planning on adding a component for. Anything else obviously missing? For a quick overview it looks very doable - the test client ought to me your test cases stay roughly the same. Are you using any middleware or other components for the Sanic ecosystem? Do you use cookies or sessions at all?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/328#issuecomment-400903687,https://api.github.com/repos/simonw/datasette/issues/328,400903687,MDEyOklzc3VlQ29tbWVudDQwMDkwMzY4Nw==,9599,simonw,2018-06-28T04:00:18Z,2018-06-28T04:00:18Z,OWNER,Need to ship docker image: #57 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336464733,"Installation instructions, including how to use the docker image", https://github.com/simonw/datasette/issues/57#issuecomment-400903871,https://api.github.com/repos/simonw/datasette/issues/57,400903871,MDEyOklzc3VlQ29tbWVudDQwMDkwMzg3MQ==,9599,simonw,2018-06-28T04:01:38Z,2018-06-28T04:01:38Z,OWNER,"Shipped to Docker Hub: https://hub.docker.com/r/datasetteproject/datasette/ I did this manually the first time. I'll set Travis up to do this automatically in #329","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127694,Ship a Docker image of the whole thing, https://github.com/simonw/datasette/issues/328#issuecomment-400904514,https://api.github.com/repos/simonw/datasette/issues/328,400904514,MDEyOklzc3VlQ29tbWVudDQwMDkwNDUxNA==,9599,simonw,2018-06-28T04:06:39Z,2018-06-28T04:06:39Z,OWNER,https://datasette.readthedocs.io/en/latest/installation.html#using-docker,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336464733,"Installation instructions, including how to use the docker image", https://github.com/simonw/datasette/pull/280#issuecomment-401003061,https://api.github.com/repos/simonw/datasette/issues/280,401003061,MDEyOklzc3VlQ29tbWVudDQwMTAwMzA2MQ==,9599,simonw,2018-06-28T11:26:23Z,2018-06-28T11:26:23Z,OWNER,I pushed this to Docker Hub https://hub.docker.com/r/datasetteproject/datasette/ and added notes on how to use it to the documentation: http://datasette.readthedocs.io/en/latest/installation.html#using-docker,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/issues/276#issuecomment-401310732,https://api.github.com/repos/simonw/datasette/issues/276,401310732,MDEyOklzc3VlQ29tbWVudDQwMTMxMDczMg==,82988,psychemedia,2018-06-29T10:05:04Z,2018-06-29T10:07:25Z,CONTRIBUTOR,"@russs Different map projections can presumably be handled on the client side using a leaflet plugin to transform the geometry (eg [kartena/Proj4Leaflet](https://kartena.github.io/Proj4Leaflet/)) although the leaflet side would need to detect or be informed of the original projection? Another possibility would be to provide an easy way/guidance for users to create an FK'd table containing the WGS84 projection of a non-WGS84 geometry in the original/principle table? This could then as a proxy for serving GeoJSON to the leaflet map?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-401312981,https://api.github.com/repos/simonw/datasette/issues/276,401312981,MDEyOklzc3VlQ29tbWVudDQwMTMxMjk4MQ==,45057,russss,2018-06-29T10:14:54Z,2018-06-29T10:14:54Z,CONTRIBUTOR,"> @RusSs Different map projections can presumably be handled on the client side using a leaflet plugin to transform the geometry (eg kartena/Proj4Leaflet) although the leaflet side would need to detect or be informed of the original projection? Well, as @simonw mentioned, GeoJSON only supports WGS84, and GeoJSON (and/or TopoJSON) is the standard we probably want to aim for. On-the-fly reprojection in spatialite is not an issue anyway, and in general I think you want to be serving stuff to web maps in WGS84 or Web Mercator.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/332#issuecomment-401477622,https://api.github.com/repos/simonw/datasette/issues/332,401477622,MDEyOklzc3VlQ29tbWVudDQwMTQ3NzYyMg==,9599,simonw,2018-06-29T21:23:17Z,2018-06-29T21:23:55Z,OWNER,"https://docs.python.org/3/library/json.html#json.dump > **json.dump**(obj, fp, *, skipkeys=False, ensure_ascii=True, check_circular=True, allow_nan=True, cls=None, indent=None, separators=None, default=None, sort_keys=False, **kw)¶ > If `allow_nan` is false (default: True), then it will be a ValueError to serialize out of range float values (nan, inf, -inf) in strict compliance of the JSON specification. If allow_nan is true, their JavaScript equivalents (NaN, Infinity, -Infinity) will be used.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/332#issuecomment-401478223,https://api.github.com/repos/simonw/datasette/issues/332,401478223,MDEyOklzc3VlQ29tbWVudDQwMTQ3ODIyMw==,9599,simonw,2018-06-29T21:26:12Z,2018-06-29T21:26:19Z,OWNER,"I'm not sure what the correct thing to do here is. I don't want to throw a `ValueError` when trying to render that data as JSON, but I also want to produce JSON that doesn't break when fetched by JavaScript.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/334#issuecomment-403526263,https://api.github.com/repos/simonw/datasette/issues/334,403526263,MDEyOklzc3VlQ29tbWVudDQwMzUyNjI2Mw==,9599,simonw,2018-07-09T15:49:01Z,2018-07-09T15:49:01Z,OWNER,Yup that's definitely a bug.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",339095976,extra_options not passed to heroku publisher, https://github.com/simonw/datasette/issues/325#issuecomment-403263890,https://api.github.com/repos/simonw/datasette/issues/325,403263890,MDEyOklzc3VlQ29tbWVudDQwMzI2Mzg5MA==,9599,simonw,2018-07-08T05:35:20Z,2018-07-09T17:28:27Z,OWNER,Fixed: https://v0-23-2.datasette.io/fixtures-e14e080/table%2Fwith%2Fslashes.csv / https://v0-23-2.datasette.io/fixtures-e14e080/table%2Fwith%2Fslashes.csv/3,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",335064777,Error on row page if table has slashes in the name and ends in .csv, https://github.com/simonw/datasette/issues/334#issuecomment-403672561,https://api.github.com/repos/simonw/datasette/issues/334,403672561,MDEyOklzc3VlQ29tbWVudDQwMzY3MjU2MQ==,9599,simonw,2018-07-10T01:45:28Z,2018-07-10T01:45:28Z,OWNER,"Tested with `datasette publish heroku fixtures.db --extra-options=""--config sql_time_limit_ms:4000""` https://blooming-anchorage-31561.herokuapp.com/-/config","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",339095976,extra_options not passed to heroku publisher, https://github.com/simonw/datasette/issues/323#issuecomment-403855639,https://api.github.com/repos/simonw/datasette/issues/323,403855639,MDEyOklzc3VlQ29tbWVudDQwMzg1NTYzOQ==,9599,simonw,2018-07-10T15:03:36Z,2018-07-10T15:03:36Z,OWNER,I'm satisified with the improvement we got from the pip wheel cache.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334698969,Speed up Travis CI builds, https://github.com/simonw/datasette/issues/330#issuecomment-403855963,https://api.github.com/repos/simonw/datasette/issues/330,403855963,MDEyOklzc3VlQ29tbWVudDQwMzg1NTk2Mw==,9599,simonw,2018-07-10T15:04:31Z,2018-07-10T15:04:31Z,OWNER,This relates to #276 - I'm definitely convinced now that displaying a giant `b'...'` blob on the page is not a useful default.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336924199,Limit text display in cells containing large amounts of text, https://github.com/simonw/datasette/issues/331#issuecomment-403856114,https://api.github.com/repos/simonw/datasette/issues/331,403856114,MDEyOklzc3VlQ29tbWVudDQwMzg1NjExNA==,9599,simonw,2018-07-10T15:04:56Z,2018-07-10T15:04:56Z,OWNER,Great idea.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336936010,Datasette throws error when loading spatialite db without extension loaded, https://github.com/simonw/datasette/issues/331#issuecomment-403858949,https://api.github.com/repos/simonw/datasette/issues/331,403858949,MDEyOklzc3VlQ29tbWVudDQwMzg1ODk0OQ==,9599,simonw,2018-07-10T15:12:53Z,2018-07-10T15:13:04Z,OWNER,"``` $ datasette airports.sqlite Serve! files=('airports.sqlite',) on port 8001 Usage: datasette airports.sqlite [OPTIONS] [FILES]... Error: It looks like you're trying to load a SpatiaLite database without first loading the SpatiaLite module. Read more: https://datasette.readthedocs.io/en/latest/spatialite.html ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336936010,Datasette throws error when loading spatialite db without extension loaded, https://github.com/simonw/datasette/issues/316#issuecomment-398133159,https://api.github.com/repos/simonw/datasette/issues/316,398133159,MDEyOklzc3VlQ29tbWVudDM5ODEzMzE1OQ==,9599,simonw,2018-06-18T17:29:59Z,2018-07-10T15:14:53Z,OWNER,"For #271 I've been contemplating having Datasette work against an on-disk database that gets modified without needing to restart the server. For that to work, I'll have to dramatically change the inspect() mechanism. It may be that inspect becomes an optional optimization in the future.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333238932,datasette inspect takes a very long time on large dbs, https://github.com/simonw/datasette/issues/335#issuecomment-403865063,https://api.github.com/repos/simonw/datasette/issues/335,403865063,MDEyOklzc3VlQ29tbWVudDQwMzg2NTA2Mw==,9599,simonw,2018-07-10T15:29:32Z,2018-07-10T15:29:32Z,OWNER,"Huh... from https://docs.brew.sh/Acceptable-Formulae > We frown on authors submitting their own work unless it is very popular. Marking this one as ""help wanted"" :)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",339505204,Package datasette for installation using homebrew, https://github.com/simonw/datasette/issues/335#issuecomment-403863927,https://api.github.com/repos/simonw/datasette/issues/335,403863927,MDEyOklzc3VlQ29tbWVudDQwMzg2MzkyNw==,9599,simonw,2018-07-10T15:26:27Z,2018-07-10T15:29:54Z,OWNER,Here are some useful examples of other Python apps that have been packaged using the recipe described above: https://github.com/Homebrew/homebrew-core/search?utf8=%E2%9C%93&q=virtualenv_install_with_resources&type=,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",339505204,Package datasette for installation using homebrew, https://github.com/simonw/datasette/issues/335#issuecomment-403866099,https://api.github.com/repos/simonw/datasette/issues/335,403866099,MDEyOklzc3VlQ29tbWVudDQwMzg2NjA5OQ==,9599,simonw,2018-07-10T15:32:14Z,2018-07-10T15:32:14Z,OWNER,"I can host a custom tap without needing to get anything accepted into homebrew-core: https://docs.brew.sh/How-to-Create-and-Maintain-a-Tap Since my principle goal here is ensuring an easy installation path for people who are familiar with `brew` but don't know how to use pip and Python 3 that could be a good option.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",339505204,Package datasette for installation using homebrew, https://github.com/simonw/datasette/issues/330#issuecomment-403868584,https://api.github.com/repos/simonw/datasette/issues/330,403868584,MDEyOklzc3VlQ29tbWVudDQwMzg2ODU4NA==,9599,simonw,2018-07-10T15:39:12Z,2018-07-10T16:21:08Z,OWNER,"I think this makes sense for the HTML view (not for JSON or CSV). It could be controlled be a new [config option](http://datasette.readthedocs.io/en/latest/config.html), `truncate_cells_html` - which is on by default but can be turned off.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336924199,Limit text display in cells containing large amounts of text, https://github.com/simonw/datasette/issues/330#issuecomment-403906747,https://api.github.com/repos/simonw/datasette/issues/330,403906747,MDEyOklzc3VlQ29tbWVudDQwMzkwNjc0Nw==,9599,simonw,2018-07-10T17:39:46Z,2018-07-10T17:39:46Z,OWNER,"``` datasette publish now timezones.db --spatialite \ --extra-options=""--config truncate_cells_html:200"" \ --name=datasette-issue-330-demo \ --branch=master ``` https://datasette-issue-330-demo-sbelwxttfn.now.sh/timezones-3cb9f64/timezones ![2018-07-10 at 10 39 am](https://user-images.githubusercontent.com/9599/42527428-7eabc6c8-842d-11e8-91ac-5666dbc5872c.png) But https://datasette-issue-330-demo-sbelwxttfn.now.sh/timezones-3cb9f64/timezones/1 displays the full blob.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336924199,Limit text display in cells containing large amounts of text, https://github.com/simonw/datasette/issues/330#issuecomment-403907193,https://api.github.com/repos/simonw/datasette/issues/330,403907193,MDEyOklzc3VlQ29tbWVudDQwMzkwNzE5Mw==,9599,simonw,2018-07-10T17:41:14Z,2018-07-10T17:41:14Z,OWNER,Documentation: http://datasette.readthedocs.io/en/latest/config.html#truncate-cells-html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336924199,Limit text display in cells containing large amounts of text, https://github.com/simonw/datasette/issues/191#issuecomment-403908704,https://api.github.com/repos/simonw/datasette/issues/191,403908704,MDEyOklzc3VlQ29tbWVudDQwMzkwODcwNA==,9599,simonw,2018-07-10T17:46:13Z,2018-07-10T17:46:13Z,OWNER,I consider this resolved by #46 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310533258,Figure out how to bundle a more up-to-date SQLite, https://github.com/simonw/datasette/issues/139#issuecomment-403909389,https://api.github.com/repos/simonw/datasette/issues/139,403909389,MDEyOklzc3VlQ29tbWVudDQwMzkwOTM4OQ==,9599,simonw,2018-07-10T17:48:18Z,2018-07-10T17:48:18Z,OWNER,This is done! https://github.com/simonw/datasette-vega,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275493851,Build a visualization plugin for Vega, https://github.com/simonw/datasette/issues/143#issuecomment-403909469,https://api.github.com/repos/simonw/datasette/issues/143,403909469,MDEyOklzc3VlQ29tbWVudDQwMzkwOTQ2OQ==,9599,simonw,2018-07-10T17:48:34Z,2018-07-10T17:48:34Z,OWNER,This is now a dupe of https://github.com/simonw/datasette-vega/issues/4,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275939188,"Mechanism for ""suggested visualizations""", https://github.com/simonw/datasette/issues/87#issuecomment-403909671,https://api.github.com/repos/simonw/datasette/issues/87,403909671,MDEyOklzc3VlQ29tbWVudDQwMzkwOTY3MQ==,9599,simonw,2018-07-10T17:49:12Z,2018-07-10T17:49:12Z,OWNER,This was fixed by https://github.com/simonw/datasette/commit/6a32684ebba89dfe882e1147b23aa8778479f5d8#diff-354f30a63fb0907d4ad57269548329e3,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273709194,Configure Travis to release new tags to PyPI, https://github.com/simonw/datasette/issues/140#issuecomment-403910318,https://api.github.com/repos/simonw/datasette/issues/140,403910318,MDEyOklzc3VlQ29tbWVudDQwMzkxMDMxOA==,9599,simonw,2018-07-10T17:51:11Z,2018-07-10T17:51:11Z,OWNER,This would be a nice example plugin to demonstrate plugin configuration options in #231,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275755475,Heatmap visualization plugin, https://github.com/simonw/datasette/issues/27#issuecomment-403910774,https://api.github.com/repos/simonw/datasette/issues/27,403910774,MDEyOklzc3VlQ29tbWVudDQwMzkxMDc3NA==,9599,simonw,2018-07-10T17:52:41Z,2018-07-10T17:52:41Z,OWNER,I consider this handled by https://github.com/simonw/datasette-vega,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267886330,Ability to plot a simple graph, https://github.com/simonw/datasette/issues/137#issuecomment-345750135,https://api.github.com/repos/simonw/datasette/issues/137,345750135,MDEyOklzc3VlQ29tbWVudDM0NTc1MDEzNQ==,9599,simonw,2017-11-20T16:30:56Z,2018-07-10T17:53:13Z,OWNER,"One possible route: introduce prefixes eg `?a.Trees.age__gt=5&a.Trees._group_count=qSpecies&b.Trees.age__gt=10&b.Trees._group_count=qSpecies` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275415799,Ability to combine multiple SQL queries on a single graph, https://github.com/simonw/datasette/issues/140#issuecomment-403939399,https://api.github.com/repos/simonw/datasette/issues/140,403939399,MDEyOklzc3VlQ29tbWVudDQwMzkzOTM5OQ==,9599,simonw,2018-07-10T19:30:17Z,2018-07-10T19:30:41Z,OWNER,Building this using Svelte would also produce a neat example of a plugin that uses Svelte: https://svelte.technology/guide - and if I like it I might part datasette-vega to it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275755475,Heatmap visualization plugin, https://github.com/simonw/datasette/issues/272#issuecomment-403959704,https://api.github.com/repos/simonw/datasette/issues/272,403959704,MDEyOklzc3VlQ29tbWVudDQwMzk1OTcwNA==,9599,simonw,2018-07-10T20:44:47Z,2018-07-10T20:44:47Z,OWNER,"No cookies or sessions - no POST requests in fact, Datasette just cares about GET (path and querystring) and being able to return custom HTTP headers.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/336#issuecomment-403996143,https://api.github.com/repos/simonw/datasette/issues/336,403996143,MDEyOklzc3VlQ29tbWVudDQwMzk5NjE0Mw==,9599,simonw,2018-07-10T23:21:27Z,2018-07-10T23:21:27Z,OWNER,Easiest way to do this I think would be to make those help blocks separate files in the docs/ directory (publish-help.txt perhaps) and then include them with a sphinx directive: https://reinout.vanrees.org/weblog/2010/12/08/include-external-in-sphinx.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",340039409,Ensure --help examples in docs are always up to date, https://github.com/simonw/datasette/issues/337#issuecomment-404021589,https://api.github.com/repos/simonw/datasette/issues/337,404021589,MDEyOklzc3VlQ29tbWVudDQwNDAyMTU4OQ==,9599,simonw,2018-07-11T02:07:32Z,2018-07-11T02:07:32Z,OWNER,http://datasette.readthedocs.io/en/latest/publish.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",340065374,Documentation for datasette publish and datasette package, https://github.com/simonw/datasette/issues/336#issuecomment-404021890,https://api.github.com/repos/simonw/datasette/issues/336,404021890,MDEyOklzc3VlQ29tbWVudDQwNDAyMTg5MA==,9599,simonw,2018-07-11T02:09:25Z,2018-07-11T02:09:25Z,OWNER,"I decided against the unit tests, instead I have a new script called `./update-docs-help.sh` which I can run any time I want to refresh the included documentation: https://github.com/simonw/datasette/commit/aec3ae53237e43b0c268dbf9b58fa265ef38cfe1#diff-cb15a1e5a244bb82ad4afce67f252543","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",340039409,Ensure --help examples in docs are always up to date, https://github.com/simonw/datasette/issues/335#issuecomment-404208602,https://api.github.com/repos/simonw/datasette/issues/335,404208602,MDEyOklzc3VlQ29tbWVudDQwNDIwODYwMg==,9599,simonw,2018-07-11T15:20:12Z,2018-07-11T15:20:12Z,OWNER,Here's a good example of a homebrew tap: https://github.com/saulpw/homebrew-vd,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",339505204,Package datasette for installation using homebrew, https://github.com/simonw/datasette/issues/338#issuecomment-404209205,https://api.github.com/repos/simonw/datasette/issues/338,404209205,MDEyOklzc3VlQ29tbWVudDQwNDIwOTIwNQ==,9599,simonw,2018-07-11T15:21:47Z,2018-07-11T15:21:47Z,OWNER,"Oops, opened this in the wrong repo - moved it here: https://github.com/simonw/datasette-vega/issues/13","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",340282796,Only load vegaEmbed if charting tools are enabled, https://github.com/simonw/datasette/issues/339#issuecomment-404338345,https://api.github.com/repos/simonw/datasette/issues/339,404338345,MDEyOklzc3VlQ29tbWVudDQwNDMzODM0NQ==,9599,simonw,2018-07-11T23:09:24Z,2018-07-11T23:09:24Z,OWNER,"It sounds like you're running into the Sanic default response timeout value of 60 seconds: https://github.com/channelcat/sanic/blob/master/docs/sanic/config.md#builtin-configuration-values For the moment you can over-ride that using an environment variable like this: SANIC_RESPONSE_TIMEOUT=6000 datasette fivethirtyeight.db -p 8008 --config sql_time_limit_ms:600000","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",340396247,Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way, https://github.com/simonw/datasette/issues/272#issuecomment-404514973,https://api.github.com/repos/simonw/datasette/issues/272,404514973,MDEyOklzc3VlQ29tbWVudDQwNDUxNDk3Mw==,647359,tomchristie,2018-07-12T13:38:24Z,2018-07-12T13:38:24Z,NONE,"Okay. I reckon the latest version should have all the kinds of components you'd need: Recently added ASGI components for Routing and Static Files support, as well as making few tweaks to make sure requests and responses are instantiated efficiently. Don't have any redirect-to-slash / redirect-to-non-slash stuff out of the box yet, which it looks like you might miss.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/339#issuecomment-404565566,https://api.github.com/repos/simonw/datasette/issues/339,404565566,MDEyOklzc3VlQ29tbWVudDQwNDU2NTU2Ng==,9599,simonw,2018-07-12T16:08:42Z,2018-07-12T16:08:42Z,OWNER,I'm going to turn this into an issue about better supporting the above option.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",340396247,Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way, https://github.com/simonw/datasette/issues/332#issuecomment-402243153,https://api.github.com/repos/simonw/datasette/issues/332,402243153,MDEyOklzc3VlQ29tbWVudDQwMjI0MzE1Mw==,9599,simonw,2018-07-03T17:58:50Z,2018-07-12T16:10:39Z,OWNER,"I think I'm going to return `null` in the JSON for infinity/nan values by default, but if you send `_nan=1` I will instead return invalid JSON with `Infinity` or `NaN` in it (since you have opted in to getting those and hence should be able to handle them).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/332#issuecomment-404567587,https://api.github.com/repos/simonw/datasette/issues/332,404567587,MDEyOklzc3VlQ29tbWVudDQwNDU2NzU4Nw==,9599,simonw,2018-07-12T16:15:29Z,2018-07-12T16:17:54Z,OWNER,Here's how plotly handled this issue: https://github.com/plotly/plotly.py/pull/203 - see also https://github.com/plotly/plotly.py/blob/213602df6c89b45ce2b811ed2591171c961408e7/plotly/utils.py#L137,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/332#issuecomment-404569003,https://api.github.com/repos/simonw/datasette/issues/332,404569003,MDEyOklzc3VlQ29tbWVudDQwNDU2OTAwMw==,9599,simonw,2018-07-12T16:20:06Z,2018-07-12T16:20:06Z,OWNER,And here's how django-rest-framework did it: https://github.com/encode/django-rest-framework/pull/4918/files,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/332#issuecomment-404574598,https://api.github.com/repos/simonw/datasette/issues/332,404574598,MDEyOklzc3VlQ29tbWVudDQwNDU3NDU5OA==,9599,simonw,2018-07-12T16:39:51Z,2018-07-12T16:39:51Z,OWNER,Since my data is all flat lists of values I don't think I need to customize the JSON encoder itself (no need to deal with nested values). I'll fix the data on its way into the encoder instead. This will also help if I decide to move to uJSON for better performance #48,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/339#issuecomment-404576136,https://api.github.com/repos/simonw/datasette/issues/339,404576136,MDEyOklzc3VlQ29tbWVudDQwNDU3NjEzNg==,12617395,bsilverm,2018-07-12T16:45:08Z,2018-07-12T16:45:08Z,NONE,Thanks for the quick reply. Looks like that is working well.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",340396247,Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way, https://github.com/simonw/datasette/issues/327#issuecomment-404923318,https://api.github.com/repos/simonw/datasette/issues/327,404923318,MDEyOklzc3VlQ29tbWVudDQwNDkyMzMxOA==,9599,simonw,2018-07-13T18:58:11Z,2018-07-13T18:58:11Z,OWNER,Relevant: https://code.fb.com/data-infrastructure/xars-a-more-efficient-open-source-system-for-self-contained-executables/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",335200136,Explore if SquashFS can be used to shrink size of packaged Docker containers, https://github.com/simonw/datasette/issues/342#issuecomment-404953877,https://api.github.com/repos/simonw/datasette/issues/342,404953877,MDEyOklzc3VlQ29tbWVudDQwNDk1Mzg3Nw==,9599,simonw,2018-07-13T21:05:12Z,2018-07-13T21:05:12Z,OWNER,That's a good idea. We already do this for tables - e.g. on https://fivethirtyeight.datasettes.com/fivethirtyeight-ac35616/most-common-name%2Fsurnames - so having it as an option for canned queries definitely makes sense.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",341123355,Requesting support for query description, https://github.com/simonw/datasette/issues/342#issuecomment-404954202,https://api.github.com/repos/simonw/datasette/issues/342,404954202,MDEyOklzc3VlQ29tbWVudDQwNDk1NDIwMg==,9599,simonw,2018-07-13T21:06:53Z,2018-07-13T21:07:13Z,OWNER,"https://timezones-api.now.sh/-/metadata currently shows this: ``` { ""databases"": { ""timezones"": { ""license"": ""ODbL"", ""license_url"": ""http://opendatacommons.org/licenses/odbl/"", ""queries"": { ""by_point"": ""select tzid\nfrom\n timezones\nwhere\n within(GeomFromText(\u0027POINT(\u0027 || :longitude || \u0027 \u0027 || :latitude || \u0027)\u0027), timezones.Geometry)\n and rowid in (\n SELECT pkid FROM idx_timezones_Geometry\n where xmin \u003c :longitude\n and xmax \u003e :longitude\n and ymin \u003c :latitude\n and ymax \u003e :latitude\n )"" }, ""source"": ""timezone-boundary-builder"", ""source_url"": ""https://github.com/evansiroky/timezone-boundary-builder"", ""tables"": { ""timezones"": { ""license"": ""ODbL"", ""license_url"": ""http://opendatacommons.org/licenses/odbl/"", ""sortable_columns"": [ ""tzid"" ], ""source"": ""timezone-boundary-builder"", ""source_url"": ""https://github.com/evansiroky/timezone-boundary-builder"" } } } }, ""license"": ""ODbL"", ""license_url"": ""http://opendatacommons.org/licenses/odbl/"", ""source"": ""timezone-boundary-builder"", ""source_url"": ""https://github.com/evansiroky/timezone-boundary-builder"", ""title"": ""OpenStreetMap Time Zone Boundaries"" } ``` We could support the value part of the `""queries""` array optionally being a dictionary with the same set of metadata fields supported for a table, plus a new `""sql""` key to hold the SQL for the query. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",341123355,Requesting support for query description, https://github.com/simonw/datasette/issues/342#issuecomment-404954672,https://api.github.com/repos/simonw/datasette/issues/342,404954672,MDEyOklzc3VlQ29tbWVudDQwNDk1NDY3Mg==,9599,simonw,2018-07-13T21:09:01Z,2018-07-13T21:09:01Z,OWNER,"So it would look like this: ``` { ""databases"": { ""timezones"": { ""license"": ""ODbL"", ""license_url"": ""http://opendatacommons.org/licenses/odbl/"", ""queries"": { ""by_point"": { ""title"": ""Timezones by point"", ""description"": ""Find the timezone for a latitude/longitude point"", ""sql"": ""select tzid\nfrom\n timezones\nwhere\n within(GeomFromText('POINT(' || :longitude || ' ' || :latitude || ')'), timezones.Geometry)\n and rowid in (\n SELECT pkid FROM idx_timezones_Geometry\n where xmin < :longitude\n and xmax > :longitude\n and ymin < :latitude\n and ymax > :latitude\n )"" } } } } } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",341123355,Requesting support for query description, https://github.com/simonw/datasette/issues/344#issuecomment-405022335,https://api.github.com/repos/simonw/datasette/issues/344,405022335,MDEyOklzc3VlQ29tbWVudDQwNTAyMjMzNQ==,45057,russss,2018-07-14T13:00:48Z,2018-07-14T13:00:48Z,CONTRIBUTOR,"Looks like this was a red herring actually, and heroku had a blip when I was testing it...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",341229113,datasette publish heroku fails without name provided, https://github.com/simonw/datasette/pull/345#issuecomment-405025731,https://api.github.com/repos/simonw/datasette/issues/345,405025731,MDEyOklzc3VlQ29tbWVudDQwNTAyNTczMQ==,9599,simonw,2018-07-14T14:04:31Z,2018-07-14T14:04:31Z,OWNER,"Fantastic, we really needed this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",341235633,Allow app names for `datasette publish heroku`, https://github.com/simonw/datasette/issues/343#issuecomment-405026441,https://api.github.com/repos/simonw/datasette/issues/343,405026441,MDEyOklzc3VlQ29tbWVudDQwNTAyNjQ0MQ==,45057,russss,2018-07-14T14:17:14Z,2018-07-14T14:17:14Z,CONTRIBUTOR,This probably depends on #294.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",341228846,Render boolean fields better by default, https://github.com/simonw/datasette/issues/294#issuecomment-405026800,https://api.github.com/repos/simonw/datasette/issues/294,405026800,MDEyOklzc3VlQ29tbWVudDQwNTAyNjgwMA==,45057,russss,2018-07-14T14:24:31Z,2018-07-14T14:24:31Z,CONTRIBUTOR,"I had a quick look at this in relation to #343 and I feel like it might be worth modelling the inspected table metadata internally as an object rather than a dict. (We'd still have to serialise it back to JSON.) There are a few places where we rely on the structure of this metadata dict for various reasons, including in templates (and potentially also in user templates). It would be nice to have a reasonably well defined API for accessing metadata internally so that it's clearer what we're breaking.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327365110,inspect should record column types, https://github.com/simonw/datasette/issues/342#issuecomment-405138460,https://api.github.com/repos/simonw/datasette/issues/342,405138460,MDEyOklzc3VlQ29tbWVudDQwNTEzODQ2MA==,9599,simonw,2018-07-16T02:42:32Z,2018-07-16T02:42:32Z,OWNER,"Demos: * https://latest.datasette.io/fixtures/neighborhood_search * https://timezones-api.now.sh/timezones/by_point Documentation: http://datasette.readthedocs.io/en/latest/sql_queries.html#canned-queries","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",341123355,Requesting support for query description, https://github.com/simonw/datasette/issues/332#issuecomment-405968983,https://api.github.com/repos/simonw/datasette/issues/332,405968983,MDEyOklzc3VlQ29tbWVudDQwNTk2ODk4Mw==,9599,simonw,2018-07-18T15:18:57Z,2018-07-18T15:18:57Z,OWNER,Maybe argument should be `?_json_nan=1` since that makes it more explicitly obvious what is going on here.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/308#issuecomment-405971920,https://api.github.com/repos/simonw/datasette/issues/308,405971920,MDEyOklzc3VlQ29tbWVudDQwNTk3MTkyMA==,9599,simonw,2018-07-18T15:27:12Z,2018-07-18T15:27:12Z,OWNER,"It looks like there are a few extra options we should support: https://devcenter.heroku.com/articles/heroku-cli-commands ``` -t, --team=team team to use --region=region specify region for the app to run in --space=space the private space to create the app in ``` Since these differ from the options for Zeit Now I think this means splitting up `datasette publish now` and `datasette publish Heroku` into separate subcommands.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",330826972,"Support extra Heroku apps:create options - region, space, team", https://github.com/simonw/datasette/issues/333#issuecomment-405975025,https://api.github.com/repos/simonw/datasette/issues/333,405975025,MDEyOklzc3VlQ29tbWVudDQwNTk3NTAyNQ==,9599,simonw,2018-07-18T15:36:11Z,2018-07-18T15:40:04Z,OWNER,"A `force_https_api_urls` config option would work here - if set, Datasette will ignore the incoming protocol and always use https. The `datasette deploy now` command could then add that as an option passed to `datasette serve`. This is the pattern which is producing incorrect URLs on Zeit Now, because the Sanic `request.url` property is not being correctly set. https://github.com/simonw/datasette/blob/6e37f091edec35e2706197489f54fff5d890c63c/datasette/views/table.py#L653-L655 Suggested help text: > Always use https:// for URLs output as part of Datasette API responses","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",338768551,Datasette on Zeit Now returns http URLs for facet and next links, https://github.com/simonw/datasette/issues/333#issuecomment-405988035,https://api.github.com/repos/simonw/datasette/issues/333,405988035,MDEyOklzc3VlQ29tbWVudDQwNTk4ODAzNQ==,9599,simonw,2018-07-18T16:12:35Z,2018-07-18T16:12:35Z,OWNER,"I'll add a `absolute_url(request, path)` method on the base view class which knows to check the new config option.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",338768551,Datasette on Zeit Now returns http URLs for facet and next links, https://github.com/simonw/datasette/issues/333#issuecomment-407109113,https://api.github.com/repos/simonw/datasette/issues/333,407109113,MDEyOklzc3VlQ29tbWVudDQwNzEwOTExMw==,9599,simonw,2018-07-23T15:59:02Z,2018-07-23T15:59:02Z,OWNER,I still need to modify `datasette publish now` to set this config option on the instances that it deploys.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",338768551,Datasette on Zeit Now returns http URLs for facet and next links, https://github.com/simonw/datasette/issues/332#issuecomment-407262311,https://api.github.com/repos/simonw/datasette/issues/332,407262311,MDEyOklzc3VlQ29tbWVudDQwNzI2MjMxMQ==,9599,simonw,2018-07-24T02:43:03Z,2018-07-24T02:43:03Z,OWNER,Actually SQLite doesn't handle NaN at all (it treats it as null) so I'm going to change this ticket to just deal with Infinity and -Infinity.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/332#issuecomment-407262436,https://api.github.com/repos/simonw/datasette/issues/332,407262436,MDEyOklzc3VlQ29tbWVudDQwNzI2MjQzNg==,9599,simonw,2018-07-24T02:43:50Z,2018-07-24T02:43:50Z,OWNER,I'm going with `_json_infinity=1` as the querystring argument.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/332#issuecomment-407262561,https://api.github.com/repos/simonw/datasette/issues/332,407262561,MDEyOklzc3VlQ29tbWVudDQwNzI2MjU2MQ==,9599,simonw,2018-07-24T02:44:39Z,2018-07-24T02:44:39Z,OWNER,According to https://www.mail-archive.com/sqlite-users@mailinglists.sqlite.org/msg110573.html you can insert Infinity/-Infinity in raw SQL (as used by our fixtures) using 1e999 and -1e999.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/332#issuecomment-407267707,https://api.github.com/repos/simonw/datasette/issues/332,407267707,MDEyOklzc3VlQ29tbWVudDQwNzI2NzcwNw==,9599,simonw,2018-07-24T03:20:08Z,2018-07-24T03:20:08Z,OWNER,"Demo: * https://700d83d.datasette.io/fixtures-dcc1dbf/infinity.json - Infinity converted to Null * https://700d83d.datasette.io/fixtures-dcc1dbf/infinity.json?_json_infinity=on - invalid JSON containing `Infinity` and `-Infinity`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/332#issuecomment-407267762,https://api.github.com/repos/simonw/datasette/issues/332,407267762,MDEyOklzc3VlQ29tbWVudDQwNzI2Nzc2Mg==,9599,simonw,2018-07-24T03:20:33Z,2018-07-24T03:20:33Z,OWNER,Documentation: http://datasette.readthedocs.io/en/latest/json_api.html#special-json-arguments,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/333#issuecomment-407267966,https://api.github.com/repos/simonw/datasette/issues/333,407267966,MDEyOklzc3VlQ29tbWVudDQwNzI2Nzk2Ng==,9599,simonw,2018-07-24T03:21:42Z,2018-07-24T03:21:42Z,OWNER,Demo: https://700d83d.datasette.io/fixtures-dcc1dbf/facetable.json?_facet=state&_size=5&_labels=on,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",338768551,Datasette on Zeit Now returns http URLs for facet and next links, https://github.com/simonw/datasette/issues/320#issuecomment-407269243,https://api.github.com/repos/simonw/datasette/issues/320,407269243,MDEyOklzc3VlQ29tbWVudDQwNzI2OTI0Mw==,9599,simonw,2018-07-24T03:30:32Z,2018-07-24T03:30:32Z,OWNER,"* No primary key => no ""object"" option: https://latest.datasette.io/fixtures-dcc1dbf/no_primary_key * Has a primary key => show ""object"" option: https://latest.datasette.io/fixtures-dcc1dbf/complex_foreign_keys * Has a next page => has ""stream all rows"" option: https://latest.datasette.io/fixtures-dcc1dbf/no_primary_key * Has foreign key references = show default-checked ""expand labels"" option: https://latest.datasette.io/fixtures-dcc1dbf/complex_foreign_keys * Does not have a next page => do not show ""stream all rows"" option: https://latest.datasette.io/fixtures-dcc1dbf/complex_foreign_keys ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334169932,Need unit tests covering the different states for the advanced export box, https://github.com/simonw/datasette/issues/298#issuecomment-407274059,https://api.github.com/repos/simonw/datasette/issues/298,407274059,MDEyOklzc3VlQ29tbWVudDQwNzI3NDA1OQ==,9599,simonw,2018-07-24T04:03:05Z,2018-07-24T04:03:05Z,OWNER,Demo: https://latest.datasette.io/fixtures-dcc1dbf?sql=select+%28%27https%3A%2F%2Ftwitter.com%2F%27+%7C%7C+%27simonw%27%29+as+user_url%3B,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327459829,URLify URLs in results from custom SQL statements / views, https://github.com/simonw/datasette/issues/329#issuecomment-407275996,https://api.github.com/repos/simonw/datasette/issues/329,407275996,MDEyOklzc3VlQ29tbWVudDQwNzI3NTk5Ng==,9599,simonw,2018-07-24T04:18:28Z,2018-07-24T04:18:28Z,OWNER,Hopefully this will do the trick: https://github.com/simonw/datasette/commit/2bdab66772dca51b0c729b4e1063610cb2edd890,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336465018,Travis should push tagged images to Docker Hub for each release, https://github.com/simonw/datasette/issues/329#issuecomment-407280689,https://api.github.com/repos/simonw/datasette/issues/329,407280689,MDEyOklzc3VlQ29tbWVudDQwNzI4MDY4OQ==,9599,simonw,2018-07-24T04:52:58Z,2018-07-24T04:52:58Z,OWNER,"It almost worked... but I had to fix the `docker login` command: https://github.com/simonw/datasette/commit/3a46d5e3c4278e74c3694f36995ea134bff800bc Hopefully the next release will be published correctly.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336465018,Travis should push tagged images to Docker Hub for each release, https://github.com/simonw/datasette/issues/336#issuecomment-407450815,https://api.github.com/repos/simonw/datasette/issues/336,407450815,MDEyOklzc3VlQ29tbWVudDQwNzQ1MDgxNQ==,9599,simonw,2018-07-24T15:35:03Z,2018-07-24T15:35:03Z,OWNER,Actually I do like the idea of a unit test that reminds me if I've forgotten to update the included files.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",340039409,Ensure --help examples in docs are always up to date, https://github.com/simonw/datasette/issues/301#issuecomment-407979065,https://api.github.com/repos/simonw/datasette/issues/301,407979065,MDEyOklzc3VlQ29tbWVudDQwNzk3OTA2NQ==,9599,simonw,2018-07-26T05:17:34Z,2018-07-26T05:17:34Z,OWNER,This code now lives in https://github.com/simonw/datasette/blob/master/datasette/publish/heroku.py,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328155946,"--spatialite option for ""datasette publish heroku""", https://github.com/simonw/datasette/issues/217#issuecomment-407980050,https://api.github.com/repos/simonw/datasette/issues/217,407980050,MDEyOklzc3VlQ29tbWVudDQwNzk4MDA1MA==,9599,simonw,2018-07-26T05:24:17Z,2018-07-26T05:24:17Z,OWNER,Documentation: http://datasette.readthedocs.io/en/latest/plugins.html#publish-subcommand-publish,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314725342,Plugin support for datasette publish, https://github.com/simonw/datasette/pull/349#issuecomment-407980716,https://api.github.com/repos/simonw/datasette/issues/349,407980716,MDEyOklzc3VlQ29tbWVudDQwNzk4MDcxNg==,9599,simonw,2018-07-26T05:28:54Z,2018-07-26T05:28:54Z,OWNER,"Documentation here: http://datasette.readthedocs.io/en/latest/plugins.html#publish-subcommand-publish The best way to write a new publish plugin is to check out how the Heroku and Now default plugins are implemented: https://github.com/simonw/datasette/tree/master/datasette/publish","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",344695978,"publish_subcommand hook + default plugins mechanism, used for publish heroku/now", https://github.com/simonw/datasette/issues/348#issuecomment-407983375,https://api.github.com/repos/simonw/datasette/issues/348,407983375,MDEyOklzc3VlQ29tbWVudDQwNzk4MzM3NQ==,9599,simonw,2018-07-26T05:46:01Z,2018-07-26T05:46:01Z,OWNER,"Oops, forgot to commit those unit tests.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",344656114,"Unit tests for ""datasette publish""", https://github.com/simonw/datasette/issues/272#issuecomment-408097719,https://api.github.com/repos/simonw/datasette/issues/272,408097719,MDEyOklzc3VlQ29tbWVudDQwODA5NzcxOQ==,9599,simonw,2018-07-26T13:29:38Z,2018-07-26T13:29:38Z,OWNER,It looks like that's a bug in Starlette - filed here: https://github.com/encode/starlette/issues/32,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-408093480,https://api.github.com/repos/simonw/datasette/issues/272,408093480,MDEyOklzc3VlQ29tbWVudDQwODA5MzQ4MA==,9599,simonw,2018-07-26T13:15:55Z,2018-07-26T13:46:40Z,OWNER,"I'm now hacking around with an initial version of this in the [starlette branch](https://github.com/simonw/datasette/tree/starlette). Here's my work in progress, deployed using `datasette publish now fixtures.db -n datasette-starlette-demo --branch=starlette --extra-options=""--asgi""` https://datasette-starlette-demo.now.sh/ Lots more work to do - the CSS isn't being served correctly for example, it's showing this error when I hit `/-/static/app.css`: ``` INFO: 127.0.0.1 - ""GET /-/static/app.css HTTP/1.1"" 200 ERROR: Exception in ASGI application Traceback (most recent call last): File ""/Users/simonw/Dropbox/Development/datasette/venv/lib/python3.6/site-packages/uvicorn/protocols/http/httptools_impl.py"", line 363, in run_asgi result = await asgi(self.receive, self.send) File ""/Users/simonw/Dropbox/Development/datasette/venv/lib/python3.6/site-packages/starlette/staticfiles.py"", line 91, in __call__ await response(receive, send) File ""/Users/simonw/Dropbox/Development/datasette/venv/lib/python3.6/site-packages/starlette/response.py"", line 180, in __call__ {""type"": ""http.response.body"", ""body"": chunk, ""more_body"": False} File ""/Users/simonw/Dropbox/Development/datasette/venv/lib/python3.6/site-packages/uvicorn/protocols/http/httptools_impl.py"", line 483, in send raise RuntimeError(""Response content shorter than Content-Length"") RuntimeError: Response content shorter than Content-Length ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-408105251,https://api.github.com/repos/simonw/datasette/issues/272,408105251,MDEyOklzc3VlQ29tbWVudDQwODEwNTI1MQ==,9599,simonw,2018-07-26T13:54:06Z,2018-07-26T13:54:06Z,OWNER,"Tom shipped my fix for that bug already, so https://datasette-starlette-demo.now.sh/ is now serving CSS!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-408478935,https://api.github.com/repos/simonw/datasette/issues/272,408478935,MDEyOklzc3VlQ29tbWVudDQwODQ3ODkzNQ==,9599,simonw,2018-07-27T17:00:08Z,2018-07-27T17:00:08Z,OWNER,"Refs https://github.com/encode/uvicorn/issues/168","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/299#issuecomment-408581551,https://api.github.com/repos/simonw/datasette/issues/299,408581551,MDEyOklzc3VlQ29tbWVudDQwODU4MTU1MQ==,9599,simonw,2018-07-28T04:24:05Z,2018-07-28T04:24:05Z,OWNER,New documentation is now online here: https://datasette.readthedocs.io/en/latest/pages.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327461381,Documentation covering ALL datasette URLs, https://github.com/simonw/datasette/issues/259#issuecomment-392214791,https://api.github.com/repos/simonw/datasette/issues/259,392214791,MDEyOklzc3VlQ29tbWVudDM5MjIxNDc5MQ==,9599,simonw,2018-05-25T23:43:15Z,2018-07-29T00:56:03Z,OWNER,"We may need to derive a usable name for each of these relationships that can be used in eg querystring parameters. The name of the join table is a reasonable choice here. Say the join table is called `event_tags` - the querystring for returning all events that are tagged `badger` could be `/db/events?_m2m_event_tags__tag=badger` perhaps? But what if `event_tags` has more than one foreign key back to `events`? Might need to specify the column in `events` that is referred back to by `event_tags` somehow in that case.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322787470,inspect() should detect many-to-many relationships, https://github.com/simonw/datasette/issues/259#issuecomment-409087501,https://api.github.com/repos/simonw/datasette/issues/259,409087501,MDEyOklzc3VlQ29tbWVudDQwOTA4NzUwMQ==,9599,simonw,2018-07-31T04:03:29Z,2018-07-31T04:03:29Z,OWNER,Parent ticket: #354,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322787470,inspect() should detect many-to-many relationships, https://github.com/simonw/datasette/issues/355#issuecomment-409087871,https://api.github.com/repos/simonw/datasette/issues/355,409087871,MDEyOklzc3VlQ29tbWVudDQwOTA4Nzg3MQ==,9599,simonw,2018-07-31T04:06:22Z,2018-07-31T04:06:22Z,OWNER,"I started playing with this in the `m2m` branch - work so far: https://github.com/simonw/datasette/compare/295d005ca48747faf046ed30c3c61e7563c61ed2...af4ce463e7518f9d7828b846efd5b528a1905eca Here's a demo: https://datasette-m2m-work-in-progress.now.sh/russian-ads-e8e09e2/ads?_m2m_ad_targets__target_id=ec3ac&_m2m_ad_targets__target_id=e128e","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",346027040,Table view should support filtering via many-to-many relationships, https://github.com/simonw/datasette/issues/356#issuecomment-409088967,https://api.github.com/repos/simonw/datasette/issues/356,409088967,MDEyOklzc3VlQ29tbWVudDQwOTA4ODk2Nw==,9599,simonw,2018-07-31T04:14:44Z,2018-07-31T04:14:44Z,OWNER,"Here's the query I'm playing with for facet counts: https://datasette-m2m-work-in-progress.now.sh/russian-ads-e8e09e2?sql=select+target_id%2C+count%28*%29+as+n+from+ad_targets%0D%0Awhere%0D%0A++target_id+not+in+%28%22ec3ac%22%2C+%22e128e%22%29%0D%0A++and+ad_id+in+%28select+ad_id+from+ad_targets+where+target_id+%3D+%22ec3ac%22%29%0D%0A++and+ad_id+in+%28select+ad_id+from+ad_targets+where+target_id+%3D+%22e128e%22%29%0D%0Agroup+by+target_id+order+by+n+desc%3B ``` select target_id, count(*) as n from ad_targets where target_id not in (""ec3ac"", ""e128e"") and ad_id in (select ad_id from ad_targets where target_id = ""ec3ac"") and ad_id in (select ad_id from ad_targets where target_id = ""e128e"") group by target_id order by n desc; ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",346028655,Ability to display facet counts for many-to-many relationships, https://github.com/simonw/datasette/issues/352#issuecomment-409715112,https://api.github.com/repos/simonw/datasette/issues/352,409715112,MDEyOklzc3VlQ29tbWVudDQwOTcxNTExMg==,9599,simonw,2018-08-01T20:41:04Z,2018-08-01T20:41:04Z,OWNER,The hook is currently only used on the custom SQL results page - it needs to run on table/view pages as well.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",345821500,render_cell(value) plugin hook, https://github.com/simonw/datasette/issues/352#issuecomment-410485995,https://api.github.com/repos/simonw/datasette/issues/352,410485995,MDEyOklzc3VlQ29tbWVudDQxMDQ4NTk5NQ==,9599,simonw,2018-08-05T00:16:21Z,2018-08-05T00:16:21Z,OWNER,"First plugin using this hook: https://github.com/simonw/datasette-json-html Hook documentation: http://datasette.readthedocs.io/en/latest/plugins.html#render-cell-value","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",345821500,render_cell(value) plugin hook, https://github.com/simonw/datasette/issues/352#issuecomment-410580202,https://api.github.com/repos/simonw/datasette/issues/352,410580202,MDEyOklzc3VlQ29tbWVudDQxMDU4MDIwMg==,9599,simonw,2018-08-06T03:39:40Z,2018-08-06T03:39:40Z,OWNER,I used `datasette-json-html` to build this: https://russian-ira-facebook-ads-datasette-whmbonekoj.now.sh/russian-ads-919cbfd/display_ads,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",345821500,render_cell(value) plugin hook, https://github.com/simonw/datasette/issues/357#issuecomment-410818501,https://api.github.com/repos/simonw/datasette/issues/357,410818501,MDEyOklzc3VlQ29tbWVudDQxMDgxODUwMQ==,9599,simonw,2018-08-06T19:04:54Z,2018-08-06T19:04:54Z,OWNER,Another potential use-case for this hook: loading metadata via a URL,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",348043884,Plugin hook for loading metadata.json, https://github.com/simonw/datasette/issues/174#issuecomment-412290986,https://api.github.com/repos/simonw/datasette/issues/174,412290986,MDEyOklzc3VlQ29tbWVudDQxMjI5MDk4Ng==,9599,simonw,2018-08-11T17:46:51Z,2018-08-11T17:46:51Z,OWNER,This was fixed in https://github.com/simonw/datasette/commit/89d9fbb91bfc0dd9091b34dbf3cf540ab849cc44,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",281197863,License/Source in footer should inherit from top level, https://github.com/simonw/datasette/issues/188#issuecomment-412291327,https://api.github.com/repos/simonw/datasette/issues/188,412291327,MDEyOklzc3VlQ29tbWVudDQxMjI5MTMyNw==,9599,simonw,2018-08-11T17:53:17Z,2018-08-11T17:53:17Z,OWNER,"Potential problem: the existing `metadata.json` format looks like this: ``` { ""title"": ""Custom title for your index page"", ""description"": ""Some description text can go here"", ""license"": ""ODbL"", ""license_url"": ""https://opendatacommons.org/licenses/odbl/"", ""databases"": { ""database1"": { ""source"": ""Alternative source"", ""source_url"": ""http://example.com/"", ""tables"": { ""example_table"": { ""description_html"": ""Custom table description"", ""license"": ""CC BY 3.0 US"", ""license_url"": ""https://creativecommons.org/licenses/by/3.0/us/"" } } } } } ``` This doesn't make sense for metadata that is bundled with a specific database - there's no point in having the `databases` key, instead the content of `database1` in the above example should be at the top level. This also means that if you rename the `*.db` file you won't have to edit its metadata at the same time. Calling such an embedded file `metadata.json` when the shape is different could be confusing. Maybe call it `database-metadata.json` instead.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309047460,Ability to bundle metadata and templates inside the SQLite file, https://github.com/simonw/datasette/issues/231#issuecomment-412291395,https://api.github.com/repos/simonw/datasette/issues/231,412291395,MDEyOklzc3VlQ29tbWVudDQxMjI5MTM5NQ==,9599,simonw,2018-08-11T17:54:41Z,2018-08-11T17:54:41Z,OWNER,"I'm going to separate the issue of enabling and disabling plugins from the existence of the `plugins` key. The format will simply be: ``` { ""plugins"": { ""name-of-plugin"": { ... any structures you like go here, defined by the plugin ... } } } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316323336,metadata.json support for plugin configuration options, https://github.com/simonw/datasette/issues/238#issuecomment-412291437,https://api.github.com/repos/simonw/datasette/issues/238,412291437,MDEyOklzc3VlQ29tbWVudDQxMjI5MTQzNw==,9599,simonw,2018-08-11T17:55:26Z,2018-08-11T18:02:48Z,OWNER,"On further thought, I'd much rather implement this using some kind of metadata plugin hook - see #357","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317714268,External metadata.json, https://github.com/simonw/datasette/issues/185#issuecomment-412299013,https://api.github.com/repos/simonw/datasette/issues/185,412299013,MDEyOklzc3VlQ29tbWVudDQxMjI5OTAxMw==,9599,simonw,2018-08-11T20:14:54Z,2018-08-11T20:14:54Z,OWNER,"I've been worrying about how this one relates to #260 - I'd like to validate metadata (to help protect against people e.g. misspelling `license_url` and then being confused when their license isn't displayed properly), but this issue requests the ability to add arbitrary additional keys to the metadata structure. I think the solution is to introduce a metadata key called `extra_metadata_keys` which allows you to specifically list the extra keys that you want to enable. Something like this: ``` { ""title"": ""My title"", ""source"": ""Source"", ""source_url"": ""https://www.example.com/"", ""release_date"": ""2018-04-01"", ""extra_metadata_keys"": [""release_date""] } ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/359#issuecomment-412356537,https://api.github.com/repos/simonw/datasette/issues/359,412356537,MDEyOklzc3VlQ29tbWVudDQxMjM1NjUzNw==,9599,simonw,2018-08-12T17:01:21Z,2018-08-12T17:01:39Z,OWNER,"Example table: https://latest-code.datasette.io/code/definitions Here's a query that does facet counting against that column: https://latest-code.datasette.io/code-a26fa3c?sql=select+count%28*%29+as+n%2C+j.value+from+definitions+join+json_each%28params%29+j+group+by+j.value+order+by+n+desc%3B ``` select count(*) as n, j.value from definitions join json_each(params) j group by j.value order by n desc; ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",349827640,Faceted browse against a JSON list of tags, https://github.com/simonw/datasette/issues/359#issuecomment-412356746,https://api.github.com/repos/simonw/datasette/issues/359,412356746,MDEyOklzc3VlQ29tbWVudDQxMjM1Njc0Ng==,9599,simonw,2018-08-12T17:05:00Z,2018-08-12T17:05:00Z,OWNER,"And here's the query for pulling back every record tagged with a specific tag: https://latest-code.datasette.io/code-a26fa3c?sql=select+*+from+definitions+where+rowid+in+%28%0D%0A++select+definitions.rowid%0D%0A++from+definitions+join+json_each%28params%29+j%0D%0A++where+j.value+%3D+%3Atag%0D%0A%29&tag=filename ``` select * from definitions where rowid in ( select definitions.rowid from definitions join json_each(params) j where j.value = :tag ) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",349827640,Faceted browse against a JSON list of tags, https://github.com/simonw/datasette/issues/359#issuecomment-412357691,https://api.github.com/repos/simonw/datasette/issues/359,412357691,MDEyOklzc3VlQ29tbWVudDQxMjM1NzY5MQ==,9599,simonw,2018-08-12T17:17:29Z,2018-08-12T17:17:29Z,OWNER,"Note that there doesn't seem to be a way to use indexes (even [indexes on expressions](https://www.sqlite.org/expridx.html)) to speed these up, so this will only ever be effective on smaller data sets, probably in the 10,000-100,000 range. Datasette is often used with smaller data sets so this is still worth pursuing.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",349827640,Faceted browse against a JSON list of tags, https://github.com/simonw/datasette/issues/185#issuecomment-412663658,https://api.github.com/repos/simonw/datasette/issues/185,412663658,MDEyOklzc3VlQ29tbWVudDQxMjY2MzY1OA==,222245,carlmjohnson,2018-08-13T21:04:11Z,2018-08-13T21:04:11Z,NONE,That seems good to me.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/360#issuecomment-413386332,https://api.github.com/repos/simonw/datasette/issues/360,413386332,MDEyOklzc3VlQ29tbWVudDQxMzM4NjMzMg==,9599,simonw,2018-08-16T00:51:00Z,2018-08-16T00:51:00Z,OWNER,Relevant: https://github.com/coleifer/pysqlite3/issues/2,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",351017129,Use pysqlite3 if available, https://github.com/simonw/datasette/issues/360#issuecomment-413387424,https://api.github.com/repos/simonw/datasette/issues/360,413387424,MDEyOklzc3VlQ29tbWVudDQxMzM4NzQyNA==,9599,simonw,2018-08-16T00:57:25Z,2018-08-16T00:57:25Z,OWNER,"I deployed a working demo of this here: https://pysqlite3-datasette.now.sh I used this command to deploy it: datasette publish now \ fixtures.db fivethirtyeight.db \ --branch=pysqlite3 \ --install=https://github.com/karlb/pysqlite3/archive/master.zip \ -n pysqlite3-datasette https://pysqlite3-datasette.now.sh/-/versions confirms version of SQLite is `3.25.0`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",351017129,Use pysqlite3 if available, https://github.com/simonw/datasette/issues/360#issuecomment-413396812,https://api.github.com/repos/simonw/datasette/issues/360,413396812,MDEyOklzc3VlQ29tbWVudDQxMzM5NjgxMg==,9599,simonw,2018-08-16T01:50:42Z,2018-08-16T01:50:42Z,OWNER,"Now that this has merged into master the command for deploying it can use `--branch=master` instead: datasette publish now \ fixtures.db fivethirtyeight.db \ --branch=master \ --install=https://github.com/karlb/pysqlite3/archive/master.zip \ -n pysqlite3-datasette ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",351017129,Use pysqlite3 if available, https://github.com/simonw/datasette/issues/267#issuecomment-414860009,https://api.github.com/repos/simonw/datasette/issues/267,414860009,MDEyOklzc3VlQ29tbWVudDQxNDg2MDAwOQ==,78156,annapowellsmith,2018-08-21T23:57:51Z,2018-08-21T23:57:51Z,NONE,"Looks to me like hashing, redirects and caching were documented as part of https://github.com/simonw/datasette/commit/788a542d3c739da5207db7d1fb91789603cdd336#diff-3021b0e065dce289c34c3b49b3952a07 - so perhaps this can be closed? :tada:","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323716411,"Documentation for URL hashing, redirects and cache policy", https://github.com/simonw/datasette/issues/350#issuecomment-416659043,https://api.github.com/repos/simonw/datasette/issues/350,416659043,MDEyOklzc3VlQ29tbWVudDQxNjY1OTA0Mw==,9599,simonw,2018-08-28T16:48:19Z,2018-08-28T16:48:19Z,OWNER,Closed in https://github.com/simonw/datasette/commit/0bd41d4cb0a42d7d2baf8b49675418d1482ae39b,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",344701755,Don't list default plugins on /-/plugins, https://github.com/simonw/datasette/issues/350#issuecomment-416667565,https://api.github.com/repos/simonw/datasette/issues/350,416667565,MDEyOklzc3VlQ29tbWVudDQxNjY2NzU2NQ==,9599,simonw,2018-08-28T17:13:50Z,2018-08-28T17:13:50Z,OWNER,https://b7257a2.datasette.io/-/plugins is now correctly returning an empty list.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",344701755,Don't list default plugins on /-/plugins, https://github.com/simonw/datasette/issues/362#issuecomment-416727898,https://api.github.com/repos/simonw/datasette/issues/362,416727898,MDEyOklzc3VlQ29tbWVudDQxNjcyNzg5OA==,9599,simonw,2018-08-28T20:24:00Z,2018-08-28T20:24:00Z,OWNER,"Are you talking about these filters here? ![2018-08-28 at 9 22 pm](https://user-images.githubusercontent.com/9599/44748784-8688cb00-ab08-11e8-8baf-ace2e04e181f.png) I haven't thought much about how those could be made more usable - right now they basically expose all available options, but customizing them for particular use-cases is certainly an interesting potential space. Could you sketch out a bit more about how your ideal interface here would work?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",352768017,Add option to include/exclude columns in search filters, https://github.com/simonw/datasette/pull/363#issuecomment-417684877,https://api.github.com/repos/simonw/datasette/issues/363,417684877,MDEyOklzc3VlQ29tbWVudDQxNzY4NDg3Nw==,436032,kevboh,2018-08-31T14:39:45Z,2018-08-31T14:39:45Z,NONE,"It looks like the check passed, not sure why it's showing as running in GH.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",355299310,Search all apps during heroku publish, https://github.com/simonw/datasette/issues/308#issuecomment-418106781,https://api.github.com/repos/simonw/datasette/issues/308,418106781,MDEyOklzc3VlQ29tbWVudDQxODEwNjc4MQ==,9599,simonw,2018-09-03T12:53:21Z,2018-09-03T12:53:21Z,OWNER,Now that I've split the heroku command out into a separate default plugin this is a much easier thing to add: https://github.com/simonw/datasette/blob/master/datasette/publish/heroku.py,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",330826972,"Support extra Heroku apps:create options - region, space, team", https://github.com/simonw/datasette/issues/272#issuecomment-418695115,https://api.github.com/repos/simonw/datasette/issues/272,418695115,MDEyOklzc3VlQ29tbWVudDQxODY5NTExNQ==,647359,tomchristie,2018-09-05T11:21:25Z,2018-09-05T11:21:25Z,NONE,"Some notes: * Starlette just got a bump to 0.3.0 - there's some renamings in there. It's got enough functionality now that you can treat it either as a framework or as a toolkit. Either way the component design is all just *here's an ASGI app* all the way through. * Uvicorn got a bump to 0.3.3 - Removed some cyclical references that were causing garbage collection to impact performance. Ought to be a decent speed bump. * Wrt. passing config - Either use a single envvar that points to a config, or use multiple envvars for the config. Uvicorn could get a flag to read a `.env` file, but I don't see ASGI itself having a specific interface there.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/pull/293#issuecomment-420295524,https://api.github.com/repos/simonw/datasette/issues/293,420295524,MDEyOklzc3VlQ29tbWVudDQyMDI5NTUyNA==,11912854,jsancho-gpl,2018-09-11T14:32:45Z,2018-09-11T14:32:45Z,NONE,I close this PR because it's better to use the new one #364 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326987229,Support for external database connectors, https://github.com/simonw/datasette/issues/329#issuecomment-422821483,https://api.github.com/repos/simonw/datasette/issues/329,422821483,MDEyOklzc3VlQ29tbWVudDQyMjgyMTQ4Mw==,418191,jaywgraves,2018-09-19T14:17:42Z,2018-09-19T14:17:42Z,CONTRIBUTOR,"I'm using the docker image (0.23.2) and notice some differences/bugs between the docs and the published version with canned queries. (submitted a tiny doc fix also) I was able to build the docker container locally using `master` and I'm using that for now. Would it be possible to manually push 0.24 to DockerHub until the TravisCI stuff is fixed? I would like to run this in our Kubernetes cluster but don't want to publish a version in our internal registry if I don't have to. Thanks!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336465018,Travis should push tagged images to Docker Hub for each release, https://github.com/simonw/datasette/pull/365#issuecomment-422885014,https://api.github.com/repos/simonw/datasette/issues/365,422885014,MDEyOklzc3VlQ29tbWVudDQyMjg4NTAxNA==,9599,simonw,2018-09-19T17:15:16Z,2018-09-19T17:15:16Z,OWNER,Thanks!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",361764460,fix small doc typo, https://github.com/simonw/datasette/issues/329#issuecomment-422903031,https://api.github.com/repos/simonw/datasette/issues/329,422903031,MDEyOklzc3VlQ29tbWVudDQyMjkwMzAzMQ==,9599,simonw,2018-09-19T18:07:09Z,2018-09-19T18:07:09Z,OWNER,"The new 0.25 release has been successfully pushed to Docker Hub! https://hub.docker.com/r/datasetteproject/datasette/tags/ One catch: it looks like it didn't update the ""latest"" tag to point at it. Looking into that now.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336465018,Travis should push tagged images to Docker Hub for each release, https://github.com/simonw/datasette/issues/329#issuecomment-422908130,https://api.github.com/repos/simonw/datasette/issues/329,422908130,MDEyOklzc3VlQ29tbWVudDQyMjkwODEzMA==,9599,simonw,2018-09-19T18:23:02Z,2018-09-19T18:23:02Z,OWNER,"I fixed that by running the following on my laptop: $ docker pull datasetteproject/datasette:0.25 $ docker tag datasetteproject/datasette:0.25 datasetteproject/datasette:latest $ docker push datasetteproject/datasette The `latest` tag now points to the most recent release.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336465018,Travis should push tagged images to Docker Hub for each release, https://github.com/simonw/datasette/issues/329#issuecomment-422915450,https://api.github.com/repos/simonw/datasette/issues/329,422915450,MDEyOklzc3VlQ29tbWVudDQyMjkxNTQ1MA==,418191,jaywgraves,2018-09-19T18:45:02Z,2018-09-20T10:50:50Z,CONTRIBUTOR,"That works for me. Was able to pull the public image and no errors on my canned query. (~although a small rendering bug. I'll create an issue and if I have time today, a PR to fix~ this turned out to be my error.) Thanks for the quick response!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336465018,Travis should push tagged images to Docker Hub for each release, https://github.com/simonw/datasette/issues/292#issuecomment-423543060,https://api.github.com/repos/simonw/datasette/issues/292,423543060,MDEyOklzc3VlQ29tbWVudDQyMzU0MzA2MA==,9599,simonw,2018-09-21T14:06:31Z,2018-09-21T14:09:06Z,OWNER,"I keep on finding new reasons that I want this. The latest is that I'm playing with the more advanced features of FTS5 - in particular the highlight() function and the ability to sort by rank. The problem is... in order to do this, I need to properly join against the `_fts` table. Here's an example query: select highlight(events_fts, 0, '', ''), events_fts.rank, events.* from events join events_fts on events.rowid = events_fts.rowid where events_fts match :search order by rank Note that this is a different query from the usual FTS one (which does `where rowid in (select rowid from events_fts...)`) because I need the rank column somewhere I can sort against. I'd like to be able to use this on the table view page so I can get faceting etc for free, but this is a completely different query from the default. Maybe I need a way to customize the entire query? That feels weird though - why am I not using a view in that case? Answer: because views can't accept `:search` style parameters. I could use a canned query, but canned queries don't get faceting etc.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/328#issuecomment-427261369,https://api.github.com/repos/simonw/datasette/issues/328,427261369,MDEyOklzc3VlQ29tbWVudDQyNzI2MTM2OQ==,13698964,chmaynard,2018-10-05T06:37:06Z,2018-10-05T06:37:06Z,NONE,"``` ~ $ docker pull datasetteproject/datasette ~ $ docker run -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db Usage: datasette -p [OPTIONS] [FILES]... Error: Invalid value for ""files"": Path ""/mnt/fixtures.db"" does not exist. ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336464733,"Installation instructions, including how to use the docker image", https://github.com/simonw/datasette/issues/187#issuecomment-427943710,https://api.github.com/repos/simonw/datasette/issues/187,427943710,MDEyOklzc3VlQ29tbWVudDQyNzk0MzcxMA==,1583271,progpow,2018-10-08T18:58:05Z,2018-10-08T18:58:05Z,NONE,"I have same error: ``` Collecting uvloop Using cached https://files.pythonhosted.org/packages/5c/37/6daa39aac42b2deda6ee77f408bec0419b600e27b89b374b0d440af32b10/uvloop-0.11.2.tar.gz Complete output from command python setup.py egg_info: Traceback (most recent call last): File """", line 1, in File ""C:\Users\sageev\AppData\Local\Temp\pip-install-bq64l8jy\uvloop\setup.py"", line 15, in raise RuntimeError('uvloop does not support Windows at the moment') RuntimeError: uvloop does not support Windows at the moment ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309033998,Windows installation error, https://github.com/simonw/datasette/issues/366#issuecomment-429737929,https://api.github.com/repos/simonw/datasette/issues/366,429737929,MDEyOklzc3VlQ29tbWVudDQyOTczNzkyOQ==,416374,gfrmin,2018-10-15T07:32:57Z,2018-10-15T07:32:57Z,CONTRIBUTOR,"Very hacky solution is to write now.json file forcing the usage of v1 of Zeit cloud, see https://github.com/slygent/datasette/commit/3ab824793ec6534b6dd87078aa46b11c4fa78ea3 This does work, at least.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",369716228,Default built image size over Zeit Now 100MiB limit, https://github.com/simonw/datasette/issues/176#issuecomment-431867885,https://api.github.com/repos/simonw/datasette/issues/176,431867885,MDEyOklzc3VlQ29tbWVudDQzMTg2Nzg4NQ==,634572,eads,2018-10-22T15:24:57Z,2018-10-22T15:24:57Z,NONE,"I'd like this as well. It would let me access Datasette-driven projects from GatsbyJS the same way I can access Postgres DBs via Hasura. While I don't see SQLite replacing Postgres for the 50m row datasets I sometimes have to work with, there's a whole class of smaller datasets that are great with Datasette but currently would find another option.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/366#issuecomment-433680598,https://api.github.com/repos/simonw/datasette/issues/366,433680598,MDEyOklzc3VlQ29tbWVudDQzMzY4MDU5OA==,9599,simonw,2018-10-28T06:38:43Z,2018-10-28T06:38:43Z,OWNER,I've just started running into this as well. Looks like I'll have to anchor to v1 for the moment - I'm hoping the discussion on https://github.com/zeit/now-cli/issues/1523 encourages an increase in this limit policy :/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",369716228,Default built image size over Zeit Now 100MiB limit, https://github.com/simonw/datasette/issues/371#issuecomment-435767775,https://api.github.com/repos/simonw/datasette/issues/371,435767775,MDEyOklzc3VlQ29tbWVudDQzNTc2Nzc3NQ==,9599,simonw,2018-11-05T06:27:33Z,2018-11-05T06:27:33Z,OWNER,"This would be fantastic - that tutorial looks like many of the details needed for this. Do you know if Digital Ocean have the ability to provision URLs for a droplet without you needing to buy your own domain name? Heroku have https://example.herokuapp.com/ and Zeit have https://blah.now.sh/ - does Digital Ocean have an equivalent? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377156339,datasette publish digitalocean plugin, https://github.com/simonw/datasette/issues/369#issuecomment-435767827,https://api.github.com/repos/simonw/datasette/issues/369,435767827,MDEyOklzc3VlQ29tbWVudDQzNTc2NzgyNw==,9599,simonw,2018-11-05T06:27:55Z,2018-11-05T06:28:48Z,OWNER,"This is a good idea. Basically a version of this bug but on the custom SQL query page: ![2018-11-04 at 10 28 pm](https://user-images.githubusercontent.com/9599/47981499-fd9a8c80-e080-11e8-9c59-00e626d3aa4c.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",374953006,Interface should show same JSON shape options for custom SQL queries, https://github.com/simonw/datasette/issues/369#issuecomment-435768450,https://api.github.com/repos/simonw/datasette/issues/369,435768450,MDEyOklzc3VlQ29tbWVudDQzNTc2ODQ1MA==,416374,gfrmin,2018-11-05T06:31:59Z,2018-11-05T06:31:59Z,CONTRIBUTOR,"That would be ideal, but you know better than me whether the CSV streaming trick works for custom SQL queries.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",374953006,Interface should show same JSON shape options for custom SQL queries, https://github.com/simonw/datasette/issues/329#issuecomment-435772031,https://api.github.com/repos/simonw/datasette/issues/329,435772031,MDEyOklzc3VlQ29tbWVudDQzNTc3MjAzMQ==,9599,simonw,2018-11-05T06:53:28Z,2018-11-05T06:54:10Z,OWNER,"This works now! The `0.25.1` release was the first release which successfully pushed to Docker Hub: https://hub.docker.com/r/datasetteproject/datasette/tags/ ![2018-11-04 at 10 53 pm](https://user-images.githubusercontent.com/9599/47982395-70593700-e084-11e8-8870-9100677c2bde.png) Here's the log from the successful Travis release job: https://travis-ci.org/simonw/datasette/jobs/450714602 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336465018,Travis should push tagged images to Docker Hub for each release, https://github.com/simonw/datasette/issues/371#issuecomment-435862009,https://api.github.com/repos/simonw/datasette/issues/371,435862009,MDEyOklzc3VlQ29tbWVudDQzNTg2MjAwOQ==,82988,psychemedia,2018-11-05T12:48:35Z,2018-11-05T12:48:35Z,CONTRIBUTOR,I think you need to register a domain name you own separately in order to get a non-IP address address? https://www.digitalocean.com/docs/networking/dns/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377156339,datasette publish digitalocean plugin, https://github.com/simonw/datasette/issues/370#issuecomment-435974786,https://api.github.com/repos/simonw/datasette/issues/370,435974786,MDEyOklzc3VlQ29tbWVudDQzNTk3NDc4Ng==,9599,simonw,2018-11-05T18:06:56Z,2018-11-05T18:06:56Z,OWNER,"I've been thinking a bit about ways of using Jupyter Notebook more effectively with Datasette (thinks like a `publish_dataframes(df1, df2, df3)` function which publishes some Pandas dataframes and returns you a URL to a new hosted Datasette instance) but you're right, Jupyter Lab is potentially a much more interesting fit.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377155320,Integration with JupyterLab, https://github.com/simonw/datasette/issues/374#issuecomment-435976262,https://api.github.com/repos/simonw/datasette/issues/374,435976262,MDEyOklzc3VlQ29tbWVudDQzNTk3NjI2Mg==,9599,simonw,2018-11-05T18:11:10Z,2018-11-05T18:11:10Z,OWNER,"I think there is a useful way forward here though: the image size may be limited to 100MB, but once the instance launches it gets access to a filesystem with a lot more space than that (possibly as much as 15GB given my initial poking around). So... one potential solution here is to teach Datasette to launch from a smaller image and then download a larger SQLite file from a known URL as part of its initial startup. Combined with the ability to get Now to always run at least one copy of an instance this could allow Datasette to host much larger SQLite databases on that platform while playing nicely with the Zeit v2 platform. See also https://github.com/zeit/now-cli/issues/1523","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377518499,Get Datasette working with Zeit Now v2's 100MB image size limit, https://github.com/simonw/datasette/issues/370#issuecomment-436037692,https://api.github.com/repos/simonw/datasette/issues/370,436037692,MDEyOklzc3VlQ29tbWVudDQzNjAzNzY5Mg==,82988,psychemedia,2018-11-05T21:15:47Z,2018-11-05T21:18:37Z,CONTRIBUTOR,"In terms of integration with `pandas`, I was pondering two different ways `datasette`/`csvs_to_sqlite` integration may work: - like [`pandasql`](https://github.com/yhat/pandasql), to provide a SQL query layer either by a direct connection to the sqlite db or via `datasette` API; - as an improvement of `pandas.to_sql()`, which is a bit ropey (e.g. `pandas.to_sql_from_csvs()`, routing the dataframe to sqlite via `csvs_tosqlite` rather than the dodgy mapping that `pandas` supports). The `pandas.publish_*` idea could be quite interesting though... Would it be useful/fruitful to think about `publish_` as a complement to [`pandas.to_`](https://pandas.pydata.org/pandas-docs/stable/api.html#id12)?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377155320,Integration with JupyterLab, https://github.com/simonw/datasette/issues/370#issuecomment-436042445,https://api.github.com/repos/simonw/datasette/issues/370,436042445,MDEyOklzc3VlQ29tbWVudDQzNjA0MjQ0NQ==,82988,psychemedia,2018-11-05T21:30:42Z,2018-11-05T21:31:48Z,CONTRIBUTOR,"Another route would be something like creating a `datasette` IPython magic for notebooks to take a dataframe and easily render it as a `datasette`. You'd need to run the app in the background rather than block execution in the notebook. Related to that, or to publishing a dataframe in notebook cell for use in other cells in a non-blocking way, there may be cribs in something like https://github.com/micahscopes/nbmultitask .","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377155320,Integration with JupyterLab, https://github.com/simonw/datasette/issues/227#issuecomment-439194286,https://api.github.com/repos/simonw/datasette/issues/227,439194286,MDEyOklzc3VlQ29tbWVudDQzOTE5NDI4Ng==,222245,carlmjohnson,2018-11-15T21:20:37Z,2018-11-15T21:20:37Z,NONE,I'm diving back into https://salaries.news.baltimoresun.com and what I really want is the ability to inject the request into my context.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/120#issuecomment-439421164,https://api.github.com/repos/simonw/datasette/issues/120,439421164,MDEyOklzc3VlQ29tbWVudDQzOTQyMTE2NA==,36796532,ad-si,2018-11-16T15:05:18Z,2018-11-16T15:05:18Z,NONE,This would be an awesome feature ❤️ ,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275087397,Plugin that adds an authentication layer of some sort, https://github.com/simonw/datasette/issues/374#issuecomment-439762759,https://api.github.com/repos/simonw/datasette/issues/374,439762759,MDEyOklzc3VlQ29tbWVudDQzOTc2Mjc1OQ==,9599,simonw,2018-11-19T03:41:36Z,2018-11-19T03:41:36Z,OWNER,"It turned out Zeit didn't end up shipping the new 100MB-limit Docker-based Zeit 2.0 after all - they ended up going in a completely different direction, towards lambdas instead (which don't really fit the Datasette model): https://zeit.co/blog/now-2 But... as far as I can tell they have introduced the 100MB image size for all free Zeit accounts ever against their 1.0 platform. So we still need to solve this, or free Zeit users won't be able to use `datasette publish now` even while 1.0 is still available. I made some notes on this here: https://simonwillison.net/2018/Nov/19/smaller-python-docker-images/ I've got it working for the Datasette Publish webapp, but I still need to fix `datasette publish now` to create much smaller patterns. I know how to do this for regular datasette, but I haven't yet figured out an Alpine Linux pattern for spatialite extras: https://github.com/simonw/datasette/blob/5e3a432a0caa23837fa58134f69e2f82e4f632a6/datasette/utils.py#L287-L300","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377518499,Get Datasette working with Zeit Now v2's 100MB image size limit, https://github.com/simonw/datasette/issues/374#issuecomment-439763196,https://api.github.com/repos/simonw/datasette/issues/374,439763196,MDEyOklzc3VlQ29tbWVudDQzOTc2MzE5Ng==,9599,simonw,2018-11-19T03:45:13Z,2018-11-19T03:45:13Z,OWNER,This looks like it might be a recipe for spatialite Python on Alpine Linux: https://github.com/bentrm/geopython/blob/8e52062d9545f4b7c1f04a3516354a5a9155e31f/Dockerfile,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377518499,Get Datasette working with Zeit Now v2's 100MB image size limit, https://github.com/simonw/datasette/issues/374#issuecomment-439763268,https://api.github.com/repos/simonw/datasette/issues/374,439763268,MDEyOklzc3VlQ29tbWVudDQzOTc2MzI2OA==,9599,simonw,2018-11-19T03:45:44Z,2018-11-19T03:45:44Z,OWNER,Another example that might be useful: https://github.com/poc-flask/alpine/blob/8e9f48a2351e106347dab36d08cf21dee865993e/Dockerfile,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377518499,Get Datasette working with Zeit Now v2's 100MB image size limit, https://github.com/simonw/datasette/pull/389#issuecomment-440128762,https://api.github.com/repos/simonw/datasette/issues/389,440128762,MDEyOklzc3VlQ29tbWVudDQ0MDEyODc2Mg==,9599,simonw,2018-11-20T03:52:11Z,2018-11-20T03:52:11Z,OWNER,"The problem is Sanic. Here's the error I'm getting: ``` (venv) datasette $ pytest -x ============================================================= test session starts ============================================================== platform darwin -- Python 3.7.1, pytest-4.0.0, py-1.7.0, pluggy-0.8.0 rootdir: /Users/simonw/Dropbox/Development/datasette, inifile: collected 258 items tests/test_api.py ...................F =================================================================== FAILURES =================================================================== _______________________________________________________ test_table_with_slashes_in_name ________________________________________________________ app_client = def test_table_with_slashes_in_name(app_client): response = app_client.get('/fixtures/table%2Fwith%2Fslashes.csv?_shape=objects&_format=json') > assert response.status == 200 E AssertionError: assert 404 == 200 ``` That's because something about how Sanic handles escape characters in URLs changed between 0.7.0 and 0.8.3.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",382471625,Bump dependency versions, https://github.com/simonw/datasette/pull/390#issuecomment-447677798,https://api.github.com/repos/simonw/datasette/issues/390,447677798,MDEyOklzc3VlQ29tbWVudDQ0NzY3Nzc5OA==,9599,simonw,2018-12-16T21:32:45Z,2018-12-16T21:32:45Z,OWNER,Thanks for spotting this!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",386459810,tiny typo in customization docs, https://github.com/simonw/datasette/issues/374#issuecomment-448437245,https://api.github.com/repos/simonw/datasette/issues/374,448437245,MDEyOklzc3VlQ29tbWVudDQ0ODQzNzI0NQ==,9599,simonw,2018-12-19T01:35:59Z,2018-12-19T01:35:59Z,OWNER,"Closing this as Zeit went on a different direction with Now v2, so the 100MB limit is no longer a concern.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377518499,Get Datasette working with Zeit Now v2's 100MB image size limit, https://github.com/simonw/datasette/issues/393#issuecomment-450943172,https://api.github.com/repos/simonw/datasette/issues/393,450943172,MDEyOklzc3VlQ29tbWVudDQ1MDk0MzE3Mg==,9599,simonw,2019-01-02T18:28:43Z,2019-01-02T18:28:43Z,OWNER,"Definitely a bug, thanks.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",395236066,"CSV export in ""Advanced export"" pane doesn't respect query", https://github.com/simonw/datasette/issues/393#issuecomment-450943632,https://api.github.com/repos/simonw/datasette/issues/393,450943632,MDEyOklzc3VlQ29tbWVudDQ1MDk0MzYzMg==,9599,simonw,2019-01-02T18:30:20Z,2019-01-02T18:30:20Z,OWNER,"This is the code which is meant to add those options as hidden form fields: https://github.com/simonw/datasette/blob/fe5b6ea95a973534fe8a44907c0ea2449aae7602/datasette/templates/table.html#L150-L155 It's clearly not working. Need to fix this and add a corresponding unit test.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",395236066,"CSV export in ""Advanced export"" pane doesn't respect query", https://github.com/simonw/datasette/issues/393#issuecomment-450944166,https://api.github.com/repos/simonw/datasette/issues/393,450944166,MDEyOklzc3VlQ29tbWVudDQ1MDk0NDE2Ng==,9599,simonw,2019-01-02T18:32:12Z,2019-01-02T18:32:12Z,OWNER,"Here's the test that needs updating: https://github.com/simonw/datasette/blob/8b8ae55e7c8b9e1dceef53f55a330b596ca44d41/tests/test_html.py#L427-L435","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",395236066,"CSV export in ""Advanced export"" pane doesn't respect query", https://github.com/simonw/datasette/issues/391#issuecomment-450964512,https://api.github.com/repos/simonw/datasette/issues/391,450964512,MDEyOklzc3VlQ29tbWVudDQ1MDk2NDUxMg==,9599,simonw,2019-01-02T19:45:12Z,2019-01-02T19:45:12Z,OWNER,"Thanks, I've fixed this. I had to re-alias it against now: ``` ~ $ now alias google-trends-pnwhfwvgqf.now.sh https://google-trends.datasettes.com/ > Assigning alias google-trends.datasettes.com to deployment google-trends-pnwhfwvgqf.now.sh > Certificate for google-trends.datasettes.com (cert_uXaADIuNooHS3tZ) created [18s] > Success! google-trends.datasettes.com now points to google-trends-pnwhfwvgqf.now.sh [20s] ```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",392610803,Google Trends example doesn’t work, https://github.com/simonw/datasette/issues/393#issuecomment-451046123,https://api.github.com/repos/simonw/datasette/issues/393,451046123,MDEyOklzc3VlQ29tbWVudDQ1MTA0NjEyMw==,9599,simonw,2019-01-03T03:05:07Z,2019-01-03T03:05:07Z,OWNER,The fix was released as part of Datasette 0.26 - you can see the fix working here: https://v0-26.datasette.io/fixtures-dd88475/facetable?_facet=planet_int&planet_int=1#export,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",395236066,"CSV export in ""Advanced export"" pane doesn't respect query", https://github.com/simonw/datasette/issues/393#issuecomment-451047426,https://api.github.com/repos/simonw/datasette/issues/393,451047426,MDEyOklzc3VlQ29tbWVudDQ1MTA0NzQyNg==,9599,simonw,2019-01-03T03:19:04Z,2019-01-03T03:19:04Z,OWNER,https://fivethirtyeight.datasettes.com/-/versions is now running 0.26 - so your initial bug demo is now fixed: https://fivethirtyeight.datasettes.com/fivethirtyeight-c300360/classic-rock%2Fclassic-rock-song-list?Release+Year__exact=1989#export,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",395236066,"CSV export in ""Advanced export"" pane doesn't respect query", https://github.com/simonw/datasette/issues/393#issuecomment-451415063,https://api.github.com/repos/simonw/datasette/issues/393,451415063,MDEyOklzc3VlQ29tbWVudDQ1MTQxNTA2Mw==,1727065,ltrgoddard,2019-01-04T11:04:08Z,2019-01-04T11:04:08Z,NONE,Awesome - will get myself up and running on 0.26,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",395236066,"CSV export in ""Advanced export"" pane doesn't respect query", https://github.com/simonw/datasette/issues/394#issuecomment-451704724,https://api.github.com/repos/simonw/datasette/issues/394,451704724,MDEyOklzc3VlQ29tbWVudDQ1MTcwNDcyNA==,9599,simonw,2019-01-06T00:32:23Z,2019-01-06T00:33:44Z,OWNER,"I found a really nice pattern for writing the unit tests for this (though it would look even nicer with a solution to #395) ```python @pytest.mark.parametrize(""prefix"", [""/prefix/"", ""https://example.com/""]) @pytest.mark.parametrize(""path"", [ ""/"", ""/fixtures"", ""/fixtures/compound_three_primary_keys"", ""/fixtures/compound_three_primary_keys/a,a,a"", ""/fixtures/paginated_view"", ]) def test_url_prefix_config(prefix, path): for client in make_app_client(config={ ""url_prefix"": prefix, }): response = client.get(path) soup = Soup(response.body, ""html.parser"") for a in soup.findAll(""a""): href = a[""href""] if href not in { ""https://github.com/simonw/datasette"", ""https://github.com/simonw/datasette/blob/master/LICENSE"", ""https://github.com/simonw/datasette/blob/master/tests/fixtures.py"", }: assert href.startswith(prefix), (href, a.parent) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",396212021,base_url configuration setting, https://github.com/simonw/datasette/issues/397#issuecomment-453251589,https://api.github.com/repos/simonw/datasette/issues/397,453251589,MDEyOklzc3VlQ29tbWVudDQ1MzI1MTU4OQ==,9599,simonw,2019-01-10T20:59:42Z,2019-01-10T20:59:42Z,OWNER,"What version of SQLite are you seeing in Datasette? You can tell by hitting http://localhost:8001/-/versions - e.g. here: https://latest.datasette.io/-/versions My best guess is that your Python SQLite module is running an older version that doesn't support window functions. One way you can fix that is with the `pysqlite3` module - try running this in your virtual environment: pip install git+git://github.com/karlb/pysqlite3 That's using a fork of the official module that embeds a full recent SQLite. See this issue thread for more details: https://github.com/coleifer/pysqlite3/issues/2","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",397129564,Update official datasetteproject/datasette Docker container to SQLite 3.26.0, https://github.com/simonw/datasette/issues/397#issuecomment-453252024,https://api.github.com/repos/simonw/datasette/issues/397,453252024,MDEyOklzc3VlQ29tbWVudDQ1MzI1MjAyNA==,9599,simonw,2019-01-10T21:00:57Z,2019-01-10T21:00:57Z,OWNER,"Oh I just saw you're using the official Datasette docker package - yeah, that's not bundled with a recent SQLite at the moment. We should update that: https://github.com/simonw/datasette/blob/5b026115126bedbb66457767e169139146d1c9fd/Dockerfile#L9-L11","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",397129564,Update official datasetteproject/datasette Docker container to SQLite 3.26.0, https://github.com/simonw/datasette/issues/271#issuecomment-453262703,https://api.github.com/repos/simonw/datasette/issues/271,453262703,MDEyOklzc3VlQ29tbWVudDQ1MzI2MjcwMw==,9599,simonw,2019-01-10T21:35:18Z,2019-01-10T21:35:18Z,OWNER,It turns out this was much easier to support than I expected: https://github.com/simonw/datasette/commit/eac08f0dfc61a99e8887442fc247656d419c76f8,"{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324162476,Mechanism for automatically picking up changes when on-disk .db file changes, https://github.com/simonw/datasette/issues/396#issuecomment-453324601,https://api.github.com/repos/simonw/datasette/issues/396,453324601,MDEyOklzc3VlQ29tbWVudDQ1MzMyNDYwMQ==,9599,simonw,2019-01-11T00:55:21Z,2019-01-11T00:55:21Z,OWNER,Demo: https://latest.datasette.io/-/versions,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",397098882,Add pragma compile_options output to /-/versions, https://github.com/simonw/datasette/issues/397#issuecomment-453330680,https://api.github.com/repos/simonw/datasette/issues/397,453330680,MDEyOklzc3VlQ29tbWVudDQ1MzMzMDY4MA==,9599,simonw,2019-01-11T01:17:11Z,2019-01-11T01:25:33Z,OWNER,"If you pull [the latest image](https://hub.docker.com/r/datasetteproject/datasette) you should get the right SQLite version now: docker pull datasetteproject/datasette docker run -p 8001:8001 \ datasetteproject/datasette \ datasette -p 8001 -h 0.0.0.0 http://0.0.0.0:8001/-/versions now gives me: ``` ""version"": ""3.26.0"" ```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",397129564,Update official datasetteproject/datasette Docker container to SQLite 3.26.0, https://github.com/simonw/datasette/issues/400#issuecomment-453795040,https://api.github.com/repos/simonw/datasette/issues/400,453795040,MDEyOklzc3VlQ29tbWVudDQ1Mzc5NTA0MA==,9599,simonw,2019-01-13T01:46:30Z,2019-01-13T01:46:30Z,OWNER,I'm really excited about this - it looks like it could be a great plugin.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",398559195,datasette publish cloudrun plugin, https://github.com/simonw/datasette/issues/399#issuecomment-453874429,https://api.github.com/repos/simonw/datasette/issues/399,453874429,MDEyOklzc3VlQ29tbWVudDQ1Mzg3NDQyOQ==,9599,simonw,2019-01-13T23:09:09Z,2019-01-13T23:09:09Z,OWNER,"It looks like there are two reasons for this: - The `.git` directory was listed in `.dockerignore` so it wasn't being copied into the build process - The docker build stage wasn't installing the `git` executable, so it couldn't read the current version ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",398089089,/-/versions for official Docker image returns wrong Datasette version, https://github.com/simonw/datasette/issues/399#issuecomment-453876023,https://api.github.com/repos/simonw/datasette/issues/399,453876023,MDEyOklzc3VlQ29tbWVudDQ1Mzg3NjAyMw==,9599,simonw,2019-01-13T23:31:59Z,2019-01-13T23:31:59Z,OWNER,"``` docker pull datasetteproject/datasette docker run -p 8001:8001 datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 ``` http://0.0.0.0:8001/-/versions now returns: ``` { ""datasette"": { ""version"": ""0.26.2+0.ga418c8b.dirty"" }, ``` I'm not sure why it's showing `.dirty` there. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",398089089,/-/versions for official Docker image returns wrong Datasette version, https://github.com/simonw/datasette/issues/402#issuecomment-455223551,https://api.github.com/repos/simonw/datasette/issues/402,455223551,MDEyOklzc3VlQ29tbWVudDQ1NTIyMzU1MQ==,9599,simonw,2019-01-17T15:55:06Z,2019-01-17T15:55:06Z,OWNER,"It's new in SQLite 3.26.0 so I will need to figure out how to only apply it in that version or higher. https://sqlite.org/releaselog/3_26_0.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",400340905,Use SQLITE_DBCONFIG_DEFENSIVE plus other recommendations from SQLite security docs, https://github.com/simonw/datasette/issues/402#issuecomment-455224327,https://api.github.com/repos/simonw/datasette/issues/402,455224327,MDEyOklzc3VlQ29tbWVudDQ1NTIyNDMyNw==,9599,simonw,2019-01-17T15:56:57Z,2019-01-17T15:56:57Z,OWNER,https://sqlite.org/security.html has other recommmendations for apps that accept SQLite files from untrusted sources that we should apply.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",400340905,Use SQLITE_DBCONFIG_DEFENSIVE plus other recommendations from SQLite security docs, https://github.com/simonw/datasette/issues/401#issuecomment-455230501,https://api.github.com/repos/simonw/datasette/issues/401,455230501,MDEyOklzc3VlQ29tbWVudDQ1NTIzMDUwMQ==,9599,simonw,2019-01-17T16:12:59Z,2019-01-17T16:12:59Z,OWNER,"Datasette-cluster-map doesn't use the new plugin configuration mechanism yet - it really should! The best example of how to use this mechanism right now is embedded in the Datasette unit tests: https://github.com/simonw/datasette/blob/b7257a21bf3dfa7353980f343c83a616da44daa7/tests/fixtures.py#L266-L270 https://github.com/simonw/datasette/blob/b7257a21bf3dfa7353980f343c83a616da44daa7/tests/test_plugins.py#L139-L145","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",400229984,How to pass configuration to plugins?, https://github.com/simonw/datasette/issues/402#issuecomment-455231411,https://api.github.com/repos/simonw/datasette/issues/402,455231411,MDEyOklzc3VlQ29tbWVudDQ1NTIzMTQxMQ==,9599,simonw,2019-01-17T16:15:21Z,2019-01-17T16:15:21Z,OWNER,Unfortunately it looks like there isn't currently a mechanism in the Python sqlite3 library for setting configuration flags like SQLITE_DBCONFIG_DEFENSIVE,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",400340905,Use SQLITE_DBCONFIG_DEFENSIVE plus other recommendations from SQLite security docs, https://github.com/simonw/datasette/issues/401#issuecomment-455445069,https://api.github.com/repos/simonw/datasette/issues/401,455445069,MDEyOklzc3VlQ29tbWVudDQ1NTQ0NTA2OQ==,9599,simonw,2019-01-18T06:49:07Z,2019-01-18T06:49:07Z,OWNER,I've released a new version of the datasette-cluster-map plugin to illustrate how plugin configuration can work: https://github.com/simonw/datasette-cluster-map/commit/fcc86c450e3df3e6b81c41f31df458923181527a,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",400229984,How to pass configuration to plugins?, https://github.com/simonw/datasette/issues/403#issuecomment-455445392,https://api.github.com/repos/simonw/datasette/issues/403,455445392,MDEyOklzc3VlQ29tbWVudDQ1NTQ0NTM5Mg==,9599,simonw,2019-01-18T06:51:14Z,2019-01-18T06:51:14Z,OWNER,"I talk about that a bit here: https://simonwillison.net/2018/Oct/4/datasette-ideas/#Bundling_the_data_with_the_code One of the key ideas behind Datasette is that if your data is read-only you can package it up with the rest of your code - so the normal limitations that apply with hosting services like now.sh no longer prevent you from including a database. The SQLite database is just another static binary file that gets packaged up as part of your deployment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",400511206,How does persistence work?, https://github.com/simonw/datasette/issues/401#issuecomment-455520561,https://api.github.com/repos/simonw/datasette/issues/401,455520561,MDEyOklzc3VlQ29tbWVudDQ1NTUyMDU2MQ==,1055831,dazzag24,2019-01-18T11:48:13Z,2019-01-18T11:48:13Z,NONE,"Thanks. I'll take a look at your changes. I must admit I was struggling to see how to pass info from the python code in __init__.py into the javascript document.addEventListener function.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",400229984,How to pass configuration to plugins?, https://github.com/simonw/datasette/issues/403#issuecomment-455752238,https://api.github.com/repos/simonw/datasette/issues/403,455752238,MDEyOklzc3VlQ29tbWVudDQ1NTc1MjIzOA==,1794527,ccorcos,2019-01-19T05:47:55Z,2019-01-19T05:47:55Z,NONE,Ah. That makes much more sense. Interesting approach.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",400511206,How does persistence work?, https://github.com/simonw/datasette/issues/405#issuecomment-457975075,https://api.github.com/repos/simonw/datasette/issues/405,457975075,MDEyOklzc3VlQ29tbWVudDQ1Nzk3NTA3NQ==,9599,simonw,2019-01-28T01:41:51Z,2019-01-28T01:41:51Z,OWNER,Implemented in https://github.com/simonw/datasette/commit/b5dd83981a7dbff571284d4d90a950c740245b05,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",403617881,.json?_nl=on option for exporting newline-delimited JSON, https://github.com/simonw/datasette/issues/405#issuecomment-457975857,https://api.github.com/repos/simonw/datasette/issues/405,457975857,MDEyOklzc3VlQ29tbWVudDQ1Nzk3NTg1Nw==,9599,simonw,2019-01-28T01:48:37Z,2019-01-28T01:49:00Z,OWNER,"Demo: https://latest.datasette.io/fixtures-dd88475/facetable.json?_shape=array&_nl=on Also https://b5dd839.datasette.io/fixtures-dd88475/facetable.json?_shape=array&_nl=on","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",403617881,.json?_nl=on option for exporting newline-delimited JSON, https://github.com/simonw/datasette/pull/404#issuecomment-457976864,https://api.github.com/repos/simonw/datasette/issues/404,457976864,MDEyOklzc3VlQ29tbWVudDQ1Nzk3Njg2NA==,9599,simonw,2019-01-28T01:56:55Z,2019-01-28T01:56:55Z,OWNER,"This failed in Python 3.5: ``` File ""/home/travis/virtualenv/python3.5.6/lib/python3.5/site-packages/jinja2/environment.py"", line 1020, in render_async raise NotImplementedError('This feature is not available for this ' NotImplementedError: This feature is not available for this version of Python ``` It looks like this is caused by this feature detection code: https://github.com/pallets/jinja/blob/a7ba0b637805c53d442e975e3864d3ea38d8743f/jinja2/utils.py#L633-L638","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",403499298,Experiment: run Jinja in async mode, https://github.com/simonw/sqlite-utils/issues/6#issuecomment-457978729,https://api.github.com/repos/simonw/sqlite-utils/issues/6,457978729,MDEyOklzc3VlQ29tbWVudDQ1Nzk3ODcyOQ==,9599,simonw,2019-01-28T02:12:19Z,2019-01-28T02:12:19Z,OWNER,Will need to solve #7 for this to become truly efficient.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",403624090,"""sqlite-utils insert"" should support newline-delimited JSON", https://github.com/simonw/sqlite-utils/issues/7#issuecomment-457980966,https://api.github.com/repos/simonw/sqlite-utils/issues/7,457980966,MDEyOklzc3VlQ29tbWVudDQ1Nzk4MDk2Ng==,9599,simonw,2019-01-28T02:29:32Z,2019-01-28T02:29:32Z,OWNER,"Remember to remove this TODO (and turn the `[]` into `()` on this line) as part of this task: https://github.com/simonw/sqlite-utils/blob/5309c5c7755818323a0f5353bad0de98ecc866be/sqlite_utils/cli.py#L78-L80","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",403625674,.insert_all() should accept a generator and process it efficiently, https://github.com/simonw/sqlite-utils/issues/7#issuecomment-458011885,https://api.github.com/repos/simonw/sqlite-utils/issues/7,458011885,MDEyOklzc3VlQ29tbWVudDQ1ODAxMTg4NQ==,9599,simonw,2019-01-28T06:25:48Z,2019-01-28T06:25:48Z,OWNER,Re-opening for the second bit involving the cli tool.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",403625674,.insert_all() should accept a generator and process it efficiently, https://github.com/simonw/sqlite-utils/issues/7#issuecomment-458011906,https://api.github.com/repos/simonw/sqlite-utils/issues/7,458011906,MDEyOklzc3VlQ29tbWVudDQ1ODAxMTkwNg==,9599,simonw,2019-01-28T06:25:55Z,2019-01-28T06:25:55Z,OWNER,"I tested this with a script called `churn_em_out.py` ``` i = 0 while True: i += 1 print( '{""id"": I, ""another"": ""row"", ""number"": J}'.replace(""I"", str(i)).replace( ""J"", str(i + 1) ) ) ``` Then I ran this: ``` python churn_em_out.py | \ sqlite-utils insert /tmp/getbig.db stats - \ --nl --batch-size=10000 ``` And used `watch 'ls -lah /tmp/getbig.db'` to watch the file growing as it had 10,000 lines of junk committed in batches. The memory used by the process never grew about around 50MB.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",403625674,.insert_all() should accept a generator and process it efficiently, https://github.com/simonw/datasette/issues/160#issuecomment-459915995,https://api.github.com/repos/simonw/datasette/issues/160,459915995,MDEyOklzc3VlQ29tbWVudDQ1OTkxNTk5NQ==,82988,psychemedia,2019-02-02T00:43:16Z,2019-02-02T00:58:20Z,CONTRIBUTOR,"Do you have any simple working examples of how to use `--static`? Inspection of default served files suggests locations such as `http://example.com/-/static/app.css?0e06ee`. If `datasette` is being proxied to `http://example.com/foo/datasette`, what form should arguments to `--static` take so that static files are correctly referenced? Use case is here: https://github.com/psychemedia/jupyterserverproxy-datasette-demo Trying to do a really simple `datasette` demo in MyBinder using jupyter-server-proxy.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/pull/407#issuecomment-460897973,https://api.github.com/repos/simonw/datasette/issues/407,460897973,MDEyOklzc3VlQ29tbWVudDQ2MDg5Nzk3Mw==,9599,simonw,2019-02-06T04:31:30Z,2019-02-06T04:31:30Z,OWNER,This helped my figure out what to do: https://github.com/heroku/heroku-builds/issues/36,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",407073223,Heroku --include-vcs-ignore, https://github.com/simonw/datasette/issues/398#issuecomment-460901857,https://api.github.com/repos/simonw/datasette/issues/398,460901857,MDEyOklzc3VlQ29tbWVudDQ2MDkwMTg1Nw==,9599,simonw,2019-02-06T05:01:19Z,2019-02-06T05:01:19Z,OWNER,"I'd really like to use the content-length header here, but Sanic hasn't yet fixed the bug I filed about it: https://github.com/huge-success/sanic/issues/1194","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",398011658,Ensure downloading a 100+MB SQLite database file works, https://github.com/simonw/datasette/issues/172#issuecomment-460902824,https://api.github.com/repos/simonw/datasette/issues/172,460902824,MDEyOklzc3VlQ29tbWVudDQ2MDkwMjgyNA==,9599,simonw,2019-02-06T05:09:05Z,2019-02-06T05:09:05Z,OWNER,"Demo: https://latest.datasette.io/fixtures-dd88475 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280896290,Show size of .db file next to download link, https://github.com/simonw/datasette/issues/187#issuecomment-463917744,https://api.github.com/repos/simonw/datasette/issues/187,463917744,MDEyOklzc3VlQ29tbWVudDQ2MzkxNzc0NA==,4190962,phoenixjun,2019-02-15T05:58:44Z,2019-02-15T05:58:44Z,NONE,is this supported or not? you can comment if it is not supported so that people like me can stop trying.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309033998,Windows installation error, https://github.com/simonw/sqlite-utils/issues/8#issuecomment-464341721,https://api.github.com/repos/simonw/sqlite-utils/issues/8,464341721,MDEyOklzc3VlQ29tbWVudDQ2NDM0MTcyMQ==,82988,psychemedia,2019-02-16T12:08:41Z,2019-02-16T12:08:41Z,NONE,We also get an error if a column name contains a `.`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",403922644,Problems handling column names containing spaces or - , https://github.com/simonw/datasette/issues/187#issuecomment-466325528,https://api.github.com/repos/simonw/datasette/issues/187,466325528,MDEyOklzc3VlQ29tbWVudDQ2NjMyNTUyOA==,2892252,fkuhn,2019-02-22T09:03:50Z,2019-02-22T09:03:50Z,NONE,"I ran into the same issue when trying to install datasette on windows after successfully using it on linux. Unfortunately, there has not been any progress in implementing uvloop for windows - so I recommend not to use it on win. You can read about this issue here: [https://github.com/MagicStack/uvloop/issues/14](url)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309033998,Windows installation error, https://github.com/simonw/sqlite-utils/issues/8#issuecomment-466695500,https://api.github.com/repos/simonw/sqlite-utils/issues/8,466695500,MDEyOklzc3VlQ29tbWVudDQ2NjY5NTUwMA==,9599,simonw,2019-02-23T21:09:03Z,2019-02-23T21:09:03Z,OWNER,"Fixed in https://github.com/simonw/sqlite-utils/commit/228d595f7d10994f34e948888093c2cd290267c4 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",403922644,Problems handling column names containing spaces or - , https://github.com/simonw/sqlite-utils/issues/11#issuecomment-466695672,https://api.github.com/repos/simonw/sqlite-utils/issues/11,466695672,MDEyOklzc3VlQ29tbWVudDQ2NjY5NTY3Mg==,9599,simonw,2019-02-23T21:10:23Z,2019-02-23T21:10:23Z,OWNER,"Rough sketch: ``` +try: + import numpy +except ImportError: + numpy = None + Column = namedtuple( ""Column"", (""cid"", ""name"", ""type"", ""notnull"", ""default_value"", ""is_pk"") ) @@ -70,6 +79,22 @@ class Database: datetime.time: ""TEXT"", None.__class__: ""TEXT"", } + # If numpy is available, add more types + if numpy: + col_type_mapping.update({ + numpy.int8: ""INTEGER"", + numpy.int16: ""INTEGER"", + numpy.int32: ""INTEGER"", + numpy.int64: ""INTEGER"", + numpy.uint8: ""INTEGER"", + numpy.uint16: ""INTEGER"", + numpy.uint32: ""INTEGER"", + numpy.uint64: ""INTEGER"", + numpy.float16: ""FLOAT"", + numpy.float32: ""FLOAT"", + numpy.float64: ""FLOAT"", + numpy.float128: ""FLOAT"", + }) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413740684,Detect numpy types when creating tables, https://github.com/simonw/sqlite-utils/issues/11#issuecomment-466695695,https://api.github.com/repos/simonw/sqlite-utils/issues/11,466695695,MDEyOklzc3VlQ29tbWVudDQ2NjY5NTY5NQ==,9599,simonw,2019-02-23T21:10:35Z,2019-02-23T21:10:35Z,OWNER,Need to test this both with and without `numpy` installed.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413740684,Detect numpy types when creating tables, https://github.com/simonw/sqlite-utils/issues/13#issuecomment-466732039,https://api.github.com/repos/simonw/sqlite-utils/issues/13,466732039,MDEyOklzc3VlQ29tbWVudDQ2NjczMjAzOQ==,9599,simonw,2019-02-24T04:07:57Z,2019-02-24T04:07:57Z,OWNER,"Example: http://api.nobelprize.org/v1/laureate.json This includes affiliations which look like this: ""affiliations"": [ { ""name"": ""Sorbonne University"", ""city"": ""Paris"", ""country"": ""France"" } ]","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413779210,Ability to automatically create IDs from content hash of row, https://github.com/simonw/sqlite-utils/issues/10#issuecomment-466794069,https://api.github.com/repos/simonw/sqlite-utils/issues/10,466794069,MDEyOklzc3VlQ29tbWVudDQ2Njc5NDA2OQ==,9599,simonw,2019-02-24T16:55:37Z,2019-02-24T16:55:37Z,OWNER,"This was fixed by https://github.com/simonw/sqlite-utils/commit/228d595f7d10994f34e948888093c2cd290267c4 - see also #8 ``` >>> db = sqlite_utils.Database("":memory:"") >>> dfX=pd.DataFrame({'order':range(3),'col2':range(3)}) >>> db[""test""].upsert_all(dfX.to_dict(orient='records')) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",411066700,Error in upsert if column named 'order', https://github.com/simonw/sqlite-utils/issues/14#issuecomment-466794369,https://api.github.com/repos/simonw/sqlite-utils/issues/14,466794369,MDEyOklzc3VlQ29tbWVudDQ2Njc5NDM2OQ==,9599,simonw,2019-02-24T16:59:11Z,2019-02-24T16:59:43Z,OWNER,"https://www.sqlite.org/lang_createindex.html ![image](https://user-images.githubusercontent.com/9599/53302378-72512c80-3812-11e9-8828-46a03d893879.png) May as well support ``--if-not-exists`` as well.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413842611,Utilities for adding indexes, https://github.com/simonw/sqlite-utils/issues/14#issuecomment-466800090,https://api.github.com/repos/simonw/sqlite-utils/issues/14,466800090,MDEyOklzc3VlQ29tbWVudDQ2NjgwMDA5MA==,9599,simonw,2019-02-24T18:01:10Z,2019-02-24T18:01:10Z,OWNER,"The `WHERE` clause can be used to create partial indexes: https://www.sqlite.org/partialindex.html I'm going to ignore it for the moment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413842611,Utilities for adding indexes, https://github.com/simonw/sqlite-utils/issues/14#issuecomment-466800210,https://api.github.com/repos/simonw/sqlite-utils/issues/14,466800210,MDEyOklzc3VlQ29tbWVudDQ2NjgwMDIxMA==,9599,simonw,2019-02-24T18:02:23Z,2019-02-24T18:02:23Z,OWNER,Likewise I'm going to ignore indexes on expressions (as opposed to just columns): https://www.sqlite.org/expridx.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413842611,Utilities for adding indexes, https://github.com/simonw/sqlite-utils/issues/2#issuecomment-466807308,https://api.github.com/repos/simonw/sqlite-utils/issues/2,466807308,MDEyOklzc3VlQ29tbWVudDQ2NjgwNzMwOA==,9599,simonw,2019-02-24T19:18:19Z,2019-02-24T19:18:25Z,OWNER,"Python API: db[""articles""].add_foreign_key(""author_id"", ""authors"", ""id"") CLI: $ sqlite-utils add-foreign-key articles author_id authors id ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",349850687,Mechanism for adding foreign keys to an existing table, https://github.com/simonw/sqlite-utils/issues/17#issuecomment-466820167,https://api.github.com/repos/simonw/sqlite-utils/issues/17,466820167,MDEyOklzc3VlQ29tbWVudDQ2NjgyMDE2Nw==,9599,simonw,2019-02-24T21:42:33Z,2019-02-24T21:42:33Z,OWNER,"It looks like the type information isn't actually used for anything at all, so this: https://github.com/simonw/sqlite-utils/blob/f8d3b7cfe5c1950b0749d40eb2640df50b52f651/tests/test_create.py#L97-L103 Could actually be written like this: ``` fresh_db[""m2m""].insert( {""one_id"": 1, ""two_id"": 1}, foreign_keys=( (""one_id"", ""one"", ""id""), (""two_id"", ""two"", ""id""), ), ) ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413868452,Improve and document foreign_keys=... argument to insert/create/etc, https://github.com/simonw/sqlite-utils/issues/17#issuecomment-466820188,https://api.github.com/repos/simonw/sqlite-utils/issues/17,466820188,MDEyOklzc3VlQ29tbWVudDQ2NjgyMDE4OA==,9599,simonw,2019-02-24T21:42:50Z,2019-02-24T21:42:50Z,OWNER,Sanity checking those foreign keys would be worthwhile.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413868452,Improve and document foreign_keys=... argument to insert/create/etc, https://github.com/simonw/sqlite-utils/issues/17#issuecomment-466821200,https://api.github.com/repos/simonw/sqlite-utils/issues/17,466821200,MDEyOklzc3VlQ29tbWVudDQ2NjgyMTIwMA==,9599,simonw,2019-02-24T21:55:08Z,2019-02-24T21:55:54Z,OWNER,"This involves a breaking API change. I need to call that out in the README and also fix my two other projects which use the old four-tuple version of `foreign_keys=`: https://github.com/simonw/db-to-sqlite/blob/c2f8e93bc6bbdfd135de3656ea0f497859ae49ff/db_to_sqlite/cli.py#L30-L42 And https://github.com/simonw/russian-ira-facebook-ads-datasette/blob/e7106710abdd7bdcae035bedd8bdaba75ae56a12/fetch_and_build_russian_ads.py#L71-L74 I'll also need to set a minimum version for `sqlite-utils` in the `db-to-sqlite` setup.py: https://github.com/simonw/db-to-sqlite/blob/c2f8e93bc6bbdfd135de3656ea0f497859ae49ff/setup.py#L25","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413868452,Improve and document foreign_keys=... argument to insert/create/etc, https://github.com/simonw/sqlite-utils/issues/17#issuecomment-466823422,https://api.github.com/repos/simonw/sqlite-utils/issues/17,466823422,MDEyOklzc3VlQ29tbWVudDQ2NjgyMzQyMg==,9599,simonw,2019-02-24T22:20:05Z,2019-02-24T22:20:05Z,OWNER,Re-opening this until I've fixed the other two projects.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413868452,Improve and document foreign_keys=... argument to insert/create/etc, https://github.com/simonw/sqlite-utils/issues/17#issuecomment-466827533,https://api.github.com/repos/simonw/sqlite-utils/issues/17,466827533,MDEyOklzc3VlQ29tbWVudDQ2NjgyNzUzMw==,9599,simonw,2019-02-24T23:03:29Z,2019-02-24T23:03:29Z,OWNER,Need to put out a new release of `sqlite-utils` so `db-to-sqlite` can depend on it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413868452,Improve and document foreign_keys=... argument to insert/create/etc, https://github.com/simonw/sqlite-utils/issues/17#issuecomment-466828503,https://api.github.com/repos/simonw/sqlite-utils/issues/17,466828503,MDEyOklzc3VlQ29tbWVudDQ2NjgyODUwMw==,9599,simonw,2019-02-24T23:15:26Z,2019-02-24T23:15:26Z,OWNER,Released: https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v0-14,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413868452,Improve and document foreign_keys=... argument to insert/create/etc, https://github.com/simonw/sqlite-utils/issues/17#issuecomment-466830869,https://api.github.com/repos/simonw/sqlite-utils/issues/17,466830869,MDEyOklzc3VlQ29tbWVudDQ2NjgzMDg2OQ==,9599,simonw,2019-02-24T23:45:48Z,2019-02-24T23:45:48Z,OWNER,Both projects have been upgraded.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413868452,Improve and document foreign_keys=... argument to insert/create/etc, https://github.com/simonw/datasette/issues/187#issuecomment-467264937,https://api.github.com/repos/simonw/datasette/issues/187,467264937,MDEyOklzc3VlQ29tbWVudDQ2NzI2NDkzNw==,9599,simonw,2019-02-26T02:14:28Z,2019-02-26T02:14:28Z,OWNER,I'm working on a port of Datasette to Starlette which I think would fix this issue: https://github.com/encode/starlette,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309033998,Windows installation error, https://github.com/simonw/datasette/issues/409#issuecomment-472844001,https://api.github.com/repos/simonw/datasette/issues/409,472844001,MDEyOklzc3VlQ29tbWVudDQ3Mjg0NDAwMQ==,43100,Uninen,2019-03-14T13:04:20Z,2019-03-14T13:04:42Z,NONE,It seems this affects the Datasette Publish -site as well: https://github.com/simonw/datasette-publish-support/issues/3,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",408376825,Zeit API v1 does not work for new users - need to migrate to v2, https://github.com/simonw/datasette/issues/409#issuecomment-472875713,https://api.github.com/repos/simonw/datasette/issues/409,472875713,MDEyOklzc3VlQ29tbWVudDQ3Mjg3NTcxMw==,209967,michaelmcandrew,2019-03-14T14:14:39Z,2019-03-14T14:14:39Z,NONE,also linking this zeit issue in case it is helpful: https://github.com/zeit/now-examples/issues/163#issuecomment-440125769,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",408376825,Zeit API v1 does not work for new users - need to migrate to v2, https://github.com/simonw/datasette/pull/416#issuecomment-473154643,https://api.github.com/repos/simonw/datasette/issues/416,473154643,MDEyOklzc3VlQ29tbWVudDQ3MzE1NDY0Mw==,9599,simonw,2019-03-15T04:27:47Z,2019-03-15T04:28:00Z,OWNER,"Deployed a demo: https://datasette-optional-hash-demo.now.sh/ datasette publish now \ ../demo-databses/russian-ads.db \ ../demo-databses/polar-bears.db \ --branch=optional-hash \ -n datasette-optional-hash \ --alias datasette-optional-hash-demo \ --install=datasette-cluster-map \ --install=datasette-json-html ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421348146,URL hashing now optional: turn on with --config hash_urls:1 (#418), https://github.com/simonw/datasette/pull/416#issuecomment-473156513,https://api.github.com/repos/simonw/datasette/issues/416,473156513,MDEyOklzc3VlQ29tbWVudDQ3MzE1NjUxMw==,9599,simonw,2019-03-15T04:40:29Z,2019-03-15T04:40:29Z,OWNER,"Still TODO: need to figure out what to do about cache TTL. Defaulting to 365 days no longer makes sense without the hash_urls setting. Maybe drop that setting default to 0? Here's the setting: https://github.com/simonw/datasette/blob/9743e1d91b5f0a2b3c1c0bd6ffce8739341f43c4/datasette/app.py#L84-L86 And here's where it takes affect: https://github.com/simonw/datasette/blob/4462a5ab2817ac0d9ffe20dafbbf27c5c5b81466/datasette/views/base.py#L491-L501","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421348146,URL hashing now optional: turn on with --config hash_urls:1 (#418), https://github.com/simonw/datasette/issues/414#issuecomment-473156774,https://api.github.com/repos/simonw/datasette/issues/414,473156774,MDEyOklzc3VlQ29tbWVudDQ3MzE1Njc3NA==,9599,simonw,2019-03-15T04:42:06Z,2019-03-15T04:42:06Z,OWNER,"This has been bothering me as well, especially when I try to install `datasette` and `sqlite-utils` at the same time.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",415575624,datasette requires specific version of Click, https://github.com/simonw/datasette/issues/411#issuecomment-473156905,https://api.github.com/repos/simonw/datasette/issues/411,473156905,MDEyOklzc3VlQ29tbWVudDQ3MzE1NjkwNQ==,9599,simonw,2019-03-15T04:42:58Z,2019-03-15T04:42:58Z,OWNER,"Have you tried this? MakePoint(:Long || "", "" || :Lat) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",410384988,How to pass named parameter into spatialite MakePoint() function, https://github.com/simonw/datasette/issues/415#issuecomment-473157770,https://api.github.com/repos/simonw/datasette/issues/415,473157770,MDEyOklzc3VlQ29tbWVudDQ3MzE1Nzc3MA==,9599,simonw,2019-03-15T04:49:03Z,2019-03-15T04:49:03Z,OWNER,"Interesting idea. I can see how this would make sense if you are dealing with really long SQL queries. My own example of a long query that might benefit from this: https://russian-ads-demo.herokuapp.com/russian-ads-a42c4e8?sql=select%0D%0A++++target_id%2C%0D%0A++++targets.name%2C%0D%0A++++count(*)+as+n%2C%0D%0A++++json_object(%0D%0A++++++++%22href%22%2C+%22%2Frussian-ads%2Ffaceted-targets%3Ftargets%3D%22+||+%0D%0A++++++++++++json_insert(%3Atargets%2C+%27%24[%27+||+json_array_length(%3Atargets)+||+%27]%27%2C+target_id)%0D%0A++++++++%2C%0D%0A++++++++%22label%22%2C+json_insert(%3Atargets%2C+%27%24[%27+||+json_array_length(%3Atargets)+||+%27]%27%2C+target_id)%0D%0A++++)+as+apply_this_facet%2C%0D%0A++++json_object(%0D%0A++++++++%22href%22%2C+%22%2Frussian-ads%2Fdisplay_ads%3F_targets_json%3D%22+||+%0D%0A++++++++++++json_insert(%3Atargets%2C+%27%24[%27+||+json_array_length(%3Atargets)+||+%27]%27%2C+target_id)%0D%0A++++++++%2C%0D%0A++++++++%22label%22%2C+%22See+%22+||+count(*)+||+%22+ads+matching+%22+||+json_insert(%3Atargets%2C+%27%24[%27+||+json_array_length(%3Atargets)+||+%27]%27%2C+target_id)%0D%0A++++)+as+browse_these_ads%0D%0Afrom+ad_targets%0D%0Ajoin+targets+on+ad_targets.target_id+%3D+targets.id%0D%0Awhere%0D%0A++++json_array_length(%3Atargets)+%3D%3D+0+or%0D%0A++++ad_id+in+(%0D%0A++++++++select+ad_id%0D%0A++++++++from+%22ad_targets%22%0D%0A++++++++where+%22ad_targets%22.target_id+in+(select+value+from+json_each(%3Atargets))%0D%0A++++++++group+by+%22ad_targets%22.ad_id%0D%0A++++++++having+count(distinct+%22ad_targets%22.target_id)+%3D+json_array_length(%3Atargets)%0D%0A++++)%0D%0A++++and+target_id+not+in+(select+value+from+json_each(%3Atargets))%0D%0Agroup+by%0D%0A++++target_id+order+by+n+desc%0D%0A&targets=[%22e6200%22] Having a `show/hide` link would be an easy way to support this in the UI, and those could add/remove a `_hide_sql=1` parameter.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",418329842,Add query parameter to hide SQL textarea, https://github.com/simonw/datasette/issues/412#issuecomment-473158506,https://api.github.com/repos/simonw/datasette/issues/412,473158506,MDEyOklzc3VlQ29tbWVudDQ3MzE1ODUwNg==,9599,simonw,2019-03-15T04:53:53Z,2019-03-15T04:53:53Z,OWNER,"I've been thinking about how Datasette instances could query each other for a while - it's a really interesting direction. There are some tricky problems to solve to get this to work. There's a SQLite mechanism called ""virtual table functions"" which can implement things like this, but it's not supported by Python's `sqlite3` module out of the box. https://github.com/coleifer/sqlite-vtfunc is a library that enables this feature. I experimented with using that to implement a function that scrapes HTML content (with an eye to accessing data from other APIs and Datasette instances) a while ago: https://github.com/coleifer/sqlite-vtfunc/issues/6 The bigger challenge is how to get this kind of thing to behave well within a Python 3 async environment. I have some ideas here but they're going to require some very crafty engineering.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",411257981,Linked Data(sette), https://github.com/simonw/datasette/pull/416#issuecomment-473159679,https://api.github.com/repos/simonw/datasette/issues/416,473159679,MDEyOklzc3VlQ29tbWVudDQ3MzE1OTY3OQ==,9599,simonw,2019-03-15T05:01:27Z,2019-03-15T05:01:27Z,OWNER,"Also: if the option is False and the user visits a URL with a hash in it, should we redirect them? I'm inclined to say no: furthermore, I'd be OK continuing to serve a far-future cache header for that case.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421348146,URL hashing now optional: turn on with --config hash_urls:1 (#418), https://github.com/simonw/datasette/pull/413#issuecomment-473160476,https://api.github.com/repos/simonw/datasette/issues/413,473160476,MDEyOklzc3VlQ29tbWVudDQ3MzE2MDQ3Ng==,9599,simonw,2019-03-15T05:06:37Z,2019-03-15T05:06:37Z,OWNER,Thanks!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413887019,Update spatialite.rst, https://github.com/simonw/datasette/pull/416#issuecomment-473160702,https://api.github.com/repos/simonw/datasette/issues/416,473160702,MDEyOklzc3VlQ29tbWVudDQ3MzE2MDcwMg==,9599,simonw,2019-03-15T05:08:13Z,2019-03-15T05:08:13Z,OWNER,This also needs extensive tests to ensure that with the option turned on all of the redirects behave as they should.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421348146,URL hashing now optional: turn on with --config hash_urls:1 (#418), https://github.com/simonw/datasette/issues/415#issuecomment-473164038,https://api.github.com/repos/simonw/datasette/issues/415,473164038,MDEyOklzc3VlQ29tbWVudDQ3MzE2NDAzOA==,9599,simonw,2019-03-15T05:31:21Z,2019-03-15T05:31:21Z,OWNER,"Demo: https://latest.datasette.io/fixtures-dd88475?sql=select+%2A+from+sortable+order+by+pk1%2C+pk2+limit+101 v.s. https://latest.datasette.io/fixtures-dd88475?sql=select+%2A+from+sortable+order+by+pk1%2C+pk2+limit+101&_hide_sql=1 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",418329842,Add query parameter to hide SQL textarea, https://github.com/simonw/datasette/issues/415#issuecomment-473217334,https://api.github.com/repos/simonw/datasette/issues/415,473217334,MDEyOklzc3VlQ29tbWVudDQ3MzIxNzMzNA==,36796532,ad-si,2019-03-15T09:30:57Z,2019-03-15T09:30:57Z,NONE,"Awesome, thanks! 😁 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",418329842,Add query parameter to hide SQL textarea, https://github.com/simonw/datasette/issues/417#issuecomment-473308631,https://api.github.com/repos/simonw/datasette/issues/417,473308631,MDEyOklzc3VlQ29tbWVudDQ3MzMwODYzMQ==,9599,simonw,2019-03-15T14:32:13Z,2019-03-15T14:32:13Z,OWNER,"This would allow Datasette to be easily used as a ""data library"" (like a data warehouse but less expectation of big data querying technology such as Presto). One of the things I learned at the NICAR CAR 2019 conference in Newport Beach is that there is a very real need for some kind of easily accessible data library at most newsrooms.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421546944,Datasette Library, https://github.com/simonw/datasette/pull/416#issuecomment-473310026,https://api.github.com/repos/simonw/datasette/issues/416,473310026,MDEyOklzc3VlQ29tbWVudDQ3MzMxMDAyNg==,9599,simonw,2019-03-15T14:35:53Z,2019-03-15T14:35:53Z,OWNER,See #418 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421348146,URL hashing now optional: turn on with --config hash_urls:1 (#418), https://github.com/simonw/datasette/issues/123#issuecomment-473313975,https://api.github.com/repos/simonw/datasette/issues/123,473313975,MDEyOklzc3VlQ29tbWVudDQ3MzMxMzk3NQ==,9599,simonw,2019-03-15T14:45:46Z,2019-03-15T14:45:46Z,OWNER,"I'm reopening this one as part of #417. Further experience with Python's CSV standard library module has convinced me that pandas is not a required dependency for this. My [sqlite-utils](https://github.com/simonw/sqlite-utils) package can do most of the work here with very few dependencies.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125561,Datasette serve should accept paths/URLs to CSVs and other file formats, https://github.com/simonw/datasette/issues/419#issuecomment-473708941,https://api.github.com/repos/simonw/datasette/issues/419,473708941,MDEyOklzc3VlQ29tbWVudDQ3MzcwODk0MQ==,9599,simonw,2019-03-17T19:58:11Z,2019-03-17T19:58:11Z,OWNER,"Some problems to solve: * Right now Datasette assumes it can always show the count of rows in a table, because this has been pre-calculated. If a database is mutable the pre-calculation trick no longer works, and for giant tables a `select count(*) from X` query can be expensive to run. Maybe we set a time limit on these? If time limit expires show ""many rows""? * Maintaining a content hash of the table no longer makes sense if it is changing (though interestingly there's a `.sha3sum` built-in SQLite CLI command which takes a hash of the content and stays the same even through vacuum runs). Without that we need a different mechanism for calculating table colours. It also means that we can't do the special dbname-hash URL trick (see #418) at all if the database is opened as mutable.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421551434,"Default to opening files in mutable mode, special option for immutable files", https://github.com/simonw/datasette/issues/418#issuecomment-473709815,https://api.github.com/repos/simonw/datasette/issues/418,473709815,MDEyOklzc3VlQ29tbWVudDQ3MzcwOTgxNQ==,9599,simonw,2019-03-17T20:08:31Z,2019-03-17T20:08:31Z,OWNER,"In #419 I'm now proposing that Datasette default to opening files in ""mutable"" mode, in which case it would not make sense to support hash URLs for those files at all. So actually this feature will only be available for files that are explicitly opened in immutable mode.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421548881,Hashed URLs should be optional, https://github.com/simonw/datasette/issues/419#issuecomment-473709883,https://api.github.com/repos/simonw/datasette/issues/419,473709883,MDEyOklzc3VlQ29tbWVudDQ3MzcwOTg4Mw==,9599,simonw,2019-03-17T20:09:47Z,2019-03-17T20:37:45Z,OWNER,"Could I persist the last calculated count for a table and somehow detect if that table has been changed in any way by another process, hence invalidating the cached count (and potentially scheduling a new count)? https://www.sqlite.org/c3ref/update_hook.html says that `sqlite3_update_hook()` can be used to register a handler invoked on almost all update/insert/delete operations to a specific table... except that it misses out on deletes triggered by `ON CONFLICT REPLACE` and only works for `ROWID` tables. Also this hook is not exposed in the Python `sqlite3` library - though it may be available using some terrifying `ctypes` hacks: https://stackoverflow.com/a/16920926 So on further research, I think the answer is *no*: I should assume that it won't be possible to cache counts and magically invalidate the cache when the underlying file is changed by another process. Instead I need to assume that counts will be an expensive operation. As such, I can introduce a time limit on counts and use that anywhere a count is displayed. If the time limit is exceeded by the `count(*)` query I can show ""many"" instead. That said... running `count(*)` against a table with 200,000 rows in only takes about 3ms, so even a timeout of 20ms is likely to work fine for tables of around a million rows. It would be really neat if I could generate a lower bound count in a limited amount of time. If I counted up to 4m rows before the timeout I could show ""more than 4m rows"". No idea if that would be possible though. Relevant: https://stackoverflow.com/questions/8988915/sqlite-count-slow-on-big-tables - reports of very slow counts on 6GB database file. Consensus seems to be ""yeah, that's just how SQLite is built"" - though there was a suggestion that you can use `select max(ROWID) from table` provided you are certain there have been no deletions. Also relevant: http://sqlite.1065341.n5.nabble.com/sqlite3-performance-on-select-count-very-slow-for-16-GB-file-td80176.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421551434,"Default to opening files in mutable mode, special option for immutable files", https://github.com/simonw/datasette/issues/419#issuecomment-473712820,https://api.github.com/repos/simonw/datasette/issues/419,473712820,MDEyOklzc3VlQ29tbWVudDQ3MzcxMjgyMA==,9599,simonw,2019-03-17T20:43:23Z,2019-03-17T20:43:51Z,OWNER,"So the differences here are: * For immutable databases we calculate content hash and table counts; mutable databases we do not * Immutable databasse open with `file:{}?immutable=1`, mutable databases open with `file:{}?mode=ro` * Anywhere that shows a table count now needs to call a new method which knows to run `count(*)` with a timeout for mutable databases, read from the precalculated counts for immutable databases * The url-hash option should no longer be available at all for mutable databases * New command-line tool syntax: `datasette mutable.db` v.s. `datasette -i immutable.db`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421551434,"Default to opening files in mutable mode, special option for immutable files", https://github.com/simonw/datasette/issues/419#issuecomment-473713363,https://api.github.com/repos/simonw/datasette/issues/419,473713363,MDEyOklzc3VlQ29tbWVudDQ3MzcxMzM2Mw==,9599,simonw,2019-03-17T20:49:39Z,2019-03-17T20:52:46Z,OWNER,"And a really important difference: the whole model of caching inspect data no longer works for mutable files, because another process might make a change to the database schema (adding a new table for example). https://fivethirtyeight.datasettes.com/-/inspect So everywhere that uses `self.ds.inspect()` right now will have to change to calling a routine which knows the difference between mutable and immutable databases and queries for live schema data for mutables while using a cache for immutables. I'll track this as a separate ticket.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421551434,"Default to opening files in mutable mode, special option for immutable files", https://github.com/simonw/datasette/issues/420#issuecomment-473713946,https://api.github.com/repos/simonw/datasette/issues/420,473713946,MDEyOklzc3VlQ29tbWVudDQ3MzcxMzk0Ng==,9599,simonw,2019-03-17T20:56:38Z,2019-03-17T20:58:17Z,OWNER,"Some examples: https://github.com/simonw/datasette/blob/1f54e092306b208125f39d06712b02895eb75168/datasette/views/table.py#L34-L40 https://github.com/simonw/datasette/blob/1f54e092306b208125f39d06712b02895eb75168/datasette/views/table.py#L45-L48 https://github.com/simonw/datasette/blob/1f54e092306b208125f39d06712b02895eb75168/datasette/views/table.py#L62-L65 https://github.com/simonw/datasette/blob/1f54e092306b208125f39d06712b02895eb75168/datasette/views/table.py#L112-L123 https://github.com/simonw/datasette/blob/1f54e092306b208125f39d06712b02895eb75168/datasette/views/index.py#L11-L19 https://github.com/simonw/datasette/blob/afe9aa3ae03c485c5d6652741438d09445a486c1/datasette/views/base.py#L143-L147 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421971339,Fix all the places that currently use .inspect() data, https://github.com/simonw/datasette/pull/416#issuecomment-473714545,https://api.github.com/repos/simonw/datasette/issues/416,473714545,MDEyOklzc3VlQ29tbWVudDQ3MzcxNDU0NQ==,9599,simonw,2019-03-17T21:03:08Z,2019-03-17T21:04:17Z,OWNER,I'm going to introduce a new config setting: `default_cache_ttl_hashed` - and set the default value for `default_cache_ttl` to 10s (to protect against dog-piling).,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421348146,URL hashing now optional: turn on with --config hash_urls:1 (#418), https://github.com/simonw/datasette/pull/416#issuecomment-473715254,https://api.github.com/repos/simonw/datasette/issues/416,473715254,MDEyOklzc3VlQ29tbWVudDQ3MzcxNTI1NA==,9599,simonw,2019-03-17T21:11:37Z,2019-03-17T21:11:37Z,OWNER,The code for this has got a bit tricky. I need to make a decision at some point as to if the current request is a hashed_url request (if it includes a DB hash in the URL which is the current correct hash). I then need to be able to use that fact to decide which default TTL value to apply when returning the response.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421348146,URL hashing now optional: turn on with --config hash_urls:1 (#418), https://github.com/simonw/datasette/pull/416#issuecomment-473717052,https://api.github.com/repos/simonw/datasette/issues/416,473717052,MDEyOklzc3VlQ29tbWVudDQ3MzcxNzA1Mg==,9599,simonw,2019-03-17T21:32:24Z,2019-03-17T21:33:16Z,OWNER,"Since this feature is now controlled by a config setting, I'm inclined to make it also available via a URL parameter. If you hit this URL: /fixtures/table.json?_hash=1 We can redirect to: /fixtures-c2342/table.json In this way developers can opt-in to a hashed (and hence far-future cached) response on a per-query basis. This option won't be available against mutable databases though, which are coming in #419 This means that the `hash_urls:1` config basically has the effect of assuming `?_hash=1` on all URLs to mutable databases.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421348146,URL hashing now optional: turn on with --config hash_urls:1 (#418), https://github.com/simonw/datasette/issues/417#issuecomment-473312514,https://api.github.com/repos/simonw/datasette/issues/417,473312514,MDEyOklzc3VlQ29tbWVudDQ3MzMxMjUxNA==,9599,simonw,2019-03-15T14:42:07Z,2019-03-17T22:12:30Z,OWNER,"A neat ability of Datasette Library would be if it can work against other files that have been dropped into the folder. In particular: if a user drops a CSV file into the folder, how about automatically converting that CSV file to SQLite using [sqlite-utils](https://github.com/simonw/sqlite-utils)?","{""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421546944,Datasette Library,