html_url,issue_url,id,node_id,user,created_at,updated_at,author_association,body,reactions,issue,performed_via_github_app https://github.com/simonw/datasette/issues/20#issuecomment-343581130,https://api.github.com/repos/simonw/datasette/issues/20,343581130,MDEyOklzc3VlQ29tbWVudDM0MzU4MTEzMA==,9599,2017-11-10T20:44:38Z,2017-11-10T20:44:38Z,OWNER,"I'm going to handle this a different way. I'm going to support a local history of your own queries stored in localStorage, but if you want to share a query you have to do it with a URL. If people really want canned query support, they can do that using custom templates - see #12 - or by adding views to their database before they publish it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136, https://github.com/simonw/datasette/issues/21#issuecomment-343581332,https://api.github.com/repos/simonw/datasette/issues/21,343581332,MDEyOklzc3VlQ29tbWVudDM0MzU4MTMzMg==,9599,2017-11-10T20:45:42Z,2017-11-10T20:45:42Z,OWNER,I'm not going to use Sanic's mechanism for this. I'll use arguments passed to my cli instead.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267769034, https://github.com/simonw/datasette/issues/16#issuecomment-343643332,https://api.github.com/repos/simonw/datasette/issues/16,343643332,MDEyOklzc3VlQ29tbWVudDM0MzY0MzMzMg==,9599,2017-11-11T06:00:04Z,2017-11-11T06:00:04Z,OWNER,"Here's what a table looks like now at a smaller screen size: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267726219, https://github.com/simonw/datasette/issues/54#issuecomment-343644891,https://api.github.com/repos/simonw/datasette/issues/54,343644891,MDEyOklzc3VlQ29tbWVudDM0MzY0NDg5MQ==,9599,2017-11-11T06:39:54Z,2017-11-11T06:39:54Z,OWNER,"I can detect something is a view like this: SELECT name from sqlite_master WHERE type ='view'; ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273121803, https://github.com/simonw/datasette/issues/26#issuecomment-343644976,https://api.github.com/repos/simonw/datasette/issues/26,343644976,MDEyOklzc3VlQ29tbWVudDM0MzY0NDk3Ng==,9599,2017-11-11T06:42:23Z,2017-11-11T06:42:23Z,OWNER,"Simplest version of this: 1. Create a temporary directory 2. Write a Dockerfile into it that pulls an image and pip installs datasette 3. Add symlinks to the DBs they listed (so we don't have to copy them) 4. Shell out to ""now"" 5. Done! ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267861210, https://github.com/simonw/datasette/issues/26#issuecomment-343645249,https://api.github.com/repos/simonw/datasette/issues/26,343645249,MDEyOklzc3VlQ29tbWVudDM0MzY0NTI0OQ==,9599,2017-11-11T06:48:59Z,2017-11-11T06:48:59Z,OWNER,"Doing this works: import os os.link('/tmp/databases/northwind.db', '/tmp/tmp-blah/northwind.db') That creates a link in tmp-blah - and then when I delete that entire directory like so: import shutil shutil.rmtree('/tmp/tmp-blah') The original database is not deleted, just the link.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267861210, https://github.com/simonw/datasette/issues/26#issuecomment-343645327,https://api.github.com/repos/simonw/datasette/issues/26,343645327,MDEyOklzc3VlQ29tbWVudDM0MzY0NTMyNw==,9599,2017-11-11T06:51:16Z,2017-11-11T06:51:16Z,OWNER,"I can create the temporary directory like so: import tempfile t = tempfile.TemporaryDirectory() t t.name '/var/folders/w9/0xm39tk94ng9h52g06z4b54c0000gp/T/tmpkym70wlp' And then to delete it all: t.cleanup() ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267861210, https://github.com/simonw/datasette/issues/40#issuecomment-343646740,https://api.github.com/repos/simonw/datasette/issues/40,343646740,MDEyOklzc3VlQ29tbWVudDM0MzY0Njc0MA==,9599,2017-11-11T07:27:33Z,2017-11-11T07:27:33Z,OWNER,I'm happy with this now that I've implemented the publish command in #26 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268470572, https://github.com/simonw/datasette/issues/47#issuecomment-343647102,https://api.github.com/repos/simonw/datasette/issues/47,343647102,MDEyOklzc3VlQ29tbWVudDM0MzY0NzEwMg==,9599,2017-11-11T07:36:00Z,2017-11-11T07:36:00Z,OWNER,"http://2016.padjo.org/tutorials/data-primer-census-acs1-demographics/ has a sqlite database: http://2016.padjo.org/files/data/starterpack/census-acs-1year/acs-1-year-2015.sqlite I tested this by deploying it here: https://datasette-fewuggrvwr.now.sh/","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271831408, https://github.com/simonw/datasette/issues/16#issuecomment-343647300,https://api.github.com/repos/simonw/datasette/issues/16,343647300,MDEyOklzc3VlQ29tbWVudDM0MzY0NzMwMA==,9599,2017-11-11T07:41:19Z,2017-11-11T07:53:09Z,OWNER,"Still needed: - [ ] A link to the homepage from some kind of navigation bar in the header - [ ] link to github.com/simonw/datasette in the footer - [ ] Slightly better titles (maybe ditch the visited link colours for titles only? should keep those for primary key links) - [ ] Links to the .json and .jsono versions of every view","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267726219, https://github.com/simonw/datasette/issues/14#issuecomment-343675165,https://api.github.com/repos/simonw/datasette/issues/14,343675165,MDEyOklzc3VlQ29tbWVudDM0MzY3NTE2NQ==,9599,2017-11-11T16:07:10Z,2017-11-11T16:07:10Z,OWNER,The plugin system can also allow alternative providers for the `publish` command - e.g. maybe hook up hyper.sh as an option for publishing containers.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940, https://github.com/simonw/datasette/issues/59#issuecomment-343676574,https://api.github.com/repos/simonw/datasette/issues/59,343676574,MDEyOklzc3VlQ29tbWVudDM0MzY3NjU3NA==,9599,2017-11-11T16:29:48Z,2017-11-11T16:29:48Z,OWNER,See also #14,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273157085, https://github.com/simonw/datasette/issues/60#issuecomment-343683566,https://api.github.com/repos/simonw/datasette/issues/60,343683566,MDEyOklzc3VlQ29tbWVudDM0MzY4MzU2Ng==,9599,2017-11-11T18:12:24Z,2017-11-11T18:12:24Z,OWNER,"I’m going to solve this by making it an optional argument you can pass to the serve command. Then the Dockerfile can still build and use it but it won’t interfere with tests or dev. If argument is not passed, we will calculate hashes on startup and calculate table row counts on demand. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273163905, https://github.com/simonw/datasette/issues/47#issuecomment-343690060,https://api.github.com/repos/simonw/datasette/issues/47,343690060,MDEyOklzc3VlQ29tbWVudDM0MzY5MDA2MA==,9599,2017-11-11T19:56:08Z,2017-11-11T19:56:08Z,OWNER," ""parlgov-development.db"": { ""url"": ""http://www.parlgov.org/"" }, ""nhsadmin.sqlite"": { ""url"": ""https://github.com/psychemedia/openHealthDataDoodles"" }","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271831408, https://github.com/simonw/datasette/issues/16#issuecomment-343691342,https://api.github.com/repos/simonw/datasette/issues/16,343691342,MDEyOklzc3VlQ29tbWVudDM0MzY5MTM0Mg==,9599,2017-11-11T20:19:07Z,2017-11-11T20:19:07Z,OWNER,"Closing this, opening a fresh ticket for the navigation stuff.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267726219, https://github.com/simonw/datasette/issues/63#issuecomment-343697291,https://api.github.com/repos/simonw/datasette/issues/63,343697291,MDEyOklzc3VlQ29tbWVudDM0MzY5NzI5MQ==,9599,2017-11-11T22:05:06Z,2017-11-11T22:11:49Z,OWNER,"I'm going to bundle sql and sql_params together into a query nested object like this: { ""query"": { ""sql"": ""select ..."", ""params"": { ""p0"": ""blah"" } } }","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273174447, https://github.com/simonw/datasette/issues/50#issuecomment-343698214,https://api.github.com/repos/simonw/datasette/issues/50,343698214,MDEyOklzc3VlQ29tbWVudDM0MzY5ODIxNA==,9599,2017-11-11T22:23:21Z,2017-11-11T22:23:21Z,OWNER,"I'm closing #50 - more tests will be added in the future, but the framework is neatly in place for them now. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272694136, https://github.com/simonw/datasette/issues/53#issuecomment-343699115,https://api.github.com/repos/simonw/datasette/issues/53,343699115,MDEyOklzc3VlQ29tbWVudDM0MzY5OTExNQ==,9599,2017-11-11T22:41:38Z,2017-11-11T22:41:38Z,OWNER,This needs to incorporate a sensible way of presenting custom SQL query results too. And let's get a textarea in there for executing SQL while we're at it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273054652, https://github.com/simonw/datasette/issues/47#issuecomment-343705966,https://api.github.com/repos/simonw/datasette/issues/47,343705966,MDEyOklzc3VlQ29tbWVudDM0MzcwNTk2Ng==,9599,2017-11-12T01:00:20Z,2017-11-12T01:00:20Z,OWNER,https://github.com/fivethirtyeight/data has a ton of CSVs,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271831408, https://github.com/simonw/datasette/issues/53#issuecomment-343707624,https://api.github.com/repos/simonw/datasette/issues/53,343707624,MDEyOklzc3VlQ29tbWVudDM0MzcwNzYyNA==,9599,2017-11-12T01:47:45Z,2017-11-12T01:47:45Z,OWNER,Split the SQL thing out into #65 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273054652, https://github.com/simonw/datasette/issues/53#issuecomment-343707676,https://api.github.com/repos/simonw/datasette/issues/53,343707676,MDEyOklzc3VlQ29tbWVudDM0MzcwNzY3Ng==,9599,2017-11-12T01:49:07Z,2017-11-12T01:49:07Z,OWNER,"Here's the new design: Also lists views at the bottom (refs #54): ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273054652, https://github.com/simonw/datasette/issues/42#issuecomment-343708447,https://api.github.com/repos/simonw/datasette/issues/42,343708447,MDEyOklzc3VlQ29tbWVudDM0MzcwODQ0Nw==,9599,2017-11-12T02:12:15Z,2017-11-12T02:12:15Z,OWNER,I ditched the metadata file concept.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268591332, https://github.com/simonw/datasette/issues/65#issuecomment-343709217,https://api.github.com/repos/simonw/datasette/issues/65,343709217,MDEyOklzc3VlQ29tbWVudDM0MzcwOTIxNw==,9599,2017-11-12T02:36:37Z,2017-11-12T02:36:37Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273191608, https://github.com/simonw/datasette/issues/25#issuecomment-343715915,https://api.github.com/repos/simonw/datasette/issues/25,343715915,MDEyOklzc3VlQ29tbWVudDM0MzcxNTkxNQ==,9599,2017-11-12T06:08:28Z,2017-11-12T06:08:28Z,OWNER," con = sqlite3.connect('existing_db.db') with open('dump.sql', 'w') as f: for line in con.iterdump(): f.write('%s\n' % line) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267857622, https://github.com/simonw/datasette/issues/42#issuecomment-343752404,https://api.github.com/repos/simonw/datasette/issues/42,343752404,MDEyOklzc3VlQ29tbWVudDM0Mzc1MjQwNA==,9599,2017-11-12T17:20:10Z,2017-11-12T17:20:10Z,OWNER,"Re-opening this - I've decided to bring back this concept, see #68 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268591332, https://github.com/simonw/datasette/issues/69#issuecomment-343752579,https://api.github.com/repos/simonw/datasette/issues/69,343752579,MDEyOklzc3VlQ29tbWVudDM0Mzc1MjU3OQ==,9599,2017-11-12T17:22:39Z,2017-11-12T17:22:39Z,OWNER,"By default I'll allow LIMIT and OFFSET up to a maximum of X (where X is let's say 50,000 to start with, but can be custom configured to a larger number or set to None for no limit).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273248366, https://github.com/simonw/datasette/issues/66#issuecomment-343752683,https://api.github.com/repos/simonw/datasette/issues/66,343752683,MDEyOklzc3VlQ29tbWVudDM0Mzc1MjY4Mw==,9599,2017-11-12T17:24:05Z,2017-11-12T17:24:21Z,OWNER,"Maybe SQL views should have their own Sanic view class (`ViewView` is kinda funny), subclassed from `TableView`?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273191806, https://github.com/simonw/datasette/issues/68#issuecomment-343753999,https://api.github.com/repos/simonw/datasette/issues/68,343753999,MDEyOklzc3VlQ29tbWVudDM0Mzc1Mzk5OQ==,9599,2017-11-12T17:45:21Z,2017-11-12T19:38:33Z,OWNER,"For initial launch, I could just support this as some optional command line arguments you pass to the publish command: datasette publish data.db --title=""Title"" --source=""url""","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273247186, https://github.com/simonw/datasette/issues/68#issuecomment-343754058,https://api.github.com/repos/simonw/datasette/issues/68,343754058,MDEyOklzc3VlQ29tbWVudDM0Mzc1NDA1OA==,9599,2017-11-12T17:46:13Z,2017-11-12T17:46:13Z,OWNER,I’m going to store this stuff in a file called metadata.json and move the existing automatically generated metadata to a file called build.json,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273247186, https://github.com/simonw/datasette/issues/57#issuecomment-343769692,https://api.github.com/repos/simonw/datasette/issues/57,343769692,MDEyOklzc3VlQ29tbWVudDM0Mzc2OTY5Mg==,9599,2017-11-12T21:32:36Z,2017-11-12T21:32:36Z,OWNER,I have created a Docker Hub public repository for this: https://hub.docker.com/r/simonwillison/datasette/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127694, https://github.com/simonw/datasette/issues/69#issuecomment-343780039,https://api.github.com/repos/simonw/datasette/issues/69,343780039,MDEyOklzc3VlQ29tbWVudDM0Mzc4MDAzOQ==,9599,2017-11-13T00:05:27Z,2017-11-13T00:05:27Z,OWNER,"I think the only safe way to do this is using SQLite `.fetchmany(1000)` - I can't guarantee that the user has not entered SQL that will outfox a limit in some way. So instead of attempting to edit their SQL, I'll always return 1001 records and let them know if they went over 1000 or not.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273248366, https://github.com/simonw/datasette/issues/71#issuecomment-343780141,https://api.github.com/repos/simonw/datasette/issues/71,343780141,MDEyOklzc3VlQ29tbWVudDM0Mzc4MDE0MQ==,9599,2017-11-13T00:06:52Z,2017-11-13T00:06:52Z,OWNER,I've registered datasettes.com as a domain name for doing this. Now setting it up so Cloudflare and Now can serve content from it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840, https://github.com/simonw/datasette/issues/71#issuecomment-343780539,https://api.github.com/repos/simonw/datasette/issues/71,343780539,MDEyOklzc3VlQ29tbWVudDM0Mzc4MDUzOQ==,9599,2017-11-13T00:13:29Z,2017-11-13T00:19:46Z,OWNER,"https://zeit.co/docs/features/dns is docs now domain add -e datasettes.com I had to set up a custom TXT record on `_now.datasettes.com` to get this to work.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840, https://github.com/simonw/datasette/issues/71#issuecomment-343780671,https://api.github.com/repos/simonw/datasette/issues/71,343780671,MDEyOklzc3VlQ29tbWVudDM0Mzc4MDY3MQ==,9599,2017-11-13T00:15:21Z,2017-11-13T00:17:37Z,OWNER,- [x] Redirect https://datasettes.com/ and https://www.datasettes.com/ to https://github.com/simonw/datasette,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840, https://github.com/simonw/datasette/issues/71#issuecomment-343780814,https://api.github.com/repos/simonw/datasette/issues/71,343780814,MDEyOklzc3VlQ29tbWVudDM0Mzc4MDgxNA==,9599,2017-11-13T00:17:50Z,2017-11-13T00:18:19Z,OWNER,"Achieved those redirects using Cloudflare ""page rules"": https://www.cloudflare.com/a/page-rules/datasettes.com","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840, https://github.com/simonw/datasette/issues/71#issuecomment-343781030,https://api.github.com/repos/simonw/datasette/issues/71,343781030,MDEyOklzc3VlQ29tbWVudDM0Mzc4MTAzMA==,9599,2017-11-13T00:21:05Z,2017-11-13T02:09:32Z,OWNER,"- [x] Have `now domain add -e datasettes.com` run without errors (hopefully just a matter of waiting for the DNS to update) - [x] Alias an example dataset hosted on Now on a datasettes.com subdomain - [x] Confirm that HTTP caching and HTTP/2 redirect pushing works as expected - this may require another page rule","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840, https://github.com/simonw/datasette/issues/71#issuecomment-343788581,https://api.github.com/repos/simonw/datasette/issues/71,343788581,MDEyOklzc3VlQ29tbWVudDM0Mzc4ODU4MQ==,9599,2017-11-13T01:48:17Z,2017-11-13T01:48:17Z,OWNER,"I had to add a rule like this to get letsencrypt certificates on now.sh working: https://github.com/zeit/now-cli/issues/188#issuecomment-270105052 I also have to flip this switch off every time I want to add a new alias: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840, https://github.com/simonw/datasette/issues/71#issuecomment-343788780,https://api.github.com/repos/simonw/datasette/issues/71,343788780,MDEyOklzc3VlQ29tbWVudDM0Mzc4ODc4MA==,9599,2017-11-13T01:50:01Z,2017-11-13T01:50:01Z,OWNER,"Added another page rule in order to get Cloudflare to always obey cache headers sent by the server: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840, https://github.com/simonw/datasette/issues/71#issuecomment-343788817,https://api.github.com/repos/simonw/datasette/issues/71,343788817,MDEyOklzc3VlQ29tbWVudDM0Mzc4ODgxNw==,9599,2017-11-13T01:50:27Z,2017-11-13T01:50:27Z,OWNER,https://fivethirtyeight.datasettes.com/ is now up and running.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840, https://github.com/simonw/datasette/issues/71#issuecomment-343789162,https://api.github.com/repos/simonw/datasette/issues/71,343789162,MDEyOklzc3VlQ29tbWVudDM0Mzc4OTE2Mg==,9599,2017-11-13T01:53:29Z,2017-11-13T01:53:29Z,OWNER,"``` $ curl -i 'https://fivethirtyeight.datasettes.com/fivethirtyeight-75d605c/obama-commutations%2Fobama_commutations.csv.jsono' HTTP/1.1 200 OK Date: Mon, 13 Nov 2017 01:50:57 GMT Content-Type: application/json Transfer-Encoding: chunked Connection: keep-alive Set-Cookie: __cfduid=de836090f3e12a60579cc7a1696cf0d9e1510537857; expires=Tue, 13-Nov-18 01:50:57 GMT; path=/; domain=.datasettes.com; HttpOnly; Secure Access-Control-Allow-Origin: * Cache-Control: public, max-age=31536000 X-Now-Region: now-sfo CF-Cache-Status: HIT Expires: Tue, 13 Nov 2018 01:50:57 GMT Server: cloudflare-nginx CF-RAY: 3bce154a6d9293b4-SJC {""database"": ""fivethirtyeight"", ""table"": ""obama-commutations/obama_commutations.csv""...```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840, https://github.com/simonw/datasette/issues/71#issuecomment-343790984,https://api.github.com/repos/simonw/datasette/issues/71,343790984,MDEyOklzc3VlQ29tbWVudDM0Mzc5MDk4NA==,9599,2017-11-13T02:09:34Z,2017-11-13T02:09:34Z,OWNER,"HTTP/2 push totally worked on the redirect! fetch('https://fivethirtyeight.datasettes.com/fivethirtyeight/riddler-pick-lowest%2Flow_numbers.csv.jsono').then(r => r.json()).then(console.log) Meanwhile, in the network pane... ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273278840, https://github.com/simonw/datasette/issues/68#issuecomment-343791348,https://api.github.com/repos/simonw/datasette/issues/68,343791348,MDEyOklzc3VlQ29tbWVudDM0Mzc5MTM0OA==,9599,2017-11-13T02:12:58Z,2017-11-13T02:12:58Z,OWNER,I should use this on https://fivethirtyeight.datasettes.com/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273247186, https://github.com/simonw/datasette/issues/73#issuecomment-343801392,https://api.github.com/repos/simonw/datasette/issues/73,343801392,MDEyOklzc3VlQ29tbWVudDM0MzgwMTM5Mg==,9599,2017-11-13T03:36:47Z,2017-11-13T03:36:47Z,OWNER,"While I’m at it, let’s allow people to opt out of HTTP/2 push with a ?_nopush=1 argument too - in case they decide they don’t want to receive large 302 responses.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273296178, https://github.com/simonw/datasette/issues/68#issuecomment-343951751,https://api.github.com/repos/simonw/datasette/issues/68,343951751,MDEyOklzc3VlQ29tbWVudDM0Mzk1MTc1MQ==,9599,2017-11-13T15:21:04Z,2017-11-13T15:21:04Z,OWNER,"For first version, I'm just supporting title, source and license information at the database level.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273247186, https://github.com/simonw/datasette/issues/67#issuecomment-343961784,https://api.github.com/repos/simonw/datasette/issues/67,343961784,MDEyOklzc3VlQ29tbWVudDM0Mzk2MTc4NA==,9599,2017-11-13T15:50:50Z,2017-11-13T15:50:50Z,OWNER,"`datasette package ...` - same arguments as `datasette publish`. Creates Docker container in your local repo, optionally tagged with `--tag`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273192789, https://github.com/simonw/datasette/issues/67#issuecomment-343967020,https://api.github.com/repos/simonw/datasette/issues/67,343967020,MDEyOklzc3VlQ29tbWVudDM0Mzk2NzAyMA==,9599,2017-11-13T16:06:10Z,2017-11-13T16:06:10Z,OWNER,http://odewahn.github.io/docker-jumpstart/example.html is helpful,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273192789, https://github.com/simonw/datasette/issues/75#issuecomment-344000982,https://api.github.com/repos/simonw/datasette/issues/75,344000982,MDEyOklzc3VlQ29tbWVudDM0NDAwMDk4Mg==,9599,2017-11-13T17:50:27Z,2017-11-13T17:50:27Z,OWNER,"This is necessary because one of the fun things to do with this tool is run it locally, e.g.: datasette ~/Library/Application\ Support/Google/Chrome/Default/History -p 8003 BUT... if we enable CORS by default, an evil site could try sniffing for localhost:8003 and attempt to steal data. So we'll enable the CORS headers only if `--cors` is provided to the command, and then use that command in the default Dockerfile.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273509159, https://github.com/simonw/datasette/issues/51#issuecomment-344017088,https://api.github.com/repos/simonw/datasette/issues/51,344017088,MDEyOklzc3VlQ29tbWVudDM0NDAxNzA4OA==,9599,2017-11-13T18:44:23Z,2017-11-13T18:44:23Z,OWNER,Implemented in https://github.com/simonw/datasette/commit/e838bd743d31358b362875854a0ac5e78047727f,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272735257, https://github.com/simonw/datasette/issues/74#issuecomment-344018680,https://api.github.com/repos/simonw/datasette/issues/74,344018680,MDEyOklzc3VlQ29tbWVudDM0NDAxODY4MA==,9599,2017-11-13T18:49:58Z,2017-11-13T18:49:58Z,OWNER,Turns out it does this already: https://github.com/simonw/datasette/blob/6b3b05b6db0d2a7b7cec8b8dbb4ddc5e12a376b2/datasette/app.py#L96-L107,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273296684, https://github.com/simonw/datasette/issues/69#issuecomment-344019631,https://api.github.com/repos/simonw/datasette/issues/69,344019631,MDEyOklzc3VlQ29tbWVudDM0NDAxOTYzMQ==,9599,2017-11-13T18:53:13Z,2017-11-13T18:53:13Z,OWNER,I'm going with a page size of 100 and a max limit of 1000,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273248366, https://github.com/simonw/datasette/issues/69#issuecomment-344048656,https://api.github.com/repos/simonw/datasette/issues/69,344048656,MDEyOklzc3VlQ29tbWVudDM0NDA0ODY1Ng==,9599,2017-11-13T20:32:47Z,2017-11-13T20:32:47Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273248366, https://github.com/simonw/datasette/issues/55#issuecomment-344060070,https://api.github.com/repos/simonw/datasette/issues/55,344060070,MDEyOklzc3VlQ29tbWVudDM0NDA2MDA3MA==,9599,2017-11-13T21:14:13Z,2017-11-13T21:14:13Z,OWNER,"I'm going to add some extra metadata to setup.py and then tag this as version 0.8: git tag 0.8 git push --tags Then to ship to PyPI: python setup.py bdist_wheel twine register dist/datasette-0.8-py3-none-any.whl twine upload dist/datasette-0.8-py3-none-any.whl ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127117, https://github.com/simonw/datasette/issues/55#issuecomment-344061762,https://api.github.com/repos/simonw/datasette/issues/55,344061762,MDEyOklzc3VlQ29tbWVudDM0NDA2MTc2Mg==,9599,2017-11-13T21:19:43Z,2017-11-13T21:19:43Z,OWNER,And we're live! https://pypi.python.org/pypi/datasette,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127117, https://github.com/simonw/datasette/issues/80#issuecomment-344074443,https://api.github.com/repos/simonw/datasette/issues/80,344074443,MDEyOklzc3VlQ29tbWVudDM0NDA3NDQ0Mw==,9599,2017-11-13T22:04:54Z,2017-11-13T22:05:02Z,OWNER,"The fivethirtyeight dataset: datasette publish now --name fivethirtyeight --metadata metadata.json fivethirtyeight.db now alias https://fivethirtyeight-jyqfudvjli.now.sh fivethirtyeight.datasettes.com And parlgov: datasette publish now parlgov.db --name=parlgov --metadata=parlgov.json now alias https://parlgov-hqvxuhmbyh.now.sh parlgov.datasettes.com ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273569477, https://github.com/simonw/datasette/issues/80#issuecomment-344075696,https://api.github.com/repos/simonw/datasette/issues/80,344075696,MDEyOklzc3VlQ29tbWVudDM0NDA3NTY5Ng==,9599,2017-11-13T22:09:46Z,2017-11-13T22:09:46Z,OWNER,"Parlgov was throwing errors on one of the views, which takes longer than 1000ms to execute - so I added the ability to customize the time limit in https://github.com/simonw/datasette/commit/1e698787a4dd6df0432021a6814c446c8b69bba2 datasette publish now parlgov.db --metadata parlgov.json --name parlgov --extra-options=""--sql_time_limit_ms=3500"" now alias https://parlgov-nvkcowlixq.now.sh parlgov.datasettes.com https://parlgov.datasettes.com/parlgov-25f9855/view_cabinet now returns in just over 2.5s ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273569477, https://github.com/simonw/datasette/pull/81#issuecomment-344076554,https://api.github.com/repos/simonw/datasette/issues/81,344076554,MDEyOklzc3VlQ29tbWVudDM0NDA3NjU1NA==,9599,2017-11-13T22:12:57Z,2017-11-13T22:12:57Z,OWNER,"Hah, I haven't even announced this yet :) Travis is upset because I'm using SQL in the tests which isn't compatible with their version of Python 3.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273595473, https://github.com/simonw/datasette/issues/59#issuecomment-344081876,https://api.github.com/repos/simonw/datasette/issues/59,344081876,MDEyOklzc3VlQ29tbWVudDM0NDA4MTg3Ng==,9599,2017-11-13T22:33:43Z,2017-11-13T22:33:43Z,OWNER,The `datasette package` command introduced in 4143e3b45c16cbae5e3e3419ef479a71810e7df3 is relevant here.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273157085, https://github.com/simonw/datasette/issues/82#issuecomment-344118849,https://api.github.com/repos/simonw/datasette/issues/82,344118849,MDEyOklzc3VlQ29tbWVudDM0NDExODg0OQ==,9599,2017-11-14T01:46:10Z,2017-11-14T01:46:10Z,OWNER,Did this: https://simonwillison.net/2017/Nov/13/datasette/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273596159, https://github.com/simonw/datasette/issues/47#issuecomment-344132481,https://api.github.com/repos/simonw/datasette/issues/47,344132481,MDEyOklzc3VlQ29tbWVudDM0NDEzMjQ4MQ==,9599,2017-11-14T03:08:13Z,2017-11-14T03:08:13Z,OWNER,I ended up shipping with https://fivethirtyeight.datasettes.com/ and https://parlgov.datasettes.com/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271831408, https://github.com/simonw/datasette/issues/59#issuecomment-344141199,https://api.github.com/repos/simonw/datasette/issues/59,344141199,MDEyOklzc3VlQ29tbWVudDM0NDE0MTE5OQ==,9599,2017-11-14T04:13:11Z,2017-11-14T04:13:11Z,OWNER,"I managed to do this manually: datasette package ~/parlgov-db/parlgov.db --metadata=parlgov.json # Output 8758ec31dda3 as the new image ID docker save 8758ec31dda3 > /tmp/my-image # I could have just piped this straight to hyper cat /tmp/my-image | hyper load # Now start the container running in hyper hyper run -d -p 80:8001 --name parlgov 8758ec31dda3 # We need to assign an IP address so we can see it hyper fip allocate 1 # Outputs 199.245.58.78 hyper fip attach 199.245.58.78 parlgov At this point, visiting the IP address in a browser showed the parlgov UI. To clean up... hyper hyper fip detach parlgov hyper fip release 199.245.58.78 hyper stop parlgov hyper rm parlgov ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273157085, https://github.com/simonw/datasette/issues/79#issuecomment-344141515,https://api.github.com/repos/simonw/datasette/issues/79,344141515,MDEyOklzc3VlQ29tbWVudDM0NDE0MTUxNQ==,9599,2017-11-14T04:16:01Z,2017-11-14T04:16:01Z,OWNER,This is probably a bit too much for the README - I should get readthedocs working.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273569068, https://github.com/simonw/datasette/issues/57#issuecomment-344149165,https://api.github.com/repos/simonw/datasette/issues/57,344149165,MDEyOklzc3VlQ29tbWVudDM0NDE0OTE2NQ==,9599,2017-11-14T05:16:34Z,2017-11-14T05:17:14Z,OWNER,"I’m intrigued by this pattern: https://github.com/macropin/datasette/blob/147195c2fdfa2b984d8f9fc1c6cab6634970a056/Dockerfile#L8 What’s the benefit of doing that? Does it result in a smaller image size?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127694, https://github.com/simonw/datasette/issues/46#issuecomment-344161371,https://api.github.com/repos/simonw/datasette/issues/46,344161371,MDEyOklzc3VlQ29tbWVudDM0NDE2MTM3MQ==,9599,2017-11-14T06:42:15Z,2017-11-14T06:42:15Z,OWNER,http://charlesleifer.com/blog/going-fast-with-sqlite-and-python/ is useful here too.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468, https://github.com/simonw/datasette/issues/46#issuecomment-344161430,https://api.github.com/repos/simonw/datasette/issues/46,344161430,MDEyOklzc3VlQ29tbWVudDM0NDE2MTQzMA==,9599,2017-11-14T06:42:44Z,2017-11-14T06:42:44Z,OWNER,Also requested on Twitter: https://twitter.com/DenubisX/status/930322813864439808,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468, https://github.com/simonw/datasette/issues/27#issuecomment-344179878,https://api.github.com/repos/simonw/datasette/issues/27,344179878,MDEyOklzc3VlQ29tbWVudDM0NDE3OTg3OA==,9599,2017-11-14T08:21:22Z,2017-11-14T08:21:22Z,OWNER,https://github.com/frappe/charts perhaps ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267886330, https://github.com/simonw/datasette/issues/43#issuecomment-344180866,https://api.github.com/repos/simonw/datasette/issues/43,344180866,MDEyOklzc3VlQ29tbWVudDM0NDE4MDg2Ng==,9599,2017-11-14T08:25:37Z,2017-11-14T08:25:37Z,OWNER,"This isn’t necessary - restarting the server is fast and easy, and I’ve not found myself needing this at all during development.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268592894, https://github.com/simonw/datasette/issues/57#issuecomment-344185817,https://api.github.com/repos/simonw/datasette/issues/57,344185817,MDEyOklzc3VlQ29tbWVudDM0NDE4NTgxNw==,9599,2017-11-14T08:46:24Z,2017-11-14T08:46:24Z,OWNER,Thanks for the explanation! Please do start a pull request. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127694, https://github.com/simonw/datasette/issues/30#issuecomment-344352573,https://api.github.com/repos/simonw/datasette/issues/30,344352573,MDEyOklzc3VlQ29tbWVudDM0NDM1MjU3Mw==,9599,2017-11-14T18:29:01Z,2017-11-14T18:29:01Z,OWNER,This is a dupe of #85 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268078453, https://github.com/simonw/datasette/issues/93#issuecomment-344409906,https://api.github.com/repos/simonw/datasette/issues/93,344409906,MDEyOklzc3VlQ29tbWVudDM0NDQwOTkwNg==,9599,2017-11-14T21:47:02Z,2017-11-14T21:47:02Z,OWNER,"Even without bundling in the database file itself, I'd love to have a standalone binary version of the core `datasette` CLI utility. I think Sanic may have some complex dependencies, but I've never tried pyinstaller so I don't know how easy or hard it would be to get this working.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952, https://github.com/simonw/datasette/issues/93#issuecomment-344415756,https://api.github.com/repos/simonw/datasette/issues/93,344415756,MDEyOklzc3VlQ29tbWVudDM0NDQxNTc1Ng==,9599,2017-11-14T22:09:13Z,2017-11-14T22:09:13Z,OWNER,Looks like we'd need to use this recipe: https://github.com/pyinstaller/pyinstaller/wiki/Recipe-Setuptools-Entry-Point,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952, https://github.com/simonw/datasette/issues/93#issuecomment-344426887,https://api.github.com/repos/simonw/datasette/issues/93,344426887,MDEyOklzc3VlQ29tbWVudDM0NDQyNjg4Nw==,9599,2017-11-14T22:51:46Z,2017-11-14T22:51:46Z,OWNER,"That didn't quite work for me. It built me a `dist/datasette` executable but when I try to run it I get an error: $ pwd /Users/simonw/Dropbox/Development/datasette $ source venv/bin/activate $ pyinstaller -F --add-data datasette/templates:datasette/templates --add-data datasette/static:datasette/static /Users/simonw/Dropbox/Development/datasette/venv/bin/datasette $ dist/datasette --help Traceback (most recent call last): File ""datasette"", line 11, in File ""site-packages/pkg_resources/__init__.py"", line 572, in load_entry_point File ""site-packages/pkg_resources/__init__.py"", line 564, in get_distribution File ""site-packages/pkg_resources/__init__.py"", line 436, in get_provider File ""site-packages/pkg_resources/__init__.py"", line 984, in require File ""site-packages/pkg_resources/__init__.py"", line 870, in resolve pkg_resources.DistributionNotFound: The 'datasette' distribution was not found and is required by the application [99117] Failed to execute script datasette ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952, https://github.com/simonw/datasette/issues/88#issuecomment-344427448,https://api.github.com/repos/simonw/datasette/issues/88,344427448,MDEyOklzc3VlQ29tbWVudDM0NDQyNzQ0OA==,9599,2017-11-14T22:54:06Z,2017-11-14T22:54:06Z,OWNER,Hooray! First dataset that wasn't deployed by me :) https://github.com/simonw/datasette/wiki/Datasettes,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273775212, https://github.com/simonw/datasette/issues/88#issuecomment-344427560,https://api.github.com/repos/simonw/datasette/issues/88,344427560,MDEyOklzc3VlQ29tbWVudDM0NDQyNzU2MA==,9599,2017-11-14T22:54:33Z,2017-11-14T22:54:33Z,OWNER,I'm getting an internal server error on http://run.plnkr.co/preview/cj9zlf1qc0003414y90ajkwpk/ at the moment,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273775212, https://github.com/simonw/datasette/issues/14#issuecomment-344438724,https://api.github.com/repos/simonw/datasette/issues/14,344438724,MDEyOklzc3VlQ29tbWVudDM0NDQzODcyNA==,9599,2017-11-14T23:47:54Z,2017-11-14T23:47:54Z,OWNER,"Plugins should be able to interact with the build step. This would give plugins an opportunity to modify the SQL databases and help prepare them for serving - for example, a full-text search plugin might create additional FTS tables, or a mapping plugin might pre-calculate a bunch of geohashes for tables that have latitude/longitude values. Plugins could really take advantage of the immutable nature of the dataset here.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940, https://github.com/simonw/datasette/issues/93#issuecomment-344440658,https://api.github.com/repos/simonw/datasette/issues/93,344440658,MDEyOklzc3VlQ29tbWVudDM0NDQ0MDY1OA==,9599,2017-11-14T23:58:07Z,2017-11-14T23:58:07Z,OWNER,It's a shame pyinstaller can't act as a cross-compiler - so I don't think I can get Travis CI to build packages. But it's fantastic that it's possible to turn the tool into a standalone executable!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273944952, https://github.com/simonw/datasette/issues/85#issuecomment-344452063,https://api.github.com/repos/simonw/datasette/issues/85,344452063,MDEyOklzc3VlQ29tbWVudDM0NDQ1MjA2Mw==,9599,2017-11-15T01:03:03Z,2017-11-15T01:03:03Z,OWNER,"This can work in reverse too. If you view the row page for something that has foreign keys against it, we can show you “53 items in TABLE link to this” and provide a link to view them all. That count worry could be prohibitively expensive. To counter that, we could run the count query via Ajax and set a strict time limit on it. See #95","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273678673, https://github.com/simonw/datasette/issues/85#issuecomment-344452326,https://api.github.com/repos/simonw/datasette/issues/85,344452326,MDEyOklzc3VlQ29tbWVudDM0NDQ1MjMyNg==,9599,2017-11-15T01:04:38Z,2017-11-15T01:04:38Z,OWNER,This will work well in conjunction with https://github.com/simonw/csvs-to-sqlite/issues/2,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273678673, https://github.com/simonw/datasette/pull/89#issuecomment-344462277,https://api.github.com/repos/simonw/datasette/issues/89,344462277,MDEyOklzc3VlQ29tbWVudDM0NDQ2MjI3Nw==,9599,2017-11-15T02:02:52Z,2017-11-15T02:02:52Z,OWNER,"This is exactly what I was after, thanks!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273816720, https://github.com/simonw/datasette/issues/13#issuecomment-344462608,https://api.github.com/repos/simonw/datasette/issues/13,344462608,MDEyOklzc3VlQ29tbWVudDM0NDQ2MjYwOA==,9599,2017-11-15T02:04:51Z,2017-11-15T02:04:51Z,OWNER,"Fixed in https://github.com/simonw/datasette/commit/8252daa4c14d73b4b69e3f2db4576bb39d73c070 - thanks, @tomdyson!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267542338, https://github.com/simonw/datasette/issues/95#issuecomment-344463436,https://api.github.com/repos/simonw/datasette/issues/95,344463436,MDEyOklzc3VlQ29tbWVudDM0NDQ2MzQzNg==,9599,2017-11-15T02:10:10Z,2017-11-15T02:10:10Z,OWNER,"This means clients can ask questions but say ""don't bother if it takes longer than X"" - which is really handy when you're working against unknown databases that might be small or might be enormous.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273998513, https://github.com/simonw/datasette/pull/94#issuecomment-344472313,https://api.github.com/repos/simonw/datasette/issues/94,344472313,MDEyOklzc3VlQ29tbWVudDM0NDQ3MjMxMw==,9599,2017-11-15T03:08:00Z,2017-11-15T03:08:00Z,OWNER,"Works for me. I'm going to land this. Just one thing: simonw$ docker run --rm -t -i -p 9001:8001 c408e8cfbe40 datasette publish now The publish command requires ""now"" to be installed and configured Follow the instructions at https://zeit.co/now#whats-now Maybe we should have the Docker container install the ""now"" client? Not sure how much size that would add though. I think it's OK without for the moment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273961179, https://github.com/simonw/datasette/issues/25#issuecomment-344487639,https://api.github.com/repos/simonw/datasette/issues/25,344487639,MDEyOklzc3VlQ29tbWVudDM0NDQ4NzYzOQ==,9599,2017-11-15T05:11:11Z,2017-11-15T05:11:11Z,OWNER,"Since you can already download the database directly, I'm not going to bother with this one.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267857622, https://github.com/simonw/datasette/issues/85#issuecomment-344657040,https://api.github.com/repos/simonw/datasette/issues/85,344657040,MDEyOklzc3VlQ29tbWVudDM0NDY1NzA0MA==,9599,2017-11-15T16:56:48Z,2017-11-15T16:56:48Z,OWNER,"Since detecting foreign keys that point to a specific table is a bit expensive (you have to call a PRAGMA on every other table) I’m going to add this to the build/inspect stage. Idea: if we detect that the foreign key table only has one other column in it (id, name) AND we know that the id is the primary key, we can add an efficient lookup on the table list view and prefetch a dictionary mapping IDs to their value. Then we can feed that dictionary in as extra tenplate context and use it to render labeled hyperlinks in the corresponding column. This means our build step should also cache which columns are indexed, and add a “label_column” property for tables with an obvious lane column.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273678673, https://github.com/simonw/datasette/issues/90#issuecomment-344667202,https://api.github.com/repos/simonw/datasette/issues/90,344667202,MDEyOklzc3VlQ29tbWVudDM0NDY2NzIwMg==,9599,2017-11-15T17:29:38Z,2017-11-15T17:29:38Z,OWNER,@jacobian points out that a buildpack may be a better fit than a Docker container for implementing this: https://twitter.com/jacobian/status/930849058465255424,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123, https://github.com/simonw/datasette/issues/90#issuecomment-344680385,https://api.github.com/repos/simonw/datasette/issues/90,344680385,MDEyOklzc3VlQ29tbWVudDM0NDY4MDM4NQ==,9599,2017-11-15T18:14:11Z,2017-11-15T18:14:11Z,OWNER,"Maybe we don’t even need a buildpack... we could create a temporary directory, set up a classic heroku app with the datasette serve command in the Procfile and then git push to deploy.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123, https://github.com/simonw/datasette/issues/90#issuecomment-344686483,https://api.github.com/repos/simonw/datasette/issues/90,344686483,MDEyOklzc3VlQ29tbWVudDM0NDY4NjQ4Mw==,9599,2017-11-15T18:36:23Z,2017-11-15T18:36:23Z,OWNER,The “datasette build” command would need to run in a bin/post_compile script eg https://github.com/simonw/simonwillisonblog/blob/cloudflare-ips/bin/post_compile,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123, https://github.com/simonw/datasette/issues/90#issuecomment-344687328,https://api.github.com/repos/simonw/datasette/issues/90,344687328,MDEyOklzc3VlQ29tbWVudDM0NDY4NzMyOA==,9599,2017-11-15T18:39:14Z,2017-11-15T18:39:49Z,OWNER,"By default the command could use a temporary directory that gets cleaned up after the deploy, but we could allow users to opt in to keeping the generated directory like so: datasette publish heroku mydb.py -d ~/dev/my-heroku-app This would create the my-heroku-app folder so you can later execute further git deploys from there.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123, https://github.com/simonw/datasette/pull/107#issuecomment-344770170,https://api.github.com/repos/simonw/datasette/issues/107,344770170,MDEyOklzc3VlQ29tbWVudDM0NDc3MDE3MA==,9599,2017-11-16T00:01:00Z,2017-11-16T00:01:22Z,OWNER,"It is - but I think this will break on this line since it expects two format string parameters: https://github.com/simonw/datasette/blob/f45ca30f91b92ac68adaba893bf034f13ec61ced/datasette/utils.py#L61 Needs unit tests too, which live here: https://github.com/simonw/datasette/blob/f45ca30f91b92ac68adaba893bf034f13ec61ced/tests/test_utils.py#L49","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274343647, https://github.com/simonw/datasette/issues/100#issuecomment-344771130,https://api.github.com/repos/simonw/datasette/issues/100,344771130,MDEyOklzc3VlQ29tbWVudDM0NDc3MTEzMA==,9599,2017-11-16T00:06:00Z,2017-11-16T00:06:00Z,OWNER,"Aha... it looks like this is a Jinja version problem: https://github.com/ansible/ansible/issues/25381#issuecomment-306492389 Datasette depends on sanic-jinja2 - and that doesn't depend on a particular jinja2 version: https://github.com/lixxu/sanic-jinja2/blob/7e9520850d8c6bb66faf43b7f252593d7efe3452/setup.py#L22 So if you have an older version of Jinja installed, stuff breaks.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274160723, https://github.com/simonw/datasette/issues/96#issuecomment-344786528,https://api.github.com/repos/simonw/datasette/issues/96,344786528,MDEyOklzc3VlQ29tbWVudDM0NDc4NjUyOA==,9599,2017-11-16T01:32:41Z,2017-11-16T01:32:41Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274001453, https://github.com/simonw/datasette/issues/96#issuecomment-344788435,https://api.github.com/repos/simonw/datasette/issues/96,344788435,MDEyOklzc3VlQ29tbWVudDM0NDc4ODQzNQ==,9599,2017-11-16T01:43:52Z,2017-11-16T01:43:52Z,OWNER,Demo: https://australian-dogs.now.sh/australian-dogs-3ba9628?sql=select+name%2C+count%28*%29+as+n+from+%28%0D%0A%0D%0Aselect+upper%28%22Animal+name%22%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2013%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2014%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2015%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22AnimalName%22%29+as+name+from+%5BCity-of-Port-Adelaide-Enfield-Dog_Registrations_2016%5D+where+AnimalBreed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5BMitcham-dog-registrations-2015%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22DOG_NAME%22%29+as+name+from+%5Bburnside-dog-registrations-2015%5D+where+DOG_BREED+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Animal_Name%22%29+as+name+from+%5Bcity-of-playford-2015-dog-registration%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5Bcity-of-prospect-dog-registration-details-2016%5D+where%22Breed+Description%22+like+%3Abreed%0D%0A%0D%0A%29+group+by+name+order+by+n+desc%3B&breed=chihuahua,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274001453, https://github.com/simonw/datasette/issues/96#issuecomment-344788763,https://api.github.com/repos/simonw/datasette/issues/96,344788763,MDEyOklzc3VlQ29tbWVudDM0NDc4ODc2Mw==,9599,2017-11-16T01:45:51Z,2017-11-16T01:45:51Z,OWNER,Another demo - this time it lets you search by name and see the most popular breeds with that name: https://australian-dogs.now.sh/australian-dogs-3ba9628?sql=select+breed%2C+count%28*%29+as+n+from+%28%0D%0A%0D%0Aselect+upper%28%22Breed%22%29+as+breed+from+%5BAdelaide-City-Council-dog-registrations-2013%5D+where+%22Animal+name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Breed_Description%22%29+as+breed+from+%5BAdelaide-City-Council-dog-registrations-2014%5D+where+%22Animal_Name%22+like+%3Aname%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Breed_Description%22%29+as+breed+from+%5BAdelaide-City-Council-dog-registrations-2015%5D+where+%22Animal_Name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22AnimalBreed%22%29+as+breed+from+%5BCity-of-Port-Adelaide-Enfield-Dog_Registrations_2016%5D+where+%22AnimalName%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Breed%22%29+as+breed+from+%5BMitcham-dog-registrations-2015%5D+where+%22Animal+Name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22DOG_BREED%22%29+as+breed+from+%5Bburnside-dog-registrations-2015%5D+where+%22DOG_NAME%22+like+%3Aname%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Breed_Description%22%29+as+breed+from+%5Bcity-of-playford-2015-dog-registration%5D+where+%22Animal_Name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Breed+Description%22%29+as+breed+from+%5Bcity-of-prospect-dog-registration-details-2016%5D+where+%22Animal+Name%22+like+%3Aname%0D%0A%0D%0A%29+group+by+breed+order+by+n+desc%3B&name=rex,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274001453, https://github.com/simonw/datasette/issues/46#issuecomment-344975156,https://api.github.com/repos/simonw/datasette/issues/46,344975156,MDEyOklzc3VlQ29tbWVudDM0NDk3NTE1Ng==,9599,2017-11-16T16:19:44Z,2017-11-16T16:19:44Z,OWNER,"That's fantastic! Thank you very much for that. Do you know if it's possible to view the Dockerfile used by https://hub.docker.com/r/prolocutor/python3-sqlite-ext/ ?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468, https://github.com/simonw/datasette/issues/46#issuecomment-344976104,https://api.github.com/repos/simonw/datasette/issues/46,344976104,MDEyOklzc3VlQ29tbWVudDM0NDk3NjEwNA==,9599,2017-11-16T16:22:45Z,2017-11-16T16:22:45Z,OWNER,Found a relevant Dockerfile on Reddit: https://www.reddit.com/r/Python/comments/5unkb3/install_sqlite3_on_python_3/ddzdz2b/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468, https://github.com/simonw/datasette/issues/46#issuecomment-344976882,https://api.github.com/repos/simonw/datasette/issues/46,344976882,MDEyOklzc3VlQ29tbWVudDM0NDk3Njg4Mg==,9599,2017-11-16T16:25:07Z,2017-11-16T16:25:07Z,OWNER,Maybe part of the solution here is to add a `--load-extension` argument to `datasette` - so when you run the command you can specify SQLite extensions that should be loaded. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468, https://github.com/simonw/datasette/issues/109#issuecomment-344986423,https://api.github.com/repos/simonw/datasette/issues/109,344986423,MDEyOklzc3VlQ29tbWVudDM0NDk4NjQyMw==,9599,2017-11-16T16:53:26Z,2017-11-16T16:53:26Z,OWNER,http://datasette.readthedocs.io/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274378301, https://github.com/simonw/datasette/issues/110#issuecomment-344988263,https://api.github.com/repos/simonw/datasette/issues/110,344988263,MDEyOklzc3VlQ29tbWVudDM0NDk4ODI2Mw==,9599,2017-11-16T16:58:48Z,2017-11-16T16:58:48Z,OWNER,"Here's how I tested this. First I downloaded and started a docker container using https://hub.docker.com/r/prolocutor/python3-sqlite-ext - which includes the compiled spatialite extension. This downloads it, then starts a shell in that container. docker run -it -p 8018:8018 prolocutor/python3-sqlite-ext:3.5.1-spatialite /bin/sh Installed a pre-release build of datasette which includes the new `--load-extension` option. pip install https://static.simonwillison.net/static/2017/datasette-0.13-py3-none-any.whl Now grab a sample database from https://www.gaia-gis.it/spatialite-2.3.1/resources.html - and unzip and rename it (datasette doesn't yet like databases with dots in their filename): wget http://www.gaia-gis.it/spatialite-2.3.1/test-2.3.sqlite.gz gunzip test-2.3.sqlite.gz mv test-2.3.sqlite test23.sqlite Now start datasette on port 8018 (the port I exposed earlier) with the extension loaded: datasette test23.sqlite -p 8018 -h 0.0.0.0 --load-extension /usr/local/lib/mod_spatialite.so Now I can confirm that it worked: http://localhost:8018/test23-c88bc35?sql=select+ST_AsText%28Geometry%29+from+HighWays+limit+1 If I run datasette without `--load-extension` I get this: datasette test23.sqlite -p 8018 -h 0.0.0.0 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274578142, https://github.com/simonw/datasette/issues/46#issuecomment-344988591,https://api.github.com/repos/simonw/datasette/issues/46,344988591,MDEyOklzc3VlQ29tbWVudDM0NDk4ODU5MQ==,9599,2017-11-16T16:59:51Z,2017-11-16T16:59:51Z,OWNER,"OK, `--load-extension` is now a supported command line option - see #110 which includes my notes on how I manually tested it using the `prolocutor/python3-sqlite-ext` Docker image.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468, https://github.com/simonw/datasette/issues/46#issuecomment-344989340,https://api.github.com/repos/simonw/datasette/issues/46,344989340,MDEyOklzc3VlQ29tbWVudDM0NDk4OTM0MA==,9599,2017-11-16T17:02:07Z,2017-11-16T17:02:07Z,OWNER,The fact that `prolocutor/python3-sqlite-ext` doesn't provide a visible Dockerfile and hasn't been updated in two years makes me hesitant to bake it into datasette itself. I'd rather put together a Dockerfile that enables the necessary extensions and can live in the datasette repository itself.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468, https://github.com/simonw/datasette/issues/46#issuecomment-344995571,https://api.github.com/repos/simonw/datasette/issues/46,344995571,MDEyOklzc3VlQ29tbWVudDM0NDk5NTU3MQ==,9599,2017-11-16T17:22:32Z,2017-11-16T17:22:32Z,OWNER,The JSON extension would be very worthwhile too: https://www.sqlite.org/json1.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",271301468, https://github.com/simonw/datasette/issues/111#issuecomment-345013127,https://api.github.com/repos/simonw/datasette/issues/111,345013127,MDEyOklzc3VlQ29tbWVudDM0NTAxMzEyNw==,9599,2017-11-16T18:23:56Z,2017-11-16T18:23:56Z,OWNER,Having this as a global option may not make sense when publishing multiple databases. We can revisit that when we implement per-database and per-table metadata.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274615452,