{"html_url": "https://github.com/simonw/datasette/issues/26#issuecomment-343645249", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/26", "id": 343645249, "node_id": "MDEyOklzc3VlQ29tbWVudDM0MzY0NTI0OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-11T06:48:59Z", "updated_at": "2017-11-11T06:48:59Z", "author_association": "OWNER", "body": "Doing this works:\r\n\r\n import os\r\n os.link('/tmp/databases/northwind.db', '/tmp/tmp-blah/northwind.db')\r\n\r\nThat creates a link in tmp-blah - and then when I delete that entire directory like so:\r\n\r\n import shutil\r\n shutil.rmtree('/tmp/tmp-blah')\r\n\r\nThe original database is not deleted, just the link.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 267861210, "label": "Command line tool for uploading one or more DBs to Now"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/26#issuecomment-343645327", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/26", "id": 343645327, "node_id": "MDEyOklzc3VlQ29tbWVudDM0MzY0NTMyNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-11T06:51:16Z", "updated_at": "2017-11-11T06:51:16Z", "author_association": "OWNER", "body": "I can create the temporary directory like so:\r\n\r\n import tempfile\r\n t = tempfile.TemporaryDirectory()\r\n t\r\n \r\n t.name\r\n '/var/folders/w9/0xm39tk94ng9h52g06z4b54c0000gp/T/tmpkym70wlp'\r\n\r\nAnd then to delete it all:\r\n\r\n t.cleanup()\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 267861210, "label": "Command line tool for uploading one or more DBs to Now"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/40#issuecomment-343646740", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/40", "id": 343646740, "node_id": "MDEyOklzc3VlQ29tbWVudDM0MzY0Njc0MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-11T07:27:33Z", "updated_at": "2017-11-11T07:27:33Z", "author_association": "OWNER", "body": "I'm happy with this now that I've implemented the publish command in #26 ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 268470572, "label": "Implement command-line tool interface"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/47#issuecomment-343647102", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/47", "id": 343647102, "node_id": "MDEyOklzc3VlQ29tbWVudDM0MzY0NzEwMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-11T07:36:00Z", "updated_at": "2017-11-11T07:36:00Z", "author_association": "OWNER", "body": "http://2016.padjo.org/tutorials/data-primer-census-acs1-demographics/ has a sqlite database: http://2016.padjo.org/files/data/starterpack/census-acs-1year/acs-1-year-2015.sqlite\r\n\r\nI tested this by deploying it here: https://datasette-fewuggrvwr.now.sh/", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 271831408, "label": "Create neat example database"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/16#issuecomment-343647300", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/16", "id": 343647300, "node_id": "MDEyOklzc3VlQ29tbWVudDM0MzY0NzMwMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-11T07:41:19Z", "updated_at": "2017-11-11T07:53:09Z", "author_association": "OWNER", "body": "Still needed:\r\n\r\n- [ ] A link to the homepage from some kind of navigation bar in the header\r\n- [ ] link to github.com/simonw/datasette in the footer\r\n- [ ] Slightly better titles (maybe ditch the visited link colours for titles only? should keep those for primary key links)\r\n- [ ] Links to the .json and .jsono versions of every view", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 267726219, "label": "Default HTML/CSS needs to look reasonable and be responsive"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/14#issuecomment-343675165", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/14", "id": 343675165, "node_id": "MDEyOklzc3VlQ29tbWVudDM0MzY3NTE2NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-11T16:07:10Z", "updated_at": "2017-11-11T16:07:10Z", "author_association": "OWNER", "body": "The plugin system can also allow alternative providers for the `publish` command - e.g. maybe hook up hyper.sh as an option for publishing containers.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 267707940, "label": "Datasette Plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/59#issuecomment-343676574", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/59", "id": 343676574, "node_id": "MDEyOklzc3VlQ29tbWVudDM0MzY3NjU3NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-11T16:29:48Z", "updated_at": "2017-11-11T16:29:48Z", "author_association": "OWNER", "body": "See also #14", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273157085, "label": "datasette publish hyper"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/60#issuecomment-343683566", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/60", "id": 343683566, "node_id": "MDEyOklzc3VlQ29tbWVudDM0MzY4MzU2Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-11T18:12:24Z", "updated_at": "2017-11-11T18:12:24Z", "author_association": "OWNER", "body": "I\u2019m going to solve this by making it an optional argument you can pass to the serve command. Then the Dockerfile can still build and use it but it won\u2019t interfere with tests or dev.\r\n\r\nIf argument is not passed, we will calculate hashes on startup and calculate table row counts on demand.\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273163905, "label": "Rethink how metadata is generated and stored"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/47#issuecomment-343690060", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/47", "id": 343690060, "node_id": "MDEyOklzc3VlQ29tbWVudDM0MzY5MDA2MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-11T19:56:08Z", "updated_at": "2017-11-11T19:56:08Z", "author_association": "OWNER", "body": " \"parlgov-development.db\": {\r\n \"url\": \"http://www.parlgov.org/\"\r\n },\r\n \"nhsadmin.sqlite\": {\r\n \"url\": \"https://github.com/psychemedia/openHealthDataDoodles\"\r\n }", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 271831408, "label": "Create neat example database"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/16#issuecomment-343691342", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/16", "id": 343691342, "node_id": "MDEyOklzc3VlQ29tbWVudDM0MzY5MTM0Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-11T20:19:07Z", "updated_at": "2017-11-11T20:19:07Z", "author_association": "OWNER", "body": "Closing this, opening a fresh ticket for the navigation stuff.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 267726219, "label": "Default HTML/CSS needs to look reasonable and be responsive"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/63#issuecomment-343697291", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/63", "id": 343697291, "node_id": "MDEyOklzc3VlQ29tbWVudDM0MzY5NzI5MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-11T22:05:06Z", "updated_at": "2017-11-11T22:11:49Z", "author_association": "OWNER", "body": "I'm going to bundle sql and sql_params together into a query nested object like this:\r\n\r\n {\r\n \"query\": {\r\n \"sql\": \"select ...\",\r\n \"params\": {\r\n \"p0\": \"blah\"\r\n }\r\n }\r\n }", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273174447, "label": "Review design of JSON output"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/50#issuecomment-343698214", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/50", "id": 343698214, "node_id": "MDEyOklzc3VlQ29tbWVudDM0MzY5ODIxNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-11T22:23:21Z", "updated_at": "2017-11-11T22:23:21Z", "author_association": "OWNER", "body": "I'm closing #50 - more tests will be added in the future, but the framework is neatly in place for them now.\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 272694136, "label": "Unit tests against application itself"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/53#issuecomment-343699115", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/53", "id": 343699115, "node_id": "MDEyOklzc3VlQ29tbWVudDM0MzY5OTExNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-11T22:41:38Z", "updated_at": "2017-11-11T22:41:38Z", "author_association": "OWNER", "body": "This needs to incorporate a sensible way of presenting custom SQL query results too. And let's get a textarea in there for executing SQL while we're at it.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273054652, "label": "Implement a better database index page"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/47#issuecomment-343705966", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/47", "id": 343705966, "node_id": "MDEyOklzc3VlQ29tbWVudDM0MzcwNTk2Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-12T01:00:20Z", "updated_at": "2017-11-12T01:00:20Z", "author_association": "OWNER", "body": "https://github.com/fivethirtyeight/data has a ton of CSVs", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 271831408, "label": "Create neat example database"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/53#issuecomment-343707624", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/53", "id": 343707624, "node_id": "MDEyOklzc3VlQ29tbWVudDM0MzcwNzYyNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-12T01:47:45Z", "updated_at": "2017-11-12T01:47:45Z", "author_association": "OWNER", "body": "Split the SQL thing out into #65 ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273054652, "label": "Implement a better database index page"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/53#issuecomment-343707676", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/53", "id": 343707676, "node_id": "MDEyOklzc3VlQ29tbWVudDM0MzcwNzY3Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-12T01:49:07Z", "updated_at": "2017-11-12T01:49:07Z", "author_association": "OWNER", "body": "Here's the new design:\r\n\r\n\"parlgov-development\"\r\n\r\nAlso lists views at the bottom (refs #54):\r\n\r\n\"parlgov-development\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273054652, "label": "Implement a better database index page"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/42#issuecomment-343708447", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/42", "id": 343708447, "node_id": "MDEyOklzc3VlQ29tbWVudDM0MzcwODQ0Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-12T02:12:15Z", "updated_at": "2017-11-12T02:12:15Z", "author_association": "OWNER", "body": "I ditched the metadata file concept.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 268591332, "label": "Homepage UI for editing metadata file"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/65#issuecomment-343709217", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/65", "id": 343709217, "node_id": "MDEyOklzc3VlQ29tbWVudDM0MzcwOTIxNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-12T02:36:37Z", "updated_at": "2017-11-12T02:36:37Z", "author_association": "OWNER", "body": "\"nhsadmin\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273191608, "label": "Re-implement ?sql= mode"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/25#issuecomment-343715915", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/25", "id": 343715915, "node_id": "MDEyOklzc3VlQ29tbWVudDM0MzcxNTkxNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-12T06:08:28Z", "updated_at": "2017-11-12T06:08:28Z", "author_association": "OWNER", "body": " con = sqlite3.connect('existing_db.db')\r\n with open('dump.sql', 'w') as f:\r\n for line in con.iterdump():\r\n f.write('%s\\n' % line)\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 267857622, "label": "Endpoint that returns SQL ready to be piped into DB"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/42#issuecomment-343752404", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/42", "id": 343752404, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzc1MjQwNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-12T17:20:10Z", "updated_at": "2017-11-12T17:20:10Z", "author_association": "OWNER", "body": "Re-opening this - I've decided to bring back this concept, see #68 ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 268591332, "label": "Homepage UI for editing metadata file"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/69#issuecomment-343752579", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/69", "id": 343752579, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzc1MjU3OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-12T17:22:39Z", "updated_at": "2017-11-12T17:22:39Z", "author_association": "OWNER", "body": "By default I'll allow LIMIT and OFFSET up to a maximum of X (where X is let's say 50,000 to start with, but can be custom configured to a larger number or set to None for no limit).", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273248366, "label": "Enforce pagination (or at least limits) for arbitrary custom SQL"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/66#issuecomment-343752683", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/66", "id": 343752683, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzc1MjY4Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-12T17:24:05Z", "updated_at": "2017-11-12T17:24:21Z", "author_association": "OWNER", "body": "Maybe SQL views should have their own Sanic view class (`ViewView` is kinda funny), subclassed from `TableView`?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273191806, "label": "Show table SQL on table page"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/68#issuecomment-343753999", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/68", "id": 343753999, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzc1Mzk5OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-12T17:45:21Z", "updated_at": "2017-11-12T19:38:33Z", "author_association": "OWNER", "body": "For initial launch, I could just support this as some optional command line arguments you pass to the publish command:\r\n\r\n datasette publish data.db --title=\"Title\" --source=\"url\"", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273247186, "label": "Support for title/source/license metadata"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/68#issuecomment-343754058", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/68", "id": 343754058, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzc1NDA1OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-12T17:46:13Z", "updated_at": "2017-11-12T17:46:13Z", "author_association": "OWNER", "body": "I\u2019m going to store this stuff in a file called metadata.json and move the existing automatically generated metadata to a file called build.json", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273247186, "label": "Support for title/source/license metadata"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/57#issuecomment-343769692", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/57", "id": 343769692, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzc2OTY5Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-12T21:32:36Z", "updated_at": "2017-11-12T21:32:36Z", "author_association": "OWNER", "body": "I have created a Docker Hub public repository for this: https://hub.docker.com/r/simonwillison/datasette/", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273127694, "label": "Ship a Docker image of the whole thing"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/69#issuecomment-343780039", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/69", "id": 343780039, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzc4MDAzOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T00:05:27Z", "updated_at": "2017-11-13T00:05:27Z", "author_association": "OWNER", "body": "I think the only safe way to do this is using SQLite `.fetchmany(1000)` - I can't guarantee that the user has not entered SQL that will outfox a limit in some way. So instead of attempting to edit their SQL, I'll always return 1001 records and let them know if they went over 1000 or not.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273248366, "label": "Enforce pagination (or at least limits) for arbitrary custom SQL"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/71#issuecomment-343780141", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/71", "id": 343780141, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzc4MDE0MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T00:06:52Z", "updated_at": "2017-11-13T00:06:52Z", "author_association": "OWNER", "body": "I've registered datasettes.com as a domain name for doing this. Now setting it up so Cloudflare and Now can serve content from it.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273278840, "label": "Set up some example datasets on a Cloudflare-backed domain"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/71#issuecomment-343780539", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/71", "id": 343780539, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzc4MDUzOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T00:13:29Z", "updated_at": "2017-11-13T00:19:46Z", "author_association": "OWNER", "body": "https://zeit.co/docs/features/dns is docs\r\n\r\n now domain add -e datasettes.com\r\n\r\nI had to set up a custom TXT record on `_now.datasettes.com` to get this to work.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273278840, "label": "Set up some example datasets on a Cloudflare-backed domain"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/71#issuecomment-343780671", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/71", "id": 343780671, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzc4MDY3MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T00:15:21Z", "updated_at": "2017-11-13T00:17:37Z", "author_association": "OWNER", "body": "- [x] Redirect https://datasettes.com/ and https://www.datasettes.com/ to https://github.com/simonw/datasette", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273278840, "label": "Set up some example datasets on a Cloudflare-backed domain"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/71#issuecomment-343780814", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/71", "id": 343780814, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzc4MDgxNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T00:17:50Z", "updated_at": "2017-11-13T00:18:19Z", "author_association": "OWNER", "body": "Achieved those redirects using Cloudflare \"page rules\": https://www.cloudflare.com/a/page-rules/datasettes.com", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273278840, "label": "Set up some example datasets on a Cloudflare-backed domain"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/71#issuecomment-343781030", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/71", "id": 343781030, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzc4MTAzMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T00:21:05Z", "updated_at": "2017-11-13T02:09:32Z", "author_association": "OWNER", "body": "- [x] Have `now domain add -e datasettes.com` run without errors (hopefully just a matter of waiting for the DNS to update)\r\n- [x] Alias an example dataset hosted on Now on a datasettes.com subdomain\r\n- [x] Confirm that HTTP caching and HTTP/2 redirect pushing works as expected - this may require another page rule", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273278840, "label": "Set up some example datasets on a Cloudflare-backed domain"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/71#issuecomment-343788581", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/71", "id": 343788581, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzc4ODU4MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T01:48:17Z", "updated_at": "2017-11-13T01:48:17Z", "author_association": "OWNER", "body": "I had to add a rule like this to get letsencrypt certificates on now.sh working: https://github.com/zeit/now-cli/issues/188#issuecomment-270105052\r\n\r\n\"page_rules__datasettes_com___cloudflare_-_web_performance___security\"\r\n\r\nI also have to flip this switch off every time I want to add a new alias:\r\n\r\n\"crypto__datasettes_com___cloudflare_-_web_performance___security\"\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273278840, "label": "Set up some example datasets on a Cloudflare-backed domain"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/71#issuecomment-343788780", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/71", "id": 343788780, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzc4ODc4MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T01:50:01Z", "updated_at": "2017-11-13T01:50:01Z", "author_association": "OWNER", "body": "Added another page rule in order to get Cloudflare to always obey cache headers sent by the server:\r\n\r\n\"page_rules__datasettes_com___cloudflare_-_web_performance___security\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273278840, "label": "Set up some example datasets on a Cloudflare-backed domain"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/71#issuecomment-343788817", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/71", "id": 343788817, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzc4ODgxNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T01:50:27Z", "updated_at": "2017-11-13T01:50:27Z", "author_association": "OWNER", "body": "https://fivethirtyeight.datasettes.com/ is now up and running.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273278840, "label": "Set up some example datasets on a Cloudflare-backed domain"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/71#issuecomment-343789162", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/71", "id": 343789162, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzc4OTE2Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T01:53:29Z", "updated_at": "2017-11-13T01:53:29Z", "author_association": "OWNER", "body": "```\r\n$ curl -i 'https://fivethirtyeight.datasettes.com/fivethirtyeight-75d605c/obama-commutations%2Fobama_commutations.csv.jsono'\r\nHTTP/1.1 200 OK\r\nDate: Mon, 13 Nov 2017 01:50:57 GMT\r\nContent-Type: application/json\r\nTransfer-Encoding: chunked\r\nConnection: keep-alive\r\nSet-Cookie: __cfduid=de836090f3e12a60579cc7a1696cf0d9e1510537857; expires=Tue, 13-Nov-18 01:50:57 GMT; path=/; domain=.datasettes.com; HttpOnly; Secure\r\nAccess-Control-Allow-Origin: *\r\nCache-Control: public, max-age=31536000\r\nX-Now-Region: now-sfo\r\nCF-Cache-Status: HIT\r\nExpires: Tue, 13 Nov 2018 01:50:57 GMT\r\nServer: cloudflare-nginx\r\nCF-RAY: 3bce154a6d9293b4-SJC\r\n\r\n{\"database\": \"fivethirtyeight\", \"table\": \"obama-commutations/obama_commutations.csv\"...```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273278840, "label": "Set up some example datasets on a Cloudflare-backed domain"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/71#issuecomment-343790984", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/71", "id": 343790984, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzc5MDk4NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T02:09:34Z", "updated_at": "2017-11-13T02:09:34Z", "author_association": "OWNER", "body": "HTTP/2 push totally worked on the redirect!\r\n\r\n fetch('https://fivethirtyeight.datasettes.com/fivethirtyeight/riddler-pick-lowest%2Flow_numbers.csv.jsono').then(r => r.json()).then(console.log)\r\n\r\n\"eventbrite_api___v3_destination_search_\"\r\n\r\nMeanwhile, in the network pane...\r\n\r\n\"eventbrite_api___v3_destination_search_\"\r\n\r\n\"eventbrite_api___v3_destination_search_\"\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273278840, "label": "Set up some example datasets on a Cloudflare-backed domain"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/68#issuecomment-343791348", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/68", "id": 343791348, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzc5MTM0OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T02:12:58Z", "updated_at": "2017-11-13T02:12:58Z", "author_association": "OWNER", "body": "I should use this on https://fivethirtyeight.datasettes.com/", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273247186, "label": "Support for title/source/license metadata"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/73#issuecomment-343801392", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/73", "id": 343801392, "node_id": "MDEyOklzc3VlQ29tbWVudDM0MzgwMTM5Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T03:36:47Z", "updated_at": "2017-11-13T03:36:47Z", "author_association": "OWNER", "body": "While I\u2019m at it, let\u2019s allow people to opt out of HTTP/2 push with a ?_nopush=1 argument too - in case they decide they don\u2019t want to receive large 302 responses.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273296178, "label": "_nocache=1 query string option for use with sort-by-random"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/68#issuecomment-343951751", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/68", "id": 343951751, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzk1MTc1MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T15:21:04Z", "updated_at": "2017-11-13T15:21:04Z", "author_association": "OWNER", "body": "For first version, I'm just supporting title, source and license information at the database level.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273247186, "label": "Support for title/source/license metadata"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/67#issuecomment-343961784", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/67", "id": 343961784, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzk2MTc4NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T15:50:50Z", "updated_at": "2017-11-13T15:50:50Z", "author_association": "OWNER", "body": "`datasette package ...` - same arguments as `datasette publish`. Creates Docker container in your local repo, optionally tagged with `--tag`", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273192789, "label": "Command that builds a local docker container"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/67#issuecomment-343967020", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/67", "id": 343967020, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Mzk2NzAyMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T16:06:10Z", "updated_at": "2017-11-13T16:06:10Z", "author_association": "OWNER", "body": "http://odewahn.github.io/docker-jumpstart/example.html is helpful", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273192789, "label": "Command that builds a local docker container"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/75#issuecomment-344000982", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/75", "id": 344000982, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDAwMDk4Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T17:50:27Z", "updated_at": "2017-11-13T17:50:27Z", "author_association": "OWNER", "body": "This is necessary because one of the fun things to do with this tool is run it locally, e.g.:\r\n\r\n datasette ~/Library/Application\\ Support/Google/Chrome/Default/History -p 8003\r\n\r\nBUT... if we enable CORS by default, an evil site could try sniffing for localhost:8003 and attempt to steal data.\r\n\r\nSo we'll enable the CORS headers only if `--cors` is provided to the command, and then use that command in the default Dockerfile.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273509159, "label": "Add --cors argument to serve"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/51#issuecomment-344017088", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/51", "id": 344017088, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDAxNzA4OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T18:44:23Z", "updated_at": "2017-11-13T18:44:23Z", "author_association": "OWNER", "body": "Implemented in https://github.com/simonw/datasette/commit/e838bd743d31358b362875854a0ac5e78047727f", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 272735257, "label": "Make a proper README"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/74#issuecomment-344018680", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/74", "id": 344018680, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDAxODY4MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T18:49:58Z", "updated_at": "2017-11-13T18:49:58Z", "author_association": "OWNER", "body": "Turns out it does this already: https://github.com/simonw/datasette/blob/6b3b05b6db0d2a7b7cec8b8dbb4ddc5e12a376b2/datasette/app.py#L96-L107", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273296684, "label": "Send a 302 redirect to the new hash for hits to old hashes"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/69#issuecomment-344019631", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/69", "id": 344019631, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDAxOTYzMQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T18:53:13Z", "updated_at": "2017-11-13T18:53:13Z", "author_association": "OWNER", "body": "I'm going with a page size of 100 and a max limit of 1000", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273248366, "label": "Enforce pagination (or at least limits) for arbitrary custom SQL"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/69#issuecomment-344048656", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/69", "id": 344048656, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDA0ODY1Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T20:32:47Z", "updated_at": "2017-11-13T20:32:47Z", "author_association": "OWNER", "body": "\"ak\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273248366, "label": "Enforce pagination (or at least limits) for arbitrary custom SQL"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/55#issuecomment-344060070", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/55", "id": 344060070, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDA2MDA3MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T21:14:13Z", "updated_at": "2017-11-13T21:14:13Z", "author_association": "OWNER", "body": "I'm going to add some extra metadata to setup.py and then tag this as version 0.8:\r\n\r\n git tag 0.8\r\n git push --tags\r\n\r\nThen to ship to PyPI:\r\n\r\n python setup.py bdist_wheel\r\n twine register dist/datasette-0.8-py3-none-any.whl\r\n twine upload dist/datasette-0.8-py3-none-any.whl\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273127117, "label": "Ship first version to PyPI"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/55#issuecomment-344061762", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/55", "id": 344061762, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDA2MTc2Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T21:19:43Z", "updated_at": "2017-11-13T21:19:43Z", "author_association": "OWNER", "body": "And we're live! https://pypi.python.org/pypi/datasette", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273127117, "label": "Ship first version to PyPI"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/80#issuecomment-344074443", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/80", "id": 344074443, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDA3NDQ0Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T22:04:54Z", "updated_at": "2017-11-13T22:05:02Z", "author_association": "OWNER", "body": "The fivethirtyeight dataset:\r\n\r\n datasette publish now --name fivethirtyeight --metadata metadata.json fivethirtyeight.db\r\n now alias https://fivethirtyeight-jyqfudvjli.now.sh fivethirtyeight.datasettes.com\r\n\r\nAnd parlgov:\r\n\r\n datasette publish now parlgov.db --name=parlgov --metadata=parlgov.json \r\n now alias https://parlgov-hqvxuhmbyh.now.sh parlgov.datasettes.com\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273569477, "label": "Deploy final versions of fivethirtyeight and parlgov datasets (with view pagination)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/80#issuecomment-344075696", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/80", "id": 344075696, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDA3NTY5Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T22:09:46Z", "updated_at": "2017-11-13T22:09:46Z", "author_association": "OWNER", "body": "Parlgov was throwing errors on one of the views, which takes longer than 1000ms to execute - so I added the ability to customize the time limit in https://github.com/simonw/datasette/commit/1e698787a4dd6df0432021a6814c446c8b69bba2\r\n\r\n datasette publish now parlgov.db --metadata parlgov.json --name parlgov --extra-options=\"--sql_time_limit_ms=3500\"\r\n now alias https://parlgov-nvkcowlixq.now.sh parlgov.datasettes.com\r\n\r\nhttps://parlgov.datasettes.com/parlgov-25f9855/view_cabinet now returns in just over 2.5s\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273569477, "label": "Deploy final versions of fivethirtyeight and parlgov datasets (with view pagination)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/81#issuecomment-344076554", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/81", "id": 344076554, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDA3NjU1NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T22:12:57Z", "updated_at": "2017-11-13T22:12:57Z", "author_association": "OWNER", "body": "Hah, I haven't even announced this yet :) Travis is upset because I'm using SQL in the tests which isn't compatible with their version of Python 3.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273595473, "label": ":fire: Removes DS_Store"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/59#issuecomment-344081876", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/59", "id": 344081876, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDA4MTg3Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-13T22:33:43Z", "updated_at": "2017-11-13T22:33:43Z", "author_association": "OWNER", "body": "The `datasette package` command introduced in 4143e3b45c16cbae5e3e3419ef479a71810e7df3 is relevant here.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273157085, "label": "datasette publish hyper"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/82#issuecomment-344118849", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/82", "id": 344118849, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDExODg0OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-14T01:46:10Z", "updated_at": "2017-11-14T01:46:10Z", "author_association": "OWNER", "body": "Did this: https://simonwillison.net/2017/Nov/13/datasette/", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273596159, "label": "Post a blog entry announcing it to the world"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/47#issuecomment-344132481", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/47", "id": 344132481, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDEzMjQ4MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-14T03:08:13Z", "updated_at": "2017-11-14T03:08:13Z", "author_association": "OWNER", "body": "I ended up shipping with https://fivethirtyeight.datasettes.com/ and https://parlgov.datasettes.com/", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 271831408, "label": "Create neat example database"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/59#issuecomment-344141199", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/59", "id": 344141199, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDE0MTE5OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-14T04:13:11Z", "updated_at": "2017-11-14T04:13:11Z", "author_association": "OWNER", "body": "I managed to do this manually:\r\n\r\n datasette package ~/parlgov-db/parlgov.db --metadata=parlgov.json\r\n # Output 8758ec31dda3 as the new image ID\r\n docker save 8758ec31dda3 > /tmp/my-image\r\n # I could have just piped this straight to hyper\r\n cat /tmp/my-image | hyper load\r\n # Now start the container running in hyper\r\n hyper run -d -p 80:8001 --name parlgov 8758ec31dda3\r\n # We need to assign an IP address so we can see it\r\n hyper fip allocate 1\r\n # Outputs 199.245.58.78\r\n hyper fip attach 199.245.58.78 parlgov\r\n\r\nAt this point, visiting the IP address in a browser showed the parlgov UI.\r\n\r\nTo clean up...\r\n\r\n hyper hyper fip detach parlgov\r\n hyper fip release 199.245.58.78\r\n hyper stop parlgov\r\n hyper rm parlgov\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273157085, "label": "datasette publish hyper"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/79#issuecomment-344141515", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/79", "id": 344141515, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDE0MTUxNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-14T04:16:01Z", "updated_at": "2017-11-14T04:16:01Z", "author_association": "OWNER", "body": "This is probably a bit too much for the README - I should get readthedocs working.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273569068, "label": "Add more detailed API documentation to the README"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/57#issuecomment-344149165", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/57", "id": 344149165, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDE0OTE2NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-14T05:16:34Z", "updated_at": "2017-11-14T05:17:14Z", "author_association": "OWNER", "body": "I\u2019m intrigued by this pattern: \r\n\r\nhttps://github.com/macropin/datasette/blob/147195c2fdfa2b984d8f9fc1c6cab6634970a056/Dockerfile#L8\r\n\r\nWhat\u2019s the benefit of doing that? Does it result in a smaller image size?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273127694, "label": "Ship a Docker image of the whole thing"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/46#issuecomment-344161226", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/46", "id": 344161226, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDE2MTIyNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-14T06:41:21Z", "updated_at": "2017-11-14T06:41:21Z", "author_association": "OWNER", "body": "Spatial extensions would be really useful too. https://www.gaia-gis.it/spatialite-2.1/SpatiaLite-manual.html", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 271301468, "label": "Dockerfile should build more recent SQLite with FTS5 and spatialite support"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/46#issuecomment-344161371", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/46", "id": 344161371, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDE2MTM3MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-14T06:42:15Z", "updated_at": "2017-11-14T06:42:15Z", "author_association": "OWNER", "body": "http://charlesleifer.com/blog/going-fast-with-sqlite-and-python/ is useful here too.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 271301468, "label": "Dockerfile should build more recent SQLite with FTS5 and spatialite support"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/46#issuecomment-344161430", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/46", "id": 344161430, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDE2MTQzMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-14T06:42:44Z", "updated_at": "2017-11-14T06:42:44Z", "author_association": "OWNER", "body": "Also requested on Twitter: https://twitter.com/DenubisX/status/930322813864439808", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 271301468, "label": "Dockerfile should build more recent SQLite with FTS5 and spatialite support"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/27#issuecomment-344179878", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/27", "id": 344179878, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDE3OTg3OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-14T08:21:22Z", "updated_at": "2017-11-14T08:21:22Z", "author_association": "OWNER", "body": "https://github.com/frappe/charts perhaps ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 267886330, "label": "Ability to plot a simple graph"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/43#issuecomment-344180866", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/43", "id": 344180866, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDE4MDg2Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-14T08:25:37Z", "updated_at": "2017-11-14T08:25:37Z", "author_association": "OWNER", "body": "This isn\u2019t necessary - restarting the server is fast and easy, and I\u2019ve not found myself needing this at all during development.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 268592894, "label": "While running, server should spot new db files added to its directory "}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/57#issuecomment-344185817", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/57", "id": 344185817, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDE4NTgxNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-14T08:46:24Z", "updated_at": "2017-11-14T08:46:24Z", "author_association": "OWNER", "body": "Thanks for the explanation! Please do start a pull request. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273127694, "label": "Ship a Docker image of the whole thing"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/30#issuecomment-344352573", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/30", "id": 344352573, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDM1MjU3Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-14T18:29:01Z", "updated_at": "2017-11-14T18:29:01Z", "author_association": "OWNER", "body": "This is a dupe of #85 ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 268078453, "label": "Do something neat with foreign keys"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/93#issuecomment-344409906", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/93", "id": 344409906, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDQwOTkwNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-14T21:47:02Z", "updated_at": "2017-11-14T21:47:02Z", "author_association": "OWNER", "body": "Even without bundling in the database file itself, I'd love to have a standalone binary version of the core `datasette` CLI utility.\r\n\r\nI think Sanic may have some complex dependencies, but I've never tried pyinstaller so I don't know how easy or hard it would be to get this working.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273944952, "label": "Package as standalone binary"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/93#issuecomment-344415756", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/93", "id": 344415756, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDQxNTc1Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-14T22:09:13Z", "updated_at": "2017-11-14T22:09:13Z", "author_association": "OWNER", "body": "Looks like we'd need to use this recipe: https://github.com/pyinstaller/pyinstaller/wiki/Recipe-Setuptools-Entry-Point", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273944952, "label": "Package as standalone binary"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/93#issuecomment-344426887", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/93", "id": 344426887, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDQyNjg4Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-14T22:51:46Z", "updated_at": "2017-11-14T22:51:46Z", "author_association": "OWNER", "body": "That didn't quite work for me. It built me a `dist/datasette` executable but when I try to run it I get an error:\r\n\r\n $ pwd\r\n /Users/simonw/Dropbox/Development/datasette\r\n $ source venv/bin/activate\r\n $ pyinstaller -F --add-data datasette/templates:datasette/templates --add-data datasette/static:datasette/static /Users/simonw/Dropbox/Development/datasette/venv/bin/datasette\r\n $ dist/datasette --help\r\n Traceback (most recent call last):\r\n File \"datasette\", line 11, in \r\n File \"site-packages/pkg_resources/__init__.py\", line 572, in load_entry_point\r\n File \"site-packages/pkg_resources/__init__.py\", line 564, in get_distribution\r\n File \"site-packages/pkg_resources/__init__.py\", line 436, in get_provider\r\n File \"site-packages/pkg_resources/__init__.py\", line 984, in require\r\n File \"site-packages/pkg_resources/__init__.py\", line 870, in resolve\r\n pkg_resources.DistributionNotFound: The 'datasette' distribution was not found and is required by the application\r\n [99117] Failed to execute script datasette\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273944952, "label": "Package as standalone binary"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/88#issuecomment-344427448", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/88", "id": 344427448, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDQyNzQ0OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-14T22:54:06Z", "updated_at": "2017-11-14T22:54:06Z", "author_association": "OWNER", "body": "Hooray! First dataset that wasn't deployed by me :) https://github.com/simonw/datasette/wiki/Datasettes", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273775212, "label": "Add NHS England Hospitals example to wiki"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/88#issuecomment-344427560", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/88", "id": 344427560, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDQyNzU2MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-14T22:54:33Z", "updated_at": "2017-11-14T22:54:33Z", "author_association": "OWNER", "body": "I'm getting an internal server error on http://run.plnkr.co/preview/cj9zlf1qc0003414y90ajkwpk/ at the moment", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273775212, "label": "Add NHS England Hospitals example to wiki"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/14#issuecomment-344438724", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/14", "id": 344438724, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDQzODcyNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-14T23:47:54Z", "updated_at": "2017-11-14T23:47:54Z", "author_association": "OWNER", "body": "Plugins should be able to interact with the build step. This would give plugins an opportunity to modify the SQL databases and help prepare them for serving - for example, a full-text search plugin might create additional FTS tables, or a mapping plugin might pre-calculate a bunch of geohashes for tables that have latitude/longitude values. Plugins could really take advantage of the immutable nature of the dataset here.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 267707940, "label": "Datasette Plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/93#issuecomment-344440377", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/93", "id": 344440377, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDQ0MDM3Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-14T23:56:35Z", "updated_at": "2017-11-14T23:56:35Z", "author_association": "OWNER", "body": "It worked!\r\n\r\n $ pyinstaller -F \\\r\n --add-data /usr/local/lib/python3.5/site-packages/datasette/templates:datasette/templates \\\r\n --add-data /usr/local/lib/python3.5/site-packages/datasette/static:datasette/static \\\r\n /usr/local/bin/datasette\r\n\r\n $ file dist/datasette \r\n dist/datasette: Mach-O 64-bit executable x86_64\r\n $ dist/datasette --help\r\n Usage: datasette [OPTIONS] COMMAND [ARGS]...\r\n\r\n Datasette!\r\n\r\n Options:\r\n --help Show this message and exit.\r\n\r\n Commands:\r\n serve* Serve up specified SQLite database files with...\r\n build\r\n package Package specified SQLite files into a new...\r\n publish Publish specified SQLite database files to...\r\n", "reactions": "{\"total_count\": 3, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 3, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273944952, "label": "Package as standalone binary"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/93#issuecomment-344440658", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/93", "id": 344440658, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDQ0MDY1OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-14T23:58:07Z", "updated_at": "2017-11-14T23:58:07Z", "author_association": "OWNER", "body": "It's a shame pyinstaller can't act as a cross-compiler - so I don't think I can get Travis CI to build packages. But it's fantastic that it's possible to turn the tool into a standalone executable!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273944952, "label": "Package as standalone binary"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/85#issuecomment-344452063", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/85", "id": 344452063, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDQ1MjA2Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-15T01:03:03Z", "updated_at": "2017-11-15T01:03:03Z", "author_association": "OWNER", "body": "This can work in reverse too. If you view the row page for something that has foreign keys against it, we can show you \u201c53 items in TABLE link to this\u201d and provide a link to view them all.\r\n\r\nThat count worry could be prohibitively expensive. To counter that, we could run the count query via Ajax and set a strict time limit on it. See #95", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273678673, "label": "Detect foreign keys and use them to link HTML pages together"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/85#issuecomment-344452326", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/85", "id": 344452326, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDQ1MjMyNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-15T01:04:38Z", "updated_at": "2017-11-15T01:04:38Z", "author_association": "OWNER", "body": "This will work well in conjunction with https://github.com/simonw/csvs-to-sqlite/issues/2", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273678673, "label": "Detect foreign keys and use them to link HTML pages together"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/89#issuecomment-344462277", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/89", "id": 344462277, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDQ2MjI3Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-15T02:02:52Z", "updated_at": "2017-11-15T02:02:52Z", "author_association": "OWNER", "body": "This is exactly what I was after, thanks!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273816720, "label": "SQL syntax highlighting with CodeMirror"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/13#issuecomment-344462608", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/13", "id": 344462608, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDQ2MjYwOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-15T02:04:51Z", "updated_at": "2017-11-15T02:04:51Z", "author_association": "OWNER", "body": "Fixed in https://github.com/simonw/datasette/commit/8252daa4c14d73b4b69e3f2db4576bb39d73c070 - thanks, @tomdyson!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 267542338, "label": "Add a syntax highlighting SQL editor"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/95#issuecomment-344463436", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/95", "id": 344463436, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDQ2MzQzNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-15T02:10:10Z", "updated_at": "2017-11-15T02:10:10Z", "author_association": "OWNER", "body": "This means clients can ask questions but say \"don't bother if it takes longer than X\" - which is really handy when you're working against unknown databases that might be small or might be enormous.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273998513, "label": "Allow shorter time limits to be set using a ?_sql_time_limit_ms =20 query string limit"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/94#issuecomment-344472313", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/94", "id": 344472313, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDQ3MjMxMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-15T03:08:00Z", "updated_at": "2017-11-15T03:08:00Z", "author_association": "OWNER", "body": "Works for me. I'm going to land this.\r\n\r\nJust one thing:\r\n\r\n simonw$ docker run --rm -t -i -p 9001:8001 c408e8cfbe40 datasette publish now\r\n The publish command requires \"now\" to be installed and configured \r\n Follow the instructions at https://zeit.co/now#whats-now\r\n\r\nMaybe we should have the Docker container install the \"now\" client? Not sure how much size that would add though. I think it's OK without for the moment.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273961179, "label": "Initial add simple prod ready Dockerfile refs #57"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/25#issuecomment-344487639", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/25", "id": 344487639, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDQ4NzYzOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-15T05:11:11Z", "updated_at": "2017-11-15T05:11:11Z", "author_association": "OWNER", "body": "Since you can already download the database directly, I'm not going to bother with this one.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 267857622, "label": "Endpoint that returns SQL ready to be piped into DB"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/85#issuecomment-344657040", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/85", "id": 344657040, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDY1NzA0MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-15T16:56:48Z", "updated_at": "2017-11-15T16:56:48Z", "author_association": "OWNER", "body": "Since detecting foreign keys that point to a specific table is a bit expensive (you have to call a PRAGMA on every other table) I\u2019m going to add this to the build/inspect stage.\r\n\r\nIdea: if we detect that the foreign key table only has one other column in it (id, name) AND we know that the id is the primary key, we can add an efficient lookup on the table list view and prefetch a dictionary mapping IDs to their value. Then we can feed that dictionary in as extra tenplate context and use it to render labeled hyperlinks in the corresponding column.\r\n\r\nThis means our build step should also cache which columns are indexed, and add a \u201clabel_column\u201d property for tables with an obvious lane column.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273678673, "label": "Detect foreign keys and use them to link HTML pages together"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/90#issuecomment-344667202", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/90", "id": 344667202, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDY2NzIwMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-15T17:29:38Z", "updated_at": "2017-11-15T17:29:38Z", "author_association": "OWNER", "body": "@jacobian points out that a buildpack may be a better fit than a Docker container for implementing this: https://twitter.com/jacobian/status/930849058465255424", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273846123, "label": "datasette publish heroku"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/90#issuecomment-344680385", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/90", "id": 344680385, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDY4MDM4NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-15T18:14:11Z", "updated_at": "2017-11-15T18:14:11Z", "author_association": "OWNER", "body": "Maybe we don\u2019t even need a buildpack... we could create a temporary directory, set up a classic heroku app with the datasette serve command in the Procfile and then git push to deploy.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273846123, "label": "datasette publish heroku"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/90#issuecomment-344686483", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/90", "id": 344686483, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDY4NjQ4Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-15T18:36:23Z", "updated_at": "2017-11-15T18:36:23Z", "author_association": "OWNER", "body": "The \u201cdatasette build\u201d command would need to run in a bin/post_compile script eg https://github.com/simonw/simonwillisonblog/blob/cloudflare-ips/bin/post_compile", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273846123, "label": "datasette publish heroku"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/90#issuecomment-344687328", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/90", "id": 344687328, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDY4NzMyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-15T18:39:14Z", "updated_at": "2017-11-15T18:39:49Z", "author_association": "OWNER", "body": "By default the command could use a temporary directory that gets cleaned up after the deploy, but we could allow users to opt in to keeping the generated directory like so:\r\n\r\n datasette publish heroku mydb.py -d ~/dev/my-heroku-app\r\n\r\nThis would create the my-heroku-app folder so you can later execute further git deploys from there.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273846123, "label": "datasette publish heroku"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/107#issuecomment-344770170", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/107", "id": 344770170, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDc3MDE3MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-16T00:01:00Z", "updated_at": "2017-11-16T00:01:22Z", "author_association": "OWNER", "body": "It is - but I think this will break on this line since it expects two format string parameters:\r\n\r\nhttps://github.com/simonw/datasette/blob/f45ca30f91b92ac68adaba893bf034f13ec61ced/datasette/utils.py#L61\r\n\r\nNeeds unit tests too, which live here: https://github.com/simonw/datasette/blob/f45ca30f91b92ac68adaba893bf034f13ec61ced/tests/test_utils.py#L49", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 274343647, "label": "add support for ?field__isnull=1"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/100#issuecomment-344771130", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/100", "id": 344771130, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDc3MTEzMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-16T00:06:00Z", "updated_at": "2017-11-16T00:06:00Z", "author_association": "OWNER", "body": "Aha... it looks like this is a Jinja version problem: https://github.com/ansible/ansible/issues/25381#issuecomment-306492389\r\n\r\nDatasette depends on sanic-jinja2 - and that doesn't depend on a particular jinja2 version: https://github.com/lixxu/sanic-jinja2/blob/7e9520850d8c6bb66faf43b7f252593d7efe3452/setup.py#L22\r\n\r\nSo if you have an older version of Jinja installed, stuff breaks.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 274160723, "label": "TemplateAssertionError: no filter named 'tojson'"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/96#issuecomment-344786528", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/96", "id": 344786528, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDc4NjUyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-16T01:32:41Z", "updated_at": "2017-11-16T01:32:41Z", "author_association": "OWNER", "body": "\"australian-dogs\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 274001453, "label": "UI for editing named parameters"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/96#issuecomment-344788435", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/96", "id": 344788435, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDc4ODQzNQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-16T01:43:52Z", "updated_at": "2017-11-16T01:43:52Z", "author_association": "OWNER", "body": "Demo: https://australian-dogs.now.sh/australian-dogs-3ba9628?sql=select+name%2C+count%28*%29+as+n+from+%28%0D%0A%0D%0Aselect+upper%28%22Animal+name%22%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2013%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2014%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2015%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22AnimalName%22%29+as+name+from+%5BCity-of-Port-Adelaide-Enfield-Dog_Registrations_2016%5D+where+AnimalBreed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5BMitcham-dog-registrations-2015%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22DOG_NAME%22%29+as+name+from+%5Bburnside-dog-registrations-2015%5D+where+DOG_BREED+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Animal_Name%22%29+as+name+from+%5Bcity-of-playford-2015-dog-registration%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5Bcity-of-prospect-dog-registration-details-2016%5D+where%22Breed+Description%22+like+%3Abreed%0D%0A%0D%0A%29+group+by+name+order+by+n+desc%3B&breed=chihuahua", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 274001453, "label": "UI for editing named parameters"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/96#issuecomment-344788763", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/96", "id": 344788763, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDc4ODc2Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-16T01:45:51Z", "updated_at": "2017-11-16T01:45:51Z", "author_association": "OWNER", "body": "Another demo - this time it lets you search by name and see the most popular breeds with that name: https://australian-dogs.now.sh/australian-dogs-3ba9628?sql=select+breed%2C+count%28*%29+as+n+from+%28%0D%0A%0D%0Aselect+upper%28%22Breed%22%29+as+breed+from+%5BAdelaide-City-Council-dog-registrations-2013%5D+where+%22Animal+name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Breed_Description%22%29+as+breed+from+%5BAdelaide-City-Council-dog-registrations-2014%5D+where+%22Animal_Name%22+like+%3Aname%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Breed_Description%22%29+as+breed+from+%5BAdelaide-City-Council-dog-registrations-2015%5D+where+%22Animal_Name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22AnimalBreed%22%29+as+breed+from+%5BCity-of-Port-Adelaide-Enfield-Dog_Registrations_2016%5D+where+%22AnimalName%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Breed%22%29+as+breed+from+%5BMitcham-dog-registrations-2015%5D+where+%22Animal+Name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22DOG_BREED%22%29+as+breed+from+%5Bburnside-dog-registrations-2015%5D+where+%22DOG_NAME%22+like+%3Aname%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Breed_Description%22%29+as+breed+from+%5Bcity-of-playford-2015-dog-registration%5D+where+%22Animal_Name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Breed+Description%22%29+as+breed+from+%5Bcity-of-prospect-dog-registration-details-2016%5D+where+%22Animal+Name%22+like+%3Aname%0D%0A%0D%0A%29+group+by+breed+order+by+n+desc%3B&name=rex", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 274001453, "label": "UI for editing named parameters"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/46#issuecomment-344975156", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/46", "id": 344975156, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDk3NTE1Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-16T16:19:44Z", "updated_at": "2017-11-16T16:19:44Z", "author_association": "OWNER", "body": "That's fantastic! Thank you very much for that. \r\n\r\nDo you know if it's possible to view the Dockerfile used by https://hub.docker.com/r/prolocutor/python3-sqlite-ext/ ?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 271301468, "label": "Dockerfile should build more recent SQLite with FTS5 and spatialite support"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/46#issuecomment-344976104", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/46", "id": 344976104, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDk3NjEwNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-16T16:22:45Z", "updated_at": "2017-11-16T16:22:45Z", "author_association": "OWNER", "body": "Found a relevant Dockerfile on Reddit: https://www.reddit.com/r/Python/comments/5unkb3/install_sqlite3_on_python_3/ddzdz2b/", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 271301468, "label": "Dockerfile should build more recent SQLite with FTS5 and spatialite support"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/46#issuecomment-344976882", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/46", "id": 344976882, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDk3Njg4Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-16T16:25:07Z", "updated_at": "2017-11-16T16:25:07Z", "author_association": "OWNER", "body": "Maybe part of the solution here is to add a `--load-extension` argument to `datasette` - so when you run the command you can specify SQLite extensions that should be loaded. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 271301468, "label": "Dockerfile should build more recent SQLite with FTS5 and spatialite support"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/109#issuecomment-344986423", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/109", "id": 344986423, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDk4NjQyMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-16T16:53:26Z", "updated_at": "2017-11-16T16:53:26Z", "author_association": "OWNER", "body": "http://datasette.readthedocs.io/", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 274378301, "label": "Set up readthedocs"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/110#issuecomment-344988263", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/110", "id": 344988263, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDk4ODI2Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-16T16:58:48Z", "updated_at": "2017-11-16T16:58:48Z", "author_association": "OWNER", "body": "Here's how I tested this.\r\n\r\nFirst I downloaded and started a docker container using https://hub.docker.com/r/prolocutor/python3-sqlite-ext - which includes the compiled spatialite extension. This downloads it, then starts a shell in that container.\r\n\r\n docker run -it -p 8018:8018 prolocutor/python3-sqlite-ext:3.5.1-spatialite /bin/sh\r\n\r\nInstalled a pre-release build of datasette which includes the new `--load-extension` option.\r\n\r\n pip install https://static.simonwillison.net/static/2017/datasette-0.13-py3-none-any.whl\r\n\r\nNow grab a sample database from https://www.gaia-gis.it/spatialite-2.3.1/resources.html - and unzip and rename it (datasette doesn't yet like databases with dots in their filename):\r\n\r\n wget http://www.gaia-gis.it/spatialite-2.3.1/test-2.3.sqlite.gz\r\n gunzip test-2.3.sqlite.gz\r\n mv test-2.3.sqlite test23.sqlite\r\n\r\nNow start datasette on port 8018 (the port I exposed earlier) with the extension loaded:\r\n\r\n datasette test23.sqlite -p 8018 -h 0.0.0.0 --load-extension /usr/local/lib/mod_spatialite.so\r\n\r\nNow I can confirm that it worked:\r\n\r\nhttp://localhost:8018/test23-c88bc35?sql=select+ST_AsText%28Geometry%29+from+HighWays+limit+1\r\n\r\n\"test23\"\r\n\r\nIf I run datasette without `--load-extension` I get this:\r\n\r\n datasette test23.sqlite -p 8018 -h 0.0.0.0\r\n\r\n\"test23_and_turn_on_auto-escaping_in_jinja_\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 274578142, "label": "Add --load-extension option to datasette for loading extra SQLite extensions"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/46#issuecomment-344988591", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/46", "id": 344988591, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDk4ODU5MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-16T16:59:51Z", "updated_at": "2017-11-16T16:59:51Z", "author_association": "OWNER", "body": "OK, `--load-extension` is now a supported command line option - see #110 which includes my notes on how I manually tested it using the `prolocutor/python3-sqlite-ext` Docker image.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 271301468, "label": "Dockerfile should build more recent SQLite with FTS5 and spatialite support"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/46#issuecomment-344989340", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/46", "id": 344989340, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDk4OTM0MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-16T17:02:07Z", "updated_at": "2017-11-16T17:02:07Z", "author_association": "OWNER", "body": "The fact that `prolocutor/python3-sqlite-ext` doesn't provide a visible Dockerfile and hasn't been updated in two years makes me hesitant to bake it into datasette itself. I'd rather put together a Dockerfile that enables the necessary extensions and can live in the datasette repository itself.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 271301468, "label": "Dockerfile should build more recent SQLite with FTS5 and spatialite support"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/46#issuecomment-344995571", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/46", "id": 344995571, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDk5NTU3MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-16T17:22:32Z", "updated_at": "2017-11-16T17:22:32Z", "author_association": "OWNER", "body": "The JSON extension would be very worthwhile too: https://www.sqlite.org/json1.html", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 271301468, "label": "Dockerfile should build more recent SQLite with FTS5 and spatialite support"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/111#issuecomment-345013127", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/111", "id": 345013127, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NTAxMzEyNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-16T18:23:56Z", "updated_at": "2017-11-16T18:23:56Z", "author_association": "OWNER", "body": "Having this as a global option may not make sense when publishing multiple databases. We can revisit that when we implement per-database and per-table metadata.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 274615452, "label": "Add \u201cupdated\u201d to metadata"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/110#issuecomment-345017256", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/110", "id": 345017256, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NTAxNzI1Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-16T18:38:30Z", "updated_at": "2017-11-16T18:38:30Z", "author_association": "OWNER", "body": "To finish up, I committed the image I created in the above so I can run it again in the future:\r\n\r\n docker commit $(docker ps -lq) datasette-sqlite\r\n\r\nNow I can run it like this:\r\n\r\n docker run -it -p 8018:8018 datasette-sqlite datasette /tmp/test23.sqlite -p 8018 -h 0.0.0.0 --load-extension /usr/local/lib/mod_spatialite.so\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 274578142, "label": "Add --load-extension option to datasette for loading extra SQLite extensions"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/14#issuecomment-345067498", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/14", "id": 345067498, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NTA2NzQ5OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-16T21:25:32Z", "updated_at": "2017-11-16T21:26:22Z", "author_association": "OWNER", "body": "For visualizations, Google Maps should be made available as a plugin. The default visualizations can use Leaflet and Open Street Map, but there's no reason to not make Google Maps available as a plugin, especially if the plugin can provide a mechanism for configuring the necessary API key.\r\n\r\nI'm particularly excited in the Google Maps heatmap visualization https://developers.google.com/maps/documentation/javascript/heatmaplayer as seen on http://mochimachine.org/wasteland/", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 267707940, "label": "Datasette Plugins"}, "performed_via_github_app": null}