issue_comments
9,947 rows
This data as json, CSV (advanced)
id ▼ | html_url | issue_url | node_id | user | created_at | updated_at | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
343780039 | https://github.com/simonw/datasette/issues/69#issuecomment-343780039 | https://api.github.com/repos/simonw/datasette/issues/69 | MDEyOklzc3VlQ29tbWVudDM0Mzc4MDAzOQ== | simonw 9599 | 2017-11-13T00:05:27Z | 2017-11-13T00:05:27Z | OWNER | I think the only safe way to do this is using SQLite `.fetchmany(1000)` - I can't guarantee that the user has not entered SQL that will outfox a limit in some way. So instead of attempting to edit their SQL, I'll always return 1001 records and let them know if they went over 1000 or not. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Enforce pagination (or at least limits) for arbitrary custom SQL 273248366 | |
343780141 | https://github.com/simonw/datasette/issues/71#issuecomment-343780141 | https://api.github.com/repos/simonw/datasette/issues/71 | MDEyOklzc3VlQ29tbWVudDM0Mzc4MDE0MQ== | simonw 9599 | 2017-11-13T00:06:52Z | 2017-11-13T00:06:52Z | OWNER | I've registered datasettes.com as a domain name for doing this. Now setting it up so Cloudflare and Now can serve content from it. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up some example datasets on a Cloudflare-backed domain 273278840 | |
343780539 | https://github.com/simonw/datasette/issues/71#issuecomment-343780539 | https://api.github.com/repos/simonw/datasette/issues/71 | MDEyOklzc3VlQ29tbWVudDM0Mzc4MDUzOQ== | simonw 9599 | 2017-11-13T00:13:29Z | 2017-11-13T00:19:46Z | OWNER | https://zeit.co/docs/features/dns is docs now domain add -e datasettes.com I had to set up a custom TXT record on `_now.datasettes.com` to get this to work. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up some example datasets on a Cloudflare-backed domain 273278840 | |
343780671 | https://github.com/simonw/datasette/issues/71#issuecomment-343780671 | https://api.github.com/repos/simonw/datasette/issues/71 | MDEyOklzc3VlQ29tbWVudDM0Mzc4MDY3MQ== | simonw 9599 | 2017-11-13T00:15:21Z | 2017-11-13T00:17:37Z | OWNER | - [x] Redirect https://datasettes.com/ and https://www.datasettes.com/ to https://github.com/simonw/datasette | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up some example datasets on a Cloudflare-backed domain 273278840 | |
343780814 | https://github.com/simonw/datasette/issues/71#issuecomment-343780814 | https://api.github.com/repos/simonw/datasette/issues/71 | MDEyOklzc3VlQ29tbWVudDM0Mzc4MDgxNA== | simonw 9599 | 2017-11-13T00:17:50Z | 2017-11-13T00:18:19Z | OWNER | Achieved those redirects using Cloudflare "page rules": https://www.cloudflare.com/a/page-rules/datasettes.com | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up some example datasets on a Cloudflare-backed domain 273278840 | |
343781030 | https://github.com/simonw/datasette/issues/71#issuecomment-343781030 | https://api.github.com/repos/simonw/datasette/issues/71 | MDEyOklzc3VlQ29tbWVudDM0Mzc4MTAzMA== | simonw 9599 | 2017-11-13T00:21:05Z | 2017-11-13T02:09:32Z | OWNER | - [x] Have `now domain add -e datasettes.com` run without errors (hopefully just a matter of waiting for the DNS to update) - [x] Alias an example dataset hosted on Now on a datasettes.com subdomain - [x] Confirm that HTTP caching and HTTP/2 redirect pushing works as expected - this may require another page rule | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up some example datasets on a Cloudflare-backed domain 273278840 | |
343788581 | https://github.com/simonw/datasette/issues/71#issuecomment-343788581 | https://api.github.com/repos/simonw/datasette/issues/71 | MDEyOklzc3VlQ29tbWVudDM0Mzc4ODU4MQ== | simonw 9599 | 2017-11-13T01:48:17Z | 2017-11-13T01:48:17Z | OWNER | I had to add a rule like this to get letsencrypt certificates on now.sh working: https://github.com/zeit/now-cli/issues/188#issuecomment-270105052 <img width="975" alt="page_rules__datasettes_com___cloudflare_-_web_performance___security" src="https://user-images.githubusercontent.com/9599/32706131-c3d88742-c7cf-11e7-8d39-07d3554ce3cf.png"> I also have to flip this switch off every time I want to add a new alias: <img width="993" alt="crypto__datasettes_com___cloudflare_-_web_performance___security" src="https://user-images.githubusercontent.com/9599/32706326-a8ba1320-c7d1-11e7-8846-eb1e62efa3ed.png"> | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up some example datasets on a Cloudflare-backed domain 273278840 | |
343788780 | https://github.com/simonw/datasette/issues/71#issuecomment-343788780 | https://api.github.com/repos/simonw/datasette/issues/71 | MDEyOklzc3VlQ29tbWVudDM0Mzc4ODc4MA== | simonw 9599 | 2017-11-13T01:50:01Z | 2017-11-13T01:50:01Z | OWNER | Added another page rule in order to get Cloudflare to always obey cache headers sent by the server: <img width="978" alt="page_rules__datasettes_com___cloudflare_-_web_performance___security" src="https://user-images.githubusercontent.com/9599/32706355-ded7c60a-c7d1-11e7-93da-20989f40d527.png"> | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up some example datasets on a Cloudflare-backed domain 273278840 | |
343788817 | https://github.com/simonw/datasette/issues/71#issuecomment-343788817 | https://api.github.com/repos/simonw/datasette/issues/71 | MDEyOklzc3VlQ29tbWVudDM0Mzc4ODgxNw== | simonw 9599 | 2017-11-13T01:50:27Z | 2017-11-13T01:50:27Z | OWNER | https://fivethirtyeight.datasettes.com/ is now up and running. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up some example datasets on a Cloudflare-backed domain 273278840 | |
343789162 | https://github.com/simonw/datasette/issues/71#issuecomment-343789162 | https://api.github.com/repos/simonw/datasette/issues/71 | MDEyOklzc3VlQ29tbWVudDM0Mzc4OTE2Mg== | simonw 9599 | 2017-11-13T01:53:29Z | 2017-11-13T01:53:29Z | OWNER | ``` $ curl -i 'https://fivethirtyeight.datasettes.com/fivethirtyeight-75d605c/obama-commutations%2Fobama_commutations.csv.jsono' HTTP/1.1 200 OK Date: Mon, 13 Nov 2017 01:50:57 GMT Content-Type: application/json Transfer-Encoding: chunked Connection: keep-alive Set-Cookie: __cfduid=de836090f3e12a60579cc7a1696cf0d9e1510537857; expires=Tue, 13-Nov-18 01:50:57 GMT; path=/; domain=.datasettes.com; HttpOnly; Secure Access-Control-Allow-Origin: * Cache-Control: public, max-age=31536000 X-Now-Region: now-sfo CF-Cache-Status: HIT Expires: Tue, 13 Nov 2018 01:50:57 GMT Server: cloudflare-nginx CF-RAY: 3bce154a6d9293b4-SJC {"database": "fivethirtyeight", "table": "obama-commutations/obama_commutations.csv"...``` | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up some example datasets on a Cloudflare-backed domain 273278840 | |
343790984 | https://github.com/simonw/datasette/issues/71#issuecomment-343790984 | https://api.github.com/repos/simonw/datasette/issues/71 | MDEyOklzc3VlQ29tbWVudDM0Mzc5MDk4NA== | simonw 9599 | 2017-11-13T02:09:34Z | 2017-11-13T02:09:34Z | OWNER | HTTP/2 push totally worked on the redirect! fetch('https://fivethirtyeight.datasettes.com/fivethirtyeight/riddler-pick-lowest%2Flow_numbers.csv.jsono').then(r => r.json()).then(console.log) <img width="735" alt="eventbrite_api___v3_destination_search_" src="https://user-images.githubusercontent.com/9599/32706785-6c8dc178-c7d4-11e7-9130-8b6ec84a7430.png"> Meanwhile, in the network pane... <img width="770" alt="eventbrite_api___v3_destination_search_" src="https://user-images.githubusercontent.com/9599/32706803-7f92aac2-c7d4-11e7-9b01-c87e3d2c3891.png"> <img width="771" alt="eventbrite_api___v3_destination_search_" src="https://user-images.githubusercontent.com/9599/32706821-95106bb4-c7d4-11e7-9ad5-31107faa5795.png"> | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up some example datasets on a Cloudflare-backed domain 273278840 | |
343791348 | https://github.com/simonw/datasette/issues/68#issuecomment-343791348 | https://api.github.com/repos/simonw/datasette/issues/68 | MDEyOklzc3VlQ29tbWVudDM0Mzc5MTM0OA== | simonw 9599 | 2017-11-13T02:12:58Z | 2017-11-13T02:12:58Z | OWNER | I should use this on https://fivethirtyeight.datasettes.com/ | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Support for title/source/license metadata 273247186 | |
343801392 | https://github.com/simonw/datasette/issues/73#issuecomment-343801392 | https://api.github.com/repos/simonw/datasette/issues/73 | MDEyOklzc3VlQ29tbWVudDM0MzgwMTM5Mg== | simonw 9599 | 2017-11-13T03:36:47Z | 2017-11-13T03:36:47Z | OWNER | While I’m at it, let’s allow people to opt out of HTTP/2 push with a ?_nopush=1 argument too - in case they decide they don’t want to receive large 302 responses. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | _nocache=1 query string option for use with sort-by-random 273296178 | |
343951751 | https://github.com/simonw/datasette/issues/68#issuecomment-343951751 | https://api.github.com/repos/simonw/datasette/issues/68 | MDEyOklzc3VlQ29tbWVudDM0Mzk1MTc1MQ== | simonw 9599 | 2017-11-13T15:21:04Z | 2017-11-13T15:21:04Z | OWNER | For first version, I'm just supporting title, source and license information at the database level. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Support for title/source/license metadata 273247186 | |
343961784 | https://github.com/simonw/datasette/issues/67#issuecomment-343961784 | https://api.github.com/repos/simonw/datasette/issues/67 | MDEyOklzc3VlQ29tbWVudDM0Mzk2MTc4NA== | simonw 9599 | 2017-11-13T15:50:50Z | 2017-11-13T15:50:50Z | OWNER | `datasette package ...` - same arguments as `datasette publish`. Creates Docker container in your local repo, optionally tagged with `--tag` | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Command that builds a local docker container 273192789 | |
343967020 | https://github.com/simonw/datasette/issues/67#issuecomment-343967020 | https://api.github.com/repos/simonw/datasette/issues/67 | MDEyOklzc3VlQ29tbWVudDM0Mzk2NzAyMA== | simonw 9599 | 2017-11-13T16:06:10Z | 2017-11-13T16:06:10Z | OWNER | http://odewahn.github.io/docker-jumpstart/example.html is helpful | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Command that builds a local docker container 273192789 | |
344000982 | https://github.com/simonw/datasette/issues/75#issuecomment-344000982 | https://api.github.com/repos/simonw/datasette/issues/75 | MDEyOklzc3VlQ29tbWVudDM0NDAwMDk4Mg== | simonw 9599 | 2017-11-13T17:50:27Z | 2017-11-13T17:50:27Z | OWNER | This is necessary because one of the fun things to do with this tool is run it locally, e.g.: datasette ~/Library/Application\ Support/Google/Chrome/Default/History -p 8003 BUT... if we enable CORS by default, an evil site could try sniffing for localhost:8003 and attempt to steal data. So we'll enable the CORS headers only if `--cors` is provided to the command, and then use that command in the default Dockerfile. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add --cors argument to serve 273509159 | |
344017088 | https://github.com/simonw/datasette/issues/51#issuecomment-344017088 | https://api.github.com/repos/simonw/datasette/issues/51 | MDEyOklzc3VlQ29tbWVudDM0NDAxNzA4OA== | simonw 9599 | 2017-11-13T18:44:23Z | 2017-11-13T18:44:23Z | OWNER | Implemented in https://github.com/simonw/datasette/commit/e838bd743d31358b362875854a0ac5e78047727f | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Make a proper README 272735257 | |
344018680 | https://github.com/simonw/datasette/issues/74#issuecomment-344018680 | https://api.github.com/repos/simonw/datasette/issues/74 | MDEyOklzc3VlQ29tbWVudDM0NDAxODY4MA== | simonw 9599 | 2017-11-13T18:49:58Z | 2017-11-13T18:49:58Z | OWNER | Turns out it does this already: https://github.com/simonw/datasette/blob/6b3b05b6db0d2a7b7cec8b8dbb4ddc5e12a376b2/datasette/app.py#L96-L107 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Send a 302 redirect to the new hash for hits to old hashes 273296684 | |
344019631 | https://github.com/simonw/datasette/issues/69#issuecomment-344019631 | https://api.github.com/repos/simonw/datasette/issues/69 | MDEyOklzc3VlQ29tbWVudDM0NDAxOTYzMQ== | simonw 9599 | 2017-11-13T18:53:13Z | 2017-11-13T18:53:13Z | OWNER | I'm going with a page size of 100 and a max limit of 1000 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Enforce pagination (or at least limits) for arbitrary custom SQL 273248366 | |
344048656 | https://github.com/simonw/datasette/issues/69#issuecomment-344048656 | https://api.github.com/repos/simonw/datasette/issues/69 | MDEyOklzc3VlQ29tbWVudDM0NDA0ODY1Ng== | simonw 9599 | 2017-11-13T20:32:47Z | 2017-11-13T20:32:47Z | OWNER | <img width="908" alt="ak" src="https://user-images.githubusercontent.com/9599/32745247-071aa764-c867-11e7-9748-88e22f5eee57.png"> | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Enforce pagination (or at least limits) for arbitrary custom SQL 273248366 | |
344060070 | https://github.com/simonw/datasette/issues/55#issuecomment-344060070 | https://api.github.com/repos/simonw/datasette/issues/55 | MDEyOklzc3VlQ29tbWVudDM0NDA2MDA3MA== | simonw 9599 | 2017-11-13T21:14:13Z | 2017-11-13T21:14:13Z | OWNER | I'm going to add some extra metadata to setup.py and then tag this as version 0.8: git tag 0.8 git push --tags Then to ship to PyPI: python setup.py bdist_wheel twine register dist/datasette-0.8-py3-none-any.whl twine upload dist/datasette-0.8-py3-none-any.whl | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Ship first version to PyPI 273127117 | |
344061762 | https://github.com/simonw/datasette/issues/55#issuecomment-344061762 | https://api.github.com/repos/simonw/datasette/issues/55 | MDEyOklzc3VlQ29tbWVudDM0NDA2MTc2Mg== | simonw 9599 | 2017-11-13T21:19:43Z | 2017-11-13T21:19:43Z | OWNER | And we're live! https://pypi.python.org/pypi/datasette | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Ship first version to PyPI 273127117 | |
344074443 | https://github.com/simonw/datasette/issues/80#issuecomment-344074443 | https://api.github.com/repos/simonw/datasette/issues/80 | MDEyOklzc3VlQ29tbWVudDM0NDA3NDQ0Mw== | simonw 9599 | 2017-11-13T22:04:54Z | 2017-11-13T22:05:02Z | OWNER | The fivethirtyeight dataset: datasette publish now --name fivethirtyeight --metadata metadata.json fivethirtyeight.db now alias https://fivethirtyeight-jyqfudvjli.now.sh fivethirtyeight.datasettes.com And parlgov: datasette publish now parlgov.db --name=parlgov --metadata=parlgov.json now alias https://parlgov-hqvxuhmbyh.now.sh parlgov.datasettes.com | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Deploy final versions of fivethirtyeight and parlgov datasets (with view pagination) 273569477 | |
344075696 | https://github.com/simonw/datasette/issues/80#issuecomment-344075696 | https://api.github.com/repos/simonw/datasette/issues/80 | MDEyOklzc3VlQ29tbWVudDM0NDA3NTY5Ng== | simonw 9599 | 2017-11-13T22:09:46Z | 2017-11-13T22:09:46Z | OWNER | Parlgov was throwing errors on one of the views, which takes longer than 1000ms to execute - so I added the ability to customize the time limit in https://github.com/simonw/datasette/commit/1e698787a4dd6df0432021a6814c446c8b69bba2 datasette publish now parlgov.db --metadata parlgov.json --name parlgov --extra-options="--sql_time_limit_ms=3500" now alias https://parlgov-nvkcowlixq.now.sh parlgov.datasettes.com https://parlgov.datasettes.com/parlgov-25f9855/view_cabinet now returns in just over 2.5s | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Deploy final versions of fivethirtyeight and parlgov datasets (with view pagination) 273569477 | |
344076554 | https://github.com/simonw/datasette/pull/81#issuecomment-344076554 | https://api.github.com/repos/simonw/datasette/issues/81 | MDEyOklzc3VlQ29tbWVudDM0NDA3NjU1NA== | simonw 9599 | 2017-11-13T22:12:57Z | 2017-11-13T22:12:57Z | OWNER | Hah, I haven't even announced this yet :) Travis is upset because I'm using SQL in the tests which isn't compatible with their version of Python 3. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | :fire: Removes DS_Store 273595473 | |
344081876 | https://github.com/simonw/datasette/issues/59#issuecomment-344081876 | https://api.github.com/repos/simonw/datasette/issues/59 | MDEyOklzc3VlQ29tbWVudDM0NDA4MTg3Ng== | simonw 9599 | 2017-11-13T22:33:43Z | 2017-11-13T22:33:43Z | OWNER | The `datasette package` command introduced in 4143e3b45c16cbae5e3e3419ef479a71810e7df3 is relevant here. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | datasette publish hyper 273157085 | |
344118849 | https://github.com/simonw/datasette/issues/82#issuecomment-344118849 | https://api.github.com/repos/simonw/datasette/issues/82 | MDEyOklzc3VlQ29tbWVudDM0NDExODg0OQ== | simonw 9599 | 2017-11-14T01:46:10Z | 2017-11-14T01:46:10Z | OWNER | Did this: https://simonwillison.net/2017/Nov/13/datasette/ | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Post a blog entry announcing it to the world 273596159 | |
344125441 | https://github.com/simonw/datasette/pull/81#issuecomment-344125441 | https://api.github.com/repos/simonw/datasette/issues/81 | MDEyOklzc3VlQ29tbWVudDM0NDEyNTQ0MQ== | jefftriplett 50527 | 2017-11-14T02:24:54Z | 2017-11-14T02:24:54Z | CONTRIBUTOR | Oops, if I jumped the gun. I saw the project in my github activity feed and saw some low hanging fruit :) | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | :fire: Removes DS_Store 273595473 | |
344132481 | https://github.com/simonw/datasette/issues/47#issuecomment-344132481 | https://api.github.com/repos/simonw/datasette/issues/47 | MDEyOklzc3VlQ29tbWVudDM0NDEzMjQ4MQ== | simonw 9599 | 2017-11-14T03:08:13Z | 2017-11-14T03:08:13Z | OWNER | I ended up shipping with https://fivethirtyeight.datasettes.com/ and https://parlgov.datasettes.com/ | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Create neat example database 271831408 | |
344141199 | https://github.com/simonw/datasette/issues/59#issuecomment-344141199 | https://api.github.com/repos/simonw/datasette/issues/59 | MDEyOklzc3VlQ29tbWVudDM0NDE0MTE5OQ== | simonw 9599 | 2017-11-14T04:13:11Z | 2017-11-14T04:13:11Z | OWNER | I managed to do this manually: datasette package ~/parlgov-db/parlgov.db --metadata=parlgov.json # Output 8758ec31dda3 as the new image ID docker save 8758ec31dda3 > /tmp/my-image # I could have just piped this straight to hyper cat /tmp/my-image | hyper load # Now start the container running in hyper hyper run -d -p 80:8001 --name parlgov 8758ec31dda3 # We need to assign an IP address so we can see it hyper fip allocate 1 # Outputs 199.245.58.78 hyper fip attach 199.245.58.78 parlgov At this point, visiting the IP address in a browser showed the parlgov UI. To clean up... hyper hyper fip detach parlgov hyper fip release 199.245.58.78 hyper stop parlgov hyper rm parlgov | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | datasette publish hyper 273157085 | |
344141515 | https://github.com/simonw/datasette/issues/79#issuecomment-344141515 | https://api.github.com/repos/simonw/datasette/issues/79 | MDEyOklzc3VlQ29tbWVudDM0NDE0MTUxNQ== | simonw 9599 | 2017-11-14T04:16:01Z | 2017-11-14T04:16:01Z | OWNER | This is probably a bit too much for the README - I should get readthedocs working. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add more detailed API documentation to the README 273569068 | |
344145265 | https://github.com/simonw/datasette/issues/57#issuecomment-344145265 | https://api.github.com/repos/simonw/datasette/issues/57 | MDEyOklzc3VlQ29tbWVudDM0NDE0NTI2NQ== | macropin 247192 | 2017-11-14T04:45:38Z | 2017-11-14T04:45:38Z | CONTRIBUTOR | I'm happy to contribute this. Just let me know if you want a Dockerfile for development or production purposes, or both. If it's prod then we can just pip install the source from pypi, otherwise for dev we'll need a `requirements.txt` to speed up rebuilds. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Ship a Docker image of the whole thing 273127694 | |
344147583 | https://github.com/simonw/datasette/issues/57#issuecomment-344147583 | https://api.github.com/repos/simonw/datasette/issues/57 | MDEyOklzc3VlQ29tbWVudDM0NDE0NzU4Mw== | macropin 247192 | 2017-11-14T05:03:47Z | 2017-11-14T05:03:47Z | CONTRIBUTOR | Let me know if you'd like a PR. The image is usable as `docker run --rm -t -i -p 9000:8001 -v $(pwd)/db:/db datasette datasette serve /db/chinook.db` | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Ship a Docker image of the whole thing 273127694 | |
344149165 | https://github.com/simonw/datasette/issues/57#issuecomment-344149165 | https://api.github.com/repos/simonw/datasette/issues/57 | MDEyOklzc3VlQ29tbWVudDM0NDE0OTE2NQ== | simonw 9599 | 2017-11-14T05:16:34Z | 2017-11-14T05:17:14Z | OWNER | I’m intrigued by this pattern: https://github.com/macropin/datasette/blob/147195c2fdfa2b984d8f9fc1c6cab6634970a056/Dockerfile#L8 What’s the benefit of doing that? Does it result in a smaller image size? | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Ship a Docker image of the whole thing 273127694 | |
344151223 | https://github.com/simonw/datasette/issues/57#issuecomment-344151223 | https://api.github.com/repos/simonw/datasette/issues/57 | MDEyOklzc3VlQ29tbWVudDM0NDE1MTIyMw== | macropin 247192 | 2017-11-14T05:32:28Z | 2017-11-14T05:33:03Z | CONTRIBUTOR | The pattern is called "multi-stage builds". And the result is a svelte 226MB image (201MB for 3.6-slim) vs 700MB+ for the full image. It's possible to get it even smaller, but that takes a lot more work. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Ship a Docker image of the whole thing 273127694 | |
344161226 | https://github.com/simonw/datasette/issues/46#issuecomment-344161226 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NDE2MTIyNg== | simonw 9599 | 2017-11-14T06:41:21Z | 2017-11-14T06:41:21Z | OWNER | Spatial extensions would be really useful too. https://www.gaia-gis.it/spatialite-2.1/SpatiaLite-manual.html | {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
344161371 | https://github.com/simonw/datasette/issues/46#issuecomment-344161371 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NDE2MTM3MQ== | simonw 9599 | 2017-11-14T06:42:15Z | 2017-11-14T06:42:15Z | OWNER | http://charlesleifer.com/blog/going-fast-with-sqlite-and-python/ is useful here too. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
344161430 | https://github.com/simonw/datasette/issues/46#issuecomment-344161430 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NDE2MTQzMA== | simonw 9599 | 2017-11-14T06:42:44Z | 2017-11-14T06:42:44Z | OWNER | Also requested on Twitter: https://twitter.com/DenubisX/status/930322813864439808 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
344179878 | https://github.com/simonw/datasette/issues/27#issuecomment-344179878 | https://api.github.com/repos/simonw/datasette/issues/27 | MDEyOklzc3VlQ29tbWVudDM0NDE3OTg3OA== | simonw 9599 | 2017-11-14T08:21:22Z | 2017-11-14T08:21:22Z | OWNER | https://github.com/frappe/charts perhaps | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Ability to plot a simple graph 267886330 | |
344180866 | https://github.com/simonw/datasette/issues/43#issuecomment-344180866 | https://api.github.com/repos/simonw/datasette/issues/43 | MDEyOklzc3VlQ29tbWVudDM0NDE4MDg2Ng== | simonw 9599 | 2017-11-14T08:25:37Z | 2017-11-14T08:25:37Z | OWNER | This isn’t necessary - restarting the server is fast and easy, and I’ve not found myself needing this at all during development. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | While running, server should spot new db files added to its directory 268592894 | |
344185817 | https://github.com/simonw/datasette/issues/57#issuecomment-344185817 | https://api.github.com/repos/simonw/datasette/issues/57 | MDEyOklzc3VlQ29tbWVudDM0NDE4NTgxNw== | simonw 9599 | 2017-11-14T08:46:24Z | 2017-11-14T08:46:24Z | OWNER | Thanks for the explanation! Please do start a pull request. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Ship a Docker image of the whole thing 273127694 | |
344352573 | https://github.com/simonw/datasette/issues/30#issuecomment-344352573 | https://api.github.com/repos/simonw/datasette/issues/30 | MDEyOklzc3VlQ29tbWVudDM0NDM1MjU3Mw== | simonw 9599 | 2017-11-14T18:29:01Z | 2017-11-14T18:29:01Z | OWNER | This is a dupe of #85 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Do something neat with foreign keys 268078453 | |
344409906 | https://github.com/simonw/datasette/issues/93#issuecomment-344409906 | https://api.github.com/repos/simonw/datasette/issues/93 | MDEyOklzc3VlQ29tbWVudDM0NDQwOTkwNg== | simonw 9599 | 2017-11-14T21:47:02Z | 2017-11-14T21:47:02Z | OWNER | Even without bundling in the database file itself, I'd love to have a standalone binary version of the core `datasette` CLI utility. I think Sanic may have some complex dependencies, but I've never tried pyinstaller so I don't know how easy or hard it would be to get this working. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Package as standalone binary 273944952 | |
344415756 | https://github.com/simonw/datasette/issues/93#issuecomment-344415756 | https://api.github.com/repos/simonw/datasette/issues/93 | MDEyOklzc3VlQ29tbWVudDM0NDQxNTc1Ng== | simonw 9599 | 2017-11-14T22:09:13Z | 2017-11-14T22:09:13Z | OWNER | Looks like we'd need to use this recipe: https://github.com/pyinstaller/pyinstaller/wiki/Recipe-Setuptools-Entry-Point | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Package as standalone binary 273944952 | |
344424382 | https://github.com/simonw/datasette/issues/93#issuecomment-344424382 | https://api.github.com/repos/simonw/datasette/issues/93 | MDEyOklzc3VlQ29tbWVudDM0NDQyNDM4Mg== | atomotic 67420 | 2017-11-14T22:42:16Z | 2017-11-14T22:42:16Z | NONE | tried quickly, this seems working: ``` ~ pip3 install pyinstaller ~ pyinstaller -F --add-data /usr/local/lib/python3.6/site-packages/datasette/templates:datasette/templates --add-data /usr/local/lib/python3.6/site-packages/datasette/static:datasette/static /usr/local/bin/datasette ~ du -h dist/datasette 6.8M dist/datasette ~ file dist/datasette dist/datasette: Mach-O 64-bit executable x86_64 ``` | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Package as standalone binary 273944952 | |
344426887 | https://github.com/simonw/datasette/issues/93#issuecomment-344426887 | https://api.github.com/repos/simonw/datasette/issues/93 | MDEyOklzc3VlQ29tbWVudDM0NDQyNjg4Nw== | simonw 9599 | 2017-11-14T22:51:46Z | 2017-11-14T22:51:46Z | OWNER | That didn't quite work for me. It built me a `dist/datasette` executable but when I try to run it I get an error: $ pwd /Users/simonw/Dropbox/Development/datasette $ source venv/bin/activate $ pyinstaller -F --add-data datasette/templates:datasette/templates --add-data datasette/static:datasette/static /Users/simonw/Dropbox/Development/datasette/venv/bin/datasette $ dist/datasette --help Traceback (most recent call last): File "datasette", line 11, in <module> File "site-packages/pkg_resources/__init__.py", line 572, in load_entry_point File "site-packages/pkg_resources/__init__.py", line 564, in get_distribution File "site-packages/pkg_resources/__init__.py", line 436, in get_provider File "site-packages/pkg_resources/__init__.py", line 984, in require File "site-packages/pkg_resources/__init__.py", line 870, in resolve pkg_resources.DistributionNotFound: The 'datasette' distribution was not found and is required by the application [99117] Failed to execute script datasette | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Package as standalone binary 273944952 | |
344427448 | https://github.com/simonw/datasette/issues/88#issuecomment-344427448 | https://api.github.com/repos/simonw/datasette/issues/88 | MDEyOklzc3VlQ29tbWVudDM0NDQyNzQ0OA== | simonw 9599 | 2017-11-14T22:54:06Z | 2017-11-14T22:54:06Z | OWNER | Hooray! First dataset that wasn't deployed by me :) https://github.com/simonw/datasette/wiki/Datasettes | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add NHS England Hospitals example to wiki 273775212 | |
344427560 | https://github.com/simonw/datasette/issues/88#issuecomment-344427560 | https://api.github.com/repos/simonw/datasette/issues/88 | MDEyOklzc3VlQ29tbWVudDM0NDQyNzU2MA== | simonw 9599 | 2017-11-14T22:54:33Z | 2017-11-14T22:54:33Z | OWNER | I'm getting an internal server error on http://run.plnkr.co/preview/cj9zlf1qc0003414y90ajkwpk/ at the moment | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add NHS England Hospitals example to wiki 273775212 | |
344430299 | https://github.com/simonw/datasette/issues/93#issuecomment-344430299 | https://api.github.com/repos/simonw/datasette/issues/93 | MDEyOklzc3VlQ29tbWVudDM0NDQzMDI5OQ== | atomotic 67420 | 2017-11-14T23:06:33Z | 2017-11-14T23:06:33Z | NONE | i will look better tomorrow, it's late i surely made some mistake https://asciinema.org/a/ZyAWbetrlriDadwWyVPUWB94H | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Package as standalone binary 273944952 | |
344430689 | https://github.com/simonw/datasette/issues/88#issuecomment-344430689 | https://api.github.com/repos/simonw/datasette/issues/88 | MDEyOklzc3VlQ29tbWVudDM0NDQzMDY4OQ== | tomdyson 15543 | 2017-11-14T23:08:22Z | 2017-11-14T23:08:22Z | CONTRIBUTOR | > I'm getting an internal server error on http://run.plnkr.co/preview/cj9zlf1qc0003414y90ajkwpk/ at the moment Sorry about that - here's a working version on Netlify: https://nhs-england-map.netlify.com | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add NHS England Hospitals example to wiki 273775212 | |
344438724 | https://github.com/simonw/datasette/issues/14#issuecomment-344438724 | https://api.github.com/repos/simonw/datasette/issues/14 | MDEyOklzc3VlQ29tbWVudDM0NDQzODcyNA== | simonw 9599 | 2017-11-14T23:47:54Z | 2017-11-14T23:47:54Z | OWNER | Plugins should be able to interact with the build step. This would give plugins an opportunity to modify the SQL databases and help prepare them for serving - for example, a full-text search plugin might create additional FTS tables, or a mapping plugin might pre-calculate a bunch of geohashes for tables that have latitude/longitude values. Plugins could really take advantage of the immutable nature of the dataset here. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Datasette Plugins 267707940 | |
344440377 | https://github.com/simonw/datasette/issues/93#issuecomment-344440377 | https://api.github.com/repos/simonw/datasette/issues/93 | MDEyOklzc3VlQ29tbWVudDM0NDQ0MDM3Nw== | simonw 9599 | 2017-11-14T23:56:35Z | 2017-11-14T23:56:35Z | OWNER | It worked! $ pyinstaller -F \ --add-data /usr/local/lib/python3.5/site-packages/datasette/templates:datasette/templates \ --add-data /usr/local/lib/python3.5/site-packages/datasette/static:datasette/static \ /usr/local/bin/datasette $ file dist/datasette dist/datasette: Mach-O 64-bit executable x86_64 $ dist/datasette --help Usage: datasette [OPTIONS] COMMAND [ARGS]... Datasette! Options: --help Show this message and exit. Commands: serve* Serve up specified SQLite database files with... build package Package specified SQLite files into a new... publish Publish specified SQLite database files to... | {"total_count": 3, "+1": 0, "-1": 0, "laugh": 0, "hooray": 3, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Package as standalone binary 273944952 | |
344440658 | https://github.com/simonw/datasette/issues/93#issuecomment-344440658 | https://api.github.com/repos/simonw/datasette/issues/93 | MDEyOklzc3VlQ29tbWVudDM0NDQ0MDY1OA== | simonw 9599 | 2017-11-14T23:58:07Z | 2017-11-14T23:58:07Z | OWNER | It's a shame pyinstaller can't act as a cross-compiler - so I don't think I can get Travis CI to build packages. But it's fantastic that it's possible to turn the tool into a standalone executable! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Package as standalone binary 273944952 | |
344452063 | https://github.com/simonw/datasette/issues/85#issuecomment-344452063 | https://api.github.com/repos/simonw/datasette/issues/85 | MDEyOklzc3VlQ29tbWVudDM0NDQ1MjA2Mw== | simonw 9599 | 2017-11-15T01:03:03Z | 2017-11-15T01:03:03Z | OWNER | This can work in reverse too. If you view the row page for something that has foreign keys against it, we can show you “53 items in TABLE link to this” and provide a link to view them all. That count worry could be prohibitively expensive. To counter that, we could run the count query via Ajax and set a strict time limit on it. See #95 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Detect foreign keys and use them to link HTML pages together 273678673 | |
344452326 | https://github.com/simonw/datasette/issues/85#issuecomment-344452326 | https://api.github.com/repos/simonw/datasette/issues/85 | MDEyOklzc3VlQ29tbWVudDM0NDQ1MjMyNg== | simonw 9599 | 2017-11-15T01:04:38Z | 2017-11-15T01:04:38Z | OWNER | This will work well in conjunction with https://github.com/simonw/csvs-to-sqlite/issues/2 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Detect foreign keys and use them to link HTML pages together 273678673 | |
344462277 | https://github.com/simonw/datasette/pull/89#issuecomment-344462277 | https://api.github.com/repos/simonw/datasette/issues/89 | MDEyOklzc3VlQ29tbWVudDM0NDQ2MjI3Nw== | simonw 9599 | 2017-11-15T02:02:52Z | 2017-11-15T02:02:52Z | OWNER | This is exactly what I was after, thanks! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | SQL syntax highlighting with CodeMirror 273816720 | |
344462608 | https://github.com/simonw/datasette/issues/13#issuecomment-344462608 | https://api.github.com/repos/simonw/datasette/issues/13 | MDEyOklzc3VlQ29tbWVudDM0NDQ2MjYwOA== | simonw 9599 | 2017-11-15T02:04:51Z | 2017-11-15T02:04:51Z | OWNER | Fixed in https://github.com/simonw/datasette/commit/8252daa4c14d73b4b69e3f2db4576bb39d73c070 - thanks, @tomdyson! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add a syntax highlighting SQL editor 267542338 | |
344463436 | https://github.com/simonw/datasette/issues/95#issuecomment-344463436 | https://api.github.com/repos/simonw/datasette/issues/95 | MDEyOklzc3VlQ29tbWVudDM0NDQ2MzQzNg== | simonw 9599 | 2017-11-15T02:10:10Z | 2017-11-15T02:10:10Z | OWNER | This means clients can ask questions but say "don't bother if it takes longer than X" - which is really handy when you're working against unknown databases that might be small or might be enormous. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Allow shorter time limits to be set using a ?_sql_time_limit_ms =20 query string limit 273998513 | |
344472313 | https://github.com/simonw/datasette/pull/94#issuecomment-344472313 | https://api.github.com/repos/simonw/datasette/issues/94 | MDEyOklzc3VlQ29tbWVudDM0NDQ3MjMxMw== | simonw 9599 | 2017-11-15T03:08:00Z | 2017-11-15T03:08:00Z | OWNER | Works for me. I'm going to land this. Just one thing: simonw$ docker run --rm -t -i -p 9001:8001 c408e8cfbe40 datasette publish now The publish command requires "now" to be installed and configured Follow the instructions at https://zeit.co/now#whats-now Maybe we should have the Docker container install the "now" client? Not sure how much size that would add though. I think it's OK without for the moment. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Initial add simple prod ready Dockerfile refs #57 273961179 | |
344487639 | https://github.com/simonw/datasette/issues/25#issuecomment-344487639 | https://api.github.com/repos/simonw/datasette/issues/25 | MDEyOklzc3VlQ29tbWVudDM0NDQ4NzYzOQ== | simonw 9599 | 2017-11-15T05:11:11Z | 2017-11-15T05:11:11Z | OWNER | Since you can already download the database directly, I'm not going to bother with this one. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Endpoint that returns SQL ready to be piped into DB 267857622 | |
344516406 | https://github.com/simonw/datasette/issues/93#issuecomment-344516406 | https://api.github.com/repos/simonw/datasette/issues/93 | MDEyOklzc3VlQ29tbWVudDM0NDUxNjQwNg== | atomotic 67420 | 2017-11-15T08:09:41Z | 2017-11-15T08:09:41Z | NONE | actually you can use travis to build for linux/macos and [appveyor](https://www.appveyor.com/) to build for windows. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Package as standalone binary 273944952 | |
344597274 | https://github.com/simonw/datasette/issues/101#issuecomment-344597274 | https://api.github.com/repos/simonw/datasette/issues/101 | MDEyOklzc3VlQ29tbWVudDM0NDU5NzI3NA== | eaubin 450244 | 2017-11-15T13:48:55Z | 2017-11-15T13:48:55Z | NONE | This is a duplicate of https://github.com/simonw/datasette/issues/100 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | TemplateAssertionError: no filter named 'tojson' 274161964 | |
344657040 | https://github.com/simonw/datasette/issues/85#issuecomment-344657040 | https://api.github.com/repos/simonw/datasette/issues/85 | MDEyOklzc3VlQ29tbWVudDM0NDY1NzA0MA== | simonw 9599 | 2017-11-15T16:56:48Z | 2017-11-15T16:56:48Z | OWNER | Since detecting foreign keys that point to a specific table is a bit expensive (you have to call a PRAGMA on every other table) I’m going to add this to the build/inspect stage. Idea: if we detect that the foreign key table only has one other column in it (id, name) AND we know that the id is the primary key, we can add an efficient lookup on the table list view and prefetch a dictionary mapping IDs to their value. Then we can feed that dictionary in as extra tenplate context and use it to render labeled hyperlinks in the corresponding column. This means our build step should also cache which columns are indexed, and add a “label_column” property for tables with an obvious lane column. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Detect foreign keys and use them to link HTML pages together 273678673 | |
344667202 | https://github.com/simonw/datasette/issues/90#issuecomment-344667202 | https://api.github.com/repos/simonw/datasette/issues/90 | MDEyOklzc3VlQ29tbWVudDM0NDY2NzIwMg== | simonw 9599 | 2017-11-15T17:29:38Z | 2017-11-15T17:29:38Z | OWNER | @jacobian points out that a buildpack may be a better fit than a Docker container for implementing this: https://twitter.com/jacobian/status/930849058465255424 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | datasette publish heroku 273846123 | |
344680385 | https://github.com/simonw/datasette/issues/90#issuecomment-344680385 | https://api.github.com/repos/simonw/datasette/issues/90 | MDEyOklzc3VlQ29tbWVudDM0NDY4MDM4NQ== | simonw 9599 | 2017-11-15T18:14:11Z | 2017-11-15T18:14:11Z | OWNER | Maybe we don’t even need a buildpack... we could create a temporary directory, set up a classic heroku app with the datasette serve command in the Procfile and then git push to deploy. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | datasette publish heroku 273846123 | |
344686483 | https://github.com/simonw/datasette/issues/90#issuecomment-344686483 | https://api.github.com/repos/simonw/datasette/issues/90 | MDEyOklzc3VlQ29tbWVudDM0NDY4NjQ4Mw== | simonw 9599 | 2017-11-15T18:36:23Z | 2017-11-15T18:36:23Z | OWNER | The “datasette build” command would need to run in a bin/post_compile script eg https://github.com/simonw/simonwillisonblog/blob/cloudflare-ips/bin/post_compile | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | datasette publish heroku 273846123 | |
344687328 | https://github.com/simonw/datasette/issues/90#issuecomment-344687328 | https://api.github.com/repos/simonw/datasette/issues/90 | MDEyOklzc3VlQ29tbWVudDM0NDY4NzMyOA== | simonw 9599 | 2017-11-15T18:39:14Z | 2017-11-15T18:39:49Z | OWNER | By default the command could use a temporary directory that gets cleaned up after the deploy, but we could allow users to opt in to keeping the generated directory like so: datasette publish heroku mydb.py -d ~/dev/my-heroku-app This would create the my-heroku-app folder so you can later execute further git deploys from there. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | datasette publish heroku 273846123 | |
344710204 | https://github.com/simonw/datasette/pull/104#issuecomment-344710204 | https://api.github.com/repos/simonw/datasette/issues/104 | MDEyOklzc3VlQ29tbWVudDM0NDcxMDIwNA== | jacobian 21148 | 2017-11-15T19:57:50Z | 2017-11-15T19:57:50Z | CONTRIBUTOR | A first basic stab at making this work, just to prove the approach. Right now this requires [a Heroku CLI plugin](https://github.com/heroku/heroku-builds), which seems pretty unreasonable. I think this can be replaced with direct API calls, which could clean up a lot of things. But I wanted to prove it worked first, and it does. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | [WIP] Add publish to heroku support 274284246 | |
344770170 | https://github.com/simonw/datasette/pull/107#issuecomment-344770170 | https://api.github.com/repos/simonw/datasette/issues/107 | MDEyOklzc3VlQ29tbWVudDM0NDc3MDE3MA== | simonw 9599 | 2017-11-16T00:01:00Z | 2017-11-16T00:01:22Z | OWNER | It is - but I think this will break on this line since it expects two format string parameters: https://github.com/simonw/datasette/blob/f45ca30f91b92ac68adaba893bf034f13ec61ced/datasette/utils.py#L61 Needs unit tests too, which live here: https://github.com/simonw/datasette/blob/f45ca30f91b92ac68adaba893bf034f13ec61ced/tests/test_utils.py#L49 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | add support for ?field__isnull=1 274343647 | |
344771130 | https://github.com/simonw/datasette/issues/100#issuecomment-344771130 | https://api.github.com/repos/simonw/datasette/issues/100 | MDEyOklzc3VlQ29tbWVudDM0NDc3MTEzMA== | simonw 9599 | 2017-11-16T00:06:00Z | 2017-11-16T00:06:00Z | OWNER | Aha... it looks like this is a Jinja version problem: https://github.com/ansible/ansible/issues/25381#issuecomment-306492389 Datasette depends on sanic-jinja2 - and that doesn't depend on a particular jinja2 version: https://github.com/lixxu/sanic-jinja2/blob/7e9520850d8c6bb66faf43b7f252593d7efe3452/setup.py#L22 So if you have an older version of Jinja installed, stuff breaks. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | TemplateAssertionError: no filter named 'tojson' 274160723 | |
344786528 | https://github.com/simonw/datasette/issues/96#issuecomment-344786528 | https://api.github.com/repos/simonw/datasette/issues/96 | MDEyOklzc3VlQ29tbWVudDM0NDc4NjUyOA== | simonw 9599 | 2017-11-16T01:32:41Z | 2017-11-16T01:32:41Z | OWNER | <img width="733" alt="australian-dogs" src="https://user-images.githubusercontent.com/9599/32869280-f82fa176-ca2a-11e7-8ad1-ac2a13c85089.png"> | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | UI for editing named parameters 274001453 | |
344788435 | https://github.com/simonw/datasette/issues/96#issuecomment-344788435 | https://api.github.com/repos/simonw/datasette/issues/96 | MDEyOklzc3VlQ29tbWVudDM0NDc4ODQzNQ== | simonw 9599 | 2017-11-16T01:43:52Z | 2017-11-16T01:43:52Z | OWNER | Demo: https://australian-dogs.now.sh/australian-dogs-3ba9628?sql=select+name%2C+count%28*%29+as+n+from+%28%0D%0A%0D%0Aselect+upper%28%22Animal+name%22%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2013%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2014%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2015%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22AnimalName%22%29+as+name+from+%5BCity-of-Port-Adelaide-Enfield-Dog_Registrations_2016%5D+where+AnimalBreed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5BMitcham-dog-registrations-2015%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22DOG_NAME%22%29+as+name+from+%5Bburnside-dog-registrations-2015%5D+where+DOG_BREED+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Animal_Name%22%29+as+name+from+%5Bcity-of-playford-2015-dog-registration%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5Bcity-of-prospect-dog-registration-details-2016%5D+where%22Breed+Description%22+like+%3Abreed%0D%0A%0D%0A%29+group+by+name+order+by+n+desc%3B&breed=chihuahua | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | UI for editing named parameters 274001453 | |
344788763 | https://github.com/simonw/datasette/issues/96#issuecomment-344788763 | https://api.github.com/repos/simonw/datasette/issues/96 | MDEyOklzc3VlQ29tbWVudDM0NDc4ODc2Mw== | simonw 9599 | 2017-11-16T01:45:51Z | 2017-11-16T01:45:51Z | OWNER | Another demo - this time it lets you search by name and see the most popular breeds with that name: https://australian-dogs.now.sh/australian-dogs-3ba9628?sql=select+breed%2C+count%28*%29+as+n+from+%28%0D%0A%0D%0Aselect+upper%28%22Breed%22%29+as+breed+from+%5BAdelaide-City-Council-dog-registrations-2013%5D+where+%22Animal+name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Breed_Description%22%29+as+breed+from+%5BAdelaide-City-Council-dog-registrations-2014%5D+where+%22Animal_Name%22+like+%3Aname%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Breed_Description%22%29+as+breed+from+%5BAdelaide-City-Council-dog-registrations-2015%5D+where+%22Animal_Name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22AnimalBreed%22%29+as+breed+from+%5BCity-of-Port-Adelaide-Enfield-Dog_Registrations_2016%5D+where+%22AnimalName%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Breed%22%29+as+breed+from+%5BMitcham-dog-registrations-2015%5D+where+%22Animal+Name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22DOG_BREED%22%29+as+breed+from+%5Bburnside-dog-registrations-2015%5D+where+%22DOG_NAME%22+like+%3Aname%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Breed_Description%22%29+as+breed+from+%5Bcity-of-playford-2015-dog-registration%5D+where+%22Animal_Name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Breed+Description%22%29+as+breed+from+%5Bcity-of-prospect-dog-registration-details-2016%5D+where+%22Animal+Name%22+like+%3Aname%0D%0A%0D%0A%29+group+by+breed+order+by+n+desc%3B&name=rex | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | UI for editing named parameters 274001453 | |
344810525 | https://github.com/simonw/datasette/issues/46#issuecomment-344810525 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NDgxMDUyNQ== | ingenieroariel 54999 | 2017-11-16T04:11:25Z | 2017-11-16T04:11:25Z | CONTRIBUTOR | @simonw On the spatialite support, here is some info to make it work and a screenshot: <img width="1230" alt="screen shot 2017-11-15 at 11 08 14 pm" src="https://user-images.githubusercontent.com/54999/32873420-f8a6d5a0-ca59-11e7-8a73-7d58d467e413.png"> I used the following Dockerfile: ``` FROM prolocutor/python3-sqlite-ext:3.5.1-spatialite as build RUN mkdir /code ADD . /code/ RUN pip install /code/ EXPOSE 8001 CMD ["datasette", "serve", "/code/ne.sqlite", "--host", "0.0.0.0"] ``` and added this to `prepare_connection`: ``` conn.enable_load_extension(True) conn.execute("SELECT load_extension('/usr/local/lib/mod_spatialite.so')") ``` | {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 1, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
344811268 | https://github.com/simonw/datasette/pull/107#issuecomment-344811268 | https://api.github.com/repos/simonw/datasette/issues/107 | MDEyOklzc3VlQ29tbWVudDM0NDgxMTI2OA== | raynae 3433657 | 2017-11-16T04:17:45Z | 2017-11-16T04:17:45Z | CONTRIBUTOR | Thanks for the guidance. I added a unit test and made a slight change to utils.py. I didn't realize this, but evidently string.format only complains if you supply less arguments than there are format placeholders, so the original commit worked, but was adding a superfluous named param. I added a conditional that prevents the named param from being created and ensures the correct number of args are passed to sting.format. It has the side effect of hiding the SQL query in /templates/table.html when there are no other where clauses--not sure if that's the desired outcome here. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | add support for ?field__isnull=1 274343647 | |
344864254 | https://github.com/simonw/datasette/issues/100#issuecomment-344864254 | https://api.github.com/repos/simonw/datasette/issues/100 | MDEyOklzc3VlQ29tbWVudDM0NDg2NDI1NA== | coisnepe 13304454 | 2017-11-16T09:25:10Z | 2017-11-16T09:25:10Z | NONE | @simonw I see. I upgraded sanic-jinja2 and jinja2: it now works flawlessly. Thank you! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | TemplateAssertionError: no filter named 'tojson' 274160723 | |
344975156 | https://github.com/simonw/datasette/issues/46#issuecomment-344975156 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NDk3NTE1Ng== | simonw 9599 | 2017-11-16T16:19:44Z | 2017-11-16T16:19:44Z | OWNER | That's fantastic! Thank you very much for that. Do you know if it's possible to view the Dockerfile used by https://hub.docker.com/r/prolocutor/python3-sqlite-ext/ ? | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
344976104 | https://github.com/simonw/datasette/issues/46#issuecomment-344976104 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NDk3NjEwNA== | simonw 9599 | 2017-11-16T16:22:45Z | 2017-11-16T16:22:45Z | OWNER | Found a relevant Dockerfile on Reddit: https://www.reddit.com/r/Python/comments/5unkb3/install_sqlite3_on_python_3/ddzdz2b/ | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
344976882 | https://github.com/simonw/datasette/issues/46#issuecomment-344976882 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NDk3Njg4Mg== | simonw 9599 | 2017-11-16T16:25:07Z | 2017-11-16T16:25:07Z | OWNER | Maybe part of the solution here is to add a `--load-extension` argument to `datasette` - so when you run the command you can specify SQLite extensions that should be loaded. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
344986423 | https://github.com/simonw/datasette/issues/109#issuecomment-344986423 | https://api.github.com/repos/simonw/datasette/issues/109 | MDEyOklzc3VlQ29tbWVudDM0NDk4NjQyMw== | simonw 9599 | 2017-11-16T16:53:26Z | 2017-11-16T16:53:26Z | OWNER | http://datasette.readthedocs.io/ | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up readthedocs 274378301 | |
344988263 | https://github.com/simonw/datasette/issues/110#issuecomment-344988263 | https://api.github.com/repos/simonw/datasette/issues/110 | MDEyOklzc3VlQ29tbWVudDM0NDk4ODI2Mw== | simonw 9599 | 2017-11-16T16:58:48Z | 2017-11-16T16:58:48Z | OWNER | Here's how I tested this. First I downloaded and started a docker container using https://hub.docker.com/r/prolocutor/python3-sqlite-ext - which includes the compiled spatialite extension. This downloads it, then starts a shell in that container. docker run -it -p 8018:8018 prolocutor/python3-sqlite-ext:3.5.1-spatialite /bin/sh Installed a pre-release build of datasette which includes the new `--load-extension` option. pip install https://static.simonwillison.net/static/2017/datasette-0.13-py3-none-any.whl Now grab a sample database from https://www.gaia-gis.it/spatialite-2.3.1/resources.html - and unzip and rename it (datasette doesn't yet like databases with dots in their filename): wget http://www.gaia-gis.it/spatialite-2.3.1/test-2.3.sqlite.gz gunzip test-2.3.sqlite.gz mv test-2.3.sqlite test23.sqlite Now start datasette on port 8018 (the port I exposed earlier) with the extension loaded: datasette test23.sqlite -p 8018 -h 0.0.0.0 --load-extension /usr/local/lib/mod_spatialite.so Now I can confirm that it worked: http://localhost:8018/test23-c88bc35?sql=select+ST_AsText%28Geometry%29+from+HighWays+limit+1 <img width="743" alt="test23" src="https://user-images.githubusercontent.com/9599/32904449-39789a3a-caac-11e7-9531-b49f06051e34.png"> If I run datasette without `--load-extension` I get this: datasette test23.sqlite -p 8018 -h 0.0.0.0 <img width="747" alt="test23_and_turn_on_auto-escaping_in_jinja_ _simonw_datasette_82261a6" src="https://user-images.githubusercontent.com/9599/32904508-54e53d78-caac-11e7-9f80-4d96e9f9fb5f.png"> | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add --load-extension option to datasette for loading extra SQLite extensions 274578142 | |
344988591 | https://github.com/simonw/datasette/issues/46#issuecomment-344988591 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NDk4ODU5MQ== | simonw 9599 | 2017-11-16T16:59:51Z | 2017-11-16T16:59:51Z | OWNER | OK, `--load-extension` is now a supported command line option - see #110 which includes my notes on how I manually tested it using the `prolocutor/python3-sqlite-ext` Docker image. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
344989340 | https://github.com/simonw/datasette/issues/46#issuecomment-344989340 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NDk4OTM0MA== | simonw 9599 | 2017-11-16T17:02:07Z | 2017-11-16T17:02:07Z | OWNER | The fact that `prolocutor/python3-sqlite-ext` doesn't provide a visible Dockerfile and hasn't been updated in two years makes me hesitant to bake it into datasette itself. I'd rather put together a Dockerfile that enables the necessary extensions and can live in the datasette repository itself. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
344995571 | https://github.com/simonw/datasette/issues/46#issuecomment-344995571 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NDk5NTU3MQ== | simonw 9599 | 2017-11-16T17:22:32Z | 2017-11-16T17:22:32Z | OWNER | The JSON extension would be very worthwhile too: https://www.sqlite.org/json1.html | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
345002908 | https://github.com/simonw/datasette/issues/46#issuecomment-345002908 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NTAwMjkwOA== | ingenieroariel 54999 | 2017-11-16T17:47:49Z | 2017-11-16T17:47:49Z | CONTRIBUTOR | I'll try to find alternatives to the Dockerfile option - I also think we should not use that old one without sources or license. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
345013127 | https://github.com/simonw/datasette/issues/111#issuecomment-345013127 | https://api.github.com/repos/simonw/datasette/issues/111 | MDEyOklzc3VlQ29tbWVudDM0NTAxMzEyNw== | simonw 9599 | 2017-11-16T18:23:56Z | 2017-11-16T18:23:56Z | OWNER | Having this as a global option may not make sense when publishing multiple databases. We can revisit that when we implement per-database and per-table metadata. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add “updated” to metadata 274615452 | |
345017256 | https://github.com/simonw/datasette/issues/110#issuecomment-345017256 | https://api.github.com/repos/simonw/datasette/issues/110 | MDEyOklzc3VlQ29tbWVudDM0NTAxNzI1Ng== | simonw 9599 | 2017-11-16T18:38:30Z | 2017-11-16T18:38:30Z | OWNER | To finish up, I committed the image I created in the above so I can run it again in the future: docker commit $(docker ps -lq) datasette-sqlite Now I can run it like this: docker run -it -p 8018:8018 datasette-sqlite datasette /tmp/test23.sqlite -p 8018 -h 0.0.0.0 --load-extension /usr/local/lib/mod_spatialite.so | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add --load-extension option to datasette for loading extra SQLite extensions 274578142 | |
345067498 | https://github.com/simonw/datasette/issues/14#issuecomment-345067498 | https://api.github.com/repos/simonw/datasette/issues/14 | MDEyOklzc3VlQ29tbWVudDM0NTA2NzQ5OA== | simonw 9599 | 2017-11-16T21:25:32Z | 2017-11-16T21:26:22Z | OWNER | For visualizations, Google Maps should be made available as a plugin. The default visualizations can use Leaflet and Open Street Map, but there's no reason to not make Google Maps available as a plugin, especially if the plugin can provide a mechanism for configuring the necessary API key. I'm particularly excited in the Google Maps heatmap visualization https://developers.google.com/maps/documentation/javascript/heatmaplayer as seen on http://mochimachine.org/wasteland/ | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Datasette Plugins 267707940 | |
345108644 | https://github.com/simonw/datasette/pull/107#issuecomment-345108644 | https://api.github.com/repos/simonw/datasette/issues/107 | MDEyOklzc3VlQ29tbWVudDM0NTEwODY0NA== | simonw 9599 | 2017-11-17T00:34:46Z | 2017-11-17T00:34:46Z | OWNER | Looks like your tests are failing because of a bug which I fixed in https://github.com/simonw/datasette/commit/9199945a1bcec4852e1cb866eb3642614dd32a48 - if you rebase to master the tests should pass. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | add support for ?field__isnull=1 274343647 | |
345117690 | https://github.com/simonw/datasette/pull/107#issuecomment-345117690 | https://api.github.com/repos/simonw/datasette/issues/107 | MDEyOklzc3VlQ29tbWVudDM0NTExNzY5MA== | raynae 3433657 | 2017-11-17T01:29:41Z | 2017-11-17T01:29:41Z | CONTRIBUTOR | Thanks for bearing with me. I was getting a message about my branch diverging when I tried to push after rebasing, so I merged master into isnull, seems like that did the trick. Let me know if I should make any corrections. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | add support for ?field__isnull=1 274343647 | |
345138134 | https://github.com/simonw/datasette/pull/114#issuecomment-345138134 | https://api.github.com/repos/simonw/datasette/issues/114 | MDEyOklzc3VlQ29tbWVudDM0NTEzODEzNA== | simonw 9599 | 2017-11-17T03:50:38Z | 2017-11-17T03:50:38Z | OWNER | Fantastic! Thank you very much. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add spatialite, switch to debian and local build 274733145 | |
345138347 | https://github.com/simonw/datasette/issues/46#issuecomment-345138347 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NTEzODM0Nw== | simonw 9599 | 2017-11-17T03:52:25Z | 2017-11-17T03:52:25Z | OWNER | We now have a Dockerfile that compiles spatialite! https://github.com/simonw/datasette/pull/114/commits/6c6b63d890529eeefcefb7ab126ea3bd7b2315c1 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
345150048 | https://github.com/simonw/datasette/issues/85#issuecomment-345150048 | https://api.github.com/repos/simonw/datasette/issues/85 | MDEyOklzc3VlQ29tbWVudDM0NTE1MDA0OA== | simonw 9599 | 2017-11-17T05:35:25Z | 2017-11-17T05:35:25Z | OWNER | `csvs-to-sqlite` is now capable of generating databases with foreign key lookup tables: https://github.com/simonw/csvs-to-sqlite/releases/tag/0.3 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Detect foreign keys and use them to link HTML pages together 273678673 | |
345242447 | https://github.com/simonw/datasette/issues/85#issuecomment-345242447 | https://api.github.com/repos/simonw/datasette/issues/85 | MDEyOklzc3VlQ29tbWVudDM0NTI0MjQ0Nw== | simonw 9599 | 2017-11-17T13:22:33Z | 2017-11-17T13:23:14Z | OWNER | I could support explicit label columns using additional arguments to `datasette serve`: datasette serve mydb.py --label-column mydb:table1:name --label-column mydb:table2:title This would mean "in mydb, set the label column for table1 to name, and the label column for table2 to title" | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Detect foreign keys and use them to link HTML pages together 273678673 | |
345255655 | https://github.com/simonw/datasette/issues/112#issuecomment-345255655 | https://api.github.com/repos/simonw/datasette/issues/112 | MDEyOklzc3VlQ29tbWVudDM0NTI1NTY1NQ== | simonw 9599 | 2017-11-17T14:19:23Z | 2017-11-17T14:19:23Z | OWNER | I tesed this by first building and running a container using the new Dockerfile from #114: docker build . docker run -it -p 8001:8001 6c9ca7e29181 /bin/sh Then I ran this inside the container itself: apt update && apt-get install wget -y \ && wget http://www.gaia-gis.it/spatialite-2.3.1/test-2.3.sqlite.gz \ && gunzip test-2.3.sqlite.gz \ && mv test-2.3.sqlite test23.sqlite \ && datasette -h 0.0.0.0 test23.sqlite I visited this URL to confirm I got an error due to spatialite not being loaded: http://localhost:8001/test23-c88bc35?sql=select+ST_AsText%28Geometry%29+from+HighWays+limit+1 Then I checked that loading it with `--load-extension` worked correctly: datasette -h 0.0.0.0 test23.sqlite \ --load-extension=/usr/lib/x86_64-linux-gnu/mod_spatialite.so Then, finally, I tested it with the new environment variable option: SQLITE_EXTENSIONS=/usr/lib/x86_64-linux-gnu/mod_spatialite.so \ datasette -h 0.0.0.0 test23.sqlite Running it with an invalid environment variable option shows an error: $ SQLITE_EXTENSIONS=/usr/lib/x86_64-linux-gnu/blah.so datasette \ -h 0.0.0.0 test23.sqlite Usage: datasette -h [OPTIONS] [FILES]... Error: Invalid value for "--load-extension": Path "/usr/lib/x86_64-linux-gnu/blah.so" does not exist. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Allow --load-extension to be set via environment variables 274617240 | |
345256576 | https://github.com/simonw/datasette/pull/115#issuecomment-345256576 | https://api.github.com/repos/simonw/datasette/issues/115 | MDEyOklzc3VlQ29tbWVudDM0NTI1NjU3Ng== | simonw 9599 | 2017-11-17T14:22:51Z | 2017-11-17T14:22:51Z | OWNER | This is great - I've been frustrated by how CodeMirror prevents me from hitting tab-enter to activate the "Run SQL" button. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add keyboard shortcut to execute SQL query 274877366 | |
345259115 | https://github.com/simonw/datasette/issues/46#issuecomment-345259115 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NTI1OTExNQ== | simonw 9599 | 2017-11-17T14:32:12Z | 2017-11-17T14:32:12Z | OWNER | OK, I can confirm that the version in the new docker container supports FTS5, JSON *and* spatialite! Notes on how I built the container and tested the spatialite extension are here: https://github.com/simonw/datasette/issues/112#issuecomment-345255655 To confirm that JSON and FTS5 are working, I ran the following: $ docker run -it -p 8001:8001 6c9ca7e29181 python Python 3.6.3 (default, Nov 4 2017, 14:24:48) [GCC 6.3.0 20170516] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import sqlite3 >>> sqlite3.connect(':memory:').execute('CREATE VIRTUAL TABLE email USING fts5(sender, title, body);') <sqlite3.Cursor object at 0x7f2d90839960> >>> list(sqlite3.connect(':memory:').execute('''SELECT json(' { "this" : "is", "a": [ "test" ] } ') ''')) [('{"this":"is","a":["test"]}',)] If I do the same thing in python3 on my OS X laptop directly, I get this: $ python3 Python 3.5.1 (default, Apr 18 2016, 11:46:32) [GCC 4.2.1 Compatible Apple LLVM 7.3.0 (clang-703.0.29)] on darwin Type "help", "copyright", "credits" or "license" for more information. >>> import sqlite3 >>> sqlite3.connect(':memory:').execute('CREATE VIRTUAL TABLE email USING fts5(sender, title, body);') Traceback (most recent call last): File "<stdin>", line 1, in <module> sqlite3.OperationalError: no such module: fts5 >>> list(sqlite3.connect(':memory:').execute('''SELECT json(' { "this" : "is", "a": [ "test" ] } ') ''')) Traceback (most recent call last): File "<stdin>", line 1, in <module> sqlite3.OperationalError: no such function: json | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
345260784 | https://github.com/simonw/datasette/issues/64#issuecomment-345260784 | https://api.github.com/repos/simonw/datasette/issues/64 | MDEyOklzc3VlQ29tbWVudDM0NTI2MDc4NA== | simonw 9599 | 2017-11-17T14:38:21Z | 2017-11-17T14:38:21Z | OWNER | This was fixed by ed2b3f25beac720f14869350baacc5f62b065194 in #107 - thanks @raynae! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Support for ?field__isnull=1 or similar 273181020 | |
345262738 | https://github.com/simonw/datasette/issues/36#issuecomment-345262738 | https://api.github.com/repos/simonw/datasette/issues/36 | MDEyOklzc3VlQ29tbWVudDM0NTI2MjczOA== | simonw 9599 | 2017-11-17T14:45:37Z | 2017-11-17T14:45:37Z | OWNER | Consider for example https://fivethirtyeight.datasettes.com/fivethirtyeight/inconvenient-sequel%2Fratings <img width="719" alt="fivethirtyeight__inconvenient-sequel_ratings" src="https://user-images.githubusercontent.com/9599/32952559-82b81ea8-cb62-11e7-817a-45c8bba7e9a2.png"> The idea here is to be able to support querystring parameters like this: * `?timestamp___date=2017-07-17` - return every item where the timestamp falls on that date * `?timestamp___year=2017` - return every item where the timestamp falls within 2017 * `?timestamp___month=1` - return every item where the month component is January * `?timestamp___day=10` - return every item where the day-of-the-month component is 10 This is similar to #64 but a fair bit more complicated. SQLite date functions are documented here: https://sqlite.org/lang_datefunc.html | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | date, year, month and day querystring lookups 268262480 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);