issue_comments
9,947 rows sorted by user
This data as json, CSV (advanced)
id | html_url | issue_url | node_id | user ▼ | created_at | updated_at | author_association | body | reactions | issue | performed_via_github_app |
---|---|---|---|---|---|---|---|---|---|---|---|
343675165 | https://github.com/simonw/datasette/issues/14#issuecomment-343675165 | https://api.github.com/repos/simonw/datasette/issues/14 | MDEyOklzc3VlQ29tbWVudDM0MzY3NTE2NQ== | simonw 9599 | 2017-11-11T16:07:10Z | 2017-11-11T16:07:10Z | OWNER | The plugin system can also allow alternative providers for the `publish` command - e.g. maybe hook up hyper.sh as an option for publishing containers. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Datasette Plugins 267707940 | |
343676574 | https://github.com/simonw/datasette/issues/59#issuecomment-343676574 | https://api.github.com/repos/simonw/datasette/issues/59 | MDEyOklzc3VlQ29tbWVudDM0MzY3NjU3NA== | simonw 9599 | 2017-11-11T16:29:48Z | 2017-11-11T16:29:48Z | OWNER | See also #14 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | datasette publish hyper 273157085 | |
343683566 | https://github.com/simonw/datasette/issues/60#issuecomment-343683566 | https://api.github.com/repos/simonw/datasette/issues/60 | MDEyOklzc3VlQ29tbWVudDM0MzY4MzU2Ng== | simonw 9599 | 2017-11-11T18:12:24Z | 2017-11-11T18:12:24Z | OWNER | I’m going to solve this by making it an optional argument you can pass to the serve command. Then the Dockerfile can still build and use it but it won’t interfere with tests or dev. If argument is not passed, we will calculate hashes on startup and calculate table row counts on demand. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Rethink how metadata is generated and stored 273163905 | |
343690060 | https://github.com/simonw/datasette/issues/47#issuecomment-343690060 | https://api.github.com/repos/simonw/datasette/issues/47 | MDEyOklzc3VlQ29tbWVudDM0MzY5MDA2MA== | simonw 9599 | 2017-11-11T19:56:08Z | 2017-11-11T19:56:08Z | OWNER | "parlgov-development.db": { "url": "http://www.parlgov.org/" }, "nhsadmin.sqlite": { "url": "https://github.com/psychemedia/openHealthDataDoodles" } | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Create neat example database 271831408 | |
343691342 | https://github.com/simonw/datasette/issues/16#issuecomment-343691342 | https://api.github.com/repos/simonw/datasette/issues/16 | MDEyOklzc3VlQ29tbWVudDM0MzY5MTM0Mg== | simonw 9599 | 2017-11-11T20:19:07Z | 2017-11-11T20:19:07Z | OWNER | Closing this, opening a fresh ticket for the navigation stuff. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Default HTML/CSS needs to look reasonable and be responsive 267726219 | |
343697291 | https://github.com/simonw/datasette/issues/63#issuecomment-343697291 | https://api.github.com/repos/simonw/datasette/issues/63 | MDEyOklzc3VlQ29tbWVudDM0MzY5NzI5MQ== | simonw 9599 | 2017-11-11T22:05:06Z | 2017-11-11T22:11:49Z | OWNER | I'm going to bundle sql and sql_params together into a query nested object like this: { "query": { "sql": "select ...", "params": { "p0": "blah" } } } | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Review design of JSON output 273174447 | |
343698214 | https://github.com/simonw/datasette/issues/50#issuecomment-343698214 | https://api.github.com/repos/simonw/datasette/issues/50 | MDEyOklzc3VlQ29tbWVudDM0MzY5ODIxNA== | simonw 9599 | 2017-11-11T22:23:21Z | 2017-11-11T22:23:21Z | OWNER | I'm closing #50 - more tests will be added in the future, but the framework is neatly in place for them now. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Unit tests against application itself 272694136 | |
343699115 | https://github.com/simonw/datasette/issues/53#issuecomment-343699115 | https://api.github.com/repos/simonw/datasette/issues/53 | MDEyOklzc3VlQ29tbWVudDM0MzY5OTExNQ== | simonw 9599 | 2017-11-11T22:41:38Z | 2017-11-11T22:41:38Z | OWNER | This needs to incorporate a sensible way of presenting custom SQL query results too. And let's get a textarea in there for executing SQL while we're at it. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Implement a better database index page 273054652 | |
343705966 | https://github.com/simonw/datasette/issues/47#issuecomment-343705966 | https://api.github.com/repos/simonw/datasette/issues/47 | MDEyOklzc3VlQ29tbWVudDM0MzcwNTk2Ng== | simonw 9599 | 2017-11-12T01:00:20Z | 2017-11-12T01:00:20Z | OWNER | https://github.com/fivethirtyeight/data has a ton of CSVs | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Create neat example database 271831408 | |
343707624 | https://github.com/simonw/datasette/issues/53#issuecomment-343707624 | https://api.github.com/repos/simonw/datasette/issues/53 | MDEyOklzc3VlQ29tbWVudDM0MzcwNzYyNA== | simonw 9599 | 2017-11-12T01:47:45Z | 2017-11-12T01:47:45Z | OWNER | Split the SQL thing out into #65 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Implement a better database index page 273054652 | |
343707676 | https://github.com/simonw/datasette/issues/53#issuecomment-343707676 | https://api.github.com/repos/simonw/datasette/issues/53 | MDEyOklzc3VlQ29tbWVudDM0MzcwNzY3Ng== | simonw 9599 | 2017-11-12T01:49:07Z | 2017-11-12T01:49:07Z | OWNER | Here's the new design: <img width="691" alt="parlgov-development" src="https://user-images.githubusercontent.com/9599/32695161-82821226-c708-11e7-835c-b3d91850b2e0.png"> Also lists views at the bottom (refs #54): <img width="345" alt="parlgov-development" src="https://user-images.githubusercontent.com/9599/32695164-99efa7de-c708-11e7-8272-bc5f5b870b84.png"> | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Implement a better database index page 273054652 | |
343708447 | https://github.com/simonw/datasette/issues/42#issuecomment-343708447 | https://api.github.com/repos/simonw/datasette/issues/42 | MDEyOklzc3VlQ29tbWVudDM0MzcwODQ0Nw== | simonw 9599 | 2017-11-12T02:12:15Z | 2017-11-12T02:12:15Z | OWNER | I ditched the metadata file concept. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Homepage UI for editing metadata file 268591332 | |
343709217 | https://github.com/simonw/datasette/issues/65#issuecomment-343709217 | https://api.github.com/repos/simonw/datasette/issues/65 | MDEyOklzc3VlQ29tbWVudDM0MzcwOTIxNw== | simonw 9599 | 2017-11-12T02:36:37Z | 2017-11-12T02:36:37Z | OWNER | <img width="982" alt="nhsadmin" src="https://user-images.githubusercontent.com/9599/32695392-3ea12612-c70f-11e7-873b-9e6ad2c869e7.png"> | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Re-implement ?sql= mode 273191608 | |
343715915 | https://github.com/simonw/datasette/issues/25#issuecomment-343715915 | https://api.github.com/repos/simonw/datasette/issues/25 | MDEyOklzc3VlQ29tbWVudDM0MzcxNTkxNQ== | simonw 9599 | 2017-11-12T06:08:28Z | 2017-11-12T06:08:28Z | OWNER | con = sqlite3.connect('existing_db.db') with open('dump.sql', 'w') as f: for line in con.iterdump(): f.write('%s\n' % line) | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Endpoint that returns SQL ready to be piped into DB 267857622 | |
343752404 | https://github.com/simonw/datasette/issues/42#issuecomment-343752404 | https://api.github.com/repos/simonw/datasette/issues/42 | MDEyOklzc3VlQ29tbWVudDM0Mzc1MjQwNA== | simonw 9599 | 2017-11-12T17:20:10Z | 2017-11-12T17:20:10Z | OWNER | Re-opening this - I've decided to bring back this concept, see #68 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Homepage UI for editing metadata file 268591332 | |
343752579 | https://github.com/simonw/datasette/issues/69#issuecomment-343752579 | https://api.github.com/repos/simonw/datasette/issues/69 | MDEyOklzc3VlQ29tbWVudDM0Mzc1MjU3OQ== | simonw 9599 | 2017-11-12T17:22:39Z | 2017-11-12T17:22:39Z | OWNER | By default I'll allow LIMIT and OFFSET up to a maximum of X (where X is let's say 50,000 to start with, but can be custom configured to a larger number or set to None for no limit). | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Enforce pagination (or at least limits) for arbitrary custom SQL 273248366 | |
343752683 | https://github.com/simonw/datasette/issues/66#issuecomment-343752683 | https://api.github.com/repos/simonw/datasette/issues/66 | MDEyOklzc3VlQ29tbWVudDM0Mzc1MjY4Mw== | simonw 9599 | 2017-11-12T17:24:05Z | 2017-11-12T17:24:21Z | OWNER | Maybe SQL views should have their own Sanic view class (`ViewView` is kinda funny), subclassed from `TableView`? | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Show table SQL on table page 273191806 | |
343753999 | https://github.com/simonw/datasette/issues/68#issuecomment-343753999 | https://api.github.com/repos/simonw/datasette/issues/68 | MDEyOklzc3VlQ29tbWVudDM0Mzc1Mzk5OQ== | simonw 9599 | 2017-11-12T17:45:21Z | 2017-11-12T19:38:33Z | OWNER | For initial launch, I could just support this as some optional command line arguments you pass to the publish command: datasette publish data.db --title="Title" --source="url" | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Support for title/source/license metadata 273247186 | |
343754058 | https://github.com/simonw/datasette/issues/68#issuecomment-343754058 | https://api.github.com/repos/simonw/datasette/issues/68 | MDEyOklzc3VlQ29tbWVudDM0Mzc1NDA1OA== | simonw 9599 | 2017-11-12T17:46:13Z | 2017-11-12T17:46:13Z | OWNER | I’m going to store this stuff in a file called metadata.json and move the existing automatically generated metadata to a file called build.json | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Support for title/source/license metadata 273247186 | |
343769692 | https://github.com/simonw/datasette/issues/57#issuecomment-343769692 | https://api.github.com/repos/simonw/datasette/issues/57 | MDEyOklzc3VlQ29tbWVudDM0Mzc2OTY5Mg== | simonw 9599 | 2017-11-12T21:32:36Z | 2017-11-12T21:32:36Z | OWNER | I have created a Docker Hub public repository for this: https://hub.docker.com/r/simonwillison/datasette/ | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Ship a Docker image of the whole thing 273127694 | |
343780039 | https://github.com/simonw/datasette/issues/69#issuecomment-343780039 | https://api.github.com/repos/simonw/datasette/issues/69 | MDEyOklzc3VlQ29tbWVudDM0Mzc4MDAzOQ== | simonw 9599 | 2017-11-13T00:05:27Z | 2017-11-13T00:05:27Z | OWNER | I think the only safe way to do this is using SQLite `.fetchmany(1000)` - I can't guarantee that the user has not entered SQL that will outfox a limit in some way. So instead of attempting to edit their SQL, I'll always return 1001 records and let them know if they went over 1000 or not. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Enforce pagination (or at least limits) for arbitrary custom SQL 273248366 | |
343780141 | https://github.com/simonw/datasette/issues/71#issuecomment-343780141 | https://api.github.com/repos/simonw/datasette/issues/71 | MDEyOklzc3VlQ29tbWVudDM0Mzc4MDE0MQ== | simonw 9599 | 2017-11-13T00:06:52Z | 2017-11-13T00:06:52Z | OWNER | I've registered datasettes.com as a domain name for doing this. Now setting it up so Cloudflare and Now can serve content from it. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up some example datasets on a Cloudflare-backed domain 273278840 | |
343780539 | https://github.com/simonw/datasette/issues/71#issuecomment-343780539 | https://api.github.com/repos/simonw/datasette/issues/71 | MDEyOklzc3VlQ29tbWVudDM0Mzc4MDUzOQ== | simonw 9599 | 2017-11-13T00:13:29Z | 2017-11-13T00:19:46Z | OWNER | https://zeit.co/docs/features/dns is docs now domain add -e datasettes.com I had to set up a custom TXT record on `_now.datasettes.com` to get this to work. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up some example datasets on a Cloudflare-backed domain 273278840 | |
343780671 | https://github.com/simonw/datasette/issues/71#issuecomment-343780671 | https://api.github.com/repos/simonw/datasette/issues/71 | MDEyOklzc3VlQ29tbWVudDM0Mzc4MDY3MQ== | simonw 9599 | 2017-11-13T00:15:21Z | 2017-11-13T00:17:37Z | OWNER | - [x] Redirect https://datasettes.com/ and https://www.datasettes.com/ to https://github.com/simonw/datasette | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up some example datasets on a Cloudflare-backed domain 273278840 | |
343780814 | https://github.com/simonw/datasette/issues/71#issuecomment-343780814 | https://api.github.com/repos/simonw/datasette/issues/71 | MDEyOklzc3VlQ29tbWVudDM0Mzc4MDgxNA== | simonw 9599 | 2017-11-13T00:17:50Z | 2017-11-13T00:18:19Z | OWNER | Achieved those redirects using Cloudflare "page rules": https://www.cloudflare.com/a/page-rules/datasettes.com | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up some example datasets on a Cloudflare-backed domain 273278840 | |
343781030 | https://github.com/simonw/datasette/issues/71#issuecomment-343781030 | https://api.github.com/repos/simonw/datasette/issues/71 | MDEyOklzc3VlQ29tbWVudDM0Mzc4MTAzMA== | simonw 9599 | 2017-11-13T00:21:05Z | 2017-11-13T02:09:32Z | OWNER | - [x] Have `now domain add -e datasettes.com` run without errors (hopefully just a matter of waiting for the DNS to update) - [x] Alias an example dataset hosted on Now on a datasettes.com subdomain - [x] Confirm that HTTP caching and HTTP/2 redirect pushing works as expected - this may require another page rule | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up some example datasets on a Cloudflare-backed domain 273278840 | |
343788581 | https://github.com/simonw/datasette/issues/71#issuecomment-343788581 | https://api.github.com/repos/simonw/datasette/issues/71 | MDEyOklzc3VlQ29tbWVudDM0Mzc4ODU4MQ== | simonw 9599 | 2017-11-13T01:48:17Z | 2017-11-13T01:48:17Z | OWNER | I had to add a rule like this to get letsencrypt certificates on now.sh working: https://github.com/zeit/now-cli/issues/188#issuecomment-270105052 <img width="975" alt="page_rules__datasettes_com___cloudflare_-_web_performance___security" src="https://user-images.githubusercontent.com/9599/32706131-c3d88742-c7cf-11e7-8d39-07d3554ce3cf.png"> I also have to flip this switch off every time I want to add a new alias: <img width="993" alt="crypto__datasettes_com___cloudflare_-_web_performance___security" src="https://user-images.githubusercontent.com/9599/32706326-a8ba1320-c7d1-11e7-8846-eb1e62efa3ed.png"> | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up some example datasets on a Cloudflare-backed domain 273278840 | |
343788780 | https://github.com/simonw/datasette/issues/71#issuecomment-343788780 | https://api.github.com/repos/simonw/datasette/issues/71 | MDEyOklzc3VlQ29tbWVudDM0Mzc4ODc4MA== | simonw 9599 | 2017-11-13T01:50:01Z | 2017-11-13T01:50:01Z | OWNER | Added another page rule in order to get Cloudflare to always obey cache headers sent by the server: <img width="978" alt="page_rules__datasettes_com___cloudflare_-_web_performance___security" src="https://user-images.githubusercontent.com/9599/32706355-ded7c60a-c7d1-11e7-93da-20989f40d527.png"> | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up some example datasets on a Cloudflare-backed domain 273278840 | |
343788817 | https://github.com/simonw/datasette/issues/71#issuecomment-343788817 | https://api.github.com/repos/simonw/datasette/issues/71 | MDEyOklzc3VlQ29tbWVudDM0Mzc4ODgxNw== | simonw 9599 | 2017-11-13T01:50:27Z | 2017-11-13T01:50:27Z | OWNER | https://fivethirtyeight.datasettes.com/ is now up and running. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up some example datasets on a Cloudflare-backed domain 273278840 | |
343789162 | https://github.com/simonw/datasette/issues/71#issuecomment-343789162 | https://api.github.com/repos/simonw/datasette/issues/71 | MDEyOklzc3VlQ29tbWVudDM0Mzc4OTE2Mg== | simonw 9599 | 2017-11-13T01:53:29Z | 2017-11-13T01:53:29Z | OWNER | ``` $ curl -i 'https://fivethirtyeight.datasettes.com/fivethirtyeight-75d605c/obama-commutations%2Fobama_commutations.csv.jsono' HTTP/1.1 200 OK Date: Mon, 13 Nov 2017 01:50:57 GMT Content-Type: application/json Transfer-Encoding: chunked Connection: keep-alive Set-Cookie: __cfduid=de836090f3e12a60579cc7a1696cf0d9e1510537857; expires=Tue, 13-Nov-18 01:50:57 GMT; path=/; domain=.datasettes.com; HttpOnly; Secure Access-Control-Allow-Origin: * Cache-Control: public, max-age=31536000 X-Now-Region: now-sfo CF-Cache-Status: HIT Expires: Tue, 13 Nov 2018 01:50:57 GMT Server: cloudflare-nginx CF-RAY: 3bce154a6d9293b4-SJC {"database": "fivethirtyeight", "table": "obama-commutations/obama_commutations.csv"...``` | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up some example datasets on a Cloudflare-backed domain 273278840 | |
343790984 | https://github.com/simonw/datasette/issues/71#issuecomment-343790984 | https://api.github.com/repos/simonw/datasette/issues/71 | MDEyOklzc3VlQ29tbWVudDM0Mzc5MDk4NA== | simonw 9599 | 2017-11-13T02:09:34Z | 2017-11-13T02:09:34Z | OWNER | HTTP/2 push totally worked on the redirect! fetch('https://fivethirtyeight.datasettes.com/fivethirtyeight/riddler-pick-lowest%2Flow_numbers.csv.jsono').then(r => r.json()).then(console.log) <img width="735" alt="eventbrite_api___v3_destination_search_" src="https://user-images.githubusercontent.com/9599/32706785-6c8dc178-c7d4-11e7-9130-8b6ec84a7430.png"> Meanwhile, in the network pane... <img width="770" alt="eventbrite_api___v3_destination_search_" src="https://user-images.githubusercontent.com/9599/32706803-7f92aac2-c7d4-11e7-9b01-c87e3d2c3891.png"> <img width="771" alt="eventbrite_api___v3_destination_search_" src="https://user-images.githubusercontent.com/9599/32706821-95106bb4-c7d4-11e7-9ad5-31107faa5795.png"> | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up some example datasets on a Cloudflare-backed domain 273278840 | |
343791348 | https://github.com/simonw/datasette/issues/68#issuecomment-343791348 | https://api.github.com/repos/simonw/datasette/issues/68 | MDEyOklzc3VlQ29tbWVudDM0Mzc5MTM0OA== | simonw 9599 | 2017-11-13T02:12:58Z | 2017-11-13T02:12:58Z | OWNER | I should use this on https://fivethirtyeight.datasettes.com/ | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Support for title/source/license metadata 273247186 | |
343801392 | https://github.com/simonw/datasette/issues/73#issuecomment-343801392 | https://api.github.com/repos/simonw/datasette/issues/73 | MDEyOklzc3VlQ29tbWVudDM0MzgwMTM5Mg== | simonw 9599 | 2017-11-13T03:36:47Z | 2017-11-13T03:36:47Z | OWNER | While I’m at it, let’s allow people to opt out of HTTP/2 push with a ?_nopush=1 argument too - in case they decide they don’t want to receive large 302 responses. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | _nocache=1 query string option for use with sort-by-random 273296178 | |
343951751 | https://github.com/simonw/datasette/issues/68#issuecomment-343951751 | https://api.github.com/repos/simonw/datasette/issues/68 | MDEyOklzc3VlQ29tbWVudDM0Mzk1MTc1MQ== | simonw 9599 | 2017-11-13T15:21:04Z | 2017-11-13T15:21:04Z | OWNER | For first version, I'm just supporting title, source and license information at the database level. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Support for title/source/license metadata 273247186 | |
343961784 | https://github.com/simonw/datasette/issues/67#issuecomment-343961784 | https://api.github.com/repos/simonw/datasette/issues/67 | MDEyOklzc3VlQ29tbWVudDM0Mzk2MTc4NA== | simonw 9599 | 2017-11-13T15:50:50Z | 2017-11-13T15:50:50Z | OWNER | `datasette package ...` - same arguments as `datasette publish`. Creates Docker container in your local repo, optionally tagged with `--tag` | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Command that builds a local docker container 273192789 | |
343967020 | https://github.com/simonw/datasette/issues/67#issuecomment-343967020 | https://api.github.com/repos/simonw/datasette/issues/67 | MDEyOklzc3VlQ29tbWVudDM0Mzk2NzAyMA== | simonw 9599 | 2017-11-13T16:06:10Z | 2017-11-13T16:06:10Z | OWNER | http://odewahn.github.io/docker-jumpstart/example.html is helpful | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Command that builds a local docker container 273192789 | |
344000982 | https://github.com/simonw/datasette/issues/75#issuecomment-344000982 | https://api.github.com/repos/simonw/datasette/issues/75 | MDEyOklzc3VlQ29tbWVudDM0NDAwMDk4Mg== | simonw 9599 | 2017-11-13T17:50:27Z | 2017-11-13T17:50:27Z | OWNER | This is necessary because one of the fun things to do with this tool is run it locally, e.g.: datasette ~/Library/Application\ Support/Google/Chrome/Default/History -p 8003 BUT... if we enable CORS by default, an evil site could try sniffing for localhost:8003 and attempt to steal data. So we'll enable the CORS headers only if `--cors` is provided to the command, and then use that command in the default Dockerfile. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add --cors argument to serve 273509159 | |
344017088 | https://github.com/simonw/datasette/issues/51#issuecomment-344017088 | https://api.github.com/repos/simonw/datasette/issues/51 | MDEyOklzc3VlQ29tbWVudDM0NDAxNzA4OA== | simonw 9599 | 2017-11-13T18:44:23Z | 2017-11-13T18:44:23Z | OWNER | Implemented in https://github.com/simonw/datasette/commit/e838bd743d31358b362875854a0ac5e78047727f | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Make a proper README 272735257 | |
344018680 | https://github.com/simonw/datasette/issues/74#issuecomment-344018680 | https://api.github.com/repos/simonw/datasette/issues/74 | MDEyOklzc3VlQ29tbWVudDM0NDAxODY4MA== | simonw 9599 | 2017-11-13T18:49:58Z | 2017-11-13T18:49:58Z | OWNER | Turns out it does this already: https://github.com/simonw/datasette/blob/6b3b05b6db0d2a7b7cec8b8dbb4ddc5e12a376b2/datasette/app.py#L96-L107 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Send a 302 redirect to the new hash for hits to old hashes 273296684 | |
344019631 | https://github.com/simonw/datasette/issues/69#issuecomment-344019631 | https://api.github.com/repos/simonw/datasette/issues/69 | MDEyOklzc3VlQ29tbWVudDM0NDAxOTYzMQ== | simonw 9599 | 2017-11-13T18:53:13Z | 2017-11-13T18:53:13Z | OWNER | I'm going with a page size of 100 and a max limit of 1000 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Enforce pagination (or at least limits) for arbitrary custom SQL 273248366 | |
344048656 | https://github.com/simonw/datasette/issues/69#issuecomment-344048656 | https://api.github.com/repos/simonw/datasette/issues/69 | MDEyOklzc3VlQ29tbWVudDM0NDA0ODY1Ng== | simonw 9599 | 2017-11-13T20:32:47Z | 2017-11-13T20:32:47Z | OWNER | <img width="908" alt="ak" src="https://user-images.githubusercontent.com/9599/32745247-071aa764-c867-11e7-9748-88e22f5eee57.png"> | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Enforce pagination (or at least limits) for arbitrary custom SQL 273248366 | |
344060070 | https://github.com/simonw/datasette/issues/55#issuecomment-344060070 | https://api.github.com/repos/simonw/datasette/issues/55 | MDEyOklzc3VlQ29tbWVudDM0NDA2MDA3MA== | simonw 9599 | 2017-11-13T21:14:13Z | 2017-11-13T21:14:13Z | OWNER | I'm going to add some extra metadata to setup.py and then tag this as version 0.8: git tag 0.8 git push --tags Then to ship to PyPI: python setup.py bdist_wheel twine register dist/datasette-0.8-py3-none-any.whl twine upload dist/datasette-0.8-py3-none-any.whl | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Ship first version to PyPI 273127117 | |
344061762 | https://github.com/simonw/datasette/issues/55#issuecomment-344061762 | https://api.github.com/repos/simonw/datasette/issues/55 | MDEyOklzc3VlQ29tbWVudDM0NDA2MTc2Mg== | simonw 9599 | 2017-11-13T21:19:43Z | 2017-11-13T21:19:43Z | OWNER | And we're live! https://pypi.python.org/pypi/datasette | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Ship first version to PyPI 273127117 | |
344074443 | https://github.com/simonw/datasette/issues/80#issuecomment-344074443 | https://api.github.com/repos/simonw/datasette/issues/80 | MDEyOklzc3VlQ29tbWVudDM0NDA3NDQ0Mw== | simonw 9599 | 2017-11-13T22:04:54Z | 2017-11-13T22:05:02Z | OWNER | The fivethirtyeight dataset: datasette publish now --name fivethirtyeight --metadata metadata.json fivethirtyeight.db now alias https://fivethirtyeight-jyqfudvjli.now.sh fivethirtyeight.datasettes.com And parlgov: datasette publish now parlgov.db --name=parlgov --metadata=parlgov.json now alias https://parlgov-hqvxuhmbyh.now.sh parlgov.datasettes.com | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Deploy final versions of fivethirtyeight and parlgov datasets (with view pagination) 273569477 | |
344075696 | https://github.com/simonw/datasette/issues/80#issuecomment-344075696 | https://api.github.com/repos/simonw/datasette/issues/80 | MDEyOklzc3VlQ29tbWVudDM0NDA3NTY5Ng== | simonw 9599 | 2017-11-13T22:09:46Z | 2017-11-13T22:09:46Z | OWNER | Parlgov was throwing errors on one of the views, which takes longer than 1000ms to execute - so I added the ability to customize the time limit in https://github.com/simonw/datasette/commit/1e698787a4dd6df0432021a6814c446c8b69bba2 datasette publish now parlgov.db --metadata parlgov.json --name parlgov --extra-options="--sql_time_limit_ms=3500" now alias https://parlgov-nvkcowlixq.now.sh parlgov.datasettes.com https://parlgov.datasettes.com/parlgov-25f9855/view_cabinet now returns in just over 2.5s | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Deploy final versions of fivethirtyeight and parlgov datasets (with view pagination) 273569477 | |
344076554 | https://github.com/simonw/datasette/pull/81#issuecomment-344076554 | https://api.github.com/repos/simonw/datasette/issues/81 | MDEyOklzc3VlQ29tbWVudDM0NDA3NjU1NA== | simonw 9599 | 2017-11-13T22:12:57Z | 2017-11-13T22:12:57Z | OWNER | Hah, I haven't even announced this yet :) Travis is upset because I'm using SQL in the tests which isn't compatible with their version of Python 3. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | :fire: Removes DS_Store 273595473 | |
344081876 | https://github.com/simonw/datasette/issues/59#issuecomment-344081876 | https://api.github.com/repos/simonw/datasette/issues/59 | MDEyOklzc3VlQ29tbWVudDM0NDA4MTg3Ng== | simonw 9599 | 2017-11-13T22:33:43Z | 2017-11-13T22:33:43Z | OWNER | The `datasette package` command introduced in 4143e3b45c16cbae5e3e3419ef479a71810e7df3 is relevant here. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | datasette publish hyper 273157085 | |
344118849 | https://github.com/simonw/datasette/issues/82#issuecomment-344118849 | https://api.github.com/repos/simonw/datasette/issues/82 | MDEyOklzc3VlQ29tbWVudDM0NDExODg0OQ== | simonw 9599 | 2017-11-14T01:46:10Z | 2017-11-14T01:46:10Z | OWNER | Did this: https://simonwillison.net/2017/Nov/13/datasette/ | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Post a blog entry announcing it to the world 273596159 | |
344132481 | https://github.com/simonw/datasette/issues/47#issuecomment-344132481 | https://api.github.com/repos/simonw/datasette/issues/47 | MDEyOklzc3VlQ29tbWVudDM0NDEzMjQ4MQ== | simonw 9599 | 2017-11-14T03:08:13Z | 2017-11-14T03:08:13Z | OWNER | I ended up shipping with https://fivethirtyeight.datasettes.com/ and https://parlgov.datasettes.com/ | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Create neat example database 271831408 | |
344141199 | https://github.com/simonw/datasette/issues/59#issuecomment-344141199 | https://api.github.com/repos/simonw/datasette/issues/59 | MDEyOklzc3VlQ29tbWVudDM0NDE0MTE5OQ== | simonw 9599 | 2017-11-14T04:13:11Z | 2017-11-14T04:13:11Z | OWNER | I managed to do this manually: datasette package ~/parlgov-db/parlgov.db --metadata=parlgov.json # Output 8758ec31dda3 as the new image ID docker save 8758ec31dda3 > /tmp/my-image # I could have just piped this straight to hyper cat /tmp/my-image | hyper load # Now start the container running in hyper hyper run -d -p 80:8001 --name parlgov 8758ec31dda3 # We need to assign an IP address so we can see it hyper fip allocate 1 # Outputs 199.245.58.78 hyper fip attach 199.245.58.78 parlgov At this point, visiting the IP address in a browser showed the parlgov UI. To clean up... hyper hyper fip detach parlgov hyper fip release 199.245.58.78 hyper stop parlgov hyper rm parlgov | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | datasette publish hyper 273157085 | |
344141515 | https://github.com/simonw/datasette/issues/79#issuecomment-344141515 | https://api.github.com/repos/simonw/datasette/issues/79 | MDEyOklzc3VlQ29tbWVudDM0NDE0MTUxNQ== | simonw 9599 | 2017-11-14T04:16:01Z | 2017-11-14T04:16:01Z | OWNER | This is probably a bit too much for the README - I should get readthedocs working. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add more detailed API documentation to the README 273569068 | |
344149165 | https://github.com/simonw/datasette/issues/57#issuecomment-344149165 | https://api.github.com/repos/simonw/datasette/issues/57 | MDEyOklzc3VlQ29tbWVudDM0NDE0OTE2NQ== | simonw 9599 | 2017-11-14T05:16:34Z | 2017-11-14T05:17:14Z | OWNER | I’m intrigued by this pattern: https://github.com/macropin/datasette/blob/147195c2fdfa2b984d8f9fc1c6cab6634970a056/Dockerfile#L8 What’s the benefit of doing that? Does it result in a smaller image size? | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Ship a Docker image of the whole thing 273127694 | |
344161226 | https://github.com/simonw/datasette/issues/46#issuecomment-344161226 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NDE2MTIyNg== | simonw 9599 | 2017-11-14T06:41:21Z | 2017-11-14T06:41:21Z | OWNER | Spatial extensions would be really useful too. https://www.gaia-gis.it/spatialite-2.1/SpatiaLite-manual.html | {"total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
344161371 | https://github.com/simonw/datasette/issues/46#issuecomment-344161371 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NDE2MTM3MQ== | simonw 9599 | 2017-11-14T06:42:15Z | 2017-11-14T06:42:15Z | OWNER | http://charlesleifer.com/blog/going-fast-with-sqlite-and-python/ is useful here too. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
344161430 | https://github.com/simonw/datasette/issues/46#issuecomment-344161430 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NDE2MTQzMA== | simonw 9599 | 2017-11-14T06:42:44Z | 2017-11-14T06:42:44Z | OWNER | Also requested on Twitter: https://twitter.com/DenubisX/status/930322813864439808 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
344179878 | https://github.com/simonw/datasette/issues/27#issuecomment-344179878 | https://api.github.com/repos/simonw/datasette/issues/27 | MDEyOklzc3VlQ29tbWVudDM0NDE3OTg3OA== | simonw 9599 | 2017-11-14T08:21:22Z | 2017-11-14T08:21:22Z | OWNER | https://github.com/frappe/charts perhaps | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Ability to plot a simple graph 267886330 | |
344180866 | https://github.com/simonw/datasette/issues/43#issuecomment-344180866 | https://api.github.com/repos/simonw/datasette/issues/43 | MDEyOklzc3VlQ29tbWVudDM0NDE4MDg2Ng== | simonw 9599 | 2017-11-14T08:25:37Z | 2017-11-14T08:25:37Z | OWNER | This isn’t necessary - restarting the server is fast and easy, and I’ve not found myself needing this at all during development. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | While running, server should spot new db files added to its directory 268592894 | |
344185817 | https://github.com/simonw/datasette/issues/57#issuecomment-344185817 | https://api.github.com/repos/simonw/datasette/issues/57 | MDEyOklzc3VlQ29tbWVudDM0NDE4NTgxNw== | simonw 9599 | 2017-11-14T08:46:24Z | 2017-11-14T08:46:24Z | OWNER | Thanks for the explanation! Please do start a pull request. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Ship a Docker image of the whole thing 273127694 | |
344352573 | https://github.com/simonw/datasette/issues/30#issuecomment-344352573 | https://api.github.com/repos/simonw/datasette/issues/30 | MDEyOklzc3VlQ29tbWVudDM0NDM1MjU3Mw== | simonw 9599 | 2017-11-14T18:29:01Z | 2017-11-14T18:29:01Z | OWNER | This is a dupe of #85 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Do something neat with foreign keys 268078453 | |
344409906 | https://github.com/simonw/datasette/issues/93#issuecomment-344409906 | https://api.github.com/repos/simonw/datasette/issues/93 | MDEyOklzc3VlQ29tbWVudDM0NDQwOTkwNg== | simonw 9599 | 2017-11-14T21:47:02Z | 2017-11-14T21:47:02Z | OWNER | Even without bundling in the database file itself, I'd love to have a standalone binary version of the core `datasette` CLI utility. I think Sanic may have some complex dependencies, but I've never tried pyinstaller so I don't know how easy or hard it would be to get this working. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Package as standalone binary 273944952 | |
344415756 | https://github.com/simonw/datasette/issues/93#issuecomment-344415756 | https://api.github.com/repos/simonw/datasette/issues/93 | MDEyOklzc3VlQ29tbWVudDM0NDQxNTc1Ng== | simonw 9599 | 2017-11-14T22:09:13Z | 2017-11-14T22:09:13Z | OWNER | Looks like we'd need to use this recipe: https://github.com/pyinstaller/pyinstaller/wiki/Recipe-Setuptools-Entry-Point | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Package as standalone binary 273944952 | |
344426887 | https://github.com/simonw/datasette/issues/93#issuecomment-344426887 | https://api.github.com/repos/simonw/datasette/issues/93 | MDEyOklzc3VlQ29tbWVudDM0NDQyNjg4Nw== | simonw 9599 | 2017-11-14T22:51:46Z | 2017-11-14T22:51:46Z | OWNER | That didn't quite work for me. It built me a `dist/datasette` executable but when I try to run it I get an error: $ pwd /Users/simonw/Dropbox/Development/datasette $ source venv/bin/activate $ pyinstaller -F --add-data datasette/templates:datasette/templates --add-data datasette/static:datasette/static /Users/simonw/Dropbox/Development/datasette/venv/bin/datasette $ dist/datasette --help Traceback (most recent call last): File "datasette", line 11, in <module> File "site-packages/pkg_resources/__init__.py", line 572, in load_entry_point File "site-packages/pkg_resources/__init__.py", line 564, in get_distribution File "site-packages/pkg_resources/__init__.py", line 436, in get_provider File "site-packages/pkg_resources/__init__.py", line 984, in require File "site-packages/pkg_resources/__init__.py", line 870, in resolve pkg_resources.DistributionNotFound: The 'datasette' distribution was not found and is required by the application [99117] Failed to execute script datasette | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Package as standalone binary 273944952 | |
344427448 | https://github.com/simonw/datasette/issues/88#issuecomment-344427448 | https://api.github.com/repos/simonw/datasette/issues/88 | MDEyOklzc3VlQ29tbWVudDM0NDQyNzQ0OA== | simonw 9599 | 2017-11-14T22:54:06Z | 2017-11-14T22:54:06Z | OWNER | Hooray! First dataset that wasn't deployed by me :) https://github.com/simonw/datasette/wiki/Datasettes | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add NHS England Hospitals example to wiki 273775212 | |
344427560 | https://github.com/simonw/datasette/issues/88#issuecomment-344427560 | https://api.github.com/repos/simonw/datasette/issues/88 | MDEyOklzc3VlQ29tbWVudDM0NDQyNzU2MA== | simonw 9599 | 2017-11-14T22:54:33Z | 2017-11-14T22:54:33Z | OWNER | I'm getting an internal server error on http://run.plnkr.co/preview/cj9zlf1qc0003414y90ajkwpk/ at the moment | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add NHS England Hospitals example to wiki 273775212 | |
344438724 | https://github.com/simonw/datasette/issues/14#issuecomment-344438724 | https://api.github.com/repos/simonw/datasette/issues/14 | MDEyOklzc3VlQ29tbWVudDM0NDQzODcyNA== | simonw 9599 | 2017-11-14T23:47:54Z | 2017-11-14T23:47:54Z | OWNER | Plugins should be able to interact with the build step. This would give plugins an opportunity to modify the SQL databases and help prepare them for serving - for example, a full-text search plugin might create additional FTS tables, or a mapping plugin might pre-calculate a bunch of geohashes for tables that have latitude/longitude values. Plugins could really take advantage of the immutable nature of the dataset here. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Datasette Plugins 267707940 | |
344440377 | https://github.com/simonw/datasette/issues/93#issuecomment-344440377 | https://api.github.com/repos/simonw/datasette/issues/93 | MDEyOklzc3VlQ29tbWVudDM0NDQ0MDM3Nw== | simonw 9599 | 2017-11-14T23:56:35Z | 2017-11-14T23:56:35Z | OWNER | It worked! $ pyinstaller -F \ --add-data /usr/local/lib/python3.5/site-packages/datasette/templates:datasette/templates \ --add-data /usr/local/lib/python3.5/site-packages/datasette/static:datasette/static \ /usr/local/bin/datasette $ file dist/datasette dist/datasette: Mach-O 64-bit executable x86_64 $ dist/datasette --help Usage: datasette [OPTIONS] COMMAND [ARGS]... Datasette! Options: --help Show this message and exit. Commands: serve* Serve up specified SQLite database files with... build package Package specified SQLite files into a new... publish Publish specified SQLite database files to... | {"total_count": 3, "+1": 0, "-1": 0, "laugh": 0, "hooray": 3, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Package as standalone binary 273944952 | |
344440658 | https://github.com/simonw/datasette/issues/93#issuecomment-344440658 | https://api.github.com/repos/simonw/datasette/issues/93 | MDEyOklzc3VlQ29tbWVudDM0NDQ0MDY1OA== | simonw 9599 | 2017-11-14T23:58:07Z | 2017-11-14T23:58:07Z | OWNER | It's a shame pyinstaller can't act as a cross-compiler - so I don't think I can get Travis CI to build packages. But it's fantastic that it's possible to turn the tool into a standalone executable! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Package as standalone binary 273944952 | |
344452063 | https://github.com/simonw/datasette/issues/85#issuecomment-344452063 | https://api.github.com/repos/simonw/datasette/issues/85 | MDEyOklzc3VlQ29tbWVudDM0NDQ1MjA2Mw== | simonw 9599 | 2017-11-15T01:03:03Z | 2017-11-15T01:03:03Z | OWNER | This can work in reverse too. If you view the row page for something that has foreign keys against it, we can show you “53 items in TABLE link to this” and provide a link to view them all. That count worry could be prohibitively expensive. To counter that, we could run the count query via Ajax and set a strict time limit on it. See #95 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Detect foreign keys and use them to link HTML pages together 273678673 | |
344452326 | https://github.com/simonw/datasette/issues/85#issuecomment-344452326 | https://api.github.com/repos/simonw/datasette/issues/85 | MDEyOklzc3VlQ29tbWVudDM0NDQ1MjMyNg== | simonw 9599 | 2017-11-15T01:04:38Z | 2017-11-15T01:04:38Z | OWNER | This will work well in conjunction with https://github.com/simonw/csvs-to-sqlite/issues/2 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Detect foreign keys and use them to link HTML pages together 273678673 | |
344462277 | https://github.com/simonw/datasette/pull/89#issuecomment-344462277 | https://api.github.com/repos/simonw/datasette/issues/89 | MDEyOklzc3VlQ29tbWVudDM0NDQ2MjI3Nw== | simonw 9599 | 2017-11-15T02:02:52Z | 2017-11-15T02:02:52Z | OWNER | This is exactly what I was after, thanks! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | SQL syntax highlighting with CodeMirror 273816720 | |
344462608 | https://github.com/simonw/datasette/issues/13#issuecomment-344462608 | https://api.github.com/repos/simonw/datasette/issues/13 | MDEyOklzc3VlQ29tbWVudDM0NDQ2MjYwOA== | simonw 9599 | 2017-11-15T02:04:51Z | 2017-11-15T02:04:51Z | OWNER | Fixed in https://github.com/simonw/datasette/commit/8252daa4c14d73b4b69e3f2db4576bb39d73c070 - thanks, @tomdyson! | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add a syntax highlighting SQL editor 267542338 | |
344463436 | https://github.com/simonw/datasette/issues/95#issuecomment-344463436 | https://api.github.com/repos/simonw/datasette/issues/95 | MDEyOklzc3VlQ29tbWVudDM0NDQ2MzQzNg== | simonw 9599 | 2017-11-15T02:10:10Z | 2017-11-15T02:10:10Z | OWNER | This means clients can ask questions but say "don't bother if it takes longer than X" - which is really handy when you're working against unknown databases that might be small or might be enormous. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Allow shorter time limits to be set using a ?_sql_time_limit_ms =20 query string limit 273998513 | |
344472313 | https://github.com/simonw/datasette/pull/94#issuecomment-344472313 | https://api.github.com/repos/simonw/datasette/issues/94 | MDEyOklzc3VlQ29tbWVudDM0NDQ3MjMxMw== | simonw 9599 | 2017-11-15T03:08:00Z | 2017-11-15T03:08:00Z | OWNER | Works for me. I'm going to land this. Just one thing: simonw$ docker run --rm -t -i -p 9001:8001 c408e8cfbe40 datasette publish now The publish command requires "now" to be installed and configured Follow the instructions at https://zeit.co/now#whats-now Maybe we should have the Docker container install the "now" client? Not sure how much size that would add though. I think it's OK without for the moment. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Initial add simple prod ready Dockerfile refs #57 273961179 | |
344487639 | https://github.com/simonw/datasette/issues/25#issuecomment-344487639 | https://api.github.com/repos/simonw/datasette/issues/25 | MDEyOklzc3VlQ29tbWVudDM0NDQ4NzYzOQ== | simonw 9599 | 2017-11-15T05:11:11Z | 2017-11-15T05:11:11Z | OWNER | Since you can already download the database directly, I'm not going to bother with this one. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Endpoint that returns SQL ready to be piped into DB 267857622 | |
344657040 | https://github.com/simonw/datasette/issues/85#issuecomment-344657040 | https://api.github.com/repos/simonw/datasette/issues/85 | MDEyOklzc3VlQ29tbWVudDM0NDY1NzA0MA== | simonw 9599 | 2017-11-15T16:56:48Z | 2017-11-15T16:56:48Z | OWNER | Since detecting foreign keys that point to a specific table is a bit expensive (you have to call a PRAGMA on every other table) I’m going to add this to the build/inspect stage. Idea: if we detect that the foreign key table only has one other column in it (id, name) AND we know that the id is the primary key, we can add an efficient lookup on the table list view and prefetch a dictionary mapping IDs to their value. Then we can feed that dictionary in as extra tenplate context and use it to render labeled hyperlinks in the corresponding column. This means our build step should also cache which columns are indexed, and add a “label_column” property for tables with an obvious lane column. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Detect foreign keys and use them to link HTML pages together 273678673 | |
344667202 | https://github.com/simonw/datasette/issues/90#issuecomment-344667202 | https://api.github.com/repos/simonw/datasette/issues/90 | MDEyOklzc3VlQ29tbWVudDM0NDY2NzIwMg== | simonw 9599 | 2017-11-15T17:29:38Z | 2017-11-15T17:29:38Z | OWNER | @jacobian points out that a buildpack may be a better fit than a Docker container for implementing this: https://twitter.com/jacobian/status/930849058465255424 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | datasette publish heroku 273846123 | |
344680385 | https://github.com/simonw/datasette/issues/90#issuecomment-344680385 | https://api.github.com/repos/simonw/datasette/issues/90 | MDEyOklzc3VlQ29tbWVudDM0NDY4MDM4NQ== | simonw 9599 | 2017-11-15T18:14:11Z | 2017-11-15T18:14:11Z | OWNER | Maybe we don’t even need a buildpack... we could create a temporary directory, set up a classic heroku app with the datasette serve command in the Procfile and then git push to deploy. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | datasette publish heroku 273846123 | |
344686483 | https://github.com/simonw/datasette/issues/90#issuecomment-344686483 | https://api.github.com/repos/simonw/datasette/issues/90 | MDEyOklzc3VlQ29tbWVudDM0NDY4NjQ4Mw== | simonw 9599 | 2017-11-15T18:36:23Z | 2017-11-15T18:36:23Z | OWNER | The “datasette build” command would need to run in a bin/post_compile script eg https://github.com/simonw/simonwillisonblog/blob/cloudflare-ips/bin/post_compile | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | datasette publish heroku 273846123 | |
344687328 | https://github.com/simonw/datasette/issues/90#issuecomment-344687328 | https://api.github.com/repos/simonw/datasette/issues/90 | MDEyOklzc3VlQ29tbWVudDM0NDY4NzMyOA== | simonw 9599 | 2017-11-15T18:39:14Z | 2017-11-15T18:39:49Z | OWNER | By default the command could use a temporary directory that gets cleaned up after the deploy, but we could allow users to opt in to keeping the generated directory like so: datasette publish heroku mydb.py -d ~/dev/my-heroku-app This would create the my-heroku-app folder so you can later execute further git deploys from there. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | datasette publish heroku 273846123 | |
344770170 | https://github.com/simonw/datasette/pull/107#issuecomment-344770170 | https://api.github.com/repos/simonw/datasette/issues/107 | MDEyOklzc3VlQ29tbWVudDM0NDc3MDE3MA== | simonw 9599 | 2017-11-16T00:01:00Z | 2017-11-16T00:01:22Z | OWNER | It is - but I think this will break on this line since it expects two format string parameters: https://github.com/simonw/datasette/blob/f45ca30f91b92ac68adaba893bf034f13ec61ced/datasette/utils.py#L61 Needs unit tests too, which live here: https://github.com/simonw/datasette/blob/f45ca30f91b92ac68adaba893bf034f13ec61ced/tests/test_utils.py#L49 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | add support for ?field__isnull=1 274343647 | |
344771130 | https://github.com/simonw/datasette/issues/100#issuecomment-344771130 | https://api.github.com/repos/simonw/datasette/issues/100 | MDEyOklzc3VlQ29tbWVudDM0NDc3MTEzMA== | simonw 9599 | 2017-11-16T00:06:00Z | 2017-11-16T00:06:00Z | OWNER | Aha... it looks like this is a Jinja version problem: https://github.com/ansible/ansible/issues/25381#issuecomment-306492389 Datasette depends on sanic-jinja2 - and that doesn't depend on a particular jinja2 version: https://github.com/lixxu/sanic-jinja2/blob/7e9520850d8c6bb66faf43b7f252593d7efe3452/setup.py#L22 So if you have an older version of Jinja installed, stuff breaks. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | TemplateAssertionError: no filter named 'tojson' 274160723 | |
344786528 | https://github.com/simonw/datasette/issues/96#issuecomment-344786528 | https://api.github.com/repos/simonw/datasette/issues/96 | MDEyOklzc3VlQ29tbWVudDM0NDc4NjUyOA== | simonw 9599 | 2017-11-16T01:32:41Z | 2017-11-16T01:32:41Z | OWNER | <img width="733" alt="australian-dogs" src="https://user-images.githubusercontent.com/9599/32869280-f82fa176-ca2a-11e7-8ad1-ac2a13c85089.png"> | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | UI for editing named parameters 274001453 | |
344788435 | https://github.com/simonw/datasette/issues/96#issuecomment-344788435 | https://api.github.com/repos/simonw/datasette/issues/96 | MDEyOklzc3VlQ29tbWVudDM0NDc4ODQzNQ== | simonw 9599 | 2017-11-16T01:43:52Z | 2017-11-16T01:43:52Z | OWNER | Demo: https://australian-dogs.now.sh/australian-dogs-3ba9628?sql=select+name%2C+count%28*%29+as+n+from+%28%0D%0A%0D%0Aselect+upper%28%22Animal+name%22%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2013%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2014%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2015%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22AnimalName%22%29+as+name+from+%5BCity-of-Port-Adelaide-Enfield-Dog_Registrations_2016%5D+where+AnimalBreed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5BMitcham-dog-registrations-2015%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22DOG_NAME%22%29+as+name+from+%5Bburnside-dog-registrations-2015%5D+where+DOG_BREED+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Animal_Name%22%29+as+name+from+%5Bcity-of-playford-2015-dog-registration%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5Bcity-of-prospect-dog-registration-details-2016%5D+where%22Breed+Description%22+like+%3Abreed%0D%0A%0D%0A%29+group+by+name+order+by+n+desc%3B&breed=chihuahua | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | UI for editing named parameters 274001453 | |
344788763 | https://github.com/simonw/datasette/issues/96#issuecomment-344788763 | https://api.github.com/repos/simonw/datasette/issues/96 | MDEyOklzc3VlQ29tbWVudDM0NDc4ODc2Mw== | simonw 9599 | 2017-11-16T01:45:51Z | 2017-11-16T01:45:51Z | OWNER | Another demo - this time it lets you search by name and see the most popular breeds with that name: https://australian-dogs.now.sh/australian-dogs-3ba9628?sql=select+breed%2C+count%28*%29+as+n+from+%28%0D%0A%0D%0Aselect+upper%28%22Breed%22%29+as+breed+from+%5BAdelaide-City-Council-dog-registrations-2013%5D+where+%22Animal+name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Breed_Description%22%29+as+breed+from+%5BAdelaide-City-Council-dog-registrations-2014%5D+where+%22Animal_Name%22+like+%3Aname%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Breed_Description%22%29+as+breed+from+%5BAdelaide-City-Council-dog-registrations-2015%5D+where+%22Animal_Name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22AnimalBreed%22%29+as+breed+from+%5BCity-of-Port-Adelaide-Enfield-Dog_Registrations_2016%5D+where+%22AnimalName%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Breed%22%29+as+breed+from+%5BMitcham-dog-registrations-2015%5D+where+%22Animal+Name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22DOG_BREED%22%29+as+breed+from+%5Bburnside-dog-registrations-2015%5D+where+%22DOG_NAME%22+like+%3Aname%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Breed_Description%22%29+as+breed+from+%5Bcity-of-playford-2015-dog-registration%5D+where+%22Animal_Name%22+like+%3Aname%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Breed+Description%22%29+as+breed+from+%5Bcity-of-prospect-dog-registration-details-2016%5D+where+%22Animal+Name%22+like+%3Aname%0D%0A%0D%0A%29+group+by+breed+order+by+n+desc%3B&name=rex | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | UI for editing named parameters 274001453 | |
344975156 | https://github.com/simonw/datasette/issues/46#issuecomment-344975156 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NDk3NTE1Ng== | simonw 9599 | 2017-11-16T16:19:44Z | 2017-11-16T16:19:44Z | OWNER | That's fantastic! Thank you very much for that. Do you know if it's possible to view the Dockerfile used by https://hub.docker.com/r/prolocutor/python3-sqlite-ext/ ? | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
344976104 | https://github.com/simonw/datasette/issues/46#issuecomment-344976104 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NDk3NjEwNA== | simonw 9599 | 2017-11-16T16:22:45Z | 2017-11-16T16:22:45Z | OWNER | Found a relevant Dockerfile on Reddit: https://www.reddit.com/r/Python/comments/5unkb3/install_sqlite3_on_python_3/ddzdz2b/ | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
344976882 | https://github.com/simonw/datasette/issues/46#issuecomment-344976882 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NDk3Njg4Mg== | simonw 9599 | 2017-11-16T16:25:07Z | 2017-11-16T16:25:07Z | OWNER | Maybe part of the solution here is to add a `--load-extension` argument to `datasette` - so when you run the command you can specify SQLite extensions that should be loaded. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
344986423 | https://github.com/simonw/datasette/issues/109#issuecomment-344986423 | https://api.github.com/repos/simonw/datasette/issues/109 | MDEyOklzc3VlQ29tbWVudDM0NDk4NjQyMw== | simonw 9599 | 2017-11-16T16:53:26Z | 2017-11-16T16:53:26Z | OWNER | http://datasette.readthedocs.io/ | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Set up readthedocs 274378301 | |
344988263 | https://github.com/simonw/datasette/issues/110#issuecomment-344988263 | https://api.github.com/repos/simonw/datasette/issues/110 | MDEyOklzc3VlQ29tbWVudDM0NDk4ODI2Mw== | simonw 9599 | 2017-11-16T16:58:48Z | 2017-11-16T16:58:48Z | OWNER | Here's how I tested this. First I downloaded and started a docker container using https://hub.docker.com/r/prolocutor/python3-sqlite-ext - which includes the compiled spatialite extension. This downloads it, then starts a shell in that container. docker run -it -p 8018:8018 prolocutor/python3-sqlite-ext:3.5.1-spatialite /bin/sh Installed a pre-release build of datasette which includes the new `--load-extension` option. pip install https://static.simonwillison.net/static/2017/datasette-0.13-py3-none-any.whl Now grab a sample database from https://www.gaia-gis.it/spatialite-2.3.1/resources.html - and unzip and rename it (datasette doesn't yet like databases with dots in their filename): wget http://www.gaia-gis.it/spatialite-2.3.1/test-2.3.sqlite.gz gunzip test-2.3.sqlite.gz mv test-2.3.sqlite test23.sqlite Now start datasette on port 8018 (the port I exposed earlier) with the extension loaded: datasette test23.sqlite -p 8018 -h 0.0.0.0 --load-extension /usr/local/lib/mod_spatialite.so Now I can confirm that it worked: http://localhost:8018/test23-c88bc35?sql=select+ST_AsText%28Geometry%29+from+HighWays+limit+1 <img width="743" alt="test23" src="https://user-images.githubusercontent.com/9599/32904449-39789a3a-caac-11e7-9531-b49f06051e34.png"> If I run datasette without `--load-extension` I get this: datasette test23.sqlite -p 8018 -h 0.0.0.0 <img width="747" alt="test23_and_turn_on_auto-escaping_in_jinja_ _simonw_datasette_82261a6" src="https://user-images.githubusercontent.com/9599/32904508-54e53d78-caac-11e7-9f80-4d96e9f9fb5f.png"> | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add --load-extension option to datasette for loading extra SQLite extensions 274578142 | |
344988591 | https://github.com/simonw/datasette/issues/46#issuecomment-344988591 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NDk4ODU5MQ== | simonw 9599 | 2017-11-16T16:59:51Z | 2017-11-16T16:59:51Z | OWNER | OK, `--load-extension` is now a supported command line option - see #110 which includes my notes on how I manually tested it using the `prolocutor/python3-sqlite-ext` Docker image. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
344989340 | https://github.com/simonw/datasette/issues/46#issuecomment-344989340 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NDk4OTM0MA== | simonw 9599 | 2017-11-16T17:02:07Z | 2017-11-16T17:02:07Z | OWNER | The fact that `prolocutor/python3-sqlite-ext` doesn't provide a visible Dockerfile and hasn't been updated in two years makes me hesitant to bake it into datasette itself. I'd rather put together a Dockerfile that enables the necessary extensions and can live in the datasette repository itself. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
344995571 | https://github.com/simonw/datasette/issues/46#issuecomment-344995571 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NDk5NTU3MQ== | simonw 9599 | 2017-11-16T17:22:32Z | 2017-11-16T17:22:32Z | OWNER | The JSON extension would be very worthwhile too: https://www.sqlite.org/json1.html | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
345013127 | https://github.com/simonw/datasette/issues/111#issuecomment-345013127 | https://api.github.com/repos/simonw/datasette/issues/111 | MDEyOklzc3VlQ29tbWVudDM0NTAxMzEyNw== | simonw 9599 | 2017-11-16T18:23:56Z | 2017-11-16T18:23:56Z | OWNER | Having this as a global option may not make sense when publishing multiple databases. We can revisit that when we implement per-database and per-table metadata. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add “updated” to metadata 274615452 | |
345017256 | https://github.com/simonw/datasette/issues/110#issuecomment-345017256 | https://api.github.com/repos/simonw/datasette/issues/110 | MDEyOklzc3VlQ29tbWVudDM0NTAxNzI1Ng== | simonw 9599 | 2017-11-16T18:38:30Z | 2017-11-16T18:38:30Z | OWNER | To finish up, I committed the image I created in the above so I can run it again in the future: docker commit $(docker ps -lq) datasette-sqlite Now I can run it like this: docker run -it -p 8018:8018 datasette-sqlite datasette /tmp/test23.sqlite -p 8018 -h 0.0.0.0 --load-extension /usr/local/lib/mod_spatialite.so | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add --load-extension option to datasette for loading extra SQLite extensions 274578142 | |
345067498 | https://github.com/simonw/datasette/issues/14#issuecomment-345067498 | https://api.github.com/repos/simonw/datasette/issues/14 | MDEyOklzc3VlQ29tbWVudDM0NTA2NzQ5OA== | simonw 9599 | 2017-11-16T21:25:32Z | 2017-11-16T21:26:22Z | OWNER | For visualizations, Google Maps should be made available as a plugin. The default visualizations can use Leaflet and Open Street Map, but there's no reason to not make Google Maps available as a plugin, especially if the plugin can provide a mechanism for configuring the necessary API key. I'm particularly excited in the Google Maps heatmap visualization https://developers.google.com/maps/documentation/javascript/heatmaplayer as seen on http://mochimachine.org/wasteland/ | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Datasette Plugins 267707940 | |
345108644 | https://github.com/simonw/datasette/pull/107#issuecomment-345108644 | https://api.github.com/repos/simonw/datasette/issues/107 | MDEyOklzc3VlQ29tbWVudDM0NTEwODY0NA== | simonw 9599 | 2017-11-17T00:34:46Z | 2017-11-17T00:34:46Z | OWNER | Looks like your tests are failing because of a bug which I fixed in https://github.com/simonw/datasette/commit/9199945a1bcec4852e1cb866eb3642614dd32a48 - if you rebase to master the tests should pass. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | add support for ?field__isnull=1 274343647 | |
345138134 | https://github.com/simonw/datasette/pull/114#issuecomment-345138134 | https://api.github.com/repos/simonw/datasette/issues/114 | MDEyOklzc3VlQ29tbWVudDM0NTEzODEzNA== | simonw 9599 | 2017-11-17T03:50:38Z | 2017-11-17T03:50:38Z | OWNER | Fantastic! Thank you very much. | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Add spatialite, switch to debian and local build 274733145 | |
345138347 | https://github.com/simonw/datasette/issues/46#issuecomment-345138347 | https://api.github.com/repos/simonw/datasette/issues/46 | MDEyOklzc3VlQ29tbWVudDM0NTEzODM0Nw== | simonw 9599 | 2017-11-17T03:52:25Z | 2017-11-17T03:52:25Z | OWNER | We now have a Dockerfile that compiles spatialite! https://github.com/simonw/datasette/pull/114/commits/6c6b63d890529eeefcefb7ab126ea3bd7b2315c1 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Dockerfile should build more recent SQLite with FTS5 and spatialite support 271301468 | |
345150048 | https://github.com/simonw/datasette/issues/85#issuecomment-345150048 | https://api.github.com/repos/simonw/datasette/issues/85 | MDEyOklzc3VlQ29tbWVudDM0NTE1MDA0OA== | simonw 9599 | 2017-11-17T05:35:25Z | 2017-11-17T05:35:25Z | OWNER | `csvs-to-sqlite` is now capable of generating databases with foreign key lookup tables: https://github.com/simonw/csvs-to-sqlite/releases/tag/0.3 | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Detect foreign keys and use them to link HTML pages together 273678673 | |
345242447 | https://github.com/simonw/datasette/issues/85#issuecomment-345242447 | https://api.github.com/repos/simonw/datasette/issues/85 | MDEyOklzc3VlQ29tbWVudDM0NTI0MjQ0Nw== | simonw 9599 | 2017-11-17T13:22:33Z | 2017-11-17T13:23:14Z | OWNER | I could support explicit label columns using additional arguments to `datasette serve`: datasette serve mydb.py --label-column mydb:table1:name --label-column mydb:table2:title This would mean "in mydb, set the label column for table1 to name, and the label column for table2 to title" | {"total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | Detect foreign keys and use them to link HTML pages together 273678673 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issue_comments] ( [html_url] TEXT, [issue_url] TEXT, [id] INTEGER PRIMARY KEY, [node_id] TEXT, [user] INTEGER REFERENCES [users]([id]), [created_at] TEXT, [updated_at] TEXT, [author_association] TEXT, [body] TEXT, [reactions] TEXT, [issue] INTEGER REFERENCES [issues]([id]) , [performed_via_github_app] TEXT); CREATE INDEX [idx_issue_comments_issue] ON [issue_comments] ([issue]); CREATE INDEX [idx_issue_comments_user] ON [issue_comments] ([user]);
author_association 4 ✖