issues
1,994 rows where repo = 107914493
This data as json, CSV (advanced)
Suggested facets: assignee, milestone, author_association, draft, state_reason, created_at (date), updated_at (date), closed_at (date)
id ▼ | node_id | number | title | user | state | locked | assignee | milestone | comments | created_at | updated_at | closed_at | author_association | pull_request | body | repo | type | active_lock_reason | performed_via_github_app | reactions | draft | state_reason |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
267513424 | MDU6SXNzdWUyNjc1MTM0MjQ= | 1 | Addressable pages for every row in a table | simonw 9599 | closed | 0 | Ship first public release 2857392 | 6 | 2017-10-23T00:44:16Z | 2017-10-24T14:11:04Z | 2017-10-24T14:11:03Z | OWNER | /database-name-7sha256/table-name/compound-pk /database-name-7sha256/table-name/compound-pk.json Tricky part will be figuring out what the private key is - especially since it could be a compound primary key and it might involve different data types. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/1/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
267513523 | MDU6SXNzdWUyNjc1MTM1MjM= | 2 | Initial proof-of-concept | simonw 9599 | closed | 0 | Ship first public release 2857392 | 0 | 2017-10-23T00:45:37Z | 2017-10-23T01:26:39Z | 2017-10-23T00:45:53Z | OWNER | Implemented in https://github.com/simonw/stateless-datasets/commit/de04d7a854d71003ffcf98028eab976a936c2dba | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/2/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
267515678 | MDU6SXNzdWUyNjc1MTU2Nzg= | 3 | Make individual column valuables addressable, with smart content types | simonw 9599 | open | 0 | 1 | 2017-10-23T01:11:32Z | 2017-12-10T03:11:58Z | OWNER | Some SQLite databases embed images in columns. It would be cool if these had URLs. /database-name-7sha256/table-name/compound-pk/column /database-name-7sha256/table-name/compound-pk/column.json /database-name-7sha256/table-name/compound-pk/column.png /database-name-7sha256/table-name/compound-pk/column.gif /database-name-7sha256/table-name/compound-pk/column.txt The one without an explicit file extension auto-detects the correct extension. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/3/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | ||||||||
267515836 | MDU6SXNzdWUyNjc1MTU4MzY= | 4 | Make URLs immutable | simonw 9599 | closed | 0 | Ship first public release 2857392 | 8 | 2017-10-23T01:13:30Z | 2017-10-24T02:38:24Z | 2017-10-24T02:38:24Z | OWNER | Absolutely everything should have a far-future expires header Part of the URL will be the truncated sha1 hash of the database file itself, calculated at build time | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/4/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
267516066 | MDU6SXNzdWUyNjc1MTYwNjY= | 5 | Implement sensible query pagination | simonw 9599 | closed | 0 | Ship first public release 2857392 | 3 | 2017-10-23T01:16:00Z | 2017-11-10T20:41:39Z | 2017-11-10T20:41:39Z | OWNER | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/5/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
267516329 | MDU6SXNzdWUyNjc1MTYzMjk= | 6 | Better JSON response options | simonw 9599 | closed | 0 | Ship first public release 2857392 | 0 | 2017-10-23T01:18:47Z | 2017-10-24T15:07:58Z | 2017-10-24T15:07:58Z | OWNER | Default returns this: { “Columns”: [“id”, “name”, “age”], “Rows”: [ [45, “Simon”, 36] ] } .jsono instead returns a list of objects each duplicating the headers in its keys. They both probably share the same pagination mechanism so it might not be a jsono flat list. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/6/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
267516650 | MDU6SXNzdWUyNjc1MTY2NTA= | 7 | Framework where by every page is JSON plus a template | simonw 9599 | closed | 0 | Ship first public release 2857392 | 1 | 2017-10-23T01:22:03Z | 2017-10-24T02:27:25Z | 2017-10-24T02:27:25Z | OWNER | Every single page of my interface should be implemented as a function that returns JSON. I can then build my jinja templates on top of the exact data that would be returned by the API version. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/7/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
267517314 | MDU6SXNzdWUyNjc1MTczMTQ= | 8 | Attempting an INSERT or UPDATE should return a sane error message | simonw 9599 | closed | 0 | Ship first public release 2857392 | 1 | 2017-10-23T01:28:25Z | 2017-10-23T15:28:12Z | 2017-10-23T15:28:08Z | OWNER | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/8/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
267517348 | MDU6SXNzdWUyNjc1MTczNDg= | 9 | Initial test suite | simonw 9599 | closed | 0 | Ship first public release 2857392 | 2 | 2017-10-23T01:28:46Z | 2017-10-24T05:55:33Z | 2017-10-24T05:55:33Z | OWNER | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/9/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
267517381 | MDU6SXNzdWUyNjc1MTczODE= | 10 | Set up Travis | simonw 9599 | closed | 0 | v1 stretch goals 2859414 | 1 | 2017-10-23T01:29:07Z | 2017-11-04T23:48:57Z | 2017-11-04T23:48:57Z | OWNER | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/10/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
267522549 | MDU6SXNzdWUyNjc1MjI1NDk= | 11 | Code that generates compile-time properties about the database | simonw 9599 | closed | 0 | Ship first public release 2857392 | 1 | 2017-10-23T02:18:24Z | 2017-10-23T16:04:23Z | 2017-10-23T16:04:23Z | OWNER | At a minimum this will include: * sha hash of each database file * list of tables with row counts for each database file | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/11/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
267523511 | MDU6SXNzdWUyNjc1MjM1MTE= | 12 | Make it so you can override templates | simonw 9599 | closed | 0 | Custom templates edition 2949431 | 1 | 2017-10-23T02:25:35Z | 2017-11-30T16:42:46Z | 2017-11-30T16:38:34Z | OWNER | The app will ship with default templates but, just like with the Django admin, you will be able to override them using either explicit configuration settings or just by dropping in templates with certain file names. Template inheritance should work here, both allowing you to override just the base template and allowing you to customize tiny bits of others. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/12/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
267542338 | MDU6SXNzdWUyNjc1NDIzMzg= | 13 | Add a syntax highlighting SQL editor | simonw 9599 | closed | 0 | 1 | 2017-10-23T05:03:33Z | 2017-11-15T02:04:51Z | 2017-11-15T02:04:51Z | OWNER | https://ace.c9.io/#nav=embedding looks like a good option | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/13/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
267707940 | MDU6SXNzdWUyNjc3MDc5NDA= | 14 | Datasette Plugins | simonw 9599 | closed | 0 | 22 | 2017-10-23T15:15:28Z | 2019-05-13T18:58:20Z | 2019-05-13T18:58:19Z | OWNER | It would be neat if additional functionality could be opted-in to the system in the form of easy-to-add plugins, hosted as separate packages. First example: a Google Analytics plugin, which adds GA tracking code with your tracking ID to the web interface for your dataset. This may be an opportunity to experiment with entry points: http://amir.rachum.com/blog/2017/07/28/python-entry-points/ | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/14/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
267713226 | MDU6SXNzdWUyNjc3MTMyMjY= | 15 | Support multiple databases | simonw 9599 | closed | 0 | Ship first public release 2857392 | 0 | 2017-10-23T15:29:51Z | 2017-10-24T02:01:38Z | 2017-10-24T02:01:38Z | OWNER | I'm going to loop through every database file in the app root directory and bundle all of them. Each one will be accessible at /databasename Note this is without the file extension, and we will disallow multiple files with the same name but different extensions. Supported extensions to start with will be `.db` and `.sqlite` and `.sqlite3` | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/15/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
267726219 | MDU6SXNzdWUyNjc3MjYyMTk= | 16 | Default HTML/CSS needs to look reasonable and be responsive | simonw 9599 | closed | 0 | Ship first public release 2857392 | 6 | 2017-10-23T16:05:22Z | 2017-11-11T20:19:07Z | 2017-11-11T20:19:07Z | OWNER | Version one should have the following characteristics: - Looks OK - Works great on mobile - Loads extremely fast - No JavaScript! At least not in v1. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/16/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
267732005 | MDU6SXNzdWUyNjc3MzIwMDU= | 17 | In development mode, should still pick up new .db files | simonw 9599 | closed | 0 | Ship first public release 2857392 | 1 | 2017-10-23T16:22:40Z | 2017-10-24T02:26:48Z | 2017-10-24T02:26:47Z | OWNER | Follow on from #11 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/17/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
267739593 | MDU6SXNzdWUyNjc3Mzk1OTM= | 18 | See if I can get a websockets interface working | simonw 9599 | closed | 0 | 1 | 2017-10-23T16:46:41Z | 2021-01-04T20:05:52Z | 2021-01-04T20:05:48Z | OWNER | Since I am already running on Sanic, how hard would it be to add a websocket ebdpoint that lets you talk to sqlite interactively? Could this be used to efficiently support streaming in answers to giant queries? | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/18/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
267741262 | MDU6SXNzdWUyNjc3NDEyNjI= | 19 | Efficient url for downloading the raw database file | simonw 9599 | closed | 0 | Ship first public release 2857392 | 1 | 2017-10-23T16:52:17Z | 2017-10-25T15:21:16Z | 2017-10-25T15:19:37Z | OWNER | Use Sanic support for steaming large files http://sanic.readthedocs.io/en/latest/sanic/response.html#file-streaming | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/19/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
267759136 | MDU6SXNzdWUyNjc3NTkxMzY= | 20 | Config file with support for defining canned queries | simonw 9599 | closed | 0 | simonw 9599 | Custom templates edition 2949431 | 9 | 2017-10-23T17:53:06Z | 2017-12-05T19:05:35Z | 2017-12-05T17:44:09Z | OWNER | Probably using YAML because then we get support for multiline strings: bats: db: bats.sqlite3 name: "Bat sightings" queries: specific_row: | select * from Bats where a = 1; | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/20/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||
267769034 | MDU6SXNzdWUyNjc3NjkwMzQ= | 21 | Use Sanic configuration mechanism | simonw 9599 | closed | 0 | v1 stretch goals 2859414 | 1 | 2017-10-23T18:25:14Z | 2017-11-10T20:45:42Z | 2017-11-10T20:45:42Z | OWNER | http://sanic.readthedocs.io/en/latest/sanic/config.html | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/21/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
267769431 | MDU6SXNzdWUyNjc3Njk0MzE= | 22 | Refactor to use class based views | simonw 9599 | closed | 0 | Ship first public release 2857392 | 0 | 2017-10-23T18:26:22Z | 2019-05-27T20:05:56Z | 2017-10-24T02:25:53Z | OWNER | http://sanic.readthedocs.io/en/latest/sanic/class_based_views.html | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/22/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
267788884 | MDU6SXNzdWUyNjc3ODg4ODQ= | 23 | Support Django-style filters in querystring arguments | simonw 9599 | closed | 0 | Ship first public release 2857392 | 6 | 2017-10-23T19:29:42Z | 2017-10-25T04:23:03Z | 2017-10-25T04:23:02Z | OWNER | e.g /database/table?name__contains=Simon&age__gte=4 Same format as Django: double underscore as the split. If you need to match against a column that happens to contain a double underscore in its official name, do this: /database/table?weird__column__exact=Simon __exact is the default operation if none is supplied. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/23/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
267828746 | MDU6SXNzdWUyNjc4Mjg3NDY= | 24 | Implement full URL design | simonw 9599 | closed | 0 | Ship first public release 2857392 | 2 | 2017-10-23T21:49:05Z | 2017-10-24T14:12:00Z | 2017-10-24T14:12:00Z | OWNER | Full URL design: /database-name /database-name.json /database-name-7sha256 /database-name-7sha256.json /database-name/table-name /database-name/table-name.json /database-name-7sha256/table-name /database-name-7sha256/table-name.json /database-name-7sha256/table-name/compound-pk /database-name-7sha256/table-name/compound-pk.json | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/24/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
267857622 | MDU6SXNzdWUyNjc4NTc2MjI= | 25 | Endpoint that returns SQL ready to be piped into DB | simonw 9599 | closed | 0 | 2 | 2017-10-24T00:19:26Z | 2017-11-15T05:11:12Z | 2017-11-15T05:11:11Z | OWNER | It would be cool if I could figure out a way to generate both the create table statements and the inserts for an individual table or the entire database and then stream them down to the client. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/25/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
267861210 | MDU6SXNzdWUyNjc4NjEyMTA= | 26 | Command line tool for uploading one or more DBs to Now | simonw 9599 | closed | 0 | Ship first public release 2857392 | 3 | 2017-10-24T00:43:10Z | 2017-11-11T07:25:30Z | 2017-11-11T07:25:30Z | OWNER | Uploading files appears to be undocumented, but I found it in their code here: https://github.com/zeit/now-cli/blob/0ca7d1fe44ebdf460b64fdc38ba543b8e295ac40/src/providers/sh/util/index.js#L291 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/26/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
267886330 | MDU6SXNzdWUyNjc4ODYzMzA= | 27 | Ability to plot a simple graph | simonw 9599 | closed | 0 | 3 | 2017-10-24T03:34:59Z | 2018-07-10T17:52:41Z | 2018-07-10T17:52:41Z | OWNER | Might be as simple as: pick he type of chart (bar, line) and then pick the column for the X axis and the column for the Y axis. Maybe also allow a pie chart. It’s up to the user to come up with SQL that gets the right values. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/27/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
267886865 | MDU6SXNzdWUyNjc4ODY4NjU= | 28 | /database?sql= should redirect correctly | simonw 9599 | closed | 0 | Ship first public release 2857392 | 0 | 2017-10-24T03:38:44Z | 2017-10-24T23:54:30Z | 2017-10-24T23:54:30Z | OWNER | Needs to redirect to the location with the hash while retaining the query string. This should also work with the .json extension. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/28/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
268050821 | MDU6SXNzdWUyNjgwNTA4MjE= | 29 | Handle bytestring records encoding to JSON | simonw 9599 | closed | 0 | Ship first public release 2857392 | 1 | 2017-10-24T14:18:45Z | 2017-10-24T14:59:00Z | 2017-10-24T14:58:47Z | OWNER | http://localhost:8006/northwind-40d049b/Categories.json 500s right now The string representation of one of the values looks like this: b"\x15\x1c/\x00\x02\x00 This is a bytestring from the database which cannot be naively converted to a unicode string. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/29/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
268078453 | MDU6SXNzdWUyNjgwNzg0NTM= | 30 | Do something neat with foreign keys | simonw 9599 | closed | 0 | 1 | 2017-10-24T15:29:29Z | 2017-11-14T18:29:08Z | 2017-11-14T18:29:01Z | OWNER | https://www.sqlite.org/pragma.html#pragma_foreign_key_list SQLite has robust support for introspecting foreign keys. I could use that to automatically link to the corresponding record from my tables. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/30/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
268087542 | MDU6SXNzdWUyNjgwODc1NDI= | 31 | Idea: colour scheme based on sha256 of db | simonw 9599 | closed | 0 | v1 stretch goals 2859414 | 1 | 2017-10-24T15:52:38Z | 2018-05-28T18:10:45Z | 2017-11-09T14:14:59Z | OWNER | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/31/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
268106803 | MDU6SXNzdWUyNjgxMDY4MDM= | 32 | Try running SQLite queries in a separate thread | simonw 9599 | closed | 0 | v1 stretch goals 2859414 | 1 | 2017-10-24T16:48:42Z | 2017-11-09T14:05:56Z | 2017-11-09T14:05:56Z | OWNER | https://pymotw.com/3/asyncio/executors.html Would be good to have some actual benchmarks so I can evaluate if this is worth it or not. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/32/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
268110769 | MDU6SXNzdWUyNjgxMTA3Njk= | 33 | Use locust for benchmarking and load tests | simonw 9599 | open | 0 | 0 | 2017-10-24T17:00:09Z | 2017-12-10T03:12:16Z | OWNER | https://github.com/locustio/locust Needed for #32 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/33/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | ||||||||
268176505 | MDU6SXNzdWUyNjgxNzY1MDU= | 34 | Support CSV export with a .csv extension | simonw 9599 | closed | 0 | 1 | 2017-10-24T20:34:43Z | 2021-06-17T18:14:48Z | 2018-05-28T20:45:34Z | OWNER | Maybe do this using streaming with multiple pagination SQL queries so we can support arbritrarily large exports. How would this work against a view which doesn’t have an obvious efficient pagination mechanism? Maybe limit views to up to 1000 exported records? Relates to #5 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/34/reactions", "total_count": 2, "+1": 2, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
268262480 | MDU6SXNzdWUyNjgyNjI0ODA= | 36 | date, year, month and day querystring lookups | simonw 9599 | closed | 0 | 3 | 2017-10-25T04:23:45Z | 2018-05-28T17:30:53Z | 2018-05-28T17:30:53Z | OWNER | - [ ] `?timestamp___date=2017-07-17` - return every item where the timestamp falls on that date - [ ] `?timestamp___year=2017` - return every item where the timestamp falls within 2017 - [ ] `?timestamp___month=1` - return every item where the month component is January - [ ] `?timestamp___day=10` - return every item where the day-of-the-month component is 10 Follow on from #23 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/36/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
268453968 | MDU6SXNzdWUyNjg0NTM5Njg= | 37 | Ability to serialize massive JSON without blocking event loop | simonw 9599 | closed | 0 | 2 | 2017-10-25T15:58:03Z | 2020-05-30T17:29:20Z | 2020-05-30T17:29:20Z | OWNER | We run the risk of someone attempting a select statement that returns thousands of rows and hence takes several seconds just to JSON encode the response, effectively blocking the event loop and pausing all other traffic. The Twisted community have a solution for this, can we adapt that in some way? http://as.ynchrono.us/2010/06/asynchronous-json_18.html?m=1 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/37/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
268462768 | MDU6SXNzdWUyNjg0NjI3Njg= | 38 | Experiment with patterns for concurrent long running queries | simonw 9599 | closed | 0 | 5 | 2017-10-25T16:23:42Z | 2018-05-28T20:47:31Z | 2018-05-28T20:47:31Z | OWNER | I want to understand how the system could perform under load with many concurrent long-running queries. Can we serve these without blocking the event loop? | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/38/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
268469569 | MDU6SXNzdWUyNjg0Njk1Njk= | 39 | Protect against malicious SQL that causes damage even though our DB is immutable | simonw 9599 | closed | 0 | Ship first public release 2857392 | 4 | 2017-10-25T16:44:27Z | 2021-08-17T23:52:07Z | 2017-11-05T02:53:47Z | OWNER | I’m currently operating under the assumption that it’s safe to allow arbitrary SQL statements because we are dealing with an immutable database. But this might not be the case - there are some pretty weird SQLite language extensions (ATTACH, PRAGMA etc) and I’m not certain they cannot be used to break things in a way that would affect future requests to the API. Solution: provide a “safe mode” option which disables the ?sql= mechanism. This still leaves the URL filter lookups, so I need to make sure that those are “safe”. In the future I may also implement a whitelist option where datasets can be configured to only allow specific filters against specific columns. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/39/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
268470572 | MDU6SXNzdWUyNjg0NzA1NzI= | 40 | Implement command-line tool interface | simonw 9599 | closed | 0 | Ship first public release 2857392 | 11 | 2017-10-25T16:47:15Z | 2017-11-11T07:27:33Z | 2017-11-11T07:27:33Z | OWNER | The first version needs to take one or more file names or URLs, then generate and deploy an app to Now. It will assume you already have the now command installed and configured. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/40/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
268590777 | MDU6SXNzdWUyNjg1OTA3Nzc= | 41 | Homepage should show summary of databases | simonw 9599 | closed | 0 | Ship first public release 2857392 | 1 | 2017-10-26T00:18:11Z | 2017-10-27T04:05:35Z | 2017-10-27T04:05:35Z | OWNER | I sch database should have a name, optional description, download link and a summary of the tables Flights.db Flights and suchlike blah. URL? License? 577373 rows across 14 tables airports, routes, airlines... Title of the homepage is derived from the databases or can be manually overridden e. “Datasets of Flights, NHS, Blah...” - or if only one database just the title of that. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/41/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
268591332 | MDU6SXNzdWUyNjg1OTEzMzI= | 42 | Homepage UI for editing metadata file | simonw 9599 | closed | 0 | 4 | 2017-10-26T00:22:03Z | 2017-12-10T03:02:14Z | 2017-12-10T03:02:14Z | OWNER | Since we are going to have a metadata file which sets the title/description/etc for each database, why not allow you to run the app in —dev mode which makes the homepage into a WYSIWYG editor that can save to that file format. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/42/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
268592894 | MDU6SXNzdWUyNjg1OTI4OTQ= | 43 | While running, server should spot new db files added to its directory | simonw 9599 | closed | 0 | v1 stretch goals 2859414 | 1 | 2017-10-26T00:32:37Z | 2017-11-14T08:25:53Z | 2017-11-14T08:25:37Z | OWNER | Maybe in each request it checks the time and if 5s has elapsed since t last scanned the directory it scans it again This would allow people with dedicated hosting to run the app there and just upload new datasets whenever they want. It would also be very convenient for development. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/43/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
269731374 | MDU6SXNzdWUyNjk3MzEzNzQ= | 44 | ?_group_count=country - return counts by specific column(s) | simonw 9599 | closed | 0 | 7 | 2017-10-30T19:50:32Z | 2018-04-26T15:09:58Z | 2018-04-26T15:09:58Z | OWNER | Imagine if this: https://stateless-datasets-jykibytogk.now.sh/flights-07d1283/airports.jsono?country__contains=gu&_group_count=country Turned into this: https://stateless-datasets-jykibytogk.now.sh/flights-07d1283?sql=select%20country,%20count(*)%20as%20group_count_country%20from%20airports%20where%20country%20like%20%27%gu%%27%20group%20by%20country%20order%20by%20group_count_country%20desc This would involve introducing a new precedent of query string arguments that start with an _ having special meanings. While we're at it, could try adding _fields=x,y,z Tasks: - [x] Get initial version working - [ ] Refactor code to not just "pretend to be a view" - [ ] Get foreign key relationships expanded | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/44/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
271242824 | MDU6SXNzdWUyNzEyNDI4MjQ= | 45 | Run SQLite operations in a thread pool | simonw 9599 | closed | 0 | Ship first public release 2857392 | 0 | 2017-11-05T02:27:12Z | 2017-11-05T02:27:34Z | 2017-11-05T02:27:33Z | OWNER | Let's run SQLite operations in threads, so we don't end up blocking our core event loop. These articles are helpful: * https://pymotw.com/3/asyncio/executors.html * https://marlinux.wordpress.com/2017/05/19/python-3-6-asyncio-sqlalchemy/ | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/45/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
271301468 | MDU6SXNzdWUyNzEzMDE0Njg= | 46 | Dockerfile should build more recent SQLite with FTS5 and spatialite support | simonw 9599 | closed | 0 | 13 | 2017-11-05T18:16:22Z | 2017-11-17T14:32:12Z | 2017-11-17T14:32:12Z | OWNER | The SQLite bundled with Python 3 doesn't support the FTS5 search extension. It would be nice if the SQLite built by our Dockerfile could support as many modern SQLite features as possible. https://web.archive.org/web/20170212034155/http://charlesleifer.com/blog/using-the-sqlite-json1-and-fts5-extensions-with-python/ has instructions on building a more recent SQLite and the pysqlite package. Our Dockerfile could carry out an updated version of this process. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/46/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
271831408 | MDU6SXNzdWUyNzE4MzE0MDg= | 47 | Create neat example database | simonw 9599 | closed | 0 | 5 | 2017-11-07T13:29:38Z | 2017-11-14T03:08:13Z | 2017-11-14T03:08:13Z | OWNER | How about data from open elections eg https://github.com/openelections/openelections-data-ca?files=1 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/47/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
272391665 | MDU6SXNzdWUyNzIzOTE2NjU= | 48 | Switch to ujson | simonw 9599 | closed | 0 | 4 | 2017-11-08T23:50:29Z | 2019-06-24T06:57:54Z | 2019-06-24T06:57:43Z | OWNER | ujson is already a dependency of Sanic, and should be quite a bit faster. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/48/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
272661336 | MDU6SXNzdWUyNzI2NjEzMzY= | 49 | Pick a name | simonw 9599 | closed | 0 | Ship first public release 2857392 | 4 | 2017-11-09T17:56:17Z | 2017-11-10T18:33:22Z | 2017-11-10T18:33:22Z | OWNER | Options so far: * immutabase * datasite * sqlstatic * dbserve * sqlserve Terms to play with: * immutable * sqlite * dataset * json * static * serve | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/49/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
272694136 | MDU6SXNzdWUyNzI2OTQxMzY= | 50 | Unit tests against application itself | simonw 9599 | closed | 0 | Ship first public release 2857392 | 2 | 2017-11-09T19:31:49Z | 2017-11-11T22:23:22Z | 2017-11-11T22:23:22Z | OWNER | Use Sanic’s testing mechanism. Test should create a temporary SQLite database file on disk by executing sql that is stored in the test themselves. For the moment we can just test the JSON API more thoroughly and just sanity check that the HTML output doesn’t throw any errors. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/50/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
272735257 | MDU6SXNzdWUyNzI3MzUyNTc= | 51 | Make a proper README | simonw 9599 | closed | 0 | Ship first public release 2857392 | 1 | 2017-11-09T21:46:07Z | 2017-11-13T18:44:23Z | 2017-11-13T18:44:23Z | OWNER | Include instructions on building a local Docker container - currently detailed here: https://gist.github.com/simonw/0ea5c960608c2d876e4637a5e48aa95d (those instructions don't work now that we have removed the Dockerfile in favour of a template generated by `datasette publish`) | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/51/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273026602 | MDU6SXNzdWUyNzMwMjY2MDI= | 52 | Solution for temporarily uploading DB so it can be built by docker | simonw 9599 | closed | 0 | 2 | 2017-11-10T18:55:25Z | 2017-12-10T03:02:57Z | 2017-12-10T03:02:57Z | OWNER | For the `datasette publish` command I ideally need a way of uploading the specified DB to somewhere temporary on the internet so that when the Dockerfile is built by the final hosting location it can download that database as part of the build process. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/52/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273054652 | MDU6SXNzdWUyNzMwNTQ2NTI= | 53 | Implement a better database index page | simonw 9599 | closed | 0 | Ship first public release 2857392 | 3 | 2017-11-10T20:47:36Z | 2017-11-12T21:19:33Z | 2017-11-12T01:50:27Z | OWNER | This view isn't great. I should do a better job of separating out tables from views and indexes, showing the count of rows in each table, and maybe move the SQL to the individual table pages. <img width="871" alt="flights" src="https://user-images.githubusercontent.com/9599/32423242-1b4458ce-c25a-11e7-910f-2dc1de909b8f.png"> | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/53/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273121803 | MDU6SXNzdWUyNzMxMjE4MDM= | 54 | Views should not attempt to link to records / use rowids | simonw 9599 | closed | 0 | Ship first public release 2857392 | 1 | 2017-11-11T05:44:54Z | 2017-11-12T21:29:42Z | 2017-11-12T21:29:33Z | OWNER | http://localhost:8001/parlgov-development-25f9855/view_variable <img width="837" alt="parlgov-development__view_variable" src="https://user-images.githubusercontent.com/9599/32686757-5b40f6a8-c660-11e7-88de-5e8dfb12ccf1.png"> | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/54/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273127117 | MDU6SXNzdWUyNzMxMjcxMTc= | 55 | Ship first version to PyPI | simonw 9599 | closed | 0 | Ship first public release 2857392 | 2 | 2017-11-11T07:38:48Z | 2017-11-13T21:19:43Z | 2017-11-13T21:19:43Z | OWNER | Just before doing this, update the Dockerfile template to `pip install datasette` https://github.com/simonw/datasette/blob/65e350ca2a4845c25752a62c16ba58cfe2c14b9b/datasette/utils.py#L125 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/55/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273127443 | MDU6SXNzdWUyNzMxMjc0NDM= | 56 | Easy way to block search engine crawling in robots.txt | simonw 9599 | closed | 0 | 1 | 2017-11-11T07:46:07Z | 2018-05-28T20:50:25Z | 2018-05-28T20:50:24Z | OWNER | For people who don't want their datasets to be crawled by search engines. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/56/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273127694 | MDU6SXNzdWUyNzMxMjc2OTQ= | 57 | Ship a Docker image of the whole thing | simonw 9599 | closed | 0 | 7 | 2017-11-11T07:51:28Z | 2018-06-28T04:01:51Z | 2018-06-28T04:01:38Z | OWNER | The generated Docker images can then just inherit from that. This will speed up deploys as no need to `pip install` anything. - [x] Ship that image to Docker Hub - [ ] Update the generated Dockerfile to use it | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/57/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273128608 | MDU6SXNzdWUyNzMxMjg2MDg= | 58 | publish command should detect if "now" is installed | simonw 9599 | closed | 0 | Ship first public release 2857392 | 0 | 2017-11-11T08:10:17Z | 2017-11-11T16:00:07Z | 2017-11-11T16:00:07Z | OWNER | If now is not installed, it should tell you where to get it. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/58/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273157085 | MDU6SXNzdWUyNzMxNTcwODU= | 59 | datasette publish hyper | simonw 9599 | closed | 0 | 4 | 2017-11-11T16:27:26Z | 2019-05-13T19:01:00Z | 2019-05-13T19:00:44Z | OWNER | This is a bit tricky, because unlike Now there doesn't seem to be a way to tell Hyper to "build this Dockerfile and deploy the resulting image". They expect you to build a container and publish it to a registry instead. https://docs.hyper.sh/Reference/CLI/load.html allows you to publish an image directly from a tarball, but that still leaves the challenge of creating that image. The nice thing about the Now integration is that you don't need to have Docker installed on your local machine. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/59/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273163905 | MDU6SXNzdWUyNzMxNjM5MDU= | 60 | Rethink how metadata is generated and stored | simonw 9599 | closed | 0 | Ship first public release 2857392 | 1 | 2017-11-11T18:01:28Z | 2017-11-11T20:12:17Z | 2017-11-11T20:12:16Z | OWNER | I broke the existing mechanism in 407795b61217205625f2d4e084afbf69f1db781b In order to get unit tests for the sanic app working. I think i should ditch the build-metadata.json cache file entirely and calculate the SHA hashes on startup. Not sure what to do about the table row counts. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/60/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273173116 | MDU6SXNzdWUyNzMxNzMxMTY= | 61 | Common header and footer | simonw 9599 | closed | 0 | Ship first public release 2857392 | 0 | 2017-11-11T20:20:08Z | 2017-11-11T20:37:19Z | 2017-11-11T20:37:19Z | OWNER | Split from #16 - [x] A link to the homepage from some kind of navigation bar in the header - [x] link to github.com/simonw/datasette in the footer - [x] Slightly better titles (maybe ditch the visited link colours for titles only? should keep those for primary key links) | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/61/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273174397 | MDU6SXNzdWUyNzMxNzQzOTc= | 62 | Link to .json and .jsono versions on various pages | simonw 9599 | closed | 0 | Ship first public release 2857392 | 0 | 2017-11-11T20:37:47Z | 2017-11-11T22:41:06Z | 2017-11-11T22:41:06Z | OWNER | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/62/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273174447 | MDU6SXNzdWUyNzMxNzQ0NDc= | 63 | Review design of JSON output | simonw 9599 | closed | 0 | Ship first public release 2857392 | 1 | 2017-11-11T20:38:33Z | 2017-11-11T22:20:17Z | 2017-11-11T22:20:17Z | OWNER | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/63/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273181020 | MDU6SXNzdWUyNzMxODEwMjA= | 64 | Support for ?field__isnull=1 or similar | simonw 9599 | closed | 0 | 1 | 2017-11-11T22:26:52Z | 2017-11-17T14:38:21Z | 2017-11-17T14:38:21Z | OWNER | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/64/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||||
273191608 | MDU6SXNzdWUyNzMxOTE2MDg= | 65 | Re-implement ?sql= mode | simonw 9599 | closed | 0 | Ship first public release 2857392 | 1 | 2017-11-12T01:47:17Z | 2017-11-12T02:36:37Z | 2017-11-12T02:35:42Z | OWNER | Here's the code I removed: async def data(self, request, name, hash): sql = 'select * from sqlite_master' custom_sql = False params = {} if request.args.get('sql'): params = request.raw_args sql = params.pop('sql') validate_sql_select(sql) custom_sql = True rows = await self.execute(name, sql, params) columns = [r[0] for r in rows.description] return { 'database': name, 'rows': rows, 'columns': columns, 'query': { 'sql': sql, 'params': params, } }, { 'database_hash': hash, 'custom_sql': custom_sql, } | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/65/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273191806 | MDU6SXNzdWUyNzMxOTE4MDY= | 66 | Show table SQL on table page | simonw 9599 | closed | 0 | Ship first public release 2857392 | 1 | 2017-11-12T01:51:23Z | 2017-11-12T21:17:29Z | 2017-11-12T21:17:29Z | OWNER | Let's do the SQL for the table you are looking at, plus SQL for any indexes that mention that table. The page for a view should show the SQL for that view. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/66/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273192789 | MDU6SXNzdWUyNzMxOTI3ODk= | 67 | Command that builds a local docker container | simonw 9599 | closed | 0 | Ship first public release 2857392 | 2 | 2017-11-12T02:13:29Z | 2017-11-13T16:17:52Z | 2017-11-13T16:17:52Z | OWNER | Be nice to indicate that this isn't just for Now. Shouldn't be too hard either. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/67/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273247186 | MDU6SXNzdWUyNzMyNDcxODY= | 68 | Support for title/source/license metadata | simonw 9599 | closed | 0 | Ship first public release 2857392 | 4 | 2017-11-12T17:04:21Z | 2017-12-04T04:55:43Z | 2017-11-13T15:26:11Z | OWNER | I've decided this is important for launch: I want to set a precedent for people citing, licensing and documenting their datasets. Not sure how best to go about supporting this. I'd like to allow for the following data to be optionally attached to any given database: - Title - Description, potentially in markdown? - Original source URL - License I'd also like the ability to attach descriptions to individual tables - and maybe even to table columns? The question then becomes: how should this information be stored. A few options: - In the SQLite database itself, in a specially named table. Problem here is that this means having to modify SQLite databases before publishing them. - In a separate SQLite database that can be published alongside the databases we are publishing. - In a JSON file. This is neat, but JSON files are not a great editing experience once you start including multiple lines (e.g. a markdown description). - In a YAML file. This is a better format for multi-line descriptions, but still isn't a great editing experience. Whatever the format, it can be made much more usable by offering a web-based editing UI for populating it (a special mode the server can be run in). | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/68/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273248366 | MDU6SXNzdWUyNzMyNDgzNjY= | 69 | Enforce pagination (or at least limits) for arbitrary custom SQL | simonw 9599 | closed | 0 | Ship first public release 2857392 | 4 | 2017-11-12T17:21:33Z | 2017-11-13T20:32:47Z | 2017-11-13T19:35:47Z | OWNER | It's way too easy to accidentally trigger a page that returns 100,000 rows at the moment. I need to use the LIMIT clause on views and custom SQL - I can support pagination "next" links using offset as well. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/69/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273267081 | MDU6SXNzdWUyNzMyNjcwODE= | 70 | Paginate views using OFFSET/LIMIT | simonw 9599 | closed | 0 | Ship first public release 2857392 | 0 | 2017-11-12T21:30:29Z | 2017-11-13T21:11:01Z | 2017-11-13T21:11:01Z | OWNER | As with #69 these should obey a maximum offset setting, which can be over-ridden. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/70/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273278840 | MDU6SXNzdWUyNzMyNzg4NDA= | 71 | Set up some example datasets on a Cloudflare-backed domain | simonw 9599 | closed | 0 | Ship first public release 2857392 | 10 | 2017-11-13T00:06:30Z | 2017-11-13T02:09:34Z | 2017-11-13T02:09:34Z | OWNER | To better demonstrate the caching and HTTP/2 features, I'd like to go live with some demos that are hosted behind Cloudflare. - [x] Redirect https://datasettes.com/ and https://www.datasettes.com/ to https://github.com/simonw/datasette - [x] Have `now domain add -e datasettes.com` run without errors (hopefully just a matter of waiting for the DNS to update) - [x] Alias an example dataset hosted on Now on a datasettes.com subdomain - [x] Confirm that HTTP caching and HTTP/2 redirect pushing works as expected - this may require another page rule | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/71/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273283166 | MDU6SXNzdWUyNzMyODMxNjY= | 72 | publish command should take an optional --name argument | simonw 9599 | closed | 0 | Ship first public release 2857392 | 0 | 2017-11-13T00:59:35Z | 2017-11-13T02:12:27Z | 2017-11-13T02:12:27Z | OWNER | To set the directory name so that now will inherit it as the name of the app. Defaults to datasette | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/72/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273296178 | MDU6SXNzdWUyNzMyOTYxNzg= | 73 | _nocache=1 query string option for use with sort-by-random | simonw 9599 | closed | 0 | 2 | 2017-11-13T02:57:10Z | 2018-05-28T17:25:15Z | 2018-05-28T17:25:15Z | OWNER | The one place where we wouldn’t want cdching is if we have something which uses sort by random to return random items. We can offer a _nocache=1 querystring argument to support this. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/73/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273296684 | MDU6SXNzdWUyNzMyOTY2ODQ= | 74 | Send a 302 redirect to the new hash for hits to old hashes | simonw 9599 | closed | 0 | Ship first public release 2857392 | 1 | 2017-11-13T03:00:59Z | 2017-11-13T18:49:59Z | 2017-11-13T18:49:59Z | OWNER | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/74/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273509159 | MDU6SXNzdWUyNzM1MDkxNTk= | 75 | Add --cors argument to serve | simonw 9599 | closed | 0 | Ship first public release 2857392 | 1 | 2017-11-13T17:16:19Z | 2017-11-13T18:17:52Z | 2017-11-13T18:17:52Z | OWNER | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/75/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273510781 | MDU6SXNzdWUyNzM1MTA3ODE= | 76 | publish should have required argument specifying publisher | simonw 9599 | closed | 0 | Ship first public release 2857392 | 0 | 2017-11-13T17:21:26Z | 2017-11-13T18:41:01Z | 2017-11-13T18:41:01Z | OWNER | Initially the only argument will be “now” - but “hyper” can be added in the future | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/76/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273537940 | MDU6SXNzdWUyNzM1Mzc5NDA= | 77 | Add Travis CI badge to README | simonw 9599 | closed | 0 | Ship first public release 2857392 | 0 | 2017-11-13T18:52:25Z | 2017-11-13T21:24:15Z | 2017-11-13T21:24:15Z | OWNER | Also fix this newline issue: <img width="647" alt="simonw_datasette__instant_json_api_for_your_sqlite_database" src="https://user-images.githubusercontent.com/9599/32743234-ae81b224-c860-11e7-98a9-980b7b448ffc.png"> | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/77/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273554949 | MDU6SXNzdWUyNzM1NTQ5NDk= | 78 | Rename after to next and provide a next_url | simonw 9599 | closed | 0 | Ship first public release 2857392 | 0 | 2017-11-13T19:48:31Z | 2017-11-13T20:35:03Z | 2017-11-13T20:35:03Z | OWNER | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/78/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273569068 | MDU6SXNzdWUyNzM1NjkwNjg= | 79 | Add more detailed API documentation to the README | simonw 9599 | closed | 0 | 3 | 2017-11-13T20:36:21Z | 2018-05-28T17:24:48Z | 2018-05-28T17:24:48Z | OWNER | Need to document: - [ ] The ?column__gt=4 style filter arguments for tables - [ ] The ?sql= API, and how named parameters work - [ ] How API pagination works - [ ] How redirects and cache headers work | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/79/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273569477 | MDU6SXNzdWUyNzM1Njk0Nzc= | 80 | Deploy final versions of fivethirtyeight and parlgov datasets (with view pagination) | simonw 9599 | closed | 0 | Ship first public release 2857392 | 2 | 2017-11-13T20:37:46Z | 2017-11-13T22:09:46Z | 2017-11-13T22:09:46Z | OWNER | Final versions should be deployed using the first released version of datasette. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/80/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273595473 | MDExOlB1bGxSZXF1ZXN0MTUyMzYwNzQw | 81 | :fire: Removes DS_Store | jefftriplett 50527 | closed | 0 | 2 | 2017-11-13T22:07:52Z | 2017-11-14T02:24:54Z | 2017-11-13T22:16:55Z | CONTRIBUTOR | simonw/datasette/pulls/81 | datasette 107914493 | pull | {"url": "https://api.github.com/repos/simonw/datasette/issues/81/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | 0 | ||||||
273596159 | MDU6SXNzdWUyNzM1OTYxNTk= | 82 | Post a blog entry announcing it to the world | simonw 9599 | closed | 0 | Ship first public release 2857392 | 1 | 2017-11-13T22:10:35Z | 2017-11-14T01:46:10Z | 2017-11-14T01:46:10Z | OWNER | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/82/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273626815 | MDU6SXNzdWUyNzM2MjY4MTU= | 83 | Individual row view is broken | simonw 9599 | closed | 0 | 0 | 2017-11-14T00:29:11Z | 2017-11-14T00:45:34Z | 2017-11-14T00:45:34Z | OWNER | https://parlgov.datasettes.com/parlgov-25f9855/viewcalc_parliament_composition/18 <img width="822" alt="cursor_and_localhost_8002_parlgov-25f9855_viewcalc_parliament_composition_18" src="https://user-images.githubusercontent.com/9599/32756593-c439c71c-c88f-11e7-9243-b6e1b778c8fa.png"> | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/83/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273660425 | MDU6SXNzdWUyNzM2NjA0MjU= | 84 | datasette package --metadata does not work with a relative path | simonw 9599 | closed | 0 | 0 | 2017-11-14T04:00:50Z | 2017-11-15T05:18:35Z | 2017-11-15T05:18:35Z | OWNER | $ datasette package ~/parlgov-db/parlgov.db --metadata=~/parlgov-db/parlgov.json Usage: datasette package [OPTIONS] FILES... Error: Invalid value for "-m" / "--metadata": Could not open file: ~/parlgov-db/parlgov.json: No such file or directory simonw-07542:~ simonw$ cd ~/parlgov-db/ simonw-07542:parlgov-db simonw$ datasette package ~/parlgov-db/parlgov.db --metadata=parlgov.json Sending build context to Docker daemon 4.46MB Step 1/7 : FROM python:3 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/84/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273678673 | MDU6SXNzdWUyNzM2Nzg2NzM= | 85 | Detect foreign keys and use them to link HTML pages together | simonw 9599 | closed | 0 | Foreign key edition 2919870 | 6 | 2017-11-14T06:12:05Z | 2017-11-19T06:08:19Z | 2017-11-19T06:08:19Z | OWNER | https://stackoverflow.com/a/44430157/6083 documents the PRAGMA needed to extract foreign key references for a table. At a minimum we can link column values known to be foreign keys to the corresponding row page. We could try to summarize the linked row in some way too - somehow extracting a sensible link title, maybe based on additional configuration in the metadata.json file. Still todo: - [x] Fix it to csvs-to-sqlite refactoring command correctly creates primary key on generated tables - [x] Ship new csvs-to-sqlite with refactoring command - [x] Refactor column logic to be more predictable in our templates (the rowid special case) - [x] Mechanism by which table metadata can specify the "label" column for a table - [x] Automatically set the label column as the first column that isn't a primary key (falling back on primary key) - [x] Code which runs a "select id, label from table where id in (...)" query as part of the tableview and populates a lookup dictionary - [x] Modify templates to use values from that lookup dictionary | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/85/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273703829 | MDU6SXNzdWUyNzM3MDM4Mjk= | 86 | Filter UI on table page | simonw 9599 | closed | 0 | Foreign key edition 2919870 | 10 | 2017-11-14T08:22:43Z | 2017-11-23T20:34:32Z | 2017-11-23T20:34:32Z | OWNER | A UI for building up simple table queries by adding additional filter rules that get executed as query parameters in the URL. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/86/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||
273709194 | MDU6SXNzdWUyNzM3MDkxOTQ= | 87 | Configure Travis to release new tags to PyPI | simonw 9599 | closed | 0 | 1 | 2017-11-14T08:44:08Z | 2018-07-10T17:49:13Z | 2018-07-10T17:49:12Z | OWNER | https://docs.travis-ci.com/user/deployment/pypi/ | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/87/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273775212 | MDU6SXNzdWUyNzM3NzUyMTI= | 88 | Add NHS England Hospitals example to wiki | tomdyson 15543 | closed | 0 | 4 | 2017-11-14T12:29:10Z | 2021-03-22T23:46:36Z | 2017-11-14T22:54:06Z | CONTRIBUTOR | https://nhs-england-hospitals.now.sh and an associated map visualisation: http://run.plnkr.co/preview/cj9zlf1qc0003414y90ajkwpk/ Datasette is wonderful! | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/88/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273816720 | MDExOlB1bGxSZXF1ZXN0MTUyNTIyNzYy | 89 | SQL syntax highlighting with CodeMirror | tomdyson 15543 | closed | 0 | 1 | 2017-11-14T14:43:33Z | 2017-11-15T02:03:01Z | 2017-11-15T02:03:01Z | CONTRIBUTOR | simonw/datasette/pulls/89 | Addresses #13 Future enhancements could include autocompletion of table and column names, e.g. with ```javascript extraKeys: {"Ctrl-Space": "autocomplete"}, hintOptions: {tables: { users: ["name", "score", "birthDate"], countries: ["name", "population", "size"] }} ``` (see https://codemirror.net/doc/manual.html#addon_sql-hint and source at http://codemirror.net/mode/sql/) | datasette 107914493 | pull | {"url": "https://api.github.com/repos/simonw/datasette/issues/89/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | 0 | |||||
273846123 | MDU6SXNzdWUyNzM4NDYxMjM= | 90 | datasette publish heroku | simonw 9599 | closed | 0 | 8 | 2017-11-14T16:01:39Z | 2017-12-10T03:06:34Z | 2017-12-10T03:05:48Z | OWNER | Heroku has Docker container support so this should not be too hard: https://devcenter.heroku.com/articles/container-registry-and-runtime See also #59 This should work exactly like the existing “datasette publish now....” command except it would be “datasette publish heroku...” | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/90/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273878873 | MDU6SXNzdWUyNzM4Nzg4NzM= | 91 | Option to serve databases from a different prefix, serve regular content elsewhere | simonw 9599 | closed | 0 | 1 | 2017-11-14T17:32:46Z | 2017-12-10T03:07:58Z | 2017-12-10T03:07:53Z | OWNER | It would be useful if the databases themselves could be served from a prefix e.g. datasette serve mydb.db --path-prefix=db Now my database is at `http://localhost:8001/db/mydb-23423` This would free up the rest of the URL namespace for other things. Maybe we could have an option to serve static content from a known folder e.g. datasette serve mydb.db --path-prefix=db --root-content=~/my-project/static Now a hit to `http://localhost:8001/news/` serves content from `~/my-project/static/news/index.html` This would make it trivial to package up entire HTML/CSS/JS apps with one or more underlying SQLite databases. Running without `--cors` would be fine here because any JS apps would be hosted on the same origin. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/91/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273895344 | MDU6SXNzdWUyNzM4OTUzNDQ= | 92 | Add --license --license_url --source --source_url --title arguments to datasette publish | simonw 9599 | closed | 0 | 0 | 2017-11-14T18:27:07Z | 2017-11-15T05:04:41Z | 2017-11-15T05:04:41Z | OWNER | I keep on using the `echo '{"source": "..."}' | datasette publish now --metadata=-` pattern, which suggests it makes sense for us to support these as optional arguments. https://gist.github.com/simonw/9f8bf23b37a42d7628c4dcc4bba10253 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/92/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273944952 | MDU6SXNzdWUyNzM5NDQ5NTI= | 93 | Package as standalone binary | atomotic 67420 | closed | 0 | 18 | 2017-11-14T21:14:07Z | 2021-11-21T07:00:23Z | 2021-11-21T07:00:23Z | NONE | hint: more than the docker image a standalone and multiplatform binary (containing the app and the database) could be simpler to distribute. i would like to investigate the possibility to package everything with [pyinstaller](http://www.pyinstaller.org/) adding the database as a [data file](https://pythonhosted.org/PyInstaller/spec-files.html#adding-data-files) | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/93/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
273961179 | MDExOlB1bGxSZXF1ZXN0MTUyNjMxNTcw | 94 | Initial add simple prod ready Dockerfile refs #57 | macropin 247192 | closed | 0 | 1 | 2017-11-14T22:09:09Z | 2017-11-15T03:08:04Z | 2017-11-15T03:08:04Z | CONTRIBUTOR | simonw/datasette/pulls/94 | Multi-stage build based off official python:3.6-slim Example usage: ``` docker run --rm -t -i -p 9000:8001 -v $(pwd)/db:/db datasette datasette serve /db/chinook.db ``` | datasette 107914493 | pull | {"url": "https://api.github.com/repos/simonw/datasette/issues/94/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | 0 | |||||
273998513 | MDU6SXNzdWUyNzM5OTg1MTM= | 95 | Allow shorter time limits to be set using a ?_sql_time_limit_ms =20 query string limit | simonw 9599 | closed | 0 | 1 | 2017-11-15T01:02:16Z | 2017-11-15T02:56:13Z | 2017-11-15T02:56:13Z | OWNER | This cannot be greater than the configured time limit. | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/95/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
274001453 | MDU6SXNzdWUyNzQwMDE0NTM= | 96 | UI for editing named parameters | simonw 9599 | closed | 0 | 3 | 2017-11-15T01:19:21Z | 2017-11-16T01:45:51Z | 2017-11-16T01:33:38Z | OWNER | On any page displaying a custom query that includes named parameters, we should show HTML form fields for editing those parameters. Eg the breed parameter on https://australian-dogs.now.sh/australian-dogs-3ba9628?sql=select+name%2C+count%28*%29+as+n+from+%28%0D%0A%0D%0Aselect+upper%28%22Animal+name%22%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2013%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2014%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2015%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22AnimalName%22%29+as+name+from+%5BCity-of-Port-Adelaide-Enfield-Dog_Registrations_2016%5D+where+AnimalBreed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5BMitcham-dog-registrations-2015%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22DOG_NAME%22%29+as+name+from+%5Bburnside-dog-registrations-2015%5D+where+DOG_BREED+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Animal_Name%22%29+as+name+from+%5Bcity-of-playford-2015-dog-registration%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5Bcity-of-prospect-dog-registration-details-2016%5D+where%22Breed+Description%22+like+%3Abreed%0D%0A%0D%0A%29+group+by+name+order+by+n+desc%3B&breed=pug | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/96/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
274022950 | MDU6SXNzdWUyNzQwMjI5NTA= | 97 | Link to JSON for the list of tables | simonw 9599 | closed | 0 | 3 | 2017-11-15T03:29:05Z | 2018-05-29T18:51:35Z | 2018-05-28T20:57:21Z | OWNER | https://twitter.com/yschimke/status/930606210855854080 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/97/reactions", "total_count": 1, "+1": 1, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
274023417 | MDU6SXNzdWUyNzQwMjM0MTc= | 98 | Default to 127.0.0.1 not 0.0.0.0 | simonw 9599 | closed | 0 | 0 | 2017-11-15T03:31:55Z | 2017-11-15T05:08:54Z | 2017-11-15T05:08:54Z | OWNER | https://twitter.com/yschimke/status/930606210855854080 | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/98/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
274023625 | MDU6SXNzdWUyNzQwMjM2MjU= | 99 | Start a change log | simonw 9599 | closed | 0 | 0 | 2017-11-15T03:33:21Z | 2017-11-16T15:12:46Z | 2017-11-16T15:12:45Z | OWNER | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/99/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | |||||||
274160723 | MDU6SXNzdWUyNzQxNjA3MjM= | 100 | TemplateAssertionError: no filter named 'tojson' | coisnepe 13304454 | closed | 0 | 2 | 2017-11-15T13:43:41Z | 2017-11-16T09:25:10Z | 2017-11-16T00:14:13Z | NONE | A 500 error is raised upon clicking on the name of a table on the homepage, say _http://0.0.0.0:8001/_ to _http://0.0.0.0:8001/test_check-c1f4771/users_ The API part seems to function as intended, though... ``` 2017-11-15 14:33:57 - (sanic)[ERROR]: Traceback (most recent call last): File "/usr/local/lib/python3.5/dist-packages/sanic/app.py", line 503, in handle_request response = await response File "/usr/local/lib/python3.5/dist-packages/datasette/app.py", line 155, in get return await self.view_get(request, name, hash, **kwargs) File "/usr/local/lib/python3.5/dist-packages/datasette/app.py", line 219, in view_get **context, File "/usr/local/lib/python3.5/dist-packages/sanic_jinja2/__init__.py", line 84, in render return html(self.render_string(template, request, **context)) File "/usr/local/lib/python3.5/dist-packages/sanic_jinja2/__init__.py", line 81, in render_string return self.env.get_template(template).render(**context) File "/usr/lib/python3/dist-packages/jinja2/environment.py", line 812, in get_template return self._load_template(name, self.make_globals(globals)) File "/usr/lib/python3/dist-packages/jinja2/environment.py", line 786, in _load_template template = self.loader.load(self, name, globals) File "/usr/lib/python3/dist-packages/jinja2/loaders.py", line 125, in load code = environment.compile(source, name, filename) File "/usr/lib/python3/dist-packages/jinja2/environment.py", line 565, in compile self.handle_exception(exc_info, source_hint=source_hint) File "/usr/lib/python3/dist-packages/jinja2/environment.py", line 754, in handle_exception reraise(exc_type, exc_value, tb) File "/usr/lib/python3/dist-packages/jinja2/_compat.py", line 37, in reraise raise value.with_traceback(tb) File "/usr/local/lib/python3.5/dist-packages/datasette/templates/table.html", line 29, in template <pre>params = {{ query.params|tojson(4) }}</pre> File "/usr/lib/python3/dist-packages/jinja2/environment.py", line 515, i… | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/100/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed | ||||||
274161964 | MDU6SXNzdWUyNzQxNjE5NjQ= | 101 | TemplateAssertionError: no filter named 'tojson' | eaubin 450244 | closed | 0 | 1 | 2017-11-15T13:47:32Z | 2017-11-15T13:48:55Z | 2017-11-15T13:48:55Z | NONE | I get an exception clicking on the table link: ``` 2017-11-15 08:40:10 - (sanic)[ERROR]: Traceback (most recent call last): File "/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/sanic/app.py", line 503, in handle_request response = await response File "/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/datasette/app.py", line 155, in get return await self.view_get(request, name, hash, **kwargs) File "/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/datasette/app.py", line 219, in view_get **context, File "/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/sanic_jinja2/__init__.py", line 84, in render return html(self.render_string(template, request, **context)) File "/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/sanic_jinja2/__init__.py", line 81, in render_string return self.env.get_template(template).render(**context) File "/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py", line 812, in get_template return self._load_template(name, self.make_globals(globals)) File "/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py", line 786, in _load_template template = self.loader.load(self, name, globals) File "/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/loaders.py", line 125, in load code = environment.compile(source, name, filename) File "/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py", line 565, in compile self.handle_exception(exc_info, source_hint=source_hint) File "/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py", line 754, in handle_exception reraise(exc_type, exc_value, tb) File "/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/_compat.py", line 37, in reraise raise value.with_traceback(tb) File "/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/datasette/templates/table.html", line 29, in template <pre>params = {{ query.params|tojson(4) }}</pre> File "/Users/e/… | datasette 107914493 | issue | {"url": "https://api.github.com/repos/simonw/datasette/issues/101/reactions", "total_count": 0, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 0, "rocket": 0, "eyes": 0} | completed |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [issues] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [state] TEXT, [locked] INTEGER, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [comments] INTEGER, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [author_association] TEXT, [pull_request] TEXT, [body] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [type] TEXT , [active_lock_reason] TEXT, [performed_via_github_app] TEXT, [reactions] TEXT, [draft] INTEGER, [state_reason] TEXT); CREATE INDEX [idx_issues_repo] ON [issues] ([repo]); CREATE INDEX [idx_issues_milestone] ON [issues] ([milestone]); CREATE INDEX [idx_issues_assignee] ON [issues] ([assignee]); CREATE INDEX [idx_issues_user] ON [issues] ([user]);