id,node_id,number,title,user,user_label,state,locked,assignee,assignee_label,milestone,milestone_label,comments,created_at,updated_at,closed_at,author_association,pull_request,body,repo,repo_label,type,active_lock_reason,performed_via_github_app,reactions,draft,state_reason 267513424,MDU6SXNzdWUyNjc1MTM0MjQ=,1,Addressable pages for every row in a table,9599,simonw,closed,0,,,2857392,Ship first public release,6,2017-10-23T00:44:16Z,2017-10-24T14:11:04Z,2017-10-24T14:11:03Z,OWNER,," /database-name-7sha256/table-name/compound-pk /database-name-7sha256/table-name/compound-pk.json Tricky part will be figuring out what the private key is - especially since it could be a compound primary key and it might involve different data types.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267513523,MDU6SXNzdWUyNjc1MTM1MjM=,2,Initial proof-of-concept,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-10-23T00:45:37Z,2017-10-23T01:26:39Z,2017-10-23T00:45:53Z,OWNER,,Implemented in https://github.com/simonw/stateless-datasets/commit/de04d7a854d71003ffcf98028eab976a936c2dba,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267515678,MDU6SXNzdWUyNjc1MTU2Nzg=,3,"Make individual column valuables addressable, with smart content types",9599,simonw,open,0,,,,,1,2017-10-23T01:11:32Z,2017-12-10T03:11:58Z,,OWNER,,"Some SQLite databases embed images in columns. It would be cool if these had URLs. /database-name-7sha256/table-name/compound-pk/column /database-name-7sha256/table-name/compound-pk/column.json /database-name-7sha256/table-name/compound-pk/column.png /database-name-7sha256/table-name/compound-pk/column.gif /database-name-7sha256/table-name/compound-pk/column.txt The one without an explicit file extension auto-detects the correct extension.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 267515836,MDU6SXNzdWUyNjc1MTU4MzY=,4,Make URLs immutable,9599,simonw,closed,0,,,2857392,Ship first public release,8,2017-10-23T01:13:30Z,2017-10-24T02:38:24Z,2017-10-24T02:38:24Z,OWNER,,"Absolutely everything should have a far-future expires header Part of the URL will be the truncated sha1 hash of the database file itself, calculated at build time",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267516066,MDU6SXNzdWUyNjc1MTYwNjY=,5,Implement sensible query pagination,9599,simonw,closed,0,,,2857392,Ship first public release,3,2017-10-23T01:16:00Z,2017-11-10T20:41:39Z,2017-11-10T20:41:39Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267516329,MDU6SXNzdWUyNjc1MTYzMjk=,6,Better JSON response options,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-10-23T01:18:47Z,2017-10-24T15:07:58Z,2017-10-24T15:07:58Z,OWNER,,"Default returns this: { “Columns”: [“id”, “name”, “age”], “Rows”: [ [45, “Simon”, 36] ] } .jsono instead returns a list of objects each duplicating the headers in its keys. They both probably share the same pagination mechanism so it might not be a jsono flat list.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267516650,MDU6SXNzdWUyNjc1MTY2NTA=,7,Framework where by every page is JSON plus a template,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-23T01:22:03Z,2017-10-24T02:27:25Z,2017-10-24T02:27:25Z,OWNER,,"Every single page of my interface should be implemented as a function that returns JSON. I can then build my jinja templates on top of the exact data that would be returned by the API version.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267517314,MDU6SXNzdWUyNjc1MTczMTQ=,8,Attempting an INSERT or UPDATE should return a sane error message,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-23T01:28:25Z,2017-10-23T15:28:12Z,2017-10-23T15:28:08Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267517348,MDU6SXNzdWUyNjc1MTczNDg=,9,Initial test suite,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-10-23T01:28:46Z,2017-10-24T05:55:33Z,2017-10-24T05:55:33Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267517381,MDU6SXNzdWUyNjc1MTczODE=,10,Set up Travis,9599,simonw,closed,0,,,2859414,v1 stretch goals,1,2017-10-23T01:29:07Z,2017-11-04T23:48:57Z,2017-11-04T23:48:57Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267522549,MDU6SXNzdWUyNjc1MjI1NDk=,11,Code that generates compile-time properties about the database ,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-23T02:18:24Z,2017-10-23T16:04:23Z,2017-10-23T16:04:23Z,OWNER,,"At a minimum this will include: * sha hash of each database file * list of tables with row counts for each database file",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267523511,MDU6SXNzdWUyNjc1MjM1MTE=,12,Make it so you can override templates,9599,simonw,closed,0,,,2949431,Custom templates edition,1,2017-10-23T02:25:35Z,2017-11-30T16:42:46Z,2017-11-30T16:38:34Z,OWNER,,"The app will ship with default templates but, just like with the Django admin, you will be able to override them using either explicit configuration settings or just by dropping in templates with certain file names. Template inheritance should work here, both allowing you to override just the base template and allowing you to customize tiny bits of others.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267542338,MDU6SXNzdWUyNjc1NDIzMzg=,13,Add a syntax highlighting SQL editor,9599,simonw,closed,0,,,,,1,2017-10-23T05:03:33Z,2017-11-15T02:04:51Z,2017-11-15T02:04:51Z,OWNER,,https://ace.c9.io/#nav=embedding looks like a good option,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267707940,MDU6SXNzdWUyNjc3MDc5NDA=,14,Datasette Plugins,9599,simonw,closed,0,,,,,22,2017-10-23T15:15:28Z,2019-05-13T18:58:20Z,2019-05-13T18:58:19Z,OWNER,,"It would be neat if additional functionality could be opted-in to the system in the form of easy-to-add plugins, hosted as separate packages. First example: a Google Analytics plugin, which adds GA tracking code with your tracking ID to the web interface for your dataset. This may be an opportunity to experiment with entry points: http://amir.rachum.com/blog/2017/07/28/python-entry-points/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267713226,MDU6SXNzdWUyNjc3MTMyMjY=,15,Support multiple databases,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-10-23T15:29:51Z,2017-10-24T02:01:38Z,2017-10-24T02:01:38Z,OWNER,,"I'm going to loop through every database file in the app root directory and bundle all of them. Each one will be accessible at /databasename Note this is without the file extension, and we will disallow multiple files with the same name but different extensions. Supported extensions to start with will be `.db` and `.sqlite` and `.sqlite3`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267726219,MDU6SXNzdWUyNjc3MjYyMTk=,16,Default HTML/CSS needs to look reasonable and be responsive,9599,simonw,closed,0,,,2857392,Ship first public release,6,2017-10-23T16:05:22Z,2017-11-11T20:19:07Z,2017-11-11T20:19:07Z,OWNER,,"Version one should have the following characteristics: - Looks OK - Works great on mobile - Loads extremely fast - No JavaScript! At least not in v1.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267732005,MDU6SXNzdWUyNjc3MzIwMDU=,17,"In development mode, should still pick up new .db files",9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-23T16:22:40Z,2017-10-24T02:26:48Z,2017-10-24T02:26:47Z,OWNER,,Follow on from #11 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267739593,MDU6SXNzdWUyNjc3Mzk1OTM=,18,See if I can get a websockets interface working,9599,simonw,closed,0,,,,,1,2017-10-23T16:46:41Z,2021-01-04T20:05:52Z,2021-01-04T20:05:48Z,OWNER,,"Since I am already running on Sanic, how hard would it be to add a websocket ebdpoint that lets you talk to sqlite interactively? Could this be used to efficiently support streaming in answers to giant queries?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267741262,MDU6SXNzdWUyNjc3NDEyNjI=,19,Efficient url for downloading the raw database file,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-23T16:52:17Z,2017-10-25T15:21:16Z,2017-10-25T15:19:37Z,OWNER,,Use Sanic support for steaming large files http://sanic.readthedocs.io/en/latest/sanic/response.html#file-streaming,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267759136,MDU6SXNzdWUyNjc3NTkxMzY=,20,Config file with support for defining canned queries,9599,simonw,closed,0,9599,simonw,2949431,Custom templates edition,9,2017-10-23T17:53:06Z,2017-12-05T19:05:35Z,2017-12-05T17:44:09Z,OWNER,,"Probably using YAML because then we get support for multiline strings: bats: db: bats.sqlite3 name: ""Bat sightings"" queries: specific_row: | select * from Bats where a = 1; ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267769034,MDU6SXNzdWUyNjc3NjkwMzQ=,21,Use Sanic configuration mechanism ,9599,simonw,closed,0,,,2859414,v1 stretch goals,1,2017-10-23T18:25:14Z,2017-11-10T20:45:42Z,2017-11-10T20:45:42Z,OWNER,,http://sanic.readthedocs.io/en/latest/sanic/config.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/21/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267769431,MDU6SXNzdWUyNjc3Njk0MzE=,22,Refactor to use class based views ,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-10-23T18:26:22Z,2019-05-27T20:05:56Z,2017-10-24T02:25:53Z,OWNER,,http://sanic.readthedocs.io/en/latest/sanic/class_based_views.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267788884,MDU6SXNzdWUyNjc3ODg4ODQ=,23,Support Django-style filters in querystring arguments,9599,simonw,closed,0,,,2857392,Ship first public release,6,2017-10-23T19:29:42Z,2017-10-25T04:23:03Z,2017-10-25T04:23:02Z,OWNER,,"e.g /database/table?name__contains=Simon&age__gte=4 Same format as Django: double underscore as the split. If you need to match against a column that happens to contain a double underscore in its official name, do this: /database/table?weird__column__exact=Simon __exact is the default operation if none is supplied.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267828746,MDU6SXNzdWUyNjc4Mjg3NDY=,24,Implement full URL design,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-10-23T21:49:05Z,2017-10-24T14:12:00Z,2017-10-24T14:12:00Z,OWNER,,"Full URL design: /database-name /database-name.json /database-name-7sha256 /database-name-7sha256.json /database-name/table-name /database-name/table-name.json /database-name-7sha256/table-name /database-name-7sha256/table-name.json /database-name-7sha256/table-name/compound-pk /database-name-7sha256/table-name/compound-pk.json ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267857622,MDU6SXNzdWUyNjc4NTc2MjI=,25,Endpoint that returns SQL ready to be piped into DB,9599,simonw,closed,0,,,,,2,2017-10-24T00:19:26Z,2017-11-15T05:11:12Z,2017-11-15T05:11:11Z,OWNER,,It would be cool if I could figure out a way to generate both the create table statements and the inserts for an individual table or the entire database and then stream them down to the client.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267861210,MDU6SXNzdWUyNjc4NjEyMTA=,26,Command line tool for uploading one or more DBs to Now,9599,simonw,closed,0,,,2857392,Ship first public release,3,2017-10-24T00:43:10Z,2017-11-11T07:25:30Z,2017-11-11T07:25:30Z,OWNER,,"Uploading files appears to be undocumented, but I found it in their code here: https://github.com/zeit/now-cli/blob/0ca7d1fe44ebdf460b64fdc38ba543b8e295ac40/src/providers/sh/util/index.js#L291",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/26/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267886330,MDU6SXNzdWUyNjc4ODYzMzA=,27,Ability to plot a simple graph,9599,simonw,closed,0,,,,,3,2017-10-24T03:34:59Z,2018-07-10T17:52:41Z,2018-07-10T17:52:41Z,OWNER,,"Might be as simple as: pick he type of chart (bar, line) and then pick the column for the X axis and the column for the Y axis. Maybe also allow a pie chart. It’s up to the user to come up with SQL that gets the right values.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/27/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 267886865,MDU6SXNzdWUyNjc4ODY4NjU=,28,/database?sql= should redirect correctly,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-10-24T03:38:44Z,2017-10-24T23:54:30Z,2017-10-24T23:54:30Z,OWNER,,Needs to redirect to the location with the hash while retaining the query string. This should also work with the .json extension.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/28/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268050821,MDU6SXNzdWUyNjgwNTA4MjE=,29,Handle bytestring records encoding to JSON,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-24T14:18:45Z,2017-10-24T14:59:00Z,2017-10-24T14:58:47Z,OWNER,,"http://localhost:8006/northwind-40d049b/Categories.json 500s right now The string representation of one of the values looks like this: b""\x15\x1c/\x00\x02\x00 This is a bytestring from the database which cannot be naively converted to a unicode string.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/29/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268078453,MDU6SXNzdWUyNjgwNzg0NTM=,30,Do something neat with foreign keys,9599,simonw,closed,0,,,,,1,2017-10-24T15:29:29Z,2017-11-14T18:29:08Z,2017-11-14T18:29:01Z,OWNER,,"https://www.sqlite.org/pragma.html#pragma_foreign_key_list SQLite has robust support for introspecting foreign keys. I could use that to automatically link to the corresponding record from my tables.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/30/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268087542,MDU6SXNzdWUyNjgwODc1NDI=,31,Idea: colour scheme based on sha256 of db,9599,simonw,closed,0,,,2859414,v1 stretch goals,1,2017-10-24T15:52:38Z,2018-05-28T18:10:45Z,2017-11-09T14:14:59Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/31/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268106803,MDU6SXNzdWUyNjgxMDY4MDM=,32,Try running SQLite queries in a separate thread,9599,simonw,closed,0,,,2859414,v1 stretch goals,1,2017-10-24T16:48:42Z,2017-11-09T14:05:56Z,2017-11-09T14:05:56Z,OWNER,,"https://pymotw.com/3/asyncio/executors.html Would be good to have some actual benchmarks so I can evaluate if this is worth it or not.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/32/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268110769,MDU6SXNzdWUyNjgxMTA3Njk=,33,Use locust for benchmarking and load tests,9599,simonw,open,0,,,,,0,2017-10-24T17:00:09Z,2017-12-10T03:12:16Z,,OWNER,,"https://github.com/locustio/locust Needed for #32 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 268176505,MDU6SXNzdWUyNjgxNzY1MDU=,34,Support CSV export with a .csv extension,9599,simonw,closed,0,,,,,1,2017-10-24T20:34:43Z,2021-06-17T18:14:48Z,2018-05-28T20:45:34Z,OWNER,,"Maybe do this using streaming with multiple pagination SQL queries so we can support arbritrarily large exports. How would this work against a view which doesn’t have an obvious efficient pagination mechanism? Maybe limit views to up to 1000 exported records? Relates to #5 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/34/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268262480,MDU6SXNzdWUyNjgyNjI0ODA=,36,"date, year, month and day querystring lookups",9599,simonw,closed,0,,,,,3,2017-10-25T04:23:45Z,2018-05-28T17:30:53Z,2018-05-28T17:30:53Z,OWNER,,"- [ ] `?timestamp___date=2017-07-17` - return every item where the timestamp falls on that date - [ ] `?timestamp___year=2017` - return every item where the timestamp falls within 2017 - [ ] `?timestamp___month=1` - return every item where the month component is January - [ ] `?timestamp___day=10` - return every item where the day-of-the-month component is 10 Follow on from #23 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/36/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268453968,MDU6SXNzdWUyNjg0NTM5Njg=,37,Ability to serialize massive JSON without blocking event loop,9599,simonw,closed,0,,,,,2,2017-10-25T15:58:03Z,2020-05-30T17:29:20Z,2020-05-30T17:29:20Z,OWNER,,"We run the risk of someone attempting a select statement that returns thousands of rows and hence takes several seconds just to JSON encode the response, effectively blocking the event loop and pausing all other traffic. The Twisted community have a solution for this, can we adapt that in some way? http://as.ynchrono.us/2010/06/asynchronous-json_18.html?m=1",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/37/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268462768,MDU6SXNzdWUyNjg0NjI3Njg=,38,Experiment with patterns for concurrent long running queries,9599,simonw,closed,0,,,,,5,2017-10-25T16:23:42Z,2018-05-28T20:47:31Z,2018-05-28T20:47:31Z,OWNER,,I want to understand how the system could perform under load with many concurrent long-running queries. Can we serve these without blocking the event loop?,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268469569,MDU6SXNzdWUyNjg0Njk1Njk=,39,Protect against malicious SQL that causes damage even though our DB is immutable,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-10-25T16:44:27Z,2021-08-17T23:52:07Z,2017-11-05T02:53:47Z,OWNER,,"I’m currently operating under the assumption that it’s safe to allow arbitrary SQL statements because we are dealing with an immutable database. But this might not be the case - there are some pretty weird SQLite language extensions (ATTACH, PRAGMA etc) and I’m not certain they cannot be used to break things in a way that would affect future requests to the API. Solution: provide a “safe mode” option which disables the ?sql= mechanism. This still leaves the URL filter lookups, so I need to make sure that those are “safe”. In the future I may also implement a whitelist option where datasets can be configured to only allow specific filters against specific columns.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/39/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268470572,MDU6SXNzdWUyNjg0NzA1NzI=,40,Implement command-line tool interface,9599,simonw,closed,0,,,2857392,Ship first public release,11,2017-10-25T16:47:15Z,2017-11-11T07:27:33Z,2017-11-11T07:27:33Z,OWNER,,"The first version needs to take one or more file names or URLs, then generate and deploy an app to Now. It will assume you already have the now command installed and configured.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/40/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268590777,MDU6SXNzdWUyNjg1OTA3Nzc=,41,Homepage should show summary of databases,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-10-26T00:18:11Z,2017-10-27T04:05:35Z,2017-10-27T04:05:35Z,OWNER,,"I sch database should have a name, optional description, download link and a summary of the tables Flights.db Flights and suchlike blah. URL? License? 577373 rows across 14 tables airports, routes, airlines... Title of the homepage is derived from the databases or can be manually overridden e. “Datasets of Flights, NHS, Blah...” - or if only one database just the title of that.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/41/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268591332,MDU6SXNzdWUyNjg1OTEzMzI=,42,Homepage UI for editing metadata file,9599,simonw,closed,0,,,,,4,2017-10-26T00:22:03Z,2017-12-10T03:02:14Z,2017-12-10T03:02:14Z,OWNER,,"Since we are going to have a metadata file which sets the title/description/etc for each database, why not allow you to run the app in —dev mode which makes the homepage into a WYSIWYG editor that can save to that file format.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/42/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 268592894,MDU6SXNzdWUyNjg1OTI4OTQ=,43,"While running, server should spot new db files added to its directory ",9599,simonw,closed,0,,,2859414,v1 stretch goals,1,2017-10-26T00:32:37Z,2017-11-14T08:25:53Z,2017-11-14T08:25:37Z,OWNER,,"Maybe in each request it checks the time and if 5s has elapsed since t last scanned the directory it scans it again This would allow people with dedicated hosting to run the app there and just upload new datasets whenever they want. It would also be very convenient for development.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/43/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 269731374,MDU6SXNzdWUyNjk3MzEzNzQ=,44,?_group_count=country - return counts by specific column(s),9599,simonw,closed,0,,,,,7,2017-10-30T19:50:32Z,2018-04-26T15:09:58Z,2018-04-26T15:09:58Z,OWNER,,"Imagine if this: https://stateless-datasets-jykibytogk.now.sh/flights-07d1283/airports.jsono?country__contains=gu&_group_count=country Turned into this: https://stateless-datasets-jykibytogk.now.sh/flights-07d1283?sql=select%20country,%20count(*)%20as%20group_count_country%20from%20airports%20where%20country%20like%20%27%gu%%27%20group%20by%20country%20order%20by%20group_count_country%20desc This would involve introducing a new precedent of query string arguments that start with an _ having special meanings. While we're at it, could try adding _fields=x,y,z Tasks: - [x] Get initial version working - [ ] Refactor code to not just ""pretend to be a view"" - [ ] Get foreign key relationships expanded",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/44/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 271242824,MDU6SXNzdWUyNzEyNDI4MjQ=,45,Run SQLite operations in a thread pool,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-05T02:27:12Z,2017-11-05T02:27:34Z,2017-11-05T02:27:33Z,OWNER,,"Let's run SQLite operations in threads, so we don't end up blocking our core event loop. These articles are helpful: * https://pymotw.com/3/asyncio/executors.html * https://marlinux.wordpress.com/2017/05/19/python-3-6-asyncio-sqlalchemy/ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/45/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 271301468,MDU6SXNzdWUyNzEzMDE0Njg=,46,Dockerfile should build more recent SQLite with FTS5 and spatialite support,9599,simonw,closed,0,,,,,13,2017-11-05T18:16:22Z,2017-11-17T14:32:12Z,2017-11-17T14:32:12Z,OWNER,,"The SQLite bundled with Python 3 doesn't support the FTS5 search extension. It would be nice if the SQLite built by our Dockerfile could support as many modern SQLite features as possible. https://web.archive.org/web/20170212034155/http://charlesleifer.com/blog/using-the-sqlite-json1-and-fts5-extensions-with-python/ has instructions on building a more recent SQLite and the pysqlite package. Our Dockerfile could carry out an updated version of this process.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/46/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 271831408,MDU6SXNzdWUyNzE4MzE0MDg=,47,Create neat example database,9599,simonw,closed,0,,,,,5,2017-11-07T13:29:38Z,2017-11-14T03:08:13Z,2017-11-14T03:08:13Z,OWNER,,How about data from open elections eg https://github.com/openelections/openelections-data-ca?files=1,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/47/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 272391665,MDU6SXNzdWUyNzIzOTE2NjU=,48,Switch to ujson,9599,simonw,closed,0,,,,,4,2017-11-08T23:50:29Z,2019-06-24T06:57:54Z,2019-06-24T06:57:43Z,OWNER,,"ujson is already a dependency of Sanic, and should be quite a bit faster.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/48/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 272661336,MDU6SXNzdWUyNzI2NjEzMzY=,49,Pick a name,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-11-09T17:56:17Z,2017-11-10T18:33:22Z,2017-11-10T18:33:22Z,OWNER,,"Options so far: * immutabase * datasite * sqlstatic * dbserve * sqlserve Terms to play with: * immutable * sqlite * dataset * json * static * serve",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/49/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 272694136,MDU6SXNzdWUyNzI2OTQxMzY=,50,Unit tests against application itself,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-11-09T19:31:49Z,2017-11-11T22:23:22Z,2017-11-11T22:23:22Z,OWNER,,"Use Sanic’s testing mechanism. Test should create a temporary SQLite database file on disk by executing sql that is stored in the test themselves. For the moment we can just test the JSON API more thoroughly and just sanity check that the HTML output doesn’t throw any errors.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/50/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 272735257,MDU6SXNzdWUyNzI3MzUyNTc=,51,Make a proper README,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-09T21:46:07Z,2017-11-13T18:44:23Z,2017-11-13T18:44:23Z,OWNER,,Include instructions on building a local Docker container - currently detailed here: https://gist.github.com/simonw/0ea5c960608c2d876e4637a5e48aa95d (those instructions don't work now that we have removed the Dockerfile in favour of a template generated by `datasette publish`),107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/51/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273026602,MDU6SXNzdWUyNzMwMjY2MDI=,52,Solution for temporarily uploading DB so it can be built by docker,9599,simonw,closed,0,,,,,2,2017-11-10T18:55:25Z,2017-12-10T03:02:57Z,2017-12-10T03:02:57Z,OWNER,,For the `datasette publish` command I ideally need a way of uploading the specified DB to somewhere temporary on the internet so that when the Dockerfile is built by the final hosting location it can download that database as part of the build process.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/52/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273054652,MDU6SXNzdWUyNzMwNTQ2NTI=,53,Implement a better database index page,9599,simonw,closed,0,,,2857392,Ship first public release,3,2017-11-10T20:47:36Z,2017-11-12T21:19:33Z,2017-11-12T01:50:27Z,OWNER,,"This view isn't great. I should do a better job of separating out tables from views and indexes, showing the count of rows in each table, and maybe move the SQL to the individual table pages. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/53/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273121803,MDU6SXNzdWUyNzMxMjE4MDM=,54,Views should not attempt to link to records / use rowids,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-11T05:44:54Z,2017-11-12T21:29:42Z,2017-11-12T21:29:33Z,OWNER,,"http://localhost:8001/parlgov-development-25f9855/view_variable ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/54/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273127117,MDU6SXNzdWUyNzMxMjcxMTc=,55,Ship first version to PyPI,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-11-11T07:38:48Z,2017-11-13T21:19:43Z,2017-11-13T21:19:43Z,OWNER,,"Just before doing this, update the Dockerfile template to `pip install datasette` https://github.com/simonw/datasette/blob/65e350ca2a4845c25752a62c16ba58cfe2c14b9b/datasette/utils.py#L125",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/55/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273127443,MDU6SXNzdWUyNzMxMjc0NDM=,56,Easy way to block search engine crawling in robots.txt,9599,simonw,closed,0,,,,,1,2017-11-11T07:46:07Z,2018-05-28T20:50:25Z,2018-05-28T20:50:24Z,OWNER,,For people who don't want their datasets to be crawled by search engines.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/56/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273127694,MDU6SXNzdWUyNzMxMjc2OTQ=,57,Ship a Docker image of the whole thing,9599,simonw,closed,0,,,,,7,2017-11-11T07:51:28Z,2018-06-28T04:01:51Z,2018-06-28T04:01:38Z,OWNER,,"The generated Docker images can then just inherit from that. This will speed up deploys as no need to `pip install` anything. - [x] Ship that image to Docker Hub - [ ] Update the generated Dockerfile to use it",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/57/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273128608,MDU6SXNzdWUyNzMxMjg2MDg=,58,"publish command should detect if ""now"" is installed",9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-11T08:10:17Z,2017-11-11T16:00:07Z,2017-11-11T16:00:07Z,OWNER,,"If now is not installed, it should tell you where to get it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/58/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273157085,MDU6SXNzdWUyNzMxNTcwODU=,59,datasette publish hyper,9599,simonw,closed,0,,,,,4,2017-11-11T16:27:26Z,2019-05-13T19:01:00Z,2019-05-13T19:00:44Z,OWNER,,"This is a bit tricky, because unlike Now there doesn't seem to be a way to tell Hyper to ""build this Dockerfile and deploy the resulting image"". They expect you to build a container and publish it to a registry instead. https://docs.hyper.sh/Reference/CLI/load.html allows you to publish an image directly from a tarball, but that still leaves the challenge of creating that image. The nice thing about the Now integration is that you don't need to have Docker installed on your local machine.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/59/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273163905,MDU6SXNzdWUyNzMxNjM5MDU=,60,Rethink how metadata is generated and stored,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-11T18:01:28Z,2017-11-11T20:12:17Z,2017-11-11T20:12:16Z,OWNER,,"I broke the existing mechanism in 407795b61217205625f2d4e084afbf69f1db781b In order to get unit tests for the sanic app working. I think i should ditch the build-metadata.json cache file entirely and calculate the SHA hashes on startup. Not sure what to do about the table row counts.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/60/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273173116,MDU6SXNzdWUyNzMxNzMxMTY=,61,Common header and footer,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-11T20:20:08Z,2017-11-11T20:37:19Z,2017-11-11T20:37:19Z,OWNER,,"Split from #16 - [x] A link to the homepage from some kind of navigation bar in the header - [x] link to github.com/simonw/datasette in the footer - [x] Slightly better titles (maybe ditch the visited link colours for titles only? should keep those for primary key links)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/61/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273174397,MDU6SXNzdWUyNzMxNzQzOTc=,62,Link to .json and .jsono versions on various pages,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-11T20:37:47Z,2017-11-11T22:41:06Z,2017-11-11T22:41:06Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/62/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273174447,MDU6SXNzdWUyNzMxNzQ0NDc=,63,Review design of JSON output,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-11T20:38:33Z,2017-11-11T22:20:17Z,2017-11-11T22:20:17Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/63/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273181020,MDU6SXNzdWUyNzMxODEwMjA=,64,Support for ?field__isnull=1 or similar,9599,simonw,closed,0,,,,,1,2017-11-11T22:26:52Z,2017-11-17T14:38:21Z,2017-11-17T14:38:21Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/64/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273191608,MDU6SXNzdWUyNzMxOTE2MDg=,65,Re-implement ?sql= mode,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-12T01:47:17Z,2017-11-12T02:36:37Z,2017-11-12T02:35:42Z,OWNER,,"Here's the code I removed: async def data(self, request, name, hash): sql = 'select * from sqlite_master' custom_sql = False params = {} if request.args.get('sql'): params = request.raw_args sql = params.pop('sql') validate_sql_select(sql) custom_sql = True rows = await self.execute(name, sql, params) columns = [r[0] for r in rows.description] return { 'database': name, 'rows': rows, 'columns': columns, 'query': { 'sql': sql, 'params': params, } }, { 'database_hash': hash, 'custom_sql': custom_sql, } ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/65/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273191806,MDU6SXNzdWUyNzMxOTE4MDY=,66,Show table SQL on table page,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-12T01:51:23Z,2017-11-12T21:17:29Z,2017-11-12T21:17:29Z,OWNER,,"Let's do the SQL for the table you are looking at, plus SQL for any indexes that mention that table. The page for a view should show the SQL for that view.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/66/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273192789,MDU6SXNzdWUyNzMxOTI3ODk=,67,Command that builds a local docker container,9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-11-12T02:13:29Z,2017-11-13T16:17:52Z,2017-11-13T16:17:52Z,OWNER,,Be nice to indicate that this isn't just for Now. Shouldn't be too hard either.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/67/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273247186,MDU6SXNzdWUyNzMyNDcxODY=,68,Support for title/source/license metadata,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-11-12T17:04:21Z,2017-12-04T04:55:43Z,2017-11-13T15:26:11Z,OWNER,,"I've decided this is important for launch: I want to set a precedent for people citing, licensing and documenting their datasets. Not sure how best to go about supporting this. I'd like to allow for the following data to be optionally attached to any given database: - Title - Description, potentially in markdown? - Original source URL - License I'd also like the ability to attach descriptions to individual tables - and maybe even to table columns? The question then becomes: how should this information be stored. A few options: - In the SQLite database itself, in a specially named table. Problem here is that this means having to modify SQLite databases before publishing them. - In a separate SQLite database that can be published alongside the databases we are publishing. - In a JSON file. This is neat, but JSON files are not a great editing experience once you start including multiple lines (e.g. a markdown description). - In a YAML file. This is a better format for multi-line descriptions, but still isn't a great editing experience. Whatever the format, it can be made much more usable by offering a web-based editing UI for populating it (a special mode the server can be run in).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/68/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273248366,MDU6SXNzdWUyNzMyNDgzNjY=,69,Enforce pagination (or at least limits) for arbitrary custom SQL,9599,simonw,closed,0,,,2857392,Ship first public release,4,2017-11-12T17:21:33Z,2017-11-13T20:32:47Z,2017-11-13T19:35:47Z,OWNER,,"It's way too easy to accidentally trigger a page that returns 100,000 rows at the moment. I need to use the LIMIT clause on views and custom SQL - I can support pagination ""next"" links using offset as well.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/69/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273267081,MDU6SXNzdWUyNzMyNjcwODE=,70,Paginate views using OFFSET/LIMIT,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-12T21:30:29Z,2017-11-13T21:11:01Z,2017-11-13T21:11:01Z,OWNER,,"As with #69 these should obey a maximum offset setting, which can be over-ridden.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/70/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273278840,MDU6SXNzdWUyNzMyNzg4NDA=,71,Set up some example datasets on a Cloudflare-backed domain,9599,simonw,closed,0,,,2857392,Ship first public release,10,2017-11-13T00:06:30Z,2017-11-13T02:09:34Z,2017-11-13T02:09:34Z,OWNER,,"To better demonstrate the caching and HTTP/2 features, I'd like to go live with some demos that are hosted behind Cloudflare. - [x] Redirect https://datasettes.com/ and https://www.datasettes.com/ to https://github.com/simonw/datasette - [x] Have `now domain add -e datasettes.com` run without errors (hopefully just a matter of waiting for the DNS to update) - [x] Alias an example dataset hosted on Now on a datasettes.com subdomain - [x] Confirm that HTTP caching and HTTP/2 redirect pushing works as expected - this may require another page rule",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/71/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273283166,MDU6SXNzdWUyNzMyODMxNjY=,72,publish command should take an optional --name argument,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-13T00:59:35Z,2017-11-13T02:12:27Z,2017-11-13T02:12:27Z,OWNER,,"To set the directory name so that now will inherit it as the name of the app. Defaults to datasette ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/72/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273296178,MDU6SXNzdWUyNzMyOTYxNzg=,73,_nocache=1 query string option for use with sort-by-random,9599,simonw,closed,0,,,,,2,2017-11-13T02:57:10Z,2018-05-28T17:25:15Z,2018-05-28T17:25:15Z,OWNER,,The one place where we wouldn’t want cdching is if we have something which uses sort by random to return random items. We can offer a _nocache=1 querystring argument to support this.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/73/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273296684,MDU6SXNzdWUyNzMyOTY2ODQ=,74,Send a 302 redirect to the new hash for hits to old hashes,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-13T03:00:59Z,2017-11-13T18:49:59Z,2017-11-13T18:49:59Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/74/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273509159,MDU6SXNzdWUyNzM1MDkxNTk=,75,Add --cors argument to serve,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-13T17:16:19Z,2017-11-13T18:17:52Z,2017-11-13T18:17:52Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/75/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273510781,MDU6SXNzdWUyNzM1MTA3ODE=,76,publish should have required argument specifying publisher,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-13T17:21:26Z,2017-11-13T18:41:01Z,2017-11-13T18:41:01Z,OWNER,,Initially the only argument will be “now” - but “hyper” can be added in the future,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/76/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273537940,MDU6SXNzdWUyNzM1Mzc5NDA=,77,Add Travis CI badge to README,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-13T18:52:25Z,2017-11-13T21:24:15Z,2017-11-13T21:24:15Z,OWNER,,"Also fix this newline issue: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/77/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273554949,MDU6SXNzdWUyNzM1NTQ5NDk=,78,Rename after to next and provide a next_url,9599,simonw,closed,0,,,2857392,Ship first public release,0,2017-11-13T19:48:31Z,2017-11-13T20:35:03Z,2017-11-13T20:35:03Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/78/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273569068,MDU6SXNzdWUyNzM1NjkwNjg=,79,Add more detailed API documentation to the README,9599,simonw,closed,0,,,,,3,2017-11-13T20:36:21Z,2018-05-28T17:24:48Z,2018-05-28T17:24:48Z,OWNER,,"Need to document: - [ ] The ?column__gt=4 style filter arguments for tables - [ ] The ?sql= API, and how named parameters work - [ ] How API pagination works - [ ] How redirects and cache headers work",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/79/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273569477,MDU6SXNzdWUyNzM1Njk0Nzc=,80,Deploy final versions of fivethirtyeight and parlgov datasets (with view pagination),9599,simonw,closed,0,,,2857392,Ship first public release,2,2017-11-13T20:37:46Z,2017-11-13T22:09:46Z,2017-11-13T22:09:46Z,OWNER,,Final versions should be deployed using the first released version of datasette.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/80/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273595473,MDExOlB1bGxSZXF1ZXN0MTUyMzYwNzQw,81,:fire: Removes DS_Store,50527,jefftriplett,closed,0,,,,,2,2017-11-13T22:07:52Z,2017-11-14T02:24:54Z,2017-11-13T22:16:55Z,CONTRIBUTOR,simonw/datasette/pulls/81,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/81/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 273596159,MDU6SXNzdWUyNzM1OTYxNTk=,82,Post a blog entry announcing it to the world,9599,simonw,closed,0,,,2857392,Ship first public release,1,2017-11-13T22:10:35Z,2017-11-14T01:46:10Z,2017-11-14T01:46:10Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/82/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273626815,MDU6SXNzdWUyNzM2MjY4MTU=,83,Individual row view is broken,9599,simonw,closed,0,,,,,0,2017-11-14T00:29:11Z,2017-11-14T00:45:34Z,2017-11-14T00:45:34Z,OWNER,,"https://parlgov.datasettes.com/parlgov-25f9855/viewcalc_parliament_composition/18 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/83/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273660425,MDU6SXNzdWUyNzM2NjA0MjU=,84,datasette package --metadata does not work with a relative path,9599,simonw,closed,0,,,,,0,2017-11-14T04:00:50Z,2017-11-15T05:18:35Z,2017-11-15T05:18:35Z,OWNER,," $ datasette package ~/parlgov-db/parlgov.db --metadata=~/parlgov-db/parlgov.json Usage: datasette package [OPTIONS] FILES... Error: Invalid value for ""-m"" / ""--metadata"": Could not open file: ~/parlgov-db/parlgov.json: No such file or directory simonw-07542:~ simonw$ cd ~/parlgov-db/ simonw-07542:parlgov-db simonw$ datasette package ~/parlgov-db/parlgov.db --metadata=parlgov.json Sending build context to Docker daemon 4.46MB Step 1/7 : FROM python:3 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/84/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273678673,MDU6SXNzdWUyNzM2Nzg2NzM=,85,Detect foreign keys and use them to link HTML pages together,9599,simonw,closed,0,,,2919870,Foreign key edition,6,2017-11-14T06:12:05Z,2017-11-19T06:08:19Z,2017-11-19T06:08:19Z,OWNER,,"https://stackoverflow.com/a/44430157/6083 documents the PRAGMA needed to extract foreign key references for a table. At a minimum we can link column values known to be foreign keys to the corresponding row page. We could try to summarize the linked row in some way too - somehow extracting a sensible link title, maybe based on additional configuration in the metadata.json file. Still todo: - [x] Fix it to csvs-to-sqlite refactoring command correctly creates primary key on generated tables - [x] Ship new csvs-to-sqlite with refactoring command - [x] Refactor column logic to be more predictable in our templates (the rowid special case) - [x] Mechanism by which table metadata can specify the ""label"" column for a table - [x] Automatically set the label column as the first column that isn't a primary key (falling back on primary key) - [x] Code which runs a ""select id, label from table where id in (...)"" query as part of the tableview and populates a lookup dictionary - [x] Modify templates to use values from that lookup dictionary",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/85/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273703829,MDU6SXNzdWUyNzM3MDM4Mjk=,86,Filter UI on table page,9599,simonw,closed,0,,,2919870,Foreign key edition,10,2017-11-14T08:22:43Z,2017-11-23T20:34:32Z,2017-11-23T20:34:32Z,OWNER,,A UI for building up simple table queries by adding additional filter rules that get executed as query parameters in the URL.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/86/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273709194,MDU6SXNzdWUyNzM3MDkxOTQ=,87,Configure Travis to release new tags to PyPI,9599,simonw,closed,0,,,,,1,2017-11-14T08:44:08Z,2018-07-10T17:49:13Z,2018-07-10T17:49:12Z,OWNER,,https://docs.travis-ci.com/user/deployment/pypi/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/87/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273775212,MDU6SXNzdWUyNzM3NzUyMTI=,88,Add NHS England Hospitals example to wiki,15543,tomdyson,closed,0,,,,,4,2017-11-14T12:29:10Z,2021-03-22T23:46:36Z,2017-11-14T22:54:06Z,CONTRIBUTOR,,"https://nhs-england-hospitals.now.sh and an associated map visualisation: http://run.plnkr.co/preview/cj9zlf1qc0003414y90ajkwpk/ Datasette is wonderful! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/88/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273816720,MDExOlB1bGxSZXF1ZXN0MTUyNTIyNzYy,89,SQL syntax highlighting with CodeMirror,15543,tomdyson,closed,0,,,,,1,2017-11-14T14:43:33Z,2017-11-15T02:03:01Z,2017-11-15T02:03:01Z,CONTRIBUTOR,simonw/datasette/pulls/89,"Addresses #13 Future enhancements could include autocompletion of table and column names, e.g. with ```javascript extraKeys: {""Ctrl-Space"": ""autocomplete""}, hintOptions: {tables: { users: [""name"", ""score"", ""birthDate""], countries: [""name"", ""population"", ""size""] }} ``` (see https://codemirror.net/doc/manual.html#addon_sql-hint and source at http://codemirror.net/mode/sql/)",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/89/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 273846123,MDU6SXNzdWUyNzM4NDYxMjM=,90,datasette publish heroku,9599,simonw,closed,0,,,,,8,2017-11-14T16:01:39Z,2017-12-10T03:06:34Z,2017-12-10T03:05:48Z,OWNER,,"Heroku has Docker container support so this should not be too hard: https://devcenter.heroku.com/articles/container-registry-and-runtime See also #59 This should work exactly like the existing “datasette publish now....” command except it would be “datasette publish heroku...”",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/90/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273878873,MDU6SXNzdWUyNzM4Nzg4NzM=,91,"Option to serve databases from a different prefix, serve regular content elsewhere",9599,simonw,closed,0,,,,,1,2017-11-14T17:32:46Z,2017-12-10T03:07:58Z,2017-12-10T03:07:53Z,OWNER,,"It would be useful if the databases themselves could be served from a prefix e.g. datasette serve mydb.db --path-prefix=db Now my database is at `http://localhost:8001/db/mydb-23423` This would free up the rest of the URL namespace for other things. Maybe we could have an option to serve static content from a known folder e.g. datasette serve mydb.db --path-prefix=db --root-content=~/my-project/static Now a hit to `http://localhost:8001/news/` serves content from `~/my-project/static/news/index.html` This would make it trivial to package up entire HTML/CSS/JS apps with one or more underlying SQLite databases. Running without `--cors` would be fine here because any JS apps would be hosted on the same origin.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/91/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273895344,MDU6SXNzdWUyNzM4OTUzNDQ=,92,Add --license --license_url --source --source_url --title arguments to datasette publish,9599,simonw,closed,0,,,,,0,2017-11-14T18:27:07Z,2017-11-15T05:04:41Z,2017-11-15T05:04:41Z,OWNER,,"I keep on using the `echo '{""source"": ""...""}' | datasette publish now --metadata=-` pattern, which suggests it makes sense for us to support these as optional arguments. https://gist.github.com/simonw/9f8bf23b37a42d7628c4dcc4bba10253",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/92/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273944952,MDU6SXNzdWUyNzM5NDQ5NTI=,93,Package as standalone binary,67420,atomotic,closed,0,,,,,18,2017-11-14T21:14:07Z,2021-11-21T07:00:23Z,2021-11-21T07:00:23Z,NONE,,"hint: more than the docker image a standalone and multiplatform binary (containing the app and the database) could be simpler to distribute. i would like to investigate the possibility to package everything with [pyinstaller](http://www.pyinstaller.org/) adding the database as a [data file](https://pythonhosted.org/PyInstaller/spec-files.html#adding-data-files)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/93/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 273961179,MDExOlB1bGxSZXF1ZXN0MTUyNjMxNTcw,94,Initial add simple prod ready Dockerfile refs #57,247192,macropin,closed,0,,,,,1,2017-11-14T22:09:09Z,2017-11-15T03:08:04Z,2017-11-15T03:08:04Z,CONTRIBUTOR,simonw/datasette/pulls/94,"Multi-stage build based off official python:3.6-slim Example usage: ``` docker run --rm -t -i -p 9000:8001 -v $(pwd)/db:/db datasette datasette serve /db/chinook.db ```",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/94/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 273998513,MDU6SXNzdWUyNzM5OTg1MTM=,95,Allow shorter time limits to be set using a ?_sql_time_limit_ms =20 query string limit,9599,simonw,closed,0,,,,,1,2017-11-15T01:02:16Z,2017-11-15T02:56:13Z,2017-11-15T02:56:13Z,OWNER,,This cannot be greater than the configured time limit.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/95/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274001453,MDU6SXNzdWUyNzQwMDE0NTM=,96,UI for editing named parameters,9599,simonw,closed,0,,,,,3,2017-11-15T01:19:21Z,2017-11-16T01:45:51Z,2017-11-16T01:33:38Z,OWNER,,"On any page displaying a custom query that includes named parameters, we should show HTML form fields for editing those parameters. Eg the breed parameter on https://australian-dogs.now.sh/australian-dogs-3ba9628?sql=select+name%2C+count%28*%29+as+n+from+%28%0D%0A%0D%0Aselect+upper%28%22Animal+name%22%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2013%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2014%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28Animal_Name%29+as+name+from+%5BAdelaide-City-Council-dog-registrations-2015%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22AnimalName%22%29+as+name+from+%5BCity-of-Port-Adelaide-Enfield-Dog_Registrations_2016%5D+where+AnimalBreed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5BMitcham-dog-registrations-2015%5D+where+Breed+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22DOG_NAME%22%29+as+name+from+%5Bburnside-dog-registrations-2015%5D+where+DOG_BREED+like+%3Abreed%0D%0A%0D%0Aunion+all+%0D%0A%0D%0Aselect+upper%28%22Animal_Name%22%29+as+name+from+%5Bcity-of-playford-2015-dog-registration%5D+where+Breed_Description+like+%3Abreed%0D%0A%0D%0Aunion+all%0D%0A%0D%0Aselect+upper%28%22Animal+Name%22%29+as+name+from+%5Bcity-of-prospect-dog-registration-details-2016%5D+where%22Breed+Description%22+like+%3Abreed%0D%0A%0D%0A%29+group+by+name+order+by+n+desc%3B&breed=pug",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/96/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274022950,MDU6SXNzdWUyNzQwMjI5NTA=,97,Link to JSON for the list of tables ,9599,simonw,closed,0,,,,,3,2017-11-15T03:29:05Z,2018-05-29T18:51:35Z,2018-05-28T20:57:21Z,OWNER,,https://twitter.com/yschimke/status/930606210855854080,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/97/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274023417,MDU6SXNzdWUyNzQwMjM0MTc=,98,Default to 127.0.0.1 not 0.0.0.0,9599,simonw,closed,0,,,,,0,2017-11-15T03:31:55Z,2017-11-15T05:08:54Z,2017-11-15T05:08:54Z,OWNER,,https://twitter.com/yschimke/status/930606210855854080,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/98/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274023625,MDU6SXNzdWUyNzQwMjM2MjU=,99,Start a change log,9599,simonw,closed,0,,,,,0,2017-11-15T03:33:21Z,2017-11-16T15:12:46Z,2017-11-16T15:12:45Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/99/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274160723,MDU6SXNzdWUyNzQxNjA3MjM=,100,TemplateAssertionError: no filter named 'tojson',13304454,coisnepe,closed,0,,,,,2,2017-11-15T13:43:41Z,2017-11-16T09:25:10Z,2017-11-16T00:14:13Z,NONE,,"A 500 error is raised upon clicking on the name of a table on the homepage, say _http://0.0.0.0:8001/_ to _http://0.0.0.0:8001/test_check-c1f4771/users_ The API part seems to function as intended, though... ``` 2017-11-15 14:33:57 - (sanic)[ERROR]: Traceback (most recent call last): File ""/usr/local/lib/python3.5/dist-packages/sanic/app.py"", line 503, in handle_request response = await response File ""/usr/local/lib/python3.5/dist-packages/datasette/app.py"", line 155, in get return await self.view_get(request, name, hash, **kwargs) File ""/usr/local/lib/python3.5/dist-packages/datasette/app.py"", line 219, in view_get **context, File ""/usr/local/lib/python3.5/dist-packages/sanic_jinja2/__init__.py"", line 84, in render return html(self.render_string(template, request, **context)) File ""/usr/local/lib/python3.5/dist-packages/sanic_jinja2/__init__.py"", line 81, in render_string return self.env.get_template(template).render(**context) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 812, in get_template return self._load_template(name, self.make_globals(globals)) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 786, in _load_template template = self.loader.load(self, name, globals) File ""/usr/lib/python3/dist-packages/jinja2/loaders.py"", line 125, in load code = environment.compile(source, name, filename) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 565, in compile self.handle_exception(exc_info, source_hint=source_hint) File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 754, in handle_exception reraise(exc_type, exc_value, tb) File ""/usr/lib/python3/dist-packages/jinja2/_compat.py"", line 37, in reraise raise value.with_traceback(tb) File ""/usr/local/lib/python3.5/dist-packages/datasette/templates/table.html"", line 29, in template
params = {{ query.params|tojson(4) }}
File ""/usr/lib/python3/dist-packages/jinja2/environment.py"", line 515, in _generate return generate(source, self, name, filename, defer_init=defer_init) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 62, in generate generator.visit(node) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 849, in visit_Template self.blockvisit(block.body, block_frame) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 1172, in visit_If self.blockvisit(node.body, if_frame) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 1353, in visit_Output self.visit(argument, frame) File ""/usr/lib/python3/dist-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 1565, in visit_Filter self.fail('no filter named %r' % node.name, node.lineno) File ""/usr/lib/python3/dist-packages/jinja2/compiler.py"", line 427, in fail raise TemplateAssertionError(msg, lineno, self.name, self.filename) jinja2.exceptions.TemplateAssertionError: no filter named 'tojson' 2017-11-15 14:33:57 - (network)[INFO][127.0.0.1:41316]: GET http://0.0.0.0:8001/test_check-c1f4771/users 500 144 2017-11-15 14:33:57 - (network)[INFO][127.0.0.1:41316]: GET http://0.0.0.0:8001/favicon.ico 200 0 ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/100/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274161964,MDU6SXNzdWUyNzQxNjE5NjQ=,101,TemplateAssertionError: no filter named 'tojson',450244,eaubin,closed,0,,,,,1,2017-11-15T13:47:32Z,2017-11-15T13:48:55Z,2017-11-15T13:48:55Z,NONE,,"I get an exception clicking on the table link: ``` 2017-11-15 08:40:10 - (sanic)[ERROR]: Traceback (most recent call last): File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/sanic/app.py"", line 503, in handle_request response = await response File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/datasette/app.py"", line 155, in get return await self.view_get(request, name, hash, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/datasette/app.py"", line 219, in view_get **context, File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/sanic_jinja2/__init__.py"", line 84, in render return html(self.render_string(template, request, **context)) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/sanic_jinja2/__init__.py"", line 81, in render_string return self.env.get_template(template).render(**context) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 812, in get_template return self._load_template(name, self.make_globals(globals)) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 786, in _load_template template = self.loader.load(self, name, globals) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/loaders.py"", line 125, in load code = environment.compile(source, name, filename) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 565, in compile self.handle_exception(exc_info, source_hint=source_hint) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 754, in handle_exception reraise(exc_type, exc_value, tb) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/_compat.py"", line 37, in reraise raise value.with_traceback(tb) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/datasette/templates/table.html"", line 29, in template
params = {{ query.params|tojson(4) }}
File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/environment.py"", line 515, in _generate return generate(source, self, name, filename, defer_init=defer_init) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 62, in generate generator.visit(node) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 849, in visit_Template self.blockvisit(block.body, block_frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 1172, in visit_If self.blockvisit(node.body, if_frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 492, in blockvisit self.visit(node, frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 1353, in visit_Output self.visit(argument, frame) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/visitor.py"", line 38, in visit return f(node, *args, **kwargs) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 1565, in visit_Filter self.fail('no filter named %r' % node.name, node.lineno) File ""/Users/e/anaconda3-4.2.0/lib/python3.5/site-packages/jinja2/compiler.py"", line 427, in fail raise TemplateAssertionError(msg, lineno, self.name, self.filename) jinja2.exceptions.TemplateAssertionError: no filter named 'tojson' ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/101/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274264175,MDU6SXNzdWUyNzQyNjQxNzU=,102,datasette publish elasticbeanstalk,9599,simonw,closed,0,,,,,1,2017-11-15T18:48:31Z,2021-01-04T20:13:20Z,2021-01-04T20:13:19Z,OWNER,,"It looks like Elastic Beanstalk is the most convenient way to deploy a docker container to AWS without first deploying a cluster. https://aws.amazon.com/blogs/devops/dockerizing-a-python-web-app/ looks helpful. We would need to automate the deployment with Boto: http://boto3.readthedocs.io/en/latest/reference/services/elasticbeanstalk.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/102/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274265878,MDU6SXNzdWUyNzQyNjU4Nzg=,103,datasette publish appengine,9599,simonw,closed,0,,,,,1,2017-11-15T18:54:18Z,2021-01-04T20:05:14Z,2021-01-04T20:05:14Z,OWNER,,"Similar approach to Heroku, discussed in #90 Looks like this could be pretty easy: https://cloud.google.com/appengine/docs/flexible/python/quickstart",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/103/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274284246,MDExOlB1bGxSZXF1ZXN0MTUyODcwMDMw,104,[WIP] Add publish to heroku support,21148,jacobian,closed,0,,,,,6,2017-11-15T19:56:22Z,2017-11-21T20:55:05Z,2017-11-21T20:55:05Z,CONTRIBUTOR,simonw/datasette/pulls/104," Refs #90 ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/104/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 274314940,MDU6SXNzdWUyNzQzMTQ5NDA=,105,Consider data-package as a format for metadata,9599,simonw,closed,0,,,,,4,2017-11-15T21:43:34Z,2017-11-20T19:50:53Z,2017-11-20T19:50:53Z,OWNER,,http://frictionlessdata.io/specs/data-package/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/105/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274315193,MDU6SXNzdWUyNzQzMTUxOTM=,106,Document how pagination works,9599,simonw,closed,0,,,,,1,2017-11-15T21:44:32Z,2019-06-24T06:42:33Z,2019-06-24T06:42:33Z,OWNER,,I made a start at that in this comment: https://news.ycombinator.com/item?id=15691926,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/106/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274343647,MDExOlB1bGxSZXF1ZXN0MTUyOTE0NDgw,107,add support for ?field__isnull=1,3433657,raynae,closed,0,,,,,4,2017-11-15T23:36:36Z,2017-11-17T15:12:29Z,2017-11-17T13:29:22Z,CONTRIBUTOR,simonw/datasette/pulls/107,Is this what you had in mind for [this issue](https://github.com/simonw/datasette/issues/64)?,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/107/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 274374317,MDU6SXNzdWUyNzQzNzQzMTc=,108,"Include version in python code, output in template",9599,simonw,closed,0,,,,,0,2017-11-16T02:32:40Z,2017-11-16T15:30:04Z,2017-11-16T15:30:04Z,OWNER,,It would be useful if I could tell which version of datasette was running on a site. Embed version number and include it in maybe a tooltip on the “powered by datasette” link,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/108/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274378301,MDU6SXNzdWUyNzQzNzgzMDE=,109,Set up readthedocs,9599,simonw,closed,0,,,,,1,2017-11-16T02:58:01Z,2017-11-16T16:53:26Z,2017-11-16T16:13:56Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/109/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274578142,MDU6SXNzdWUyNzQ1NzgxNDI=,110,Add --load-extension option to datasette for loading extra SQLite extensions,9599,simonw,closed,0,,,,,2,2017-11-16T16:26:19Z,2017-11-16T18:38:30Z,2017-11-16T16:58:50Z,OWNER,,"This would allow users with extra SQLite extensions installed (like spatialite) to load them at runtime. Inspired by this comment: https://github.com/simonw/datasette/issues/46#issuecomment-344810525",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/110/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274615452,MDU6SXNzdWUyNzQ2MTU0NTI=,111,Add “updated” to metadata,9599,simonw,open,0,,,,,12,2017-11-16T18:22:20Z,2021-09-21T22:48:27Z,,OWNER,,"To give an indication as to when the data was last updated. This should be a field in the metadata that is then shown on the index page and in the footer, if it is set. Also support setting it using an option to “datasette publish” and “datasette package” - which can either be a string or can be the magic string “today” to set it to today’s date: datasette publish file.db --updated=today",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/111/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 274617240,MDU6SXNzdWUyNzQ2MTcyNDA=,112,Allow --load-extension to be set via environment variables,9599,simonw,closed,0,,,,,1,2017-11-16T18:28:31Z,2017-11-17T14:19:23Z,2017-11-17T14:17:27Z,OWNER,,"This will make it easier to package up datasette in a Docker container with a bunch of pre-compiled extensions without the user having to remember to include all of the options every time. Click has a mechanism for this: http://click.pocoo.org/5/options/#multiple-values-from-environment-values",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/112/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274662378,MDU6SXNzdWUyNzQ2NjIzNzg=,113,Fix the   bug on the database custom SQL query view,9599,simonw,closed,0,,,2919870,Foreign key edition,0,2017-11-16T21:01:26Z,2017-11-17T15:40:52Z,2017-11-17T15:40:52Z,OWNER,,"https://sf-film-locations.now.sh/sf-film-locations-57704b7?sql=select+*+from+Film_Locations_in_San_Francisco This is the bug I fixed in 01e0c3fa18cd0dd7970e208790ffd683a420c924 - but I only fixed it in one place.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/113/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274733145,MDExOlB1bGxSZXF1ZXN0MTUzMjAxOTQ1,114,"Add spatialite, switch to debian and local build",54999,ingenieroariel,closed,0,,,,,1,2017-11-17T02:37:09Z,2017-11-17T03:50:52Z,2017-11-17T03:50:52Z,CONTRIBUTOR,simonw/datasette/pulls/114,"Improves the Dockerfile to support spatial datasets, work with the local datasette code (Friendly with git tags and Dockerhub) and moves to slim debian, a small image easy to extend via apt packages for sqlite.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/114/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 274877366,MDExOlB1bGxSZXF1ZXN0MTUzMzA2ODgy,115,Add keyboard shortcut to execute SQL query,198537,rgieseke,closed,0,,,,,1,2017-11-17T14:13:33Z,2017-11-17T15:16:34Z,2017-11-17T14:22:56Z,CONTRIBUTOR,simonw/datasette/pulls/115,"Very cool tool, thanks a lot! This PR adds a `Shift-Enter` short cut to execute the SQL query. I used CodeMirrors keyboard handling.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/115/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 274884209,MDU6SXNzdWUyNzQ4ODQyMDk=,116,Add documentation section about SQLite extensions,9599,simonw,closed,0,,,,,1,2017-11-17T14:36:30Z,2018-05-28T17:23:42Z,2018-05-28T17:23:41Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/116/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 274900388,MDExOlB1bGxSZXF1ZXN0MTUzMzI0MzAx,117,Don't prevent tabbing to `Run SQL` button,198537,rgieseke,closed,0,,,,,1,2017-11-17T15:27:50Z,2017-11-19T20:30:24Z,2017-11-18T00:53:43Z,CONTRIBUTOR,simonw/datasette/pulls/117,"Mentioned in #115 Here you go!",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/117/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 275048699,MDExOlB1bGxSZXF1ZXN0MTUzNDMyMDQ1,118,Foreign key information on row and table pages,9599,simonw,closed,0,,,,,0,2017-11-18T03:13:27Z,2017-11-18T03:15:57Z,2017-11-18T03:15:50Z,OWNER,simonw/datasette/pulls/118,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/118/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 275082158,MDU6SXNzdWUyNzUwODIxNTg=,119,"Build an ""export this data to google sheets"" plugin",9599,simonw,closed,0,,,,,1,2017-11-18T14:14:51Z,2020-06-04T18:46:40Z,2020-06-04T18:46:39Z,OWNER,,"Inspired by https://github.com/kren1/tosheets It should be a plug-in because I'd like to keep all interactions with proprietary / non-open-source software encapsulated in plugins rather than shipped as part of core.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/119/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275087397,MDU6SXNzdWUyNzUwODczOTc=,120,Plugin that adds an authentication layer of some sort,9599,simonw,closed,0,,,,,4,2017-11-18T15:39:13Z,2020-03-16T18:48:06Z,2020-03-16T18:48:06Z,OWNER,,"Would allow people who want to host private data to do so. .sh ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/120/reactions"", ""total_count"": 7, ""+1"": 5, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 2, ""rocket"": 0, ""eyes"": 0}",,completed 275089535,MDU6SXNzdWUyNzUwODk1MzU=,121,?_json=foo&_json=bar query string argument ,9599,simonw,closed,0,,,,,4,2017-11-18T16:09:55Z,2018-05-31T13:48:12Z,2018-05-28T18:11:51Z,OWNER,,"Causes the specified columns in the output to be treated as JSON, and returned deserialized in the .json or .jsono response. This will be particularly powerful when combined with https://sqlite.org/json1.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/121/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275092453,MDU6SXNzdWUyNzUwOTI0NTM=,122,"Redesign JSON output, ditch jsono, offer variants controlled by parameter instead",9599,simonw,closed,0,,,,,5,2017-11-18T16:52:28Z,2018-04-08T14:54:09Z,2018-04-08T14:54:09Z,OWNER,,"I want to support three variants for the rows output: * a list of lists, with a columns key saying what they are * a list of dictionaries * a single dictionary where the keys are the primary keys of the rows and the values are the row dictionaries themselves I also want to make the various bits of metadata opt-in - so you don't get the SQL statement unless you ask for it. These output options should be controlled by query string arguments. I will set the .jsono URL to redirect to .json with the corresponding options. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/122/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275125561,MDU6SXNzdWUyNzUxMjU1NjE=,123,Datasette serve should accept paths/URLs to CSVs and other file formats,9599,simonw,open,0,,,,,9,2017-11-19T02:05:48Z,2021-07-19T00:04:32Z,,OWNER,,"This would remove the csvs-to-sqlite step which I end up using for almost everything. I'm hesitant to introduce pandas as a required dependency though since it require compiling numpy. Could build it so this option is only available if you have pandas installed.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/123/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",, 275125805,MDU6SXNzdWUyNzUxMjU4MDU=,124,Option to open readonly but not immutable,9599,simonw,closed,0,,,,,5,2017-11-19T02:11:03Z,2019-06-24T06:43:46Z,2019-06-24T06:43:46Z,OWNER,,Immutable assumes no other process can modify the file. An option to open reqdonly instead would enable other processes to update the file in place.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/124/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275135393,MDU6SXNzdWUyNzUxMzUzOTM=,125,Plot rows on a map with Leaflet and Leaflet.markercluster,9599,simonw,closed,0,,,,,2,2017-11-19T06:05:05Z,2018-04-26T15:14:31Z,2018-04-26T15:14:31Z,OWNER,,"https://github.com/Leaflet/Leaflet.markercluster would allow us to paginate-load in an enormous set of rows with latitude/longitude points, e.g. https://australian-dunnies.now.sh/ Here's a demo of it loading 50,000 markers: https://leaflet.github.io/Leaflet.markercluster/example/marker-clustering-realworld.50000.html - and it looks like it's easy to support progress bars for if we were iteratively loading 1,000 markers at a time using datasette pagination.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/125/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275135535,MDU6SXNzdWUyNzUxMzU1MzU=,126,Blog entry announcing foreign key support,9599,simonw,closed,0,,,2919870,Foreign key edition,1,2017-11-19T06:09:06Z,2017-11-30T16:49:24Z,2017-11-30T16:49:24Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/126/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275135719,MDU6SXNzdWUyNzUxMzU3MTk=,127,"Filtered tables should show count of all matching rows, if fast enough",9599,simonw,closed,0,,,2919870,Foreign key edition,2,2017-11-19T06:13:29Z,2017-11-24T22:02:01Z,2017-11-24T22:02:01Z,OWNER,,"Relates to #86. If you are viewing a filtered page e.g. https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9/bob-ross%2Felements-by-episode?CLOUDS=1 we should show the count of matching rows. Since this could be an expensive operation, we will run it with a strict time limit (maybe 50ms). If the time limit is exceeded we will display ""many"" instead, perhaps? Maybe even link to a count(*) query that would get the full 1000ms time limit which the user can click on if they like (that could even Ajax-in the result).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/127/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275159710,MDU6SXNzdWUyNzUxNTk3MTA=,128,"Every visualization should have an ""embed"" button",9599,simonw,open,0,,,,,0,2017-11-19T13:38:13Z,2019-05-13T18:33:51Z,,OWNER,,"At least for the first round of visualizations, any time you construct one using the UI the result should include an ""embed this"" button that returns source code to copy and paste These examples should use unpkg.com (or similarl) urls with SRI hashes, eg https://www.srihash.org - and should load data from the datasette JSON API.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/128/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 275164558,MDU6SXNzdWUyNzUxNjQ1NTg=,129,Hide FTS-created tables by default on the database index page,9599,simonw,closed,0,,,,,2,2017-11-19T14:50:42Z,2017-11-22T20:22:02Z,2017-11-22T20:19:04Z,OWNER,,"SQLite databases that use FTS include a number of automatically generated tables, e.g.: https://sf-trees-search.now.sh/sf-trees-search-a899b92 Of these, only the `Street_Tree_List` table is actually relevant to the user. We can detect which tables are FTS tables by first finding the virtual tables: sqlite> .headers on sqlite> select * from sqlite_master where rootpage = 0; type|name|tbl_name|rootpage|sql table|Search|Search|0|CREATE VIRTUAL TABLE ""Street_Tree_List_fts"" USING FTS4 (""qAddress"", ""qCaretaker"", ""qSpecies"") Then parsing the above to figure out which ones are USING FTS? - then assume that any table which starts with that `Street_Tree_List_fts` prefix was created to support search: sqlite> select * from sqlite_master where type='table' and tbl_name like 'Street_Tree_List_fts%'; type|name|tbl_name|rootpage|sql table|Search_content|Search_content|10355|CREATE TABLE 'Street_Tree_List_fts_content'(docid INTEGER PRIMARY KEY, 'c0qAddress', 'c1qCaretaker', 'c2qSpecies') table|Search_segments|Search_segments|10356|CREATE TABLE 'Street_Tree_List_fts_segments'(blockid INTEGER PRIMARY KEY, block BLOB) table|Search_segdir|Search_segdir|10357|CREATE TABLE 'Street_Tree_List_fts_segdir'(level INTEGER,idx INTEGER,start_block INTEGER,leaves_end_block INTEGER,end_block INTEGER,root BLOB,PRIMARY KEY(level, idx)) table|Search_docsize|Search_docsize|10359|CREATE TABLE 'Street_Tree_List_fts_docsize'(docid INTEGER PRIMARY KEY, size BLOB) table|Search_stat|Search_stat|10360|CREATE TABLE 'Street_Tree_List_fts_stat'(id INTEGER PRIMARY KEY, value BLOB) We won't hide these completely - instead, we'll default the database index view to not showing them with a message that says ""5 hidden tables"" and support ?_hidden=1 to display them.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/129/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275166078,MDU6SXNzdWUyNzUxNjYwNzg=,130,"Rename ""datasette build"" to ""datasette inspect""",9599,simonw,closed,0,,,,,0,2017-11-19T15:08:02Z,2017-12-07T16:57:58Z,2017-12-07T16:57:58Z,OWNER,,"This command introspects the databases and writes out a JSON summary. I think I'd like to use `datasette build` for something more interesting, potentially duplicating functionality from https://github.com/simonw/csvs-to-sqlite Since the internal method that does this is called `ds.inspect()` that seems like a reasonable replacement name for the command.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/130/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275166669,MDU6SXNzdWUyNzUxNjY2Njk=,131,UI support for running FTS searches,9599,simonw,closed,0,,,,,3,2017-11-19T15:16:20Z,2017-11-19T17:18:05Z,2017-11-19T17:00:12Z,OWNER,,"Here's an example query that searches all FTS indexed columns in a table: https://sf-trees-search.now.sh/sf-trees-search-a899b92?sql=select+*+from+Street_Tree_List+where+rowid+in+%28select+rowid+from+Street_Tree_List_fts+where+Street_Tree_List_fts+match+%27grove+london+dpw%27%29%0D%0A And here's a query that searches a specific column: https://sf-trees-search.now.sh/sf-trees-search-a899b92?sql=select+*+from+Street_Tree_List+where+rowid+in+%28select+rowid+from+Street_Tree_List_fts+where+qSpecies+match+%27london%27%29%0D%0A If we detect that a table has FTS enabled (which we can do by looking for it as a content table reference in another FTS table's create definition) we should add a search box to the table page which constructs this query - maybe using `?_search=XXX` in the query string? To support search against specified columns, we can do `?_search__ qSpecies=London`. - not necessary, see comment below. - [x] Detect if a table has a FTS index defined against it as a content= parameter - [x] Decide what to do if there is more than one FTS index (maybe just pick the first one?) - [x] Add the `?_search=` query string argument - [x] Add the UI",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/131/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275175929,MDU6SXNzdWUyNzUxNzU5Mjk=,132,Row view is not currently expanding foreign keys,9599,simonw,closed,0,,,2919870,Foreign key edition,1,2017-11-19T17:24:25Z,2017-11-23T21:51:51Z,2017-11-23T21:51:30Z,OWNER,,Eg https://sf-trees.now.sh/sf-trees-ebc2ad9/Street_Tree_List/1,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/132/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275176006,MDU6SXNzdWUyNzUxNzYwMDY=,133,"If view is filtered, search should apply within those filtered rows",9599,simonw,closed,0,,,2919870,Foreign key edition,3,2017-11-19T17:25:36Z,2017-11-24T22:30:32Z,2017-11-24T22:30:15Z,OWNER,,Eg on https://sf-trees.now.sh/sf-trees-ebc2ad9/Street_Tree_List?qSpecies=1,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/133/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275176094,MDU6SXNzdWUyNzUxNzYwOTQ=,134,Filtered table view should show a count,9599,simonw,closed,0,,,2919870,Foreign key edition,1,2017-11-19T17:26:53Z,2017-11-19T18:10:49Z,2017-11-19T18:10:49Z,OWNER,,Let's do the thing where we attempt to show an accurate count if it can be done in less than 50ms,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/134/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275179724,MDU6SXNzdWUyNzUxNzk3MjQ=,135,?_search=x should work if used directly against a FTS virtual table,9599,simonw,closed,0,,,2949431,Custom templates edition,3,2017-11-19T18:17:53Z,2017-12-07T04:54:41Z,2017-12-07T04:54:41Z,OWNER,,e.g. https://sf-trees.now.sh/sf-trees-ebc2ad9/Street_Tree_List_fts?_search=grove should work,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/135/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275228834,MDU6SXNzdWUyNzUyMjg4MzQ=,136,"""Reformat SQL"" button next to SQL editor textarea",9599,simonw,closed,0,,,,,0,2017-11-20T03:42:19Z,2019-10-14T03:46:13Z,2019-10-14T03:46:13Z,OWNER,,"Can use this: https://github.com/zeroturnaround/sql-formatter https://zeroturnaround.github.io/sql-formatter/ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/136/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275415799,MDU6SXNzdWUyNzU0MTU3OTk=,137,Ability to combine multiple SQL queries on a single graph,9599,simonw,open,0,,,,,1,2017-11-20T16:26:57Z,2019-05-13T18:33:51Z,,OWNER,,This would make visualizations significantly more powerful. The interesting challenge will be around the URL design. It would be useful to be able to combine either multiple explicit SQL queries or multiple queries based on the filter string parameters passed to one or more table views.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/137/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 275476839,MDU6SXNzdWUyNzU0NzY4Mzk=,138,"Per-database and per-table metadata, probably using data-package",9599,simonw,closed,0,,,,,1,2017-11-20T19:50:10Z,2017-12-10T03:08:36Z,2017-12-10T03:08:26Z,OWNER,,"Ability to annotate databases and tables with extra metadata describing their purpose, providing source and licensing information and describing individual columns. http://frictionlessdata.io/specs/data-package/ looks like a great format for this, see #105 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/138/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275493851,MDU6SXNzdWUyNzU0OTM4NTE=,139,Build a visualization plugin for Vega,9599,simonw,closed,0,,,,,2,2017-11-20T20:47:41Z,2018-07-10T17:48:18Z,2018-07-10T17:48:18Z,OWNER,,"https://vega.github.io/vega/examples/population-pyramid/ for example looks pretty easy to hook up to Datasette. Depends on #14 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/139/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275755475,MDU6SXNzdWUyNzU3NTU0NzU=,140,Heatmap visualization plugin,9599,simonw,open,0,,,,,2,2017-11-21T15:34:23Z,2019-05-13T18:33:51Z,,OWNER,,Could use https://github.com/scottbedard/svelte-heatmap,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/140/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 275814941,MDU6SXNzdWUyNzU4MTQ5NDE=,141,datasette publish can fail if /tmp is on a different device,21148,jacobian,closed,0,,,2949431,Custom templates edition,5,2017-11-21T18:28:05Z,2020-04-29T03:27:54Z,2017-12-08T16:06:36Z,CONTRIBUTOR,,"`datasette publish` uses hard links to avoid copying the db into a tmp directory. This can fail if `/tmp` is on another device, because hardlinks can't cross devices. You'll see something like this: ``` $ datasette publish heroku whatever.db ... OSError: [Errno 18] Invalid cross-device link: '/mnt/c/Users/jacob/c/datasette/whatever.db' -> '/tmp/tmpvxq2yof6/whatever.db' ``` [In my case this is failing because I'm on a Windows machine, using WSL, so my code's on a different virtual filesystem from the Linux subsystem, Because Reasons.] I'm not sure if it's possible to detect this (can you figure out which device `/tmp` is on?), or what the fallback should be (soft link? copy?).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/141/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275917760,MDU6SXNzdWUyNzU5MTc3NjA=,142,Show extra instructions with the interrupted,9599,simonw,closed,0,,,,,3,2017-11-22T01:44:29Z,2018-05-28T21:25:06Z,2018-05-28T21:24:35Z,OWNER,,"When you are using Datasette locally for ad-hoc analysis it can be frustrating to hit the time limit. If you start it with the correct command line arguments you can disable that time limit. So how about we tell you how to do that anytime you hit the interrupted error provided you are accessing it from localhost.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/142/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 275939188,MDU6SXNzdWUyNzU5MzkxODg=,143,"Mechanism for ""suggested visualizations""",9599,simonw,closed,0,,,,,1,2017-11-22T04:10:25Z,2018-07-10T17:48:34Z,2018-07-10T17:48:34Z,OWNER,," Each visualization should have a way of deciding if it might be appropriate for the current view of data. We can then offer a ""suggested visualizations"" prompt which shows previews.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/143/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 276091279,MDU6SXNzdWUyNzYwOTEyNzk=,144,apsw as alternative sqlite3 binding (for full text search),649467,mhalle,closed,0,,,,,3,2017-11-22T14:40:39Z,2018-05-28T21:29:42Z,2018-05-28T21:29:42Z,NONE,,"Hey there, Have you considered providing apsw support as an alternative to stock python sqlite3? I use apsw because it keeps up with sqlite3 and is straightforward to bring in extensions like FTS5. FTS really accelerates the kind of searching often done by web clients. I may be able to help (it shouldn't be much code), but there are a couple of stylistic questions that come up when supporting an optional package. Also, apsw is tricky in that it doesn't have a pypi package (author says limitations in providing options to setup.py). ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/144/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 276192732,MDExOlB1bGxSZXF1ZXN0MTU0MjQ2ODE2,145,Fix pytest version conflict,9599,simonw,closed,0,,,,,0,2017-11-22T20:15:34Z,2017-11-22T20:17:54Z,2017-11-22T20:17:52Z,OWNER,simonw/datasette/pulls/145,"https://travis-ci.org/simonw/datasette/jobs/305929426 pkg_resources.VersionConflict: (pytest 3.2.1 (/home/travis/virtualenv/python3.5.3/lib/python3.5/site-packages), Requirement.parse('pytest==3.2.3'))",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/145/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 276455748,MDU6SXNzdWUyNzY0NTU3NDg=,146,datasette publish gcloud,9599,simonw,closed,0,,,,,2,2017-11-23T18:55:03Z,2019-06-24T06:48:20Z,2019-06-24T06:48:20Z,OWNER,,"See also #103 It looks like you can start a Google Cloud VM with a ""docker container"" option - and the Google Cloud Registry is easy to push containers to. So it would be feasible to have `datasette publish gcloud ...` automatically build a container, push it to GCR, then start a new VM instance with it: https://cloud.google.com/container-registry/docs/pushing-and-pulling ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/146/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 276476670,MDU6SXNzdWUyNzY0NzY2NzA=,147,Tidy up design of the header of the table page,9599,simonw,closed,0,,,2919870,Foreign key edition,1,2017-11-23T21:52:58Z,2017-11-24T22:02:46Z,2017-11-24T22:02:46Z,OWNER,,"This is a bit messy: Depends on #127 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/147/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 276477888,MDU6SXNzdWUyNzY0Nzc4ODg=,148,Need a != filter,9599,simonw,closed,0,,,2919870,Foreign key edition,0,2017-11-23T22:05:22Z,2017-11-23T22:10:02Z,2017-11-23T22:10:01Z,OWNER,,https://datasette-demos.now.sh/sf-trees-ebc2ad9/Street_Tree_List?qCareAssistant=1 shows trees managed by FUF - but how about trees that are NOT managed by FUF?,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/148/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 276704127,MDU6SXNzdWUyNzY3MDQxMjc=,149,Update custom SQL results to match new table view header,9599,simonw,closed,0,,,2919870,Foreign key edition,1,2017-11-24T22:03:59Z,2017-11-24T22:42:10Z,2017-11-24T22:42:09Z,OWNER,,"Follow-on from #147 - the custom SQL results page should more closely match the design of the table view, which now looks like this: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/149/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 276704327,MDU6SXNzdWUyNzY3MDQzMjc=,150,_group_count= feature improvements,9599,simonw,closed,0,,,,,3,2017-11-24T22:06:18Z,2018-05-28T16:41:28Z,2018-05-28T16:41:28Z,OWNER,,"- [ ] The ""apply filters"" form should keep you on the _group_count= page - [ ] Foreign key references should be expand - [ ] Page title should reflect the view you are on",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/150/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 276718605,MDU6SXNzdWUyNzY3MTg2MDU=,151,Set up a pattern portfolio,9599,simonw,closed,0,,,,,2,2017-11-25T02:09:49Z,2020-07-02T00:13:24Z,2020-05-03T03:13:16Z,OWNER,,"https://www.slideshare.net/nataliedowne/practical-maintainable-css/75 This will be a single page that demonstrates all of the different CSS styles and classes available to Datasette.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/151/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 276765070,MDU6SXNzdWUyNzY3NjUwNzA=,152,Incorrect display of rows page for tables with a primary key,9599,simonw,closed,0,,,2949431,Custom templates edition,0,2017-11-25T17:29:54Z,2017-12-07T05:23:20Z,2017-12-07T05:23:19Z,OWNER,,"This is a regression. Here's the old version: And here's the new, broken one: https://parlgov-xtxlddmtiz.now.sh/parlgov-25f9855/party_family/1 The JSON output is the same for both - it's only the HTML representation that exhibits the bug.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/152/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 276842536,MDU6SXNzdWUyNzY4NDI1MzY=,153,Ability to customize presentation of specific columns in HTML view,20264,ftrain,closed,0,,,2949431,Custom templates edition,14,2017-11-26T17:46:11Z,2017-12-10T02:08:45Z,2017-12-07T06:17:33Z,NONE,,"This ties into https://github.com/simonw/datasette/issues/3 in some ways. It would be great to have some adaptability in the HTML views and to specific some columns as displaying in certain ways. - [x] 1. **Auto-parsing URIs into in-browser links.** Why? Lots of public data around cultural commons stuff links to a specific URL. This would be a great utility to turn on at the command line, just parse everything for URLs. Maybe they need to be underlined or represented in a different way than internal URLs. - [x] 2. **Ability to identify a column as plain/preformatted text.** Why? Was trying to import the Enron emails, the body collapses. Hard to read. These fields also tend to screw up the ability to scan a table view. If you knew it was text the system could set an `overflow` property on the relevant CSS, so you could still scan. - [x] 3. **Ability to identify a column as HTML.** Why? I want to spider some stuff and drop sections into SQLite, and just keep them as HTML.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/153/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 276873891,MDU6SXNzdWUyNzY4NzM4OTE=,154,Datasette CSS should include content hash in the URL,9599,simonw,closed,0,,,2949431,Custom templates edition,3,2017-11-27T00:57:36Z,2017-12-09T03:10:23Z,2017-12-09T03:10:22Z,OWNER,,"When I deployed the latest version of datasette to https://fivethirtyeight.datasettes.com/ I noticed I was getting served stale CSS since it had been cached. Including the sha of he contents in its URL should fix that. I can calculate this on server start.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/154/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 277589569,MDU6SXNzdWUyNzc1ODk1Njk=,155,A primary key column that has foreign key restriction associated won't rendering label column,388154,wsxiaoys,closed,0,,,2949431,Custom templates edition,4,2017-11-29T00:40:02Z,2017-12-07T05:39:53Z,2017-12-07T05:39:53Z,NONE,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/155/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 278189708,MDU6SXNzdWUyNzgxODk3MDg=,156,Document CSS hooks and custom templates,9599,simonw,closed,0,,,2949431,Custom templates edition,1,2017-11-30T16:43:15Z,2017-11-30T17:11:34Z,2017-11-30T17:10:58Z,OWNER,,Documentation currently lives in commit messages on https://github.com/simonw/datasette/commit/8ab3a169d42d096f2c7979c6d3d7746618d30f0b and 3cd06729f457d690603b6060dc552b535517ab09,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/156/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 278190321,MDU6SXNzdWUyNzgxOTAzMjE=,157,"Teach ""datasette publish"" about custom template directories",9599,simonw,closed,0,,,2949431,Custom templates edition,1,2017-11-30T16:44:57Z,2020-01-15T16:05:13Z,2017-12-09T18:28:54Z,OWNER,,"The following command should copy the custom templates into the deployment and ensure `datasette serve` correctly serves them: datasette publish now mydb.db --template-dir=custom-templates/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/157/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 278190981,MDU6SXNzdWUyNzgxOTA5ODE=,158,Ensure default templates are designed to be extended,9599,simonw,closed,0,,,2949431,Custom templates edition,1,2017-11-30T16:46:41Z,2017-12-07T05:41:09Z,2017-12-07T05:41:08Z,OWNER,,"Since custom templates can do `{% extends ""default:table.html"" %}` the default templates should include sensible named `{% block %}` components designed to support common extension patterns. Since we already support `{{ super() }}` we may not have much if anything to add here.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/158/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 278191223,MDU6SXNzdWUyNzgxOTEyMjM=,159,Come up with an elegant mechanism for per-row template customization,9599,simonw,closed,0,,,2949431,Custom templates edition,0,2017-11-30T16:47:26Z,2017-12-07T06:12:27Z,2017-12-07T06:12:26Z,OWNER,,It would be nice if customizing the display of an individual row in a custom table template was as simple as possible - refs #153 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/159/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 278208011,MDU6SXNzdWUyNzgyMDgwMTE=,160,Ability to bundle and serve additional static files,9599,simonw,closed,0,,,2949431,Custom templates edition,8,2017-11-30T17:37:51Z,2019-02-02T00:58:20Z,2017-12-09T18:29:11Z,OWNER,,"Since we now have custom templates, we should support including custom static files with them as well. Maybe something like this: datasette mydb.db --template-dir=templates/ --static-dir=static/ This should also be supported by datasette publish - see also #157 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/160/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 278814220,MDU6SXNzdWUyNzg4MTQyMjA=,161,Support WITH query ,388154,wsxiaoys,closed,0,,,,,4,2017-12-03T20:00:40Z,2017-12-08T06:18:12Z,2017-12-04T04:52:41Z,NONE,,"Currently datasettle failed with error message: Statement must begin with SELECT Example query ```sql WITH RECURSIVE cnt(x) AS ( SELECT 1 UNION ALL SELECT x+1 FROM cnt LIMIT 1000000 ) SELECT x FROM cnt; ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/161/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 279199916,MDU6SXNzdWUyNzkxOTk5MTY=,162,Link should not show up in the column selection dropdowns,9599,simonw,closed,0,,,2949431,Custom templates edition,0,2017-12-05T00:19:04Z,2017-12-07T05:05:58Z,2017-12-07T05:05:58Z,OWNER,,"e.g. on https://san-francisco.datasettes.com/food-trucks-921342f/Applicant ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/162/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 279547886,MDU6SXNzdWUyNzk1NDc4ODY=,163,Document the querystring argument for setting a different time limit,9599,simonw,closed,0,,,,,2,2017-12-05T22:05:08Z,2021-03-23T02:44:33Z,2017-12-06T15:06:57Z,OWNER,,"http://datasette.readthedocs.io/en/latest/sql_queries.html#query-limits Need to explain why this is useful too.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/163/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 280013907,MDU6SXNzdWUyODAwMTM5MDc=,164,datasette skeleton command for kick-starting database and table metadata,9599,simonw,closed,0,,,2949431,Custom templates edition,3,2017-12-07T06:13:28Z,2021-03-23T02:45:12Z,2017-12-07T06:20:45Z,OWNER,,Generates an example `metadata.json` file populated with all of the databases and tables inspected from the specified databases.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/164/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 280014287,MDU6SXNzdWUyODAwMTQyODc=,165,metadata.json support for per-database and per-table information,9599,simonw,closed,0,,,2949431,Custom templates edition,2,2017-12-07T06:15:34Z,2017-12-07T16:48:34Z,2017-12-07T16:47:29Z,OWNER,,"Every database and every table should be able to support the following optional metadata: title description description_html license license_url source source_url If `description_html` is provided it over-rides `description` and will be displayed unescaped.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/165/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 280023225,MDU6SXNzdWUyODAwMjMyMjU=,166,Documentation for metadata.json and datasette skeleton,9599,simonw,closed,0,,,2949431,Custom templates edition,1,2017-12-07T07:02:52Z,2017-12-07T17:20:35Z,2017-12-07T17:20:25Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/166/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 280315352,MDU6SXNzdWUyODAzMTUzNTI=,167,Nasty bug: last column not being correctly displayed,9599,simonw,closed,0,,,2949431,Custom templates edition,6,2017-12-07T23:23:46Z,2017-12-10T01:00:21Z,2017-12-10T01:00:20Z,OWNER,,"e.g. https://datasette-bwnojrhmmg.now.sh/dk3-bde9a9a/dk?source__contains=http ![2017-12-07 at 3 22 pm](https://user-images.githubusercontent.com/9599/33743613-7ee97be0-db62-11e7-8e81-9b9ec69d93f0.png) The JSON output shows that the column is there, but is being displayed incorrectly: https://datasette-bwnojrhmmg.now.sh/dk3-bde9a9a/dk.jsono?source__contains=http ![2017-12-07 at 3 23 pm](https://user-images.githubusercontent.com/9599/33743645-9489b302-db62-11e7-898b-72e812e8855d.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/167/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 280662866,MDExOlB1bGxSZXF1ZXN0MTU3MzY1ODEx,168,Upgrade to Sanic 0.7.0,9599,simonw,closed,0,,,,,1,2017-12-09T01:25:08Z,2017-12-09T03:00:34Z,2017-12-09T03:00:34Z,OWNER,simonw/datasette/pulls/168,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/168/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 280744309,MDU6SXNzdWUyODA3NDQzMDk=,169,Release v0.14 with templates and static files features,9599,simonw,closed,0,,,2949431,Custom templates edition,1,2017-12-09T18:52:48Z,2017-12-10T02:04:56Z,2017-12-10T02:04:56Z,OWNER,,"Everything in this milestone https://github.com/simonw/datasette/milestone/6 - plus various other fixes: https://github.com/simonw/datasette/compare/0.13...6bdfcf60760c27e29ff34692d06e62b36aeecc56 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/169/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 280745470,MDU6SXNzdWUyODA3NDU0NzA=,170,Custom template for named canned query,9599,simonw,closed,0,,,2949431,Custom templates edition,3,2017-12-09T19:07:51Z,2017-12-09T21:35:30Z,2017-12-09T21:34:52Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/170/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 280745746,MDU6SXNzdWUyODA3NDU3NDY=,171,HTML comments specifying custom templates for page,9599,simonw,closed,0,,,2949431,Custom templates edition,1,2017-12-09T19:11:13Z,2017-12-09T21:50:50Z,2017-12-09T21:48:03Z,OWNER,," This would make the custom templating system self-documenting, and save people from having to figure out the right template names for customizing specific pages.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/171/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 280896290,MDU6SXNzdWUyODA4OTYyOTA=,172,Show size of .db file next to download link,9599,simonw,closed,0,,,,,1,2017-12-11T05:12:46Z,2019-02-06T05:09:06Z,2019-02-06T05:00:36Z,OWNER,,"Size in bytes should be calculated by datasette inspect. Template should display it in KB or MB or GB",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/172/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 281110295,MDU6SXNzdWUyODExMTAyOTU=,173,I18n and L10n support,50138,janimo,open,0,,,,,2,2017-12-11T17:49:58Z,2021-04-26T12:10:01Z,,NONE,,It would be less geeky and more user friendly if the display strings in the filter menu and possibly other parts could be localized.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/173/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 281197863,MDU6SXNzdWUyODExOTc4NjM=,174,License/Source in footer should inherit from top level,9599,simonw,closed,0,,,,,1,2017-12-11T23:01:35Z,2018-08-11T17:46:51Z,2018-08-11T17:46:51Z,OWNER,,"The footer on https://vice-police-shootings.now.sh/vice-bc7c892/ViceNews_FullOISData does not show license and source information... but that Datasette has that information, it's just defined at the top level: https://vice-police-shootings.now.sh/ The footer for a row/table page should fall back on information for the database, and if there is none for the database it should fall back on the top-level metadata instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/174/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 282971961,MDU6SXNzdWUyODI5NzE5NjE=,175,"Add project topic ""automatic-api""",3179832,dbohdan,closed,0,,,,,1,2017-12-18T18:09:17Z,2017-12-21T18:33:55Z,2017-12-21T18:33:55Z,NONE,,"Hi there! Could you add the ~~tag~~ topic `automatic-api` to your repository? I am [making a list](https://github.com/dbohdan/automatic-api) of all projects that automatically expose APIs to databases. (Your Show HN made me do it. :-) I knew about PostgREST and PostGraphQL, but it took adding Datasette to sell me on the concept.) They will be easier to discover if there is a standard GitHub tag, and `automatic-api` seems as good a candidate as any. Two projects [already use it](https://github.com/topics/automatic-api).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/175/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 285168503,MDU6SXNzdWUyODUxNjg1MDM=,176,Add GraphQL endpoint,173848,yozlet,open,0,,,,,8,2017-12-29T23:21:01Z,2020-04-21T14:16:24Z,,NONE,,Would make it much easier to build React & similar frontends. Maybe with https://github.com/graphql-python/sanic-graphql ?,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/176/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 286938589,MDU6SXNzdWUyODY5Mzg1ODk=,177,Publishing to Heroku - metadata file not uploaded?,82988,psychemedia,closed,0,,,,,0,2018-01-09T01:04:31Z,2018-01-25T16:45:32Z,2018-01-25T16:45:32Z,CONTRIBUTOR,,"Trying to run *datasette* (version 0.14) on Heroku with a `metadata.json` doesn't seem to be picking up the `metadata.json` file? On a Mac with dodgy `tar` support: ``` ▸ Couldn't detect GNU tar. Builds could fail due to decompression errors ▸ See ▸ https://devcenter.heroku.com/articles/platform-api-deploying-slugs#create-slug-archive ▸ Please install it, or specify the '--tar' option ▸ Falling back to node's built-in compressor ``` Could that be causing the issue? Also, I'm not seeing custom query links anywhere obvious when I run the metadata file with a local *datasette* server? ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/177/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 287240246,MDExOlB1bGxSZXF1ZXN0MTYxOTgyNzEx,178,"If metadata exists, add it to heroku launch command",82988,psychemedia,closed,0,,,,,1,2018-01-09T21:42:21Z,2018-01-15T09:42:46Z,2018-01-14T21:05:16Z,CONTRIBUTOR,simonw/datasette/pulls/178,"The heroku build does seem to make use of any provided `metadata.json` file. Add the `--metadata` switch to the Heroku web launch command if a `metadata.json` file is available. Addresses: https://github.com/simonw/datasette/issues/177",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/178/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 288438570,MDU6SXNzdWUyODg0Mzg1NzA=,179,More metadata options for template authors ,9599,simonw,open,0,,,,,2,2018-01-14T20:51:04Z,2019-05-13T18:33:33Z,,OWNER,,See this thread on Twitter: https://twitter.com/simonw/status/952637152797458432,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/179/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 289375133,MDExOlB1bGxSZXF1ZXN0MTYzNTIzOTc2,180,make html title more readable in query template,56477,ryanpitts,closed,0,,,,,0,2018-01-17T18:56:03Z,2018-04-03T16:03:38Z,2018-04-03T15:24:05Z,CONTRIBUTOR,simonw/datasette/pulls/180,tiny tweak to make this easier to visually parse—I think it matches your style in other templates,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/180/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 289425975,MDExOlB1bGxSZXF1ZXN0MTYzNTYxODMw,181,"add ""format sql"" button to query page, uses sql-formatter",1957344,bsmithgall,closed,0,,,,,7,2018-01-17T21:50:04Z,2019-11-11T03:08:25Z,2019-11-11T03:08:25Z,NONE,simonw/datasette/pulls/181,"Cool project! This fixes #136 using the suggested [sql formatter](https://github.com/zeroturnaround/sql-formatter) library. I included the minified version in the bundle and added the relevant scripts to the codemirror includes instead of adding new files, though I could also add new files. I wanted to keep it all together, since the result of the format needs access to the editor in order to properly update the codemirror instance.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/181/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 291451116,MDExOlB1bGxSZXF1ZXN0MTY1MDI5ODA3,182,Add db filesize next to download link,3433657,raynae,closed,0,,,,,0,2018-01-25T04:58:56Z,2019-03-22T13:50:57Z,2019-02-06T04:59:38Z,CONTRIBUTOR,simonw/datasette/pulls/182,"Took a stab at #172, will this do the trick?",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/182/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 291639118,MDU6SXNzdWUyOTE2MzkxMTg=,183,Custom Queries - escaping strings,82988,psychemedia,closed,0,,,,,2,2018-01-25T16:49:13Z,2019-06-24T06:45:07Z,2019-06-24T06:45:07Z,CONTRIBUTOR,,"If a SQLite table column name contains spaces, they are usually referred to in double quotes: `SELECT * FROM mytable WHERE ""gappy column name""=""my value"";` In the JSON metadata file, this is passed by escaping the double quotes: `""queries"": {""my query"": ""SELECT * FROM mytable WHERE \""gappy column name\""=\""my value\"";""}` When specifying a custom query in `metadata.json` using double quotes, these are then rendered in the *datasette* query box using single quotes: `SELECT * FROM mytable WHERE 'gappy column name'='my value';` which does not work. Alternatively, a valid custom query can be passed using backticks (\`) to quote the column name and single (unescaped) quotes for the matched value: ``""queries"": {""my query"": ""SELECT * FROM mytable WHERE `gappy column name`='my value';""}`` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/183/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 292011379,MDU6SXNzdWUyOTIwMTEzNzk=,184,500 from missing table name,222245,carlmjohnson,closed,0,,,,,4,2018-01-26T19:46:45Z,2019-05-21T16:17:29Z,2018-04-13T18:18:59Z,NONE,,"https://github.com/simonw/datasette/blob/56623e48da5412b25fb39cc26b9c743b684dd968/datasette/app.py#L517-L519 throws an error if it gets an empty list back. Simplest solution is to write a helper func that just says ```python result = list(await self.execute(name, sql, params) if result: return result[0][0] ``` and use it anywhere `[0][0]` is now.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/184/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 299760684,MDU6SXNzdWUyOTk3NjA2ODQ=,185,Metadata should be a nested arbitrary KV store,222245,carlmjohnson,open,0,,,,,12,2018-02-23T16:02:07Z,2019-05-13T18:33:33Z,,NONE,,"I started using the metadata feature and was surprised to find that values are not inherited from the root object down to specific databases and tables. This makes metadata much less useful and requires a lot of pointless duplication. Ideally, metadata should allow arbitrary key-value pairs, and there should be a way of accessing metadata either in an inherited or non-inherited manner. Something like `metadata.page.key` vs. `metadata.this.key` might work as an interface.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/185/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 306811513,MDU6SXNzdWUzMDY4MTE1MTM=,186,proposal new option to disable user agents cache,47107,stefanocudini,closed,0,,,,,3,2018-03-20T10:42:20Z,2018-03-21T09:07:22Z,2018-03-21T01:28:31Z,NONE,,"I think it would be very useful for debugging an option of adding headers to http replies ``` Cache-Control: no-cache ``` especially in the html output",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/186/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 309033998,MDU6SXNzdWUzMDkwMzM5OTg=,187,Windows installation error,11855322,robmarkcole,closed,0,,,,,7,2018-03-27T16:04:37Z,2019-06-15T21:44:23Z,2019-06-15T21:44:23Z,NONE,,"On attempting install on a Win 7 PC with py 3.6.2 (Anaconda dist) I get the error: ``` Collecting uvloop>=0.5.3 (from Sanic==0.7.0->datasette) Downloading uvloop-0.9.1.tar.gz (1.8MB) 100% |¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦| 1.8MB 12.8MB/s Complete output from command python setup.py egg_info: Traceback (most recent call last): File """", line 1, in File ""C:\Users\RCole\AppData\Local\Temp\pip-build-juakfqt8\uvloop\setup.py "", line 10, in raise RuntimeError('uvloop does not support Windows at the moment') RuntimeError: uvloop does not support Windows at the moment ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/187/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 309047460,MDU6SXNzdWUzMDkwNDc0NjA=,188,Ability to bundle metadata and templates inside the SQLite file,9599,simonw,open,0,,,,,4,2018-03-27T16:42:07Z,2020-12-04T17:18:34Z,,OWNER,,"One of the nicest qualities of SQLite as a data format is that you get a single file which you can then backup or share with other people. Datasette breaks this a little once you start including custom metadata.json or template files and CSS. It would be cool if there was an optional mechanism for baking that extra configuration into the SQLite file itself. That way entire datasette mini-applications (including canned queries and custom HTML and CSS) could be constructed as single .db files. Since datasette configuration is all file-based, one way to achieve that would be to support a ""datasette_files"" table which, if present is used to search for file contents by path. This is inline with the philosophy described by https://www.sqlite.org/appfileformat.html ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/188/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 309471814,MDU6SXNzdWUzMDk0NzE4MTQ=,189,Ability to sort (and paginate) by column,9599,simonw,closed,0,9599,simonw,,,31,2018-03-28T18:04:51Z,2018-04-15T18:54:22Z,2018-04-09T05:16:02Z,OWNER,,"As requested in https://github.com/simonw/datasette/issues/185#issuecomment-376614973 I've previously avoided this for performance reasons: sort-by-column on a column without an index is likely to perform badly for hundreds of thousands of rows. That's not a good enough reason to avoid the feature entirely though. A few options: * Allow sort-by-column by default, give users the option to disable it for specific tables/columns * Disallow sort-by-column by default, give users option (probably in `metadata.json`) to enable it for specific tables/columns * Automatically detect if a column either has an index on it OR a table has less than X rows in it We already have the mechanism in place to cut off SQL queries that take more than X seconds, so if someone DOES try to sort by a column that's too expensive it won't actually hurt anything - but it would be nice to not show people a ""sort"" option which is guaranteed to throw a timeout error. The vast majority of datasette usage that I've seen so far is on smaller datasets where the performance penalties of sort-by-column are extremely unlikely to show up. ---- Still left to do: - [x] UI that shows which sort order is currently being applied (in HTML and in JSON) - [x] UI for applying a sort order (with rel=nofollow to avoid Google crawling it) - [x] Sort column names should be escaped correctly in generated SQL - [x] Validation that the selected sort order is a valid column - [x] Throw error if user attempts to apply _sort AND _sort_desc at the same time - [x] Ability to disable sorting (or sort only for specific columns) in metadata.json - [x] Fix ""201 rows where sorted by sortable_with_nulls "" bug ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/189/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 309558826,MDU6SXNzdWUzMDk1NTg4MjY=,190,Keyset pagination doesn't work correctly for compound primary keys,9599,simonw,closed,0,,,,,7,2018-03-28T22:45:06Z,2018-03-30T06:31:15Z,2018-03-30T06:26:28Z,OWNER,,"Consider https://datasette-issue-190-compound-pks.now.sh/compound-pks-9aafe8f/compound_primary_key ![2018-03-28 at 3 47 pm](https://user-images.githubusercontent.com/9599/38060388-56da86dc-329f-11e8-9f20-5576153ad55c.png) The next= link is to `d,v`: https://datasette-issue-190-compound-pks.now.sh/compound-pks-9aafe8f/compound_primary_key?_next=d%2Cv But that page starts with: ![2018-03-28 at 3 48 pm](https://user-images.githubusercontent.com/9599/38060402-6b0f5984-329f-11e8-85b8-44a666c4ee71.png) The next key in the sequence should be `d,w`. Also we should return the full a-z of the ones that start with the letter e - in this example we only return `e-w`, `e-x`, `e-y` and `e-z`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/190/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 310533258,MDU6SXNzdWUzMTA1MzMyNTg=,191,Figure out how to bundle a more up-to-date SQLite,9599,simonw,closed,0,,,,,6,2018-04-02T16:33:25Z,2018-07-10T17:46:13Z,2018-07-10T17:46:13Z,OWNER,,"The version of SQLite that ships with Python 3 is a bit limited - it doesn't support row values for example https://www.sqlite.org/rowvalue.html Figure out how to bundle a more recent SQLite engine with datasette. We need to figure out two cases: * Bundling a recent version in a Dockerfile build. I expect this to be quite easy. * Making a more recent version available to people hacking around in Mac OS X. I have no idea how to start on this. I want it working on Mac OS X too because I don't want to force Docker as a dependency for anyone who just want to hack around with Datasette a little and run the test suite.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/191/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 310850458,MDExOlB1bGxSZXF1ZXN0MTc5MTA4OTYx,192,New ?_shape=objects/object/lists param for JSON API,9599,simonw,closed,0,,,,,0,2018-04-03T14:02:58Z,2018-04-03T14:53:00Z,2018-04-03T14:52:55Z,OWNER,simonw/datasette/pulls/192,Refs #122,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/192/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 310882100,MDU6SXNzdWUzMTA4ODIxMDA=,193,Cleaner mechanism for handling custom errors,9599,simonw,closed,0,,,,,3,2018-04-03T15:19:13Z,2018-04-13T18:18:59Z,2018-04-13T18:18:59Z,OWNER,,"This code is pretty messy: https://github.com/simonw/datasette/blob/0abd3abacb309a2bd5913a7a2df4e9256585b1bb/datasette/app.py#L245-L265 Instead, it would be nice if I could raise an exception that would be converted into the appropriate JSON or HTML error message, with a corresponding HTTP code.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/193/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 312312125,MDU6SXNzdWUzMTIzMTIxMjU=,194,Rename table_rows and filtered_table_rows to have _count suffix,9599,simonw,closed,0,,,,,2,2018-04-08T14:53:37Z,2018-04-09T05:25:22Z,2018-04-09T05:25:22Z,OWNER,,"These fields represent counts of items: ""table_rows"": 131, ""filtered_table_rows"": 8, But the names make it sound like they might be arrays full of rows. Adding a `_count` suffix would make this more clear: ""table_rows_count"": 131, ""filtered_table_rows_count"": 8, ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/194/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 312313496,MDU6SXNzdWUzMTIzMTM0OTY=,195,"Run pks_for_table in inspect, executing once at build time rather than constantly",9599,simonw,closed,0,,,,,3,2018-04-08T15:12:40Z,2018-04-10T00:54:43Z,2018-04-10T00:54:43Z,OWNER,,"Right now several Datasette views call the `await self.pks_for_table(...)` method to figure out what primary keys are set for a specific table. This executes a `PRAGMA table_info` SQL query. It would be faster and more efficient to execute this query for each table as part of the `inspect()` method.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/195/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 312355154,MDExOlB1bGxSZXF1ZXN0MTgwMTg4Mzk3,196,_sort= and _sort_desc= parameters to table view,9599,simonw,closed,0,,,,,0,2018-04-09T00:07:21Z,2018-04-09T05:10:29Z,2018-04-09T05:10:23Z,OWNER,simonw/datasette/pulls/196,See #189 ,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/196/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 312395790,MDU6SXNzdWUzMTIzOTU3OTA=,197,Ability to sort by more than one column,9599,simonw,open,0,,,,,0,2018-04-09T05:13:30Z,2018-07-10T17:45:37Z,,OWNER,,"Split off from #189. I'd like to support ""sort by X descending, then by Y ascending if there are dupes for X"" as well. Suggested syntax for that: ?_sort_desc=X&_sort=Y we currently only allow one argument to be sent. We should allow as many arguments as there are columns, for example: ?_sort=department&_sort_desc=precinct&_sort=age&_sort_desc=size",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/197/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 312396095,MDU6SXNzdWUzMTIzOTYwOTU=,198,Ability to sort with nulls last,9599,simonw,open,0,,,,,0,2018-04-09T05:15:40Z,2018-07-10T17:45:37Z,,OWNER,,"Split off from #189 Here's how to do that in SQL: https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9?sql=select+rowid%2C+*+from+%5Bnfl-wide-receivers%2Fadvanced-historical%5D%0D%0Aorder+by+case+when+career_ranypa+is+null+then+1+else+0+end%2C+career_ranypa%2C+rowid order by case when career_ranypa is null then 1 else 0 end, career_ranypa",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/198/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 312620566,MDU6SXNzdWUzMTI2MjA1NjY=,199,Ability to apply sort on mobile in portrait mode,9599,simonw,closed,0,,,,,4,2018-04-09T17:35:04Z,2018-04-10T00:37:53Z,2018-04-10T00:34:38Z,OWNER,,"Missed this in #189... on mobile in portrait mode we hide the column headers, which means you can't click them to sort! You can sort in landscape mode at least. Need to come up with an alternative sort UI for portrait on mobile.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/199/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 313494458,MDExOlB1bGxSZXF1ZXN0MTgxMDMzMDI0,200,Hide Spatialite system tables,45057,russss,closed,0,,,,,3,2018-04-11T21:26:58Z,2018-04-12T21:34:48Z,2018-04-12T21:34:48Z,CONTRIBUTOR,simonw/datasette/pulls/200,They were getting on my nerves.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/200/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 313512748,MDU6SXNzdWUzMTM1MTI3NDg=,201,Support explain select / explain query plan select,9599,simonw,closed,0,,,,,1,2018-04-11T22:41:26Z,2018-04-13T21:17:14Z,2018-04-12T21:32:52Z,OWNER,,See https://www.sqlite.org/eqp.html and https://www.sqlite.org/lang_explain.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/201/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 313785206,MDExOlB1bGxSZXF1ZXN0MTgxMjQ3NTY4,202,Raise 404 on nonexistent table URLs,45057,russss,closed,0,,,,,2,2018-04-12T15:47:06Z,2018-04-13T19:22:56Z,2018-04-13T18:19:15Z,CONTRIBUTOR,simonw/datasette/pulls/202,"Currently they just 500. Also cleaned the logic up a bit, I hope I didn't miss anything. This is issue #184.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/202/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 313837303,MDU6SXNzdWUzMTM4MzczMDM=,203,Support for units,45057,russss,closed,0,,,,,10,2018-04-12T18:24:28Z,2018-04-16T21:59:17Z,2018-04-16T21:59:17Z,CONTRIBUTOR,,"It would be nice to be able to attach a unit to a column in the metadata, and have it rendered with that unit (and SI prefix) when it's displayed. It would also be nice to support entering the prefixes in variables when querying. With my radio licensing app I've put all frequencies in Hz. It's easy enough to special-case the row rendering to add the SI prefixes, but it's pretty unusable when querying by that field.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/203/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314256802,MDExOlB1bGxSZXF1ZXN0MTgxNjAwOTI2,204,Initial units support,45057,russss,closed,0,,,,,0,2018-04-13T21:32:49Z,2018-04-14T09:44:33Z,2018-04-14T03:32:54Z,CONTRIBUTOR,simonw/datasette/pulls/204,"Add support for specifying units for a column in metadata.json and rendering them on display using [pint](https://pint.readthedocs.io/en/latest/). Example table metadata: ```json ""license_frequency"": { ""units"": { ""frequency"": ""Hz"", ""channel_width"": ""Hz"", ""height"": ""m"", ""antenna_height"": ""m"", ""azimuth"": ""degrees"" } } ``` [Example result](https://wtr-api.herokuapp.com/wtr-663ea99/license_frequency/1) This works surprisingly well! I'd like to add support for using units when querying but this is PR is pretty usable as-is. (Pint doesn't seem to support decibels though - it thinks they're decibytes - which is an annoying omission.) (ref ticket #203)",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/204/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 314319372,MDExOlB1bGxSZXF1ZXN0MTgxNjQyMTE0,205,Support filtering with units and more,45057,russss,closed,0,,,,,3,2018-04-14T10:47:51Z,2018-04-14T15:24:04Z,2018-04-14T15:24:04Z,CONTRIBUTOR,simonw/datasette/pulls/205,"The first commit: * Adds units to exported JSON * Adds units key to metadata skeleton * Adds some docs for units The second commit adds filtering by units by the first method I mentioned in #203: ![image](https://user-images.githubusercontent.com/45057/38767463-7193be16-3fd9-11e8-8a5f-ac4159415c6d.png) [Try it here](https://wtr-api.herokuapp.com/wtr-663ea99/license_frequency?frequency__gt=50GHz&height__lt=50ft). I think it integrates pretty neatly. The third commit adds support for registering custom units with Pint from metadata.json. Probably pretty niche, but I need decibels!",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/205/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 314323977,MDExOlB1bGxSZXF1ZXN0MTgxNjQ0ODA1,206,Fix sqlite error when loading rows with no incoming FKs,45057,russss,closed,0,,,,,0,2018-04-14T12:08:17Z,2018-04-14T14:32:42Z,2018-04-14T14:24:25Z,CONTRIBUTOR,simonw/datasette/pulls/206,"This fixes `ERROR: conn=, sql = 'select ', params = {'id': '1'}` caused by an invalid query loading incoming FKs when none exist. The error was ignored due to async but it still got printed to the console.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/206/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 314329002,MDExOlB1bGxSZXF1ZXN0MTgxNjQ3NzE3,207,Link foreign keys which don't have labels,45057,russss,closed,0,,,,,1,2018-04-14T13:27:14Z,2018-04-14T15:00:00Z,2018-04-14T15:00:00Z,CONTRIBUTOR,simonw/datasette/pulls/207,"This renders unlabeled FKs as simple links. I can't see why this would cause any major problems. ![image](https://user-images.githubusercontent.com/45057/38768722-ea15a000-3fef-11e8-8664-ffd7aa4894ea.png) Also includes bonus fixes for two minor issues: * In foreign key link hrefs the primary key was escaped using HTML escaping rather than URL escaping. This broke some non-integer PKs. * Print tracebacks to console when handling 500 errors.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/207/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 314340944,MDExOlB1bGxSZXF1ZXN0MTgxNjU0ODM5,208,Return HTTP 405 on InvalidUsage rather than 500,45057,russss,closed,0,,,,,0,2018-04-14T16:12:50Z,2018-04-14T18:00:39Z,2018-04-14T18:00:39Z,CONTRIBUTOR,simonw/datasette/pulls/208,"This also stops it filling up the logs. This happens for HEAD requests at the moment - which perhaps should be handled better, but that's a different issue.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/208/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 314455877,MDExOlB1bGxSZXF1ZXN0MTgxNzIzMzAz,209, Don't duplicate simple primary keys in the link column,45057,russss,closed,0,,,,,6,2018-04-15T21:56:15Z,2018-04-18T08:40:37Z,2018-04-18T01:13:04Z,CONTRIBUTOR,simonw/datasette/pulls/209,"When there's a simple (single-column) primary key, it looks weird to duplicate it in the link column. This change removes the second PK column and treats the link column as if it were the PK column from a header/sorting perspective. This might make it a bit more difficult to tell what the link for the row is, I'm not sure yet. I feel like the alternative is to change the link column to just have the text ""view"" or something, instead of repeating the PK. (I doubt it makes much more sense with compound PKs.) Bonus change in this PR: fix urlencoding of links in the displayed HTML. Before: ![image](https://user-images.githubusercontent.com/45057/38783830-e2ababb4-40ff-11e8-97fb-25e286a8c920.png) After: ![image](https://user-images.githubusercontent.com/45057/38783835-ebf6b48e-40ff-11e8-8c47-6a864cf21ccc.png)",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/209/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 314469126,MDExOlB1bGxSZXF1ZXN0MTgxNzMxOTU2,210,"Start of the plugin system, based on pluggy",9599,simonw,closed,0,,,,,0,2018-04-16T00:51:30Z,2018-04-16T00:56:16Z,2018-04-16T00:56:16Z,OWNER,simonw/datasette/pulls/210,Refs #14,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/210/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 314471743,MDU6SXNzdWUzMTQ0NzE3NDM=,211,Load plugins from a `--plugins-dir=plugins/` directory,9599,simonw,closed,0,,,,,6,2018-04-16T01:17:43Z,2018-04-16T05:22:02Z,2018-04-16T05:22:02Z,OWNER,,"In #14 and 33c7c53ff87c2 I've added working support for setuptools entry_points plugins. These can be installed from PyPI using `pip install ...`. I imagine some projects will benefit from being able to add plugins without first publishing them to PyPI. Datasette already supports [loading custom templates](http://datasette.readthedocs.io/en/latest/custom_templates.html#custom-templates) like so: datasette serve --template-dir=mytemplates/ mydb.db I propose an additional option, `--plugins-dir=` which specifies a directory full of `blah.py` files which will be loaded into Datasette when the application server starts. datasette serve --plugins-dir=myplugins/ mydb.db This will also need to be supported by `datasette publish` as those Python files should be copied up as part of the deployment.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/211/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314504812,MDExOlB1bGxSZXF1ZXN0MTgxNzU1MjIw,212,New --plugins-dir=plugins/ option,9599,simonw,closed,0,,,,,0,2018-04-16T05:19:28Z,2018-04-16T05:22:18Z,2018-04-16T05:22:01Z,OWNER,simonw/datasette/pulls/212,Refs #211,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/212/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 314506033,MDU6SXNzdWUzMTQ1MDYwMzM=,213,Documentation for plugins system,9599,simonw,closed,0,,,,,0,2018-04-16T05:27:07Z,2018-04-16T15:12:48Z,2018-04-16T15:12:48Z,OWNER,,"Documentation for #14 - how to write plugins, how to ship plugins to PyPI and how to use the `--plugins-dir` option added in #211 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/213/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314506446,MDU6SXNzdWUzMTQ1MDY0NDY=,214,Ability for plugins to define extra JavaScript and CSS,9599,simonw,closed,0,,,,,6,2018-04-16T05:29:34Z,2020-09-30T20:36:11Z,2018-04-18T03:13:03Z,OWNER,,"This can hook in to the existing `extra_css_urls` and `extra_js_urls` mechanism: https://github.com/simonw/datasette/blob/b2955d9065ea019500c7d072bcd9d49d1967f051/datasette/app.py#L304-L305 The plugins should be able to bundle their own assets though, so it will also have to integrate with the `/static/` static mounts mechanism somehow: https://github.com/simonw/datasette/blob/b2955d9065ea019500c7d072bcd9d49d1967f051/datasette/app.py#L1255-L1257 Refs #14",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/214/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314506669,MDU6SXNzdWUzMTQ1MDY2Njk=,215,Allow plugins to define additional URL routes and views,9599,simonw,closed,0,,,5512395,Datasette 0.44,14,2018-04-16T05:31:09Z,2020-06-09T03:14:32Z,2020-06-09T03:12:08Z,OWNER,,"Might be as simple as having plugins get passed the `app` after the other routes have been defined: https://github.com/simonw/datasette/blob/b2955d9065ea019500c7d072bcd9d49d1967f051/datasette/app.py#L1270-L1274 Refs #14",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/215/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314665147,MDU6SXNzdWUzMTQ2NjUxNDc=,216,Bug: Sort by column with NULL in next_page URL,222245,carlmjohnson,closed,0,,,,,15,2018-04-16T14:03:18Z,2018-04-17T01:45:24Z,2018-04-17T01:45:24Z,NONE,,"Copy-pasting from https://github.com/simonw/datasette/issues/189#issuecomment-381429213, since that issue is closed: I think I found a bug. I tried to sort by middle initial in my salaries set, and many middle initials are null. The `next_url` gets set by Datasette to: http://localhost:8001/salaries-d3a5631/2017+Maryland+state+salaries?_next=None%2C391&_sort=middle_initial But then None is interpreted literally and it tries to find a name with the middle initial ""None"" and ends up skipping ahead to O on page 2. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/216/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314725342,MDU6SXNzdWUzMTQ3MjUzNDI=,217,Plugin support for datasette publish,9599,simonw,closed,0,,,,,1,2018-04-16T16:17:14Z,2018-07-26T05:33:39Z,2018-07-26T05:16:00Z,OWNER,,"It should be possible to support additional deployment options by writing a plugin (see #59). As part of this, rewrite the Heroku and Now publishers to be implemented as plugins (they will still ship with datasette by default). Maybe `datasette package` should be changed to being part of publish instead, `datasette publish docker` perhaps? Refs #14",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/217/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 314771615,MDU6SXNzdWUzMTQ3NzE2MTU=,218,"Support custom unit display in order to handle ""$10,000""",9599,simonw,open,0,,,,,0,2018-04-16T18:39:31Z,2018-07-10T17:45:38Z,,OWNER,,"I tried to get Datasette to display `$10,000` using the new units support but we currently only display units as a suffix: https://github.com/simonw/datasette/blob/10a34f995c70daa37a8a2aa02c3135a4b023a24c/datasette/app.py#L563-L572 It would be neat if there was a mechanism for specifying a custom unit display - maybe something like this: ``` { ""custom_units"": { ""us_dollar"": { ""unit"": ""us_dollar = [] = $"", ""format"": ""${:,}"" } } } ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/218/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 314834783,MDU6SXNzdWUzMTQ4MzQ3ODM=,219,Expose units in the JSON API?,45057,russss,open,0,,,,,0,2018-04-16T22:04:25Z,2018-04-16T22:04:25Z,,CONTRIBUTOR,,"From #203: it would be nice for the JSON API to (optionally) return columns rendered with units in them - if, for example, you're consuming the JSON to render the rows on a map. I'm not entirely sure how useful this will be though - at the moment my map queries are custom SQL queries (a few have joins in, the rest might be fetching large amounts of data so it makes sense to limit columns fetched). Perhaps the SQL function is a better approach in general.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/219/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 314847571,MDU6SXNzdWUzMTQ4NDc1NzE=,220,Investigate syntactic sugar for plugins,9599,simonw,closed,0,,,,,2,2018-04-16T23:01:39Z,2020-06-11T21:50:06Z,2020-06-11T21:49:55Z,OWNER,,"Suggested by @andrewhayward on Twitter: https://twitter.com/arhayward/status/986015118965268480?s=21 > Have you considered a basic abstraction on top of that, for standard hook features? ``` @sql_function random_integer(a,b): return random.randint(a,b) @template_filter uppercase(str): return str.upper() ``` Maybe `from datasette.plugins import template_filter`? Would have to work out how to get this to play well with pluggy",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/220/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 315142414,MDU6SXNzdWUzMTUxNDI0MTQ=,221,Allow plugins to add new cli sub commands ,9599,simonw,closed,0,,,,,3,2018-04-17T16:40:13Z,2021-01-04T20:12:14Z,2021-01-04T20:12:14Z,OWNER,,I could then test this out by having https://github.com/simonw/csvs-to-sqlite register itself as a plugin,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/221/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 315316214,MDExOlB1bGxSZXF1ZXN0MTgyMzU3NjEz,222,Fix for plugins in Python 3.5,9599,simonw,closed,0,,,,,0,2018-04-18T03:21:01Z,2018-04-18T04:26:50Z,2018-04-18T03:24:21Z,OWNER,simonw/datasette/pulls/222,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/222/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 315327860,MDU6SXNzdWUzMTUzMjc4NjA=,223,datasette publish --install=name-of-plugin,9599,simonw,closed,0,,,,,3,2018-04-18T04:33:59Z,2018-04-18T14:56:17Z,2018-04-18T14:56:17Z,OWNER,,Mechanism for causing datasette publish and datasette package to install one or more additional plugins using `pip install` - refs #14 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/223/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 315517578,MDU6SXNzdWUzMTU1MTc1Nzg=,224,Ability for plugins to bundle templates,9599,simonw,closed,0,,,,,1,2018-04-18T14:57:53Z,2018-04-19T05:50:36Z,2018-04-19T05:50:36Z,OWNER,,"Plugins should be able to bundle templates. The Datasette template loader should then consult those plugins first when loading a template. Jinja2 has a `PackageLoader` class that can help with this: http://jinja.pocoo.org/docs/2.10/api/#jinja2.PackageLoader Refs #14",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/224/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 315548495,MDU6SXNzdWUzMTU1NDg0OTU=,225,/-/(inspect|metadata|plugins)(.json)? introspection,9599,simonw,closed,0,,,,,0,2018-04-18T16:14:58Z,2018-04-19T05:25:33Z,2018-04-19T05:25:33Z,OWNER,,"3 pages (and accompanying .json endpoints) for viewing: * the metadata.json that datasette was loaded with * the output of ds.inspect() * a list of installed plugins, detected by pluggy",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/225/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 315738696,MDU6SXNzdWUzMTU3Mzg2OTY=,226,Unit tests for installable plugins,9599,simonw,closed,0,,,,,2,2018-04-19T06:05:32Z,2020-11-24T19:52:51Z,2020-11-24T19:52:46Z,OWNER,,"I'd like more thorough unit test coverage of the plugins mechanism - in particular for installable plugins. I think I can do this while still having the code live in the same repo, by creating a subdirectory in tests/example_plugin with its own setup.py and then running `python setup.py install` as part of the test runner. I imagine I will need to bump the version number every time I change the plugin in case someone runs the test again in the same virtual environment. If that doesn't work I can instead ship a datasette-plugins-tests two to PyPI and add that as a tests_require dependency. Refs #14",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/226/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 315960272,MDU6SXNzdWUzMTU5NjAyNzI=,227,prepare_context() plugin hook,9599,simonw,closed,0,,,,,8,2018-04-19T16:55:26Z,2020-03-24T22:19:54Z,2020-03-24T22:19:54Z,OWNER,,This would be called with the context dictionary before each template is rendered. It would have the opportunity to modify that context.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/227/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 316031566,MDU6SXNzdWUzMTYwMzE1NjY=,228,"If spatialite detected, mark idx_XXX_Geometry tables as hidden",9599,simonw,closed,0,,,,,1,2018-04-19T20:37:24Z,2018-04-26T03:25:39Z,2018-04-26T03:25:39Z,OWNER,,"https://timezones-api.now.sh/timezones-faf26d0 ![2018-04-19 at 1 36 pm](https://user-images.githubusercontent.com/9599/39016906-a5acbb3e-43d6-11e8-9a31-814ff1d0022e.png) Need to update this logic: https://github.com/simonw/datasette/blob/e2750c7cc0585adaa8c866be611089e62961ee35/datasette/app.py#L1276-L1288",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/228/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 316123256,MDU6SXNzdWUzMTYxMjMyNTY=,229,Table view should support ?_size=400 parameter,9599,simonw,closed,0,,,,,1,2018-04-20T04:23:18Z,2018-04-26T04:49:46Z,2018-04-26T04:48:32Z,OWNER,,Allows callers to request more rows at once. The limit will still be `max_returned_rows` (defaults to 1000).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/229/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 316128955,MDU6SXNzdWUzMTYxMjg5NTU=,230,Setting page size AND max returned rows to 1000 doesn't seem to work,9599,simonw,closed,0,,,,,1,2018-04-20T05:05:11Z,2018-04-26T04:04:25Z,2018-04-26T04:04:25Z,OWNER,,"It appears that if the two settings are the same Datasette fails to return any results, probably because of the trick where we try to fetch 1001 rows so we know if there's a next page.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/230/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 316323336,MDU6SXNzdWUzMTYzMjMzMzY=,231,metadata.json support for plugin configuration options,9599,simonw,closed,0,,,,,4,2018-04-20T15:58:47Z,2019-05-13T18:56:21Z,2019-05-13T18:56:21Z,OWNER,,"My [datasette-cluster-map](https://github.com/simonw/datasette-cluster-map) plugin currently works by detecting `latitude` and `longitude` columns. I'd like to be able to configure it to look for different column names. One way to do this could be to support optional plugin configuration as part of `metadata.json`. Something like this: { ""title"": ""Polar Bear Ear Tags, 2009-2011"", ""source"": ""USGS Alaska Science Center, Polar Bear Research Program"", ""source_url"": ""https://alaska.usgs.gov/products/data.php?dataid=130"", ""plugins"": { ""datasette_cluster_map"": { ""latitude_columns"": [ ""latitude"", ""Capture Latitude"" ], ""longitude_columns"": [ ""longitude"", ""Capture Longitude"" ] } } } These settings should be supported at the root level or at the individual database or table level. They could also be exposed in the https://datasette-cluster-map-demo.now.sh/-/plugins debug tool. Refs #14",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/231/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 316365426,MDExOlB1bGxSZXF1ZXN0MTgzMTM1NjA0,232,Fix a typo,45281,lsb,closed,0,,,,,1,2018-04-20T18:20:04Z,2018-04-21T00:19:08Z,2018-04-21T00:19:08Z,CONTRIBUTOR,simonw/datasette/pulls/232,It looks like this was the only instance of it: https://github.com/simonw/datasette/search?utf8=%E2%9C%93&q=SOLite&type=,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/232/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 316444720,MDU6SXNzdWUzMTY0NDQ3MjA=,233,Option to expose expanded foreign keys in JSON/CSV,9599,simonw,closed,0,,,,,11,2018-04-21T00:18:25Z,2018-06-16T22:26:21Z,2018-06-16T22:20:14Z,OWNER,,"https://datasette-cluster-map-demo.datasettes.com/sf-trees-02c8ef1/Street_Tree_List?qCareAssistant=1 ![f36b87c0-478e-4d55-9a5f-ad37df0b47cb](https://user-images.githubusercontent.com/9599/39078411-bb3e4f88-44be-11e8-9d0c-d22324793c77.png) It would be nice if the info bubbles there could expose more than just the IDs, and if the title showed the expanded name of the selected qCareAssistant.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/233/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 316526433,MDU6SXNzdWUzMTY1MjY0MzM=,234,label_column option in metadata.json,9599,simonw,closed,0,,,,,3,2018-04-21T21:19:08Z,2018-04-22T20:47:12Z,2018-04-22T20:47:12Z,OWNER,,"Currently the column used for displaying a foreign key relationship is automatically detected by `inspect()` by looking for tables that have a primary key column and one other column. This doesn't work for tables with more than two columns. Let's allow the table section in `metadata.json` to optionally define a `label_column` which, if present, will be used for those displays.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/234/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 316621102,MDU6SXNzdWUzMTY2MjExMDI=,235,Add limit on the size in KB of data returned from a single query,9599,simonw,open,0,,,,,2,2018-04-22T23:01:15Z,2018-04-24T00:30:02Z,,OWNER,,"Datasette limits the number of rows returned to 1,000 and limits the time spent executing a SQL query to 1000ms - and both of these limits can be customized. It does not have a limit on the size of the response returned. It's possible to compose maliciously large SQL responses in a small number of rows using mechanisms like the `group_concat()` aggregate function. It would be good to avoid malicious SQL creating 100MB+ responses and potentially crashing the server. I think the easiest place to implement that is here: https://github.com/simonw/datasette/blob/f3f42957128c1e7ece584d45d9167f2ac003a3b8/datasette/app.py#L175-L190 Currently we use `cursor.fetchmany()` to fetch up to 1,001 rows at once. Instead, we could switch to iterating through `cursor.fetchone()` (or just using `for row in cursor`) and keeping a running tally of the size of the response as we go - maybe just using `rough_response_size += len(str(row))`. If that goes above a certain threshold we can terminate the response with an error, like we do with timelimits. The bigger challenge here is understanding how well this approach works and what impact it will have on overall Datasette performance. I think I need #33 for this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/235/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 317001500,MDU6SXNzdWUzMTcwMDE1MDA=,236,datasette publish lambda plugin,9599,simonw,open,0,,,,,11,2018-04-23T22:10:30Z,2023-03-12T14:04:15Z,,OWNER,,"Refs #217 - create a publish plugin that can deploy to AWS Lambda. https://docs.aws.amazon.com/lambda/latest/dg/limits.html says lambda packages can be up to 50 MB, so this would only work with smaller databases (the command can check the filesize before attempting to package and deploy it). Lambdas do get a 512 MB `/tmp` directory too, so for larger databases the function could start and then download up to 512MB from an S3 bucket - so the plugin could take an optional S3 bucket to write to and know how to upload the `.db` file there and then have the lambda download it on startup.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/236/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 317475156,MDU6SXNzdWUzMTc0NzUxNTY=,237,Support for ?_search_colname=blah searches,9599,simonw,closed,0,,,,,2,2018-04-25T04:29:53Z,2018-05-05T22:56:42Z,2018-05-05T22:33:23Z,OWNER,,"Right now the `_search=` argument searches across all fields in a full-text index, for example: https://san-francisco.datasettes.com/sf-film-locations-84594a7/Film_Locations_in_San_Francisco?_search=justin SQLite FTS also supports searches within a specified field, for example: https://san-francisco.datasettes.com/sf-film-locations-84594a7?sql=select+rowid%2C+*+from+Film_Locations_in_San_Francisco+where+rowid+in+%28select+rowid+from+%5BFilm_Locations_in_San_Francisco_fts%5D+where+%5BLocations%5D+match+%3Asearch%29+order+by+rowid+limit+101&search=justin ``` select rowid, * from Film_Locations_in_San_Francisco where rowid in ( select rowid from [Film_Locations_in_San_Francisco_fts] where [Locations] match :search ) order by rowid limit 101 ``` The `_search=` parameter could be extended to support this using `_search_colname=`. This should also be able to support columns with spaces and special characters in their names, something like this: `_search_Column%20With%20Spaces=foo` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/237/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 317714268,MDU6SXNzdWUzMTc3MTQyNjg=,238,External metadata.json,9599,simonw,closed,0,,,,,3,2018-04-25T17:02:30Z,2019-06-24T06:52:55Z,2019-06-24T06:52:45Z,OWNER,,"A frustration I'm having with https://register-of-members-interests.datasettes.com/ is that I keep coming up with new canned queries but I don't want to redeploy the whole thing just to add them to `metadata.json` Maybe Datasette could optionally take a `--metadata-url` option which causes it to load from a URL instead and occasionally check for updates.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/238/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 317760361,MDU6SXNzdWUzMTc3NjAzNjE=,239,Support for hidden tables in metadata.json,9599,simonw,closed,0,,,,,2,2018-04-25T19:21:17Z,2018-04-26T03:45:12Z,2018-04-26T03:43:10Z,OWNER,,"Since we already have a hidden feature, let's expose it more to our users ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/239/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 317900587,MDU6SXNzdWUzMTc5MDA1ODc=,240,FTS table detection should be part of .inspect(),9599,simonw,closed,0,,,,,0,2018-04-26T06:58:10Z,2018-04-29T00:04:44Z,2018-04-29T00:04:44Z,OWNER,,"The code that detects if specific tables have a corresponding FTS column is currently called from TableView - it should instead be handled as part of `.inspect()`. This will make it easier to build other features that need to behave differently depending on whether a table can be searched, e.g. an autocomplete widget for selecting filters from foreign key tables. Current code: https://github.com/simonw/datasette/blob/f188ceaa2a3a5b2eab83425ad0f00cb0d364e24a/datasette/app.py#L728-L733",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/240/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 318490133,MDU6SXNzdWUzMTg0OTAxMzM=,241,Default datasette logging format should be JSON,9599,simonw,open,0,,,,,0,2018-04-27T17:32:48Z,2018-07-10T17:45:40Z,,OWNER,,"Structured logs are better. Datasette should default to outputting it's HTTP access log lines as newline delimited JSON instead of the Sanic default format it uses at the moment. For improved greppability these logs should have keys ordered in a consistent way. Python's JSON module can do this with ordered dictionaries.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/241/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 318692953,MDU6SXNzdWUzMTg2OTI5NTM=,242,Rename ?_sql_time_limit_ms= to ?_timelimit=,9599,simonw,closed,0,,,,,0,2018-04-29T06:11:35Z,2018-05-02T00:20:42Z,2018-05-02T00:20:42Z,OWNER,,It's a bit of a mouthful at the moment.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/242/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 318737808,MDU6SXNzdWUzMTg3Mzc4MDg=,243,--spatialite option for datasette publish commands,9599,simonw,closed,0,,,,,2,2018-04-29T18:19:32Z,2018-05-31T14:17:53Z,2018-05-31T14:17:53Z,OWNER,,Performs the necessary incantations to install Spatialite on Zeit Now or Heroku and sets the corresponding environment variable to ensure the module is correctly loaded by datasette serve.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/243/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 318738000,MDU6SXNzdWUzMTg3MzgwMDA=,244,/-/versions page,9599,simonw,closed,0,,,,,1,2018-04-29T18:22:15Z,2018-05-03T14:13:49Z,2018-05-03T14:09:53Z,OWNER,,"Displays the current version of: * datasette * Python * SQLite * Spatialite (if available) Installed plugin versions should be shown on /-/plugins",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/244/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 319358200,MDU6SXNzdWUzMTkzNTgyMDA=,245,?_shape=array option,9599,simonw,closed,0,,,,,1,2018-05-01T23:11:07Z,2018-05-03T14:14:33Z,2018-05-02T00:12:20Z,OWNER,,"Some tools (`pandas.DataFrame(...)` for example) are happiest when you give them a raw array of JSON objects. `?_shape=array` should do just that While I'm at it, rename the default `?_shape=lists` to instead be called `?shape=arrays` And validate that `_shape` is a valid option And have `?_shape=object` return the object at the root level rather than nested in `.rows` to better match the behavior of `?_shape=array`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/245/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 319371036,MDExOlB1bGxSZXF1ZXN0MTg1MzA3NDA3,246,?_shape=array and _timelimit=,9599,simonw,closed,0,,,,,0,2018-05-02T00:18:54Z,2018-05-02T00:20:41Z,2018-05-02T00:20:40Z,OWNER,simonw/datasette/pulls/246,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/246/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 319449852,MDU6SXNzdWUzMTk0NDk4NTI=,247,SQLite code decoupled from Datasette,11912854,jsancho-gpl,open,0,,,,,1,2018-05-02T08:03:28Z,2018-05-21T15:29:31Z,,NONE,,"I'm working on the possibility of use Datasette with other file formats that aren't SQLite, like files with [PyTables](https://github.com/PyTables/PyTables) format. In order to accomplish that, I've started [a fork for decoupling the code related with SQLite](https://github.com/jsancho-gpl/datasette/tree/feature/db-type-plugin) and putting it in an external connector to allow future connectors for a lot of file formats. It'd be nice if you could look at it and suggest improvements for a possible PR.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/247/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 319954545,MDU6SXNzdWUzMTk5NTQ1NDU=,248,/-/plugins should show version of each installed plugin,9599,simonw,closed,0,,,,,2,2018-05-03T14:50:45Z,2018-05-04T18:25:40Z,2018-05-04T18:05:04Z,OWNER,,"Refs #244 https://stackoverflow.com/questions/20180543/how-to-check-version-of-python-modules ``` >>> import pkg_resources >>> pkg_resources.get_distribution('datasette_cluster_map').version '0.4' ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/248/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 320090329,MDU6SXNzdWUzMjAwOTAzMjk=,249,?_size=max argument ,9599,simonw,closed,0,,,,,1,2018-05-03T21:42:04Z,2018-05-04T18:26:30Z,2018-05-04T18:05:04Z,OWNER,,"For plugins that want to load the most data allowable, having `?_size=max` would be useful.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/249/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 320132682,MDU6SXNzdWUzMjAxMzI2ODI=,250,Setup some issue templates,9599,simonw,open,0,,,,,0,2018-05-04T01:49:07Z,2018-05-04T01:49:07Z,,OWNER,,"https://twitter.com/left_pad/status/99216385740464537 I like the idea of using these to help people understand some of the ways I want to use issues.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/250/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 320592643,MDU6SXNzdWUzMjA1OTI2NDM=,251,"Explore ""distinct values for column"" in inspect()",9599,simonw,closed,0,,,,,4,2018-05-06T13:27:24Z,2018-05-14T22:47:55Z,2018-05-14T22:47:55Z,OWNER,,"A lot of datasets have columns which have a small number of possible values in them - this one for example: https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9?sql=select+distinct+category+from+%5Binconvenient-sequel%2Fratings%5D%3B Detecting these could be interesting as part of `.inspect()`, since it would allow for various UI enhancements like autocomplete / select box filters for those columns. The problem is detecting them efficiently. `.inspect()` shouldn't spend 5 minutes churning through columns on giant tables trying to determine if they have a small collection of unique values.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/251/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 321624016,MDU6SXNzdWUzMjE2MjQwMTY=,252,/-/versions should report the FTS version supported by SQLite,9599,simonw,closed,0,,,,,0,2018-05-09T15:43:47Z,2018-05-11T13:19:52Z,2018-05-11T13:19:52Z,OWNER,,I can copy this function from `csvs-to-sqlite`: https://github.com/simonw/csvs-to-sqlite/blob/dccbf65b37bc9eed50e9edb80a42f257e93edb1f/csvs_to_sqlite/utils.py#L283-L293,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/252/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 321631020,MDU6SXNzdWUzMjE2MzEwMjA=,253,Documentation explaining how to use SQLite FTS with Datasette,9599,simonw,closed,0,,,,,1,2018-05-09T16:02:08Z,2018-05-12T12:09:02Z,2018-05-12T12:06:51Z,OWNER,,"In particular how to work with https://www.sqlite.org/fts3.html#_external_content_fts4_tables_ - which Datasette can automatically detect and use to add a search UI to your page. Examples of basic search setup like this: ``` CREATE VIRTUAL TABLE ""interests_fts"" USING FTS4 (name, content=""interests""); INSERT INTO ""interests_fts"" (rowid, name) SELECT rowid, name FROM interests; ``` And complex join-based search setup like this: ``` CREATE VIRTUAL TABLE ""interests_fts"" USING FTS4 (name, category, member, content=""interests""); INSERT INTO ""interests_fts"" (rowid, name, category, member) SELECT interests.rowid, interests.name, interest_categories.name, members.name FROM interests JOIN interest_categories ON interests.category_id = interest_categories.id JOIN members ON interests.member_id = members.id; ``` Also mention how `csvs-to-sqlite` can be used to do this easily. This will benefit from #252 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/253/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 322283067,MDU6SXNzdWUzMjIyODMwNjc=,254,Escaping named parameters in canned queries,247131,philroche,closed,0,,,,,4,2018-05-11T12:43:30Z,2020-05-10T14:54:14Z,2020-05-10T14:54:13Z,NONE,,"Thank you very much for this project. I have created some canned queries but some of the filters include a colon eg. ""com.ubuntu.cloud:server:18.04:amd64"". When saved these colons are parsed as named parameters. Is there a way to escape colons in a canned query?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/254/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 322477187,MDU6SXNzdWUzMjI0NzcxODc=,255,Facets,9599,simonw,closed,0,,,,,16,2018-05-12T03:00:07Z,2019-05-29T21:39:12Z,2018-05-16T15:32:12Z,OWNER,,"Ability to display facets and facet counts on the table view. Facets can be specified in the URL with `?_facet=column&_facet=othercolumn` or the default facets for a table can be set using a new `""facets"": [...]` property in `metadata.json` - [x] Implement `?_facet=` - [x] Implement `metadata.json` `facets` key - [x] Design for how facets should be presented - [x] Facets should be able to toggle off as well as on - [x] Expand labels for facets that are foreign keys - [x] Suggest potential facets (if we can do so within a tight time limit) - [x] Documentation",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/255/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 322551723,MDU6SXNzdWUzMjI1NTE3MjM=,256,Break up app.py into separate view modules,9599,simonw,closed,0,,,,,1,2018-05-12T23:56:33Z,2018-05-14T03:05:37Z,2018-05-14T03:05:37Z,OWNER,,"`views/table.py` and `views/database.py` and `views/utils.py` as a starting point. Likewise, create `tests/test_views_table.py` and `tests/test_views_database.py` - these will contain both HTML and API test for those views.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/256/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 322591993,MDExOlB1bGxSZXF1ZXN0MTg3NjY4ODkw,257,Refactor views,9599,simonw,closed,0,,,,,5,2018-05-13T13:00:50Z,2018-05-14T03:04:25Z,2018-05-14T03:04:24Z,OWNER,simonw/datasette/pulls/257,"* Split out view classes from main `app.py` * Run [black](https://github.com/ambv/black) against resulting code to apply opinionated source code formatting * Run [isort](https://github.com/timothycrosley/isort) to re-order my imports Refs #256 ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/257/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 322741659,MDExOlB1bGxSZXF1ZXN0MTg3NzcwMzQ1,258,Add new metadata key persistent_urls which removes the hash from all database urls,247131,philroche,closed,0,,,,,3,2018-05-14T09:39:18Z,2018-05-21T07:38:15Z,2018-05-21T07:38:15Z,NONE,simonw/datasette/pulls/258,"Add new metadata key ""persistent_urls"" which removes the hash from all database urls when set to ""true"" This PR is just to gauge if this, or something like it, is something you would consider merging? I understand the reason why the substring of the hash is included in the url but there are some use cases where the urls should persist across deployments. For bookmarks for example or for scripts that use the JSON API. This is the initial commit for this feature. Tests and documentation updates to follow.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/258/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 322787470,MDU6SXNzdWUzMjI3ODc0NzA=,259,inspect() should detect many-to-many relationships,9599,simonw,closed,0,,,,,6,2018-05-14T12:03:58Z,2019-05-23T03:55:37Z,2019-05-23T03:55:37Z,OWNER,,"Relates to #255 - in particular supporting facets across M2M relationships. It should be possible for `.inspect()` to notice when a table has two foreign keys to two different tables, and assume that this means there is a M2M relationship between those tables. When rendering a table with a m2m relationship we could display the first X associated records as a comma separated list of hyperlinks in a new column on the table view, with a column name derived from the table on the other side. Since SQLite doesn't have RANK or an equivalent of https://www.xaprb.com/blog/2006/12/02/how-to-number-rows-in-mysql/ this would be implemented as N+1 queries (one query per cell that we want to display an m2m summary). This should be OK in SQLite: https://sqlite.org/np1queryprob.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/259/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 323223872,MDU6SXNzdWUzMjMyMjM4NzI=,260,Validate metadata.json on startup,9599,simonw,open,0,,,,,7,2018-05-15T13:42:56Z,2023-06-21T12:51:22Z,,OWNER,,"It's easy to misspell the name of a database or table and then be puzzled when the metadata settings silently fail. To avoid this, let's sanity check the provided metadata.json on startup and quit with a useful error message if we find any obvious mistakes.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/260/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 323459939,MDExOlB1bGxSZXF1ZXN0MTg4MzEyNDEx,261,Facets improvements plus suggested facets,9599,simonw,closed,0,,,,,0,2018-05-16T03:52:39Z,2018-05-16T15:27:26Z,2018-05-16T15:27:25Z,OWNER,simonw/datasette/pulls/261,Refs #255,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/261/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 323658641,MDU6SXNzdWUzMjM2NTg2NDE=,262,Add ?_extra= mechanism for requesting extra properties in JSON,9599,simonw,open,0,,,3268330,Datasette 1.0,27,2018-05-16T14:55:42Z,2023-03-29T06:22:22Z,,OWNER,,"Datasette views currently work by creating a set of data that should be returned as JSON, then defining an additional, optional `template_data()` function which is called if the view is being rendered as HTML. This `template_data()` function calculates extra template context variables which are necessary for the HTML view but should not be included in the JSON. Example of how that is used today: https://github.com/simonw/datasette/blob/2b79f2bdeb1efa86e0756e741292d625f91cb93d/datasette/views/table.py#L672-L704 With features like Facets in #255 I'm beginning to want to move more items into the `template_data()` - in the case of facets it's the `suggested_facets` array. This saves that feature from being calculated (involving several SQL queries) for the JSON case where it is unlikely to be used. But... as an API user, I want to still optionally be able to access that information. Solution: Add a `?_extra=suggested_facets&_extra=table_metadata` argument which can be used to optionally request additional blocks to be added to the JSON API. Then redefine as many of the current `template_data()` features as extra arguments instead, and teach Datasette to return certain extras by default when rendering templates. This could allow the JSON representation to be slimmed down further (removing e.g. the `table_definition` and `view_definition` keys) while still making that information available to API users who need it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/262/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 323671577,MDU6SXNzdWUzMjM2NzE1Nzc=,263,Facets should not execute for ?shape=array|object,9599,simonw,closed,0,,,,,3,2018-05-16T15:26:13Z,2021-06-02T02:54:34Z,2021-06-02T02:54:34Z,OWNER,,Split off from #255 - there's no point executing the facet SQL for the `?_shape=array` and `?_shape=object` API responses.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/263/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 323673899,MDU6SXNzdWUzMjM2NzM4OTk=,264,Make it possible to customize various facet settings,9599,simonw,closed,0,,,,,1,2018-05-16T15:31:34Z,2018-05-18T06:18:00Z,2018-05-18T05:11:52Z,OWNER,,"The new Facets implementation from #255 includes several hard-coded settings which should be made configurable somehow: Number of rows to return in a facet (maybe this should also be an option that can be set via quersytring argument, e.g. `?_facet=qSpecies:40`): https://github.com/simonw/datasette/blob/9959a9e4deec8e3e178f919e8b494214d5faa7fd/datasette/views/table.py#L539 Time limit for executing a facet: https://github.com/simonw/datasette/blob/9959a9e4deec8e3e178f919e8b494214d5faa7fd/datasette/views/table.py#L559-L562 Maximum unique values returned in order for a column to be suggested as a facet: https://github.com/simonw/datasette/blob/9959a9e4deec8e3e178f919e8b494214d5faa7fd/datasette/views/table.py#L646-L647 Time limit for calculating if a column should be a suggested facet: https://github.com/simonw/datasette/blob/9959a9e4deec8e3e178f919e8b494214d5faa7fd/datasette/views/table.py#L664-L667 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/264/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 323677499,MDU6SXNzdWUzMjM2Nzc0OTk=,265,Add links to example Datasette instances to appropiate places in docs,9599,simonw,closed,0,,,,,5,2018-05-16T15:40:20Z,2018-06-18T15:52:15Z,2018-06-18T15:52:15Z,OWNER,,"Links to working examples would really help, especially on these pages: * http://datasette.readthedocs.io/en/latest/json_api.html * http://datasette.readthedocs.io/en/latest/sql_queries.html * http://datasette.readthedocs.io/en/latest/facets.html * http://datasette.readthedocs.io/en/latest/full_text_search.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/265/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 323681589,MDU6SXNzdWUzMjM2ODE1ODk=,266,Export to CSV,9599,simonw,closed,0,,,,,27,2018-05-16T15:50:24Z,2021-06-17T18:14:24Z,2018-06-18T06:05:25Z,OWNER,,Datasette needs to be able to export data to CSV.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/266/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 323716411,MDU6SXNzdWUzMjM3MTY0MTE=,267,"Documentation for URL hashing, redirects and cache policy",9599,simonw,closed,0,,,,,3,2018-05-16T17:29:01Z,2019-06-24T06:41:02Z,2019-06-24T06:41:02Z,OWNER,,See my comments on #258 for a starting point,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/267/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 323718842,MDU6SXNzdWUzMjM3MTg4NDI=,268,Mechanism for ranking results from SQLite full-text search,9599,simonw,open,0,,,,,12,2018-05-16T17:36:40Z,2022-01-13T22:21:28Z,,OWNER,,This isn't particularly straight-forward - all the more reason for Datasette to implement it for you. This article is helpful: http://charlesleifer.com/blog/using-sqlite-full-text-search-with-python/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/268/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 323726888,MDU6SXNzdWUzMjM3MjY4ODg=,269,"If a facet fails due to timing out, let the user know somehow",9599,simonw,closed,0,,,,,0,2018-05-16T18:01:47Z,2018-05-18T06:11:46Z,2018-05-18T06:11:46Z,OWNER,,Refs #255 - right now facets fail silently if the user requested them but they take longer than 200ms to calculate - see also #264,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/269/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 323830051,MDU6SXNzdWUzMjM4MzAwNTE=,270,--limit= CLI option for setting limits,9599,simonw,closed,0,,,,,1,2018-05-17T00:14:24Z,2018-05-18T06:19:31Z,2018-05-18T06:16:39Z,OWNER,,"#264 calls for four new datasette limit options, on top of the two existing ones: * `--max_returned_rows` * `--sql_time_limit_ms` These are already clogging up `datasette serve --help` a bit. How about this syntax instead? datasette --limit max_returned_rows:100 \ --limit facet_timeout_ms:500 demo.db Then we can add as many new user over-rideable limits as we like without clogging up `--help` too much - though it would be good to have a way of optionally listings their documentation as well.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/270/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 324162476,MDU6SXNzdWUzMjQxNjI0NzY=,271,Mechanism for automatically picking up changes when on-disk .db file changes,9599,simonw,closed,0,,,,,4,2018-05-17T19:53:15Z,2019-01-10T21:35:18Z,2019-01-10T21:35:18Z,OWNER,,"It would be useful if Datasette could spot when a SQLite database file changes on disk and restart itself (hence re-running .inspect() and picking up the new content hash). Ideally this could happen in an atomic way so no requests get dropped during the switch-over. This may not play well with SQLite opening databases in immutable mode. Research required.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/271/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 324188953,MDU6SXNzdWUzMjQxODg5NTM=,272,Port Datasette to ASGI,9599,simonw,closed,0,9599,simonw,3268330,Datasette 1.0,42,2018-05-17T21:16:32Z,2019-06-24T04:54:15Z,2019-06-24T03:33:06Z,OWNER,,"Datasette doesn't take much advantage of Sanic, and I'm increasingly having to work around parts of it because of idiosyncrasies that are specific to Datasette - caring about the exact order of querystring arguments for example. Since Datasette is GET-only our needs from a web framework are actually pretty slim. This becomes more important as I expand the plugins #14 framework. Am I sure I want the plugin ecosystem to depend on a Sanic if I might move away from it in the future? If Datasette wasn't all about async/await I would use WSGI, but today it makes more sense to use ASGI. I'd like to be confident that switching to ASGI would still give me the excellent performance that Sanic provides. https://github.com/django/asgiref/blob/master/specs/asgi.rst",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/272/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 324451322,MDU6SXNzdWUzMjQ0NTEzMjI=,273,Figure out a way to have /-/version return current git commit hash,9599,simonw,closed,0,,,,,2,2018-05-18T15:16:56Z,2018-05-22T19:35:22Z,2018-05-22T19:35:22Z,OWNER,,"https://fivethirtyeight.datasettes.com/-/versions reports Datasette version `0.21` This isn't actually correct. The deploy script for that site actually deploys current master using `https://github.com/simonw/datasette/archive/master.zip`: https://github.com/simonw/fivethirtyeight-datasette/blob/66b4b0dfedd7237bc8c02d3e26d905bca7b84069/Dockerfile#L9 Ideally this would show the current commit hash, but I'm not at all sure if it's possible to derive that from `pip install https://github.com/simonw/datasette/archive/master.zip`. Is there another mechanism that could be used to reliably `pip install` current master but still provide access to the most recent commit hash?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/273/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 324652142,MDU6SXNzdWUzMjQ2NTIxNDI=,274,"Rename --limit to --config, add --help-config",9599,simonw,closed,0,,,,,2,2018-05-19T18:57:42Z,2018-05-20T17:04:55Z,2018-05-20T17:04:11Z,OWNER,,"#270 introduced `--limit` but on further thought it should be called `--config` instead. `--page_size` should becomes `--config default_page_size:1000` Add `--help-config` to show full help showing all config settings.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/274/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 324720095,MDU6SXNzdWUzMjQ3MjAwOTU=,275,"""config"" section in metadata.json (root, database and table level)",9599,simonw,open,0,,,,,2,2018-05-20T16:02:28Z,2019-05-13T18:33:31Z,,OWNER,,"Split off from #274 Metadata should an optional `""config""` section at root, table or database level. The TableView and RowView and DatabaseView and BaseView classes could all have a `.config(""key"")` method which knows how to resolve the hierarchy of configs. This will allow individual tables (or databases) to set their own config settings for things like `sql_time_limit_ms`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/275/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 324835838,MDU6SXNzdWUzMjQ4MzU4Mzg=,276,Handle spatialite geometry columns better,45057,russss,closed,0,,,,,21,2018-05-21T08:46:55Z,2022-03-21T22:22:20Z,2022-03-21T22:22:20Z,CONTRIBUTOR,,"I'd like to see spatialite geometry columns rendered more sensibly - at the moment they come through as well-known-binary unless you use custom SQL, and WKB isn't of much use to anyone on the web. In HTML: they should be shown either as simple lat/long (if it's just a point, for example), or as a sensible placeholder if they're more complex geometries. In JSON: they should be GeoJSON geometries, (which means they can be automatically fed into a leaflet map with no further messing around). In CSV: they should be WKT. I briefly wondered if this should go into a plugin, but I suspect it needs hooking in at a deeper level than the plugin architecture will support any time soon.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/276/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 324836533,MDExOlB1bGxSZXF1ZXN0MTg5MzE4NDUz,277,Refactor inspect logic,45057,russss,closed,0,,,,,2,2018-05-21T08:49:31Z,2018-05-22T16:07:24Z,2018-05-22T14:03:07Z,CONTRIBUTOR,simonw/datasette/pulls/277,"This pulls the logic for inspect out into a new file which makes it a bit easier to understand. This was going to be the first part of an implementation for #276, but it seems like that might take a while so I'm going to PR a few bits of refactoring individually.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/277/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 325294102,MDU6SXNzdWUzMjUyOTQxMDI=,278,Build smallest possible Docker image with Datasette plus recent SQLite (with json1) plus Spatialite 4.4.0,9599,simonw,closed,0,,,,,3,2018-05-22T13:28:40Z,2018-05-23T17:43:36Z,2018-05-23T17:43:36Z,OWNER,,"A Dockerfile that does the following: * Bundles Datasette master * Python 3.6 most recent version (or 3.7 if it has been released) * SQLite 3.23.1 (or most recent release) such that ""import sqite3"" in Python gets that version. Ideally with the json1 module baked in by default, but having it loadable as an optional module is fine too * SpatiaLite 4.4.0-RC0 (or most recent version) such that it can be loaded as an optional module * Uses multi-stage builds to stay as small as possible Note that the current ""release"" of SpatiaLite is 4.3.0 which is missing key features like https://www.gaia-gis.it/fossil/libspatialite/wiki?name=KNN - 4.4.0 probably needs to be compiled from source. I don't know the best way to get a current SQLite version bundled for Python 3. Maybe https://github.com/coleifer/pysqlite3 ?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/278/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 325352370,MDExOlB1bGxSZXF1ZXN0MTg5NzA3Mzc0,279,Add version number support with Versioneer,198537,rgieseke,closed,0,,,,,4,2018-05-22T15:39:45Z,2018-05-22T19:35:23Z,2018-05-22T19:35:22Z,CONTRIBUTOR,simonw/datasette/pulls/279,"I think that's all for getting Versioneer support, I've been happily using it in a couple of projects ... ``` In [2]: datasette.__version__ Out[2]: '0.22+3.g6e12445' ``` Repo: https://github.com/warner/python-versioneer Versioneer Licence: Public Domain (CC0-1.0) Closes #273 ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/279/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 325373747,MDExOlB1bGxSZXF1ZXN0MTg5NzIzNzE2,280,Build Dockerfile with recent Sqlite + Spatialite,565628,r4vi,closed,0,,,,,10,2018-05-22T16:33:50Z,2018-06-28T11:26:23Z,2018-05-23T17:43:35Z,CONTRIBUTOR,simonw/datasette/pulls/280,"This solves #278 without bloating the Dockerfile too much, the image size is now 495MB (original was ~240MB) but it could be reduced significantly if we only copied the output of the compilation of spatialite and friends to /usr/local/lib, instead of the entirety of it however that will take more time. In the python code change references to `import sqlite3` to `import pysqlite3` and it should use the compiled version of sqlite3.23.1. You don't need to try/except because pysqlite3 falls back to builtin sqlite3 if there is no compiled version. ```bash $ docker run --rm -it datasette spatialite SpatiaLite version ..: 4.4.0-RC0 Supported Extensions: - 'VirtualShape' [direct Shapefile access] - 'VirtualDbf' [direct DBF access] - 'VirtualXL' [direct XLS access] - 'VirtualText' [direct CSV/TXT access] - 'VirtualNetwork' [Dijkstra shortest path] - 'RTree' [Spatial Index - R*Tree] - 'MbrCache' [Spatial Index - MBR cache] - 'VirtualSpatialIndex' [R*Tree metahandler] - 'VirtualElementary' [ElemGeoms metahandler] - 'VirtualKNN' [K-Nearest Neighbors metahandler] - 'VirtualXPath' [XML Path Language - XPath] - 'VirtualFDO' [FDO-OGR interoperability] - 'VirtualGPKG' [OGC GeoPackage interoperability] - 'VirtualBBox' [BoundingBox tables] - 'SpatiaLite' [Spatial SQL - OGC] PROJ.4 version ......: Rel. 4.9.3, 15 August 2016 GEOS version ........: 3.5.1-CAPI-1.9.1 r4246 TARGET CPU ..........: x86_64-linux-gnu the SPATIAL_REF_SYS table already contains some row(s) SQLite version ......: 3.23.1 Enter "".help"" for instructions SQLite version 3.23.1 2018-04-10 17:39:29 Enter "".help"" for instructions Enter SQL statements terminated with a "";"" spatialite> ``` ```bash $ docker run --rm -it datasette python -c ""import pysqlite3; print(pysqlite3.sqlite_version)"" 3.23.1 ```",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/280/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 325553991,MDExOlB1bGxSZXF1ZXN0MTg5ODYwMDUy,281,Reduces image size using Alpine + Multistage (re: #278),487897,iMerica,closed,0,,,,,1,2018-05-23T05:27:05Z,2018-05-26T02:10:38Z,2018-05-26T02:10:38Z,NONE,simonw/datasette/pulls/281,"Hey Simon! I got the image size down from 256MB to 110MB. Seems to be working okay, but you might want to test it a bit more. Example output of `docker run --rm -it datasette` ``` Serve! files=() on port 8001 [2018-05-23 05:23:08 +0000] [1] [INFO] Goin' Fast @ http://127.0.0.1:8001 [2018-05-23 05:23:08 +0000] [1] [INFO] Starting worker [1] ``` Related: https://github.com/simonw/datasette/issues/278 ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/281/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 325705981,MDU6SXNzdWUzMjU3MDU5ODE=,282,Faceting breaks pagination,9599,simonw,closed,0,,,,,1,2018-05-23T13:29:47Z,2018-05-23T13:53:39Z,2018-05-23T13:42:07Z,OWNER,,"e.g. on https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3/nba-elo%2Fnbaallelo?_facet=lg_id#facet-lg_id - click the ""next page"" link: https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3/nba-elo%2Fnbaallelo?_facet=lg_id&_next=100 Invalid SQL: near ""and"": syntax error",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/282/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 325958506,MDU6SXNzdWUzMjU5NTg1MDY=,283,Support cross-database joins,9599,simonw,closed,0,,,,,26,2018-05-24T04:18:39Z,2021-06-06T09:40:18Z,2021-02-18T22:16:46Z,OWNER,,"SQLite has the ability to attach multiple databases to a single connection and then run joins across multiple databases. Since Datasette supports more than one database, this would make a pretty neat feature.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/283/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 326182814,MDU6SXNzdWUzMjYxODI4MTQ=,284,Ability to enable/disable specific features via --config,9599,simonw,closed,0,,,,,5,2018-05-24T15:47:56Z,2018-05-25T06:05:02Z,2018-05-25T05:51:09Z,OWNER,,"`--config` settings from #274 can currently only be integers. I'd like them to be available as boooeans too. Then we can use them to have that are turned on by default but can be turned off. First features to get this treatment: - [x] `allow_sql` - whether or not the `?sql=` parameter is allowed and form is displayed - [X] `allow_facet` - is `?_facet=` allowed or do we only run facets defined in `metadata.json` - [X] `allow_download` - do we let users download the full SQLite database file? - [X] `suggest_facets` - do we attempt to calculate suggested facets? Refs #275 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/284/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 326189744,MDU6SXNzdWUzMjYxODk3NDQ=,285,num_threads and cache_max_age should be --config options,9599,simonw,closed,0,,,,,2,2018-05-24T16:04:51Z,2018-05-27T00:53:35Z,2018-05-27T00:43:33Z,OWNER,,"https://github.com/simonw/datasette/blob/58b5a37dbbf13868a46bcbb284509434e66eca25/datasette/app.py#L106 And https://github.com/simonw/datasette/blob/58b5a37dbbf13868a46bcbb284509434e66eca25/datasette/views/base.py#L325 Refs #275 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/285/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 326599525,MDU6SXNzdWUzMjY1OTk1MjU=,286,Database hash should include current datasette version,9599,simonw,open,0,,,,,2,2018-05-25T17:03:42Z,2018-05-25T17:07:36Z,,OWNER,,"Right now deploying a new version of datasette doesn't invalidate existing URLs, so users may still see a cached copy of the old templates. We can fix this by including the current datasette version in the input to the hash function (which currently just the database file contents).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/286/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 326617744,MDU6SXNzdWUzMjY2MTc3NDQ=,287,?_shape=arrayfirst,9599,simonw,closed,0,,,,,1,2018-05-25T18:11:03Z,2018-05-27T00:32:53Z,2018-05-27T00:32:29Z,OWNER,,Return an array of single items (the first item in each row returned from the SQL query).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/287/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 326767626,MDU6SXNzdWUzMjY3Njc2MjY=,288,Support multiple filters of the same type,9599,simonw,closed,0,,,,,3,2018-05-26T21:13:12Z,2019-04-15T23:45:04Z,2019-04-15T23:44:26Z,OWNER,,"This should work for example: https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3/biopics%2Fbiopics?year_release__not=2014&year_release__not=2015",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/288/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 326768188,MDU6SXNzdWUzMjY3NjgxODg=,289,?_ttl= parameter to control caching,9599,simonw,closed,0,,,,,3,2018-05-26T21:22:55Z,2018-05-26T22:22:47Z,2018-05-26T22:17:48Z,OWNER,,"This would allow clients to specify the max-age caching header that should be returned with the query. Most important this will allow caching to be completely urned off for specific queries using `?_ttl=0`. Sending 0 should cause a `Cache-Control: no-cache` header to be returned.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/289/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 326778161,MDU6SXNzdWUzMjY3NzgxNjE=,290,Consider increasing the default for num_sql_threads (currently 3),9599,simonw,open,0,,,,,0,2018-05-27T00:52:41Z,2018-05-27T00:52:41Z,,OWNER,,"I ran a very rough micro-benchmark on the new `num_sql_threads` config option (added in #285) datasette --config num_sql_threads:1 fivethirtyeight.db Then ab -n 100 -c 10 'http://127.0.0.1:8011/fivethirtyeight-2628db9/twitter-ratio%2Fsenators' | Number of threads | Requests/second | |---|---| | 1 | 4.57 | | 3 | 9.77 | | 10 | 13.53 | | 20 | 15.24 | 50 | 8.21 | This was on my early 2018 OS X laptop. Need to benchmark in other common environments before making a decision on changing the default. That said, the default of 3 was a number I plucked out of thin air.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/290/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 326783670,MDU6SXNzdWUzMjY3ODM2NzA=,291,Avoid plugins accidentally loading dependencies twice,9599,simonw,closed,0,,,,,3,2018-05-27T03:15:21Z,2020-09-30T20:36:12Z,2018-05-28T20:42:02Z,OWNER,,Plugins that include JavaScript files risk loading the same code twice. In particular: I want to build a second plugin that uses the Leaflet mapping library (the first was [datasette-cluster-map](https://pypi.org/project/datasette-cluster-map/)). But I don't want the two plugins to load duplicate copies of Leaflet.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/291/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 326800219,MDU6SXNzdWUzMjY4MDAyMTk=,292,Mechanism for customizing the SQL used to select specific columns in the table view,9599,simonw,closed,0,,,,,15,2018-05-27T09:05:52Z,2021-05-27T04:25:01Z,2021-05-27T04:25:01Z,OWNER,,"Some columns don't make a lot of sense in their default representation - binary blobs such as SpatiaLite geometries for example, or lengthy columns that really should be truncated somehow. We may also find that there are tables where we don't want to show all of the columns - so a mechanism to select a subset of columns would be nice. I think there are two features here: * the ability to request a subset of columns on the table view * the ability to override the SQL for a specific column and/or add extra columns - `AsGeoJSON(Geometry)` for example Both features should be available via both querystring arguments and in `metadata.json` The querystring argument for custom SQL should only work if `allow_sql` config is turned on. Refs #276",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/292/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 326987229,MDExOlB1bGxSZXF1ZXN0MTkwOTAxNDI5,293,Support for external database connectors,11912854,jsancho-gpl,closed,0,,,,,1,2018-05-28T11:02:45Z,2018-09-11T14:32:45Z,2018-09-11T14:32:45Z,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/293,"I think it would be nice that Datasette could work with other file formats that aren't SQLite, like files with PyTables format. I've tried to accomplish that using external connectors published with entry points. These external connectors must have a structure similar to the structure [PyTables Datasette connector](https://github.com/PyTables/datasette-pytables) has.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/293/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 327365110,MDU6SXNzdWUzMjczNjUxMTA=,294,inspect should record column types,9599,simonw,open,0,,,,,7,2018-05-29T15:10:41Z,2019-06-28T16:45:28Z,,OWNER,,"For each table we want to know the columns, their order and what type they are. I'm going to break with SQLite defaults a little on this one and allow datasette to define additional types - to start with just a `geometry` type for columns that are detected as SpatiaLite geometries. Possible JSON design: ""columns"": [{ ""name"": ""title"", ""type"": ""text"" }, ...] Refs #276",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/294/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 327383759,MDU6SXNzdWUzMjczODM3NTk=,295,Extract unit tests for inspect out to test_inspect.py,9599,simonw,closed,0,,,,,2,2018-05-29T15:55:04Z,2019-05-11T21:40:32Z,2019-05-11T21:40:32Z,OWNER,,"Right now they are bundled up as API unit tests for a relatively unimportant endpoint. They should be their own thing. Blocks #294",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/295/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 327395270,MDU6SXNzdWUzMjczOTUyNzA=,296,Per-database and per-table /-/ URL namespace,9599,simonw,open,0,,,,,3,2018-05-29T16:23:13Z,2019-06-28T16:46:34Z,,OWNER,,"Initially this will be for subsets of `/-/inspect` and `/-/metadata` but it will also give us a URL namespace for future features like `/-/facet` (expanded list of a specific facet, linked to from `...`) and `/-/graph` To start: * `/dbname/-/inspect` * `/dbname/-/metadata` * `/dbname/tablename/-/inspect` * `/dbname/tablename/-/metadata` This means we will no longer allow databases or tables to have the name `""-""` - I think that's OK We will continue to support rows with a primary key of `""-""` at the following URL: * `/dbname/tablename/-`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/296/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 327420945,MDU6SXNzdWUzMjc0MjA5NDU=,297,datasette publish Dockerfile should use python:3.6-slim-stretch,9599,simonw,closed,0,,,,,1,2018-05-29T17:40:08Z,2018-05-31T14:44:37Z,2018-05-31T14:44:37Z,OWNER,,"Right now the Dockerfile generated by `datasette package` and `datasette publish` uses this: https://github.com/simonw/datasette/blob/b0a95da96386ddf99816911e08df86178ffa9a89/datasette/utils.py#L269 This appears to result in a SQLite version of `3.8.7.1` - https://parlgov.datasettes.com/-/versions ``` ""sqlite"": { ""extensions"": {}, ""fts_versions"": [ ""FTS4"", ""FTS3"" ], ""version"": ""3.8.7.1"" } ``` Meanwhile, https://fivethirtyeight.datasettes.com/-/versions is deployed with this Dockerfile https://github.com/simonw/fivethirtyeight-datasette/blob/0849901cae06e957fe04892cd4033bdcd1fcf966/Dockerfile which uses `FROM python:3.6-slim-stretch` and results in the following version report: ``` ""sqlite"": { ""extensions"": { ""json1"": null }, ""fts_versions"": [ ""FTS5"", ""FTS4"", ""FTS3"" ], ""version"": ""3.16.2"" } ``` So not only do we get a more recent SQLite (including https://www.sqlite.org/rowvalue.html added in 3.15) but we also get `FTS5` and `json1` as well. Refs #191 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/297/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 327459829,MDU6SXNzdWUzMjc0NTk4Mjk=,298,URLify URLs in results from custom SQL statements / views,9599,simonw,closed,0,,,,,2,2018-05-29T19:41:07Z,2018-07-24T04:53:20Z,2018-07-24T03:56:50Z,OWNER,,"Consider this custom query: https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3?sql=select+user%2C+%28%27https%3A%2F%2Ftwitter.com%2F%27+%7C%7C+user%29+as+user_url%2C+created_at%2C+text%2C+url+from+%5Btwitter-ratio%2Fsenators%5D+limit+10%3B ```select user, ('https://twitter.com/' || user) as user_url, created_at, text, url from [twitter-ratio/senators] limit 10;``` ![2018-05-29 at 12 38 pm](https://user-images.githubusercontent.com/9599/40681177-44a36d5c-633d-11e8-935b-c49dad7ac682.png) It would be nice if these URLs were turned into links, as happens on the table view page: https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3/twitter-ratio%2Fsenators ![2018-05-29 at 12 39 pm](https://user-images.githubusercontent.com/9599/40681206-5c69c47c-633d-11e8-9f3a-08899f8659b8.png) This currently does not happen because the table view render logic takes a different path through `display_columns_and_rows()` which includes this bit: https://github.com/simonw/datasette/blob/b0a95da96386ddf99816911e08df86178ffa9a89/datasette/views/table.py#L195-L202",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/298/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 327461381,MDU6SXNzdWUzMjc0NjEzODE=,299,Documentation covering ALL datasette URLs,9599,simonw,closed,0,,,,,1,2018-05-29T19:46:15Z,2018-07-28T04:24:05Z,2018-07-28T04:22:30Z,OWNER,,"Relates to #296. We need a single page of the docs listing all of the URL patterns Datasette responds to, also detailing which templates are used to render them and linking to examples of the JSON they output when called with `.json`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/299/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 327541975,MDU6SXNzdWUzMjc1NDE5NzU=,300,Hide sort select box on larger screens,9599,simonw,closed,0,,,,,0,2018-05-30T01:34:59Z,2018-05-31T14:43:13Z,2018-05-31T14:43:13Z,OWNER,,"I'm larger screens you can sort by clicking column headers, so no need to show the select box (which was added for the small screen layout that doesn't show headers)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/300/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 328155946,MDU6SXNzdWUzMjgxNTU5NDY=,301,"--spatialite option for ""datasette publish heroku""",9599,simonw,open,0,,,,,1,2018-05-31T14:13:09Z,2022-01-20T21:28:50Z,,OWNER,,Split off from #243. Need to figure out how to install and configure SpatiaLite on Heroku.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/301/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 328171513,MDU6SXNzdWUzMjgxNzE1MTM=,302,test-2.3.sqlite database filename throws a 404,9599,simonw,closed,0,,,3439337,0.23.1,2,2018-05-31T14:50:58Z,2018-06-21T15:21:17Z,2018-06-21T15:21:16Z,OWNER,,"The following almost works: datasette test-2.3.sqlite http://127.0.0.1:8001test-2.3-c88bc35/HighWays loads OK, but http://127.0.0.1:8001test-2.3-c88bc35 throws a 404: ![2018-05-31 at 7 50 am](https://user-images.githubusercontent.com/9599/40789434-447ae934-64a7-11e8-9a07-4eeba87147d5.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/302/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 328172521,MDU6SXNzdWUzMjgxNzI1MjE=,303,Support table names ending with .json or .csv,9599,simonw,closed,0,,,,,4,2018-05-31T14:53:23Z,2018-06-15T06:55:50Z,2018-06-15T06:55:50Z,OWNER,,"This is needed for #266 - if a table name ends with `.json` or `.csv` right now our URL pattern matching will do the wrong thing. We should be smarter about this. This does mean we will have some URLs that look like this: http://localhost:8001/dbname/weird.json - returning HTML, not JSON http://localhost:8001/dbname/weird.json.json - returning JSON http://localhost:8001/dbname/weird.json.csv - returning CSV ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/303/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 328229224,MDU6SXNzdWUzMjgyMjkyMjQ=,304,Ability to configure SQLite cache_size,9599,simonw,closed,0,,,,,3,2018-05-31T17:28:07Z,2018-06-04T16:13:32Z,2018-06-04T16:03:19Z,OWNER,,"See https://www.sqlite.org/pragma.html#pragma_cache_size Let's call the config setting `cache_size_kb` to emphasize that we're using the negative option. Note this warning: perhaps we should raise an error if you try to use this setting against a SQLite version prior to 3.7.10 > If the argument N is positive then the suggested cache size is set to N. If the argument N is negative, then the number of cache pages is adjusted to use approximately abs(N*1024) bytes of memory. Backwards compatibility note: The behavior of cache_size with a negative N was different in prior to version 3.7.10 (2012-01-16). In version 3.7.9 and earlier, the number of pages in the cache was set to the absolute value of N.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/304/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 329147284,MDU6SXNzdWUzMjkxNDcyODQ=,305,Add contributor guidelines to docs,9599,simonw,closed,0,,,,,2,2018-06-04T17:25:30Z,2019-06-24T06:40:19Z,2019-06-24T06:40:19Z,OWNER,,https://channels.readthedocs.io/en/latest/contributing.html is a nice example of this done well.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/305/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 329661905,MDU6SXNzdWUzMjk2NjE5MDU=,306,Custom URL routing with independent tests,9599,simonw,closed,0,,,,,5,2018-06-05T23:40:08Z,2018-06-07T15:29:28Z,2018-06-07T15:29:28Z,OWNER,,"The more I think about #303 the more I feel that Datasette's URL routing needs go beyond Django-style regex matching. If we go custom, tests should live in `test_routing.py`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/306/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 330323860,MDExOlB1bGxSZXF1ZXN0MTkzMzYxMzQx,307,"Initial sketch of custom URL routing, refs #306",9599,simonw,closed,0,,,,,1,2018-06-07T15:26:48Z,2018-06-07T15:29:54Z,2018-06-07T15:29:41Z,OWNER,simonw/datasette/pulls/307,See #306 for background on this.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/307/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 330826972,MDU6SXNzdWUzMzA4MjY5NzI=,308,"Support extra Heroku apps:create options - region, space, team",78156,annapowellsmith,open,0,,,,,2,2018-06-08T23:08:33Z,2018-09-21T14:09:28Z,,NONE,,"It would be useful to document how to pass Heroku CLI options on `datasette publish`, e.g. `--region eu`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/308/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 331343824,MDU6SXNzdWUzMzEzNDM4MjQ=,309,On 404s with a trailing slash redirect to that page without a trailing slash,9599,simonw,closed,0,,,3439337,0.23.1,2,2018-06-11T20:46:49Z,2018-06-21T15:22:02Z,2018-06-21T15:13:15Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/309/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 332830309,MDU6SXNzdWUzMzI4MzAzMDk=,310,datasette publish now is broken in master,9599,simonw,closed,0,,,,,0,2018-06-15T16:01:14Z,2018-06-16T16:29:50Z,2018-06-16T16:29:50Z,OWNER,,"``` > gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -fPIC -I/usr/local/include/python3.6m -c httptools/parser/parser.c -o build/temp.linux-x86_64-3.6/httptools/parser/parser.o -O2 > unable to execute 'gcc': No such file or directory > error: command 'gcc' failed with exit status 1 > > ---------------------------------------- > Command ""/usr/local/bin/python -u -c ""import setuptools, tokenize;__file__='/tmp/pip-install-s73273rj/httptools/setup.py';f=getattr(tokenize, 'open', open)(__file__);code=f.read().replace('\r\n', '\n');f.close();exec(compile(code, __file__, 'exec'))"" install --record /tmp/pip-record-yha7dxqq/install-record.txt --single-version-externally-managed --compile"" failed with error code 1 in /tmp/pip-install-s73273rj/httptools/ ``` Turns out the `python-slim` base image I introduced in b18e4515855c3f1eeca3dfcccdbb6df05869084a doesn't include gcc: https://github.com/docker-library/python/issues/60#issuecomment-134322383",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/310/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 332998752,MDExOlB1bGxSZXF1ZXN0MTk1MzM5MTEx,311,"?_labels=1 to expand foreign keys (in csv and json), refs #233",9599,simonw,closed,0,,,,,2,2018-06-16T16:31:12Z,2018-06-16T22:20:31Z,2018-06-16T22:20:31Z,OWNER,simonw/datasette/pulls/311,"Output looks something like this: { ""rowid"": 233, ""TreeID"": 121240, ""qLegalStatus"": { ""value"" 2, ""label"": ""Private"" } ""qSpecies"": { ""value"": 16, ""label"": ""Sycamore"" } ""qAddress"": ""91 Commonwealth Ave"", ... }",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/311/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 333000163,MDU6SXNzdWUzMzMwMDAxNjM=,312,"HTML, CSV and JSON views should support ?_col=&_col=",9599,simonw,closed,0,,,,,1,2018-06-16T16:53:35Z,2021-06-17T18:14:24Z,2018-06-16T17:00:12Z,OWNER,,To support whitelisting columns to display.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/312/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 333086005,MDU6SXNzdWUzMzMwODYwMDU=,313,Deploy demo of Datasette on every commit that passes tests,9599,simonw,closed,0,,,,,6,2018-06-17T19:19:12Z,2018-06-17T21:52:58Z,2018-06-17T21:52:58Z,OWNER,,We can use Travis CI and Zeit Now to ensure there is always a live demo of current master. We can ship archived demos for releases as well.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/313/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 333096176,MDU6SXNzdWUzMzMwOTYxNzY=,314,HTML table does not correctly display entirely blank rows,9599,simonw,closed,0,,,3439337,0.23.1,1,2018-06-17T21:58:06Z,2018-06-21T16:04:59Z,2018-06-21T15:26:26Z,OWNER,,"https://958b75c.datasette.io/fixtures-35b6eb6/simple_view ![2018-06-17 at 2 56 pm](https://user-images.githubusercontent.com/9599/41512541-b52e90be-723e-11e8-95c9-7d091738d5cc.png) https://958b75c.datasette.io/fixtures-35b6eb6/simple_view.json shows the underlying data: ``` ""rows"": [ [ ""hello"", ""HELLO"" ], [ ""world"", ""WORLD"" ], [ """", """" ] ] ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/314/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 333120982,MDExOlB1bGxSZXF1ZXN0MTk1NDEzMjQx,315,Streaming mode for downloading all rows as a CSV,9599,simonw,closed,0,,,,,0,2018-06-18T03:06:59Z,2018-06-18T03:29:13Z,2018-06-18T03:21:02Z,OWNER,simonw/datasette/pulls/315,Refs #266,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/315/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 333238932,MDU6SXNzdWUzMzMyMzg5MzI=,316,datasette inspect takes a very long time on large dbs,132230,gavinband,closed,0,,,,,5,2018-06-18T11:56:27Z,2019-05-11T18:26:25Z,2019-05-11T18:26:25Z,NONE,,"Hi, I want to expose data in a very large sqlite database (~600Gb) to the web. I have used datasette with success on smaller test databases with the same schema - it works very well (thanks!). However, using the full db, both `datasette inspect` and `datasette serve` seem to hang or pause for a very long time (tens of minutes) on startup. Is this expected behaviour? (I noticed that the output of `datasette inspect` includes row counts for each table. Simply counting the rows in this db will take a long time (tens of millions of rows across each of ~10 tables), so I wondered if this is the source of the problem.) Any help on a workaround would be appreciated. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/316/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 333326107,MDU6SXNzdWUzMzMzMjYxMDc=,317,Travis CI fails to upload new releases to PyPI,9599,simonw,closed,0,,,3439337,0.23.1,2,2018-06-18T15:44:26Z,2018-06-21T15:45:47Z,2018-06-21T15:45:47Z,OWNER,,"https://travis-ci.org/simonw/datasette/jobs/393684139 ``` ... removing build/bdist.linux-x86_64/wheel Uploading distributions to https://upload.pypi.org/legacy/ Uploading datasette-0.23-py3-none-any.whl 100%|██████████| 201k/201k [00:00<00:00, 1.02MB/s] HTTPError: 403 Client Error: Invalid or non-existent authentication information. for url: https://upload.pypi.org/legacy/ ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/317/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 334148669,MDU6SXNzdWUzMzQxNDg2Njk=,318,Facets with value of 0 displayed incorrectly,9599,simonw,closed,0,,,3439337,0.23.1,1,2018-06-20T16:06:46Z,2019-05-29T21:39:12Z,2018-06-21T04:30:45Z,OWNER,,"https://registry.datasette.io/registry-7d4f81f/tables?_facet=is_hidden#facet-is_hidden ![2018-06-20 at 9 05 am](https://user-images.githubusercontent.com/9599/41670448-2c06e642-7469-11e8-86be-4664269582b1.png) Displays correctly if you select it: https://registry.datasette.io/registry-7d4f81f/tables?_facet=is_hidden&is_hidden=0 ![2018-06-20 at 9 06 am](https://user-images.githubusercontent.com/9599/41670471-3e61e486-7469-11e8-8710-5da90ef65787.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/318/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 334149717,MDU6SXNzdWUzMzQxNDk3MTc=,319,Incorrect display of compound primary keys with foreign key relationships,9599,simonw,closed,0,,,3439337,0.23.1,2,2018-06-20T16:09:36Z,2018-06-21T15:58:15Z,2018-06-21T14:56:41Z,OWNER,,"https://registry.datasette.io/registry-7d4f81f/datasette_tags ![2018-06-20 at 9 07 am](https://user-images.githubusercontent.com/9599/41670542-68cc4dec-7469-11e8-9521-3bbc6465eccb.png) Underlying JSON looks [like this](https://registry.datasette.io/registry-7d4f81f/datasette_tags.json?_labels=on): ``` { ""database"": ""registry"", ""table"": ""datasette_tags"", ""is_view"": false, ""human_description_en"": """", ""rows"": [ { ""datasette_id"": { ""value"": 1, ""label"": ""Global Power Plant Database"" }, ""tag"": { ""value"": ""geospatial"", ""label"": ""geospatial"" } }, ```` Bug is likely somewhere in here: https://github.com/simonw/datasette/blob/e04f5b0d348ef7275a0a5ab9eb53527105132885/datasette/views/table.py#L143-L207",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/319/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 334169932,MDU6SXNzdWUzMzQxNjk5MzI=,320,Need unit tests covering the different states for the advanced export box,9599,simonw,closed,0,,,,,1,2018-06-20T17:03:40Z,2018-07-24T04:53:20Z,2018-07-24T03:38:40Z,OWNER,,"There are quite a few variants of this box: ![2018-06-20 at 10 02 am](https://user-images.githubusercontent.com/9599/41673229-1d423adc-7471-11e8-99d4-4251f7d03aa5.png) Test coverage should exercise all of them, since the logic is a little unclear. https://github.com/simonw/datasette/blob/fdfbbbb9ee0d02fd4d43dfc42382252fa2287d6d/datasette/templates/table.html#L140-L159",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/320/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 334190959,MDU6SXNzdWUzMzQxOTA5NTk=,321,Wildcard support in query parameters,12617395,bsilverm,closed,0,,,3439337,0.23.1,8,2018-06-20T18:03:56Z,2018-06-21T17:00:10Z,2018-06-21T04:55:26Z,NONE,,"I haven't found a way to get the wildcard (%) inserted automatically in to a query parameter. This would be useful for cases the query parameter is followed by a LIKE clause. Wrapping the parameter name using the wildcard character within the metadata file (ie - ...where xyz like %:querystring%) does not seem to work. Can this be made possible? Or if not, can the template be extended to provide a tip to the user that they need to insert the wildcard characters themselves?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/321/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 334592281,MDExOlB1bGxSZXF1ZXN0MTk2NTI2ODYx,322,Feature/in operator,2691848,groundf,closed,0,,,,,0,2018-06-21T17:41:51Z,2018-06-21T17:45:25Z,2018-06-21T17:45:25Z,NONE,simonw/datasette/pulls/322,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/322/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 334698969,MDU6SXNzdWUzMzQ2OTg5Njk=,323,Speed up Travis CI builds,9599,simonw,closed,0,,,,,1,2018-06-21T23:55:27Z,2018-07-10T15:03:37Z,2018-07-10T15:03:36Z,OWNER,,"They've got a bit slow. Part of this is the Zeit Now deploy, but the build-and-test cycle is taking at least a couple of minutes. ![2018-06-21 at 4 54 pm](https://user-images.githubusercontent.com/9599/41751010-e48c823e-7573-11e8-88f3-7aa8a7e53917.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/323/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 334731076,MDExOlB1bGxSZXF1ZXN0MTk2NjI4MzA0,324,Speed up Travis by reusing pip wheel cache across builds,9599,simonw,closed,0,,,,,0,2018-06-22T03:20:08Z,2018-06-24T01:03:47Z,2018-06-24T01:03:47Z,OWNER,simonw/datasette/pulls/324,From https://atchai.com/blog/faster-ci/ - refs #323 ,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/324/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 335064777,MDU6SXNzdWUzMzUwNjQ3Nzc=,325,Error on row page if table has slashes in the name and ends in .csv,9599,simonw,closed,0,,,,,1,2018-06-23T03:43:42Z,2018-07-09T17:28:27Z,2018-07-08T05:21:59Z,OWNER,,"https://v0-23-1.datasette.io/fixtures-e14e080/table%252Fwith%252Fslashes.csv/3 > no such table: table%252Fwith%252Fslashes.csv From clicking the row link on https://v0-23-1.datasette.io/fixtures-e14e080/table%2Fwith%2Fslashes.csv",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/325/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 335141434,MDU6SXNzdWUzMzUxNDE0MzQ=,326,CSV should respect --cors and return cors headers,9599,simonw,closed,0,,,,,1,2018-06-24T00:44:07Z,2021-06-17T18:14:24Z,2018-06-24T00:59:45Z,OWNER,,Otherwise tools like Vega can't load data via CSV.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/326/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 335200136,MDU6SXNzdWUzMzUyMDAxMzY=,327,Explore if SquashFS can be used to shrink size of packaged Docker containers,9599,simonw,open,0,,,,,4,2018-06-24T18:15:16Z,2022-02-17T23:37:24Z,,OWNER,,"Inspired by this article: https://cldellow.com/2018/06/22/sqlite-parquet-vtable.html#sqlite-database-indexed--squashed https://en.wikipedia.org/wiki/SquashFS is ""a compressed read-only file system for Linux"" - which means it could be a really nice fit for Datasette and its read-only SQLite databases. It would be interesting to explore a Dockerfile recipe that used SquashFS to compress the SQLite database file that was bundled up by `datasette package` and friends.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/327/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 336464733,MDU6SXNzdWUzMzY0NjQ3MzM=,328,"Installation instructions, including how to use the docker image",9599,simonw,closed,0,,,,,3,2018-06-28T03:59:33Z,2018-10-05T06:37:07Z,2018-06-28T04:02:10Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/328/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 336465018,MDU6SXNzdWUzMzY0NjUwMTg=,329,Travis should push tagged images to Docker Hub for each release,9599,simonw,closed,0,,,,,7,2018-06-28T04:01:31Z,2018-11-05T06:54:10Z,2018-11-05T06:53:28Z,OWNER,,https://sebest.github.io/post/using-travis-ci-to-build-docker-images/,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/329/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 336924199,MDU6SXNzdWUzMzY5MjQxOTk=,330,Limit text display in cells containing large amounts of text,82988,psychemedia,closed,0,,,,,4,2018-06-29T09:15:22Z,2018-07-24T04:53:20Z,2018-07-10T16:20:48Z,CONTRIBUTOR,,"The default preview of a database shows all columns (is the row count limited?) which is fine in many cases but can take a long time to load / offer a large overhead if the table is a SpatiaLite table containing geometry columns that include large shapefiles. Would it make sense to have a setting that can limit the amount of text displayed in any given cell in the table preview, or (less useful?) suppress (with notification) the display of overlong columns unless enabled by the user? An issue then arises if a user does want to see all the text in a cell: 1) for a particular cell; 2) for every cell in the table; 3) for all cells in a particular column or columns (I haven't checked but what if a column contains e.g. raw image data? Does this display as raw data? Or can this be rendered in a context aware way as an image preview? I guess a custom template would be one way to do that?)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/330/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 336936010,MDU6SXNzdWUzMzY5MzYwMTA=,331,Datasette throws error when loading spatialite db without extension loaded,82988,psychemedia,closed,0,,,,,2,2018-06-29T09:51:14Z,2022-01-20T21:29:40Z,2018-07-10T15:13:36Z,CONTRIBUTOR,,"When starting datasette on a SpatialLite database *without* loading the SpatiaLite extension (using eg `--load-extension=/usr/local/lib/mod_spatialite.dylib`) an error is thrown and the server fails to start: ``` datasette -p 8003 adminboundaries.db Serve! files=('adminboundaries.db',) on port 8003 Traceback (most recent call last): File ""/Users/ajh59/anaconda3/bin/datasette"", line 11, in sys.exit(cli()) File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py"", line 722, in __call__ return self.main(*args, **kwargs) File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py"", line 697, in main rv = self.invoke(ctx) File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py"", line 1066, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py"", line 895, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/click/core.py"", line 535, in invoke return callback(*args, **kwargs) File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/datasette/cli.py"", line 552, in serve ds.inspect() File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/datasette/app.py"", line 273, in inspect ""tables"": inspect_tables(conn, self.metadata.get(""databases"", {}).get(name, {})) File ""/Users/ajh59/anaconda3/lib/python3.6/site-packages/datasette/inspect.py"", line 79, in inspect_tables ""PRAGMA table_info({});"".format(escape_sqlite(table)) sqlite3.OperationalError: no such module: VirtualSpatialIndex ``` It would be nice to trap this and return a message saying something like: ``` It looks like you're trying to load a SpatiaLite database? Make sure you load in the SpatiaLite extension when starting datasette. Read more: https://datasette.readthedocs.io/en/latest/spatialite.html ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/331/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 337141108,MDU6SXNzdWUzMzcxNDExMDg=,332,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1,9599,simonw,closed,0,,,,,12,2018-06-29T21:21:27Z,2018-07-24T04:53:20Z,2018-07-24T03:08:30Z,OWNER,,"It turns out if you load this CSV using `csvs-to-sqlite` you get an Infinity value in SQLite: ``` name,num sasha,10 terry,Inf cathy,0.5 ``` `csvs-to-sqlite infinity-bug.csv infinity-bug.db` I deployed this using: ``` datasette publish now infinity-bug.db --name=datasette-infinity-bug --install=datasette-vega ``` Datasette outputs that as `Infinity` in the JSON format, which causes JavaScript errors. Demo * https://datasette-infinity-bug.now.sh/infinity-bug-0d0224e/infinity-bug - HTML view works * https://datasette-infinity-bug.now.sh/infinity-bug-0d0224e/infinity-bug.json?_shape=array - this outputs the following: ``` [ { ""rowid"": 1, ""name"": ""sasha"", ""num"": 10.0 }, { ""rowid"": 2, ""name"": ""terry"", ""num"": Infinity }, { ""rowid"": 3, ""name"": ""cathy"", ""num"": 0.5 } ] ``` But... in Firefox that gets rendered like this: ![2018-06-29 at 4 20 pm](https://user-images.githubusercontent.com/9599/42115408-5d30f630-7bb8-11e8-8370-c8484801c49b.png) And if you click the ""Show charting options"" button you get this error in the console: ``` SyntaxError: JSON.parse: unexpected character at line 1 column 83 of the JSON data ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/332/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 338768551,MDU6SXNzdWUzMzg3Njg1NTE=,333,Datasette on Zeit Now returns http URLs for facet and next links,9599,simonw,closed,0,,,,,4,2018-07-06T00:40:49Z,2018-07-24T04:53:20Z,2018-07-24T01:51:53Z,OWNER,,"e.g. on https://fivethirtyeight.datasettes.com/fivethirtyeight-ac35616/nba-elo%2Fnbaallelo.json?_facet=lg_id&_size=0 ``` { ""facet_results"": { ""lg_id"": { ""name"": ""lg_id"", ""results"": [ { ""value"": ""NBA"", ""label"": ""NBA"", ""count"": 118016, ""toggle_url"": ""http://fivethirtyeight.datasettes.com/fivethirtyeight-ac35616/nba-elo%2Fnbaallelo.json?_facet=lg_id&_size=1&lg_id=NBA"", ""selected"": false }, { ""value"": ""ABA"", ""label"": ""ABA"", ""count"": 8298, ""toggle_url"": ""http://fivethirtyeight.datasettes.com/fivethirtyeight-ac35616/nba-elo%2Fnbaallelo.json?_facet=lg_id&_size=1&lg_id=ABA"", ""selected"": false } ], ""truncated"": false } }, ""suggested_facets"": [ { ""name"": ""_iscopy"", ""toggle_url"": ""/fivethirtyeight-ac35616/nba-elo%2Fnbaallelo.json?_facet=lg_id&_size=1&_facet=_iscopy"" } ], ""next_url"": ""http://fivethirtyeight.datasettes.com/fivethirtyeight-ac35616/nba-elo%2Fnbaallelo.json?_facet=lg_id&_size=1&_next=1"", } ``` `next_url` and `facet_results` both link to `http://` when they should link to `https://`. Note that suggested facets doesn't include the full URL at all, which is a consistency bug.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/333/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 339095976,MDU6SXNzdWUzMzkwOTU5NzY=,334,extra_options not passed to heroku publisher,719357,kamicut,closed,0,,,,,2,2018-07-06T23:26:12Z,2018-07-24T04:53:21Z,2018-07-10T01:46:04Z,NONE,,"I might be wrong but I was not able to publish to `heroku` with `--extra-options`, I think `extra_options` is not being used in this function [here](https://github.com/simonw/datasette/blob/master/datasette/utils.py#L369). Any help appreciated! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/334/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 339505204,MDU6SXNzdWUzMzk1MDUyMDQ=,335,Package datasette for installation using homebrew,9599,simonw,closed,0,,,,,12,2018-07-09T15:45:03Z,2020-08-11T16:54:06Z,2020-08-11T16:54:06Z,OWNER,,"https://docs.brew.sh/Python-for-Formula-Authors describes how. > Applications should be installed into a Python virtualenv environment rooted in libexec. This prevents the app’s Python modules from contaminating the system site-packages and vice versa. It recommends using https://github.com/tdsmith/homebrew-pypi-poet",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/335/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 340039409,MDU6SXNzdWUzNDAwMzk0MDk=,336,Ensure --help examples in docs are always up to date,9599,simonw,closed,0,,,,,3,2018-07-10T23:20:01Z,2018-07-24T16:01:29Z,2018-07-24T16:01:29Z,OWNER,,"Ideally I would automatically generate the --help output shown in our docs, but I don't think I can get that working with readthedocs. Instead, I'm going to add a unit test that checks that those extracts in the documentation match the current output of the --help command.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/336/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 340065374,MDU6SXNzdWUzNDAwNjUzNzQ=,337,Documentation for datasette publish and datasette package,9599,simonw,closed,0,,,,,1,2018-07-11T02:04:06Z,2018-07-11T02:07:32Z,2018-07-11T02:05:56Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/337/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 340282796,MDU6SXNzdWUzNDAyODI3OTY=,338,Only load vegaEmbed if charting tools are enabled,9599,simonw,closed,0,,,,,1,2018-07-11T15:02:14Z,2018-07-11T15:21:47Z,2018-07-11T15:21:47Z,OWNER,,"vegaEmbed is a LOT of code (it bundles d3) Inspired by this tweet: https://twitter.com/thelarkinn/status/1017053567641948162 - it would be great if we loaded that code on demand the first time the ""Show chart options"" button was clicked, or when the page loads with #g. options in the URL. Even better: avoid the overhead if loading React unless the chart options need to be displayed. This would be a pretty major refactoring though.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/338/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 340396247,MDU6SXNzdWUzNDAzOTYyNDc=,339,Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way,12617395,bsilverm,closed,0,,,,,4,2018-07-11T20:38:06Z,2022-03-21T22:22:40Z,2022-03-21T22:22:34Z,NONE,,"Is it possible to configure the sql_time_limit_ms beyond 60 seconds? It seems queries are still timing out at 60 seconds when sql_time_limit_ms is set to 180000. We have a very large data set and often encounter timeouts when testing new queries from the datasette UI. We are optimizing our database as much as we can, but still may require more than 60 seconds for complex queries.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/339/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 340730961,MDU6SXNzdWUzNDA3MzA5NjE=,340,Embrace black,9599,simonw,closed,0,,,,,1,2018-07-12T17:32:29Z,2019-06-24T06:50:27Z,2019-06-24T06:50:26Z,OWNER,,"Run [black](https://github.com/ambv/black) against everything. Then set up CI to fail if code doesn't conform to black's style. Here's how Starlette does this: * https://github.com/encode/starlette/blob/e3d090b3597167f7b3a4f76e4bb3c0d3e94be61a/.travis.yml#L14 * https://github.com/encode/starlette/blob/e3d090b3597167f7b3a4f76e4bb3c0d3e94be61a/scripts/lint - essentially runs `black starlette tests --check` And here's an example of a test run that failed: https://travis-ci.org/encode/starlette/jobs/403172478",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/340/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 340733753,MDExOlB1bGxSZXF1ZXN0MjAxMDc1NTMy,341,Bump aiohttp to fix compatibility with Python 3.7,9599,simonw,closed,0,,,,,0,2018-07-12T17:41:24Z,2018-07-12T18:07:38Z,2018-07-12T18:07:38Z,OWNER,simonw/datasette/pulls/341,Tests failed here: https://travis-ci.org/simonw/datasette/jobs/403223333,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/341/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 341123355,MDU6SXNzdWUzNDExMjMzNTU=,342,Requesting support for query description,12617395,bsilverm,closed,0,,,,,4,2018-07-13T18:50:16Z,2018-07-24T04:53:21Z,2018-07-16T02:33:54Z,NONE,,"It would be great if the metadata file allowed you to enter a description for the query. We have a lot of pre-defined queries that can only be so descriptive by their name. It would be nice if an optional description could be included underneath the name within the UI, or on hover where it currently shows the SQL.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/342/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 341228846,MDU6SXNzdWUzNDEyMjg4NDY=,343,Render boolean fields better by default,45057,russss,open,0,,,,,1,2018-07-14T11:10:29Z,2018-07-14T14:17:14Z,,CONTRIBUTOR,,These show up as 0 or 1 because sqlite. I think Yes/No would be fine in most cases?,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/343/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 341229113,MDU6SXNzdWUzNDEyMjkxMTM=,344,datasette publish heroku fails without name provided,45057,russss,closed,0,,,,,1,2018-07-14T11:15:56Z,2018-07-14T13:00:48Z,2018-07-14T13:00:48Z,CONTRIBUTOR,,"It fails with the following JSON traceback if the `-n` option isn't provided, despite the fact that the command line help says that's not needed for heroku publishes.
``` Traceback (most recent call last): File ""/usr/local/bin/datasette"", line 11, in sys.exit(cli()) File ""/usr/local/lib/python3.6/site-packages/click/core.py"", line 722, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.6/site-packages/click/core.py"", line 697, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.6/site-packages/click/core.py"", line 1066, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.6/site-packages/click/core.py"", line 895, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.6/site-packages/click/core.py"", line 535, in invoke return callback(*args, **kwargs) File ""/usr/local/lib/python3.6/site-packages/datasette/cli.py"", line 265, in publish app_name = json.loads(create_output)[""name""] File ""/usr/local/Cellar/python/3.6.5/Frameworks/Python.framework/Versions/3.6/lib/python3.6/json/__init__.py"", line 354, in loads return _default_decoder.decode(s) File ""/usr/local/Cellar/python/3.6.5/Frameworks/Python.framework/Versions/3.6/lib/python3.6/json/decoder.py"", line 339, in decode obj, end = self.raw_decode(s, idx=_w(s, 0).end()) File ""/usr/local/Cellar/python/3.6.5/Frameworks/Python.framework/Versions/3.6/lib/python3.6/json/decoder.py"", line 357, in raw_decode raise JSONDecodeError(""Expecting value"", s, err.value) from None json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0) ```
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/344/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 341235633,MDExOlB1bGxSZXF1ZXN0MjAxNDUxMzMy,345,Allow app names for `datasette publish heroku`,45057,russss,closed,0,,,,,1,2018-07-14T13:12:34Z,2018-07-14T14:09:54Z,2018-07-14T14:04:44Z,CONTRIBUTOR,simonw/datasette/pulls/345,"Lets you supply the `-n` parameter for Heroku deploys, which also lets you update existing Heroku deployments.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/345/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 343728754,MDU6SXNzdWUzNDM3Mjg3NTQ=,346,Logo design for DATASETTE,35750428,ggabogarcia,closed,0,,,,,0,2018-07-23T17:40:17Z,2018-08-02T02:31:59Z,2018-08-02T02:31:59Z,NONE,,"Hello :) , I'm a graphic designer, I'm interested in collaborating with open source projects, besides this helps me expand my portfolio. I would like to design a logo for your project. I will be happy to collaborate with you :). ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/346/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 344654623,MDU6SXNzdWUzNDQ2NTQ2MjM=,347,"Rename ""datasette package"" to ""datasette publish docker""",9599,simonw,open,0,,,,,0,2018-07-26T00:42:46Z,2018-07-26T00:42:46Z,,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/347/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 344656114,MDU6SXNzdWUzNDQ2NTYxMTQ=,348,"Unit tests for ""datasette publish""",9599,simonw,closed,0,,,,,1,2018-07-26T00:52:23Z,2018-07-26T05:46:10Z,2018-07-26T05:46:10Z,OWNER,,"The datasette publish family of commands all work by shelling out to heroku/now/docker from subprocess import call, check_output So in tests I should be able to mock those calls: @mock.patch('subprocess.call') ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/348/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 344695978,MDExOlB1bGxSZXF1ZXN0MjA0MDI5MTQy,349,"publish_subcommand hook + default plugins mechanism, used for publish heroku/now",9599,simonw,closed,0,,,,,1,2018-07-26T05:03:22Z,2018-07-26T05:28:54Z,2018-07-26T05:16:00Z,OWNER,simonw/datasette/pulls/349,"This change introduces a new plugin hook, publish_subcommand, which can be used to implement new subcommands for the ""datasette publish"" command family. I've used this new hook to refactor out the ""publish now"" and ""publish heroku"" implementations into separate modules. I've also added unit tests for these two publishers, mocking the subprocess.call and subprocess.check_output functions. As part of this, I introduced a mechanism for loading default plugins. These are defined in the new ""default_plugins"" list inside datasette/app.py Closes #217 (Plugin support for ""datasette publish"") Closes #348 (Unit tests for ""datasette publish"") Refs #14, #59, #102, #103, #146, #236, #347",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/349/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 344701755,MDU6SXNzdWUzNDQ3MDE3NTU=,350,Don't list default plugins on /-/plugins,9599,simonw,closed,0,,,,,2,2018-07-26T05:38:00Z,2018-08-28T17:13:50Z,2018-08-28T16:48:19Z,OWNER,,"https://dbbe707.datasette.io/-/plugins is showing ""datasette.publish.now"" and ""datasette.publish.heroku""",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/350/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 345469355,MDU6SXNzdWUzNDU0NjkzNTU=,351,Automatically create a GitHub release linking to release notes for every tagged release,9599,simonw,closed,0,,,,,1,2018-07-28T18:31:12Z,2020-05-28T18:56:16Z,2020-05-28T18:56:15Z,OWNER,,"Can use this API called from Travis: https://developer.github.com/v3/repos/releases/#create-a-release The release it generates should look like this one: https://github.com/simonw/datasette/releases/tag/0.24",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/351/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 345821500,MDU6SXNzdWUzNDU4MjE1MDA=,352,render_cell(value) plugin hook,9599,simonw,closed,0,,,,,4,2018-07-30T15:56:20Z,2020-02-10T16:18:58Z,2018-08-05T00:14:57Z,OWNER,,To allow plugins to customize how values matching a specific pattern are displayed in the HTML table view.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/352/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 345821778,MDExOlB1bGxSZXF1ZXN0MjA0ODUxNTEx,353,render_cell(value) plugin hook,9599,simonw,closed,0,,,,,0,2018-07-30T15:57:08Z,2018-08-05T00:14:57Z,2018-08-05T00:14:57Z,OWNER,simonw/datasette/pulls/353,Closes #352.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/353/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 346026869,MDU6SXNzdWUzNDYwMjY4Njk=,354,Handle many-to-many relationships,9599,simonw,open,0,,,,,0,2018-07-31T04:03:13Z,2020-11-24T19:51:18Z,,OWNER,,This is a master tracking ticket for various many-2-many features.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/354/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 346027040,MDU6SXNzdWUzNDYwMjcwNDA=,355,Table view should support filtering via many-to-many relationships,9599,simonw,open,0,,,,,10,2018-07-31T04:04:16Z,2019-05-23T06:04:03Z,,OWNER,,Parent: #354 ,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/355/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 346028655,MDU6SXNzdWUzNDYwMjg2NTU=,356,Ability to display facet counts for many-to-many relationships,9599,simonw,closed,0,,,,,4,2018-07-31T04:14:26Z,2019-05-29T21:39:12Z,2019-05-25T16:30:09Z,OWNER,,Parent: #354,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/356/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 347058326,MDExOlB1bGxSZXF1ZXN0MjA1NzcwOTk2,1,Make .indexes compatible with older SQLite versions,9599,simonw,closed,0,,,,,0,2018-08-02T15:17:05Z,2018-08-02T15:17:30Z,2018-08-02T15:17:30Z,OWNER,simonw/sqlite-utils/pulls/1,Older SQLite versions return a different set of columns from the PRAGMA we are using.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 348043884,MDU6SXNzdWUzNDgwNDM4ODQ=,357,Plugin hook for loading metadata.json,9599,simonw,open,0,,,,,6,2018-08-06T19:00:01Z,2020-06-21T22:19:58Z,,OWNER,,"For https://github.com/simonw/russian-ira-facebook-ads-datasette/tree/af6d956995e14afd585c35a6a06bb01da32043ba I wrote a script to convert YAML to JSON because YAML is a better format for embedding multi-line HTML descriptions and canned SQL statements. Example yaml metadata file: https://github.com/simonw/russian-ira-facebook-ads-datasette/blob/af6d956995e14afd585c35a6a06bb01da32043ba/russian-ads-metadata.yaml It would be useful if Datasette could be fed a YAML file directly: datasette -m metadata.yaml Question is... should this be a native feature (hence adding a YAML dependency) or should it be handled by a `datasette-metadata-yaml` plugin, using a new plugin hook for loading metadata? If so, what would other use-cases for that plugin hook be?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/357/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 348534997,MDExOlB1bGxSZXF1ZXN0MjA2ODYzODAz,358,"Bump versions of pytest, pluggy and beautifulsoup4",9599,simonw,closed,0,,,,,0,2018-08-08T00:44:38Z,2018-08-08T01:11:13Z,2018-08-08T01:11:13Z,OWNER,simonw/datasette/pulls/358,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/358/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 349827640,MDU6SXNzdWUzNDk4Mjc2NDA=,359,Faceted browse against a JSON list of tags,9599,simonw,closed,0,,,,,6,2018-08-12T17:01:14Z,2019-05-29T21:39:12Z,2019-05-03T00:21:44Z,OWNER,,"If a table has a `[""foo"", ""bar"", ""baz""]` JSON column allow that to be faceted against. - [x] Support `?column__arraycontains=x` filter queries - [x] Support `?_facet_array=column` faceting",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/359/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 349850687,MDU6SXNzdWUzNDk4NTA2ODc=,2,Mechanism for adding foreign keys to an existing table,9599,simonw,closed,0,,,,,1,2018-08-12T22:50:56Z,2019-02-24T21:34:41Z,2019-02-24T21:34:41Z,OWNER,,"SQLite does not have ALTER TABLE support for adding new foreign keys... but it turns out it's possible to make these changes without having to duplicate the entire table by carefully running `UPDATE sqlite_master SET sql=... WHERE type='table' AND name='X';` Here's how Django does it: https://github.com/django/django/blob/d3449faaa915a08c275b35de01e66a7ef6bdb2dc/django/db/backends/sqlite3/schema.py#L103-L125 And here's the official documentation about this: https://sqlite.org/lang_altertable.html#otheralter (scroll to the very bottom of the page)",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 351017129,MDU6SXNzdWUzNTEwMTcxMjk=,360,Use pysqlite3 if available,9599,simonw,closed,0,,,,,3,2018-08-16T00:50:45Z,2018-08-16T01:50:42Z,2018-08-16T00:58:58Z,OWNER,,[pysqlite3](https://github.com/coleifer/pysqlite3) is a way to provide access to a more recent version of SQLite than the standard library `sqlite3` module (which tends to use the version provided with the operating system - which on e.g. the Travis CI Ubuntu build environment can be as old as 3.8.0).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/360/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 351017365,MDExOlB1bGxSZXF1ZXN0MjA4NzE5MDQz,361," Import pysqlite3 if available, closes #360 ",9599,simonw,closed,0,,,,,0,2018-08-16T00:52:21Z,2018-08-16T00:58:57Z,2018-08-16T00:58:57Z,OWNER,simonw/datasette/pulls/361,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/361/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 351845423,MDU6SXNzdWUzNTE4NDU0MjM=,3,Experiment with contentless FTS tables,9599,simonw,closed,0,,,,,1,2018-08-18T19:31:01Z,2019-07-22T20:58:55Z,2019-07-22T20:58:55Z,OWNER,,Could greatly reduce size of resulting database for large datasets: http://cocoamine.net/blog/2015/09/07/contentless-fts4-for-large-immutable-documents/,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 352768017,MDU6SXNzdWUzNTI3NjgwMTc=,362,Add option to include/exclude columns in search filters,78156,annapowellsmith,open,0,,,,,1,2018-08-22T01:32:08Z,2020-11-03T19:01:59Z,,NONE,,"I have a dataset with many columns, of which only some are likely to be of interest for searching. It would be great for usability if the search filters in the UI could be configured to include/exclude columns. See also: https://github.com/simonw/datasette/issues/292",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/362/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 355299310,MDExOlB1bGxSZXF1ZXN0MjExODYwNzA2,363,Search all apps during heroku publish,436032,kevboh,open,0,,,,,1,2018-08-29T19:25:10Z,2018-08-31T14:39:45Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/363,Adds the `-A` option to include apps from all organizations when searching app names for publish.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/363/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 359075028,MDExOlB1bGxSZXF1ZXN0MjE0NjUzNjQx,364,Support for other types of databases using external connectors,11912854,jsancho-gpl,open,0,,,,,0,2018-09-11T14:31:47Z,2018-09-11T14:31:47Z,,FIRST_TIME_CONTRIBUTOR,simonw/datasette/pulls/364,"This PR is related to #293, but now all commits have been merged. The purpose is to support other file formats that aren't SQLite, like files with PyTables format. I've tried to accomplish that using external connectors published with entry points. The modifications in the original datasette code are minimal and many are in a separated file.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/364/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 361764460,MDExOlB1bGxSZXF1ZXN0MjE2NjUxMzE3,365,fix small doc typo,418191,jaywgraves,closed,0,,,,,2,2018-09-19T14:02:02Z,2019-12-19T02:30:33Z,2018-09-19T17:15:43Z,CONTRIBUTOR,simonw/datasette/pulls/365,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/365/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 369716228,MDU6SXNzdWUzNjk3MTYyMjg=,366,Default built image size over Zeit Now 100MiB limit,416374,gfrmin,closed,0,,,,,2,2018-10-12T21:27:17Z,2018-11-05T06:23:32Z,2018-11-05T06:23:32Z,CONTRIBUTOR,,"Using `dataset publish now` with no other custom options on a small (43KB) sqlite database leads to the error ""The built image size (373.5M) exceeds the 100MiB limit"". I think this is because of a recent Zeit change: https://github.com/zeit/now-cli/issues/1523",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/366/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 374675798,MDExOlB1bGxSZXF1ZXN0MjI2MzE0ODYy,367,Mark codemirror files as vendored,48517,jaap3,closed,0,,,,,2,2018-10-27T18:41:25Z,2019-05-03T21:12:09Z,2019-05-03T21:11:20Z,CONTRIBUTOR,simonw/datasette/pulls/367,"GitHub lists datasette as a Javascript project, primarily because of the vendored codemirror files. This is somewhat confusing when you're looking for datasette, knowing it's written in Python. Luckily it's possible exclude certain files from GitHub's code statistics: https://github.com/github/linguist#using-gitattributes",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/367/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 374676773,MDExOlB1bGxSZXF1ZXN0MjI2MzE1NTEz,368,Update installation instructions,48517,jaap3,closed,0,,,,,0,2018-10-27T18:52:31Z,2019-05-03T18:18:43Z,2019-05-03T18:18:42Z,CONTRIBUTOR,simonw/datasette/pulls/368,"I was writing this as a response to your tweet, but decided I might just make it a pull request. I feel like it might be confusing to those unfamiliar with Python's `-m` flag and the built-in `venv` module to omit the space between the flag and its argument. By adding a space and prefixing the second occurrence of `venv` with a `./` it's maybe a bit clearer what the arguments are and what they do. By also using `python3 -m pip` it becomes even clearer that `-m` is a special flag that makes the python executable do neat things.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/368/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 374953006,MDU6SXNzdWUzNzQ5NTMwMDY=,369,Interface should show same JSON shape options for custom SQL queries,416374,gfrmin,open,0,,,3268330,Datasette 1.0,2,2018-10-29T10:39:15Z,2020-05-30T17:24:06Z,,CONTRIBUTOR,,"At the moment the page returning a custom SQL query shows the JSON and CSV APIs, but not the multiple JSON shapes. However, adding the `_shape` parameter to the JSON API URL manually still works, so perhaps there should be consistency in the interface by having the same ""Advanced Export"" box for custom SQL queries.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/369/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 377155320,MDU6SXNzdWUzNzcxNTUzMjA=,370,Integration with JupyterLab,82988,psychemedia,open,0,,,,,4,2018-11-04T13:57:13Z,2022-09-29T08:17:47Z,,CONTRIBUTOR,,"I just watched a demo video for the [JupyterLab Chart Editor](https://www.crowdcast.io/e/introducing-JupyterLab-Chart-Editor/) which wraps the plotly chart editor app in a JupyterLab panel and lets you open a plotly chart JSON file in that editor. Essentially, it pops an HTML app into a panel in JupyterLab, and I think registers the app as a file viewer for a particular file type. (I'm not completely taken by it, tbh, because it means you can do irreproducible things to the chart definition file, but that's another issue). JupyterLab extensions can also open files from a dialogue as the iframe/html previewer shows: https://github.com/timkpaine/jupyterlab_iframe. This made me wonder about what `datasette` integration with JupyterLab might do. For example, by right-clicking on a CSV file (for which there is already a CSV table view) in the file browser, offer a *View / Run as datasette* file viewer option that will: - run the CSV file through `csvs-to-sqlite`; - launch the `datasette` server and display the `datasette` view in a JupyterLab panel. (? Create a new SQLite db for each CSV file and launch each datasette view on a new port? Or have a JupyterLab (session?) SQLite db that stores all `datasette` viewed CSVs and runs on a single port?) As a freebie, the `datasette` API would allow you to run efficient SQL queries against the file eg using using `pandas.read_sql()` queries in a notebook in the same space. Related: - [JupyterLab extensions docs](https://jupyterlab.readthedocs.io/en/stable/user/extensions.html) - a [cookiecutter for wrting JupyterLab extensions using Javascript](https://github.com/jupyterlab/extension-cookiecutter-js) - a [cookiecutter for writing JupyterLab extensions using Typescript](https://github.com/jupyterlab/extension-cookiecutter-ts) - tutorial: [Let’s Make an xkcd JupyterLab Extension](https://jupyterlab.readthedocs.io/en/stable/developer/xkcd_extension_tutorial.html)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/370/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 377156339,MDU6SXNzdWUzNzcxNTYzMzk=,371,datasette publish digitalocean plugin,82988,psychemedia,closed,0,,,,,3,2018-11-04T14:07:41Z,2021-01-04T20:14:28Z,2021-01-04T20:14:28Z,CONTRIBUTOR,,"Provide support for launching `datasette` on Digital Ocean. Example: [Deploy Docker containers into Digital Ocean](https://blog.machinebox.io/deploy-machine-box-in-digital-ocean-385265fbeafd). Digital Ocean also has a preconfigured VM running Docker that can be launched from the command line via the Digital Ocean API: [Docker One-Click Application](https://www.digitalocean.com/docs/one-clicks/docker/). Related: - Launching containers in Digital Ocean servers running docker: [How To Provision and Manage Remote Docker Hosts with Docker Machine on Ubuntu 16.04](https://www.digitalocean.com/community/tutorials/how-to-provision-and-manage-remote-docker-hosts-with-docker-machine-on-ubuntu-16-04) - [How To Use Doctl, the Official DigitalOcean Command-Line Client](https://www.digitalocean.com/community/tutorials/how-to-use-doctl-the-official-digitalocean-command-line-client)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/371/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 377166793,MDU6SXNzdWUzNzcxNjY3OTM=,372,Docker build tools,82988,psychemedia,open,0,,,,,0,2018-11-04T16:02:35Z,2018-11-04T16:02:35Z,,CONTRIBUTOR,,"In terms of small pieces lightly joined, I note that there are several tools starting to appear for building generating Dockerfiles and building Docker containers from simpler components such as `requirements.txt` files. If plugin/extensions builders want to include additional packages, then things like incremental builds of composable builds that add additional items into a base `datasette` container may be required. Examples of Dockerfile generators / container builders: - [openshift/source-to-image (s2i)](https://github.com/openshift/source-to-image) - [jupyter/repo2docker](https://github.com/jupyter/repo2docker) - [stencila/dockter](https://github.com/stencila/dockter) Discussions / threads (via Binderhub gitter) on: - [why `repo2docker` not `s2i`](http://words.yuvi.in/post/why-not-s2i/) - [why `dockter` not `repo2docker`](https://twitter.com/choldgraf/status/1058499607309647872) - [composability in `s2i`](https://trello.com/c/AexIVZNf/1008-8-composable-builds-builds-evg) Relates to things like: - https://github.com/simonw/datasette/pull/280",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/372/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 2, ""rocket"": 0, ""eyes"": 0}",, 377266351,MDU6SXNzdWUzNzcyNjYzNTE=,373,Views should be shown on root/index page along with tables,416374,gfrmin,closed,0,,,4305096,0.28,1,2018-11-05T06:28:41Z,2019-05-16T00:29:22Z,2019-05-16T00:29:22Z,CONTRIBUTOR,,"At the moment the number of views is given on a datasette ""homepage"", but not links to any views themselves",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/373/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 377518499,MDU6SXNzdWUzNzc1MTg0OTk=,374,Get Datasette working with Zeit Now v2's 100MB image size limit,9599,simonw,closed,0,,,,,5,2018-11-05T18:08:29Z,2018-12-19T01:35:59Z,2018-12-19T01:35:59Z,OWNER,,"Follow-on from #366 Zeit Now's v2 cloud has a 100MB size limit on Docker images, in order to support much faster wake-ups of new instances. Fitting Datasette AND the SQLite database it is hosting in here is going to be a challenge.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/374/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 382471625,MDExOlB1bGxSZXF1ZXN0MjMyMTcyMTA2,389,Bump dependency versions,9599,simonw,closed,0,,,,,2,2018-11-20T02:23:12Z,2019-11-13T19:13:41Z,2019-11-13T19:13:41Z,OWNER,simonw/datasette/pulls/389,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/389/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 386459810,MDExOlB1bGxSZXF1ZXN0MjM1MTk0Mjg2,390,tiny typo in customization docs,418191,jaywgraves,closed,0,,,,,1,2018-12-01T13:44:42Z,2019-12-19T02:30:35Z,2018-12-16T21:32:56Z,CONTRIBUTOR,simonw/datasette/pulls/390,was looking to add some custom templates to my use of datasette and saw this small typo.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/390/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 392610803,MDU6SXNzdWUzOTI2MTA4MDM=,391,Google Trends example doesn’t work,229881,styfle,closed,0,,,,,1,2018-12-19T13:51:38Z,2019-01-02T19:45:13Z,2019-01-02T19:45:12Z,NONE,,"https://google-trends.datasettes.com/ I see a cloud flare error. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/391/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 394751072,MDExOlB1bGxSZXF1ZXN0MjQxNDE4NDQz,392,Fix some regex DeprecationWarnings,9599,simonw,closed,0,,,,,0,2018-12-29T02:10:28Z,2018-12-29T02:22:28Z,2018-12-29T02:22:28Z,OWNER,simonw/datasette/pulls/392,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/392/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 395236066,MDU6SXNzdWUzOTUyMzYwNjY=,393,"CSV export in ""Advanced export"" pane doesn't respect query",1727065,ltrgoddard,closed,0,,,,,6,2019-01-02T12:39:41Z,2021-06-17T18:14:24Z,2019-01-03T02:44:10Z,NONE,,"It looks like there's an inconsistency when exporting to CSV via the the web interface. Say I'm looking at [songs released in 1989](https://fivethirtyeight.datasettes.com/fivethirtyeight-c300360/classic-rock%2Fclassic-rock-song-list?Release+Year__exact=1989) in the `classic-rock/classic-rock-song-list` table from the Five Thirty Eight data. The JSON and CSV export links at the top of the page both give me filtered data using `Release+Year__exact=1989` in the URL. In the `Advanced export` tab, though, the CSV option gives me the whole data set, while the JSON options preserve the query. It may be that this is intended behaviour related to the streaming CSV stuff [discussed here](https://github.com/simonw/datasette/issues/266), but if that's the case then I think it should be a little clearer.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/393/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 396212021,MDU6SXNzdWUzOTYyMTIwMjE=,394,base_url configuration setting,9599,simonw,closed,0,,,5234079,Datasette 0.39,27,2019-01-05T23:48:48Z,2020-06-11T09:15:20Z,2020-03-25T00:18:45Z,OWNER,,"I've identified a couple of use-cases for running Datasette in a way that over-rides the default way that internal URLs are generated. 1. Running behind a reverse proxy. I tried running Datasette behind a proxy and found that some of the generated internal links incorrectly referenced `http://127.0.0.1:8001/fixtures/...` - when they should have been referencing `http://my-host.my-domain.com/fixtures/...` - this is a problem both for links within the HTML interface but also for the `toggle_url` keys returned in the JSON as part of the facets datastructure. 2. I would like it to be possible to host a Datasette instance at e.g. `https://www.mynewspaper.com/interactives/2018/election-results/` - either through careful HTTP proxying or, once Datasette has been ported to ASGI, by mounting a Datasette ASGI instance deep within an existing set of URL routes. I'm going to add a `url_prefix` configuration option. This will default to `""""`, which means Datasette will behave as it does at the moment - it will use `/` for most URL prefixes in the HTML version, and an absolute URL derived from the incoming `Host` header for URLs that are returned as part of the JSON output. If `url_prefix` is set to another value (either a full URL or a path) then this path will be appended to all generated URLs.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/394/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 396215043,MDU6SXNzdWUzOTYyMTUwNDM=,395,Find a cleaner pattern for fixtures with arguments,9599,simonw,closed,0,,,,,1,2019-01-06T00:31:22Z,2020-06-07T21:23:22Z,2020-06-07T21:23:22Z,OWNER,,"A lot of Datasette tests look like this: https://github.com/simonw/datasette/blob/b65d97792a53f78cb14b226231063209d22c4602/tests/test_api.py#L438-L444 The loop here isn't actually expected to loop - it's there because the `make_app_client` function yields a value and then cleans it up afterwards. This pattern works, but it is a little confusing. It would be nice to replace it with something less strange looking. The answer may be to switch to the ""factories as fixtures"" pattern described here: https://docs.pytest.org/en/latest/fixture.html#factories-as-fixtures In particular some variant of this example: ``` @pytest.fixture def make_customer_record(): created_records = [] def _make_customer_record(name): record = models.Customer(name=name, orders=[]) created_records.append(record) return record yield _make_customer_record for record in created_records: record.destroy() def test_customer_records(make_customer_record): customer_1 = make_customer_record(""Lisa"") customer_2 = make_customer_record(""Mike"") customer_3 = make_customer_record(""Meredith"") ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/395/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 397098882,MDU6SXNzdWUzOTcwOTg4ODI=,396,Add pragma compile_options output to /-/versions,9599,simonw,closed,0,,,,,1,2019-01-08T21:43:54Z,2019-01-11T00:55:22Z,2019-01-11T00:44:56Z,OWNER,,"``` sqlite> pragma compile_options ...> ; BUG_COMPATIBLE_20160819 COMPILER=clang-9.0.0 DEFAULT_CACHE_SIZE=2000 DEFAULT_CKPTFULLFSYNC DEFAULT_JOURNAL_SIZE_LIMIT=32768 DEFAULT_PAGE_SIZE=4096 DEFAULT_SYNCHRONOUS=2 DEFAULT_WAL_SYNCHRONOUS=1 ENABLE_API_ARMOR ENABLE_COLUMN_METADATA ENABLE_DBSTAT_VTAB ENABLE_FTS3 ENABLE_FTS3_PARENTHESIS ENABLE_FTS3_TOKENIZER ENABLE_FTS4 ENABLE_FTS5 ENABLE_JSON1 ENABLE_LOCKING_STYLE=1 ENABLE_PREUPDATE_HOOK ENABLE_RTREE ENABLE_SESSION ENABLE_SNAPSHOT ENABLE_SQLLOG ENABLE_UNKNOWN_SQL_FUNCTION ENABLE_UPDATE_DELETE_LIMIT HAVE_ISNAN MAX_LENGTH=2147483645 MAX_MMAP_SIZE=1073741824 MAX_VARIABLE_NUMBER=500000 OMIT_AUTORESET OMIT_LOAD_EXTENSION STMTJRNL_SPILL=131072 THREADSAFE=2 USE_URI sqlite> ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/396/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 397129564,MDU6SXNzdWUzOTcxMjk1NjQ=,397,Update official datasetteproject/datasette Docker container to SQLite 3.26.0,43564,claes,closed,0,,,,,3,2019-01-08T22:51:50Z,2019-01-11T01:25:33Z,2019-01-11T00:56:18Z,NONE,,"I try to start datasette on a database that contains the below view It fails in a way that makes me think it does not support the window functions SQL syntax. ``` create view general_ledger as select transactions.account_number, strftime(""%Y-%m-%d"", verifications.verification_date) as verification_date, verifications.verification_number, verifications.verification_text, case when transactions.centi_amount >= 0 and verifications.verification_number > 0 then printf(""%.2f"", (transactions.centi_amount/100.0)) end as debit, case when transactions.centi_amount <= 0 and verifications.verification_number > 0 then printf(""%.2f"", (transactions.centi_amount/100.0)) end as credit, printf(""%.2f"", sum(transactions.centi_amount) over (partition by transactions.account_number order by verifications.verification_number range between unbounded preceding and current row)/100.0) from verifications inner join transactions on transactions.verification_id = verifications.id order by transactions.account_number, verifications.verification_number; ``` ``` docker run -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/ledger.db Serve! files=('/mnt/ledger.db',) on port 8001 Traceback (most recent call last): File ""/usr/local/bin/datasette"", line 11, in sys.exit(cli()) File ""/usr/local/lib/python3.6/site-packages/click/core.py"", line 722, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.6/site-packages/click/core.py"", line 697, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.6/site-packages/click/core.py"", line 1066, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.6/site-packages/click/core.py"", line 895, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.6/site-packages/click/core.py"", line 535, in invoke return callback(*args, **kwargs) File ""/usr/local/lib/python3.6/site-packages/datasette/cli.py"", line 375, in serve ds.inspect() File ""/usr/local/lib/python3.6/site-packages/datasette/app.py"", line 308, in inspect ""views"": inspect_views(conn), File ""/usr/local/lib/python3.6/site-packages/datasette/inspect.py"", line 30, in inspect_views return [v[0] for v in conn.execute('select name from sqlite_master where type = ""view""')] sqlite3.DatabaseError: malformed database schema (general_ledger) - near ""over"": syntax error ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/397/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 398011658,MDU6SXNzdWUzOTgwMTE2NTg=,398,Ensure downloading a 100+MB SQLite database file works,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2019-01-10T20:57:52Z,2020-12-05T19:36:27Z,2020-12-05T19:36:27Z,OWNER,,I've seen attempted downloads of large files fail after about ten seconds.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/398/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 398089089,MDU6SXNzdWUzOTgwODkwODk=,399,/-/versions for official Docker image returns wrong Datasette version,9599,simonw,closed,0,,,,,2,2019-01-11T01:19:58Z,2019-01-13T23:31:59Z,2019-01-13T23:10:45Z,OWNER,,"``` docker run -p 8001:8001 datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 ``` http://0.0.0.0:8001/-/versions returns this: ``` { ""datasette"": { ""version"": ""0+unknown"" }, ... ``` This is because the Docker image is built by copying in the Datasette source code, which confuses versioneer. Maybe the Docker image should install the code using a wheel or similar? ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/399/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 398559195,MDU6SXNzdWUzOTg1NTkxOTU=,400,datasette publish cloudrun plugin,10352819,rprimet,closed,0,,,,,1,2019-01-12T14:35:11Z,2019-05-03T16:57:35Z,2019-05-03T16:57:35Z,CONTRIBUTOR,,"Google announced that they may launch a simple service for running Docker containers (previously serverless containers, now called ""cloud run"" -- link to alpha [here](https://services.google.com/fb/forms/serverlesscontainers/)). If/when this happens, it might be a good fit for publishing datasettes? (at least using the current version, manually publishing a datasette seems relatively painless).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/400/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 400229984,MDU6SXNzdWU0MDAyMjk5ODQ=,401,How to pass configuration to plugins?,1055831,dazzag24,closed,0,,,,,3,2019-01-17T11:20:41Z,2019-01-18T11:48:13Z,2019-01-18T06:49:07Z,NONE,,"Hi, Firstly, thanks for your work on datasette, it is a hugely useful tool! I've been working on a fork [https://github.com/dazzag24/datasette-cluster-map] of datasette-cluster-map to allow the tileserver to be easily switched. Primarily because the tiles being served in the current version use localised text for labels and I'd like to have English used for these names instead. It uses http://leaflet-extras.github.io/leaflet-providers/preview/ to allow you to simply set the tile provider using a call like so: ``` let tiles = L.tileLayer.provider('Esri.WorldTopoMap'); ``` instead of the current: ``` let tiles = L.tileLayer('https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png', { maxZoom: 19, detectRetina: true, attribution: '© OpenStreetMap contributors' }), ``` However I've got stuck in trying to work out how to pass the provider string to the plugin. In the documentation: https://datasette.readthedocs.io/en/stable/plugins.html you discuss configuration of plugins and use an example of passing in which latitude and longitude columns should be used. However I cannot seem to see anywhere in the current datasette-cluster-map code where these config params are passed in or used. Can you please point me to an example or how to pass configuration from the metadata.json down into a plugin. Once I've over come this issue I was wondering if you would be interested in taking this change into your version? Many thanks Darren",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/401/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 400340905,MDU6SXNzdWU0MDAzNDA5MDU=,402,Use SQLITE_DBCONFIG_DEFENSIVE plus other recommendations from SQLite security docs,9599,simonw,open,0,,,,,3,2019-01-17T15:52:28Z,2019-01-17T16:15:21Z,,OWNER,,"> Was just having a skim through the datasette source. Given that the vuln impacts shadow tables, wasn't sure whether these are also covered by the immutable flag. Latest release introduced a SQLITE_DBCONFIG_DEFENSIVE flag that they recommend setting: https://sqlite.org/security.html https://twitter.com/ignoredambience/status/1085926961413869568",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/402/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 400511206,MDU6SXNzdWU0MDA1MTEyMDY=,403,How does persistence work?,1794527,ccorcos,closed,0,,,,,2,2019-01-17T23:41:57Z,2019-01-19T05:47:55Z,2019-01-18T06:51:14Z,NONE,,I was under the impression that now.sh is for stateless microservices. So where are these SQLite databases stored and when do they get created and destroyed?,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/403/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 403028630,MDExOlB1bGxSZXF1ZXN0MjQ3NTc2OTQy,4,Fts5,9599,simonw,closed,0,,,,,0,2019-01-25T06:54:05Z,2019-01-25T06:54:33Z,2019-01-25T06:54:33Z,OWNER,simonw/sqlite-utils/pulls/4,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 403396009,MDExOlB1bGxSZXF1ZXN0MjQ3ODYxNDE5,5,Run Travis tests against Python 3.8-dev,9599,simonw,closed,0,,,,,0,2019-01-26T02:30:55Z,2019-01-26T02:37:54Z,2019-01-26T02:37:54Z,OWNER,simonw/sqlite-utils/pulls/5,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 403499298,MDExOlB1bGxSZXF1ZXN0MjQ3OTIzMzQ3,404,Experiment: run Jinja in async mode,9599,simonw,closed,0,,,,,3,2019-01-27T00:28:44Z,2019-11-12T05:02:18Z,2019-11-12T05:02:13Z,OWNER,simonw/datasette/pulls/404,"See http://jinja.pocoo.org/docs/2.10/api/#async-support Tests all pass. Have not checked performance difference yet. Creating pull request to run tests in Travis. This is not ready to merge - I'm not yet sure if this is a good idea.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/404/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 403617881,MDU6SXNzdWU0MDM2MTc4ODE=,405,.json?_nl=on option for exporting newline-delimited JSON,9599,simonw,closed,0,,,,,2,2019-01-28T01:10:45Z,2019-01-28T01:49:00Z,2019-01-28T01:48:37Z,OWNER,,"The neat thing about newline-delimited JSON is that you don't have to read an entire array (of potentially thousands of objects) into memory in order to parse it - you can parse things a line at a time instead. It will look like this: `https://latest.datasette.io/fixtures/facetable.json?_shape=array&_nl=on` ``` {""pk"": 1, ""planet_int"": 1, ""on_earth"": 1, ""state"": ""CA"", ""city_id"": 1, ""neighborhood"": ""Mission""} {""pk"": 2, ""planet_int"": 1, ""on_earth"": 1, ""state"": ""CA"", ""city_id"": 1, ""neighborhood"": ""Dogpatch""} {""pk"": 3, ""planet_int"": 1, ""on_earth"": 1, ""state"": ""CA"", ""city_id"": 1, ""neighborhood"": ""SOMA""} {""pk"": 4, ""planet_int"": 1, ""on_earth"": 1, ""state"": ""CA"", ""city_id"": 1, ""neighborhood"": ""Tenderloin""} {""pk"": 5, ""planet_int"": 1, ""on_earth"": 1, ""state"": ""CA"", ""city_id"": 1, ""neighborhood"": ""Bernal Heights""} ``` I added this as part of the `sqlite-utils json` CLI command is this commit - I think Datasette should offer it as well: https://github.com/simonw/sqlite-utils/commit/5466c9745dfef858286146ea158ffd5a71391d10 It can be offered alongside `_stream=on` (which currently only works for CSV, but it could work for JSON as well thanks to this trick).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/405/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 403624090,MDU6SXNzdWU0MDM2MjQwOTA=,6,"""sqlite-utils insert"" should support newline-delimited JSON",9599,simonw,closed,0,,,,,1,2019-01-28T02:00:02Z,2019-01-28T02:17:45Z,2019-01-28T02:17:45Z,OWNER,,"We can already export newline delimited JSON. We should learn to import it as well. The neat thing about importing it is that you can import GBs of data without having to read the whole lot into memory in order to decode the wrapping JSON array. Datasette can export it now: https://github.com/simonw/datasette/issues/405 Demo: https://latest.datasette.io/fixtures/facetable.json?_shape=array&_nl=on It should be possible to do this: $ curl ""https://latest.datasette.io/fixtures/facetable.json?_shape=array&_nl=on"" \ | sqlite-utils insert data.db facetable - --nl ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 403625674,MDU6SXNzdWU0MDM2MjU2NzQ=,7,.insert_all() should accept a generator and process it efficiently,9599,simonw,closed,0,,,,,3,2019-01-28T02:11:58Z,2019-01-28T06:26:53Z,2019-01-28T06:26:53Z,OWNER,,"Right now you have to load every record into memory before passing the list to `.insert_all()` and friends. If you want to process millions of rows, this is inefficient. Python has generators - we should use them! The only catch here is that part of the magic of `sqlite-utils` is that it guesses the column types and creates the table for you. This code will need to be updated to notice if the table needs creating and, if it does, create it using the first X (where x=1,000 but can be customized) records. If a record outside of those first 1,000 has a rogue column, we can crash with an error. This will free us up to make the `--nl` option added in #6 much more efficient.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 403922644,MDU6SXNzdWU0MDM5MjI2NDQ=,8,Problems handling column names containing spaces or - ,82988,psychemedia,closed,0,,,,,3,2019-01-28T17:23:28Z,2019-04-14T15:29:33Z,2019-02-23T21:09:03Z,NONE,,"Irrrespective of whether using column names containing a space or - character is good practice, SQLite does allow it, but `sqlite-utils` throws an error in the following cases: ```python from sqlite_utils import Database dbname = 'test.db' DB = Database(sqlite3.connect(dbname)) import pandas as pd df = pd.DataFrame({'col1':range(3), 'col2':range(3)}) #Convert pandas dataframe to appropriate list/dict format DB['test1'].insert_all( df.to_dict(orient='records') ) #Works fine ``` However: ```python df = pd.DataFrame({'col 1':range(3), 'col2':range(3)}) DB['test1'].insert_all(df.to_dict(orient='records')) ``` throws: ``` --------------------------------------------------------------------------- OperationalError Traceback (most recent call last) in () 1 import pandas as pd 2 df = pd.DataFrame({'col 1':range(3), 'col2':range(3)}) ----> 3 DB['test1'].insert_all(df.to_dict(orient='records')) /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, upsert, batch_size, column_order) 327 jsonify_if_needed(record.get(key, None)) for key in all_columns 328 ) --> 329 result = self.db.conn.execute(sql, values) 330 self.db.conn.commit() 331 self.last_id = result.lastrowid OperationalError: near ""1"": syntax error ``` and: ```python df = pd.DataFrame({'col-1':range(3), 'col2':range(3)}) DB['test1'].upsert_all(df.to_dict(orient='records')) ``` results in: ``` --------------------------------------------------------------------------- OperationalError Traceback (most recent call last) in () 1 import pandas as pd 2 df = pd.DataFrame({'col-1':range(3), 'col2':range(3)}) ----> 3 DB['test1'].insert_all(df.to_dict(orient='records')) /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, upsert, batch_size, column_order) 327 jsonify_if_needed(record.get(key, None)) for key in all_columns 328 ) --> 329 result = self.db.conn.execute(sql, values) 330 self.db.conn.commit() 331 self.last_id = result.lastrowid OperationalError: near ""-"": syntax error ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 405801771,MDExOlB1bGxSZXF1ZXN0MjQ5NjgwOTQ0,9,:pencil: Updates my_database.py to my_database.db,50527,jefftriplett,closed,0,,,,,0,2019-02-01T17:35:43Z,2019-02-24T03:55:04Z,2019-02-24T03:55:04Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/9,I noticed that both `.py` and `.db` were used in the docs and assumed you'd prefer `.db`. ,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/9/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 406055201,MDU6SXNzdWU0MDYwNTUyMDE=,406,Support nullable foreign keys in _labels mode,9599,simonw,closed,0,9599,simonw,,,2,2019-02-03T05:34:20Z,2019-11-02T22:39:28Z,2019-11-02T22:30:27Z,OWNER,,"Currently if there's a null in a foreign key we get ""None"" displayed in the inflated view: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/406/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 407073223,MDExOlB1bGxSZXF1ZXN0MjUwNjI4Mjc1,407,Heroku --include-vcs-ignore,9599,simonw,closed,0,,,,,1,2019-02-06T04:06:20Z,2019-02-06T04:31:30Z,2019-02-06T04:15:47Z,OWNER,simonw/datasette/pulls/407,"Should mean `datasette publish heroku` can work under Travis, unlike this failure: https://travis-ci.org/simonw/fivethirtyeight-datasette/builds/488047550 ``` 2.25s$ datasette publish heroku fivethirtyeight.db -m metadata.json -n fivethirtyeight-datasette tar: unrecognized option '--exclude-vcs-ignores' Try 'tar --help' or 'tar --usage' for more information. ▸ Command failed: tar cz -C /tmp/tmpuaxm7i8f --exclude-vcs-ignores --exclude ▸ .git --exclude .gitmodules . > ▸ /tmp/f49440e0-1bf3-4d3f-9eb0-fbc2967d1fd4.tar.gz ▸ tar: unrecognized option '--exclude-vcs-ignores' ▸ Try 'tar --help' or 'tar --usage' for more information. ▸ The command ""datasette publish heroku fivethirtyeight.db -m metadata.json -n fivethirtyeight-datasette"" exited with 0. ``` The fix for that issue is to call the heroku command like this: heroku builds:create -a app_name --include-vcs-ignore ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/407/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 407174173,MDU6SXNzdWU0MDcxNzQxNzM=,408,"Show metadata info (e.g. license, source) on custom SQL query pages",78356,stefanw,closed,0,,,,,0,2019-02-06T10:43:34Z,2019-10-14T03:53:22Z,2019-10-14T03:53:22Z,NONE,,"Currently metadata info is not displayed on custom SQL pages. E.g. compare the footer of [this normal table page](https://register-of-members-interests.datasettes.com/regmem-98dc8b7/categories) with the footer [this custom SQL page](https://register-of-members-interests.datasettes.com/regmem-98dc8b7?sql=select+*+from+categories). This is important in order to adhere to attribution license requirements.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/408/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 408376825,MDU6SXNzdWU0MDgzNzY4MjU=,409,Zeit API v1 does not work for new users - need to migrate to v2,209967,michaelmcandrew,closed,0,,,,,3,2019-02-09T00:50:33Z,2020-04-06T15:44:46Z,2020-04-06T15:44:46Z,NONE,,"Hello there, This looks like a great tool. Thanks. Unfortunately, I hit the following error: ``` michael@hazel ~/src/cc-datasette/data/out datasette publish now cc-datasette.db > WARN! You are using an old version of the Now Platform. More: https://zeit.co/docs/v1-upgrade > Deploying /tmp/tmpjtrxwsyf/datasette under michaelmcandrew > Using project datasette > Error! You tried to create a Now 1.0 deployment. Please use Now 2.0 instead: https://zeit.co/upgrade ``` I'm guessing you might not hit this because you are not a 'new user' of Zeit (https://github.com/zeit/now-cli/issues/1805#issuecomment-452470953). Would it be a lot of work to upgrade to the new Zeit API, do you think?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/409/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 408518024,MDU6SXNzdWU0MDg1MTgwMjQ=,410,How to setup a multi database environment?,30607,aborruso,closed,0,,,,,1,2019-02-10T09:39:24Z,2019-04-12T04:42:28Z,2019-04-12T04:42:27Z,NONE,,"Hi, first of all I need to write that Simon Willison and datasette are really great. I have probably a stupid question, but it seems to me that I do not have the reply in the documentation. I have installed datasette and run it with `datasette mydb.db`, and I can reach it on `http://127.0.0.1:8001`. But how to work with more than one db? Imagine I have ten sqlite databases, and that I need to explore/query these via datasette, how to run datasette? Is it possibile to create a sort of db index and than run `datasette serve myindex`? Thank you",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/410/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 410384988,MDU6SXNzdWU0MTAzODQ5ODg=,411,How to pass named parameter into spatialite MakePoint() function,1055831,dazzag24,closed,0,,,,,2,2019-02-14T16:30:22Z,2022-01-20T21:29:41Z,2019-05-05T12:25:04Z,NONE,,"Hi, datasette version: ""0.26.2"" extensions: spatialite: ""4.4.0-RC0"" sqlite version: ""3.22.0"" I have a table of airports with latitude and longitude columns. I've added spatialite (with KNN support). After creating the db using csvs-to-sqlit, I run these commands to setup the spatialite tables: ``` conn.execute('SELECT InitSpatialMetadata(1)') conn.execute(""SELECT AddGeometryColumn('airports', 'point_geom', 4326, 'POINT', 2);"") conn.execute('''UPDATE airports SET point_geom = GeomFromText('POINT('||""longitude""||' '||""latitude""||')',4326);''') conn.execute(""SELECT CreateSpatialIndex('airports', 'point_geom');"") ``` I'm attempting to create a canned query and have this in my metadata.json file: ``` ""find_airports_nearest_to_point"":{ ""sql"":""SELECT a.pos AS rank, b.id, b.name, b.country, b.latitude AS latitude, b.longitude AS longitude, a.distance / 1000.0 AS dist_km FROM KNN AS a JOIN airports AS b ON (b.rowid = a.fid) WHERE f_table_name = \""airports\"" AND ref_geometry = MakePoint( :Long , :Lat ) AND max_items = 10;""} ``` which doesn't seem to perform the templating of the name parameters correctly and I get no results. Have also tired: ``` MakePoint( || :Long || , || :Lat || ) ``` which returns this error: ``` near ""||"": syntax error ``` However I cannot seem to find the correct combination of named parameter syntax (:Lat) or sqlite concatenation operator to make it work. Any ideas if using named parameters inside functions is supported? Thanks Darren",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/411/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 411066700,MDU6SXNzdWU0MTEwNjY3MDA=,10,Error in upsert if column named 'order',82988,psychemedia,closed,0,,,,,1,2019-02-16T12:05:18Z,2019-02-24T16:55:38Z,2019-02-24T16:55:37Z,NONE,,"The following works fine: ``` connX = sqlite3.connect('DELME.db', timeout=10) dfX=pd.DataFrame({'col1':range(3),'col2':range(3)}) DBX = Database(connX) DBX['test'].upsert_all(dfX.to_dict(orient='records')) ``` But if a column is named `order`: ``` connX = sqlite3.connect('DELME.db', timeout=10) dfX=pd.DataFrame({'order':range(3),'col2':range(3)}) DBX = Database(connX) DBX['test'].upsert_all(dfX.to_dict(orient='records')) ``` it throws an error: ``` --------------------------------------------------------------------------- OperationalError Traceback (most recent call last) in 3 dfX=pd.DataFrame({'order':range(3),'col2':range(3)}) 4 DBX = Database(connX) ----> 5 DBX['test'].upsert_all(dfX.to_dict(orient='records')) /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in upsert_all(self, records, pk, foreign_keys, column_order) 347 foreign_keys=foreign_keys, 348 upsert=True, --> 349 column_order=column_order, 350 ) 351 /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, upsert, batch_size, column_order) 327 jsonify_if_needed(record.get(key, None)) for key in all_columns 328 ) --> 329 result = self.db.conn.execute(sql, values) 330 self.db.conn.commit() 331 self.last_id = result.lastrowid OperationalError: near ""order"": syntax error ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 411257981,MDU6SXNzdWU0MTEyNTc5ODE=,412,Linked Data(sette),43340,sfkeller,open,0,,,,,2,2019-02-18T00:38:14Z,2019-03-19T10:09:46Z,,NONE,,"I've a radical feature idea (possible first as an extension in order to experiment?): I'd like to link to a remote table from a remote database, e.g. with a function ""linked_datasette()"". So one could do following query: ``` SELECT foo.id, foo.a, remote_party.b FROM foo JOIN linked_datasette(""https://parlgov.datasettes.com/parlgov-b42a2f2"") AS remote_party ON foo.id=remote_party.id ``` This is inspired by SPARQL's SERVICE keyword for remote RDF ""endpoints"". There's a foundation in the SQL Standard called SQL/MED (https://rhaas.blogspot.com/2011/01/why-sqlmed-is-cool.html ). And here's an implementation from me in Postgres FDW to connect another Postgres ""endpoint"": https://pastebin.com/Fz2v64Cz .",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/412/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 413740684,MDU6SXNzdWU0MTM3NDA2ODQ=,11,Detect numpy types when creating tables,9599,simonw,closed,0,,,,,2,2019-02-23T21:09:35Z,2019-02-24T04:02:20Z,2019-02-24T04:02:20Z,OWNER,,Inspired by #8,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 413778585,MDExOlB1bGxSZXF1ZXN0MjU1NjU4MTEy,12,"Support for numpy types, closes #11",9599,simonw,closed,0,,,,,0,2019-02-24T03:57:32Z,2019-02-24T04:02:20Z,2019-02-24T04:02:20Z,OWNER,simonw/sqlite-utils/pulls/12,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 413779210,MDU6SXNzdWU0MTM3NzkyMTA=,13,Ability to automatically create IDs from content hash of row,9599,simonw,closed,0,,,,,1,2019-02-24T04:07:08Z,2019-02-24T04:36:48Z,2019-02-24T04:36:48Z,OWNER,,"Sometimes when you are importing data the underlying source provides records without IDs that can be uniquely identified by their contents. A utility mechanism for calculating a sha1 hash of the contents and using that as a unique ID would be useful.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 413842611,MDU6SXNzdWU0MTM4NDI2MTE=,14,Utilities for adding indexes,9599,simonw,closed,0,,,,,3,2019-02-24T16:57:28Z,2019-02-24T19:11:28Z,2019-02-24T19:11:28Z,OWNER,,"Both in the Python API and the CLI tool. For the CLI tool this should work: $ sqlite-utils create-index mydb.db mytable col1 col2 This will create a compound index across col1 and col2. The name of the index will be automatically chosen unless you use the `--name=...` option. Support a `--unique` option too.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 413857257,MDU6SXNzdWU0MTM4NTcyNTc=,15,Ability to add columns to tables,9599,simonw,closed,0,,,,,0,2019-02-24T19:20:51Z,2019-02-24T20:04:40Z,2019-02-24T20:04:40Z,OWNER,,"Makes sense to do this before foreign keys in #2 Python: db[""table""].add_column(""new_column"", int) CLI: $ sqlite-utils add-column table new_column INTEGER ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 413867537,MDU6SXNzdWU0MTM4Njc1Mzc=,16,add_column() should support REFERENCES {other_table}({other_column}),9599,simonw,closed,0,,,,,4,2019-02-24T21:00:45Z,2019-05-29T05:17:59Z,2019-05-29T04:56:18Z,OWNER,,Related to #2 ,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 413868452,MDU6SXNzdWU0MTM4Njg0NTI=,17,Improve and document foreign_keys=... argument to insert/create/etc,9599,simonw,closed,0,,,,,7,2019-02-24T21:09:11Z,2019-02-24T23:45:48Z,2019-02-24T23:45:48Z,OWNER,,"The `foreign_keys=` argument to `table.insert_all()` and friends can be used to specify foreign key relationships that should be created. It is not yet documented. It also requires you to specify the SQLite type of each column, even though this can be detected by introspecting the referenced table: cols = [c for c in self.db[other_table].columns if c.name == other_column] cols[0].type Relates to #2 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 413871266,MDU6SXNzdWU0MTM4NzEyNjY=,18,.insert/.upsert/.insert_all/.upsert_all should add missing columns,9599,simonw,closed,0,,,4348046,1.0,2,2019-02-24T21:36:11Z,2019-05-25T00:42:11Z,2019-05-25T00:42:11Z,OWNER,,"This is a larger change, but it would be incredibly useful: if you attempt to insert or update a document with a field that does not currently exist in the underlying table, sqlite-utils should add the appropriate column for you.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 413887019,MDExOlB1bGxSZXF1ZXN0MjU1NzI1MDU3,413,Update spatialite.rst,28597217,joelondon,closed,0,,,,,1,2019-02-25T00:08:35Z,2019-03-15T05:06:45Z,2019-03-15T05:06:45Z,CONTRIBUTOR,simonw/datasette/pulls/413,a line of sql added to create the idx_ in the python recipe,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/413/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 415575624,MDU6SXNzdWU0MTU1NzU2MjQ=,414,datasette requires specific version of Click,82988,psychemedia,closed,0,,,,,1,2019-02-28T11:24:59Z,2019-03-15T04:42:13Z,2019-03-15T04:42:13Z,CONTRIBUTOR,,"Is `datasette` beholden to version `click==6.7`? Current release is at 7.0. Can the requirement be liberalised, eg to `>=6.7`?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/414/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 418329842,MDU6SXNzdWU0MTgzMjk4NDI=,415,Add query parameter to hide SQL textarea,36796532,ad-si,closed,0,,,,,3,2019-03-07T14:11:30Z,2019-03-15T09:30:57Z,2019-03-15T05:22:43Z,NONE,,It would be cool if there was a query parameter to hide / remove the SQL textarea. Then I could simply save a bookmark for a certain query and open it to see the data without having to scroll below the (long) SQL query first.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/415/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 421348146,MDExOlB1bGxSZXF1ZXN0MjYxNDE4Mjg1,416,URL hashing now optional: turn on with --config hash_urls:1 (#418),9599,simonw,closed,0,,,,,8,2019-03-15T04:26:06Z,2019-03-17T22:55:04Z,2019-03-17T22:55:04Z,OWNER,simonw/datasette/pulls/416,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/416/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 421546944,MDU6SXNzdWU0MjE1NDY5NDQ=,417,Datasette Library,9599,simonw,open,0,,,,,12,2019-03-15T14:30:22Z,2020-12-29T14:34:50Z,,OWNER,,"The ability to run Datasette in a mode where it automatically picks up new (or modified) files in a directory tree without needing to restart the server. Suggested command: datasette library /path/to/mydbs/",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/417/reactions"", ""total_count"": 8, ""+1"": 8, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 421548881,MDU6SXNzdWU0MjE1NDg4ODE=,418,Hashed URLs should be optional,9599,simonw,closed,0,,,4305096,0.28,5,2019-03-15T14:34:12Z,2019-05-16T15:12:26Z,2019-05-16T15:12:26Z,OWNER,,"The cute performance hack where a hash of the DB contents is included in the URL makes a lot less sense when serving files that frequently change. It's also difficult to explain to people. It should be optional and default to ""off"". Needed for #417 and #419",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/418/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 421551434,MDU6SXNzdWU0MjE1NTE0MzQ=,419,"Default to opening files in mutable mode, special option for immutable files",9599,simonw,closed,0,,,4305096,0.28,10,2019-03-15T14:39:27Z,2019-05-16T15:14:32Z,2019-05-16T15:14:31Z,OWNER,,"One of the original ideas behind Datasette was that serving immutable data makes everything way easier. Two examples: You don't have to worry about SQLite concurrency and you can bundle the database inside a Docker container and deploy it to immutable hosting. See [The interesting ideas in Datasette](https://simonwillison.net/2018/Oct/4/datasette-ideas/) for more on this. I'm beginning to see a much stronger case for being able to serve mutable data as well. SQLite is actually perfectly capable of handling reads against a database that is also being written to, even if the writes are coming from another process. https://www.sqlite.org/wal.htm There are all kinds of interesting use-cases which Datasette is currently unsuitable for due to its insistence on immutable databases. Some examples: * Continually run Datasette against a SQLite database updated by another process, e.g. Firefox bookmarks * Projects where a cron runs every X minutes and writes new entries gathered from other sources to SQLite * Tail a log file, write those log updates to a SQLite file, view recent log entries in Datasette This is also relevant to #417, Datasette Library.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/419/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 421971339,MDU6SXNzdWU0MjE5NzEzMzk=,420,Fix all the places that currently use .inspect() data,9599,simonw,closed,0,,,4305096,0.28,13,2019-03-17T20:54:37Z,2019-05-19T19:58:31Z,2019-05-02T01:13:46Z,OWNER,,"See #419: if Datasette is going to work against mutable SQLite files it can no longer assume that the `.inspect()` method will have cached the correct schema for all tables in all attached databases. So everywhere in the code at the moment that relies on `.inspect()` data needs to be modified to use live introspection of the schema instead. From [a comment later on](https://github.com/simonw/datasette/issues/420#issuecomment-474398127): here are the uses I need to fix as a checklist: - [x] `table_exists()` - [x] `info[""file""]` in `.execute()` - [x] `resolve_db_name()` - [x] `.database_url(database)` - [x] `DatabaseDownload` file path - [x] `sortable_columns_for_table()` uses it to find the columns in a table - [x] `expandable_columns()` uses it to find foreign keys - [x] `expand_foreign_keys()` uses it to find foreign keys - [x] `display_columns_and_rows()` uses it to find primary keys and foreign keys... but also has access to a cursor.description which it uses to list the columns - [x] `TableView.data` uses it to lookup columns and primary keys and the table_rows_count (used if the thing isn't a view) and probably a few more things, this method is huge! - [x] `RowView.data` uses it for primary keys - [x] `foreign_key_tables()` uses it for foreign keys - [x] `DatabaseView` list of tables - [x] `IndexView` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/420/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 421985685,MDU6SXNzdWU0MjE5ODU2ODU=,421,Documentation for ?_hash=1 and Datasette's hashed URL caching,9599,simonw,closed,0,,,4305096,0.28,2,2019-03-17T23:08:36Z,2019-05-19T05:32:37Z,2019-05-19T05:31:27Z,OWNER,,Follow on from #418 - the Datasette documentation needs an entire section (probably a new page) describing exactly how the hash-in-URL caching mechanism works.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/421/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 423316403,MDU6SXNzdWU0MjMzMTY0MDM=,422,Figure out what to do about table counts in a mutable world,9599,simonw,closed,0,,,,,4,2019-03-20T15:27:15Z,2019-05-02T05:43:11Z,2019-05-02T05:43:11Z,OWNER,,"In moving away from the existing static inspect method (see #420 and #419) the biggest thing lost is full table row counts. These can be expensive against large tables, but currently Datasette runs the `count (*) from x` query once at inspection time and then reuses it for every page. We can run those counts with a timelimit, but this means that for larger tables we won't be able to show a count at all, which is disappointing. Is there a way we can find an approximate or lower bound count for a table?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/422/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 426722204,MDU6SXNzdWU0MjY3MjIyMDQ=,423,?_search_col=X not reflected correctly in the UI,9599,simonw,open,0,,,,,0,2019-03-28T21:48:19Z,2020-11-03T19:01:59Z,,OWNER,,"e.g. https://latest.datasette.io/fixtures/searchable?_search_text1=barry ![2019-03-28 at 2 47 PM](https://user-images.githubusercontent.com/9599/55195035-84ebb800-5168-11e9-910b-fc9868bcd93e.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/423/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 427429265,MDExOlB1bGxSZXF1ZXN0MjY2MDM1Mzgy,424,Column types in inspected metadata,45057,russss,closed,0,,,,,2,2019-03-31T18:46:33Z,2019-04-29T18:30:50Z,2019-04-29T18:30:46Z,CONTRIBUTOR,simonw/datasette/pulls/424,"This PR does two things: * Adds the sqlite column type for each column to the inspected table info. * Stops binary columns from being rendered to HTML, unless a plugin handles it. There's a bit more detail in the changeset descriptions. These changes are intended as a precursor to a plugin which adds first-class support for Spatialite geographic primitives, and perhaps more useful geo-stuff.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/424/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 430103450,MDU6SXNzdWU0MzAxMDM0NTA=,425,Submitting SQL on hide page is broken,9599,simonw,closed,0,,,,,2,2019-04-07T04:21:31Z,2019-04-12T05:12:13Z,2019-04-12T05:00:53Z,OWNER,,Clicking the submit button here doesn't work correctly: https://3a208a4.datasette.io/fixtures?sql=select+%2A+from+compound_three_primary_keys+order+by+pk1%2C+pk2%2C+pk3+limit+101&_hide_sql=1,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/425/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 431756352,MDExOlB1bGxSZXF1ZXN0MjY5MzY0OTI0,426,Upgrade to Jinja2==2.10.1,9599,simonw,closed,0,,,,,1,2019-04-10T23:03:08Z,2019-04-22T21:23:22Z,2019-04-10T23:13:31Z,OWNER,simonw/datasette/pulls/426,"https://nvd.nist.gov/vuln/detail/CVE-2019-10906 This is only a security issue of concern if evaluating templates from untrusted sources, which isn't something I would ever expect a Datasette user to do.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/426/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 431800286,MDU6SXNzdWU0MzE4MDAyODY=,427,"New design for facet abstraction, including querystring and metadata.json",9599,simonw,closed,0,,,,,10,2019-04-11T02:24:15Z,2019-05-29T21:39:12Z,2019-05-03T00:11:29Z,OWNER,,"I need a better design for query strings for facets (and for how facets are enabled in `metadata.json`). Think of all of the potential kinds of facets: * `?_facet_array=tags` where tags is a JSON array of values * `_facet_date=datetimecol` - faceted by date part of a datetime * `_facet_bins=numeric_column` - can I do some kind of fancy binning here? Might need to take an argument * `?_facet_bins=numeric_column:5` - could be a way to take an argument. We’ll ignore columns with a : in their name. * `?_facet_json=jsoncol:jsonpath` - could use a JSON path to extract out something to facet on? * `?_facet_percentile=numericcolumn` - could this work? * `?_facet_function=column:sqlfunctionname` - maybe this could be interesting? Would allow for e.g. facet by soundex * `?_facet_prefix=column:prefix` - facet by terms but only if they start with a specific prefix * `?_facet_substring=column:3,6` - facet by a substr(column, 3, 6) Maybe bundling JSON in querystrings is a way to do options? `?_facet_distance={""latitude_column"":""x"",...}` Could detect values starting with `{` - and if for some weird reason you have a column starting with that character you can pass this instead: `?_facet_percentile={""column"": ""{value}""}` This could even be the mechanism that allows us to extend regular facets to support additional options like adding a sum or max to each one. Problem: it’s not obvious what the name associated with these facets should be. What if one column is faceted multiple times using multiple facet variants? Maybe just number them? name1=… name2=… etc? Other option is to use Solr style querystring syntax for notation. Solr does this: `?f.price.facet.range.gap=100&f.age.facet.range.gap=10` So how about this: `?_facet_range=age&_facet_range.span=5` Related: #359",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/427/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 432217625,MDU6SXNzdWU0MzIyMTc2MjU=,19,Incorrect help text for enable-fts command,9599,simonw,closed,0,,,4348046,1.0,0,2019-04-11T19:46:44Z,2019-05-25T00:44:31Z,2019-05-25T00:44:31Z,OWNER,,"I clearly copied-and-pasted this from the `tables` command without updating it: https://github.com/simonw/sqlite-utils/blob/0b1af42ead3b3902347951180b3364ce1942da6e/sqlite_utils/cli.py#L216-L222",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 432371762,MDU6SXNzdWU0MzIzNzE3NjI=,428,Make ?_fts_table=x and ?_fts_pk=y available as URL parameters on table view,9599,simonw,closed,0,,,,,2,2019-04-12T03:30:55Z,2019-04-12T04:30:29Z,2019-04-12T04:21:25Z,OWNER,,"These can currently only be set using `metadata.json`: https://datasette.readthedocs.io/en/0.27/full_text_search.html#configuring-full-text-search-for-a-table-or-view There's no reason not to support these as URL parameters as well. That way it would be easy to use FTS search against a view without having to use `metadata.json`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/428/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 432636432,MDU6SXNzdWU0MzI2MzY0MzI=,429,?_where=sql-fragment parameter for table views,9599,simonw,closed,0,,,,,7,2019-04-12T15:58:51Z,2019-04-15T10:48:01Z,2019-04-13T01:37:25Z,OWNER,,"Only available if arbitrary SQL is enabled (the default). `?_where=id in (1,2,3)&_where=id in (select tag_id from tags)` Allows any table (or view) page to have arbitrary additional `extra_where` clauses defined using the URL! This would be extremely useful for building JavaScript applications against the Datasette API that only need on extra tiny bit of SQL but still want to benefit from other table view features like faceting. Would be nice if this could take `:named` parameters and have them filled in via querystring as well.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/429/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 432727685,MDU6SXNzdWU0MzI3Mjc2ODU=,20,JSON column values get extraneously quoted ,649467,mhalle,closed,0,,,4348046,1.0,1,2019-04-12T20:15:30Z,2019-05-25T00:57:19Z,2019-05-25T00:57:19Z,NONE,,"If the input to `sqlite-utils insert` includes a column that is a JSON array or object, `sqlite-utils query` will introduce an extra level of quoting on output: ``` # echo '[{""key"": [""one"", ""two"", ""three""]}]' | sqlite-utils insert t.db t - # sqlite-utils t.db 'select * from t' [{""key"": ""[\""one\"", \""two\"", \""three\""]""}] # sqlite3 t.db 'select * from t' [""one"", ""two"", ""three""] ``` This might require an imperfect solution, since sqlite3 doesn't have a JSON type. Perhaps fields that start with `[""` or `{""` and end with `""]` or `""}` could be detected, with a flag to turn off that behavior for weird text fields (or vice versa).",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 432792459,MDExOlB1bGxSZXF1ZXN0MjcwMTkxMDg0,430,"?_where= parameter on table views, closes #429",9599,simonw,closed,0,,,,,0,2019-04-13T01:15:09Z,2019-04-13T01:37:23Z,2019-04-13T01:37:23Z,OWNER,simonw/datasette/pulls/430,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/430/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 432870248,MDU6SXNzdWU0MzI4NzAyNDg=,431,Datasette doesn't reload when database file changes,82988,psychemedia,closed,0,,,,,3,2019-04-13T16:50:43Z,2019-05-02T05:13:55Z,2019-05-02T05:13:54Z,CONTRIBUTOR,,"My understanding of the `--reload` option was that if the database file changed `datasette` would automatically reload. I'm running on a Mac and from the `datasette` UI queries don't seem to be picking up data in a newly changed db (I checked the db timestamp - it certainly updated). I was also expecting to see some sort of log statement in the datasette logging to say that it had detected a file change and restarted, but don't see anything there? Will try to check on an Ubuntu box when I get a chance to see if this is a Mac thing.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/431/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 432893491,MDExOlB1bGxSZXF1ZXN0MjcwMjUxMDIx,432,"Refactor facets to a class and new plugin, refs #427",9599,simonw,closed,0,,,,,4,2019-04-13T20:04:45Z,2019-05-03T00:04:24Z,2019-05-03T00:04:24Z,OWNER,simonw/datasette/pulls/432,WIP for #427,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/432/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 433297989,MDU6SXNzdWU0MzMyOTc5ODk=,433,"?column__in=value1,value2,value3 filter",9599,simonw,closed,0,,,,,0,2019-04-15T13:58:24Z,2019-04-15T23:00:20Z,2019-04-15T23:00:20Z,OWNER,,"Support for the SQL `where column in (...)` construct, inspired by the new design for facet configuration in #427 `?column__in=value1,value2,value3` will map to `where column in (""value1"", ""value2"", ""value3"")` If comma separation won't work (because the values themselves contain commas) you can do this instead: `?column__in=[""value1"",""value2"",""value3,with-comma""]` See also #288",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/433/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 434321685,MDExOlB1bGxSZXF1ZXN0MjcxMzM4NDA1,434,"""datasette publish cloudrun"" command to publish to Google Cloud Run",10352819,rprimet,closed,0,,,,,8,2019-04-17T14:41:18Z,2019-05-03T21:50:44Z,2019-05-03T13:59:02Z,CONTRIBUTOR,simonw/datasette/pulls/434,"This is a very rough draft to start a discussion on a possible datasette cloud run publish plugin (see issue #400). The main change was to dynamically set the listening port in `make_dockerfile` to satisfy cloud run's [requirements](https://cloud.google.com/run/docs/reference/container-contract). This was done by running `datasette` through `sh` to get environment variable substitution. Not sure if that's the right approach? ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/434/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 435531034,MDU6SXNzdWU0MzU1MzEwMzQ=,435,Tracing support for seeing what SQL queries were executed,9599,simonw,closed,0,,,4305096,0.28,4,2019-04-21T17:37:37Z,2019-05-11T20:32:21Z,2019-05-11T19:07:42Z,OWNER,,"Features like faceting, foreign key expansions and now the inspect-less index view mean Datasette can end up executing a surprisingly large number of SQL queries to render a single page. Past experience with projects like [tikbar](https://github.com/simonw/tikibar) have shown that being able to see what actually went into rendering a page can be critical for optimizing performance and generally understanding how everything works. Support a tracing mode (probably via a `?_trace=1` querystring) which adds information about what is actually going on to both the HTML and the JSON.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/435/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 435819321,MDU6SXNzdWU0MzU4MTkzMjE=,436,400 Error when trying to register new user via https://publish.datasettes.com/,317694,nniiicc,closed,0,,,,,1,2019-04-22T17:55:00Z,2021-01-04T20:15:42Z,2021-01-04T20:15:41Z,NONE,,"Behavior: When registering a new user via Zeit - confirmation is sent and screen acknowledges registered user... When clicking grant access the next screen is a white 400 error message. Replicated: Chrome and Firefox; 2 different email accounts",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/436/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 438048318,MDExOlB1bGxSZXF1ZXN0Mjc0MTc0NjE0,437,Add inspect and prepare_sanic hooks,45057,russss,closed,0,,,,,2,2019-04-28T11:53:34Z,2019-06-24T16:38:57Z,2019-06-24T16:38:56Z,CONTRIBUTOR,simonw/datasette/pulls/437,"This adds two new plugin hooks: The `inspect` hook allows plugins to add data to the inspect dictionary. The `prepare_sanic` hook allows plugins to hook into the web router. I've attached a warning to this hook in the docs in light of #272 but I want this hook now... On quick inspection, I don't think it's worthwhile to try and make this hook independent of the web framework (but it looks like Starlette would make the hook implementation a bit nicer). Ref #14",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/437/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 438200529,MDU6SXNzdWU0MzgyMDA1Mjk=,438,Plugins are loaded when running pytest,45057,russss,closed,0,,,,,2,2019-04-29T08:25:58Z,2019-05-02T05:09:18Z,2019-05-02T05:09:11Z,CONTRIBUTOR,,"If I have a datasette plugin installed on my system, its hooks are called when running the main datasette tests. This is probably undesirable, especially with the inspect hook in #437, as the plugin may rely on inspected state that the tests don't know about.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/438/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 438240541,MDExOlB1bGxSZXF1ZXN0Mjc0MzEzNjI1,439,[WIP] Add primary key to the extra_body_script hook arguments,45057,russss,closed,0,,,,,2,2019-04-29T10:08:23Z,2019-05-01T09:58:32Z,2019-05-01T09:58:30Z,CONTRIBUTOR,simonw/datasette/pulls/439,"This allows the row to be identified on row pages. The context here is that I want to access the row's data to plot it on a map. I considered passing the entire template context through to the hook function. This would expose the actual row data and potentially avoid a further fetch request in JS, but it does make the plugin API a lot more leaky. (At any rate, using the selected row data is tricky in my case because of Spatialite's infuriating custom binary representation...)",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/439/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 438259941,MDU6SXNzdWU0MzgyNTk5NDE=,440,Plugin hook for additional data export formats,45057,russss,closed,0,,,,,0,2019-04-29T11:01:39Z,2019-05-01T23:01:57Z,2019-05-01T23:01:57Z,CONTRIBUTOR,,"It would be nice to have a simple way for plugins to provide additional data export formats. Might require a bit of work on the internals. I can work around this at a lower level with the `prepare_sanic` hook from #437 in the mean time. I guess plugins should be able to register a function which takes a row or list of rows and returns the rendered data. They'll also need to provide a file extension and probably a Content-Type. Datasette could then automatically include this format in the list of export formats on each page. Looks like this is related to #119.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/440/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 438437973,MDExOlB1bGxSZXF1ZXN0Mjc0NDY4ODM2,441,Add register_output_renderer hook,45057,russss,closed,0,,,,,8,2019-04-29T18:03:21Z,2019-05-01T23:01:57Z,2019-05-01T23:01:57Z,CONTRIBUTOR,simonw/datasette/pulls/441,"This changeset refactors out the JSON renderer and then adds a hook and dispatcher system to allow custom output renderers to be registered. The CSV output renderer is untouched because supporting streaming renderers through this system would be significantly more complex, and probably not worthwhile. We can't simply allow hooks to be called at request time because we need a list of supported file extensions when the request is being routed in order to resolve ambiguous database/table names. So, renderers need to be registered at startup. I've tried to make this API independent of Sanic's request/response objects so that this can remain stable during the switch to ASGI. I'm using dictionaries to keep it simple and to make adding additional options in the future easy. Fixes #440",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/441/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 438450757,MDExOlB1bGxSZXF1ZXN0Mjc0NDc4NzYx,442,Suppress rendering of binary data,45057,russss,closed,0,,,,,2,2019-04-29T18:36:41Z,2019-05-03T18:26:48Z,2019-05-03T16:44:49Z,CONTRIBUTOR,simonw/datasette/pulls/442,"Binary columns (including spatialite geographies) get shown as ugly binary strings in the HTML by default. Nobody wants to see that mess. Show the size of the column in bytes instead. If you want to decode the binary data, you can use a plugin to do it.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/442/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 439480260,MDExOlB1bGxSZXF1ZXN0Mjc1Mjc1NjEw,443,Pass view_name to extra_body_script hook,45057,russss,closed,0,,,,,0,2019-05-02T08:38:36Z,2019-05-03T13:12:20Z,2019-05-03T13:12:20Z,CONTRIBUTOR,simonw/datasette/pulls/443,"At the moment it's not easy to tell whether the hook is being called in (for example) the row or table view, as in both cases the `database` and `table` parameters are provided. This passes the `view_name` added in #441 to the `extra_body_script` hook.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/443/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 439487648,MDExOlB1bGxSZXF1ZXN0Mjc1MjgxMzA3,444,Add a max-line-length setting for flake8,45057,russss,closed,0,,,,,0,2019-05-02T08:58:57Z,2019-05-04T09:44:48Z,2019-05-03T13:11:28Z,CONTRIBUTOR,simonw/datasette/pulls/444,"This stops my automatic editor linting from flagging lines which are too long. It's been lingering in my checkout for ages. 160 is an arbitrary large number - we could alter it if we have any opinions (but I find the line length limit to be my least favourite part of PEP8).",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/444/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 439836586,MDExOlB1bGxSZXF1ZXN0Mjc1NTU4NjEy,445,"Extract facet code out into a new plugin hook, closes #427",9599,simonw,closed,0,,,,,0,2019-05-03T00:02:41Z,2019-05-03T18:17:18Z,2019-05-03T00:11:27Z,OWNER,simonw/datasette/pulls/445,"Datasette previously only supported one type of faceting: exact column value counting. With this change, faceting logic is extracted out into one or more separate classes which can implement other patterns of faceting - this is discussed in #427, but potential upcoming facet types include facet-by-date, facet-by-JSON-array, facet-by-many-2-many and more. A new plugin hook, register_facet_classes, can be used by plugins to add in additional facet classes. Each class must implement two methods: suggest(), which scans columns in the table to decide if they might be worth suggesting for faceting, and facet_results(), which executes the facet operation and returns results ready to be displayed in the UI.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/445/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 440134714,MDU6SXNzdWU0NDAxMzQ3MTQ=,446,Define mechanism for plugins to return structured data,9599,simonw,closed,0,,,3268330,Datasette 1.0,7,2019-05-03T17:00:16Z,2020-10-02T00:08:54Z,2020-10-02T00:08:47Z,OWNER,,"Several plugin hooks now expect plugins to return data in a specific shape - notably the new output format hook and the custom facet hook. These use Python dictionaries right now but that's quite error prone: it would be good to have a mechanism that supported a more structured format. Full list of current hooks is here: https://datasette.readthedocs.io/en/latest/plugins.html#plugin-hooks",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/446/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 440159137,MDExOlB1bGxSZXF1ZXN0Mjc1ODAxNDYz,447,Use dist: xenial and python: 3.7 on Travis,9599,simonw,closed,0,,,,,1,2019-05-03T18:07:07Z,2019-05-03T18:17:05Z,2019-05-03T18:16:53Z,OWNER,simonw/datasette/pulls/447,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/447/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 440222719,MDU6SXNzdWU0NDAyMjI3MTk=,448,_facet_array should work against views,9599,simonw,closed,0,,,3268330,Datasette 1.0,12,2019-05-03T21:08:04Z,2021-11-16T01:32:05Z,2021-11-16T01:19:40Z,OWNER,,"I created this view: https://json-view-facet-bug-demo-j7hipcg4aq-uc.a.run.app/russian-ads-8dbda00/ads_with_targets ``` CREATE VIEW ads_with_targets as select ads.*, json_group_array(targets.name) as target_names from ads join ad_targets on ad_targets.ad_id = ads.id join targets on ad_targets.target_id = targets.id group by ad_targets.ad_id ``` When I try to apply faceting by array it appears to work at first: https://json-view-facet-bug-demo-j7hipcg4aq-uc.a.run.app/russian-ads/ads_with_targets?_facet_array=target_names But actually it's doing the wrong thing - the SQL for the facets uses rowid, but rowid is not present on views at all! These results are incorrect, and clicking to select a facet will fail to produce any rows: https://json-view-facet-bug-demo-j7hipcg4aq-uc.a.run.app/russian-ads/ads_with_targets?_facet_array=target_names&target_names__arraycontains=people_who_match%3Ainterests%3AAfrican-American+Civil+Rights+Movement+%281954%E2%80%9468%29 Here's the SQL it should be using when you select a facet (note that it does not use a rowid): https://json-view-facet-bug-demo-j7hipcg4aq-uc.a.run.app/russian-ads?sql=select+*+from+ads_with_targets+where+id+in+%28%0D%0A++++++++++++select+ads_with_targets.id+from+ads_with_targets%2C+json_each%28ads_with_targets.target_names%29+j%0D%0A++++++++++++where+j.value+%3D+%3Ap0%0D%0A++++++++%29+limit+101&p0=people_who_match%3Ainterests%3ABlack+%28Color%29 So we need to do something a lot smarter here. I'm not sure what the fix will look like, or even if it's feasible given that views don't have a rowid to hook into so the JSON faceting SQL may have to be completely rewritten. ``` datasette publish cloudrun \ russian-ads.db \ --name json-view-facet-bug-demo \ --branch master \ --extra-options ""--config sql_time_limit_ms:5000 --config facet_time_limit_ms:5000"" ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/448/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 440237422,MDExOlB1bGxSZXF1ZXN0Mjc1ODYxNTU5,449,Apply black to everything,9599,simonw,closed,0,,,,,0,2019-05-03T21:57:26Z,2019-05-04T02:17:14Z,2019-05-04T02:15:15Z,OWNER,simonw/datasette/pulls/449,"I've been hesitating on this for literally months, because I'm not at all excited about the giant diff that will result. But I've been using black on many of my other projects (most actively [sqlite-utils](https://github.com/simonw/sqlite-utils)) and the productivity boost is undeniable: I don't have to spend a single second thinking about code formatting any more! So it's worth swallowing the one-off pain and moving on in a new, black-enabled world.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/449/reactions"", ""total_count"": 4, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 4, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 440304714,MDExOlB1bGxSZXF1ZXN0Mjc1OTA5MTk3,450,Coalesce hidden table count to 0,45057,russss,closed,0,,,,,2,2019-05-04T09:37:10Z,2019-05-11T18:10:09Z,2019-05-11T18:10:09Z,CONTRIBUTOR,simonw/datasette/pulls/450,"For some reason I'm hitting a `None` here with a FTS table. I'm not entirely sure why but this makes the logic work the same as with non-hidden tables.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/450/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 440313209,MDU6SXNzdWU0NDAzMTMyMDk=,451,Update README,9599,simonw,closed,0,,,4305096,0.28,1,2019-05-04T11:26:07Z,2019-05-19T22:23:43Z,2019-05-19T22:23:43Z,OWNER,,"The README is quite out of date now. It includes out-dated copies of help files, promotes the old Zeit Now integration and duplicates a lot of material from the docs.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/451/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 440325850,MDExOlB1bGxSZXF1ZXN0Mjc1OTIzMDY2,452,SQL builder utility classes,45057,russss,open,0,,,,,0,2019-05-04T13:57:47Z,2019-05-04T14:03:04Z,,CONTRIBUTOR,simonw/datasette/pulls/452,"This adds a straightforward set of classes to aid in the construction of SQL queries. My plan for this was to allow plugins to manipulate the Datasette-generated SQL in a more structured way. I'm not sure that's going to work, but I feel like this is still a step forward - it reduces the number of intermediate variables in `TableView.data` which aids readability, and also factors out a lot of the boring string concatenation. There are a fair number of minor structure changes in here too as I've tried to make the ordering of `TableView.data` a bit more logical. As far as I can tell, I haven't broken anything...",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/452/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 440332621,MDU6SXNzdWU0NDAzMzI2MjE=,453,Error pages do not return CORS header with --cors,9599,simonw,closed,0,,,,,1,2019-05-04T15:07:44Z,2019-05-05T12:24:24Z,2019-05-05T12:11:33Z,OWNER,,"This is very confusing. It means that if you send invalid SQL you will get back a CORS error, because the resulting 400 page cannot be accessed via JavaScript.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/453/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 440437037,MDU6SXNzdWU0NDA0MzcwMzc=,454,Plugin for allowing CORS from specified hosts,9599,simonw,closed,0,9599,simonw,,,5,2019-05-05T12:05:02Z,2019-10-03T23:59:57Z,2019-10-03T23:59:56Z,OWNER,,"It would be useful if Datasette could be configured to allow CORS requests from one or more origins, as opposed to only allowing either none or `""*""`. This is slightly tricky because the `Access-Control-Allow-Origin: https://foo.example` header is only allowed to return one value per request - and according to https://developer.mozilla.org/en-US/docs/Web/HTTP/CORS ""The Access-Control-Allow-Origin header should contain the value that was sent in the request's Origin header."" This means the application code needs to have a whitelist of allowed hosts and code that dynamically changes the outgoing `Access-Control-Allow-Origin` header based on the `Origin` header from the incoming request.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/454/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 441858747,MDU6SXNzdWU0NDE4NTg3NDc=,455,Hidden tables shown on the index page,9599,simonw,closed,0,,,4305096,0.28,1,2019-05-08T18:02:13Z,2019-05-14T15:49:29Z,2019-05-14T15:48:08Z,OWNER,,"Minor bug in master right now. https://csvconf.now.sh/ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/455/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 442327592,MDU6SXNzdWU0NDIzMjc1OTI=,456,Installing installs the tests package,7725188,hellerve,closed,0,,,,,3,2019-05-09T16:35:16Z,2020-07-24T20:39:54Z,2020-07-24T20:39:54Z,CONTRIBUTOR,,"Because `setup.py` uses `find_packages` and `tests` is on the top-level, `pip install datasette` will install a top-level package called `tests`, which is probably not desired behavior. The offending line is here: https://github.com/simonw/datasette/blob/bfa2ae0d16d39bb82dbe4da4f3fdc3c7f6257418/setup.py#L40 And only `pip uninstall datasette` with a conflicting package would warn you by default; apparently another package had the same problem, which is why I get this message when uninstalling: ``` $ pip uninstall datasette Uninstalling datasette-0.27: Would remove: /usr/local/bin/datasette /usr/local/lib/python3.7/site-packages/datasette-0.27.dist-info/* /usr/local/lib/python3.7/site-packages/datasette/* /usr/local/lib/python3.7/site-packages/tests/* Would not remove (might be manually added): [ .. snip .. ] Proceed (y/n)? ``` This should be a relatively simple fix, and I could drop a PR if desired! Cheers",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/456/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 442330564,MDU6SXNzdWU0NDIzMzA1NjQ=,457,"Ability to ""publish cloudrun"" with no user input",9599,simonw,closed,0,,,,,2,2019-05-09T16:42:51Z,2019-05-09T19:41:31Z,2019-05-09T16:45:08Z,OWNER,,"If you attempt to deploy a new version of a cloudrun deployment, the script currently pauses and asks for user input for the service name like this: ```77d4d7de-3dfc-4acc-9a23-efe16230f318 2019-05-09T15:01:48+00:00 52S gs://datasette-222320_cloudbuild/source/1557414063.1-3a82df8096e9434b93511b0588d8d155.tgz gcr.io/datasette-222320/sf-trees (+1 more) SUCCESS Service name: (sf-trees): USER INPUT REQUIRED HERE Deploying container to Cloud Run service [sf-trees] in project [datasette-222320] region [us-central1] ✓ Deploying... Done. ✓ Creating Revision... ✓ Routing traffic... ✓ Setting IAM Policy... ``` This is incompatible with running under CI.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/457/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 442402832,MDExOlB1bGxSZXF1ZXN0Mjc3NTI0MDcy,458,setup: add tests to package exclusion,7725188,hellerve,closed,0,,,,,1,2019-05-09T19:47:21Z,2020-07-21T01:14:42Z,2019-05-10T01:54:51Z,CONTRIBUTOR,simonw/datasette/pulls/458,"This PR fixes #456 by adding `tests` to the package exclusion list. Cheers",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/458/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 443020048,MDU6SXNzdWU0NDMwMjAwNDg=,459,"Fix the ""datasette now publish ... --alias=x"" option",9599,simonw,closed,0,,,4305096,0.28,3,2019-05-11T17:48:40Z,2019-05-11T20:22:08Z,2019-05-11T20:22:08Z,OWNER,,"Now have deprecated the mechanism we were using for this - running `now alias` without any parameters - in favour of something new: https://zeit.co/blog/automatic-aliasing",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/459/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 443020810,MDU6SXNzdWU0NDMwMjA4MTA=,460,Design changes to homepage to support mutable files,9599,simonw,closed,0,,,4305096,0.28,5,2019-05-11T17:58:05Z,2019-05-16T03:34:09Z,2019-05-16T03:24:16Z,OWNER,,"Needed for #419 - since we can now start up Datasette with a whole bunch of large connected databases that are mutable we can no longer guarantee a quick count of rows across all of the tables. New proposed homepage tweaks: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/460/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 443021509,MDU6SXNzdWU0NDMwMjE1MDk=,461,Paginate + search for databases/tables on the homepage,9599,simonw,open,0,,,3268330,Datasette 1.0,4,2019-05-11T18:05:34Z,2020-12-17T22:14:46Z,,OWNER,,Split out from #460 - in order to support large numbers of connected databases the homepage needs to be paginated.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/461/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 443023308,MDU6SXNzdWU0NDMwMjMzMDg=,462,Replace most of `.inspect()` (and `datasette inspect`) with table counting,9599,simonw,closed,0,,,4305096,0.28,4,2019-05-11T18:26:06Z,2019-05-16T14:31:05Z,2019-05-16T14:31:05Z,OWNER,,"This is the last part of #419 - with the move to supporting mutable databases by default, the inspect-data mechanism currently in use no-longer makes much sense. The one optimization I think it's worth keeping for databases opened in immutable mode is the cached table counts. I think `datasette inspect` should cut down to only counting the rows in the tables - the other things done by inspect (figuring out columns, foreign key relationships, FTS etc) should all be fast enough that they can be reliably performed at runtime even against large databases. If performing them at run-time has performance issues, I would rather cache those results internally within Datasette after they are first calculated than continue to support them in the `datasette inspect` command - to keep things simpler.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/462/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 443034003,MDU6SXNzdWU0NDMwMzQwMDM=,463,Write release notes for 0.28,9599,simonw,closed,0,,,4305096,0.28,1,2019-05-11T20:36:56Z,2019-05-19T21:24:44Z,2019-05-19T21:24:20Z,OWNER,,"So much new stuff! https://github.com/simonw/datasette/compare/0.27...master",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/463/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 443034218,MDU6SXNzdWU0NDMwMzQyMTg=,464,Add Glitch to Getting Started docs section,9599,simonw,closed,0,,,4305096,0.28,1,2019-05-11T20:39:39Z,2019-05-16T05:04:35Z,2019-05-16T05:03:46Z,OWNER,,Glitch is by far the easiest way to start trying out Datasette. Add a section to https://datasette.readthedocs.io/en/latest/getting_started.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/464/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 443038584,MDU6SXNzdWU0NDMwMzg1ODQ=,465,Decide what to do about /-/inspect,9599,simonw,closed,0,,,,,4,2019-05-11T21:39:46Z,2019-06-28T16:34:33Z,2019-06-28T16:34:33Z,OWNER,,"It's not clear to me what this endpoint should do now as a result of #419 - it's still useful to be able to introspect databases for tools like datasette-registry, but since we aren't pre-calculating introspection data any more I need to rethink the approach. For one thing, this endpoint may need to be paginated. Or maybe it should be split up into separate endpoints for each connected database? Those should probably be paginated too seeing as fivethirtyeight has 400+ tables.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/465/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 443040665,MDU6SXNzdWU0NDMwNDA2NjU=,466,"Move ""no such module: VirtualSpatialIndex"" code elsewhere",9599,simonw,closed,0,,,4305096,0.28,2,2019-05-11T22:09:00Z,2022-01-20T21:29:41Z,2019-05-11T22:57:22Z,OWNER,,"We currently show a useful warning (from #331) when the user tries to open a spatialite database without first loading the module: https://github.com/simonw/datasette/blob/c692cd291111050483a32bea1ee08e994a0b781b/datasette/app.py#L547-L554 This code is part of `.inspect()` which is going away - see #462 - so I need to find somewhere else for it to live.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/466/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 444711254,MDU6SXNzdWU0NDQ3MTEyNTQ=,467,Index page row counts only for DBs with < 30 tables (10ms count limit per table),9599,simonw,closed,0,,,4305096,0.28,2,2019-05-16T01:21:36Z,2019-05-16T03:03:45Z,2019-05-16T03:03:45Z,OWNER,,"Split out from #460. If a database is mutable, calculating row counts gets expensive. I'm only going to calculate row counts for the index page if it has less than X tables (both hidden and non-hidden) AND each table can be counted in less than 10ms. If any count takes longer than 10ms I'll cancel the counting entirely. We currently show an inaccurate count if this happens, which is just confusing.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/467/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 444746021,MDU6SXNzdWU0NDQ3NDYwMjE=,468,Pagination for the database index page,9599,simonw,closed,0,,,3268330,Datasette 1.0,1,2019-05-16T04:13:56Z,2020-10-16T23:20:26Z,2020-10-16T23:20:22Z,OWNER,,"Some databases have a LOT of tables. Now that we often calculate table row counts dynamically we could really speed things up by paginating the database index page, e.g. http://fivethirtyeight-datasette.herokuapp.com/fivethirtyeight If we're paginating, having a filter-search-for-table widget (similar to the search-for-database widget I'm planning for the homepage) would make sense. Related: pagination for homepage #461 and Datasette Library #417",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/468/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 444749373,MDU6SXNzdWU0NDQ3NDkzNzM=,469,publish commands should use new -i option,9599,simonw,closed,0,,,,,1,2019-05-16T04:31:40Z,2019-05-19T22:53:41Z,2019-05-19T22:53:41Z,OWNER,,"I can make this change only after releasing 0.28 - if I make the change earlier than that `publish heroku` etc will break because they will install the latest release of Datasette which will not understand the `-i` option. This is a one-line fix: replace this: https://github.com/simonw/datasette/blob/2ad9d15cd6901654e6801e2faa29e6fc08bae5fa/datasette/utils.py#L489 With this: (need to do it for other publishers too though) ``` quoted_files = "" "".join( [""-i {}"".format(shlex.quote(file_name)) for file_name in file_names] ) ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/469/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 444997937,MDU6SXNzdWU0NDQ5OTc5Mzc=,470,/-/databases showing currently attached database details,9599,simonw,closed,0,,,4305096,0.28,1,2019-05-16T14:45:18Z,2019-05-19T19:28:44Z,2019-05-16T14:50:26Z,OWNER,,"Split from #419. Mainly useful to see what is connected as mutable v.s. immutable. Also helps fill the gap left by `/-/inspect` until #465 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/470/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 445003029,MDU6SXNzdWU0NDUwMDMwMjk=,471,?_hash=1 and --config hash_urls:1 should only work for immutable databases,9599,simonw,closed,0,,,4305096,0.28,1,2019-05-16T14:54:25Z,2019-05-16T15:11:03Z,2019-05-16T15:11:03Z,OWNER,,Split from #419.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/471/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 445230077,MDU6SXNzdWU0NDUyMzAwNzc=,472,"Rename ""publish now"" to ""publish nowv1""",9599,simonw,closed,0,,,4305096,0.28,1,2019-05-17T01:58:52Z,2019-05-19T18:07:39Z,2019-05-19T18:07:39Z,OWNER,,This will help clarify that you need a nowv1 account use it.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/472/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 445850934,MDU6SXNzdWU0NDU4NTA5MzQ=,473,Plugin hook: filters_from_request,9599,simonw,closed,0,,,,,13,2019-05-19T18:44:33Z,2021-12-17T23:11:30Z,2021-12-17T19:02:17Z,OWNER,,"I meant to add this as part of the facets plugin mechanism but didn't quite get to it. Original idea was to allow plugins to register extra filters, as seen in `datasette/filters.py`: https://github.com/simonw/datasette/blob/260085838887ee343f4d3b177c422e7aef5ade9d/datasette/filters.py#L83-L98",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/473/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 445855789,MDU6SXNzdWU0NDU4NTU3ODk=,474,Do not allow downloads of mutable databases,9599,simonw,closed,0,,,4305096,0.28,1,2019-05-19T19:35:32Z,2019-05-19T20:41:17Z,2019-05-19T20:41:16Z,OWNER,,If the file changes during download it will probably result in a corrupt download. Safer not to allow downloads at all of mutable databases.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/474/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 445855910,MDU6SXNzdWU0NDU4NTU5MTA=,475,Documentation for about and about_url metadata,9599,simonw,closed,0,,,4305096,0.28,0,2019-05-19T19:36:59Z,2019-05-19T20:13:36Z,2019-05-19T20:13:36Z,OWNER,,Added in https://github.com/simonw/datasette/commit/bf6b0f918de4aeee7c1036ac975ce2fb23237da7 without docs.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/475/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 445858491,MDU6SXNzdWU0NDU4NTg0OTE=,476,"Remove ""datasette skeleton""",9599,simonw,closed,0,,,4305096,0.28,0,2019-05-19T20:04:11Z,2019-05-19T20:06:06Z,2019-05-19T20:06:06Z,OWNER,,"It doesn't work any more, and it's not a particularly useful feature - I've hardly used it since I added it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/476/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 445862501,MDU6SXNzdWU0NDU4NjI1MDE=,477,Documentation for ArrayFacet (facet by JSON array),9599,simonw,closed,0,,,4305096,0.28,0,2019-05-19T20:47:27Z,2019-05-29T21:39:12Z,2019-05-19T21:19:43Z,OWNER,,This is missing from https://datasette.readthedocs.io/en/0.27.1/facets.html right now,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/477/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 445868234,MDU6SXNzdWU0NDU4NjgyMzQ=,478,Make it so Docker build doesn't delay PyPI release,9599,simonw,closed,0,,,4471010,Datasette 0.29,3,2019-05-19T21:52:10Z,2019-07-08T03:30:41Z,2019-07-07T20:03:20Z,OWNER,,"Datasette automated releases currently include building a Docker image that has a full custom-compiled version of SQLite and SpatiaLite. This takes ages! I still want to publish this Docker image (to https://hub.docker.com/r/datasetteproject/datasette/tags ) but I'd like it if this wasn't a blocker on pushing the new package to PyPI. Ideally PyPI publish would happen first.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/478/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 445873563,MDExOlB1bGxSZXF1ZXN0MjgwMjA0Mjc2,479,doc typo fix,98555,IgnoredAmbience,closed,0,,,,,1,2019-05-19T22:54:25Z,2019-05-20T16:42:29Z,2019-05-20T16:42:29Z,CONTRIBUTOR,simonw/datasette/pulls/479,Fix typo in performance doc page,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/479/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 445875242,MDExOlB1bGxSZXF1ZXN0MjgwMjA1NTAy,480,Split pypi and docker travis tasks,813732,glasnt,closed,0,,,4471010,Datasette 0.29,1,2019-05-19T23:14:37Z,2019-07-07T20:03:20Z,2019-07-07T20:03:20Z,CONTRIBUTOR,simonw/datasette/pulls/480,"Resolves #478 This *should* work, but because this is a change that'll only really be testable on a) this repo, b) master branch, this might fail fast if I didn't get the configurations right. Looking at #478 it should just be as simple as splitting out the docker and pypi processes into separate jobs, but it might end up being more complicated than that, depending on what pre-processes the pypi deployment needs, and how travisci treats deployment steps without scripts in general. ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/480/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 446429421,MDU6SXNzdWU0NDY0Mjk0MjE=,481,Facet by date,9599,simonw,closed,0,,,,,1,2019-05-21T05:55:54Z,2019-05-29T21:39:12Z,2019-05-21T06:09:49Z,OWNER,,Ability to facet on datetime fields by their date.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/481/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 446433735,MDU6SXNzdWU0NDY0MzM3MzU=,482,Example of a custom facet plugin is incorrect,9599,simonw,closed,0,,,4471010,Datasette 0.29,0,2019-05-21T06:12:47Z,2019-07-07T23:19:10Z,2019-07-07T23:19:10Z,OWNER,,"The function signatures are wrong on https://datasette.readthedocs.io/en/0.28/plugins.html#register-facet-classes The new signatures are: `async def suggest(self)` and `async def facet_results(self)` - the `sql` and `params` are now passed to the class constructor.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/482/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 447408527,MDU6SXNzdWU0NDc0MDg1Mjc=,483,Option to facet by date using month or year,9599,simonw,open,0,,,,,5,2019-05-23T01:25:29Z,2019-05-29T21:38:27Z,,OWNER,,"Facet by date (from #481) can take datetimes and facet them by the day component. https://latest.datasette.io/fixtures/facetable?_facet_date=created I'd like to also be able to facet by month or year. I'm not sure what the best way to achieve this is. Could be two more Facet classes (YearFacet and MonthFacet) but I think it might be nicer if the existing DateFacet could take an optional argument that changed its behaviour. But... if I do that, do I expose it in the UI somewhere or is it only available to URL-hackers?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/483/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 447451492,MDU6SXNzdWU0NDc0NTE0OTI=,484,Mechanism for displaying summary of m2m relationships in rows on table view,9599,simonw,open,0,,,,,1,2019-05-23T05:02:41Z,2019-05-23T06:34:05Z,,OWNER,,"Part of #354 (m2m support) It would be fantastic if rows that are part of a m2m relationship could display it in an additional column in the table view. It might look something like this: https://russian-ira-facebook-ads.datasettes.com/russian-ads-919cbfd/display_ads?_search=black+lives+matter That example [was achieved](https://github.com/simonw/russian-ira-facebook-ads-datasette/blob/daf51a8c50a78e8bc7971c211005fd85e66ccf64/russian-ads-metadata.yaml#L72-L77) using a custom SQL query and [datasette-json-html](https://github.com/simonw/datasette-json-html) - but I'd like this to be a built-in feature instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/484/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 447469253,MDU6SXNzdWU0NDc0NjkyNTM=,485,Improvements to table label detection ,9599,simonw,open,0,9599,simonw,,,10,2019-05-23T06:19:49Z,2022-10-03T00:04:42Z,,OWNER,,"Label detection doesn't work if the primary key is called pk rather than id, so this page doesn't work: https://latest.datasette.io/fixtures/roadside_attraction_characteristics Code is here: https://github.com/simonw/datasette/blob/cccea85be6aaaeadb31f3b588ec7f732628815f5/datasette/app.py#L644-L653",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/485/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 448189298,MDU6SXNzdWU0NDgxODkyOTg=,486,Ability to add extra routes and related templates,2181410,clausjuhl,closed,0,,,,,2,2019-05-24T14:04:25Z,2019-05-24T14:43:28Z,2019-05-24T14:43:09Z,NONE,,"Hi Simon Thank for an excellent job! Datasette is such an obviously good idea (once you have that idea!) and so well done. The only thing that I miss, is the ability to add extras routes (with associated jinja2-templates). For most of the datasets, that I would like to publish, I would also like at least a page, that describes the data (semantics, provenance, biases...) and a page explaining our cookie- and privacy-policies (which would allows us to use something like Goggle Analytics). ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/486/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 448391492,MDU6SXNzdWU0NDgzOTE0OTI=,21,Option to ignore inserts if primary key exists already,9599,simonw,closed,0,,,,,3,2019-05-25T00:17:12Z,2019-05-29T05:09:01Z,2019-05-29T04:18:26Z,OWNER,,"> I've just noticed that SQLite lets you IGNORE inserts that collide with a pre-existing key. This can be quite handy if you have a dataset that keeps changing in part, and you don't want to upsert and replace pre-existing PK rows but you do want to ignore collisions to existing PK rows. > > Do `sqlite_utils` support such (cavalier!) behaviour? _Originally posted by @psychemedia in https://github.com/simonw/sqlite-utils/issues/18#issuecomment-480621924_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/21/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 448395665,MDU6SXNzdWU0NDgzOTU2NjU=,22,Release notes for 1.0,9599,simonw,closed,0,,,4348046,1.0,2,2019-05-25T00:58:03Z,2019-05-25T01:18:27Z,2019-05-25T01:06:52Z,OWNER,,https://github.com/simonw/sqlite-utils/compare/0.14...251e473,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 448664792,MDU6SXNzdWU0NDg2NjQ3OTI=,487,Refactor database methods off Datasette class,9599,simonw,closed,0,,,,,1,2019-05-27T04:52:41Z,2019-05-27T20:05:34Z,2019-05-27T05:08:01Z,OWNER,,"Methods like this one: https://github.com/simonw/datasette/blob/182a3017c24e3fa3af60e4ac0c91c7e48f8736fd/datasette/app.py#L497-L503 Should live on the `ConnectedDatabase` class instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/487/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 448668204,MDU6SXNzdWU0NDg2NjgyMDQ=,488,Move detect_primary_keys to Database class method,9599,simonw,closed,0,,,,,0,2019-05-27T05:11:51Z,2019-05-27T20:05:34Z,2019-05-27T18:29:02Z,OWNER,,"e.g. https://github.com/simonw/datasette/blob/026c84db30bd0a75ecde146a80a5d142078dc299/datasette/views/table.py#L73-L75 Should be ``` pks = await db.primary_keys(table) ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/488/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 448977444,MDU6SXNzdWU0NDg5Nzc0NDQ=,489,Pagination breaks when combined with expanded foreign keys,9599,simonw,closed,0,,,,,1,2019-05-27T19:56:56Z,2019-05-28T02:48:57Z,2019-05-28T02:23:27Z,OWNER,,"Consider https://edb3662.datasette.io/fixtures/roadside_attraction_characteristics?_sort=attraction_id&_size=2 The ""Next page"" link goes here, which returns 0 rows: https://edb3662.datasette.io/fixtures/roadside_attraction_characteristics?_size=2&_next=%257B%2527value%2527%253A%2B2%252C%2B%2527label%2527%253A%2B%2527Winchester%2BMystery%2BHouse%2527%257D%2C2&_sort=attraction_id That's because if you double-url-decode that `_next` link you get this: `_next={'value': 2, 'label': 'Winchester Mystery House'},2` It should be `_next=2,2`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/489/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 448978907,MDU6SXNzdWU0NDg5Nzg5MDc=,490,Rename InterruptedError exception class,9599,simonw,closed,0,,,,,1,2019-05-27T20:04:25Z,2019-05-28T00:16:45Z,2019-05-28T00:16:45Z,OWNER,,"https://github.com/simonw/datasette/blob/edb36629e7356f70f42b9d37fea5dfe9cc3c364a/datasette/utils.py#L49-L50 Python has a built-in exception called this, so we should call ours something else.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/490/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 449445715,MDU6SXNzdWU0NDk0NDU3MTU=,491,Figure out how to use Firebase with cloudrun to enable vanity URLs and CDN caching,9599,simonw,open,0,,,,,0,2019-05-28T19:48:06Z,2019-05-28T19:48:35Z,,OWNER,,"It looks like Firebase can solve a couple of problems with the existing `datasette publish cloudrun` hosting mechanism: * The URLs it produces aren't pretty enough. Firebase offers more control over vanity URLs. * CDN caching (as seen in `datasette publish now`) is great for improving performance and saving money on Cloud Run execution time. https://firebase.google.com/docs/hosting/cloud-run looks like it can help with both of these. Lots of interesting questions: * Should this be a new `datasette publish firebase` command or should it instead be implemented as additional custom options to `datasette publish cloudrun`? * How much harder does it become to do account setup? * How much will this option cost users?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/491/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 449565204,MDU6SXNzdWU0NDk1NjUyMDQ=,23,Syntactic sugar for creating m2m records,9599,simonw,closed,0,,,,,10,2019-05-29T02:17:48Z,2019-08-04T03:54:58Z,2019-08-04T03:37:34Z,OWNER,,Python library only. What would be a syntactically pleasant way of creating a m2m record?,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 449818897,MDU6SXNzdWU0NDk4MTg4OTc=,24,Additional Column Constraints?,98555,IgnoredAmbience,closed,0,,,,,6,2019-05-29T13:47:03Z,2019-06-13T06:47:17Z,2019-06-13T06:30:26Z,NONE,,"I'm looking to import data from XML with a pre-defined schema that maps fairly closely to a relational database. In particular, it has explicit annotations for when fields are required, optional, or when a default value should be inferred. Would there be value in adding the ability to define `NOT NULL` and `DEFAULT` column constraints to sqlite-utils?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 449848803,MDU6SXNzdWU0NDk4NDg4MDM=,25,"Allow .insert(..., foreign_keys=()) to auto-detect table and primary key",9599,simonw,closed,0,,,,,4,2019-05-29T14:39:22Z,2019-06-13T05:32:32Z,2019-06-13T05:32:32Z,OWNER,,"The `foreign_keys=` argument currently takes a list of triples: ```python db[""usages""].insert_all( usages_to_insert, foreign_keys=( (""line_id"", ""lines"", ""id""), (""definition_id"", ""definitions"", ""id""), ), ) ``` As of #16 we have a mechanism for detecting the primary key column (the third item in this triple) - we should use that here too, so foreign keys can be optionally defined as a list of pairs.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 449854604,MDU6SXNzdWU0NDk4NTQ2MDQ=,492,Facets not correctly persisted in hidden form fields,9599,simonw,closed,0,,,3268330,Datasette 1.0,4,2019-05-29T14:49:39Z,2020-09-15T20:12:29Z,2020-09-15T20:12:29Z,OWNER,,"Steps to reproduce: visit https://2a4b892.datasette.io/fixtures/roadside_attractions?_facet_m2m=attraction_characteristic and click ""Apply"" Result is a 500: `no such column: attraction_characteristic` The error occurs because of this hidden HTML input: This should be: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/492/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 449886319,MDU6SXNzdWU0NDk4ODYzMTk=,493,Rename metadata.json to config.json,9599,simonw,open,0,,,3268330,Datasette 1.0,6,2019-05-29T15:48:03Z,2020-12-18T20:34:39Z,,OWNER,,"It is increasingly being useful configuration options, when it started out as purely metadata. Could cause confusion with the `--config` mechanism though - maybe that should be called ""settings"" instead?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/493/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 449931899,MDU6SXNzdWU0NDk5MzE4OTk=,494,--reload should only trigger for -i databases,9599,simonw,closed,0,9599,simonw,,,1,2019-05-29T17:28:43Z,2020-02-24T19:45:05Z,2020-02-24T19:45:05Z,OWNER,,Right now it's triggering any time a mutable database changes.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/494/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 450032134,MDU6SXNzdWU0NTAwMzIxMzQ=,495,facet_m2m gets confused by multiple relationships,9599,simonw,open,0,,,,,2,2019-05-29T21:37:28Z,2020-12-17T05:08:22Z,,OWNER,,"I got this for a database I was playing with: I think this is because of these three tables: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/495/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 450862577,MDU6SXNzdWU0NTA4NjI1Nzc=,496,Additional options to gcloud build command in cloudrun - timeout,1740337,costrouc,closed,0,,,,,1,2019-05-31T15:43:55Z,2019-05-31T23:05:05Z,2019-05-31T23:05:05Z,NONE,,"I am trying to deploy a 3.1 GB dataset to cloudrun with datasette. Currrently the docker build times out. Would be nice to have a timeout flag or additional gcloud commands that could be specified. Here is the line https://github.com/simonw/datasette/blob/f825e2012109247fa246e2b938f8174069e574f1/datasette/publish/cloudrun.py#L78 I would be happy to submit a PR to allow for a timeout option. What are your ideas of allowing the user additional build publishing flag options?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/496/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 451261628,MDExOlB1bGxSZXF1ZXN0Mjg0MzkwMTk3,497,Upgrade pytest to 4.6.1,9599,simonw,closed,0,,,,,0,2019-06-03T01:45:34Z,2019-06-03T02:06:32Z,2019-06-03T02:06:27Z,OWNER,simonw/datasette/pulls/497,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/497/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 451513541,MDU6SXNzdWU0NTE1MTM1NDE=,498,Full text search of all tables at once?,7936571,chrismp,closed,0,,,,,12,2019-06-03T14:24:43Z,2020-05-30T17:26:02Z,2020-05-30T17:26:02Z,NONE,,"Does datasette have a built-in way, in a browser, to do a full-text search of all columns, in all databases and tables, that have full-text search enabled? Is there a plugin that does this?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/498/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 451585764,MDU6SXNzdWU0NTE1ODU3NjQ=,499,Accessibility for non-techie newsies? ,7936571,chrismp,open,0,,,,,3,2019-06-03T16:49:37Z,2019-06-05T21:22:55Z,,NONE,,"Hi again, I'm having fun uploading datasets to Heroku via datasette. I'd like to set up datasette so that it's easy for other newsroom workers, who don't use Linux and aren't programmers, to upload datasets. Does datsette provide this out-of-the-box, or as a plugin? ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/499/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 451705509,MDExOlB1bGxSZXF1ZXN0Mjg0NzQzNzk0,500,Fix typo in install step: should be install -e,32314,tmcw,closed,0,,,,,1,2019-06-03T21:50:51Z,2019-06-11T18:48:43Z,2019-06-11T18:48:40Z,CONTRIBUTOR,simonw/datasette/pulls/500,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/500/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 452901999,MDExOlB1bGxSZXF1ZXN0Mjg1Njk4MzEw,501,Test against Python 3.8-dev using Travis,9599,simonw,closed,0,,,,,3,2019-06-06T08:37:53Z,2019-11-11T03:23:29Z,2019-11-11T03:23:29Z,OWNER,simonw/datasette/pulls/501,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/501/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 453131917,MDU6SXNzdWU0NTMxMzE5MTc=,502,Exporting sqlite database(s)?,7936571,chrismp,closed,0,,,,,3,2019-06-06T16:39:53Z,2021-04-03T05:16:54Z,2019-06-11T18:50:42Z,NONE,,"I'm working on datasette from one computer. But if I want to work on it from another computer and want to copy the SQLite database(s) already on the Heroku datasette instance, how to I copy the database(s) to the second computer so that I can then update it and push to online via datasette's command line code that pushes code to Heroku?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/502/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 453243459,MDU6SXNzdWU0NTMyNDM0NTk=,503,Handle SQLite databases with spaces in their names?,7936571,chrismp,closed,0,9599,simonw,,,1,2019-06-06T21:20:59Z,2019-11-04T23:16:30Z,2019-11-04T23:16:30Z,NONE,,"I named my SQLite database ""Government workers"" and published it to Heroku. When I clicked the ""Government workers"" database online it lead to a 404 page: `Database not found: Government%20workers`. I believe this is because the database name has a space.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/503/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 453639196,MDU6SXNzdWU0NTM2MzkxOTY=,504,Remove TableView ?_group_count= feature,9599,simonw,closed,0,9599,simonw,,,0,2019-06-07T18:25:18Z,2019-11-06T05:13:10Z,2019-11-06T05:13:10Z,OWNER,,"This feature really doesn't warrant continuing to exist. For reference: #150 and #44 Don't forget to remove it from the docs: https://github.com/simonw/datasette/blob/172da009d890aa029cff7138b4dcfd4f60948525/docs/json_api.rst#L322-L324",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/504/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 453829910,MDU6SXNzdWU0NTM4Mjk5MTA=,505,Add white-space: pre-wrap to SQL create statement,9599,simonw,closed,0,9599,simonw,4471010,Datasette 0.29,0,2019-06-08T19:59:56Z,2019-07-07T20:26:55Z,2019-07-07T20:26:55Z,OWNER,,"Right now a super-long CREATE TABLE statement causes the table page to be even wider than the table itself: Adding `white-space: pre-wrap` to that `
` element is an easy fix:


",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/505/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
453846217,MDU6SXNzdWU0NTM4NDYyMTc=,506,Option to display binary data,9599,simonw,closed,0,,,,,10,2019-06-08T23:44:12Z,2019-06-11T15:48:27Z,2019-06-09T16:07:39Z,OWNER,,"In #442 we suppressed rendering of binary data:



It turns out there is one use-case where displaying binary data is useful: when you're poking around looking at random SQLite databases you find in `~/Library` trying to figure out what they are for.

So, a mechanism for opting in to ugly display of binary data again would be useful.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/506/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
455486286,MDU6SXNzdWU0NTU0ODYyODY=,26,Mechanism for turning nested JSON into foreign keys / many-to-many,9599,simonw,open,0,,,,,14,2019-06-13T00:52:06Z,2022-06-29T23:35:29Z,,OWNER,,"The GitHub JSON APIs have a really interesting convention with respect to related objects.

Consider https://api.github.com/repos/simonw/sqlite-utils/issues - here's a truncated subset:
```json
  {
    ""id"": 449818897,
    ""node_id"": ""MDU6SXNzdWU0NDk4MTg4OTc="",
    ""number"": 24,
    ""title"": ""Additional Column Constraints?"",
    ""user"": {
      ""login"": ""IgnoredAmbience"",
      ""id"": 98555,
      ""node_id"": ""MDQ6VXNlcjk4NTU1"",
      ""avatar_url"": ""https://avatars0.githubusercontent.com/u/98555?v=4"",
      ""gravatar_id"": """"
    },
    ""labels"": [
      {
        ""id"": 993377884,
        ""node_id"": ""MDU6TGFiZWw5OTMzNzc4ODQ="",
        ""url"": ""https://api.github.com/repos/simonw/sqlite-utils/labels/enhancement"",
        ""name"": ""enhancement"",
        ""color"": ""a2eeef"",
        ""default"": true
      }
    ],
    ""state"": ""open""
  }
```
The `user` column lists a complete user. The `labels` column has a list of labels.

Since both user and label have populated `id` field this is actually enough information for us to create records for them AND set up the corresponding foreign key (for user) and m2m relationships (for labels).

It would be really neat if `sqlite-utils` had some kind of mechanism for correctly processing these kind of patterns.

Thanks to `jq` there's not much need for extra customization of the shape here - if we support a narrowly defined structure users can use `jq` to reshape arbitrary JSON to match.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/26/reactions"", ""total_count"": 4, ""+1"": 4, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
455496504,MDU6SXNzdWU0NTU0OTY1MDQ=,27,sqlite-utils create-table command,9599,simonw,closed,0,,,,,8,2019-06-13T01:43:30Z,2020-05-03T15:26:15Z,2020-05-03T15:26:15Z,OWNER,,"Spun off from #24 - it would be useful if CLI users could create new tables (with explicit column types, not null rules and defaults) without having to insert an example record.

- [x] Get it working
- [x] Support `--pk`
- [x] Support `--not-null`
- [x] Support `--default`
- [x] Support `--fk colname othertable othercol`
- [x] Support `--replace` and `--ignore`
- [x] Documentation",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/27/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
455852801,MDU6SXNzdWU0NTU4NTI4MDE=,507,Every datasette plugin on the ecosystem page should have a screenshot,9599,simonw,open,0,,,,,4,2019-06-13T17:02:51Z,2020-09-17T02:47:35Z,,OWNER,,https://github.com/simonw/datasette/blob/master/docs/ecosystem.rst,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/507/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
455965174,MDU6SXNzdWU0NTU5NjUxNzQ=,508,Ability to set default sort order for a table or view in metadata.json,9599,simonw,closed,0,9599,simonw,,,1,2019-06-13T21:40:51Z,2020-05-28T18:53:03Z,2020-05-28T18:53:02Z,OWNER,,"It can go here in the documentation: https://datasette.readthedocs.io/en/stable/metadata.html#setting-which-columns-can-be-used-for-sorting

Also need to fix this sentence which is no longer true:

> By default, database views in Datasette do not support sorting",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/508/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
455996809,MDU6SXNzdWU0NTU5OTY4MDk=,28,"Rearrange the docs by area, not CLI vs Python",9599,simonw,closed,0,,,,,1,2019-06-13T23:33:35Z,2019-07-15T02:37:20Z,2019-07-15T02:37:20Z,OWNER,,"The docs for eg inserting data should live on the same page, rather than being split across the API and CLI pages.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/28/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
456568880,MDU6SXNzdWU0NTY1Njg4ODA=,509,Support opening multiple databases with the same stem,9599,simonw,closed,0,9599,simonw,3268330,Datasette 1.0,4,2019-06-15T19:32:00Z,2020-12-22T20:04:35Z,2020-12-22T20:04:35Z,OWNER,,"e.g. I should be able to do this:

    datasette App/data.db Other_App/data.db

This currently errors because you can't have two databases taking the `/data` URL path.

Instead, how about in this particular case assigning the second database `/data-1`?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/509/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
456569067,MDU6SXNzdWU0NTY1NjkwNjc=,510,Ability to facet by delimiter (e.g. comma separated fields),9599,simonw,open,0,9599,simonw,,,1,2019-06-15T19:34:41Z,2019-07-08T15:44:51Z,,OWNER,,"E.g. if a field contains ""Tags,With,Commas"" be able to facet them in the same way as `_facet_array=` lets you facet `[""Tags"", ""With"", ""Commas""]`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/510/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
456578474,MDU6SXNzdWU0NTY1Nzg0NzQ=,511,Get Datasette tests passing on Windows in GitHub Actions,9599,simonw,open,0,,,,,13,2019-06-15T21:41:58Z,2021-07-11T17:23:05Z,,OWNER,,"This should almost happen as a side-effect or moving from Sanic to Uvicorn during the port to ASGI: #272 

Additional steps:

- test it manually
- update documentation
- set up some form of Windows CI
",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/511/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
457147936,MDU6SXNzdWU0NTcxNDc5MzY=,512,"""about"" parameter in metadata does not appear when alone",7936571,chrismp,open,0,,,,,3,2019-06-17T21:04:20Z,2019-10-11T15:49:13Z,,NONE,,"Here's an example of metadata I have for one database on datasette.

```
""Records-requests"": {
	""tables"": {
		""Some table"": {
			""about"": ""This table has data.""
		}
	}
}
```

The text in `about` does not show up when I publish the data. But it shows up after I add a `""source""` parameter in the metadata.

Is this intended?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/512/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
457201907,MDU6SXNzdWU0NTcyMDE5MDc=,513,Is it possible to publish to Heroku despite slug size being  too large?,7936571,chrismp,closed,0,,,,,2,2019-06-18T00:12:02Z,2019-06-21T22:35:54Z,2019-06-21T22:35:54Z,NONE,,"I'm trying to push more than 1.5GB worth of SQLite databases -- 535MB compressed -- to Heroku but I get this error when I run the `datasette publish heroku` command.

    Compiled slug size: 535.5M is too large (max is 500M).

Can I publish the databases and make datasette work on Heroku despite the large slug size?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/513/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
458941203,MDU6SXNzdWU0NTg5NDEyMDM=,29,Prevent accidental add-foreign-key with invalid column,9599,simonw,closed,0,,,,,0,2019-06-20T23:57:24Z,2019-06-20T23:58:26Z,2019-06-20T23:58:26Z,OWNER,,"You can corrupt your database by running:

    $ sqlite-utils add-foreign-key my.db table non_existent_column other_table other_column
",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/29/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
459397625,MDU6SXNzdWU0NTkzOTc2MjU=,514,Documentation with recommendations on running Datasette in production without using Docker,7936571,chrismp,closed,0,,,5971510,Datasette 0.50,27,2019-06-21T22:48:12Z,2020-10-08T23:55:53Z,2020-10-08T23:33:05Z,NONE,,"I've got some SQLite databases too big to push to Heroku or the other services with built-in support in datasette. 

So instead I moved my datasette code and databases to a remote server on Kimsufi. In the folder containing the SQLite databases I run the following code.

`nohup datasette serve -h 0.0.0.0 *.db --cors --port 8000 --metadata metadata.json > output.log 2>&1 &`.

When I go to `http://my-remote-server.com:8000`, the site loads. But I know this is not a good long-term solution to running datasette on this server. 

What is the ""correct"" way to have this site run, preferably on server port 80?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/514/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
459469278,MDU6SXNzdWU0NTk0NjkyNzg=,515,Try shrinking official image with docker-slim,9599,simonw,open,0,,,,,0,2019-06-22T12:25:37Z,2019-06-22T12:25:37Z,,OWNER,,"This looks really promising: https://github.com/docker-slim/docker-slim

If it can shave substantial size from our official container reliably we could add it to the automated build process.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/515/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
459509126,MDU6SXNzdWU0NTk1MDkxMjY=,516,Enforce import sort order with isort,9599,simonw,open,0,,,,,4,2019-06-22T20:35:50Z,2019-06-24T05:06:59Z,,OWNER,,"I want to use isort to order imports. A few steps here:

- [x] Add a .isort.cfg file (see below)
- [x] Use `isort -rc` to reformat existing code
- [ ] Commit this change
- [x] Add a unit test that ensures future changes remain isort compatible",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/516/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,
459537047,MDU6SXNzdWU0NTk1MzcwNDc=,517,"Add unit test for ""static"" mechanism in plugins",9599,simonw,closed,0,,,,,1,2019-06-23T05:03:31Z,2021-01-04T20:15:19Z,2021-01-04T20:15:19Z,OWNER,,"Split out from #272 - this is actually quite tricky. Here's the relevant code:

https://github.com/simonw/datasette/blob/35429f90894321eda7f2db31b9ea7976f31f73ac/datasette/utils.py#L602-L614",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/517/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
459587155,MDExOlB1bGxSZXF1ZXN0MjkwODk3MTA0,518,Port Datasette from Sanic to ASGI + Uvicorn,9599,simonw,closed,0,9599,simonw,3268330,Datasette 1.0,12,2019-06-23T15:18:42Z,2019-06-24T13:42:50Z,2019-06-24T03:13:09Z,OWNER,simonw/datasette/pulls/518,"Most of the code here was fleshed out in comments on #272 (Port Datasette to ASGI) - this pull request will track the final pieces:

- [x] Update test harness to more correctly simulate the `raw_path` issue
- [x] Use `raw_path` so table names containing `/` can work correctly
- [x] Bug: JSON not served with correct content-type
- [x] Get ?_trace=1 working again
- [x] Replacement for `@app.listener(""before_server_start"")`
- [x] Bug: `/fixtures/table%2Fwith%2Fslashes.csv?_format=json` downloads as CSV
- [x] Replace Sanic request and response objects with my own classes, so I can remove Sanic dependency
- [x] Final code tidy-up before merging to master",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/518/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0,
459590021,MDU6SXNzdWU0NTk1OTAwMjE=,519,Decide what goes into Datasette 1.0,9599,simonw,closed,0,,,3268330,Datasette 1.0,4,2019-06-23T15:47:41Z,2021-11-15T23:26:11Z,2021-11-15T23:26:11Z,OWNER,,Datasette ASGI #272 is a big part of it... but 1.0 will generally be an indicator that Datasette is a stable platform for developers to write plugins and custom templates against. So lots to think about.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/519/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
459598080,MDU6SXNzdWU0NTk1OTgwODA=,520,asgi_wrapper plugin hook,9599,simonw,closed,0,9599,simonw,,,3,2019-06-23T17:16:45Z,2019-07-03T04:40:34Z,2019-07-03T04:06:28Z,OWNER,,"After #272 we can finally add this hook. It will allow plugins to wrap their own ASGI middleware around Datasette. Potential use-cases include:

* adding authentication
* custom CORS headers (see #454)
* maybe gzip support?
* possibly defining entirely new routes, though that may be better handled by a separate hook",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/520/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed
459621683,MDU6SXNzdWU0NTk2MjE2ODM=,521,Easier way of creating custom row templates,9599,simonw,closed,0,,,,,6,2019-06-23T21:49:27Z,2019-07-03T03:23:56Z,2019-07-03T03:23:56Z,OWNER,,"I was messing around with a custom `_rows_and_columns.html` template and ended up with this:
```html
{% for row in display_rows %}
  

{% for cell in row %} {% if cell.column == ""First_Name"" %}

{{ cell.value }} {% elif cell.column == ""Last_Name"" %} {{ cell.value }}

{% elif cell.column == ""Short_Description"" %}

{{ cell.column }}: {{ cell.value }}

{% else %} {{ cell.column }}: {{ cell.value }}    {% endif %} {% endfor %}

{% endfor %} ``` This is nasty. I'd like to be able to do something like this instead: ``` {% for row in display_rows %}

{{ row[""First_Name""] }} {{ row[""Last_Name""] }}

... ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/521/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 459622390,MDU6SXNzdWU0NTk2MjIzOTA=,522,Handle case-insensitive headers in a nicer way,9599,simonw,open,0,,,,,1,2019-06-23T21:56:34Z,2019-06-26T18:48:53Z,,OWNER,,Spun out from https://github.com/simonw/datasette/pull/518#discussion_r296486289,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/522/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 459627549,MDU6SXNzdWU0NTk2Mjc1NDk=,523,Show total/unfiltered row count when filtering,2657547,rixx,closed,0,,,,,2,2019-06-23T22:56:48Z,2019-06-24T01:38:14Z,2019-06-24T01:38:14Z,CONTRIBUTOR,,"When I'm seeing a filtered view of a table, I'd like to be able to see something like '2 rows where status != ""closed"" (of 1000 total)' to have a context for the data I'm seeing – e.g. currently my database is being filled by an importer, so this information would be super helpful. Since this information would be a performance hit, maybe something like '12 rows where status != ""closed"" (of ??? total)' with lazy-loading on-click(?) could be applied (Or via a ""How many total?"" tooltip, or …)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/523/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 459689615,MDExOlB1bGxSZXF1ZXN0MjkwOTcxMjk1,524,"Sort commits using isort, refs #516",9599,simonw,open,0,,,,,0,2019-06-24T05:04:48Z,2019-06-24T05:45:00Z,,OWNER,simonw/datasette/pulls/524,Also added a lint unit test to ensure they stay sorted. #516,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/524/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 459714943,MDU6SXNzdWU0NTk3MTQ5NDM=,525,Add section on sqite-utils enable-fts to the search documentation,9599,simonw,closed,0,9599,simonw,,,2,2019-06-24T06:39:16Z,2019-06-24T16:36:35Z,2019-06-24T16:29:43Z,OWNER,,"https://datasette.readthedocs.io/en/stable/full_text_search.html already has a section about csvs-to-sqlite, sqlite-utils is even more relevant.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/525/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 459882902,MDU6SXNzdWU0NTk4ODI5MDI=,526,Stream all results for arbitrary SQL and canned queries,50578294,matej-fr,open,0,,,,,23,2019-06-24T13:09:45Z,2022-09-28T04:01:25Z,,NONE,,"I think that there is a difficulty with canned queries. When I want to stream all results of a canned query TwoDays I get only first 1.000 records. Example: `http://myserver/history_sample/two_days.csv?_stream=on` returns only first 1.000 records. If I do the same with the whole database i.e. `http://myserver/history_sample/database.csv?_stream=on` I get correctly all records. Any ideas?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/526/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 459936585,MDU6SXNzdWU0NTk5MzY1ODU=,527,Unable to use rank when fts-table generated with csvs-to-sqlite,2181410,clausjuhl,closed,0,,,,,3,2019-06-24T14:49:48Z,2019-06-24T15:21:18Z,2019-06-24T15:09:10Z,NONE,,"Hi Simon. If i generate a fts-table with the csvs-to-sqlite f-option, I'm unable to use (in datasette's GUI) the internal ranking of the table for sorting or viewing, but if I generate the fts-table with the enable-fts argument from sqlite-utils, everyrthing works ok. Eg.: datasette, version 0.28 sqlite-utils, version 1.2.1 csvs-to-sqlite, version 0.9 No column named rank with these commands: $ csvs-to-sqlite minutes.csv minutes.db -f text_data $ datasette -i minutes.db select rank, * from minutes_fts where minutes_fts match 'dog' Everything ok with these commands: $ csvs-to-sqlite minutes.csv minutes.db $ sqlite-utils enable-fts minutes.db text_data $ datasette -i minutes.db select rank, * from minutes_fts where minutes_fts match 'dog' Am I doing something wrong? Thank you for a great application!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/527/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 460095928,MDU6SXNzdWU0NjAwOTU5Mjg=,528,Establish a pattern for Datasette plugins built on top of Pandas,9599,simonw,open,0,,,,,0,2019-06-24T21:05:52Z,2019-06-24T21:05:52Z,,OWNER,,"The Pandas ecosystem is huge, varied and full of tools that are really good at doing interesting analysis on top of tabular data. Pandas should not be a dependency of Datasette core, but I think there is a lot of potential in having plugins which use Pandas to apply interesting analysis to data sucked out of Datasette's SQLite tables. One example ([thanks, Tony](https://twitter.com/psychemedia/status/1143259809715752962)): https://github.com/ResidentMario/missingno could form the basis of a fantastic plugin for getting a high-level overview of how complete each column in a table is. Some thought is needed here about what shape these kind of plugins might take, and what plugin hooks they would use.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/528/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 460396952,MDExOlB1bGxSZXF1ZXN0MjkxNTM0NTk2,529,Use keyed rows - fixes #521,1383872,nathancahill,closed,0,,,,,1,2019-06-25T12:33:48Z,2019-06-25T12:35:07Z,2019-06-25T12:35:07Z,NONE,simonw/datasette/pulls/529,"Supports template syntax like this: ``` {% for row in display_rows %}

{{ row[""First_Name""] }} {{ row[""Last_Name""] }}

... ```",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/529/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 460540321,MDU6SXNzdWU0NjA1NDAzMjE=,530,Extract codemirror SQL editor out into a plugin,9599,simonw,closed,0,,,,,1,2019-06-25T17:07:51Z,2020-10-01T00:42:08Z,2020-10-01T00:42:08Z,OWNER,,"Right now codemirror (used for the SQL editor on https://latest.datasette.io/fixtures?sql=select+*+from+%5B123_starts_with_digits%5D ) is the only JavaScript in Datasette. It's also the only vendored dependency. I'd like to move it out to a plugin. But... ideally I would like that plugin to be part of the default ""pip install datasette"" experience. I don't know what the best pattern for optional dependencies is. I don't want to have to tell people to run `pip install datasette[full]`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/530/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 461215118,MDU6SXNzdWU0NjEyMTUxMTg=,30,Option to open database in read-only mode,9599,simonw,closed,0,,,,,1,2019-06-26T22:50:38Z,2020-05-11T19:17:17Z,2020-05-11T19:17:17Z,OWNER,,Would this make it 100% safe to run reads against a database file that is being written to by another process?,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/30/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 461237618,MDU6SXNzdWU0NjEyMzc2MTg=,31,Mechanism for adding multiple foreign key constraints at once,9599,simonw,closed,0,,,,,0,2019-06-27T00:04:30Z,2019-06-29T06:27:40Z,2019-06-29T06:27:40Z,OWNER,,"Needed by [db-to-sqlite](https://github.com/simonw/db-to-sqlite). It currently works by collecting all of the foreign key relationships it can find and then applying them at the end of the process. The problem is, the `add_foreign_key()` method looks like this: https://github.com/simonw/sqlite-utils/blob/86bd2bba689e25f09551d611ccfbee1e069e5b66/sqlite_utils/db.py#L498-L516 That means it's doing a full `VACUUM` for every single relationship it sets up - and if you have hundreds of foreign key relationships in your database this can take hours. I think the right solution is to have a `.add_foreign_keys(list_of_args)` method which does the bulk operation and then a single `VACUUM`. `.add_foreign_key(...)` can then call the bulk action with a single list item.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/31/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 462094937,MDExOlB1bGxSZXF1ZXN0MjkyODc5MjA0,32,db.add_foreign_keys() method,9599,simonw,closed,0,,,,,1,2019-06-28T15:40:33Z,2019-06-29T06:27:39Z,2019-06-29T06:27:39Z,OWNER,simonw/sqlite-utils/pulls/32,"Refs #31. Still TODO: - [x] Unit tests - [x] Documentation",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/32/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 462117311,MDU6SXNzdWU0NjIxMTczMTE=,531,/database/-/inspect,9599,simonw,open,0,,,,,1,2019-06-28T16:33:41Z,2019-07-08T15:43:57Z,,OWNER,,"Build `/database/-/inspect` which shows tables, columns, column types and foreign keys It won't show table counts. Or maybe it will include them optionally but only for `-i` databases, in a special area of the JSON reserved for immutable-only inspect details. _Originally posted by @simonw in https://github.com/simonw/datasette/issues/465#issuecomment-506797086_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/531/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 462423839,MDU6SXNzdWU0NjI0MjM4Mzk=,33,index_foreign_keys / index-foreign-keys utilities,9599,simonw,closed,0,,,,,2,2019-06-30T16:42:03Z,2019-06-30T23:54:11Z,2019-06-30T23:50:55Z,OWNER,,"Sometimes it's good to have indices on all columns that are foreign keys, to allow for efficient reverse lookups. This would be a useful utility: $ sqlite-utils index-foreign-keys database.db ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 462423972,MDExOlB1bGxSZXF1ZXN0MjkzMTE3MTgz,34,sqlite-utils index-foreign-keys / db.index_foreign_keys(),9599,simonw,closed,0,,,,,0,2019-06-30T16:43:40Z,2019-06-30T23:50:55Z,2019-06-30T23:50:55Z,OWNER,simonw/sqlite-utils/pulls/34,"Refs #33 - [x] `sqlite-utils index-foreign-keys` command - [x] `db.index_foreign_keys()` method - [x] unit tests - [x] documentation",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/34/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 462430920,MDU6SXNzdWU0NjI0MzA5MjA=,35,table.update(...) method,9599,simonw,closed,0,,,,,2,2019-06-30T18:06:15Z,2019-07-28T15:43:52Z,2019-07-28T15:43:52Z,OWNER,,"Spun off from #23 - this method will allow a user to update a specific row. Currently the only way to do that it is to call `.upsert({full record})` with the primary key field matching an existing record - but this does not support partial updates. ```python db[""events""].update(3, {""name"": ""Renamed""}) ``` This method only works on an existing table, so there's no need for a `pk=""id""` specifier - it can detect the primary key by looking at the table. If the primary key is compound the first argument can be a tuple: ```python db[""events_venues""].update((3, 2), {""custom_label"": ""Label""}) ``` The method can be called without the second dictionary argument. Doing this selects the row specified by the primary key (throwing an error if it does not exist) and remembers it so that chained operations can be carried out - see proposal in https://github.com/simonw/sqlite-utils/issues/23#issuecomment-507055345 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/35/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 462817589,MDU6SXNzdWU0NjI4MTc1ODk=,36,Support compound primary keys,9599,simonw,closed,0,,,,,0,2019-07-01T17:00:07Z,2019-07-15T04:28:52Z,2019-07-15T04:28:52Z,OWNER,,"This should work: ```python table = db[""dog_breeds""].insert({ ""dog_id"": 1, ""breed_id"": 2 }, pk=(""dog_id"", ""breed_id"")) ``` Needed for m2m work in #23",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/36/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 462928038,MDU6SXNzdWU0NjI5MjgwMzg=,532,Switch setup.py to using ~= for dependencies,9599,simonw,closed,0,,,,,0,2019-07-01T21:53:48Z,2019-07-03T04:32:58Z,2019-07-03T04:32:58Z,OWNER,,"`~=` means ""compatible release"" https://www.python.org/dev/peps/pep-0440/#compatible-release See also https://stackoverflow.com/questions/39590187/in-requirements-txt-what-does-tilde-equals-mean",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/532/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 463492395,MDExOlB1bGxSZXF1ZXN0MjkzOTYyNDA1,533,"Support cleaner custom templates for rows and tables, closes #521",9599,simonw,closed,0,,,,,1,2019-07-03T00:40:18Z,2019-07-03T03:23:06Z,2019-07-03T03:23:06Z,OWNER,simonw/datasette/pulls/533,"- [x] Rename `_rows_and_columns.html` to `_table.html` - [x] Unit test - [x] Documentation",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/533/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 463492815,MDU6SXNzdWU0NjM0OTI4MTU=,534,500 error on m2m facet detection,9599,simonw,open,0,,,,,1,2019-07-03T00:42:42Z,2020-12-17T05:08:22Z,,OWNER,,"This may help debug: ``` diff --git a/datasette/facets.py b/datasette/facets.py index 76d73e5..07a4034 100644 --- a/datasette/facets.py +++ b/datasette/facets.py @@ -499,11 +499,14 @@ class ManyToManyFacet(Facet): ""outgoing"" ] if len(other_table_outgoing_foreign_keys) == 2: - destination_table = [ - t - for t in other_table_outgoing_foreign_keys - if t[""other_table""] != self.table - ][0][""other_table""] + try: + destination_table = [ + t + for t in other_table_outgoing_foreign_keys + if t[""other_table""] != self.table + ][0][""other_table""] + except IndexError: + import pdb; pdb.pm() # Only suggest if it's not selected already if (""_facet_m2m"", destination_table) in args: continue ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/534/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 463531894,MDExOlB1bGxSZXF1ZXN0MjkzOTkyMzgy,535,"Added asgi_wrapper plugin hook, closes #520",9599,simonw,closed,0,,,,,0,2019-07-03T03:58:00Z,2019-07-03T04:06:26Z,2019-07-03T04:06:26Z,OWNER,simonw/datasette/pulls/535,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/535/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 463534974,MDExOlB1bGxSZXF1ZXN0MjkzOTk0NDQz,536,"Switch to ~= dependencies, closes #532",9599,simonw,closed,0,,,,,0,2019-07-03T04:12:16Z,2019-07-03T04:32:55Z,2019-07-03T04:32:55Z,OWNER,simonw/datasette/pulls/536,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/536/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 463544206,MDU6SXNzdWU0NjM1NDQyMDY=,537,"Populate ""endpoint"" key in ASGI scope",9599,simonw,open,0,,,,,12,2019-07-03T04:54:47Z,2019-07-22T06:03:18Z,,OWNER,,"This is a trick used by Starlette so that other layers of ASGI middleware can see which route was selected. They added it here: https://github.com/encode/starlette/commit/34d0097feb6f057bd050d5057df5a2f96b97384e If Datasette supports it as well we can benefit from it if we integrate this sentry_asgi middleware (probably as a `datasette-sentry` plugin): https://github.com/encode/sentry-asgi/blob/c6a42d44d31f85885b79e4ee898683ecf8104971/sentry_asgi/middleware.py#L34-L35",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/537/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 463915863,MDU6SXNzdWU0NjM5MTU4NjM=,538,Mechanism for secrets in plugin configuration,9599,simonw,closed,0,,,,,3,2019-07-03T19:23:34Z,2019-07-04T05:47:54Z,2019-07-04T05:47:54Z,OWNER,,"See https://github.com/simonw/datasette-auth-github/issues/1 We need a mechanism where by plugins can tap into ""secret"" config options without exposing them in the visible metadata.json (where plugin configs currently live, see https://datasette.readthedocs.io/en/stable/plugins.html#plugin-configuration )",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/538/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 464040911,MDExOlB1bGxSZXF1ZXN0Mjk0NDAwNDQ2,539,Secret plugin configuration options,9599,simonw,closed,0,,,,,2,2019-07-04T03:21:20Z,2019-07-04T05:36:45Z,2019-07-04T05:36:45Z,OWNER,simonw/datasette/pulls/539,Refs #538 ,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/539/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 464449570,MDU6SXNzdWU0NjQ0NDk1NzA=,540,Add a universal navigation bar which can be modified by plugins,9599,simonw,closed,0,,,,,8,2019-07-05T03:50:33Z,2019-07-06T23:13:29Z,2019-07-06T23:11:35Z,OWNER,,"Needed by https://github.com/simonw/datasette-auth-github/issues/5 We already have a navigation breadcrumbs header on some pages, I can extend that to be present on every page and make it easy to modify with custom templates. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/540/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 464779810,MDU6SXNzdWU0NjQ3Nzk4MTA=,541,Plugin hook for adding extra template context variables,9599,simonw,closed,0,,,,,2,2019-07-05T21:37:05Z,2019-07-06T00:05:59Z,2019-07-06T00:05:59Z,OWNER,,"It turns out I need this for https://github.com/simonw/datasette-auth-github/issues/5 It can be modelled on the `extra_body_script` hook: https://datasette.readthedocs.io/en/stable/plugins.html#extra-body-script-template-database-table-view-name-datasette",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/541/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 464786717,MDExOlB1bGxSZXF1ZXN0Mjk0OTkyNTc4,542,extra_template_vars plugin hook,9599,simonw,closed,0,,,,,5,2019-07-05T22:19:17Z,2019-07-06T00:05:57Z,2019-07-06T00:05:56Z,OWNER,simonw/datasette/pulls/542,Refs #541,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/542/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 464868844,MDU6SXNzdWU0NjQ4Njg4NDQ=,543,datasette publish option for setting plugin configuration secrets,9599,simonw,closed,0,,,4471010,Datasette 0.29,3,2019-07-06T16:21:23Z,2019-07-08T02:06:34Z,2019-07-08T02:06:34Z,OWNER,,Follow-on from #538 - the `datasette publish` command needs a way of passing secrets which will be made available to plugin configuration but will not be exposed in `/-/metadata.json`.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/543/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 464894812,MDExOlB1bGxSZXF1ZXN0Mjk1MDY1Nzk2,544,--plugin-secret option,9599,simonw,closed,0,,,4471010,Datasette 0.29,1,2019-07-06T22:18:20Z,2019-07-08T02:06:31Z,2019-07-08T02:06:31Z,OWNER,simonw/datasette/pulls/544,"Refs #543 - [x] Zeit Now v1 support - [x] Solve escaping of ENV in Dockerfile - [x] Heroku support - [x] Unit tests - [x] Cloud Run support - [x] Documentation ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/544/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 464905894,MDU6SXNzdWU0NjQ5MDU4OTQ=,545,Fix header on 404 page,9599,simonw,closed,0,,,4471010,Datasette 0.29,1,2019-07-07T01:47:40Z,2019-07-07T20:26:55Z,2019-07-07T20:26:55Z,OWNER,," ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/545/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 464987783,MDExOlB1bGxSZXF1ZXN0Mjk1MTI3MjEz,546,Facet by delimiter,9599,simonw,open,0,,,,,2,2019-07-07T20:06:05Z,2019-11-18T23:46:01Z,,OWNER,simonw/datasette/pulls/546,Refs #510,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/546/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 464990184,MDU6SXNzdWU0NjQ5OTAxODQ=,547,Release notes for 0.29,9599,simonw,closed,0,,,4471010,Datasette 0.29,2,2019-07-07T20:30:28Z,2019-07-08T03:31:59Z,2019-07-08T03:31:59Z,OWNER,,There's a lot of stuff... https://github.com/simonw/datasette/compare/0.28...master,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/547/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 464994105,MDU6SXNzdWU0NjQ5OTQxMDU=,548,Add datasette-cors and datasette-auth-github plugins to Ecosystem page,9599,simonw,closed,0,,,4471010,Datasette 0.29,0,2019-07-07T21:14:14Z,2019-07-08T02:02:36Z,2019-07-08T02:02:36Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/548/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 465001185,MDU6SXNzdWU0NjUwMDExODU=,549,Send pull request to the repo that the _table.html template will break,9599,simonw,closed,0,,,4471010,Datasette 0.29,1,2019-07-07T22:45:17Z,2019-07-08T03:36:46Z,2019-07-08T03:36:45Z,OWNER,,"Bump this to 0.29 https://github.com/simonw/salaries-datasette/blob/master/requirements/base.txt And rename https://github.com/simonw/salaries-datasette/blob/master/templates/_rows_and_columns.html to _table.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/549/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 465002978,MDU6SXNzdWU0NjUwMDI5Nzg=,550,Pull m2m faceting out of master so we can ship a release without it,9599,simonw,closed,0,,,4471010,Datasette 0.29,1,2019-07-07T23:10:48Z,2019-07-07T23:21:22Z,2019-07-07T23:21:22Z,OWNER,,After spending some time with #495 I believe I need to make some pretty major changes to how m2m faceting works. I don't want it to block the release of ASGI Datasette so I'm going to revert it back out of master for the moment and merge it back in after the release has gone out.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/550/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 465003070,MDU6SXNzdWU0NjUwMDMwNzA=,551,Ship many-to-many faceting support (and facet-by-delimiter),9599,simonw,open,0,,,,,2,2019-07-07T23:11:45Z,2019-07-08T15:45:23Z,,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/551/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 465019882,MDU6SXNzdWU0NjUwMTk4ODI=,552,"Add --plugin-secret support to ""datasette package""",9599,simonw,open,0,,,,,1,2019-07-08T01:46:47Z,2019-07-08T01:47:30Z,,OWNER,,"Split out from #544. I think I should combine this with #347 (renaming `datasette package` to `datasette publish docker`).",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/552/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 465327844,MDU6SXNzdWU0NjUzMjc4NDQ=,553,Potential improvements to facet-by-date,9599,simonw,open,0,,,,,3,2019-07-08T15:37:53Z,2019-07-08T15:41:55Z,,OWNER,,"In addition to #483 Tobias had some useful suggestions on Twitter: https://twitter.com/rixxtr/status/1148253926476701696 > I think for date facets, it might be more meaningful to order them by date, rather than by size? Or offer both? I'm *definitely* often interested in size-over-time, so https://data.rixx.de/django_tickets/tickets?_facet_date=created#facet-created … isn't all that helpful! Screenshot of that link: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/553/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 465728430,MDExOlB1bGxSZXF1ZXN0Mjk1NzExNTA0,554,Fix static mounts using relative paths and prevent traversal exploits,3243482,abdusco,closed,0,,,,,4,2019-07-09T11:32:02Z,2019-07-11T16:29:26Z,2019-07-11T16:13:19Z,CONTRIBUTOR,simonw/datasette/pulls/554,"While debugging why my static mounts using a relative path (`--static mystatic:rel/path/to/dir`) not working, I noticed that the requests fail no matter what, returning 404 errors. The reason is that datasette tries to prevent traversal exploits by checking if the path is relative to its registered directory. This check fails when the mount is a relative directory, because `/abs/dir/file` obviously not under `dir/file`. https://github.com/simonw/datasette/blob/81fa8b6cdc5457b42a224779e5291952314e8d20/datasette/utils/asgi.py#L303-L306 This also has the consequence of returning any requested file, because when `/abs/dir/../../evil.file` resolves `aiofiles` happily returns it to the client after it resolves the path itself. The solution is to make sure we're checking relativity of paths after they're fully resolved. I've implemented the mentioned changes and also updated the tests.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/554/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 465731062,MDU6SXNzdWU0NjU3MzEwNjI=,555,Static mounts with relative paths not working,3243482,abdusco,closed,0,,,,,0,2019-07-09T11:38:35Z,2019-07-11T16:13:22Z,2019-07-11T16:13:22Z,CONTRIBUTOR,,"Datasette fails to serve files from static mounts that are created using relative paths `datasette --static mystatic:rel/path/to/static/dir`. I've explained the problem and the solution in the pull request: https://github.com/simonw/datasette/pull/554",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/555/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 465773546,MDExOlB1bGxSZXF1ZXN0Mjk1NzQ4MjY4,556,Add support for running datasette as a module,3243482,abdusco,closed,0,,,,,1,2019-07-09T13:13:30Z,2019-07-11T16:07:45Z,2019-07-11T16:07:44Z,CONTRIBUTOR,simonw/datasette/pulls/556,"This PR allows running datasette using `python -m datasette` command in addition to just running the executable. This function is quite useful when debugging a plugin in a project because IDEs like PyCharm can easily start a debug session when datasette is run as a module in contrast to trying to attach a debugger to a running process. ![image](https://user-images.githubusercontent.com/3243482/60890448-fc4ede80-a263-11e9-8b42-d2a3db8d1a59.png) ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/556/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 465815372,MDU6SXNzdWU0NjU4MTUzNzI=,37,Experiment with type hints,9599,simonw,closed,0,,,,,6,2019-07-09T14:30:34Z,2021-08-18T21:48:57Z,2021-08-18T21:48:57Z,OWNER,,"Since it's designed to be used in Jupyter or for rapid prototyping in an IDE (and it's still pretty small) `sqlite-utils` feels like a great candidate for me to finally try out Python type hints. https://veekaybee.github.io/2019/07/08/python-type-hints/ is good. It suggests the mypy docs for getting started: https://mypy.readthedocs.io/en/latest/existing_code.html plus this tutorial: https://pymbook.readthedocs.io/en/latest/typehinting.html",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/37/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 466996584,MDExOlB1bGxSZXF1ZXN0Mjk2NzM1MzIw,557,Get tests running on Windows using Travis CI,9599,simonw,closed,0,,,,,4,2019-07-11T16:36:57Z,2021-07-10T23:39:48Z,2021-07-10T23:39:48Z,OWNER,simonw/datasette/pulls/557,Refs #511,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/557/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 467218270,MDU6SXNzdWU0NjcyMTgyNzA=,558,Support unicode in url,380586,0x1997,closed,0,,,,,4,2019-07-12T04:43:24Z,2019-07-15T01:29:30Z,2019-07-14T02:49:33Z,NONE,,"Hi, I defined some custom queries in my `metadata.json`. There are Chinese characters in the names of the queries. So the urls are like `http://127.0.0.1:8001/mydb/测试查询`. When opening such urls, datasette will throw an exception. ``` Traceback (most recent call last): File ""/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/utils/asgi.py"", line 100, in __call__ return await view(new_scope, receive, send) File ""/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/utils/asgi.py"", line 172, in view request, **scope[""url_route""][""kwargs""] File ""/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/views/base.py"", line 267, in get request, database, hash, correct_hash_provided, **kwargs File ""/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/views/base.py"", line 471, in view_get for key in self.ds.renderers.keys() File ""/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/views/base.py"", line 471, in for key in self.ds.renderers.keys() File ""/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/utils/__init__.py"", line 655, in path_with_format path = request.path File ""/home/zhe/miniconda3/lib/python3.7/site-packages/datasette/utils/asgi.py"", line 49, in path self.scope.get(""raw_path"", self.scope[""path""].encode(""latin-1"")) UnicodeEncodeError: 'latin-1' codec can't encode characters in position 9-11: ordinal not in range(256) ``` This used to work when datasette was based on sanic. Btw, thanks for the great work!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/558/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 467623820,MDExOlB1bGxSZXF1ZXN0Mjk3MjQzMDcz,559,Bump to uvicorn 0.8.4,9599,simonw,closed,0,,,,,0,2019-07-12T22:30:29Z,2019-07-13T22:34:58Z,2019-07-13T22:34:58Z,OWNER,simonw/datasette/pulls/559,"https://github.com/encode/uvicorn/commits/0.8.4 Query strings will now be included in log files: https://github.com/encode/uvicorn/pull/384",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/559/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 467790646,MDU6SXNzdWU0Njc3OTA2NDY=,560,CodeMirror fails to load on database page,9599,simonw,closed,0,,,,,3,2019-07-14T03:31:00Z,2019-09-03T01:03:02Z,2019-07-14T03:38:59Z,OWNER,,"It's not loading on https://latest.datasette.io/fixtures But it does load on https://latest.datasette.io/fixtures?sql=select+*+from+facetable",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/560/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 467862459,MDExOlB1bGxSZXF1ZXN0Mjk3NDEyNDY0,38,table.update() method,9599,simonw,closed,0,,,,,2,2019-07-14T17:03:49Z,2019-07-28T15:43:51Z,2019-07-28T15:43:51Z,OWNER,simonw/sqlite-utils/pulls/38,"Refs #35 Still to do: - [x] Unit tests - [x] Switch to using `.get()` - [x] Better exceptions, plus unit tests for what happens if pk does not exist - [x] Documentation - [x] Ensure compound primary keys work properly - [x] `alter=True` support",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 467864071,MDU6SXNzdWU0Njc4NjQwNzE=,39,table.get(...) method,9599,simonw,closed,0,,,,,0,2019-07-14T17:20:51Z,2019-07-15T04:28:53Z,2019-07-15T04:28:53Z,OWNER,,"Utility method for fetching a record by its primary key. Accepts a single value (for primary key / rowid tables) or a list/tuple of values (for compound primary keys, refs #36). Raises a `NotFoundError` if the record cannot be found.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/39/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 467928674,MDExOlB1bGxSZXF1ZXN0Mjk3NDU5Nzk3,40,.get() method plus support for compound primary keys,9599,simonw,closed,0,,,,,1,2019-07-15T03:43:13Z,2019-07-15T04:28:57Z,2019-07-15T04:28:52Z,OWNER,simonw/sqlite-utils/pulls/40,"- [x] Tests for the `NotFoundError` exception - [x] Documentation for `.get()` method - [x] Support `--pk` multiple times to define CLI compound primary keys - [x] Documentation for compound primary keys",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/40/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 469828961,MDExOlB1bGxSZXF1ZXN0Mjk4OTYyNTUx,561,Fix typos,15278512,minho42,closed,0,,,,,0,2019-07-18T15:13:35Z,2019-07-26T10:25:45Z,2019-07-26T10:25:45Z,CONTRIBUTOR,simonw/datasette/pulls/561,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/561/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 470131537,MDU6SXNzdWU0NzAxMzE1Mzc=,41,sqlite-utils insert --tsv option,9599,simonw,closed,0,,,,,0,2019-07-19T04:27:21Z,2019-07-19T04:50:47Z,2019-07-19T04:50:47Z,OWNER,,"Right now we only support ingesting CSV, but sometimes interesting data is released as TSV. https://www.washingtonpost.com/national/2019/07/18/how-download-use-dea-pain-pills-database/ for example.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/41/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470345929,MDU6SXNzdWU0NzAzNDU5Mjk=,42,"table.extract(...) method and ""sqlite-utils extract"" command",9599,simonw,closed,0,,,5897911,2.20,21,2019-07-19T14:09:36Z,2020-09-22T23:39:31Z,2020-09-22T23:37:49Z,OWNER,,"One of my favourite features of [csvs-to-sqlite](https://github.com/simonw/csvs-to-sqlite) is that it can ""extract"" columns into a separate lookup table - for example: csvs-to-sqlite big_csv_file.csv -c country output.db This will turn the `country` column in the resulting table into a integer foreign key against a new `country` table. You can see an example of what that looks like here: https://san-francisco.datasettes.com/registered-business-locations-3d50679/Business+Corridor was extracted from https://san-francisco.datasettes.com/registered-business-locations-3d50679/Registered_Business_Locations_-_San_Francisco?Business%20Corridor=1 I'd like to have the same capability in `sqlite-utils` - but with the ability to run it against an existing SQLite table rather than just against a CSV.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/42/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470542938,MDU6SXNzdWU0NzA1NDI5Mzg=,562,Facet by array shouldn't suggest for arrays that are not arrays-of-strings,9599,simonw,closed,0,,,,,2,2019-07-19T20:51:29Z,2019-11-01T19:42:10Z,2019-11-01T19:37:55Z,OWNER,,"It's triggering for arrays that look like this at the moment: ```json [ { ""type"": ""HKWorkoutEventTypeSegment"", ""date"": ""2019-05-21 09:43:50 -0700"", ""duration"": ""12.2780519704024"", ""durationUnit"": ""min"" }, { ""type"": ""HKWorkoutEventTypeSegment"", ""date"": ""2019-05-21 09:43:50 -0700"", ""duration"": ""19.467273102204"", ""durationUnit"": ""min"" } ] ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/562/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470637068,MDU6SXNzdWU0NzA2MzcwNjg=,1,Use XML Analyser to figure out the structure of the export XML,9599,simonw,closed,0,,,,,1,2019-07-20T05:19:02Z,2019-07-20T05:20:09Z,2019-07-20T05:20:09Z,MEMBER,,https://github.com/simonw/xml_analyser,197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470637152,MDU6SXNzdWU0NzA2MzcxNTI=,2,Import workouts,9599,simonw,closed,0,,,,,1,2019-07-20T05:20:21Z,2019-07-20T06:21:41Z,2019-07-20T06:21:41Z,MEMBER,,From #1,197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470637206,MDU6SXNzdWU0NzA2MzcyMDY=,3,Import ActivitySummary,9599,simonw,closed,0,,,,,0,2019-07-20T05:21:00Z,2019-07-20T05:58:07Z,2019-07-20T05:58:07Z,MEMBER,,"From #1 ```python 'ActivitySummary': {'attr_counts': {'activeEnergyBurned': 980, 'activeEnergyBurnedGoal': 980, 'activeEnergyBurnedUnit': 980, 'appleExerciseTime': 980, 'appleExerciseTimeGoal': 980, 'appleStandHours': 980, 'appleStandHoursGoal': 980, 'dateComponents': 980}, 'child_counts': {}, 'count': 980, 'parent_counts': {'HealthData': 980}}, ```",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470640505,MDU6SXNzdWU0NzA2NDA1MDU=,4,Import Records,9599,simonw,closed,0,,,,,1,2019-07-20T06:11:20Z,2019-07-20T06:21:41Z,2019-07-20T06:21:41Z,MEMBER,,"From #1: ```python 'Record': {'attr_counts': {'creationDate': 2672233, 'device': 2665111, 'endDate': 2672233, 'sourceName': 2672233, 'sourceVersion': 2671779, 'startDate': 2672233, 'type': 2672233, 'unit': 2650012, 'value': 2672232}, 'child_counts': {'HeartRateVariabilityMetadataList': 2318, 'MetadataEntry': 287974}, 'count': 2672233, 'parent_counts': {'Correlation': 2, 'HealthData': 2672231}}, ```",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470691622,MDU6SXNzdWU0NzA2OTE2MjI=,5,Add progress bar,9599,simonw,closed,0,,,,,2,2019-07-20T16:29:07Z,2019-07-22T03:30:13Z,2019-07-22T02:49:22Z,MEMBER,,"Showing a progress bar would be nice, using Click. The easiest way to do this would probably be be to hook it up to the length of the compressed content, and update it as this code pushes more XML bytes through the parser: https://github.com/dogsheep/healthkit-to-sqlite/blob/d64299765064501f4efdd9a0b21dbdba9ec4287f/healthkit_to_sqlite/utils.py#L6-L10",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470691999,MDU6SXNzdWU0NzA2OTE5OTk=,43,.add_column() doesn't match indentation of initial creation,9599,simonw,closed,0,,,,,3,2019-07-20T16:33:10Z,2019-07-23T13:09:11Z,2019-07-23T13:09:05Z,OWNER,,"I spotted a table which was created once and then had columns added to it and the formatted SQL looks like this: ```sql CREATE TABLE [records] ( [type] TEXT, [sourceName] TEXT, [sourceVersion] TEXT, [unit] TEXT, [creationDate] TEXT, [startDate] TEXT, [endDate] TEXT, [value] TEXT, [metadata_Health Mate App Version] TEXT, [metadata_Withings User Identifier] TEXT, [metadata_Modified Date] TEXT, [metadata_Withings Link] TEXT, [metadata_HKWasUserEntered] TEXT , [device] TEXT, [metadata_HKMetadataKeyHeartRateMotionContext] TEXT, [metadata_HKDeviceManufacturerName] TEXT, [metadata_HKMetadataKeySyncVersion] TEXT, [metadata_HKMetadataKeySyncIdentifier] TEXT, [metadata_HKSwimmingStrokeStyle] TEXT, [metadata_HKVO2MaxTestType] TEXT, [metadata_HKTimeZone] TEXT, [metadata_Average HR] TEXT, [metadata_Recharge] TEXT, [metadata_Lights] TEXT, [metadata_Asleep] TEXT, [metadata_Rating] TEXT, [metadata_Energy Threshold] TEXT, [metadata_Deep Sleep] TEXT, [metadata_Nap] TEXT, [metadata_Edit Slots] TEXT, [metadata_Tags] TEXT, [metadata_Daytime HR] TEXT) ``` It would be nice if the columns that were added later matched the indentation of the initial columns.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/43/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 470856782,MDU6SXNzdWU0NzA4NTY3ODI=,6,Break up records into different tables for each type,9599,simonw,closed,0,,,,,1,2019-07-22T01:54:59Z,2019-07-22T03:28:55Z,2019-07-22T03:28:50Z,MEMBER,,"I don't think there's much benefit to having all of the different record types stored in the same enormous table. Here's what I get when I use `_facet=type`: I'm going to try splitting these up into separate tables - so `HKQuantityTypeIdentifierBodyMassIndex` becomes a table called `rBodyMassIndex` - and see if that's nicer to work with.",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 471292050,MDU6SXNzdWU0NzEyOTIwNTA=,563,incorrect json url for row-level data?,10352819,rprimet,closed,0,,,,,0,2019-07-22T19:59:38Z,2019-10-21T02:03:09Z,2019-10-21T02:03:09Z,CONTRIBUTOR,,"While visiting [this example page](https://register-of-members-interests.datasettes.com/regmem-98dc8b7/people/uk.org.publicwhip%2Fperson%2F10001) (linked from Datasette documentation), manually clicking on [the link](https://register-of-members-interests.datasettes.com/regmem-98dc8b7/people/uk.org.publicwhip%2Fperson%2F10001?_format=json) (""This data as .json"") to the json data results in an error 500 `data() got an unexpected keyword argument 'as_format'` The [JSON page linked to from the documentation](https://register-of-members-interests.datasettes.com/regmem-d22c12c/people/uk.org.publicwhip%2Fperson%2F10001.json) however is correct (the page address ends in `.json` rather than using a query string `?format=json`) This particular datasette demo page is now a few versions behind, but I was able to reproduce the issue using v0.29.2 and a downloaded copy of the demo database (and also with the current HEAD). Here is a stack trace: ``` Traceback (most recent call last): File ""/home/romain/miniconda3/envs/dsbug/lib/python3.7/site-packages/datasette/utils/asgi.py"", line 101, in __call__ return await view(new_scope, receive, send) File ""/home/romain/miniconda3/envs/dsbug/lib/python3.7/site-packages/datasette/utils/asgi.py"", line 173, in view request, **scope[""url_route""][""kwargs""] File ""/home/romain/miniconda3/envs/dsbug/lib/python3.7/site-packages/datasette/views/base.py"", line 267, in get request, database, hash, correct_hash_provided, **kwargs File ""/home/romain/miniconda3/envs/dsbug/lib/python3.7/site-packages/datasette/views/base.py"", line 399, in view_get request, database, hash, **kwargs TypeError: data() got an unexpected keyword argument 'as_format' ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/563/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 471628483,MDU6SXNzdWU0NzE2Mjg0ODM=,44,Utilities for building lookup tables,9599,simonw,closed,0,,,,,2,2019-07-23T10:59:58Z,2019-07-23T13:07:01Z,2019-07-23T13:07:01Z,OWNER,,"While building https://github.com/dogsheep/healthkit-to-sqlite I found a need for a neat mechanism for easily building lookup tables - tables where each unique value in a column is replaced by a foreign key to a separate table. csvs-to-sqlite currently creates those with its ""extract"" mechanism - but that's written as custom code against Pandas. I'd like to eventually replace Pandas with sqlite-utils there. See also #42 ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/44/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 471684708,MDExOlB1bGxSZXF1ZXN0MzAwMjg2NTM1,45,"Implemented table.lookup(...), closes #44",9599,simonw,closed,0,,,,,0,2019-07-23T13:03:30Z,2019-07-23T13:07:00Z,2019-07-23T13:07:00Z,OWNER,simonw/sqlite-utils/pulls/45,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/45/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 471780443,MDU6SXNzdWU0NzE3ODA0NDM=,46,extracts= option for insert/update/etc,9599,simonw,closed,0,,,,,3,2019-07-23T15:55:46Z,2020-03-01T16:53:40Z,2019-07-23T17:00:44Z,OWNER,,"Relates to #42 and #44. I want the ability to extract values out into lookup tables during bulk insert/upsert operations. `db.insert_all(rows, extracts=[""species""])` - creates species table for values in the species column `db.insert_all(rows, extracts={""species"": ""Species""})` - as above but the new table is called `Species`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/46/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 471797101,MDExOlB1bGxSZXF1ZXN0MzAwMzc3NTk5,47,extracts= table parameter,9599,simonw,closed,0,,,,,0,2019-07-23T16:30:29Z,2019-07-23T17:00:43Z,2019-07-23T17:00:43Z,OWNER,simonw/sqlite-utils/pulls/47,Still needs docs. Refs #46,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/47/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 471818939,MDU6SXNzdWU0NzE4MTg5Mzk=,48,"Jupyter notebook demo of the library, launchable on Binder",9599,simonw,closed,0,,,,,2,2019-07-23T17:05:05Z,2022-01-26T02:08:46Z,2022-01-26T02:08:39Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/48/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 472097220,MDU6SXNzdWU0NzIwOTcyMjA=,7,Script uses a lot of RAM,9599,simonw,closed,0,,,,,3,2019-07-24T06:11:11Z,2019-07-24T06:35:52Z,2019-07-24T06:35:52Z,MEMBER,,"I'm using an XML pull parser which should avoid the need to slurp the whole XML file into memory, but it's not working - the script still uses over 1GB of RAM when it runs according to Activity Monitor. I think this is because I'm still causing the full root element to be incrementally loaded into memory just in case I try and access it later. http://effbot.org/elementtree/iterparse.htm says I should use `elem.clear()` as I go. It also says: > The above pattern has one drawback; it does not clear the root element, so you will end up with a single element with lots of empty child elements. If your files are huge, rather than just large, this might be a problem. To work around this, you need to get your hands on the root element. So I will try that recipe and see if it helps.",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 472104705,MDExOlB1bGxSZXF1ZXN0MzAwNTgwMjIx,8,Use less RAM,9599,simonw,closed,0,,,,,0,2019-07-24T06:35:01Z,2019-07-24T06:35:52Z,2019-07-24T06:35:52Z,MEMBER,dogsheep/healthkit-to-sqlite/pulls/8,Closes #7,197882382,healthkit-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 472115381,MDU6SXNzdWU0NzIxMTUzODE=,49,extracts= should support multiple-column extracts,9599,simonw,open,0,,,,,10,2019-07-24T07:06:41Z,2020-10-16T19:18:19Z,,OWNER,,"Lookup tables can be constructed on compound columns, but the `extracts=` option doesn't currently support that. Right now extracts can be defined in two ways: ```python # Extract these columns into tables with the same name: dogs = db.table(""dogs"", extracts=[""breed"", ""most_recent_trophy""]) # Same as above but with custom table names: dogs = db.table(""dogs"", extracts={""breed"": ""Breeds"", ""most_recent_trophy"": ""Trophies""}) ``` Need some kind of syntax for much more complicated extractions, like when two columns (say ""source"" and ""source_version"") are extracted into a single table.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/49/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 472429048,MDU6SXNzdWU0NzI0MjkwNDg=,9,Too many SQL variables,166463,tholo,closed,0,,,,,4,2019-07-24T18:24:17Z,2019-07-26T10:01:05Z,2019-07-26T10:01:05Z,NONE,,"Decided to try importing my data, and ran into this: ``` Traceback (most recent call last): File ""/Users/tholo/Source/health/bin/healthkit-to-sqlite"", line 10, in sys.exit(cli()) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/healthkit_to_sqlite/cli.py"", line 50, in cli convert_xml_to_sqlite(fp, db, progress_callback=bar.update) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/healthkit_to_sqlite/utils.py"", line 41, in convert_xml_to_sqlite write_records(records, db) File ""/Users/tholo/Source/health/lib/python3.7/site-packages/healthkit_to_sqlite/utils.py"", line 80, in write_records column_order=[""startDate"", ""endDate"", ""value"", ""unit""], File ""/Users/tholo/Source/health/lib/python3.7/site-packages/sqlite_utils/db.py"", line 911, in insert_all result = self.db.conn.execute(sql, values) sqlite3.OperationalError: too many SQL variables ``` Added some debug output in sqlite_utils/db.py, which resulted in: ``` INSERT INTO [rBodyMassIndex] ([creationDate], [endDate], [metadata_HKWasUserEntered], [metadata_Health Mate App Version], [metadata_Modified Date], [metadata_Withings Link], [metadata_Withings User Identifier], [sourceName], [sourceVersion], [startDate], [unit], [value]) VALUES (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) , (?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?, ?) ; ``` with the attached data: ``` ['2019-06-27 22:55:10 -0700', '2011-06-22 21:05:53 -0700', '0', '4.4.2', '2011-06-23 04:05:53 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308801953&type=1', '301293', 'Health Mate', '4040200', '2011-06-22 21:05:53 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2011-06-23 09:36:27 -0700', '0', '4.4.2', '2011-06-23 16:36:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308846987&type=1', '301293', 'Health Mate', '4040200', '2011-06-23 09:36:27 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2011-06-23 23:54:07 -0700', '0', '4.4.2', '2011-06-24 06:55:19 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308898447&type=1', '301293', 'Health Mate', '4040200', '2011-06-23 23:54:07 -0700', 'count', '30.679', '2019-06-27 22:55:10 -0700', '2011-06-24 09:13:40 -0700', '0', '4.4.2', '2011-06-24 16:14:35 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1308932020&type=1', '301293', 'Health Mate', '4040200', '2011-06-24 09:13:40 -0700', 'count', '30.3549', '2019-06-27 22:55:10 -0700', '2011-06-25 08:30:08 -0700', '0', '4.4.2', '2011-06-25 15:30:49 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309015808&type=1', '301293', 'Health Mate', '4040200', '2011-06-25 08:30:08 -0700', 'count', '30.3395', '2019-06-27 22:55:10 -0700', '2011-06-26 07:47:51 -0700', '0', '4.4.2', '2011-06-26 14:48:27 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309099671&type=1', '301293', 'Health Mate', '4040200', '2011-06-26 07:47:51 -0700', 'count', '30.2315', '2019-06-27 22:55:10 -0700', '2011-06-28 08:48:26 -0700', '0', '4.4.2', '2011-06-28 15:49:13 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309276106&type=1', '301293', 'Health Mate', '4040200', '2011-06-28 08:48:26 -0700', 'count', '30.0617', '2019-06-27 22:55:10 -0700', '2011-06-29 09:21:16 -0700', '0', '4.4.2', '2011-06-29 16:21:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309364476&type=1', '301293', 'Health Mate', '4040200', '2011-06-29 09:21:16 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-06-30 08:41:46 -0700', '0', '4.4.2', '2011-06-30 15:42:30 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309448506&type=1', '301293', 'Health Mate', '4040200', '2011-06-30 08:41:46 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2011-07-01 09:05:28 -0700', '0', '4.4.2', '2011-07-01 16:06:24 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309536328&type=1', '301293', 'Health Mate', '4040200', '2011-07-01 09:05:28 -0700', 'count', '29.8611', '2019-06-27 22:55:10 -0700', '2011-07-02 08:58:50 -0700', '0', '4.4.2', '2011-07-02 15:59:40 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309622330&type=1', '301293', 'Health Mate', '4040200', '2011-07-02 08:58:50 -0700', 'count', '29.8765', '2019-06-27 22:55:10 -0700', '2011-07-04 09:33:43 -0700', '0', '4.4.2', '2011-07-04 16:34:19 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309797223&type=1', '301293', 'Health Mate', '4040200', '2011-07-04 09:33:43 -0700', 'count', '30.0309', '2019-06-27 22:55:10 -0700', '2011-07-06 09:40:23 -0700', '0', '4.4.2', '2011-07-06 16:41:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1309970423&type=1', '301293', 'Health Mate', '4040200', '2011-07-06 09:40:23 -0700', 'count', '30.1852', '2019-06-27 22:55:10 -0700', '2011-07-08 08:08:48 -0700', '0', '4.4.2', '2011-07-08 15:09:51 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310137728&type=1', '301293', 'Health Mate', '4040200', '2011-07-08 08:08:48 -0700', 'count', '30.0309', '2019-06-27 22:55:10 -0700', '2011-07-09 08:31:05 -0700', '0', '4.4.2', '2011-07-09 15:31:48 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310225465&type=1', '301293', 'Health Mate', '4040200', '2011-07-09 08:31:05 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-07-10 08:14:36 -0700', '0', '4.4.2', '2011-07-10 15:15:12 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310310876&type=1', '301293', 'Health Mate', '4040200', '2011-07-10 08:14:36 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2011-07-12 07:55:21 -0700', '0', '4.4.2', '2011-07-12 14:55:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310482521&type=1', '301293', 'Health Mate', '4040200', '2011-07-12 07:55:21 -0700', 'count', '30.108', '2019-06-27 22:55:10 -0700', '2011-07-13 08:48:05 -0700', '0', '4.4.2', '2011-07-13 15:48:42 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310572085&type=1', '301293', 'Health Mate', '4040200', '2011-07-13 08:48:05 -0700', 'count', '30', '2019-06-27 22:55:10 -0700', '2011-07-14 09:05:16 -0700', '0', '4.4.2', '2011-07-14 16:05:57 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310659516&type=1', '301293', 'Health Mate', '4040200', '2011-07-14 09:05:16 -0700', 'count', '29.9074', '2019-06-27 22:55:10 -0700', '2011-07-15 07:09:56 -0700', '0', '4.4.2', '2011-07-15 14:10:35 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310738996&type=1', '301293', 'Health Mate', '4040200', '2011-07-15 07:09:56 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-07-16 09:26:04 -0700', '0', '4.4.2', '2011-07-16 16:26:44 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310833564&type=1', '301293', 'Health Mate', '4040200', '2011-07-16 09:26:04 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-07-17 09:52:59 -0700', '0', '4.4.2', '2011-07-17 16:53:38 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1310921579&type=1', '301293', 'Health Mate', '4040200', '2011-07-17 09:52:59 -0700', 'count', '29.8765', '2019-06-27 22:55:10 -0700', '2011-07-19 08:56:16 -0700', '0', '4.4.2', '2011-07-19 15:57:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311090976&type=1', '301293', 'Health Mate', '4040200', '2011-07-19 08:56:16 -0700', 'count', '29.7685', '2019-06-27 22:55:10 -0700', '2011-07-21 08:21:20 -0700', '0', '4.4.2', '2011-07-21 15:22:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311261680&type=1', '301293', 'Health Mate', '4040200', '2011-07-21 08:21:20 -0700', 'count', '29.7685', '2019-06-27 22:55:10 -0700', '2011-07-23 08:49:56 -0700', '0', '4.4.2', '2011-07-23 15:50:40 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311436196&type=1', '301293', 'Health Mate', '4040200', '2011-07-23 08:49:56 -0700', 'count', '29.7222', '2019-06-27 22:55:10 -0700', '2011-07-24 09:17:35 -0700', '0', '4.4.2', '2011-07-24 16:18:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311524255&type=1', '301293', 'Health Mate', '4040200', '2011-07-24 09:17:35 -0700', 'count', '29.5833', '2019-06-27 22:55:10 -0700', '2011-07-25 07:51:55 -0700', '0', '4.4.2', '2011-07-25 14:52:48 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1311605515&type=1', '301293', 'Health Mate', '4040200', '2011-07-25 07:51:55 -0700', 'count', '29.5525', '2019-06-27 22:55:10 -0700', '2011-08-06 10:04:05 -0700', '0', '4.4.2', '2011-08-06 17:04:47 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1312650245&type=1', '301293', 'Health Mate', '4040200', '2011-08-06 10:04:05 -0700', 'count', '29.7377', '2019-06-27 22:55:10 -0700', '2011-08-08 07:52:22 -0700', '0', '4.4.2', '2011-08-08 14:53:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1312815142&type=1', '301293', 'Health Mate', '4040200', '2011-08-08 07:52:22 -0700', 'count', '29.6605', '2019-06-27 22:55:10 -0700', '2011-08-10 07:57:30 -0700', '0', '4.4.2', '2011-08-10 14:58:12 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1312988250&type=1', '301293', 'Health Mate', '4040200', '2011-08-10 07:57:30 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-08-12 07:51:14 -0700', '0', '4.4.2', '2011-08-12 14:51:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1313160674&type=1', '301293', 'Health Mate', '4040200', '2011-08-12 07:51:14 -0700', 'count', '29.6914', '2019-06-27 22:55:10 -0700', '2011-08-13 07:45:28 -0700', '0', '4.4.2', '2011-08-13 14:46:08 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1313246728&type=1', '301293', 'Health Mate', '4040200', '2011-08-13 07:45:28 -0700', 'count', '29.5833', '2019-06-27 22:55:10 -0700', '2011-08-17 09:06:20 -0700', '0', '4.4.2', '2011-08-17 16:07:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1313597180&type=1', '301293', 'Health Mate', '4040200', '2011-08-17 09:06:20 -0700', 'count', '29.5679', '2019-06-27 22:55:10 -0700', '2011-08-22 08:28:08 -0700', '0', '4.4.2', '2011-08-22 15:28:57 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1314026888&type=1', '301293', 'Health Mate', '4040200', '2011-08-22 08:28:08 -0700', 'count', '29.9846', '2019-06-27 22:55:10 -0700', '2011-08-25 08:59:30 -0700', '0', '4.4.2', '2011-08-25 16:00:15 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1314287970&type=1', '301293', 'Health Mate', '4040200', '2011-08-25 08:59:30 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2011-08-30 08:13:59 -0700', '0', '4.4.2', '2011-08-30 15:46:08 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1314717239&type=1', '301293', 'Health Mate', '4040200', '2011-08-30 08:13:59 -0700', 'count', '29.784', '2019-06-27 22:55:10 -0700', '2011-09-12 08:47:51 -0700', '0', '4.4.2', '2011-09-12 15:48:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1315842471&type=1', '301293', 'Health Mate', '4040200', '2011-09-12 08:47:51 -0700', 'count', '29.7377', '2019-06-27 22:55:10 -0700', '2011-09-13 09:17:27 -0700', '0', '4.4.2', '2011-09-13 16:48:30 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1315930647&type=1', '301293', 'Health Mate', '4040200', '2011-09-13 09:17:27 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-10-01 09:12:20 -0700', '0', '4.4.2', '2011-10-01 16:13:00 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1317485540&type=1', '301293', 'Health Mate', '4040200', '2011-10-01 09:12:20 -0700', 'count', '29.8148', '2019-06-27 22:55:10 -0700', '2011-10-11 11:14:11 -0700', '0', '4.4.2', '2011-10-11 18:15:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1318356851&type=1', '301293', 'Health Mate', '4040200', '2011-10-11 11:14:11 -0700', 'count', '29.7377', '2019-06-27 22:55:10 -0700', '2011-10-16 09:29:47 -0700', '0', '4.4.2', '2011-10-16 16:30:39 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1318782587&type=1', '301293', 'Health Mate', '4040200', '2011-10-16 09:29:47 -0700', 'count', '29.6914', '2019-06-27 22:55:10 -0700', '2011-10-19 09:21:44 -0700', '0', '4.4.2', '2011-10-19 16:22:25 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1319041304&type=1', '301293', 'Health Mate', '4040200', '2011-10-19 09:21:44 -0700', 'count', '29.7685', '2019-06-27 22:55:10 -0700', '2011-10-24 07:04:22 -0700', '0', '4.4.2', '2011-10-24 14:05:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1319465062&type=1', '301293', 'Health Mate', '4040200', '2011-10-24 07:04:22 -0700', 'count', '29.5988', '2019-06-27 22:55:10 -0700', '2011-11-07 09:33:17 -0700', '0', '4.4.2', '2011-11-07 16:33:58 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1320683597&type=1', '301293', 'Health Mate', '4040200', '2011-11-07 09:33:17 -0700', 'count', '29.8611', '2019-06-27 22:55:10 -0700', '2011-11-10 07:59:03 -0700', '0', '4.4.2', '2011-11-10 14:59:48 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1320937143&type=1', '301293', 'Health Mate', '4040200', '2011-11-10 07:59:03 -0700', 'count', '29.9383', '2019-06-27 22:55:10 -0700', '2011-11-13 09:28:31 -0700', '0', '4.4.2', '2011-11-13 16:29:20 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1321201711&type=1', '301293', 'Health Mate', '4040200', '2011-11-13 09:28:31 -0700', 'count', '29.7531', '2019-06-27 22:55:10 -0700', '2011-11-21 08:45:06 -0700', '0', '4.4.2', '2011-11-21 15:46:04 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1321890306&type=1', '301293', 'Health Mate', '4040200', '2011-11-21 08:45:06 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2011-11-23 09:55:44 -0700', '0', '4.4.2', '2011-11-23 16:56:18 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1322067344&type=1', '301293', 'Health Mate', '4040200', '2011-11-23 09:55:44 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2011-11-29 09:50:44 -0700', '0', '4.4.2', '2011-11-29 16:51:31 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1322585444&type=1', '301293', 'Health Mate', '4040200', '2011-11-29 09:50:44 -0700', 'count', '30.1698', '2019-06-27 22:55:10 -0700', '2011-11-30 11:13:21 -0700', '0', '4.4.2', '2011-11-30 18:14:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1322676801&type=1', '301293', 'Health Mate', '4040200', '2011-11-30 11:13:21 -0700', 'count', '30.0617', '2019-06-27 22:55:10 -0700', '2011-12-04 10:24:36 -0700', '0', '4.4.2', '2011-12-04 17:25:24 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1323019476&type=1', '301293', 'Health Mate', '4040200', '2011-12-04 10:24:36 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2011-12-10 09:22:18 -0700', '0', '4.4.2', '2011-12-10 16:23:07 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1323534138&type=1', '301293', 'Health Mate', '4040200', '2011-12-10 09:22:18 -0700', 'count', '29.9537', '2019-06-27 22:55:10 -0700', '2011-12-26 10:36:42 -0700', '0', '4.4.2', '2011-12-26 17:37:31 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1324921002&type=1', '301293', 'Health Mate', '4040200', '2011-12-26 10:36:42 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2012-01-11 11:24:13 -0700', '0', '4.4.2', '2012-01-11 18:25:04 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1326306253&type=1', '301293', 'Health Mate', '4040200', '2012-01-11 11:24:13 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2012-01-15 10:17:09 -0700', '0', '4.4.2', '2012-01-15 17:17:51 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1326647829&type=1', '301293', 'Health Mate', '4040200', '2012-01-15 10:17:09 -0700', 'count', '29.8302', '2019-06-27 22:55:10 -0700', '2012-01-19 09:24:32 -0700', '0', '4.4.2', '2012-01-19 16:25:21 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1326990272&type=1', '301293', 'Health Mate', '4040200', '2012-01-19 09:24:32 -0700', 'count', '29.7994', '2019-06-27 22:55:10 -0700', '2012-01-29 10:26:13 -0700', '0', '4.4.2', '2012-01-29 17:26:52 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1327857973&type=1', '301293', 'Health Mate', '4040200', '2012-01-29 10:26:13 -0700', 'count', '30.0154', '2019-06-27 22:55:10 -0700', '2012-02-03 10:13:28 -0700', '0', '4.4.2', '2012-02-03 17:15:01 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1328289208&type=1', '301293', 'Health Mate', '4040200', '2012-02-03 10:13:28 -0700', 'count', '29.8457', '2019-06-27 22:55:10 -0700', '2012-02-12 09:23:01 -0700', '0', '4.4.2', '2012-02-12 16:23:53 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1329063781&type=1', '301293', 'Health Mate', '4040200', '2012-02-12 09:23:01 -0700', 'count', '30.1235', '2019-06-27 22:55:10 -0700', '2012-03-03 09:26:06 -0700', '0', '4.4.2', '2012-03-03 16:26:54 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1330791966&type=1', '301293', 'Health Mate', '4040200', '2012-03-03 09:26:06 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2012-03-11 11:23:15 -0700', '0', '4.4.2', '2012-03-11 18:24:16 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1331490195&type=1', '301293', 'Health Mate', '4040200', '2012-03-11 11:23:15 -0700', 'count', '30.2161', '2019-06-27 22:55:10 -0700', '2012-03-16 09:39:36 -0700', '0', '4.4.2', '2012-03-16 16:40:20 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1331915976&type=1', '301293', 'Health Mate', '4040200', '2012-03-16 09:39:36 -0700', 'count', '30.2778', '2019-06-27 22:55:10 -0700', '2012-03-21 08:33:07 -0700', '0', '4.4.2', '2012-03-21 15:34:00 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1332343987&type=1', '301293', 'Health Mate', '4040200', '2012-03-21 08:33:07 -0700', 'count', '30.1389', '2019-06-27 22:55:10 -0700', '2012-04-11 08:49:34 -0700', '0', '4.4.2', '2012-04-11 15:50:18 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1334159374&type=1', '301293', 'Health Mate', '4040200', '2012-04-11 08:49:34 -0700', 'count', '30.0154', '2019-06-27 22:55:10 -0700', '2012-04-13 08:32:06 -0700', '0', '4.4.2', '2012-04-13 15:32:49 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1334331126&type=1', '301293', 'Health Mate', '4040200', '2012-04-13 08:32:06 -0700', 'count', '29.9383', '2019-06-27 22:55:10 -0700', '2012-04-20 08:21:38 -0700', '0', '4.4.2', '2012-04-20 15:52:45 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1334935298&type=1', '301293', 'Health Mate', '4040200', '2012-04-20 08:21:38 -0700', 'count', '30.2006', '2019-06-27 22:55:10 -0700', '2012-04-25 09:00:01 -0700', '0', '4.4.2', '2012-04-25 16:00:42 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1335369601&type=1', '301293', 'Health Mate', '4040200', '2012-04-25 09:00:01 -0700', 'count', '30.2006', '2019-06-27 22:55:10 -0700', '2012-05-04 11:10:18 -0700', '0', '4.4.2', '2012-05-04 18:10:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1336155018&type=1', '301293', 'Health Mate', '4040200', '2012-05-04 11:10:18 -0700', 'count', '30.4321', '2019-06-27 22:55:10 -0700', '2012-05-12 09:35:00 -0700', '0', '4.4.2', '2012-05-12 16:35:43 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1336840500&type=1', '301293', 'Health Mate', '4040200', '2012-05-12 09:35:00 -0700', 'count', '30.1235', '2019-06-27 22:55:10 -0700', '2012-05-22 09:27:53 -0700', '0', '4.4.2', '2012-05-22 16:28:37 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1337704073&type=1', '301293', 'Health Mate', '4040200', '2012-05-22 09:27:53 -0700', 'count', '30.4167', '2019-06-27 22:55:10 -0700', '2012-05-31 09:23:16 -0700', '0', '4.4.2', '2012-05-31 16:24:04 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1338481396&type=1', '301293', 'Health Mate', '4040200', '2012-05-31 09:23:16 -0700', 'count', '30.2006', '2019-06-27 22:55:10 -0700', '2012-06-08 09:29:07 -0700', '0', '4.4.2', '2012-06-08 16:29:52 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1339172947&type=1', '301293', 'Health Mate', '4040200', '2012-06-08 09:29:07 -0700', 'count', '30.5247', '2019-06-27 22:55:10 -0700', '2012-06-21 08:07:33 -0700', '0', '4.4.2', '2012-06-21 15:08:20 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1340291253&type=1', '301293', 'Health Mate', '4040200', '2012-06-21 08:07:33 -0700', 'count', '30.5864', '2019-06-27 22:55:10 -0700', '2012-08-08 10:02:22 -0700', '0', '4.4.2', '2012-08-08 17:03:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1344445342&type=1', '301293', 'Health Mate', '4040200', '2012-08-08 10:02:22 -0700', 'count', '30.6636', '2019-06-27 22:55:10 -0700', '2012-08-17 09:11:32 -0700', '0', '4.4.2', '2012-08-17 16:42:05 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1345219892&type=1', '301293', 'Health Mate', '4040200', '2012-08-17 09:11:32 -0700', 'count', '30.8796', '2019-06-27 22:55:10 -0700', '2012-09-10 08:27:21 -0700', '0', '4.4.2', '2012-09-10 15:28:07 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1347290841&type=1', '301293', 'Health Mate', '4040200', '2012-09-10 08:27:21 -0700', 'count', '31.034', '2019-06-27 22:55:10 -0700', '2012-09-17 08:35:33 -0700', '0', '4.4.2', '2012-09-17 15:35:33 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1347896133&type=1', '301293', 'Health Mate', '4040200', '2012-09-17 08:35:33 -0700', 'count', '30.7099', '2019-06-27 22:55:10 -0700', '2012-09-26 08:59:46 -0700', '0', '4.4.2', '2012-09-26 16:13:18 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1348675186&type=1', '301293', 'Health Mate', '4040200', '2012-09-26 08:59:46 -0700', 'count', '30.679', '2019-06-27 22:55:10 -0700', '2012-10-18 08:51:16 -0700', '0', '4.4.2', '2012-10-18 15:51:59 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1350575476&type=1', '301293', 'Health Mate', '4040200', '2012-10-18 08:51:16 -0700', 'count', '30.7716', '2019-06-27 22:55:10 -0700', '2012-11-15 08:54:57 -0700', '0', '4.4.2', '2012-11-15 15:55:58 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1352994897&type=1', '301293', 'Health Mate', '4040200', '2012-11-15 08:54:57 -0700', 'count', '31.0802', '2019-06-27 22:55:10 -0700', '2012-12-17 09:13:40 -0700', '0', '4.4.2', '2012-12-17 16:20:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1355760820&type=1', '301293', 'Health Mate', '4040200', '2012-12-17 09:13:40 -0700', 'count', '29.784', '2019-06-27 22:55:10 -0700', '2012-12-19 11:09:55 -0700', '0', '4.4.2', '2012-12-19 18:10:37 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1355940595&type=1', '301293', 'Health Mate', '4040200', '2012-12-19 11:09:55 -0700', 'count', '29.6914', '2019-06-27 22:55:10 -0700', '2012-12-25 10:37:41 -0700', '0', '4.4.2', '2012-12-25 17:38:25 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1356457061&type=1', '301293', 'Health Mate', '4040200', '2012-12-25 10:37:41 -0700', 'count', '29.8765', '2019-06-27 22:55:10 -0700', '2013-01-01 10:44:02 -0700', '0', '4.4.2', '2013-01-01 17:44:46 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1357062242&type=1', '301293', 'Health Mate', '4040200', '2013-01-01 10:44:02 -0700', 'count', '30.0772', '2019-06-27 22:55:10 -0700', '2013-01-15 09:10:46 -0700', '0', '4.4.2', '2013-01-15 16:11:28 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1358266246&type=1', '301293', 'Health Mate', '4040200', '2013-01-15 09:10:46 -0700', 'count', '29.9691', '2019-06-27 22:55:10 -0700', '2013-01-20 11:03:39 -0700', '0', '4.4.2', '2013-01-20 18:04:22 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1358705019&type=1', '301293', 'Health Mate', '4040200', '2013-01-20 11:03:39 -0700', 'count', '30.108', '2019-06-27 22:55:10 -0700', '2013-01-30 08:56:30 -0700', '0', '4.4.2', '2013-01-30 15:57:14 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1359561390&type=1', '301293', 'Health Mate', '4040200', '2013-01-30 08:56:30 -0700', 'count', '30.0926', '2019-06-27 22:55:10 -0700', '2013-02-04 11:02:35 -0700', '0', '4.4.2', '2013-02-04 18:03:25 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1360000955&type=1', '301293', 'Health Mate', '4040200', '2013-02-04 11:02:35 -0700', 'count', '29.8148', '2019-06-27 22:55:10 -0700', '2013-02-07 09:07:06 -0700', '0', '4.4.2', '2013-02-07 16:07:49 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1360253226&type=1', '301293', 'Health Mate', '4040200', '2013-02-07 09:07:06 -0700', 'count', '30.1389', '2019-06-27 22:55:10 -0700', '2013-02-19 08:49:57 -0700', '0', '4.4.2', '2013-02-19 15:50:39 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1361288997&type=1', '301293', 'Health Mate', '4040200', '2013-02-19 08:49:57 -0700', 'count', '30.1235', '2019-06-27 22:55:10 -0700', '2013-03-02 11:20:54 -0700', '0', '4.4.2', '2013-03-02 18:21:38 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1362248454&type=1', '301293', 'Health Mate', '4040200', '2013-03-02 11:20:54 -0700', 'count', '30', '2019-06-27 22:55:10 -0700', '2013-04-23 08:05:30 -0700', '0', '4.4.2', '2013-04-23 15:06:59 +0000', 'withings-bd2://timeline/measure?user """""" id=301293&date=1366729530&type=1', '301293', 'Health Mate', '4040200', '2013-04-23 08:05:30 -0700', 'count', '30.5247', '2019-06-27 22:55:10 -0700', '2013-05-09 09:49:18 -0700', '0', '4.4.2', '2013-05-09 16:50:02 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1368118158&type=1', '301293', 'Health Mate', '4040200', '2013-05-09 09:49:18 -0700', 'count', '30.4167', '2019-06-27 22:55:10 -0700', '2013-06-09 09:28:47 -0700', '0', '4.4.2', '2013-06-09 16:29:30 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1370795327&type=1', '301293', 'Health Mate', '4040200', '2013-06-09 09:28:47 -0700', 'count', '30.8333', '2019-06-27 22:55:10 -0700', '2013-07-09 08:00:17 -0700', '0', '4.4.2', '2013-07-09 15:01:00 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1373382017&type=1', '301293', 'Health Mate', '4040200', '2013-07-09 08:00:17 -0700', 'count', '30.8179', '2019-06-27 22:55:10 -0700', '2013-07-28 09:16:55 -0700', '0', '4.4.2', '2013-07-28 16:17:39 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1375028215&type=1', '301293', 'Health Mate', '4040200', '2013-07-28 09:16:55 -0700', 'count', '30.5556', '2019-06-27 22:55:10 -0700', '2013-09-13 09:22:19 -0700', '0', '4.4.2', '2013-09-13 16:23:08 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1379089339&type=1', '301293', 'Health Mate', '4040200', '2013-09-13 09:22:19 -0700', 'count', '30.9568', '2019-06-27 22:55:10 -0700', '2013-09-24 08:08:23 -0700', '0', '4.4.2', '2013-09-24 15:09:03 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1380035303&type=1', '301293', 'Health Mate', '4040200', '2013-09-24 08:08:23 -0700', 'count', '31.4352', '2019-06-27 22:55:10 -0700', '2013-10-01 08:15:13 -0700', '0', '4.4.2', '2013-10-01 15:15:57 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1380640513&type=1', '301293', 'Health Mate', '4040200', '2013-10-01 08:15:13 -0700', 'count', '31.2037', '2019-06-27 22:55:10 -0700', '2013-10-23 09:31:25 -0700', '0', '4.4.2', '2013-10-23 16:32:13 +0000', 'withings-bd2://timeline/measure?userid=301293&date=1382545885&type=1', '301293', 'Health Mate', '4040200', '2013-10-23 09:31:25 -0700', 'count', '31.8056'] ```",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 473083260,MDU6SXNzdWU0NzMwODMyNjA=,50,"""Too many SQL variables"" on large inserts",9599,simonw,closed,0,,,,,4,2019-07-25T21:43:31Z,2022-11-04T14:38:36Z,2019-07-28T11:59:33Z,OWNER,,"Reported here: https://github.com/dogsheep/healthkit-to-sqlite/issues/9 It looks like there's a default limit of 999 variables - we need to be smart about that, maybe dynamically lower the batch size based on the number of columns.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/50/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 473288428,MDExOlB1bGxSZXF1ZXN0MzAxNDgzNjEz,564,First proof-of-concept of Datasette Library,9599,simonw,open,0,,,,,1,2019-07-26T10:22:26Z,2023-02-07T15:14:11Z,,OWNER,simonw/datasette/pulls/564,"Refs #417. Run it like this: datasette -d ~/Library Uses a new plugin hook - available_databases() ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/564/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1, 473307794,MDU6SXNzdWU0NzMzMDc3OTQ=,565,Conflict between datasette and uvicorn click versions,440503,jonheslop,closed,0,,,,,1,2019-07-26T11:13:40Z,2020-10-02T00:09:55Z,2020-10-02T00:09:55Z,NONE,,"Hello Datasette is awesome thanks so much! I not very familiar with Python but I think there is a problem with datasette docker builds I keep getting this error ``` ERROR: uvicorn 0.8.4 has requirement click==7.*, but you'll have click 6.0 which is incompatible. ERROR: datasette 0.29.2 has requirement click~=7.0, but you'll have click 6.0 which is incompatible. ``` The full log from the docker build is here - https://gist.github.com/jonheslop/e01cd322e761cfaf34f0cb83f86411b0 Just in case it’s helpful this is my setup - https://github.com/dotwatcher/dotwatcher-data",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/565/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 473733752,MDExOlB1bGxSZXF1ZXN0MzAxODI0MDk3,51,"Fix for too many SQL variables, closes #50",9599,simonw,closed,0,,,,,1,2019-07-28T11:30:30Z,2019-07-28T11:59:32Z,2019-07-28T11:59:32Z,OWNER,simonw/sqlite-utils/pulls/51,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/51/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 476413293,MDU6SXNzdWU0NzY0MTMyOTM=,52,Throws error if .insert_all() / .upsert_all() called with empty list,9599,simonw,closed,0,,,,,1,2019-08-03T04:09:00Z,2019-11-07T04:32:39Z,2019-11-07T04:32:39Z,OWNER,,See also https://github.com/simonw/db-to-sqlite/issues/18,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/52/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 476436920,MDExOlB1bGxSZXF1ZXN0MzAzOTkwNjgz,53,Work in progress: m2m() method for creating many-to-many records,9599,simonw,closed,0,,,,,0,2019-08-03T10:03:56Z,2019-08-04T03:38:10Z,2019-08-04T03:37:33Z,OWNER,simonw/sqlite-utils/pulls/53,"- [x] `table.insert({""name"": ""Barry""}).m2m(""tags"", lookup={""tag"": ""Coworker""})` - [x] Explicit table name `.m2m(""humans"", ..., m2m_table=""relationships"")` - [x] Automatically use an existing m2m table if a single obvious candidate exists (a table with two foreign keys in the correct directions) - [x] Require the explicit `m2m_table=` argument if multiple candidates for the m2m table exist - [x] Documentation Refs #23",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/53/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 476437213,MDU6SXNzdWU0NzY0MzcyMTM=,566,Unexpected keyword argument 'hidden',8330931,dvot197007,closed,0,,,,,1,2019-08-03T10:07:57Z,2019-08-03T16:13:36Z,2019-08-03T16:13:36Z,NONE,,"I couldn't get a test example running. I am running python 3.6.8 and tried both windows and windows subsystem for linux, getting the same error. My test.db was created by converting a five line csv file with csvs-to-sqlite. The csv file is: col1, col2, col3 1,2,3 4,5,6 7,8,9 10,11,12 Here is the error message: (myvenv) davido@DESKTOP-L29G79U:~/dot/datasette-eg$ datasette test.db Traceback (most recent call last): File ""/home/davido/dot/datasette-eg/myvenv/bin/datasette"", line 7, in from datasette.cli import cli File ""/home/davido/dot/datasette-eg/myvenv/lib/python3.6/site-packages/datasette/cli.py"", line 2, in import uvicorn File ""/home/davido/dot/datasette-eg/myvenv/lib/python3.6/site-packages/uvicorn/__init__.py"", line 2, in from uvicorn.main import Server, main, run File ""/home/davido/dot/datasette-eg/myvenv/lib/python3.6/site-packages/uvicorn/main.py"", line 224, in headers: typing.List[str], File ""/home/davido/dot/datasette-eg/myvenv/lib/python3.6/site-packages/click/decorators.py"", line 170, in decorator _param_memo(f, OptionClass(param_decls, **attrs)) File ""/home/davido/dot/datasette-eg/myvenv/lib/python3.6/site-packages/click/core.py"", line 1430, in __init__ Parameter.__init__(self, param_decls, type=type, **attrs) TypeError: __init__() got an unexpected keyword argument 'hidden' Thanks.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/566/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 476573875,MDU6SXNzdWU0NzY1NzM4NzU=,567,Datasette Edit,9599,simonw,closed,0,,,,,3,2019-08-04T17:09:28Z,2020-02-25T03:40:50Z,2020-02-25T03:40:50Z,OWNER,,"Datasette started out immutable. Then it gained the ability to run against read-only databases that were being modified by other processes. It's time for the next logical progression: the option to allow Datasette (or more likely individual plugins) to write to the database! This is going to require some careful rethinking of how connection management works.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/567/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 476852861,MDU6SXNzdWU0NzY4NTI4NjE=,568,Add database_color as a configurable option,50906992,LBHELewis,open,0,,,,,0,2019-08-05T13:14:45Z,2019-08-05T13:14:45Z,,NONE,,This would be really useful as it would allow us to tie in with colour schemes.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/568/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 480961330,MDU6SXNzdWU0ODA5NjEzMzA=,54,"Ability to list views, and to access db[""view_name""].rows / rows_where / etc",20264,ftrain,closed,0,,,,,5,2019-08-15T02:00:28Z,2019-08-23T12:41:09Z,2019-08-23T12:20:15Z,NONE,,"The docs show me how to create a view via `db.create_view()` but I can't seem to get back to that view post-creation; if I query it as a table it returns `None`, and it doesn't appear in the table listing, even though querying the view works fine from inside the sqlite3 command-line. It'd be great to have the view as a pseudo-table, or if the python/sqlite3 module makes that hard to pull off (I couldn't figure it out), to have that edge-case documented next to the `db.create_view()` docs.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/54/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 481885279,MDU6SXNzdWU0ODE4ODUyNzk=,569,More advanced connection pooling,9599,simonw,open,0,,,,,4,2019-08-17T13:20:41Z,2019-10-02T22:44:37Z,,OWNER,,"We need a much smarter way of handling database connections. Today, connections are simple: Datasette runs a number of threads (defaults to 3) and each thread gets a threadlocal read-only (or immutable) connection to each attached database - opened on demand. For Datasette Library (#417) I want to support potentially hundreds of attached databases. Datasette Edit (#567) is going to introduce a need for writable connections too. I'd also like to be able to run joins across multiple databases (#283) which further complicates things. Supporting thousands of open SQLite connections at once feels like it won't provide good enough performance (though I should benchmark that to be sure). Some kind of connection pooling is likely to be necessary.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/569/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 481887482,MDExOlB1bGxSZXF1ZXN0MzA4MjkyNDQ3,55,Ability to introspect and run queries against views,9599,simonw,closed,0,,,,,1,2019-08-17T13:40:56Z,2019-08-23T12:19:42Z,2019-08-23T12:19:42Z,OWNER,simonw/sqlite-utils/pulls/55,See #54 ,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/55/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 487598042,MDU6SXNzdWU0ODc1OTgwNDI=,1,Implement code to pull checkins from the Foursquare API,9599,simonw,closed,0,,,,,0,2019-08-30T17:40:02Z,2019-08-30T18:23:24Z,2019-08-30T18:23:24Z,MEMBER,,"The tool currently only works with a pre-prepared JSON file of checkins. When called without options, it should prompt the user to paste in a Foursquare OAuth token. The `--token=` option should work too, and should be backed up by an optional environment variable.",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487598468,MDU6SXNzdWU0ODc1OTg0Njg=,2,--save option to dump checkins to a JSON file on disk,9599,simonw,closed,0,,,,,1,2019-08-30T17:41:06Z,2019-08-31T02:40:21Z,2019-08-31T02:40:21Z,MEMBER,,"This is a complement to the `--load` option - mainly useful for development purposes. (I'll rename `--file` to `--load` as part of this issue).",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487600595,MDU6SXNzdWU0ODc2MDA1OTU=,3,Option to fetch only checkins more recent than the current max checkin,9599,simonw,closed,0,,,,,4,2019-08-30T17:46:45Z,2019-10-16T20:41:23Z,2019-10-16T20:39:59Z,MEMBER,,"The Foursquare checkins API supports ""return every checkin occurring after this point"" - I can pass it the maximum createdAt date currently stored in the database. This will allow for quick incremental fetches via a cron.",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487601121,MDU6SXNzdWU0ODc2MDExMjE=,4,Online tool for getting a Foursquare OAuth token,9599,simonw,closed,0,,,,,1,2019-08-30T17:48:14Z,2019-08-31T18:07:26Z,2019-08-31T18:07:26Z,MEMBER,,"I will link to this from the documentation. See also this conversation on Twitter: https://twitter.com/simonw/status/1166822603023011840 I've decided to go with ""copy and paste in a token"" rather than hooking up a local web server that can have tokens passed to it.",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487721884,MDU6SXNzdWU0ODc3MjE4ODQ=,5,Treat Foursquare timestamps as UTC,9599,simonw,closed,0,,,,,0,2019-08-31T02:44:47Z,2019-08-31T02:50:41Z,2019-08-31T02:50:41Z,MEMBER,,"Current test failure is due to timezone differences between my laptop and Circle CI: https://circleci.com/gh/dogsheep/swarm-to-sqlite/3 ``` E Full diff: E - [{'created': '2018-07-01T04:48:19', E ? ^ E + [{'created': '2018-07-01T02:48:19', E ? ^ E 'createdAt': 1530413299, ``` The timestamps I store in `created` should always be UTC.",205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 487847945,MDExOlB1bGxSZXF1ZXN0MzEzMDA3NDgz,56,Escape the table name in populate_fts and search.,49260,amjith,closed,0,,,,,2,2019-09-01T06:29:05Z,2019-09-02T17:23:21Z,2019-09-02T17:23:21Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/56,"The table names weren't escaped using double quotes in the populate_fts method. Reproducible case: ``` >>> import sqlite_utils >>> db = sqlite_utils.Database(""abc.db"") >>> db[""http://example.com""].insert_all([ ... {""id"": 1, ""age"": 4, ""name"": ""Cleo""}, ... {""id"": 2, ""age"": 2, ""name"": ""Pancakes""} ... ], pk=""id"") >>> db[""http://example.com""].enable_fts([""name""]) Traceback (most recent call last): File """", line 1, in db[""http://example.com""].enable_fts([""name""]) File ""/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/sqlite_utils/db.py"", l ine 705, in enable_fts self.populate_fts(columns) File ""/home/amjith/.virtualenvs/itsysearch/lib/python3.7/site-packages/sqlite_utils/db.py"", l ine 715, in populate_fts self.db.conn.executescript(sql) sqlite3.OperationalError: unrecognized token: "":"" >>> ```",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/56/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 487987958,MDExOlB1bGxSZXF1ZXN0MzEzMTA1NjM0,57,Add triggers while enabling FTS,49260,amjith,closed,0,,,,,4,2019-09-02T04:23:40Z,2019-09-03T01:03:59Z,2019-09-02T23:42:29Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/57,"This adds the option for a user to set up triggers in the database to keep their FTS table in sync with the parent table. Ref: https://sqlite.org/fts5.html#external_content_and_contentless_tables I would prefer to make the creation of triggers the default behavior, but that will break existing usage where people have been calling `populate_fts` after inserting new rows. I am happy to make changes to the PR as you see fit. ",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/57/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 488293926,MDU6SXNzdWU0ODgyOTM5MjY=,58,Support enabling FTS on views,49260,amjith,closed,0,,,,,1,2019-09-02T18:56:36Z,2020-10-16T18:39:36Z,2020-10-16T18:39:31Z,CONTRIBUTOR,,"Right now enable_fts() is only implemented for Table(). Technically sqlite supports enabling fts on views. But it requires deeper thought since views don't have `rowid` and the current implementation of enable_fts() relies on the presence of `rowid` column. It is possible to provide an alternative rowid using the `content_rowid` option to the FTS5() function. Ref: https://sqlite.org/fts5.html#fts5_table_creation_and_initialization > The ""content_rowid"" option, used to set the rowid field of an external content table. This will further complicate `enable_fts()` function by adding an extra argument. I'm wondering if that is outside the scope of this tool or should I work on that feature and send a PR? ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/58/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488338516,MDU6SXNzdWU0ODgzMzg1MTY=,570,detect_fts should handle alternative table escaping,9599,simonw,closed,0,,,,,0,2019-09-02T23:43:29Z,2019-09-03T00:32:28Z,2019-09-03T00:32:28Z,OWNER,,"sqlite-utils now uses a better way of escaping table names, which has highlighted a bug in Datasette. Datasette has its own version of the `detect_fts` function - at https://github.com/simonw/datasette/blob/d224ee2c98ac39c2c6e21a0ac0c62e5c3e1ccd11/datasette/utils/__init__.py#L466-L479 - which fails to pick up FTS tables created using the new escaping pattern. _Originally posted by @simonw in https://github.com/simonw/sqlite-utils/pull/57#issuecomment-527258212_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/570/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488338965,MDU6SXNzdWU0ODgzMzg5NjU=,59,Ability to introspect triggers,9599,simonw,closed,0,,,,,0,2019-09-02T23:47:16Z,2019-09-03T01:52:36Z,2019-09-03T00:09:42Z,OWNER,,"Now that we're creating triggers (thanks to @amjith in #57) it would be neat if we could introspect them too. I'm thinking: `db.triggers` - lists all triggers for the database `db[""tablename""].triggers` - lists triggers for that table The underlying query for this is `select * from sqlite_master where type = 'trigger'` I'll return the trigger information in a new namedtuple, similar to how Indexes and ForeignKeys work.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/59/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488341021,MDExOlB1bGxSZXF1ZXN0MzEzMzgzMzE3,60,db.triggers and table.triggers introspection,9599,simonw,closed,0,,,,,0,2019-09-03T00:04:32Z,2019-09-03T00:09:42Z,2019-09-03T00:09:42Z,OWNER,simonw/sqlite-utils/pulls/60,Closes #59,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/60/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 488343304,MDExOlB1bGxSZXF1ZXN0MzEzMzg0OTI2,571,detect_fts now works with alternative table escaping,9599,simonw,closed,0,,,,,0,2019-09-03T00:23:39Z,2019-09-03T00:32:28Z,2019-09-03T00:32:28Z,OWNER,simonw/datasette/pulls/571,Fixes #570,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/571/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 488833136,MDU6SXNzdWU0ODg4MzMxMzY=,1,"Imported followers should go in ""users"", relationships in ""following""",9599,simonw,closed,0,,,,,0,2019-09-03T21:27:37Z,2019-09-04T20:23:04Z,2019-09-04T20:23:04Z,MEMBER,,"Right now `twitter-to-sqlite followers` dumps everything in a `followers` table, and doesn't actually record which account they are following! It should instead save them all in a global `users` table and then set up m2m relationships in a `following` table. This also means it should create a record for the specified user in order to record both sides of each relationship.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488833698,MDU6SXNzdWU0ODg4MzM2OTg=,2,"""twitter-to-sqlite user-timeline"" command for pulling tweets by a specific user",9599,simonw,closed,0,,,,,3,2019-09-03T21:29:12Z,2019-09-04T20:02:11Z,2019-09-04T20:02:11Z,MEMBER,,"Twitter only allows up to 3,200 tweets to be retrieved from https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-user_timeline.html I'm going to do: $ twitter-to-sqlite tweets simonw ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488833975,MDU6SXNzdWU0ODg4MzM5NzU=,3,Command for running a search and saving tweets for that search,9599,simonw,closed,0,,,,,6,2019-09-03T21:29:56Z,2019-11-04T05:31:56Z,2019-11-04T05:31:16Z,MEMBER,, $ twitter-to-sqlite search dogsheep,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488835586,MDU6SXNzdWU0ODg4MzU1ODY=,4,Command for importing data from a Twitter Export file,9599,simonw,closed,0,,,,,2,2019-09-03T21:34:13Z,2019-10-11T06:45:02Z,2019-10-11T06:45:02Z,MEMBER,,"Twitter lets you export all of your data as an archive file: https://twitter.com/settings/your_twitter_data A command for importing this data into SQLite would be extremely useful. $ twitter-to-sqlite import twitter.db path-to-archive.zip ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 488874815,MDU6SXNzdWU0ODg4NzQ4MTU=,5,Write tests that simulate the Twitter API,9599,simonw,open,0,,,,,1,2019-09-03T23:55:35Z,2019-09-03T23:56:28Z,,MEMBER,,I can use betamax for this: https://pypi.org/project/betamax/,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 489419782,MDU6SXNzdWU0ODk0MTk3ODI=,6,Extract extended_entities into a media table,9599,simonw,closed,0,,,,,0,2019-09-04T21:59:10Z,2019-09-04T22:08:01Z,2019-09-04T22:08:01Z,MEMBER,," ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 489429284,MDU6SXNzdWU0ODk0MjkyODQ=,572,Error running datasette publish with just --source_url,9599,simonw,closed,0,,,,,1,2019-09-04T22:19:22Z,2019-11-13T04:28:44Z,2019-11-13T04:28:44Z,OWNER,,"``` datasette publish now cleo.db \ --source_url=""https://twitter.com/cleopaws"" \ ``` Gave me this error: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/572/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 490798130,MDU6SXNzdWU0OTA3OTgxMzA=,7,users-lookup command for fetching users,9599,simonw,closed,0,,,,,0,2019-09-08T19:47:59Z,2019-09-08T20:32:13Z,2019-09-08T20:32:13Z,MEMBER,,"https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-users-lookup ``` https://api.twitter.com/1.1/users/lookup.json?user_id=783214,6253282 https://api.twitter.com/1.1/users/lookup.json?screen_name=simonw,cleopaws ``` CLI design: ``` $ twitter-to-sqlite users-lookup simonw cleopaws $ twitter-to-sqlite users-lookup 783214 6253282 --ids ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 490803176,MDU6SXNzdWU0OTA4MDMxNzY=,8,--sql and --attach options for feeding commands from SQL queries,9599,simonw,closed,0,,,,,4,2019-09-08T20:35:49Z,2020-03-20T23:13:01Z,2020-03-20T23:13:01Z,MEMBER,,"Say you want to fetch Twitter profiles for a list of accounts that are stored in another database: $ twitter-to-sqlite users-lookup users.db --attach attending.db \ --sql ""select Twitter from attending.attendes where Twitter is not null"" The SQL query you feed in is expected to return a list of screen names suitable for processing further by the command. Should be supported by all three of: - [x] `twitter-to-sqlite users-lookup` - [x] `twitter-to-sqlite user-timeline` - [x] `twitter-to-sqlite followers` and `friends` The `--attach` option allows other SQLite databases to be attached to the connection. Without it the SQL query will have to read from the single attached database.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 491219910,MDU6SXNzdWU0OTEyMTk5MTA=,61,importing CSV to SQLite as library,17739,witeshadow,closed,0,,,,,2,2019-09-09T17:12:40Z,2019-11-04T16:25:01Z,2019-11-04T16:25:01Z,NONE,,"CSV can be imported to SQLite when used CLI, but I don't see documentation for when using as library. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/61/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 491791152,MDU6SXNzdWU0OTE3OTExNTI=,9,followers-ids and friends-ids subcommands,9599,simonw,closed,0,,,,,1,2019-09-10T16:58:15Z,2019-09-10T17:36:55Z,2019-09-10T17:36:55Z,MEMBER,,"These will import follower and friendship IDs into the following tables, using these APIs: https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-followers-ids https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-friends-ids",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 492153532,MDU6SXNzdWU0OTIxNTM1MzI=,573,Exposing Datasette via Jupyter-server-proxy,82988,psychemedia,closed,0,,,,,3,2019-09-11T10:32:36Z,2020-03-26T09:41:30Z,2020-03-26T09:41:30Z,CONTRIBUTOR,,"It is possible to expose a running `datasette` service in a Jupyter environment such as a MyBinder environment using the [`jupyter-server-proxy`](https://github.com/jupyterhub/jupyter-server-proxy). For example, using [this demo Binder](https://mybinder.org/v2/gh/binder-examples/r/master?filepath=index.ipynb) which has the server proxy installed, we can then upload a simple test database from the notebook homepage, from a Jupyter termianl install datasette and set it running against the test db on eg port 8001 and then view it via the path `proxy/8001`. Clicking links results in 404s though because the `datasette` links aren't relative to the current path? ![image](https://user-images.githubusercontent.com/82988/64689964-44b69280-d487-11e9-8f9f-3681422bcc9f.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/573/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 492297930,MDU6SXNzdWU0OTIyOTc5MzA=,10,Rethink progress bars for various commands,9599,simonw,closed,0,,,,,5,2019-09-11T15:06:47Z,2020-04-01T03:45:48Z,2020-04-01T03:45:48Z,MEMBER,,"Progress bars and the `--silent` option are implemented inconsistently across commands at the moment. This is made more challenging by the fact that for many operations the total length is not known. https://click.palletsprojects.com/en/7.x/api/#click.progressbar",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 493599818,MDU6SXNzdWU0OTM1OTk4MTg=,1,Command for fetching starred repos,9599,simonw,closed,0,,,,,0,2019-09-14T08:36:29Z,2019-09-14T21:30:48Z,2019-09-14T21:30:48Z,MEMBER,,,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 493668862,MDU6SXNzdWU0OTM2Njg4NjI=,2,Extract licenses from repos into a separate table,9599,simonw,closed,0,,,,,0,2019-09-14T21:33:41Z,2019-09-14T21:46:58Z,2019-09-14T21:46:58Z,MEMBER,," ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 493670426,MDU6SXNzdWU0OTM2NzA0MjY=,3,Command to fetch all repos belonging to a user or organization,9599,simonw,closed,0,,,,,2,2019-09-14T21:54:21Z,2019-09-17T00:17:53Z,2019-09-17T00:17:53Z,MEMBER,,"How about this: $ github-to-sqlite repos simonw",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 493670730,MDU6SXNzdWU0OTM2NzA3MzA=,4,Command to fetch stargazers for one or more repos,9599,simonw,closed,0,,,,,8,2019-09-14T21:58:22Z,2020-05-02T21:30:27Z,2020-05-02T21:30:27Z,MEMBER,,"Maybe this: $ github-to-sqlite stargazers github.db simonw/datasette It could accept more than one repos. Maybe have options similar to `--sql` in [twitter-to-sqlite](https://github.com/dogsheep/twitter-to-sqlite) so you can e.g. fetch all stargazers for all of the repos you have fetched into the database already (or all of the repos belonging to owner X)",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 493671014,MDU6SXNzdWU0OTM2NzEwMTQ=,5,"Add ""incomplete"" boolean to users table for incomplete profiles",9599,simonw,closed,0,,,,,2,2019-09-14T22:01:50Z,2020-03-23T19:23:31Z,2020-03-23T19:23:30Z,MEMBER,,"User profiles that are fetched from e.g. stargazers (#4) are incomplete - they have a login but they don't have name, company etc. Add a `incomplete` boolean flag to the `users` table to record this. Then later I can add a `backfill-users` command which loops through and fetches missing data for those incomplete profiles.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 494685791,MDU6SXNzdWU0OTQ2ODU3OTE=,574,Improve usage description of --host option,132978,terrycojones,closed,0,,,,,2,2019-09-17T15:12:12Z,2019-11-01T21:58:17Z,2019-11-01T21:57:54Z,NONE,,"It would be nice if the `--host` option had a clearer description. I tried to get datasette running on an AWS instance and it took a while to realize it was only listening on localhost. So I wanted to make it listen on an non-localhost interface and tried giving a couple of values to `--host` (a host name, then an interface name), but none of them did. In the end I read the source to see that the option is passed to `uvicorn` and looked at the uvicorn docs, which also didn't help. Then I searched the web for ""example running datasette on a host"" which led me to https://github.com/simonw/datasette/issues/514 where I saw someone using `-h 0.0.0.0`. I tried that and it works. That usage could be mentioned somewhere, and might save someone else some time.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/574/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 496415321,MDU6SXNzdWU0OTY0MTUzMjE=,1,Figure out some interesting example SQL queries,9599,simonw,open,0,,,,,9,2019-09-20T15:28:07Z,2021-05-03T03:46:23Z,,MEMBER,,My knowledge of genetics has left me short here. I'd love to be able to provide some interesting example SELECT queries - maybe one that spots if you are [likely to have red hair?](https://www.snpedia.com/index.php/Rs1805007),209590345,genome-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/genome-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 497162288,MDU6SXNzdWU0OTcxNjIyODg=,575,Plugin documentation should cover how to bundle static/templates in setup.py,9599,simonw,closed,0,,,6026070,0.51,1,2019-09-23T15:15:18Z,2020-10-24T20:06:17Z,2020-10-24T20:03:53Z,OWNER,,"These sections here should cover it: https://datasette.readthedocs.io/en/latest/plugins.html#static-assets Example: https://github.com/simonw/datasette-auth-github/blob/bf01f8f01b87a6cb09c47380ba0a86e0546ebb38/setup.py#L30 ``` package_data={""datasette_auth_github"": [""templates/*.html""]}, ``` Also from https://github.com/simonw/datasette-plugin-demos/blob/0ccf9e6189e923046047acd7878d1d19a2cccbb1/setup.py#L18-L22 package_data={ 'datasette_plugin_demos': [ 'static/plugin.js', ], }, ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/575/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 497170355,MDU6SXNzdWU0OTcxNzAzNTU=,576,Documented internals API for use in plugins,9599,simonw,closed,0,,,3268330,Datasette 1.0,10,2019-09-23T15:28:50Z,2021-01-05T23:12:51Z,2021-01-05T23:12:37Z,OWNER,,"Quite a few of the plugin hooks make a `datasette”`instance of the Datasette class available to the plugins, so that they can look up configuration settings and execute database queries. This means it should provide a documented, stable API so that plugin authors can rely on it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/576/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 497171390,MDU6SXNzdWU0OTcxNzEzOTA=,577,Utility mechanism for plugins to render templates,9599,simonw,closed,0,,,3268330,Datasette 1.0,7,2019-09-23T15:30:36Z,2020-02-04T20:26:20Z,2020-02-04T20:26:19Z,OWNER,,"Sometimes a plugin will need to render a template for some custom UI. We need a documented API for doing this, which ensures that everything will work correctly if you extend base.html etc. See also #576. This could be a `.render()` method on the Datasette class, but that feels a bit weird - should that class also take responsibility for rendering?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/577/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 499954048,MDExOlB1bGxSZXF1ZXN0MzIyNTI5Mzgx,578,Added support for multi arch builds,887095,heussd,closed,0,,,,,3,2019-09-29T18:43:03Z,2019-11-13T19:13:15Z,2019-11-13T19:13:15Z,NONE,simonw/datasette/pulls/578,Minor changes in Dockerfile and new Makefile to support Docker multi architecture builds. `make`will build one image per architecture and push them as one Docker manifest to Docker Hub. Feel free to change `IMAGE_NAME ` to `datasetteproject/datasette` to update your official Docker Hub image(s).,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/578/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 500783373,MDU6SXNzdWU1MDA3ODMzNzM=,62,[enhancement] Method to delete a row in python,4454869,Sergeileduc,closed,0,,,,,5,2019-10-01T09:45:47Z,2019-11-04T16:30:34Z,2019-11-04T16:18:18Z,NONE,,"Hi ! Thanks for the lib ! Obviously, every possible sql queries won't have a dedicated method. But I was thinking : a method to delete a row (I'm terrible with names, maybe `delete_where()` or something, would be useful. I have a Database, with primary key. For the moment, I use : ```Python3 db.conn.execute(f""DELETE FROM table WHERE key = {key_id}"") db.conn.commit() ``` to delete a row I don't need anymore, giving his primary key. Works like a charm. Just an idea : ```Python3 table.delete_where_pkey({'key': key_id}) ``` or something (I know, I'm terrible at naming methods...). Pros : well, no need to write SQL query. Cons : WHERE normally allows to do many more things (operators =, <>, >, <, BETWEEN), not to mention AND, OR, etc... Method is maybe to specific, and/or a pain to render more flexible. Again, just a thought. Writing his own sql works too, so... Thanks again. See yah.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/62/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 501773982,MDExOlB1bGxSZXF1ZXN0MzIzOTgzNzMy,579,New connection pooling,9599,simonw,open,0,,,,,1,2019-10-02T23:22:19Z,2019-11-15T22:57:21Z,,OWNER,simonw/datasette/pulls/579,See #569,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/579/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 502355384,MDU6SXNzdWU1MDIzNTUzODQ=,580,Testing utilities should be available to plugins,9599,simonw,closed,0,,,,,5,2019-10-03T23:58:26Z,2020-02-28T07:58:46Z,2020-02-28T07:58:46Z,OWNER,,"I'm trying to write a plugin at the moment ([datasette-atom](https://github.com/simonw/datasette-atom)) which needs to run unit tests against a full in-memory Datasette instance, in the same way that the Datasette test suite itself works. I got it working by creating copies of the [TestClient and TestResponse classes](https://github.com/simonw/datasette/blob/a314b761866d250c16f1ff6dd682010cf4181eb4/tests/fixtures.py#L22-L96) within the plugin itself: https://github.com/simonw/datasette-atom/commit/c0e3bd9556d7b31f253a8bf666d42205cd24f4fc#diff-33337525d2d877f7cc7f33737bfd2d7b I had to do this because those classes are in the `tests/` directory within Datasette, so they don't get included in the package that ships to PyPI. It would be better if these classes were included in the main package in a way that made it easy for plugins to reuse them to write their own tests.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/580/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 502993509,MDU6SXNzdWU1MDI5OTM1MDk=,581,Redesign register_output_renderer callback,9599,simonw,closed,0,,,5471110,Datasette 0.43,24,2019-10-05T17:43:23Z,2020-05-28T02:24:14Z,2020-05-28T02:21:50Z,OWNER,,"In building https://github.com/simonw/datasette-atom it became clear that the callback function (which currently accepts just args, data and view_name) would also benefit from access to a mechanism to render templates and a `datasette` instance so it can execute SQL. To maintain backwards compatibility with existing plugins, we can introspect the callback function to see if it wants those new arguments or not. At a minimum I want to make `datasette` and ASGI `scope` available.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/581/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503045221,MDU6SXNzdWU1MDMwNDUyMjE=,11,Commands for recording real-time tweets from the streaming API,9599,simonw,closed,0,,,,,1,2019-10-06T03:09:30Z,2019-10-06T04:54:17Z,2019-10-06T04:48:31Z,MEMBER,,"https://developer.twitter.com/en/docs/tweets/filter-realtime/api-reference/post-statuses-filter We can support tracking keywords and following specific users.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503053243,MDU6SXNzdWU1MDMwNTMyNDM=,582,Datasette should not completely crash if one SQLite database is malformed,9599,simonw,open,0,,,,,0,2019-10-06T05:11:43Z,2019-10-06T05:11:43Z,,OWNER,,"If you run Datasette against a number of database files and one of them is malformed, you get this 500 error on the index page: It would be better if Datasette still worked and listed the databases that were NOT malformed, then showed an inline error message just for the one that could not be accessed.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/582/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 503053800,MDU6SXNzdWU1MDMwNTM4MDA=,12,"Extract ""source"" into a separate lookup table",9599,simonw,closed,0,,,,,3,2019-10-06T05:17:23Z,2019-10-17T15:49:24Z,2019-10-17T15:49:24Z,MEMBER,,"It's pretty bulky and ugly at the moment: ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503085013,MDU6SXNzdWU1MDMwODUwMTM=,13,statuses-lookup command,9599,simonw,closed,0,,,,,1,2019-10-06T11:00:20Z,2019-10-07T00:33:49Z,2019-10-07T00:31:44Z,MEMBER,,"For bulk retrieving tweets by their ID. https://developer.twitter.com/en/docs/tweets/post-and-engage/api-reference/get-statuses-lookup Rate limit is 900/15 minutes (1 call per second) but each call can pull up to 100 IDs, so we can pull 6,000 per minute. Should support `--SQL` and `--attach` #8 ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503128914,MDU6SXNzdWU1MDMxMjg5MTQ=,583,"Enable ""explain"" and ""explain query plan"" for CTEs",9599,simonw,closed,0,,,,,1,2019-10-06T17:00:10Z,2019-10-06T17:24:07Z,2019-10-06T17:24:07Z,OWNER,,"This currently throws an error: https://latest.datasette.io/fixtures?sql=explain+WITH+RECURSIVE%0D%0A++xaxis%28x%29+AS+%28VALUES%28-2.0%29+UNION+ALL+SELECT+x%2B0.05+FROM+xaxis+WHERE+x%3C1.2%29%2C%0D%0A++yaxis%28y%29+AS+%28VALUES%28-1.0%29+UNION+ALL+SELECT+y%2B0.1+FROM+yaxis+WHERE+y%3C1.0%29%2C%0D%0A++m%28iter%2C+cx%2C+cy%2C+x%2C+y%29+AS+%28%0D%0A++++SELECT+0%2C+x%2C+y%2C+0.0%2C+0.0+FROM+xaxis%2C+yaxis%0D%0A++++UNION+ALL%0D%0A++++SELECT+iter%2B1%2C+cx%2C+cy%2C+x*x-y*y+%2B+cx%2C+2.0*x*y+%2B+cy+FROM+m+%0D%0A+++++WHERE+%28x*x+%2B+y*y%29+%3C+4.0+AND+iter%3C28%0D%0A++%29%2C%0D%0A++m2%28iter%2C+cx%2C+cy%29+AS+%28%0D%0A++++SELECT+max%28iter%29%2C+cx%2C+cy+FROM+m+GROUP+BY+cx%2C+cy%0D%0A++%29%2C%0D%0A++a%28t%29+AS+%28%0D%0A++++SELECT+group_concat%28+substr%28%27+.%2B*%23%27%2C+1%2Bmin%28iter%2F7%2C4%29%2C+1%29%2C+%27%27%29+%0D%0A++++FROM+m2+GROUP+BY+cy%0D%0A++%29%0D%0ASELECT+group_concat%28rtrim%28t%29%2Cx%270a%27%29+FROM+a%3B",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/583/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503190241,MDU6SXNzdWU1MDMxOTAyNDE=,584,Codec error in some CSV exports,9599,simonw,closed,0,,,,,2,2019-10-07T01:15:34Z,2021-06-17T18:13:20Z,2019-10-18T05:23:16Z,OWNER,,"Got this exploring my Swarm checkins: ![448DBFC4-71F8-4846-83C0-BEA511B2157A](https://user-images.githubusercontent.com/9599/66279259-3af53480-e865-11e9-9651-04fd2d895392.jpeg) `/swarm/stickers.csv?stickerType=messageOnly&_size=max`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/584/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503217375,MDU6SXNzdWU1MDMyMTczNzU=,585,"Databases on index page should display in order they were passed to ""datasette serve""?",9599,simonw,closed,0,,,,,1,2019-10-07T03:42:39Z,2019-10-14T03:52:34Z,2019-10-14T03:52:34Z,OWNER,,"If you run this: datasette serve -h 127.0.0.1 -p 8000 -m phone-locations.db healthkit.db locations.db genome.db Then the index page for that Datasette instance should show the databases in the order they were specified on the command-line. Mind you when we add pagination to that page in #468 we may want to do something different here.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/585/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503218205,MDU6SXNzdWU1MDMyMTgyMDU=,586,Enable browser caching for plugin statics with datasette-auth,9599,simonw,closed,0,,,,,2,2019-10-07T03:47:14Z,2019-10-07T15:46:04Z,2019-10-07T15:46:03Z,OWNER,,"An authenticated Datasette I run is seeing delays on every page load. On looking at the network inspector it turns out it's because datasette-vega is nearly 1MB and a `cache-control: private` is preventing it from being cached! This may well turn out to be a bug in `datasette-auth-github` but it's still worth tracking here because caching of static assets from plugins is very important. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/586/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503233021,MDU6SXNzdWU1MDMyMzMwMjE=,1,Use better pagination (and implement progress bar),9599,simonw,closed,0,,,,,4,2019-10-07T04:58:11Z,2020-03-27T22:13:57Z,2020-03-27T22:13:57Z,MEMBER,,"Right now we attempt to load everything at once - which caps out at 5,000 items and is really slow. We can do better by implementing pagination using count and offset.",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503234169,MDU6SXNzdWU1MDMyMzQxNjk=,2,Track and use the 'since' value,9599,simonw,closed,0,,,,,3,2019-10-07T05:02:59Z,2020-03-27T22:22:30Z,2020-03-27T22:22:30Z,MEMBER,,"Pocket says: > Whenever possible, you should use the since parameter, or count and and offset parameters when retrieving a user's list. After retrieving the list, you should store the current time (which is provided along with the list response) and pass that in the next request for the list. This way the server only needs to return a small set (changes since that time) instead of the user's entire list every time. At the bottom of https://getpocket.com/developer/docs/v3/retrieve",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 503243784,MDU6SXNzdWU1MDMyNDM3ODQ=,3,Extract images into separate tables,9599,simonw,open,0,,,,,1,2019-10-07T05:43:01Z,2020-09-01T06:17:45Z,,MEMBER,,"As already done with authors. Slightly harder because images do not have a universally unique ID. Also need to figure out what to do about there being columns for both `image` and `images`. ",213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 503244410,MDU6SXNzdWU1MDMyNDQ0MTA=,14,"When importing favorites, record which user favorited them",9599,simonw,closed,0,,,,,0,2019-10-07T05:45:11Z,2019-10-14T03:30:25Z,2019-10-14T03:30:25Z,MEMBER,,"This code currently just dumps them into the `tweets` table without recording who it was who had favorited them. https://github.com/dogsheep/twitter-to-sqlite/blob/436a170d74ec70903d1b4ca430c2c6b6435cdfcc/twitter_to_sqlite/cli.py#L152-L157",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 504238461,MDU6SXNzdWU1MDQyMzg0NjE=,6,sqlite3.OperationalError: table users has no column named bio,1055831,dazzag24,closed,0,,,,,2,2019-10-08T19:39:52Z,2019-10-13T05:31:28Z,2019-10-13T05:30:19Z,NONE,,"``` $ github-to-sqlite repos github.db $ github-to-sqlite starred github.db dazzag24 Traceback (most recent call last): File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/bin/github-to-sqlite"", line 10, in sys.exit(cli()) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/cli.py"", line 106, in starred utils.save_stars(db, user, stars) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/utils.py"", line 177, in save_stars user_id = save_user(db, user) File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/github_to_sqlite/utils.py"", line 61, in save_user return db[""users""].upsert(to_save, pk=""id"").last_pk File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py"", line 1067, in upsert extracts=extracts, File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py"", line 916, in insert extracts=extracts, File ""/home/darreng/.virtualenvs/dogsheep-d2PjdrD7/lib/python3.6/site-packages/sqlite_utils/db.py"", line 1024, in insert_all result = self.db.conn.execute(sql, values) sqlite3.OperationalError: table users has no column named bio ``` ``` $ pipenv graph github-to-sqlite==0.4 - requests [required: Any, installed: 2.22.0] - certifi [required: >=2017.4.17, installed: 2019.9.11] - chardet [required: >=3.0.2,<3.1.0, installed: 3.0.4] - idna [required: >=2.5,<2.9, installed: 2.8] - urllib3 [required: >=1.21.1,<1.26,!=1.25.1,!=1.25.0, installed: 1.25.6] - sqlite-utils [required: ~=1.11, installed: 1.11] - click [required: Any, installed: 7.0] - click-default-group [required: Any, installed: 1.2.2] - click [required: Any, installed: 7.0] - tabulate [required: Any, installed: 0.8.5] Python 3.6.8 ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 504720731,MDU6SXNzdWU1MDQ3MjA3MzE=,1,Add more details on how to request data from google takeout correctly.,1055831,dazzag24,open,0,,,,,0,2019-10-09T15:17:34Z,2019-10-09T15:17:34Z,,NONE,,"The default is to download everything. This can result in an enormous amount of data when you only really need 2 types of data for now: - My Activity - Location History In addition unless you specify that ""My Activity"" is downloaded in JSON format the default is HTML. This then causes the `google-takeout-to-sqlite my-activity takeout.db takeout.zip` command to fail as it only contains html files not json files. Thanks",206649770,google-takeout-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 504805857,MDU6SXNzdWU1MDQ4MDU4NTc=,587,Use --platform=managed for publish cloudrun,9599,simonw,closed,0,,,,,0,2019-10-09T18:02:16Z,2019-10-17T21:51:57Z,2019-10-17T21:51:57Z,OWNER,,"Running `datasette publish cloudrun` now shows this message: > Please choose a target platform: > [1] Cloud Run (fully managed) > [2] Cloud Run on GKE > [3] a Kubernetes cluster > [4] cancel >Please enter your numeric choice: 1 > > To specify the platform yourself, pass `--platform managed`. Or, to make this the default target platform, run `gcloud config set run/platform managed`. May as well set that as a default.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/587/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 505512251,MDU6SXNzdWU1MDU1MTIyNTE=,588,Queries per DB table in metadata.json,12617395,bsilverm,closed,0,,,,,3,2019-10-10T21:08:19Z,2019-10-21T12:58:22Z,2019-10-21T01:48:42Z,NONE,,"It doesn't appear possible to have separate queries defined per database table. When I do something like below, my table descriptions show up but not the queries: ` ""databases"": { ""MYDB"": { ""tables"": { ""MYFIRSTTABLE"": { ""source"": ""Test"", ""source_url"": ""https://www.google.com"", ""queries"": { ""Query 1"": { ""sql"": ""select * from MYFIRSTTABLE"", ""title"": ""Query 1"", ""description"": ""This is the first query"" }, } }, ""MYSECONDTABLE"": { ""source"":""Test2"", ""source_url"":""https://www.google.com"", ""queries"": { ""Query 2"" : { ""sql"":""select * from MYSECONDTABLE;"", ""title"": ""Query 2"", ""description"":""This is the second query"" } } } }`",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/588/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 505666744,MDExOlB1bGxSZXF1ZXN0MzI3MDUxNjcz,15,"twitter-to-sqlite import command, refs #4",9599,simonw,closed,0,,,,,0,2019-10-11T06:37:14Z,2019-10-11T06:45:01Z,2019-10-11T06:45:01Z,MEMBER,dogsheep/twitter-to-sqlite/pulls/15,,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 505673645,MDU6SXNzdWU1MDU2NzM2NDU=,16,Do a better job with archived direct message threads,9599,simonw,open,0,,,,,0,2019-10-11T06:55:21Z,2019-10-11T06:55:27Z,,MEMBER,,https://github.com/dogsheep/twitter-to-sqlite/blob/fb2698086d766e0333a55bb73435e7283feeb438/twitter_to_sqlite/archive.py#L98-L99,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 505674949,MDU6SXNzdWU1MDU2NzQ5NDk=,17,import command should empty all archive-* tables first,9599,simonw,closed,0,,,,,2,2019-10-11T06:58:43Z,2019-10-11T15:40:08Z,2019-10-11T15:40:08Z,MEMBER,,Can have a CLI option for NOT doing that.,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 505814865,MDExOlB1bGxSZXF1ZXN0MzI3MTY5NzQ4,589,Display metadata footer on custom SQL queries,2657547,rixx,closed,0,,,,,0,2019-10-11T12:10:28Z,2019-10-14T08:58:23Z,2019-10-14T03:53:22Z,CONTRIBUTOR,simonw/datasette/pulls/589,Closes #408,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/589/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 505818256,MDExOlB1bGxSZXF1ZXN0MzI3MTcyNTQ1,590,Handle spaces in DB names,2657547,rixx,closed,0,,,,,3,2019-10-11T12:18:22Z,2019-11-04T23:16:31Z,2019-11-04T23:16:30Z,CONTRIBUTOR,simonw/datasette/pulls/590,Closes #503,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/590/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 505837199,MDExOlB1bGxSZXF1ZXN0MzI3MTg4MDg3,591,Sort databases on homepage by argument order,2657547,rixx,closed,0,,,,,1,2019-10-11T12:57:38Z,2019-10-14T08:57:50Z,2019-10-14T03:52:34Z,CONTRIBUTOR,simonw/datasette/pulls/591,Closes #585,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/591/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 505928530,MDU6SXNzdWU1MDU5Mjg1MzA=,18,Command to import home-timeline,9599,simonw,closed,0,,,,,4,2019-10-11T15:47:54Z,2019-10-11T16:51:33Z,2019-10-11T16:51:12Z,MEMBER,,"Feature request: https://twitter.com/johankj/status/1182563563136868352 > Would it be possible to save all tweets in my timeline from the last X days? I would love to see how big a percentage some users are of my daily timeline as a metric on whether I should unfollow them/move them to a list.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 505950145,MDExOlB1bGxSZXF1ZXN0MzI3Mjc5ODE4,592,Offer SQL formatting,2657547,rixx,closed,0,,,,,1,2019-10-11T16:35:49Z,2019-10-14T08:57:12Z,2019-10-14T03:46:13Z,CONTRIBUTOR,simonw/datasette/pulls/592,"SQL code will be formatted on page load, and can additionally be formatted by clicking the ""Format SQL"" button. Closes #136",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/592/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 506087267,MDU6SXNzdWU1MDYwODcyNjc=,19,since_id support for home-timeline,9599,simonw,closed,0,,,,,3,2019-10-11T22:48:24Z,2019-10-16T19:13:06Z,2019-10-16T19:12:46Z,MEMBER,,Currently every time you run `home-timeline` we pull all 800 available tweets. We should offer to support `since_id` (which can be provided or can be pulled directly from the database) in order to work more efficiently if this command is executed e.g. on a cron.,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 506183241,MDU6SXNzdWU1MDYxODMyNDE=,593,make uvicorn optional dependancy (because not ok on windows python yet),4312421,stonebig,closed,0,,,,,3,2019-10-12T12:51:07Z,2019-10-13T06:22:08Z,2019-10-13T06:22:07Z,NONE,,"would it be possible to: - remove uvicorn mandatory dependancy ? - eventually make a fallback to hypercorn ? reason: - uvloop not yet supported on Windows/Python-3.8 and below, may happen with Python-3.9 only. - it seems a 6 lines effort (but I'm not expert)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/593/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 506268945,MDU6SXNzdWU1MDYyNjg5NDU=,20,--since support for various commands for refresh-by-cron,9599,simonw,closed,0,,,,,3,2019-10-13T03:40:46Z,2019-10-21T03:32:04Z,2019-10-16T19:26:11Z,MEMBER,,"I want to run a cron that updates my Twitter database every X minutes. It should be able to retrieve the following without needing to paginate through everything: - [x] Tweets I have tweeted - [x] My home timeline (see #19) - [x] Tweets I have favourited It would be nice if this could be standardized across all commands as a `--since` option.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 506276893,MDU6SXNzdWU1MDYyNzY4OTM=,7,issue-comments command for importing issue comments,9599,simonw,closed,0,,,,,1,2019-10-13T05:23:58Z,2019-10-14T14:44:12Z,2019-10-13T05:24:30Z,MEMBER,,Using this API: https://developer.github.com/v3/issues/comments/,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 506297048,MDU6SXNzdWU1MDYyOTcwNDg=,594,upgrade to uvicorn-0.9 to be Python-3.8 friendly,4312421,stonebig,closed,0,,,,,3,2019-10-13T09:23:43Z,2019-11-12T04:47:04Z,2019-11-12T04:47:04Z,NONE,,uvicorn-0.8 relies on websockets-0.7 which lacks python-3.8 compatiblity,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/594/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 506300941,MDExOlB1bGxSZXF1ZXN0MzI3NTQxMDQ2,595,bump uvicorn to 0.9.0 to be Python-3.8 friendly,4312421,stonebig,closed,0,,,,,9,2019-10-13T10:00:04Z,2019-11-12T04:46:48Z,2019-11-12T04:46:48Z,NONE,simonw/datasette/pulls/595,"as uvicorn-0.9 is needed to get websockets-8.0.2, which is needed to have Python-3.8 compatibility",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/595/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 506432572,MDU6SXNzdWU1MDY0MzI1NzI=,21,Fix & escapes in tweet text,9599,simonw,closed,0,,,,,1,2019-10-14T03:37:28Z,2019-10-15T18:48:16Z,2019-10-15T18:48:16Z,MEMBER,," Shouldn't be storing `&` here.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/21/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 507454958,MDU6SXNzdWU1MDc0NTQ5NTg=,596,Handle really wide tables better,9599,simonw,open,0,,,,,9,2019-10-15T20:05:46Z,2022-09-07T00:58:41Z,,OWNER,,"If a table has hundreds of columns the Datasette UI starts getting unwieldy. Addressing this would be neat. One option would be to only select the first 30 columns by default and provide a UI for selecting more.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/596/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 508024032,MDU6SXNzdWU1MDgwMjQwMzI=,22,Ability to import from uncompressed archive or from specific files,9599,simonw,closed,0,,,,,0,2019-10-16T18:31:57Z,2019-10-16T18:53:36Z,2019-10-16T18:53:36Z,MEMBER,,"Currently you can only import like this: $ twitter-to-sqlite import path-to-twitter.zip It would be useful if you could import from a folder that was decompressed from that zip: $ twitter-to-sqlite import path-to-twitter/ AND from individual files within that folder - since that would allow you to e.g. selectively import certain files: $ twitter-to-sqlite import path-to-twitter/favorites.js path-to-twitter/tweets.js",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 508070977,MDU6SXNzdWU1MDgwNzA5Nzc=,597,If you have databases called foo.db and foo-bar.db you cannot visit /foo-bar,9599,simonw,closed,0,,,,,5,2019-10-16T20:07:41Z,2019-10-18T22:51:08Z,2019-10-18T22:51:08Z,OWNER,,"Weird bug I just came across. It appears that if you have one database called `foo.db` and another called `foo-bar.db` any attempts to visit `/foo-bar` will redirect to `/foo`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/597/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 508100844,MDU6SXNzdWU1MDgxMDA4NDQ=,598,Character encoding bug with CSV export,46313,JoeGermuska,closed,0,,,,,1,2019-10-16T21:09:30Z,2021-06-17T18:13:20Z,2019-10-18T22:52:21Z,NONE,,"I was just poking around, and at [this URL](https://sql-murder-mystery.datasette.io/sql-murder-mystery/crime_scene_report.csv?_stream=on&type=arson&_size=max), I encountered this error: ``` 'latin-1' codec can't encode character '\u2019' in position 27: ordinal not in range(256) ``` ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/598/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 508190730,MDU6SXNzdWU1MDgxOTA3MzA=,23,Extremely simple migration system,9599,simonw,closed,0,,,,,2,2019-10-17T02:13:57Z,2019-10-17T16:57:17Z,2019-10-17T16:57:17Z,MEMBER,,"Needed for #12. This is going to be an incredibly simple version of the Django migration system. * A `migrations` table, keeping track of which migrations were applied (and when) * A `migrate()` function which applies any pending migrations * A `MIGRATIONS` constant which is a list of functions to be applied The function names will be detected and used as the names of the migrations. Every time you run the CLI tool it will call the `migrate()` function before doing anything else. Needs to take into account that there might be no tables at all. As such, migration functions should sanity check that the tables they are going to work on actually exist.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 508553387,MDExOlB1bGxSZXF1ZXN0MzI5MzI0MzY4,24,Tweet source extraction and new migration system,9599,simonw,closed,0,,,,,0,2019-10-17T15:24:56Z,2019-10-17T15:49:29Z,2019-10-17T15:49:24Z,MEMBER,dogsheep/twitter-to-sqlite/pulls/24,Closes #12 and #23,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 508578780,MDU6SXNzdWU1MDg1Nzg3ODA=,25,Ensure migrations don't accidentally create foreign key twice,9599,simonw,closed,0,,,,,2,2019-10-17T16:08:50Z,2019-10-17T16:56:47Z,2019-10-17T16:56:47Z,MEMBER,,"Is it possible for these lines to run against a database table that already has these foreign keys? https://github.com/dogsheep/twitter-to-sqlite/blob/c9295233f219c446fa2085cace987067488a31b9/twitter_to_sqlite/migrations.py#L21-L22",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 509267608,MDExOlB1bGxSZXF1ZXN0MzI5ODkwMzIw,599,Fix for /foo v.s. /foo-bar issue in #597,9599,simonw,closed,0,,,,,0,2019-10-18T19:22:55Z,2019-10-18T22:51:07Z,2019-10-18T22:51:07Z,OWNER,simonw/datasette/pulls/599,Refs #597,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/599/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 509339999,MDU6SXNzdWU1MDkzMzk5OTk=,600,Don't auto-format SQL on first page load,9599,simonw,closed,0,,,,,0,2019-10-18T22:36:10Z,2019-10-18T23:56:46Z,2019-10-18T23:56:46Z,OWNER,,"I've gone back and forth on this a bit, but I've decided I'm not keen on the way Datasette now automatically formats SQL when a query (or canned query) page first loads. I like having an optional ""Format SQL"" button, but applying formatting automatically means that if the user has carefully formatted their SQL to a specific style their formatting will be automatically over-ridden.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/600/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 509340359,MDExOlB1bGxSZXF1ZXN0MzI5OTQ3MTgw,601,Don't auto-format SQL on page load,9599,simonw,closed,0,,,,,5,2019-10-18T22:37:39Z,2019-10-20T02:29:49Z,2019-10-18T23:56:45Z,OWNER,simonw/datasette/pulls/601,Refs #600,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/601/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 509535510,MDExOlB1bGxSZXF1ZXN0MzMwMDc2MjYz,602,Offer to format readonly SQL,2657547,rixx,closed,0,,,,,3,2019-10-20T02:29:32Z,2019-11-04T07:29:33Z,2019-11-04T02:39:56Z,CONTRIBUTOR,simonw/datasette/pulls/602,"Following discussion in #601, this PR adds a ""Format SQL"" button to read-only SQL (if the SQL actually differs from the formatting result). It also removes a console error on readonly SQL queries.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/602/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 509612217,MDExOlB1bGxSZXF1ZXN0MzMwMTI5MzU4,603,always pop as_format off args dict,6025893,chris48s,closed,0,,,,,2,2019-10-20T15:44:22Z,2019-10-30T19:12:22Z,2019-10-21T02:03:09Z,CONTRIBUTOR,simonw/datasette/pulls/603,closes #563,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/603/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 509693773,MDU6SXNzdWU1MDk2OTM3NzM=,604,_where= parameter is not persisted in hidden form fields,9599,simonw,closed,0,,,,,3,2019-10-21T02:14:10Z,2019-10-30T19:12:38Z,2019-10-30T18:49:44Z,OWNER,,"e.g. on this page: https://v0-30.datasette.io/fixtures/roadside_attractions?_where=name%20like%20%27%museum%%27 Click the ""Apply"" button and the `_where=` parameter will be dropped.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/604/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 510076368,MDU6SXNzdWU1MTAwNzYzNjg=,605,Support queries at the table level,12617395,bsilverm,open,0,,,,,2,2019-10-21T15:58:30Z,2019-10-30T18:55:37Z,,NONE,,"Per the issue described in [issue #588](https://github.com/simonw/datasette/issues/588), it was determined queries are not supported at the table level. Per my last comment in the issue, I'd like to request support for this as it would help eliminate errors in the event certain tables are not present in the database.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/605/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 512218858,MDU6SXNzdWU1MTIyMTg4NTg=,606,/-/plugins shows incorrect name for plugins,9599,simonw,closed,0,,,,,3,2019-10-24T22:53:25Z,2019-11-01T05:41:04Z,2019-11-01T05:40:07Z,OWNER,,"https://fivethirtyeight.datasettes.com/-/plugins ```json [ { ""name"": ""datasette_jellyfish"", ""static"": false, ""templates"": false, ""version"": ""0.3"" }, { ""name"": ""datasette_vega"", ""static"": true, ""templates"": false, ""version"": ""0.6.2"" } ] ``` These should be shown as `datasette-jellyfish` and `datasette-vega` since those are the names on PyPI.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/606/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 512996469,MDU6SXNzdWU1MTI5OTY0Njk=,607,Ways to improve fuzzy search speed on larger data sets?,8431341,zeluspudding,closed,0,,,,,6,2019-10-27T17:31:37Z,2019-11-07T03:38:10Z,2019-11-07T03:38:10Z,NONE,,"I have an sqlite table with 16 million rows in it. Having read @simonw article ""[Fast Autocomplete Search for Your Website](https://24ways.org/2018/fast-autocomplete-search-for-your-website/)"" I was curious to try datasette to see what kind of query performance I could get out of it. In truth I don't need to do full text search since all I would like to do is give my users a way to search for the names of investors such as ""Warren Buffet"", or ""Tim Cook"" (who's names are in a single column). On the first search, Datasette takes over 20 seconds to return all records associated with `elon musk`: > ![image](https://user-images.githubusercontent.com/8431341/67638889-a86e1100-f8b7-11e9-9f7e-a9d13a42e988.png) > ![image](https://user-images.githubusercontent.com/8431341/67638825-ed457800-f8b6-11e9-94d1-b44f1a40ee8c.png) If I rerun the same search, it then takes almost 9 seconds: > ![image](https://user-images.githubusercontent.com/8431341/67638908-e4a17180-f8b7-11e9-9d00-748c80ef1f21.png) That's far to slow to implement an autocomplete feature. I could reduce the latency by making a special table of only unique investor names, thereby reducing the search space to less than a million rows (then I'd need to implement a way to add only new investor names to the table as I received new data.. about 4,000 rows a day). If I did that, I'm still concerned the new table wouldn't be lean enough to lookup investor names quickly. Plus, even if I can implement the autocomplete feature, I would still finally have to lookup records for that investors which would take between 8 - 20 seconds. Are there any tricks for speeding this up? Here's my hardware: > ![image](https://user-images.githubusercontent.com/8431341/67638861-55945980-f8b7-11e9-96a8-ca76c7c68c5d.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/607/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 513008936,MDU6SXNzdWU1MTMwMDg5MzY=,608,"Improve UI of ""datasette publish cloudrun"" to reduce chances of accidentally over-writing a service",9599,simonw,closed,0,,,,,6,2019-10-27T19:21:28Z,2019-11-08T02:51:36Z,2019-11-08T02:48:46Z,OWNER,,"The concept of a ""service"" in Cloud Run is crucial: if you deploy to the same service, you will over-write what you deployed there last! As such, I'd like to make service a required positional argument for `publish cloudrun`: datasette publish cloudrun my-service one.db two.db three.db ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/608/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 513074501,MDU6SXNzdWU1MTMwNzQ1MDE=,26,Command for importing mentions timeline,9599,simonw,closed,0,,,,,1,2019-10-28T03:14:27Z,2019-10-30T02:36:13Z,2019-10-30T02:20:47Z,MEMBER,,"https://developer.twitter.com/en/docs/tweets/timelines/api-reference/get-statuses-mentions_timeline Almost identical to home-timeline #18 but it uses `https://api.twitter.com/1.1/statuses/mentions_timeline.json` instead.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/26/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 514459062,MDU6SXNzdWU1MTQ0NTkwNjI=,27,retweets-of-me command,9599,simonw,closed,0,,,,,4,2019-10-30T07:43:01Z,2019-11-03T01:12:58Z,2019-11-03T01:12:58Z,MEMBER,,https://developer.twitter.com/en/docs/tweets/post-and-engage/api-reference/get-statuses-retweets_of_me,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/27/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 514899195,MDExOlB1bGxSZXF1ZXN0MzM0NDQ4MjU4,609,Update to latest black,9599,simonw,closed,0,,,,,0,2019-10-30T18:42:35Z,2019-10-30T18:49:01Z,2019-10-30T18:49:01Z,OWNER,simonw/datasette/pulls/609,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/609/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 515658861,MDU6SXNzdWU1MTU2NTg4NjE=,28,Add indexes to followers table,9599,simonw,closed,0,,,,,1,2019-10-31T18:40:22Z,2019-11-09T20:15:42Z,2019-11-09T20:11:48Z,MEMBER,,`select follower_id from following where followed_id = 12497` takes over a second for me at the moment.,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/28/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516310670,MDU6SXNzdWU1MTYzMTA2NzA=,610,Don't suggest array facet if column is only [] empty arrays,9599,simonw,closed,0,,,,,0,2019-11-01T19:42:02Z,2019-11-01T21:46:08Z,2019-11-01T21:46:08Z,OWNER,,Follow on from #562,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/610/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516370822,MDU6SXNzdWU1MTYzNzA4MjI=,611,Static assets no longer loading for installed plugins,9599,simonw,closed,0,,,,,3,2019-11-01T22:07:00Z,2019-11-01T22:15:55Z,2019-11-01T22:15:55Z,OWNER,,"Caused by fix I made in #606 e.g. `/-/static-plugins/datasette_leaflet_geojson/datasette-leaflet-geojson.js` is a 404, but view-`/-/static-plugins/datasette-leaflet-geojson/datasette-leaflet-geojson.js` works correctly.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/611/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516748849,MDU6SXNzdWU1MTY3NDg4NDk=,612,CSV export is broken for tables with null foreign keys,9599,simonw,closed,0,,,,,2,2019-11-02T22:52:47Z,2021-06-17T18:13:20Z,2019-11-02T23:12:53Z,OWNER,,"Following on from #406 - this CSV export appears to be broken: https://14da705.datasette.io/fixtures/foreign_key_references.csv?_labels=on&_size=max ```csv pk,foreign_key_with_label,foreign_key_with_label_label,foreign_key_with_no_label,foreign_key_with_no_label_label 1,1,hello,1,1 2,, ``` That second row should have 5 values, but it only has 4.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/612/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516763727,MDExOlB1bGxSZXF1ZXN0MzM1OTgwMjQ2,8,"stargazers command, refs #4",9599,simonw,closed,0,,,,,5,2019-11-03T00:37:36Z,2020-05-02T20:00:27Z,2020-05-02T20:00:26Z,MEMBER,dogsheep/github-to-sqlite/pulls/8,Needs tests. Refs #4.,207052882,github-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 516769276,MDU6SXNzdWU1MTY3NjkyNzY=,9,Commands do not work without an auth.json file,9599,simonw,closed,0,,,,,0,2019-11-03T01:54:28Z,2019-11-11T05:30:48Z,2019-11-11T05:30:48Z,MEMBER,,"`auth.json` is meant to be optional. If it's not provided, the tool should make heavily rate-limited unauthenticated requests. ``` $ github-to-sqlite repos .data/repos.db simonw Usage: github-to-sqlite repos [OPTIONS] DB_PATH [USERNAME] Try ""github-to-sqlite repos --help"" for help. Error: Invalid value for ""-a"" / ""--auth"": File ""auth.json"" does not exist. ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516874735,MDU6SXNzdWU1MTY4NzQ3MzU=,613,Basic join support for table view,9599,simonw,open,0,,,,,1,2019-11-03T19:12:53Z,2019-11-03T19:14:01Z,,OWNER,,"I think it would be possible to support basic foreign key joins on the table page. The user could specify columns that should result in a join (from a set of suggestions similar to how facets work right now) and they could then be passed as `?_join=city_id` arguments. This feature will make a lot of sense when combined with the ability to show / hide / customize columns, see #292",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/613/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 516950748,MDU6SXNzdWU1MTY5NTA3NDg=,614,"Add ""not in"" filter - ?pk__notin=x,y,z",9599,simonw,closed,0,,,,,1,2019-11-04T04:07:17Z,2019-11-04T04:31:58Z,2019-11-04T04:12:00Z,OWNER,,"We have a `__in` filter at the moment: https://latest.datasette.io/fixtures/facetable?pk__in=1,2,3 Today I found myself needing the inverse, a `?pk__notin=` filter, which isn't currently supported.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/614/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 516967682,MDU6SXNzdWU1MTY5Njc2ODI=,10,Add this repos_starred view,9599,simonw,closed,0,,,,,3,2019-11-04T05:44:38Z,2020-05-02T16:37:36Z,2020-05-02T16:37:36Z,MEMBER,,"```sql create view repos_starred as select stars.starred_at, users.login, repos.* from repos join stars on repos.id = stars.repo join users on repos.owner = users.id order by starred_at desc; ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 517241040,MDU6SXNzdWU1MTcyNDEwNDA=,63,ensure_index() method,9599,simonw,closed,0,,,,,1,2019-11-04T15:51:22Z,2019-11-04T16:20:36Z,2019-11-04T16:20:35Z,OWNER,,"```python db[""table""].ensure_index([""col1"", ""col2""]) ``` This will do the following: - if the specified table or column does not exist, do nothing - if they exist and already have an index, do nothing - otherwise, create the index I want this for tools like [twitter-to-sqlite search](https://github.com/dogsheep/twitter-to-sqlite/blob/801c0c2daf17d8abce9dcb5d8d610410e7e25dbe/README.md#running-searches) where the `search_runs` table may or not have been created yet but, if it IS created, I want to put an index on the `hash` column.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/63/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 517451234,MDU6SXNzdWU1MTc0NTEyMzQ=,615,?_col= and ?_nocol= support for toggling columns on table view,9599,simonw,closed,0,,,,,16,2019-11-04T22:55:41Z,2021-05-27T04:26:10Z,2021-05-27T04:17:44Z,OWNER,,Split off from #292 (I guess this is a re-opening of #312).,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/615/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 518506242,MDU6SXNzdWU1MTg1MDYyNDI=,616,Datasette FTS detection bug,49656826,null92,closed,0,,,,,2,2019-11-06T14:25:47Z,2019-11-08T15:31:33Z,2019-11-08T02:06:56Z,NONE,,"I'm having a trouble with datasette. I deployed EXACTLY the same project on two different apps on Heroku. Both have databases (not all) with FTS activated but only one detects and works fine. You can take a look here: With search: http://teste-templates.herokuapp.com/amazonia_protege/car Without search: http://bases.vortex.media/amazonia_protege/car ![teste](https://user-images.githubusercontent.com/49656826/68306310-11a80e00-0088-11ea-8d1c-db3bd3375518.jpg) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/616/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 518725064,MDU6SXNzdWU1MTg3MjUwNjQ=,29,`import` command fails on empty files,21148,jacobian,closed,0,,,,,4,2019-11-06T20:34:26Z,2019-11-09T20:33:38Z,2019-11-09T19:36:36Z,CONTRIBUTOR,,"If a file in the export is empty (in my case it was `account-suspensions.js`), `twitter-to-sqlite import` fails: ``` $ twitter-to-sqlite import twitter.db ~/Downloads/twitter-2019-11-06-926f4f3be4b3b1fcb1aa387c40cd14f7c8aaf9bbcdb2d78ac14d9989add501bb.zip Traceback (most recent call last): File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/bin/twitter-to-sqlite"", line 10, in sys.exit(cli()) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py"", line 627, in import_ archive.import_from_file(db, filename, content) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/archive.py"", line 224, in import_from_file db[table_name].upsert_all(rows, hash_id=""pk"") File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1113, in upsert_all extracts=extracts, File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/sqlite_utils/db.py"", line 980, in insert_all first_record = next(records) StopIteration ``` This appears to be because `db.upsert_all` is called with no rows -- I think? I hacked around this by modifying `import_from_file` to have an `if rows:` clause: ``` for table, rows in to_insert.items(): if rows: table_name = ""archive_{}"".format(table.replace(""-"", ""_"")) ... ``` I'm happy to work up a real PR if that's the right approach, but I'm not sure it is.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/29/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 518739697,MDU6SXNzdWU1MTg3Mzk2OTc=,30,`followers` fails because `transform_user` is called twice,21148,jacobian,closed,0,,,,,2,2019-11-06T20:44:52Z,2019-11-09T20:15:28Z,2019-11-09T19:55:52Z,CONTRIBUTOR,,"Trying to run `twitter-to-sqlite followers` errors out: ``` Traceback (most recent call last): File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/bin/twitter-to-sqlite"", line 10, in sys.exit(cli()) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py"", line 130, in followers go(bar.update) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/cli.py"", line 116, in go utils.save_users(db, [profile]) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/utils.py"", line 302, in save_users transform_user(user) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/twitter_to_sqlite/utils.py"", line 181, in transform_user user[""created_at""] = parser.parse(user[""created_at""]) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 1374, in parse return DEFAULTPARSER.parse(timestr, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 646, in parse res, skipped_tokens = self._parse(timestr, **kwargs) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 725, in _parse l = _timelex.split(timestr) # Splits the timestr into tokens File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 207, in split return list(cls(s)) File ""/Users/jacob/Library/Caches/pypoetry/virtualenvs/jkm-dogsheep-ezLnyXZS-py3.7/lib/python3.7/site-packages/dateutil/parser/_parser.py"", line 76, in __init__ '{itype}'.format(itype=instream.__class__.__name__)) TypeError: Parser must be a string or character stream, not datetime ``` This appears to be because https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L111 calls `transform_user`, and then https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L116 calls `transform_user` again, which fails because the user is already transformed. I was able to work around this by commenting out https://github.com/dogsheep/twitter-to-sqlite/blob/master/twitter_to_sqlite/cli.py#L116. Shall I work up a patch for that, or is there a better approach?",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/30/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 519032008,MDExOlB1bGxSZXF1ZXN0MzM3ODQ3NTcz,64,test_insert_upsert_all_empty_list,9599,simonw,closed,0,,,,,0,2019-11-07T04:24:45Z,2019-11-07T04:32:38Z,2019-11-07T04:32:38Z,OWNER,simonw/sqlite-utils/pulls/64,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/64/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 519038979,MDU6SXNzdWU1MTkwMzg5Nzk=,10,Failed to import workout points,9599,simonw,closed,0,,,,,4,2019-11-07T04:50:22Z,2019-11-08T01:18:37Z,2019-11-08T01:18:37Z,MEMBER,,"I just ran the script and it failed to import any `workout_points`, though it did import `workouts`.",197882382,healthkit-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/healthkit-to-sqlite/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 519039316,MDExOlB1bGxSZXF1ZXN0MzM3ODUzMzk0,65,Release 1.12.1,9599,simonw,closed,0,,,,,0,2019-11-07T04:51:29Z,2019-11-07T04:58:48Z,2019-11-07T04:58:47Z,OWNER,simonw/sqlite-utils/pulls/65,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/65/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 519613116,MDU6SXNzdWU1MTk2MTMxMTY=,617,Refactor TableView.data() method,9599,simonw,closed,0,,,,,9,2019-11-08T01:55:41Z,2021-12-18T01:41:47Z,2021-12-11T19:17:11Z,OWNER,,"This is by far the most complex piece of Datasette - the `TableView.data()` method is over 500 lines long and is increasingly getting in the way of cleanly implementing new features (e.g. #615 and #613). Need to break it up into smaller, cleaner pieces.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/617/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 519979091,MDExOlB1bGxSZXF1ZXN0MzM4NjQ3Mzc4,1,Add parkrun-to-sqlite,1101318,mrw34,closed,0,,,,,0,2019-11-08T12:05:32Z,2020-10-12T00:35:16Z,2020-10-12T00:35:16Z,CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/1,,214746582,dogsheep.github.io,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 520507306,MDU6SXNzdWU1MjA1MDczMDY=,618,Mechanism for seeing indexes on a specific table,9599,simonw,closed,0,,,,,2,2019-11-09T20:10:41Z,2019-11-10T01:40:05Z,2019-11-10T01:30:25Z,OWNER,,"The only way to see the indexes that apply to a specific table at the moment is to run the following SQL manually: ```sql select * from sqlite_master where type = 'index' and tbl_name=? ``` For example: It would be good if this list of indexes was displayed in a neater way on the table page.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/618/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520508502,MDU6SXNzdWU1MjA1MDg1MDI=,31,"""friends"" command (similar to ""followers"")",9599,simonw,closed,0,,,,,2,2019-11-09T20:20:20Z,2022-09-20T05:05:03Z,2020-02-07T07:03:28Z,MEMBER,,"Current list of commands: ``` followers Save followers for specified user (defaults to... followers-ids Populate followers table with IDs of account followers friends-ids Populate followers table with IDs of account friends ``` Obvious omission here is `friends`, which would be powered by `https://api.twitter.com/1.1/friends/list.json`: https://developer.twitter.com/en/docs/accounts-and-users/follow-search-get-users/api-reference/get-friends-list",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/31/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520521843,MDU6SXNzdWU1MjA1MjE4NDM=,11,Command to fetch releases,9599,simonw,closed,0,,,,,0,2019-11-09T22:23:30Z,2019-11-09T22:57:00Z,2019-11-09T22:57:00Z,MEMBER,,"https://developer.github.com/v3/repos/releases/#list-releases-for-a-repository `GET /repos/:owner/:repo/releases`",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520655983,MDU6SXNzdWU1MjA2NTU5ODM=,619,"""Invalid SQL"" page should let you edit the SQL",9599,simonw,closed,0,,,,,14,2019-11-10T20:54:12Z,2022-01-13T22:21:42Z,2021-06-02T04:15:54Z,OWNER,,"https://latest.datasette.io/fixtures?sql=select%0D%0A++*%0D%0Afrom%0D%0A++%5Bfoo%5D Would be useful if this page showed you the invalid SQL you entered so you can edit it and try again.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/619/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520667773,MDU6SXNzdWU1MjA2Njc3NzM=,620,Mechanism for indicating foreign key relationships in the table and query page URLs,9599,simonw,open,0,,,,,6,2019-11-10T22:26:27Z,2021-04-05T03:57:22Z,,OWNER,,"Datasette currently only inflates foreign keys (into names hyperlinks) if it detects them as foreign key constraints in the underlying database. It would be useful if you could specify additional ""foreign keys"" using both `metadata.json` and the querystring - similar time how you can pass `?_fts_table=x` https://datasette.readthedocs.io/en/stable/full_text_search.html#configuring-full-text-search-for-a-table-or-view",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/620/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 1}",, 520681725,MDU6SXNzdWU1MjA2ODE3MjU=,621,Syntax for ?_through= that works as a form field,9599,simonw,open,0,,,,,7,2019-11-11T00:19:03Z,2021-12-18T01:42:33Z,,OWNER,,"The current syntax for `?_through=` uses JSON to avoid any risk of confusion with table or column names that contain special characters. This means you can't target a form field at it. We should be able to support both - `?x.y.z=value` for tables and columns with ""regular"" names, falling back to the current JSON syntax for columns or tables that won't work with the key/value syntax.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/621/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 520715188,MDU6SXNzdWU1MjA3MTUxODg=,622,Datasette should work with Python 3.8 (and drop compatibility with Python 3.5),9599,simonw,closed,0,,,,,4,2019-11-11T03:12:36Z,2019-11-12T05:52:49Z,2019-11-12T05:09:13Z,OWNER,,"See #595, #594, #404. The big thing holding me back from ditching Python 3.5 was glitch.com - but they now offer Python 3.7: https://support.glitch.com/t/can-you-upgrade-python-to-latest-version/7980/25?u=simonw",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/622/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520718056,MDExOlB1bGxSZXF1ZXN0MzM5MjM2NjQ3,623,Test against Python 3.8 in Travis,9599,simonw,closed,0,,,,,2,2019-11-11T03:24:54Z,2019-11-11T03:45:35Z,2019-11-11T03:45:35Z,OWNER,simonw/datasette/pulls/623,Needed for #622,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/623/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 520728483,MDExOlB1bGxSZXF1ZXN0MzM5MjQ0ODg4,624,Bump pint to 0.9,9599,simonw,closed,0,,,,,0,2019-11-11T04:07:07Z,2019-11-11T04:19:02Z,2019-11-11T04:19:02Z,OWNER,simonw/datasette/pulls/624,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/624/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 520740741,MDU6SXNzdWU1MjA3NDA3NDE=,625,If you apply ?_facet_array=tags then &_facet=tags does nothing,9599,simonw,closed,0,,,7571612,Datasette 0.60,13,2019-11-11T04:59:29Z,2022-01-13T22:26:58Z,2021-12-16T20:12:22Z,OWNER,,"Start here: https://v0-30-2.datasette.io/fixtures/facetable?_facet_array=tags Note that `tags` is offered as a suggested facet. But if you click that you get this: https://v0-30-2.datasette.io/fixtures/facetable?_facet_array=tags&_facet=tags The `_facet=tags` is added to the URL and it's removed from the list of suggested tags... but the facet itself is not displayed: The `_facet=tags` facet should look like this: ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/625/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 520756546,MDU6SXNzdWU1MjA3NTY1NDY=,12,Add this view for seeing new releases,9599,simonw,closed,0,,,,,5,2019-11-11T06:00:12Z,2020-05-02T18:58:18Z,2020-05-02T18:58:17Z,MEMBER,,"```sql CREATE VIEW recent_releases AS select json_object(""label"", repos.full_name, ""href"", repos.html_url) as repo, json_object( ""href"", releases.html_url, ""label"", releases.name ) as release, substr(releases.published_at, 0, 11) as date, releases.body as body_markdown, releases.published_at from releases join repos on repos.id = releases.repo order by releases.published_at desc ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 521275281,MDU6SXNzdWU1MjEyNzUyODE=,13,Set up a live demo Datasette instance,9599,simonw,closed,0,,,5225818,1.0,9,2019-11-12T01:27:02Z,2020-03-24T00:03:26Z,2020-03-24T00:03:25Z,MEMBER,,"I deployed https://github-to-sqlite-releases-j7hipcg4aq-uc.a.run.app/ by running this: ``` #!/bin/bash # Fetch repos for simonw and dogsheep github-to-sqlite repos github.db simonw dogsheep -a auth.json # Fetch releases for the repos tagged 'datasette-io' sqlite-utils github.db "" select full_name from repos where rowid in ( select repos.rowid from repos, json_each(repos.topics) j where j.value = 'datasette-io' )"" --csv --no-headers | while read repo; do github-to-sqlite releases \ github.db $(echo $repo | tr -d '\r') \ -a auth.json; sleep 2; done; ``` And then deploying using this: ``` $ datasette publish cloudrun github.db \ --title ""github-to-sqlite releases demo"" \ --about_url=""https://github.com/simonw/github-to-sqlite"" \ --about='github-to-sqlite' \ --install=datasette-render-markdown \ --install=datasette-json-html \ --service=github-to-sqlite-releases ``` This should happen automatically for every release. I can run it once a day in Circle CI to keep the demo database up-to-date.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 521282013,MDU6SXNzdWU1MjEyODIwMTM=,626,Unit tests should fail under Python 3.8,9599,simonw,closed,0,,,,,1,2019-11-12T01:54:11Z,2019-11-12T04:31:26Z,2019-11-12T04:31:13Z,OWNER,,"The unit tests currently pass under Python 3.8. But... when you actually attempt to run Datasette you get an error: ``` ~/Dropbox/Development/datasette $ venv-py3.8.0/bin/datasette --memory -p 8855 Serve! files=() (immutables=()) on port 8855 Traceback (most recent call last): File ""venv-py3.8.0/bin/datasette"", line 11, in load_entry_point('datasette', 'console_scripts', 'datasette')() File ""/Users/simonw/Dropbox/Development/datasette/venv-py3.8.0/lib/python3.8/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/simonw/Dropbox/Development/datasette/venv-py3.8.0/lib/python3.8/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/simonw/Dropbox/Development/datasette/venv-py3.8.0/lib/python3.8/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simonw/Dropbox/Development/datasette/venv-py3.8.0/lib/python3.8/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simonw/Dropbox/Development/datasette/venv-py3.8.0/lib/python3.8/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/simonw/Dropbox/Development/datasette/datasette/cli.py"", line 365, in serve uvicorn.run(ds.app(), host=host, port=port, log_level=""info"") File ""/Users/simonw/Dropbox/Development/datasette/venv-py3.8.0/lib/python3.8/site-packages/uvicorn/main.py"", line 279, in run server.run() File ""/Users/simonw/Dropbox/Development/datasette/venv-py3.8.0/lib/python3.8/site-packages/uvicorn/main.py"", line 305, in run self.config.setup_event_loop() File ""/Users/simonw/Dropbox/Development/datasette/venv-py3.8.0/lib/python3.8/site-packages/uvicorn/config.py"", line 218, in setup_event_loop loop_setup() File ""/Users/simonw/Dropbox/Development/datasette/venv-py3.8.0/lib/python3.8/site-packages/uvicorn/loops/auto.py"", line 3, in auto_loop_setup import uvloop File ""/Users/simonw/Dropbox/Development/datasette/venv-py3.8.0/lib/python3.8/site-packages/uvloop/__init__.py"", line 7, in from .loop import Loop as __BaseLoop # NOQA File ""uvloop/includes/stdlib.pxi"", line 114, in init uvloop.loop AttributeError: module 'sys' has no attribute 'set_coroutine_wrapper' ~/Dropbox/Development/datasette $ ``` If Datasette doesn't work under Python 3.8 the tests should fail.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/626/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 521323012,MDExOlB1bGxSZXF1ZXN0MzM5NzIyNzkw,627,"Support Python 3.8, stop supporting Python 3.5",9599,simonw,closed,0,,,,,2,2019-11-12T04:36:33Z,2020-04-05T10:23:58Z,2019-11-12T05:09:12Z,OWNER,simonw/datasette/pulls/627,Refs #622,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/627/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 521329771,MDU6SXNzdWU1MjEzMjk3NzE=,628,Render jinja2 templates in async mode,9599,simonw,closed,0,,,,,2,2019-11-12T05:01:55Z,2019-11-14T23:28:09Z,2019-11-14T23:14:24Z,OWNER,,"I started playing with this in #404 and got good results but it didn't work in Python 3.5. As of #627 I don't support 3.5 any more so this can go ahead. Rendering templates in async mode will mean that template plugins can include async code... which opens the door to custom template functions that execute SQL queries!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/628/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 521335335,MDU6SXNzdWU1MjEzMzUzMzU=,629,"""datasette publish"" commands should deploy with Python 3.8",9599,simonw,closed,0,,,,,1,2019-11-12T05:22:31Z,2019-11-12T06:03:10Z,2019-11-12T06:03:10Z,OWNER,,Now that we support 3.8 (#627) `datasette publish` should always deploy using Python 3.8.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/629/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 521346800,MDExOlB1bGxSZXF1ZXN0MzM5NzQyNDMy,630,Use python:3.8 base Docker image,9599,simonw,closed,0,,,,,0,2019-11-12T06:02:37Z,2019-11-12T06:03:10Z,2019-11-12T06:03:10Z,OWNER,simonw/datasette/pulls/630,Closes #629,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/630/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 521868864,MDU6SXNzdWU1MjE4Njg4NjQ=,66,"The "".upsert()"" method is misnamed",9599,simonw,closed,0,,,,,15,2019-11-12T23:48:28Z,2019-12-31T01:30:21Z,2019-12-31T01:30:20Z,OWNER,,"This thread here is illuminating: https://stackoverflow.com/questions/3634984/insert-if-not-exists-else-update The term `UPSERT` in SQLite has a specific meaning as-of 3.24.0 (2018-06-04): https://www.sqlite.org/lang_UPSERT.html It means ""behave as an UPDATE or a no-op if the INSERT would violate a uniqueness constraint"". The syntax in 3.24.0+ looks like this (confusingly it does not use the term ""upsert""): ```sql INSERT INTO phonebook(name,phonenumber) VALUES('Alice','704-555-1212') ON CONFLICT(name) DO UPDATE SET phonenumber=excluded.phonenumber ``` Here's the problem: the `sqlite-utils` `.upsert()` and `.upsert_all()` methods don't do this. They use the following SQL: ```sql INSERT OR REPLACE INTO [{table}] ({columns}) VALUES {rows}; ``` If the record already exists, it will be entirely replaced by a new record - as opposed to updating any specified fields but leaving existing fields as they are (the behaviour of ""upsert"" in SQLite itself).",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/66/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 521923131,MDExOlB1bGxSZXF1ZXN0MzQwMjExMTQ5,631,bugfix issue 572,3683993,qwo,closed,0,,,,,1,2019-11-13T02:46:50Z,2019-11-13T04:28:43Z,2019-11-13T04:28:42Z,CONTRIBUTOR,simonw/datasette/pulls/631,closes bugfix issue #572 ,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/631/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 521995039,MDU6SXNzdWU1MjE5OTUwMzk=,632,Upgrade datasette publish Heroku runtime,9599,simonw,closed,0,,,,,2,2019-11-13T06:46:19Z,2019-11-13T16:44:07Z,2019-11-13T16:43:23Z,OWNER,,"``` Python has released a security update! Please consider upgrading to python-3.6.9 ``` https://devcenter.heroku.com/articles/python-support#supported-runtimes shows 3.8.0 is now supported.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/632/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 522334771,MDU6SXNzdWU1MjIzMzQ3NzE=,633,"Publish to Heroku is broken: ""WARNING: You must pass the application as an import string to enable 'reload' or 'workers""",9599,simonw,closed,0,,,,,3,2019-11-13T16:32:11Z,2020-04-28T20:37:50Z,2019-11-13T16:43:23Z,OWNER,,"``` 2019-11-13T16:27:59.821483+00:00 heroku[web.1]: Starting process with command `datasette serve --host 0.0.0.0 -i fixtures.db --cors --port 36817 --inspect-file inspect-data.json` 2019-11-13T16:28:01.856471+00:00 heroku[web.1]: State changed from starting to crashed 2019-11-13T16:28:01.750253+00:00 app[web.1]: Serve! files=() (immutables=('fixtures.db',)) on port 36817 2019-11-13T16:28:01.771524+00:00 app[web.1]: WARNING: You must pass the application as an import string to enable 'reload' or 'workers'. 2019-11-13T16:28:01.837839+00:00 heroku[web.1]: Process exited with status 1 ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/633/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 522352520,MDU6SXNzdWU1MjIzNTI1MjA=,634,Don't run tests twice when releasing a tag,9599,simonw,closed,0,,,,,2,2019-11-13T17:02:42Z,2020-09-15T20:37:58Z,2020-09-15T20:37:58Z,OWNER,,"Shipping a release currently runs the tests twice: https://travis-ci.org/simonw/datasette/builds/611463728 It does a regular test run on Python 3.6/7/8 - then the ""Release tagged version"" step runs the tests again before publishing to PyPI! This second run is not necessary.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/634/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 522566332,MDExOlB1bGxSZXF1ZXN0MzQwNzQzMjIw,635,Use Jinja async mode,9599,simonw,closed,0,,,,,0,2019-11-14T01:20:57Z,2019-11-14T23:14:23Z,2019-11-14T23:14:23Z,OWNER,simonw/datasette/pulls/635,Refs #628. Still needs documentation.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/635/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 525254973,MDU6SXNzdWU1MjUyNTQ5NzM=,636,rowid is not included in dropdown filter menus,9599,simonw,closed,0,,,,,3,2019-11-19T20:43:04Z,2019-11-19T23:01:17Z,2019-11-19T23:01:17Z,OWNER,,"For `rowid` tables the `rowid` column isn't shown in the list of filter options: This also means if you link to e.g. `?rowid__gt=1060124` the resulting filter interface will be slightly broken: clicking the ""apply"" button again will lose your filter for example.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/636/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 525993034,MDU6SXNzdWU1MjU5OTMwMzQ=,637,"Custom queries with 0 results should say ""0 results""",9599,simonw,closed,0,,,,,3,2019-11-20T18:28:14Z,2019-11-23T06:17:23Z,2019-11-23T06:07:08Z,OWNER,,"Consider https://latest.datasette.io/fixtures/neighborhood_search?text=foop It's currently not obvious that the query executed and returned 0 results.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/637/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 526913133,MDU6SXNzdWU1MjY5MTMxMzM=,638,Don't suggest column for faceting if all values are 1,9599,simonw,closed,0,,,,,3,2019-11-22T00:14:22Z,2019-11-22T01:14:59Z,2019-11-22T00:57:49Z,OWNER,,"https://www.niche-museums.com/museums/museums?_facet=wikipedia_url Challenge is how to do this efficiently, since suggested facet queries need to be lightning fast.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/638/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 527670799,MDU6SXNzdWU1Mjc2NzA3OTk=,639,updating metadata.json without recreating the app,172847,pkoppstein,open,0,,,,,6,2019-11-24T09:19:53Z,2019-11-30T06:08:50Z,,NONE,,"I've sucessfully ""uploaded"" an SQLite database (with a metadata.json file) to heroku using: $ datasette publish heroku so-sales.db -m metadata.json -n so-sales The question is: how can I modify the (small) metadata.json file without having to upload the (large) SQLite database. The directions on heroku indicate I should run: heroku git:clone -a so-sales But this just results in an empty directory with a warning: warning: You appear to have cloned an empty repository. I've been able to ""clone"" the heroku ""app"" using the command: $ heroku slugs:download -a so-sales but this is not a git repository.... Ideally, it seems to me, there'd be an option of the `datasette` CLI to allow a file to be updated, or there'd be some way to create a local git ""clone"" of the app so that the heroku instructions for ""Deploying with git"" would apply. (p.s. I ran `datasette publish heroku -m metadata.json -n so-sales` in the hope that that would not cause the .db file to be wiped, but of course it was.) (p.p.s. Thanks for Datasette!)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/639/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 527710055,MDU6SXNzdWU1Mjc3MTAwNTU=,640,Nicer error message for heroku publish name clash,82988,psychemedia,open,0,,,,,1,2019-11-24T14:57:07Z,2019-12-06T07:19:34Z,,CONTRIBUTOR,,"If you try to publish to Heroku using no set name (i.e. the default `datasette` name) and a project already exists under that name, you get a meaningful error report on the first line followed by Py error messages that drown it out: ``` Creating datasette... ! ▸ Name datasette is already taken Traceback (most recent call last): File ""/usr/local/bin/datasette"", line 10, in sys.exit(cli()) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/local/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/NNNNN/Library/Python/3.7/lib/python/site-packages/datasette/publish/heroku.py"", line 124, in heroku create_output = check_output(cmd).decode(""utf8"") File ""/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/lib/python3.7/subprocess.py"", line 411, in check_output **kwargs).stdout File ""/usr/local/Cellar/python/3.7.5/Frameworks/Python.framework/Versions/3.7/lib/python3.7/subprocess.py"", line 512, in run output=stdout, stderr=stderr) subprocess.CalledProcessError: Command '['heroku', 'apps:create', 'datasette', '--json']' returned non-zero exit status 1. ``` It would be neater if: - the Py error message was caught; - the report suggested setting a project name using `-n` etc. It may also be useful to provide a command to list the current names that are being used, which I assume is available via a Heroku call?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/640/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 528442126,MDU6SXNzdWU1Mjg0NDIxMjY=,641,Better documentation for --static option,9599,simonw,closed,0,,,,,1,2019-11-26T02:07:57Z,2019-11-26T03:30:02Z,2019-11-26T02:31:53Z,OWNER,,"This is misleading: https://github.com/simonw/datasette/blob/aca41618f8761f99c47c8ae8e81b07a6d4af4d7a/docs/datasette-serve-help.txt#L23 The correct format is e.g. `static:static/` Also it's not mentioned in the regular documentation at all.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/641/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 529376481,MDExOlB1bGxSZXF1ZXN0MzQ2MjY0OTI2,67,Run tests against 3.5 too,9599,simonw,closed,0,,,,,2,2019-11-27T14:20:35Z,2019-12-31T01:29:44Z,2019-12-31T01:29:43Z,OWNER,simonw/sqlite-utils/pulls/67,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/67/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 529429214,MDU6SXNzdWU1Mjk0MjkyMTQ=,642,Provide a cookiecutter template for creating new plugins,9599,simonw,closed,0,,,3268330,Datasette 1.0,6,2019-11-27T15:46:36Z,2020-06-20T03:20:33Z,2020-06-20T03:20:25Z,OWNER,,See this conversation: https://twitter.com/psychemedia/status/1199707352540368896,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/642/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 530468212,MDU6SXNzdWU1MzA0NjgyMTI=,643,Set up some basic benchmarks as part of the unit tests,9599,simonw,open,0,,,,,0,2019-11-29T19:24:19Z,2019-11-29T19:24:19Z,,OWNER,,"https://pypi.org/project/pytest-benchmark/ looks great for this. Here's how to run it as a github action: https://github.com/rhysd/github-action-benchmark/blob/master/examples/pytest/README.md",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/643/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 530491074,MDU6SXNzdWU1MzA0OTEwNzQ=,14,Command for importing events,9599,simonw,open,0,,,,,3,2019-11-29T21:28:58Z,2020-04-14T19:38:34Z,,MEMBER,,"Eg from https://api.github.com/users/simonw/events Docs here: https://developer.github.com/v3/activity/events/#list-events-performed-by-a-user",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/14/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 530513784,MDExOlB1bGxSZXF1ZXN0MzQ3MTc5MDgx,644,Validate metadata json on startup,6025893,chris48s,closed,0,,,,,1,2019-11-30T00:32:15Z,2021-07-28T17:58:45Z,2021-07-28T17:58:45Z,CONTRIBUTOR,simonw/datasette/pulls/644,"This PR adds a sanity check which builds up a marshmallow schema on-the-fly based on the structure of the database(s) on startup and then validates the metadata json against it. In case of invalid data, this will raise with a descriptive error e.g: ``` marshmallow.exceptions.ValidationError: {'databases': {'fixtures': {'tables': {'not_a_table': ['Unknown field.']}}}} ``` Closes #260 --- This was intended to be fairly self-contained, but then while I was working on it, I hit some problems getting the tests to pass in the context of the test suite as a whole. My tests passed in isolation, but then failed while doing a full test suite run. That's when the worms started coming out of the can :bug: After some sleuthing, it turned out this was essentially the result of several issues intersecting: * There are certain events in the application lifecycle where the metadata schema can be modified after it is loaded e.g: https://github.com/simonw/datasette/blob/a562f2965552fb2dbbbd74df245c9965ee23d886/datasette/app.py#L299-L320 This means that sometimes what goes in isn't always exactly what comes out when you call `/-/metadata`. * Because the test fixtures use session scope for performance reasons if one unit test performs an action which mutates the metadata, that can impact on other unit tests which run after it using the same fixture. * Because the `self._metadata` property was being set with a simple assignment `self._metadata = metadata`, that created an object reference to the test fixture data, so operating on `self._metadata` was actually modifying the test fixture `METADATA` meaning that depending on when it was loaded in the test suite lifecycle, `METADATA` had different content, which was somewhat unexpected. As such, I've added some band-aids in 3552024 and 6859fd8: * Switching the metadata object to a `deepcopy` of the input prevents us directly mutating the input fixture. * I've switched some of the tests to use a fixture with function scope instead of session scope so we're working on a clean copy that hasn't been mutated by other tests where necessary but keeping session scope in most cases for performance. * I haven't really addressed the fact that sometimes the metadata object gets mutated in place, so the object that is served from `/-/metadata` isn't necessarily always exactly the same as the file you fed into it on init. I'm not sure how much of a problem that is. The way the tests were written makes me think it was unexpected, but getting into it feels like too much scope creep for this PR so its probably best addressed as another issue.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/644/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 530653633,MDU6SXNzdWU1MzA2NTM2MzM=,645,Mechanism for register_output_renderer to suggest extension or not,9599,simonw,closed,0,,,,,4,2019-12-01T01:26:27Z,2020-05-28T02:22:18Z,2020-05-28T02:22:12Z,OWNER,,"[datasette-atom](https://github.com/simonw/datasette-atom) only works if the user constructs a SQL query with specific output columns (`atom_id` ,`atom_updated` etc). It would be good if the `.atom` link wasn't shown on the query/table page unless those columns were present. Right now you get a link which results in a 400 error: See also #581.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/645/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 531502365,MDU6SXNzdWU1MzE1MDIzNjU=,646,Make database level information from metadata.json available in the index.html template,18017473,lagolucas,open,0,,,3268330,Datasette 1.0,3,2019-12-02T19:55:10Z,2022-03-15T20:50:34Z,,NONE,,"Did a search on the issues here and didn't find anything related to what I want. I want to have information that is on the database level of the JSON like title, source and source_url, and use it on the index page. I tried some small tweaks on the python and html files, but failed to get that result. Is there a way? Thanks!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/646/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 531583658,MDU6SXNzdWU1MzE1ODM2NTg=,68,Add support for porter stemming in FTS,9599,simonw,closed,0,,,,,1,2019-12-02T22:35:52Z,2020-09-20T04:25:53Z,2020-09-20T04:25:47Z,OWNER,,FTS5 can have porter stemming enabled.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/68/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 531755959,MDU6SXNzdWU1MzE3NTU5NTk=,647,Move hashed URL mode out to a plugin,9599,simonw,closed,0,,,3268330,Datasette 1.0,9,2019-12-03T06:29:03Z,2022-03-19T11:56:05Z,2022-03-15T23:13:06Z,OWNER,,"They used to be the default until #418. Since making them optional I haven't felt the need to use them even once. That suggests to me that they should be removed. I think their effect could be entirely handled by an ASGI wrapping plugin. https://datasette.readthedocs.io/en/0.32/performance.html#hashed-url-mode",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/647/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 534492501,MDU6SXNzdWU1MzQ0OTI1MDE=,648,Mechanism for adding arbitrary pages like /about,9599,simonw,closed,0,,,,,13,2019-12-08T04:55:19Z,2020-05-07T15:21:19Z,2020-04-26T18:46:45Z,OWNER,,"For www.niche-museums.com I solved this by creating an empty `about.db` database file - see https://simonwillison.net/2019/Nov/25/niche-museums/ I want a neater mechanism for this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/648/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 534507142,MDU6SXNzdWU1MzQ1MDcxNDI=,69,Feature request: enable extensions loading,30607,aborruso,closed,0,,,,,3,2019-12-08T08:06:25Z,2022-02-05T00:04:25Z,2020-10-16T18:42:49Z,NONE,,"Hi, it would be great to add a parameter that enables the load of a sqlite extension you need. Something like ""-ext modspatialite"". In this way your great tool would be even more comfortable and powerful. Thank you very much",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/69/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 534530973,MDU6SXNzdWU1MzQ1MzA5NzM=,649,Reduce table counts on index page with many databases,9599,simonw,closed,0,,,,,2,2019-12-08T11:56:37Z,2020-02-29T01:08:29Z,2020-02-29T01:08:29Z,OWNER,,"Since #467 the index page has attempted to optimistically count times. My personal Dogsheep has enough connected databases and tables that the page can still take way too long to load - sometimes more than twenty seconds.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/649/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 534629631,MDU6SXNzdWU1MzQ2Mjk2MzE=,650,Add a glossary to the documentation,9599,simonw,open,0,,,,,3,2019-12-09T00:23:45Z,2022-01-13T22:04:56Z,,OWNER,,"Call it `glossary.rst` - it can use a definition list something like this: ```rst .. _glossary: Glossary ======== Term A definition of the term. Another term Another definition. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/650/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 539204432,MDU6SXNzdWU1MzkyMDQ0MzI=,70,Implement ON DELETE and ON UPDATE actions for foreign keys,26292069,LucasElArruda,open,0,,,,,2,2019-12-17T17:19:10Z,2020-02-27T04:18:53Z,,NONE,,"Hi! I did not find any mention on the library about ON DELETE and ON UPDATE actions for foreign keys. Are those expected to be implemented? If not, it would be a nice thing to include!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/70/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 539590148,MDU6SXNzdWU1Mzk1OTAxNDg=,651,fts5 syntax error when using punctuation,2181410,clausjuhl,closed,0,,,,,3,2019-12-18T10:25:35Z,2021-07-14T19:26:06Z,2019-12-30T06:42:55Z,NONE,,"Hi Simon I get a syntax error when using punctuation or special characters in a fulltext search (using fts5). I created the virtual table using sqlite-utils' ""enable-fts""-command. The same error appears on Niche Museums [https://www.niche-museums.com/browse/search?q=park.](https://www.niche-museums.com/browse/search?q=park.), but works fine in most of your other datasette-examples, e.g. register-of-members-interests [https://register-of-members-interests.datasettes.com/regmem-98dc8b7/items?_search=mins.](https://register-of-members-interests.datasettes.com/regmem-98dc8b7/items?_search=mins.) What am I doing wrong? Many thanks! ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/651/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 539985017,MDExOlB1bGxSZXF1ZXN0MzU0ODY5Mzkx,652,Quick (and uninformed and perhaps misguided) attempt to add a url for hosting datasette at a particular host/URI,132978,terrycojones,closed,0,,,,,1,2019-12-18T23:37:16Z,2020-03-24T22:14:50Z,2020-03-24T22:14:50Z,NONE,simonw/datasette/pulls/652,"As usual, I don't really know what I'm doing... so this is just a suggested approach. I've not written tests, I've not run the tests, I don't know if I've missed some absolute URLs that would need to have the leading slash dropped. BUT, I tested it with `--config base_url:http://127.0.0.1:8001/` on the command line and from what little I know about datasette it's at least working in some obvious cases. My changes are based on what I saw in https://github.com/simonw/datasette/commit/8da2db4b71096b19e7a9ef1929369b8483d448bf (thanks!) I'm happy to be more thorough on this if you think it's worth pursuing. Fixes #394 (he said, optimistically).",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/652/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 541274681,MDU6SXNzdWU1NDEyNzQ2ODE=,2,Add linkedin-to-sqlite,881925,mnp,open,0,,,,,0,2019-12-21T03:13:40Z,2019-12-21T03:13:40Z,,NONE,,"There is an API available. https://developer.linkedin.com/docs/rest-api# At the minimum, I would think contact list and messages would be of interest.",214746582,dogsheep.github.io,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 541331755,MDExOlB1bGxSZXF1ZXN0MzU2MDA0MjQy,653,allow leading comments in SQL input field,418191,jaywgraves,closed,0,,,,,8,2019-12-21T14:19:52Z,2020-02-05T02:35:41Z,2020-02-05T02:13:25Z,CONTRIBUTOR,simonw/datasette/pulls/653,"this changes the SQL validation to allow for lines that are commented out my main use case for this is that I like to write a succession of queries when trying to solve a problem. In most native SQL clients there is a key binding that will run just the current highlighted query or the program is smart enough to run just the query that the cursor is in if it's properly delimited with a ';'. Typically my workflow will start with a single simple query and I'll copy/paste it to a new query below when I want to make big changes while debugging. This makes it easy to go back to a working version above when the query doesn't work. Since datasette sends the whole query to the DB I have to comment out the older queries by prefixing each line with `--`. This gets caught by the validators when I use my typical strategy of copy/pasting each successive query below the last one. so this is just a simple fix to allow for a query to be sent to the DB with leading comments. ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/653/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 541467590,MDU6SXNzdWU1NDE0Njc1OTA=,654,Template debug mode that outputs template context,9599,simonw,closed,0,,,,,3,2019-12-22T15:51:25Z,2019-12-22T16:13:11Z,2019-12-22T16:04:51Z,OWNER,,It would make writing templates (including custom templates) easier if there was an option to dump out the full template context - maybe `?_context=1`,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/654/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 542553350,MDU6SXNzdWU1NDI1NTMzNTA=,655,Copy and paste doesn't work reliably on iPhone for SQL editor,9599,simonw,closed,0,,,3268330,Datasette 1.0,3,2019-12-26T13:15:10Z,2020-09-30T20:36:12Z,2020-08-30T17:51:40Z,OWNER,,I'm having a lot of trouble copying and pasting from the codemirror editor on my iPhone.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/655/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 542814756,MDU6SXNzdWU1NDI4MTQ3NTY=,71,Tests are failing due to missing FTS5,9599,simonw,closed,0,,,,,3,2019-12-27T09:41:16Z,2019-12-27T09:49:37Z,2019-12-27T09:49:37Z,OWNER,,"https://travis-ci.com/simonw/sqlite-utils/jobs/268436167 This is a recent change: 2 months ago they worked fine. I'm not sure what changed here. Maybe something to do with https://launchpad.net/~jonathonf/+archive/ubuntu/backports ?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/71/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 543355051,MDExOlB1bGxSZXF1ZXN0MzU3NjQwMTg2,6,don't break if source is missing,78035,mfa,closed,0,,,,,1,2019-12-29T10:46:47Z,2020-03-28T02:28:11Z,2020-03-28T02:28:11Z,CONTRIBUTOR,dogsheep/swarm-to-sqlite/pulls/6,broke for me. very old checkins in 2010 had no source set.,205429375,swarm-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 543717994,MDExOlB1bGxSZXF1ZXN0MzU3OTc0MzI2,3,Add todoist-to-sqlite,706257,bcongdon,closed,0,,,,,0,2019-12-30T04:02:59Z,2020-10-12T00:35:58Z,2020-10-12T00:35:57Z,CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/3,"Really enjoying getting into the dogsheep/datasette ecosystem. I made a downloader for Todoist, and I think/hope others might find this useful",214746582,dogsheep.github.io,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 543738004,MDExOlB1bGxSZXF1ZXN0MzU3OTkyNTg4,72,Fixed implementation of upsert,9599,simonw,closed,0,,,,,0,2019-12-30T05:08:05Z,2019-12-30T05:29:24Z,2019-12-30T05:29:24Z,OWNER,simonw/sqlite-utils/pulls/72,Refs #66,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/72/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 544571092,MDU6SXNzdWU1NDQ1NzEwOTI=,15,Assets table with downloads,2029,garethr,closed,0,,,5225818,1.0,4,2020-01-02T13:05:28Z,2020-03-28T12:17:01Z,2020-03-23T19:17:32Z,NONE,,"The `releases` command extracts the releases table, but data about the individual assets are locked up in the JSON document in the `assets` field. My main interest is in individual and aggregate download counts. I was wondering if creating a new table with a record per asset may be useful? If so I'm happy to send a PR when I get a moment. Do you have opinions about that simply being part of the `releases` command or would you prefer a separate command as well?",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 545407916,MDU6SXNzdWU1NDU0MDc5MTY=,73,upsert_all() throws issue when upserting to empty table,82988,psychemedia,closed,0,,,,,6,2020-01-05T11:58:57Z,2020-01-31T14:21:09Z,2020-01-05T17:20:18Z,NONE,,"If I try to add a list of `dict`s to an empty table using `upsert_all`, I get an error: ```python import sqlite3 from sqlite_utils import Database import pandas as pd conx = sqlite3.connect(':memory') cx = conx.cursor() cx.executescript('CREATE TABLE ""test"" (""Col1"" TEXT);') q=""SELECT * FROM test;"" pd.read_sql(q, conx) #shows empty table db = Database(conx) db['test'].upsert_all([{'Col1':'a'},{'Col1':'b'}]) --------------------------------------------------------------------------- TypeError Traceback (most recent call last) in 1 db = Database(conx) ----> 2 db['test'].upsert_all([{'Col1':'a'},{'Col1':'b'}]) /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in upsert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, alter, extracts) 1157 alter=alter, 1158 extracts=extracts, -> 1159 upsert=True, 1160 ) 1161 /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, alter, ignore, replace, extracts, upsert) 1040 sql = ""INSERT OR IGNORE INTO [{table}]({pks}) VALUES({pk_placeholders});"".format( 1041 table=self.name, -> 1042 pks="", "".join([""[{}]"".format(p) for p in pks]), 1043 pk_placeholders="", "".join([""?"" for p in pks]), 1044 ) TypeError: 'NoneType' object is not iterable ``` A hacky workaround in use is: ```python try: db['test'].upsert_all([{'Col1':'a'},{'Col1':'b'}]) except: db['test'].insert_all([{'Col1':'a'},{'Col1':'b'}]) ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/73/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 546051181,MDU6SXNzdWU1NDYwNTExODE=,16,Exception running first command: IndexError: list index out of range,15092,jayvdb,closed,0,,,,,4,2020-01-07T03:01:58Z,2020-04-14T18:37:21Z,2020-04-14T18:37:21Z,NONE,,"Exception running first command without an existing db or auth. ```py > mkdir ~/.github/coala > /usr/bin/github-to-sqlite repos ~/.github/coala coala Traceback (most recent call last): File ""/usr/bin/github-to-sqlite"", line 11, in load_entry_point('github-to-sqlite==0.6', 'console_scripts', 'github-to-sqlite')() File ""/usr/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/usr/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/usr/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/usr/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/usr/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/usr/lib/python3.7/site-packages/github_to_sqlite/cli.py"", line 163, in repos utils.save_repo(db, repo) File ""/usr/lib/python3.7/site-packages/github_to_sqlite/utils.py"", line 120, in save_repo to_save[""owner""] = save_user(db, to_save[""owner""]) File ""/usr/lib/python3.7/site-packages/github_to_sqlite/utils.py"", line 61, in save_user return db[""users""].upsert(to_save, pk=""id"", alter=True).last_pk File ""/usr/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1135, in upsert extracts=extracts, File ""/usr/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1162, in upsert_all upsert=True, File ""/usr/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1105, in insert_all row = list(self.rows_where(""rowid = ?"", [self.last_rowid]))[0] IndexError: list index out of range ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/16/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 546073980,MDU6SXNzdWU1NDYwNzM5ODA=,74,Test failures on openSUSE 15.1: AssertionError: Explicit other_table and other_column,15092,jayvdb,open,0,,,,,3,2020-01-07T04:35:50Z,2020-01-12T07:21:17Z,,CONTRIBUTOR,,"openSUSE 15.1 is using python 3.6.5 and click-7.0 , however it has test failures while openSUSE Tumbleweed on py37 passes. Most fail on the cli exit code like ```py [ 74s] =================================== FAILURES =================================== [ 74s] _________________________________ test_tables __________________________________ [ 74s] [ 74s] db_path = '/tmp/pytest-of-abuild/pytest-0/test_tables0/test.db' [ 74s] [ 74s] def test_tables(db_path): [ 74s] result = CliRunner().invoke(cli.cli, [""tables"", db_path]) [ 74s] > assert '[{""table"": ""Gosh""},\n {""table"": ""Gosh2""}]' == result.output.strip() [ 74s] E assert '[{""table"": ""...e"": ""Gosh2""}]' == '' [ 74s] E - [{""table"": ""Gosh""}, [ 74s] E - {""table"": ""Gosh2""}] [ 74s] [ 74s] tests/test_cli.py:28: AssertionError ``` packaging project at https://build.opensuse.org/package/show/home:jayvdb:py-new/python-sqlite-utils I'll keep digging into this after I have github-to-sqlite working on Tumbleweed, as I'll need openSUSE Leap 15.1 working before I can submit this into the main python repo.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/74/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 546078359,MDExOlB1bGxSZXF1ZXN0MzU5ODIyNzcz,75,Explicitly include tests and docs in sdist,15092,jayvdb,closed,0,,,,,1,2020-01-07T04:53:20Z,2020-01-31T00:21:27Z,2020-01-31T00:21:27Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/75,Also exclude 'tests' from runtime installation.,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/75/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 546961357,MDU6SXNzdWU1NDY5NjEzNTc=,656,Display of the column definitions,6371750,JBPressac,closed,0,,,,,1,2020-01-08T16:16:53Z,2020-01-20T14:17:11Z,2020-01-20T14:14:33Z,CONTRIBUTOR,,"Hello, Is the nice display of headers and definitions at the top of https://fivethirtyeight.datasettes.com/fivethirtyeight-ac35616/antiquities-act%2Factions_under_antiquities_act is configured in the metadata.json file ? Thank you,",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/656/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 548591089,MDU6SXNzdWU1NDg1OTEwODk=,657,Allow creation of virtual tables at startup,1055831,dazzag24,open,0,,,,,4,2020-01-12T16:10:55Z,2021-01-15T20:24:35Z,,NONE,,"Hi, I've been experimenting with SQLite reading from huge datasets using this excellent Parquet extension from @cldellow. https://cldellow.com/2018/06/22/sqlite-parquet-vtable.html https://github.com/cldellow/sqlite-parquet-vtable This works really well, but I was keen to see if I could combine datasette with this. Having previously experimented with the spatialite extension I knew that datasette supports loading extensions in the underlying sqlite instance. However I hit a blocker as the current design only allows SELECT statements to be executed and so I am unable to execute the crucial CREATE VIRTUAL TABLE ......... command that is required to load the data from the parquet file into the table. It seems like this would be a simple-ish change, but I don't know enough about the architecture of datasette to start implementing this myself? Could this be done as a datasette plugin? or would this require more fundamental changes at initialisation time? My thoughts are that something at init time could detect that the user was loading a *.parquet file and then switch to a mode were it loads that via the ""CREATE VIRTUAL TABLE..."" rather than loading the *.db file in the default case?? I'm happy to contribute code and testing, I just need some pointers on the best approach. Thanks Darren",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/657/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 549287310,MDU6SXNzdWU1NDkyODczMTA=,76,order_by mechanism,10501166,metab0t,closed,0,,,,,4,2020-01-14T02:06:03Z,2020-04-16T06:23:29Z,2020-04-16T03:13:06Z,NONE,,"In some cases, I want to iterate rows in a table with `ORDER BY` clause. It would be nice to have a `rows_order_by` function similar to `rows_where`. In a more general case, `rows_filter` function might be added to allow more customized filtering to iterate rows.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/76/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 550293770,MDU6SXNzdWU1NTAyOTM3NzA=,658,How do I use the app.css as style sheet?,49656826,null92,open,0,,,,,2,2020-01-15T16:27:57Z,2020-02-07T00:29:50Z,,NONE,,"Simon, I'm trying to use the app.css (in static folder) as style sheet but the datasette on Heroku simply ignore it! I read everything about customization here and on readthedocs but still can't. Is this possible? Thanks!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/658/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 551834842,MDU6SXNzdWU1NTE4MzQ4NDI=,659,README information is obscured by feature history,55480210,labstersteve,closed,0,,,,,1,2020-01-18T22:34:51Z,2020-12-10T23:28:51Z,2020-12-10T23:28:51Z,NONE,,"While it's sometimes valuable to know how a project has developed, there is usually little justification for including this information in the README, and certainly not immediately after other key information such as ""what does this package do, and who might want to use it?"" Might I recommend that the feature history is migrated to an Appendix in the documentation?",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/659/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 552773632,MDExOlB1bGxSZXF1ZXN0MzY1MjE4Mzkx,660,"gcloud run is now GA, s/beta//",813732,glasnt,closed,0,,,,,1,2020-01-21T10:08:38Z,2020-01-22T03:41:09Z,2020-01-21T23:28:12Z,CONTRIBUTOR,simonw/datasette/pulls/660,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/660/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 555832585,MDU6SXNzdWU1NTU4MzI1ODU=,661,"--port option to expose a port other than 8001 in ""datasette package""",134771,dvhthomas,closed,0,,,,,3,2020-01-27T21:05:56Z,2020-01-30T04:17:52Z,2020-01-29T22:46:45Z,NONE,,"I see how to alter the port using `datasette serve -p XXX` per the docs. However, I'm packaging up to server the container on AppEngine flexible, which [requires](https://cloud.google.com/appengine/docs/flexible/custom-runtimes/build#listening_to_port_8080) that the container is serving traffic on port 8080. https://github.com/simonw/datasette/blob/7950105c278b140e6cb665c68b59df219870f9bc/Dockerfile#L41 Is there a way to inject a non-default port into the Dockerfile, or should I just do something like `sed` to replace 8001 with 8080 after `dataset package` has done it's thing? Thanks for the advice.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/661/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 556814876,MDU6SXNzdWU1NTY4MTQ4NzY=,662,Escape_fts5_query-hookimplementation does not work with queries to standard tables,2181410,clausjuhl,closed,0,,,,,5,2020-01-29T11:56:03Z,2020-01-30T00:30:20Z,2020-01-30T00:30:19Z,NONE,,"Hi Simon Thank you for adding the escape_function, but it does not work on my datasette-installation (0.33). I've added the following file to my datasette-dir: /plugins/sql_functions.py: `from datasette import hookimpl def escape_fts_query(query): bits = query.split() return ' '.join('""{}""'.format(bit.replace('""', '')) for bit in bits) @hookimpl def prepare_connection(conn): conn.create_function(""escape_fts_query"", 1, escape_fts_query)` It has no effect on the standard queries to the tables though, as they still produce errors when including any characters like '-', '/', '+' or '?' Does the function only work when using costum queries, where I can include the escape_fts-function explicitly in the sql-query? PS. I'm calling datasette with --plugins=plugins, and my other plugins work just fine. PPS. The fts5 virtual table is created with 'sqlite3' like so: `CREATE VIRTUAL TABLE ""cases_fts"" USING FTS5( title, subtitle, resume, suggestion, presentation, detail = full, content_rowid = 'id', content = 'cases', tokenize='unicode61', 'remove_diacritics 2', 'tokenchars ""-_""' );` Thanks! _Originally posted by @clausjuhl in https://github.com/simonw/datasette/issues/651#issuecomment-579675357_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/662/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 557077945,MDExOlB1bGxSZXF1ZXN0MzY4NzM0NTAw,663,"-p argument for datasette package, plus tests - refs #661",9599,simonw,closed,0,,,,,1,2020-01-29T19:47:50Z,2020-01-29T22:46:43Z,2020-01-29T22:46:43Z,OWNER,simonw/datasette/pulls/663,,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/663/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 557825032,MDU6SXNzdWU1NTc4MjUwMzI=,77,Ability to insert data that is transformed by a SQL function,9599,simonw,closed,0,,,,,2,2020-01-30T23:45:55Z,2022-02-05T00:04:25Z,2020-01-31T00:24:32Z,OWNER,,"I want to be able to run the equivalent of this SQL insert: ```python # Convert to ""Well Known Text"" format wkt = shape(geojson['geometry']).wkt # Insert and commit the record conn.execute(""INSERT INTO places (id, name, geom) VALUES(null, ?, GeomFromText(?, 4326))"", ( ""Wales"", wkt )) conn.commit() ``` From the Datasette SpatiaLite docs: https://datasette.readthedocs.io/en/stable/spatialite.html To do this, I need a way of telling `sqlite-utils` that a specific column should be wrapped in `GeomFromText(?, 4326)`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/77/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 557830332,MDExOlB1bGxSZXF1ZXN0MzY5MzQ4MDg0,78,"New conversions= feature, refs #77",9599,simonw,closed,0,,,,,0,2020-01-31T00:02:33Z,2020-09-22T07:48:29Z,2020-01-31T00:24:31Z,OWNER,simonw/sqlite-utils/pulls/78,,140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/78/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 557842245,MDU6SXNzdWU1NTc4NDIyNDU=,79,Helper methods for working with SpatiaLite,9599,simonw,closed,0,,,,,8,2020-01-31T00:39:19Z,2022-02-05T00:04:25Z,2022-02-04T05:55:11Z,OWNER,,"As demonstrated by this piece of documentation, using SpatiaLite with sqlite-utils requires a fair bit of boilerplate: https://github.com/simonw/sqlite-utils/blob/f7289174e66ae4d91d57de94bbd9d09fabf7aff4/docs/python-api.rst#L880-L909",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/79/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 557892819,MDExOlB1bGxSZXF1ZXN0MzY5Mzk0MDQz,80,on_create mechanism for after table creation,9599,simonw,closed,0,,,,,5,2020-01-31T03:38:48Z,2020-01-31T05:08:04Z,2020-01-31T05:08:04Z,OWNER,simonw/sqlite-utils/pulls/80,"I need this for `geojson-to-sqlite`, in particular https://github.com/simonw/geojson-to-sqlite/issues/6",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/80/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 558600274,MDU6SXNzdWU1NTg2MDAyNzQ=,81,"Remove .detect_column_types() from table, make it a documented API",9599,simonw,closed,0,,,,,4,2020-02-01T21:25:54Z,2020-02-01T21:55:35Z,2020-02-01T21:55:35Z,OWNER,,"I used it in `geojson-to-sqlite` here: https://github.com/simonw/geojson-to-sqlite/blob/f10e44264712dd59ae7dfa2e6fd5a904b682fb33/geojson_to_sqlite/utils.py#L45-L50 It would make more sense for this method to live on the Database rather than the Table - or even to exist as a separate utility method entirely. Then it should be documented.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/81/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 558715564,MDExOlB1bGxSZXF1ZXN0MzcwMDI0Njk3,4,Add beeminder-to-sqlite,706257,bcongdon,closed,0,,,,,0,2020-02-02T15:51:36Z,2020-10-12T00:36:16Z,2020-10-12T00:36:16Z,CONTRIBUTOR,dogsheep/dogsheep.github.io/pulls/4,,214746582,dogsheep.github.io,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep.github.io/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 559197745,MDU6SXNzdWU1NTkxOTc3NDU=,82,Tutorial command no longer works,10350886,petey284,closed,0,,,,,3,2020-02-03T16:36:11Z,2020-02-27T04:16:43Z,2020-02-27T04:16:30Z,NONE,,"Issue with command on [tutorial](https://simonwillison.net/2019/Feb/25/sqlite-utils/) on Simon's site. The following command no longer works, and breaks with the previous too many variables error: #50 ``` cmd > curl ""https://data.nasa.gov/resource/y77d-th95.json"" | \ sqlite-utils insert meteorites.db meteorites - --pk=id ``` Output: ``` cmd Traceback (most recent call last): File ""continuum\miniconda3\envs\main\lib\runpy.py"", line 193, in _run_module_as_main ""__main__"", mod_spec) File ""continuum\miniconda3\envs\main\lib\runpy.py"", line 85, in _run_code exec(code, run_globals) File ""Continuum\miniconda3\envs\main\Scripts\sqlite-utils.exe\__main__.py"", line 9, in File ""continuum\miniconda3\envs\main\lib\site-packages\click\core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""continuum\miniconda3\envs\main\lib\site-packages\click\core.py"", line 717, in main rv = self.invoke(ctx) File ""continuum\miniconda3\envs\main\lib\site-packages\click\core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""continuum\miniconda3\envs\main\lib\site-packages\click\core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""continuum\miniconda3\envs\main\lib\site-packages\click\core.py"", line 555, in invoke return callback(*args, **kwargs) File ""continuum\miniconda3\envs\main\lib\site-packages\sqlite_utils\cli.py"", line 434, in insert default=default, File ""continuum\miniconda3\envs\main\lib\site-packages\sqlite_utils\cli.py"", line 384, in insert_upsert_implementation docs, pk=pk, batch_size=batch_size, alter=alter, **extra_kwargs File ""continuum\miniconda3\envs\main\lib\site-packages\sqlite_utils\db.py"", line 1081, in insert_all result = self.db.conn.execute(query, params) sqlite3.OperationalError: too many SQL variables ``` My thought is that maybe the dataset grew over the last few years and so didn't run into this issue before. No error when I reduce the count of entries to 83. Once the number of entries hits 84 the command fails. // This passes ``` cmd type meteorite_83.txt | sqlite-utils insert meteorites.db meteorites - --pk=id ``` // But this fails ``` cmd type meteorite_84.txt | sqlite-utils insert meteorites.db meteorites - --pk=id ``` A potential fix might be to chunk the incoming data? I can work on a PR if pointed in right direction. ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/82/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 559374410,MDU6SXNzdWU1NTkzNzQ0MTA=,83,"Make db[""table""].exists a documented API",9599,simonw,closed,0,,,,,1,2020-02-03T22:31:44Z,2020-02-08T23:58:35Z,2020-02-08T23:56:23Z,OWNER,,Right now it's a static thing which might get out-of-sync with the database. It should probably be a live check. Maybe call it `.exists()` instead?,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/83/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 559522877,MDExOlB1bGxSZXF1ZXN0MzcwNjc1MDA3,664,Datasette.render_template() method,9599,simonw,closed,0,,,,,5,2020-02-04T06:53:59Z,2020-02-04T20:26:18Z,2020-02-04T20:26:18Z,OWNER,simonw/datasette/pulls/664,Refs #577,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/664/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 559964149,MDU6SXNzdWU1NTk5NjQxNDk=,665,Introduce a SQL statement parser in Python,9599,simonw,open,0,,,,,1,2020-02-04T20:36:05Z,2020-02-04T20:36:48Z,,OWNER,,#254 and #653 are both examples of problems that could be solved using a real SQL parser in Python.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/665/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 561454071,MDU6SXNzdWU1NjE0NTQwNzE=,32,"Documentation for "" favorites"" command",9599,simonw,closed,0,,,,,0,2020-02-07T06:50:11Z,2020-02-07T06:59:10Z,2020-02-07T06:59:10Z,MEMBER,,"It looks like I forgot to document this one in the README. https://github.com/dogsheep/twitter-to-sqlite/blob/6ebd482619bd94180e54bb7b56549c413077d329/twitter_to_sqlite/cli.py#L183-L194",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/32/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 561460274,MDU6SXNzdWU1NjE0NjAyNzQ=,84,.upsert() with hash_id throws error,9599,simonw,closed,0,,,,,0,2020-02-07T07:08:19Z,2020-02-07T07:17:11Z,2020-02-07T07:17:11Z,OWNER,,"```python db[table_name].upsert_all(rows, hash_id=""pk"") ``` This throws an error: `PrimaryKeyRequired('upsert() requires a pk')` The problem is, if you try this: ```python db[table_name].upsert_all(rows, hash_id=""pk"", pk=""pk"") ``` You get this error: `AssertionError('Use either pk= or hash_id=')` `hash_id=` should imply that `pk=` that column.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/84/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 561469252,MDExOlB1bGxSZXF1ZXN0MzcyMjczNjA4,33,Upgrade to sqlite-utils 2.2.1,9599,simonw,closed,0,,,,,1,2020-02-07T07:32:12Z,2020-03-20T19:21:42Z,2020-03-20T19:21:41Z,MEMBER,dogsheep/twitter-to-sqlite/pulls/33,,206156866,twitter-to-sqlite,pull,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 562085508,MDExOlB1bGxSZXF1ZXN0MzcyNzYzOTA2,666,"Use inspect-file, if possible, for total row count",13896256,kevindkeogh,closed,0,,,,,3,2020-02-08T22:10:35Z,2020-03-09T02:47:15Z,2020-02-25T20:19:29Z,CONTRIBUTOR,simonw/datasette/pulls/666,"For large tables, counting the number of rows in the table can take a signficant amount of time. Instead, where an inspect-file is provided for an immutable database, look up the row-count for a plain count(*).",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/666/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 562787785,MDU6SXNzdWU1NjI3ODc3ODU=,667,Allow injecting configuration data from plugins,870184,xrotwang,closed,0,,,,,2,2020-02-10T19:50:15Z,2020-02-12T16:18:22Z,2020-02-12T09:21:22Z,NONE,,"I'm trying to customize datasette as explorer for [CLDF](https://cldf.clld.org) datasets. Such datasets can be converted automatically to SQLite, which then can be fed to datasette, (e.g. https://github.com/cldf/cookbook/blob/master/recipes/datasette/README.md). Part of this customization would be support for the ""special"" data types described in the [CLDF ontology](https://cldf.clld.org/v1.0/terms.rdf). But while rendering of the values can be customized via the `render_cell` hook in a plugin, e.g. custom labels for foreign keys must be specified through the config file. It would be nice to be able to programmatically inject config data from plugins as well.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/667/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 562911863,MDU6SXNzdWU1NjI5MTE4NjM=,85,Create index doesn't work for columns containing spaces,9599,simonw,closed,0,,,,,1,2020-02-11T00:34:46Z,2020-02-11T05:13:20Z,2020-02-11T05:13:20Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/85/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 563347679,MDU6SXNzdWU1NjMzNDc2Nzk=,668,Make it easier to load SpatiaLite,9599,simonw,closed,0,,,,,2,2020-02-11T17:03:43Z,2022-01-20T21:29:41Z,2021-01-04T20:18:39Z,OWNER,,"``` $ datasette spatial.db Serve! files=('spatial.db',) (immutables=()) on port 8001 ERROR: conn=, sql = 'PRAGMA table_info(SpatialIndex);', params = None: no such module: VirtualSpatialIndex Usage: datasette serve [OPTIONS] [FILES]... Error: It looks like you're trying to load a SpatiaLite database without first loading the SpatiaLite module. Read more: https://datasette.readthedocs.io/en/latest/spatialite.html ``` This error message could sniff around in the common locations for the SpatiaLite module and output the CLI command you should use to enable it: ``` datasette spatial.db --load-extension=/usr/local/lib/mod_spatialite.dylib ``` Even better: if Datasette had a `--spatialite` option which automatically loads the extension from common locations, if it can find it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/668/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 563348959,MDExOlB1bGxSZXF1ZXN0MzczNzc1Nzg4,669,fix db-to-sqlite command in ecosystem doc page,883348,adipasquale,closed,0,,,,,1,2020-02-11T17:05:41Z,2020-02-22T02:32:18Z,2020-02-22T02:32:17Z,CONTRIBUTOR,simonw/datasette/pulls/669,the `--connection` parameter has become positional,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/669/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 564579430,MDU6SXNzdWU1NjQ1Nzk0MzA=,86,Problem with square bracket in CSV column name,8149512,foscoj,closed,0,,,,,7,2020-02-13T10:19:57Z,2020-02-27T04:16:08Z,2020-02-27T04:16:07Z,NONE,,"testing some data from european power information (entsoe.eu), the title of the csv contains square brackets. as I am playing with glitch, sqlite-utils are used for creating the db. Traceback (most recent call last): File ""/app/.local/bin/sqlite-utils"", line 8, in sys.exit(cli()) File ""/app/.local/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/app/.local/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/app/.local/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/app/.local/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/app/.local/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/app/.local/lib/python3.7/site-packages/sqlite_utils/cli.py"", line 434, in insert default=default, File ""/app/.local/lib/python3.7/site-packages/sqlite_utils/cli.py"", line 384, in insert_upsert_implementation docs, pk=pk, batch_size=batch_size, alter=alter, **extra_kwargs File ""/app/.local/lib/python3.7/site-packages/sqlite_utils/db.py"", line 997, in insert_all extracts=extracts, File ""/app/.local/lib/python3.7/site-packages/sqlite_utils/db.py"", line 618, in create extracts=extracts, File ""/app/.local/lib/python3.7/site-packages/sqlite_utils/db.py"", line 310, in create_table self.conn.execute(sql) sqlite3.OperationalError: unrecognized token: ""]"" entsoe_2016.csv renamed to txt for uploading compatibility [entsoe_2016.txt](https://github.com/simonw/sqlite-utils/files/4197688/entsoe_2016.txt) code is remixed directly from your https://glitch.com/edit/#!/datasette-csvs repo ",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/86/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 564833696,MDU6SXNzdWU1NjQ4MzM2OTY=,670,Prototoype for Datasette on PostgreSQL,9599,simonw,open,0,,,,,14,2020-02-13T17:17:55Z,2023-07-17T02:23:32Z,,OWNER,,"I thought this would never happen, but now that I'm deep in the weeds of running SQLite in production for Datasette Cloud I'm starting to reconsider my policy of only supporting SQLite. Some of the factors making me think PostgreSQL support could be worth the effort: - Serverless. I'm getting increasingly excited about writable-database use-cases for Datasette. If it could talk to PostgreSQL then users could easily deploy it on Heroku or other serverless providers that can talk to a managed RDS-style PostgreSQL. - Existing databases. Plenty of organizations have PostgreSQL databases. They can export to SQLite using [db-to-sqlite](https://github.com/simonw/db-to-sqlite) but that's a pretty big barrier to getting started - being able to run `datasette postgresql://connection-string` and start trying it out would be a massively better experience. - Data size. I keep running into use-cases where I want to run Datasette against many GBs of data. SQLite can do this but PostgreSQL is much more optimized for large data, especially given the existence of tools like Citus. - Marketing. Convincing people to trust their data to SQLite is potentially a big barrier to adoption. Even if I've convinced myself it's trustworthy I still have to convince everyone else. - It might not be that hard? If this required a ground-up rewrite it wouldn't be worth the effort, but I have a hunch that it may not be too hard - most of the SQL in Datasette should work on both databases since it's almost all portable SELECT statements. If Datasette did DML this would be a lot harder, but it doesn't. - Plugins! This feels like a natural surface for a plugin - at which point people could add MySQL support and suchlike in the future. The above reasons feel strong enough to justify a prototype.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/670/reactions"", ""total_count"": 15, ""+1"": 11, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 4, ""rocket"": 0, ""eyes"": 0}",, 565041624,MDU6SXNzdWU1NjUwNDE2MjQ=,671,"datasette.add_database(name, db) and datasette.remove_database(name) methods",9599,simonw,closed,0,,,,,1,2020-02-14T01:05:48Z,2020-02-14T01:30:35Z,2020-02-14T01:30:30Z,OWNER,,"- `datasette.add_database(name, db)` - adds a new named database to the list of connected databases. `db` will be a `Database()` object, which may prove useful in the future for things like #670 and could also allow some plugins to provide in-memory SQLite databases. - `datasette.remove_database(name)` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/417#issuecomment-586047995_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/671/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 565064079,MDExOlB1bGxSZXF1ZXN0Mzc1MTgwODMy,672,--dirs option for scanning directories for SQLite databases,9599,simonw,open,0,,,,,15,2020-02-14T02:25:52Z,2020-03-27T01:03:53Z,,OWNER,simonw/datasette/pulls/672,Refs #417.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/672/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 565518772,MDU6SXNzdWU1NjU1MTg3NzI=,673,Mechanism for checking if a SQLite database file is safe to open,9599,simonw,closed,0,,,,,11,2020-02-14T19:36:04Z,2020-02-14T20:13:59Z,2020-02-14T20:13:59Z,OWNER,,"Opening a SpatiaLite database file without SpatiaLite will result in errors later on. Same for database files which use custom extensions, like the Apple Photos database. I've figured out how to tell if a database is safe to open or not: ```sql select sql from sqlite_master where sql like 'CREATE VIRTUAL TABLE%'; ``` This returns the SQL definitions for virtual tables. The bit after `using` tells you what they need. Run this against a SpatiaLite database and you get the following: ```sql CREATE VIRTUAL TABLE SpatialIndex USING VirtualSpatialIndex() CREATE VIRTUAL TABLE ElementaryGeometries USING VirtualElementary() ``` Run it against an Apple Photos `photos.db` file (found with `find ~/Library | grep photos.db`) and you get this (partial list): ```sql CREATE VIRTUAL TABLE RidList_VirtualReader using RidList_VirtualReaderModule CREATE VIRTUAL TABLE Array_VirtualReader using Array_VirtualReaderModule CREATE VIRTUAL TABLE LiGlobals_VirtualBufferReader using VirtualBufferReaderModule CREATE VIRTUAL TABLE RKPlace_RTree using rtree (modelId,minLongitude,maxLongitude,minLatitude,maxLatitude) ``` For a database with FTS4 you get: ```sql CREATE VIRTUAL TABLE ""docs_fts"" USING FTS4 ( [title], [content], content=""docs"" ) ``` FTS5: ```sql CREATE VIRTUAL TABLE [FARA_All_Registrants_fts] USING FTS5 ( [Name], [Address_1], [Address_2], content=[FARA_All_Registrants] ) ``` So I can use this to figure out all of the `using` pieces and then compare them to a list of known support ones. _Originally posted by @simonw in https://github.com/simonw/datasette/pull/672#issuecomment-586441484_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/673/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 565552217,MDU6SXNzdWU1NjU1NTIyMTc=,674,Rethink how sanity checks work,9599,simonw,closed,0,,,,,5,2020-02-14T20:57:02Z,2020-03-26T17:19:23Z,2020-02-15T17:57:46Z,OWNER,,"If you specify a file to open using `files` or `-i` then Datasette should show a useful error message and fail to start. Files found by scanning a directory #672 should just be skipped. _Split off from comment by @simonw in https://github.com/simonw/datasette/issues/673#issuecomment-586455321_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/674/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 565837965,MDU6SXNzdWU1NjU4Mzc5NjU=,87,Should detect collections.OrderedDict as a regular dictionary,9599,simonw,closed,0,,,,,2,2020-02-16T02:06:34Z,2020-02-16T02:20:59Z,2020-02-16T02:20:59Z,OWNER,,"``` File ""...python3.7/site-packages/sqlite_utils/db.py"", line 292, in create_table column_type=COLUMN_TYPE_MAPPING[column_type], KeyError: ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/87/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 567902704,MDU6SXNzdWU1Njc5MDI3MDQ=,675,--cp option for datasette publish and datasette package for shipping additional files and directories,141844,aviflax,open,0,,,,,12,2020-02-19T22:55:56Z,2020-12-28T18:49:21Z,,NONE,,"I’m working on integrating Datasette into a documentation-oriented publishing workflow internally in my company, and in order to deploy the Docker image created by `datasette package` I need to add an additional file to the image — in my case, it’s a sort of a deployment directive. I’ve worked out a way to do this after the image has been created, but it’s convoluted and brittle. So it’d be excellent if there was an additional option for this command, something like, like, `--copy`. I’d envision it looking something like: ```shell $ datasette package --copy /the/source/path:/the/target/path data.db ``` I’d be happy to help design, specify, implement, and test this feature, if you’d be interested. Thanks for the fantastic tools!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/675/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 568091133,MDU6SXNzdWU1NjgwOTExMzM=,676,?_searchmode=raw option for running FTS searches without escaping characters,58088336,tunguyenatwork,closed,0,,,,,9,2020-02-20T06:56:57Z,2020-02-25T05:57:24Z,2020-02-25T05:56:04Z,NONE,,"After the version 0.34. I am not able to use the wildchar in the _search option( or the full text search). It will not return any result unless I specify the whole word for text search. If I use 'match :search || ""*"" ' in the sql statement then it will work as expected.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/676/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 569237568,MDU6SXNzdWU1NjkyMzc1Njg=,677,The first time you click sort by ID it should show you results in reverse order,9599,simonw,closed,0,,,,,1,2020-02-21T23:38:50Z,2020-03-21T23:57:46Z,2020-03-21T23:57:46Z,OWNER,,"e.g. on https://latest.datasette.io/fixtures/roadside_attractions Clicking the ""pk"" column header doesn't actually do anything - it sorts by pk asc but since the page was already sorted like that nothing useful changes. The first click on a primary key column that the page is already implicitly sorted by should instead enable sort descending on that column.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/677/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 569253072,MDU6SXNzdWU1NjkyNTMwNzI=,678,prepare_connection() plugin hook should accept optional datasette argument,9599,simonw,closed,0,,,,,3,2020-02-22T00:50:26Z,2020-02-22T03:53:19Z,2020-02-22T02:28:51Z,OWNER,,"I want to build a plugin that allows users to configure certain database columns to be ""masked"" - so the `password` column on a users table is never revealed, for example. To do this, I need to use the `conn.set_authorizer()` SQLite mechanism. So the plugin needs to build off the `prepare_connection(conn)` hook. But that hook doesn't currently get passed `datasette` so it doesn't have a way of looking up its plugin configuration!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/678/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 569268612,MDU6SXNzdWU1NjkyNjg2MTI=,679,Release 0.36,9599,simonw,closed,0,,,,,2,2020-02-22T02:41:01Z,2020-02-22T03:52:13Z,2020-02-22T03:52:13Z,OWNER,,"I think we have enough changes to warrant a release - and I want to take advantage of the changes to the `prepare_connection()` plugin hook in #678 Changes since 0.35 so far: https://github.com/simonw/datasette/compare/0.35...be2265b0e811d0ac2875c2f748125c17b0f9289e - [x] Update ecosystem page - [x] Write release notes - [x] Ship the release",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/679/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 569275763,MDU6SXNzdWU1NjkyNzU3NjM=,680,Release automation: automate the bit that posts the GitHub release,9599,simonw,closed,0,,,,,5,2020-02-22T03:50:40Z,2020-09-12T18:18:50Z,2020-09-12T18:18:50Z,OWNER,,"The most manual part of [the release process](https://datasette.readthedocs.io/en/stable/contributing.html#release-process) right now is having to post a GitHub release that matches the updated changelog. This is particularly annoying because the changelog is in `.rst` while the GitHub release needs markdown - so I currently manually translate between the two. Having the release script automatically post a GitHub release at the end would be much more convenient.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/680/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 569317377,MDU6SXNzdWU1NjkzMTczNzc=,681,Cashe-header missing in http-response,2181410,clausjuhl,closed,0,,,,,4,2020-02-22T10:50:45Z,2020-02-24T20:53:57Z,2020-02-24T20:53:56Z,NONE,,"Hi Simon. I need some help with both understanding and adding http-headers. If I call datasette on localhost with --config default_cache_ttl:120 and --cors, I only get the following response-headers: access-control-allow-origin: * content-type: text/html; charset=utf-8 date: Sat, 22 Feb 2020 10:32:15 GMT referrer-policy: no-referrer server: uvicorn transfer-encoding: chunked Cors works, but no caching-header is set? Same thing happens if I use the command in a Dockerfile and run datasette with docker. Second, how can one add headers to uvicorn? I've tried to add uvicorn commands to the Dockerfile, before the final datasette command, but it doesn't work. Is there any way to add headers to the uvicorn.run() command i datasette? I particular, I would like to add some of the missing security-headers: Thank you for a great product!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/681/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 569613563,MDU6SXNzdWU1Njk2MTM1NjM=,682,Mechanism for writing to database via a queue,9599,simonw,closed,0,,,,,10,2020-02-24T03:10:07Z,2020-02-25T04:45:10Z,2020-02-25T04:45:10Z,OWNER,,"I've been mulling this over for a long time, and I have a new approach that I think is worth exploring. The catch with writing to SQLite is that it should only accept one write at a time. I'm now thinking that an easy way to manage that would be with a write queue for each database which is then read by a single dedicated write thread which manages its own writable connection.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/682/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 570101428,MDExOlB1bGxSZXF1ZXN0Mzc5MTkyMjU4,683,.execute_write() and .execute_write_fn() methods on Database,9599,simonw,closed,0,,,3268330,Datasette 1.0,14,2020-02-24T19:51:58Z,2020-05-30T18:40:20Z,2020-02-25T04:45:08Z,OWNER,simonw/datasette/pulls/683,"See #682 - [x] Come up with design for `.execute_write()` and `.execute_write_fn()` - [x] Build some quick demo plugins to exercise the design - [x] Write some unit tests - [x] Write the documentation",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/683/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 570301333,MDU6SXNzdWU1NzAzMDEzMzM=,684,Add documentation on Database introspection methods to internals.rst,9599,simonw,closed,0,,,3268330,Datasette 1.0,4,2020-02-25T04:20:24Z,2020-06-04T18:56:15Z,2020-05-30T18:40:39Z,OWNER,,`internals.rst` will be landing as part of #683,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/684/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 570309546,MDU6SXNzdWU1NzAzMDk1NDY=,685,Document (and reconsider design of) Database.execute() and Database.execute_against_connection_in_thread(),9599,simonw,closed,0,,,3268330,Datasette 1.0,15,2020-02-25T04:49:44Z,2020-05-30T13:20:50Z,2020-05-08T17:42:18Z,OWNER,,"In #683 I started a new section of internals documentation covering the `Database` class: https://datasette.readthedocs.io/en/latest/internals.html#database-class I decided not to document `.execute()` and `.execute_against_connection_in_thread()` yet because I'm not 100% happy with their API design yet.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/685/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 570327466,MDExOlB1bGxSZXF1ZXN0Mzc5Mzc4Nzgw,686,?_searchmode=raw option,9599,simonw,closed,0,,,,,0,2020-02-25T05:45:50Z,2020-02-25T05:56:09Z,2020-02-25T05:56:04Z,OWNER,simonw/datasette/pulls/686,Closes #676,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/686/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 571805300,MDU6SXNzdWU1NzE4MDUzMDA=,88,"table.disable_fts() method and ""sqlite-utils disable-fts ..."" command",9599,simonw,closed,0,,,,,5,2020-02-27T04:00:50Z,2020-02-27T04:40:44Z,2020-02-27T04:40:44Z,OWNER,,This would make it easier to iterate on the FTS configuration for a database without having to wipe and recreate the database each time.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/88/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 572896293,MDU6SXNzdWU1NzI4OTYyOTM=,687,Expand plugins documentation to multiple pages,9599,simonw,closed,0,,,5533512,Datasette 0.45,11,2020-02-28T17:26:21Z,2020-06-22T03:55:20Z,2020-06-22T03:53:54Z,OWNER,,"I think the plugins docs need to extend beyond a single page now. I want to add a whole section on writing tests for plugins, showing how `httpx` can be used as seen in https://github.com/simonw/datasette-atom/issues/3 and suchlike.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/687/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 573088799,MDExOlB1bGxSZXF1ZXN0MzgxNjY2Nzc3,688,Don't count rows on homepage for DBs > 100MB,9599,simonw,closed,0,,,,,0,2020-02-29T01:01:06Z,2020-02-29T01:08:30Z,2020-02-29T01:08:29Z,OWNER,simonw/datasette/pulls/688,Closes #649.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/688/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 573578548,MDU6SXNzdWU1NzM1Nzg1NDg=,89,Ability to customize columns used by extracts= feature,9599,simonw,open,0,,,,,3,2020-03-01T16:54:48Z,2020-10-16T19:17:50Z,,OWNER,,"@simonw any thoughts on allow extracts to specify the lookup column name? If I'm understanding the documentation right, `.lookup()` allows you to define the ""value"" column (the documentation uses name), but when you use `extracts` keyword as part of `.insert()`, `.upsert()` etc. the lookup must be done against a column named ""value"". I have an existing lookup table that I've populated with columns ""id"" and ""name"" as opposed to ""id"" and ""value"", and seems I can't use `extracts=`, unless I'm missing something... Initial thought on how to do this would be to allow the dictionary value to be a tuple of table name column pair... so: ``` table = db.table(""trees"", extracts={""species_id"": (""Species"", ""name""}) ``` I haven't dug too much into the existing code yet, but does this make sense? Worth doing? _Originally posted by @chrishas35 in https://github.com/simonw/sqlite-utils/issues/46#issuecomment-592999503_",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/89/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 573583971,MDU6SXNzdWU1NzM1ODM5NzE=,689,"""Templates considered"" comment broken in >=0.35",35075,chrishas35,closed,0,,,,,6,2020-03-01T17:31:21Z,2020-04-05T19:39:44Z,2020-04-05T19:39:44Z,NONE,,"Noticed that the ""Templates Considered"" comment is missing in 0.37. Believe I traced it back to #664 as you can see it in https://v0-34.datasette.io/ but not https://v0-35.datasette.io/. Looking at the template context debug between the two you can see what is missing from 0.35 vs. 0.34: ```diff < ""datasette_version"": ""0.34"", < ""app_css_hash"": ""ffa51a"", < ""select_templates"": [ < ""*index.html"" < ], < ""zip"": """", < ""body_scripts"": [], < ""extra_css_urls"": """", < ""extra_js_urls"": """", < ""format_bytes"": """", < ""database_url"": "">"", < ""database_color"": "">"" --- > ""datasette_version"": ""0.35"", > ""database_url"": "">"", > ""database_color"": "">"" ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/689/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 573740712,MDU6SXNzdWU1NzM3NDA3MTI=,90,Cannot .enable_fts() for columns with spaces in their names,9599,simonw,closed,0,,,,,0,2020-03-02T06:06:03Z,2020-03-02T06:10:49Z,2020-03-02T06:10:49Z,OWNER,,"``` import sqlite_utils db = sqlite_utils.Database(memory=True) db[""test""].insert({""space in name"": ""hello""}) db[""test""].enable_fts([""space in name""]) --------------------------------------------------------------------------- OperationalError Traceback (most recent call last) in ----> 1 db['test'].enable_fts([""space in name""]) /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in enable_fts(self, columns, fts_version, create_triggers) 755 ) 756 self.db.conn.executescript(sql) --> 757 self.populate_fts(columns) 758 759 if create_triggers: /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in populate_fts(self, columns) 787 table=self.name, columns="", "".join(columns) 788 ) --> 789 self.db.conn.executescript(sql) 790 return self 791 OperationalError: near ""in"": syntax error ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/90/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 573755726,MDU6SXNzdWU1NzM3NTU3MjY=,690,Mechanism for plugins to add action menu items for various things,9599,simonw,closed,0,,,6026070,0.51,11,2020-03-02T06:48:36Z,2020-10-30T05:20:43Z,2020-10-30T05:20:42Z,OWNER,,"Now that we have support for plugins that can write I'm seeing all sorts of places where a plugin might need to add UI to the table page. Some examples: - `datasette-configure-fts` needs to add a ""configure search for this table"" link - a plugin that lets you render or delete tables needs to add a link or button somewhere - existing plugins like `datasette-vega` and `datasette-cluster-map` already do this with JavaScript The challenge here is that multiple plugins may want to do this, so simply overriding templates and populating names blocks doesn't entirely work as templates may override each other.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/690/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 574021194,MDU6SXNzdWU1NzQwMjExOTQ=,691,--reload sould reload server if code in --plugins-dir changes,9599,simonw,open,0,,,,,1,2020-03-02T14:42:21Z,2020-06-14T02:35:17Z,,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/691/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 574035432,MDU6SXNzdWU1NzQwMzU0MzI=,692,is_hidden_table context variable on table.html page,9599,simonw,open,0,,,,,1,2020-03-02T15:03:25Z,2020-03-02T15:03:48Z,,OWNER,,It's useful to know if a table is hidden when rendering that page. `datasette-configure-fts` for example may want to disallow enabling search on hidden tables.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/692/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 574043218,MDU6SXNzdWU1NzQwNDMyMTg=,693,Variables from extra_template_vars() not exposed in _context=1,9599,simonw,closed,0,,,,,3,2020-03-02T15:14:51Z,2020-04-05T19:12:48Z,2020-04-05T19:12:48Z,OWNER,,The `_context=1` debugging mode does not show variables that should have been added to the context by the `extra_template_vars()` plugin hook.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/693/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 576582604,MDU6SXNzdWU1NzY1ODI2MDQ=,694,datasette publish cloudrun --memory option,9599,simonw,closed,0,,,,,8,2020-03-05T22:59:57Z,2020-06-23T17:10:51Z,2020-03-05T23:49:41Z,OWNER,,"Got this error deploying large (603MB) database with Cloud Run ``` X Deploying... Cloud Run error: Container failed to start. Failed to start and then listen on the port defined by the PORT environment variable. Logs for this revi sion might contain more information. X Creating Revision... Cloud Run error: Container failed to start. Failed to start and then listen on the port defined by the PORT environment variable. Logs for this revision might contain more information. . Routing traffic... ✓ Setting IAM Policy... Deployment failed ERROR: (gcloud.run.deploy) Cloud Run error: Container failed to start. Failed to start and then listen on the port defined by the PORT environment variable. Logs for this revision might contain more information. ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/694/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 576711589,MDU6SXNzdWU1NzY3MTE1ODk=,695,Update SQLite bundled with Docker container,9599,simonw,closed,0,,,,,7,2020-03-06T05:42:12Z,2020-03-08T23:33:23Z,2020-03-06T06:15:27Z,OWNER,,"It's 3.26.0 at the moment: https://github.com/simonw/datasette/blob/af9cd4ca64652fae262e6f7b5d201f6e0adc989b/Dockerfile#L9-L11 Most recent release is 3.31.1: https://www.sqlite.org/releaselog/3_31_1.html",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/695/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 576722115,MDU6SXNzdWU1NzY3MjIxMTU=,696,Single failing unit test when run inside the Docker image,9599,simonw,closed,0,,,3268330,Datasette 1.0,2,2020-03-06T06:16:36Z,2021-03-29T17:04:19Z,2021-03-07T07:41:18Z,OWNER,,"``` docker run -it -v `pwd`:/mnt datasetteproject/datasette:latest /bin/bash root@0e1928cfdf79:/# cd /mnt root@0e1928cfdf79:/mnt# pip install -e .[test] root@0e1928cfdf79:/mnt# pytest ``` I get one failure! It was for `test_searchable[/fixtures/searchable.json?_search=te*+AND+do*&_searchmode=raw-expected_rows3]` ``` def test_searchable(app_client, path, expected_rows): response = app_client.get(path) > assert expected_rows == response.json[""rows""] E AssertionError: assert [[1, 'barry c...sel', 'puma']] == [] E Left contains 2 more items, first extra item: [1, 'barry cat', 'terry dog', 'panther'] E Full diff: E + [] E - [[1, 'barry cat', 'terry dog', 'panther'], E - [2, 'terry dog', 'sara weasel', 'puma']] ``` _Originally posted by @simonw in https://github.com/simonw/datasette/issues/695#issuecomment-595614469_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/696/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 577302229,MDU6SXNzdWU1NzczMDIyMjk=,91,Enable ordering FTS results by rank,416374,gfrmin,closed,0,,,6079500,3.0,1,2020-03-07T08:43:51Z,2020-11-06T23:53:26Z,2020-11-06T23:53:25Z,NONE,,According to https://www.sqlite.org/fts5.html (not sure about FTS4) results can be sorted by relevance. At the moment results are returned by default by `rowid`. Perhaps a flag can be added to the `search` method?,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/91/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 577578306,MDU6SXNzdWU1Nzc1NzgzMDY=,697,index.html is not reliably loaded from a plugin,9599,simonw,closed,0,,,,,7,2020-03-08T22:37:55Z,2020-03-08T23:33:28Z,2020-03-08T23:11:27Z,OWNER,,"Lots of detail in https://github.com/simonw/datasette-search-all/issues/2 - short version is that I have a plugin with its own `index.html` template and Datasette intermittently fails to load it and uses the default `index.html` that ships with Datasette instead. Related: * #689: ""Templates considered"" comment broken in >=0.35 * #693: Variables from extra_template_vars() not exposed in _context=1 (may as well fix this while I'm in there)",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/697/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 578883725,MDU6SXNzdWU1Nzg4ODM3MjU=,17,Command for importing commits,9599,simonw,closed,0,,,,,2,2020-03-10T21:55:12Z,2020-03-11T02:47:37Z,2020-03-11T02:47:37Z,MEMBER,,Using this API: https://api.github.com/repos/dogsheep/github-to-sqlite/commits,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 581339961,MDU6SXNzdWU1ODEzMzk5NjE=,92,.columns_dict doesn't work for all possible column types,9599,simonw,closed,0,,,,,7,2020-03-14T19:30:35Z,2020-03-15T18:37:43Z,2020-03-14T20:04:14Z,OWNER,,"Got this error: ``` File "".../python3.7/site-packages/sqlite_utils/db.py"", line 462, in for column in self.columns KeyError: 'REAL' ``` `.columns_dict` uses `REVERSE_COLUMN_TYPE_MAPPING`: https://github.com/simonw/sqlite-utils/blob/43f1c6ab4e3a6b76531fb6f5447adb83d26f3971/sqlite_utils/db.py#L457-L463 `REVERSE_COLUMN_TYPE_MAPPING` defines `FLOAT` not `REAL`A https://github.com/simonw/sqlite-utils/blob/43f1c6ab4e3a6b76531fb6f5447adb83d26f3971/sqlite_utils/db.py#L68-L74",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/92/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 581795570,MDU6SXNzdWU1ODE3OTU1NzA=,93,Support more string values for types in .add_column(),9599,simonw,open,0,,,,,0,2020-03-15T19:32:49Z,2020-09-24T20:36:46Z,,OWNER,,"https://sqlite-utils.readthedocs.io/en/2.4.2/python-api.html#adding-columns says: > SQLite types you can specify are ""TEXT"", ""INTEGER"", ""FLOAT"" or ""BLOB"". As discovered in #92 this isn't the right list of values. I should expand this to match https://www.sqlite.org/datatype3.html",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/93/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 582517965,MDU6SXNzdWU1ODI1MTc5NjU=,698,Ability for a canned query to write to the database,9599,simonw,closed,0,,,5512395,Datasette 0.44,26,2020-03-16T18:31:59Z,2020-06-06T19:43:49Z,2020-06-06T19:43:48Z,OWNER,,"Canned queries are currently read-only: https://datasette.readthedocs.io/en/0.38/sql_queries.html#canned-queries Add a `""write"": true` option to their definition in `metadata.json` which turns them into queries that are submitted via POST and send their queries to the write queue. Then they can be used as a really quick way to define a writable interface and JSON API!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/698/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 582526961,MDU6SXNzdWU1ODI1MjY5NjE=,699,Authentication (and permissions) as a core concept,9599,simonw,closed,0,,,5512395,Datasette 0.44,40,2020-03-16T18:48:00Z,2020-06-06T19:42:11Z,2020-06-06T19:42:11Z,OWNER,,"Right now Datasette authentication is provided exclusively by plugins: * https://github.com/simonw/datasette-auth-github * https://github.com/simonw/datasette-auth-existing-cookies This is an all-or-nothing approach: either your Datasette instance requires authentication at the top level or it does not. But... as I build new plugins like https://github.com/simonw/datasette-configure-fts and https://github.com/simonw/datasette-edit-tables I increasingly have individual features which should be reserved for logged-in users while still wanting other parts of Datasette to be open to all. This is too much for plugins to own independently of Datasette core. Datasette needs to ship a single ""user is authenticated"" concept (independent of how users actually sign in) so that different plugins can integrate with it.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/699/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 582713554,MDU6SXNzdWU1ODI3MTM1NTQ=,700,Request object utility for handling POST form data,9599,simonw,closed,0,,,,,1,2020-03-17T02:44:59Z,2020-03-17T02:47:50Z,2020-03-17T02:47:50Z,OWNER,,"> This is also going to need me to handle POST form submissions which means I need to be able to parse the form body. I guess that will go in [datasette/utils/asgi.py](https://github.com/simonw/datasette/blob/master/datasette/utils/asgi.py). _Originally posted by @simonw in https://github.com/simonw/datasette/issues/698#issuecomment-599704264_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/700/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 583970196,MDU6SXNzdWU1ODM5NzAxOTY=,701,Search box CSS doesn't look great on OS X Safari,9599,simonw,closed,0,,,5234079,Datasette 0.39,3,2020-03-18T20:00:52Z,2020-03-24T22:57:18Z,2020-03-24T22:57:18Z,OWNER,," ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/701/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585266763,MDU6SXNzdWU1ODUyNjY3NjM=,34,IndexError running user-timeline command,9599,simonw,closed,0,,,,,2,2020-03-20T18:54:08Z,2020-03-20T19:20:52Z,2020-03-20T19:20:37Z,MEMBER,,"``` $ twitter-to-sqlite user-timeline data.db --screen_name Allen_Joines Traceback (most recent call last): File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/bin/twitter-to-sqlite"", line 11, in load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')() File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py"", line 256, in user_timeline utils.save_tweets(db, chunk) File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py"", line 289, in save_tweets db[""users""].upsert(user, pk=""id"", alter=True) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1128, in upsert conversions=conversions, File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1157, in upsert_all upsert=True, File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/sqlite_utils/db.py"", line 1096, in insert_all row = list(self.rows_where(""rowid = ?"", [self.last_rowid]))[0] IndexError: list index out of range ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/34/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585282212,MDU6SXNzdWU1ODUyODIyMTI=,35,twitter-to-sqlite user-timeline [screen_names] --sql / --attach,9599,simonw,closed,0,,,,,5,2020-03-20T19:26:07Z,2020-03-20T20:17:00Z,2020-03-20T20:16:35Z,MEMBER,,Split from #8.,206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/35/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585306847,MDU6SXNzdWU1ODUzMDY4NDc=,36,twitter-to-sqlite followers/friends --sql / --attach,9599,simonw,closed,0,,,,,0,2020-03-20T20:20:33Z,2020-03-20T23:12:38Z,2020-03-20T23:12:38Z,MEMBER,,"Split from #8. The `friends` and `followers` commands don't yet support `--sql` and `--attach`. (`friends-ids` and `followers-ids` do though).",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/36/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585353598,MDU6SXNzdWU1ODUzNTM1OTg=,37,"Handle ""User not found"" error",9599,simonw,closed,0,,,,,3,2020-03-20T22:14:32Z,2020-04-17T23:43:46Z,2020-04-17T23:43:46Z,MEMBER,,"While running `user-timeline` I got this bug (because a screen name I asked for didn't exist): ``` File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py"", line 185, in transform_user user[""created_at""] = parser.parse(user[""created_at""]) KeyError: 'created_at' >>> import pdb >>> pdb.pm() > /Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py(185)transform_user() -> user[""created_at""] = parser.parse(user[""created_at""]) (Pdb) user {'errors': [{'code': 50, 'message': 'User not found.'}]} ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/37/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585359363,MDU6SXNzdWU1ODUzNTkzNjM=,38,Screen name display for user-timeline is uneven,9599,simonw,closed,0,,,,,1,2020-03-20T22:30:23Z,2020-03-20T22:37:17Z,2020-03-20T22:37:17Z,MEMBER,,"``` CDPHE [####################################] 67 CHFSKy [####################################] 3216 DHSWI [####################################] 41 DPHHSMT [####################################] 742 Delaware_DHSS [####################################] 3231 DhhsNevada [####################################] 639 ``` I could format them to match the length of the longest screen name instead.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585390482,MDU6SXNzdWU1ODUzOTA0ODI=,702,Option in metadata.json to set default sort order for a table,9599,simonw,closed,0,,,5234079,Datasette 0.39,5,2020-03-21T00:19:56Z,2020-03-25T04:19:36Z,2020-03-22T02:40:35Z,OWNER,,If you access the table page without any `?_sort` or `?_sort_desc` arguments it currently defaults to order by primary key - would be neat to be able to change that.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/702/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585411547,MDU6SXNzdWU1ODU0MTE1NDc=,18,Commits in GitHub API can have null author,9599,simonw,closed,0,,,5225818,1.0,8,2020-03-21T02:20:56Z,2020-03-23T20:44:49Z,2020-03-23T20:44:26Z,MEMBER,,"``` Traceback (most recent call last): File ""/home/ubuntu/datasette-venv/bin/github-to-sqlite"", line 8, in sys.exit(cli()) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/github_to_sqlite/cli.py"", line 235, in commits utils.save_commits(db, commits, repo_full[""id""]) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/github_to_sqlite/utils.py"", line 290, in save_commits commit_to_insert[""author""] = save_user(db, commit[""author""]) File ""/home/ubuntu/datasette-venv/lib/python3.6/site-packages/github_to_sqlite/utils.py"", line 54, in save_user for key, value in user.items() AttributeError: 'NoneType' object has no attribute 'items' ``` Got this running the `commits` command from cron.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585526292,MDU6SXNzdWU1ODU1MjYyOTI=,1,Set up full text search,9599,simonw,closed,0,,,,,1,2020-03-21T15:57:35Z,2020-03-21T19:47:46Z,2020-03-21T19:45:52Z,MEMBER,,"Should run against `title` and `text` in `items`, and `about` and `id` in `users`.",248903544,hacker-news-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/hacker-news-to-sqlite/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585597133,MDExOlB1bGxSZXF1ZXN0MzkxOTI0NTA5,703,WIP implementation of writable canned queries,9599,simonw,closed,0,,,,,3,2020-03-21T22:23:51Z,2020-06-03T00:08:14Z,2020-06-02T23:57:35Z,OWNER,simonw/datasette/pulls/703,Refs #698.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/703/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",1, 585597329,MDU6SXNzdWU1ODU1OTczMjk=,704,Add datasette-publish-fly to Datasette Publish documentation,9599,simonw,closed,0,,,5234079,Datasette 0.39,1,2020-03-21T22:25:10Z,2020-03-24T22:39:09Z,2020-03-24T22:39:09Z,OWNER,,It's a cool example of a plugin that provides a new publish provider - worth mentioning on https://datasette.readthedocs.io/en/stable/publish.html,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/704/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585626199,MDU6SXNzdWU1ODU2MjYxOTk=,705,latest.datasette.io is no longer updating,9599,simonw,closed,0,,,5234079,Datasette 0.39,15,2020-03-22T01:59:30Z,2020-03-25T02:30:24Z,2020-03-25T02:30:24Z,OWNER,,https://latest.datasette.io/-/versions is stuck on 0.35.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/705/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585633142,MDU6SXNzdWU1ODU2MzMxNDI=,706,"Documentation for the ""request"" object",9599,simonw,closed,0,,,3268330,Datasette 1.0,6,2020-03-22T02:55:50Z,2020-05-30T13:20:00Z,2020-05-27T22:31:22Z,OWNER,,"Since that object is passed to the `extra_template_vars` hooks AND the classes registered by `register_facet_classes` it should be part of the documented interface on https://datasette.readthedocs.io/en/stable/internals.html I could also start passing it to the `register_output_renderer` callback.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/706/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 585850715,MDU6SXNzdWU1ODU4NTA3MTU=,19,"Enable full-text search for more stuff (like commits, issues and issue_comments)",9599,simonw,closed,0,,,5225818,1.0,2,2020-03-23T00:19:56Z,2020-03-23T19:06:39Z,2020-03-23T19:06:39Z,MEMBER,,Currently FTS is only enabled for repos and releases.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586454513,MDU6SXNzdWU1ODY0NTQ1MTM=,20,Upgrade to sqlite-utils 2.x,9599,simonw,closed,0,,,5225818,1.0,0,2020-03-23T19:17:58Z,2020-03-23T19:22:52Z,2020-03-23T19:22:52Z,MEMBER,,,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586477757,MDU6SXNzdWU1ODY0Nzc3NTc=,94,"If column data is a mixture of integers and nulls, detected type should be INTEGER",9599,simonw,closed,0,,,,,0,2020-03-23T19:51:46Z,2020-03-23T19:57:10Z,2020-03-23T19:57:10Z,OWNER,,It looks like detected type for that case is TEXT at the moment.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/94/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586486367,MDU6SXNzdWU1ODY0ODYzNjc=,95,Columns with only null values are no longer created in the database,9599,simonw,closed,0,,,,,0,2020-03-23T20:07:42Z,2020-03-23T20:31:15Z,2020-03-23T20:31:15Z,OWNER,,"Bug introduced in #94, and released in `2.4.3`.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/95/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586561727,MDU6SXNzdWU1ODY1NjE3Mjc=,21,Turn GitHub API errors into exceptions,9599,simonw,closed,0,,,5225818,1.0,2,2020-03-23T22:37:24Z,2020-03-23T23:48:23Z,2020-03-23T23:48:22Z,MEMBER,,"This would have really helped in debugging the mess in #13. Running with this `auth.json` is a useful demo: ```json {""github_personal_token"": """"} ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/21/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586567379,MDU6SXNzdWU1ODY1NjczNzk=,22,Handle empty git repositories,9599,simonw,closed,0,,,,,0,2020-03-23T22:49:48Z,2020-03-23T23:13:11Z,2020-03-23T23:13:11Z,MEMBER,,"Got this error: ``` github_to_sqlite.utils.GitHubError: {'message': 'Git Repository is empty.', 'documentation_url': 'https://developer.github.com/v3/repos/commits/#list-commits-on-a-repository'} ``` From https://api.github.com/repos/dogsheep/beta/commits",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 586595839,MDU6SXNzdWU1ODY1OTU4Mzk=,23,Release 1.0,9599,simonw,closed,0,,,5225818,1.0,1,2020-03-24T00:03:55Z,2020-03-24T00:15:50Z,2020-03-24T00:15:50Z,MEMBER,,Need to compile release notes.,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 587222354,MDU6SXNzdWU1ODcyMjIzNTQ=,707,"Consider configuring Jinja in Datasette() constructor, not .app()",9599,simonw,closed,0,,,,,0,2020-03-24T19:19:58Z,2020-03-27T01:12:57Z,2020-03-27T01:12:57Z,OWNER,,"Right now the following fails with an error: ```python ds = Datasette([], template_dir=""."") rendered = await ds.render_template(""index.html"") ``` The error is: ``` async def render_template( self, templates, context=None, request=None, view_name=None ): context = context or {} if isinstance(templates, Template): template = templates select_templates = [] else: if isinstance(templates, str): templates = [templates] > template = self.jinja_env.select_template(templates) E AttributeError: 'Datasette' object has no attribute 'jinja_env' ``` This is because `jinja_env` is configured in the `.app()` method, here: https://github.com/simonw/datasette/blob/a498d0fe6590f9bdbc4faf9e0dd5faeb3b06002c/datasette/app.py#L609-L633 This is a little surprising, especially now that `.render_template()` is part of the documented internals API: https://datasette.readthedocs.io/en/stable/internals.html#render-template-template-context-none-request-none Maybe this should happen in the Datasette class constructor instead.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/707/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 587302139,MDExOlB1bGxSZXF1ZXN0MzkzMjc0NDMz,708,"base_url configuration setting, refs #394",9599,simonw,closed,0,,,5234079,Datasette 0.39,2,2020-03-24T21:52:00Z,2020-03-25T00:18:44Z,2020-03-25T00:18:44Z,OWNER,simonw/datasette/pulls/708,Pull request implementing #394,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/708/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 587314002,MDU6SXNzdWU1ODczMTQwMDI=,709,Each plugin hook should link to example plugins built with it,9599,simonw,closed,0,,,5234079,Datasette 0.39,1,2020-03-24T22:18:48Z,2020-03-24T22:30:10Z,2020-03-24T22:29:43Z,OWNER,,,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/709/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 587322443,MDU6SXNzdWU1ODczMjI0NDM=,710,Remove Zeit Now v1 support,9599,simonw,closed,0,,,,,2,2020-03-24T22:39:49Z,2020-04-04T23:05:12Z,2020-04-04T23:05:12Z,OWNER,,It will remain supported as a plugin but since no-one can sign up for Docker hosting any more (for over a year now) there's no point including it in Datasette core.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/710/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 587398703,MDU6SXNzdWU1ODczOTg3MDM=,711,Release notes for Datasette 0.39,9599,simonw,closed,0,,,5234079,Datasette 0.39,2,2020-03-25T02:31:13Z,2020-03-25T04:06:55Z,2020-03-25T04:06:55Z,OWNER,,Then I can ship it.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/711/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 588108428,MDU6SXNzdWU1ODgxMDg0Mjg=,712,base_url doesn't entirely work for running Datasette inside Binder,9599,simonw,closed,0,,,,,12,2020-03-26T02:25:55Z,2020-03-26T15:11:49Z,2020-03-26T14:35:43Z,OWNER,,"> Thanks! I'm trying to launch Datasette from *within* a notebook using the jupyter-server-proxy and the new `base_url` parameter. While the assets load ok, and the breadcrumb navigation works, the facet links don't seem to use the `base_url`. Or have I missed something? _Originally posted by @wragge in https://github.com/simonw/datasette/issues/394#issuecomment-604166918_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/712/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 589402939,MDU6SXNzdWU1ODk0MDI5Mzk=,4,"Store authentication information as ""pocket_access_token"" etc",9599,simonw,closed,0,,,,,0,2020-03-27T20:43:22Z,2020-03-27T20:43:59Z,2020-03-27T20:43:59Z,MEMBER,,The `pocket_` prefix will mean that the same `auth.json` file can be used for other Dogsheep tools without Pocket over-riding a value set by some other tool.,213286752,pocket-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/pocket-to-sqlite/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 589491711,MDU6SXNzdWU1ODk0OTE3MTE=,7,Upgrade to sqlite-utils 2.x,9599,simonw,closed,0,,,,,0,2020-03-28T02:24:51Z,2020-03-28T02:25:03Z,2020-03-28T02:25:03Z,MEMBER,,,205429375,swarm-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/swarm-to-sqlite/issues/7/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 589801352,MDExOlB1bGxSZXF1ZXN0Mzk1MjU4Njg3,96,Add type conversion for Panda's Timestamp,32605365,b0b5h4rp13,closed,0,,,,,2,2020-03-29T14:13:09Z,2020-03-31T04:40:49Z,2020-03-31T04:40:48Z,CONTRIBUTOR,simonw/sqlite-utils/pulls/96,"Add type conversion for Panda's Timestamp, if Panda library is present in system (thanks for this project, I was about to do the same thing from scratch)",140912432,sqlite-utils,pull,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/96/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 590666760,MDU6SXNzdWU1OTA2NjY3NjA=,39,--since feature can be confused by retweets,9599,simonw,closed,0,,,,,11,2020-03-30T23:25:33Z,2020-04-01T03:45:16Z,2020-04-01T03:45:16Z,MEMBER,,"If you run `twitter-to-sqlite user-timeline ... --since` it's supposed to fetch Tweets those specific users tweeted since last time the command was run. It does this by seeking out the max ID of their previous tweets: https://github.com/dogsheep/twitter-to-sqlite/blob/810cb2af5a175837204389fd7f4b5721f8b325ab/twitter_to_sqlite/cli.py#L305-L311 BUT... this has a nasty flaw: if another account had retweeted one of their recent tweets the retweeted-tweet will have been loaded into the database - so we may treat that as the most recent since ID and miss a bunch of their tweets!",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/39/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 590669793,MDU6SXNzdWU1OTA2Njk3OTM=,40,Feature: record history of follower counts,9599,simonw,closed,0,,,,,5,2020-03-30T23:32:28Z,2020-04-01T04:13:05Z,2020-04-01T04:13:05Z,MEMBER,,"We currently over-write the follower count every time we import a tweet (when we import that user profile again): https://github.com/dogsheep/twitter-to-sqlite/blob/810cb2af5a175837204389fd7f4b5721f8b325ab/twitter_to_sqlite/utils.py#L293-L294 It would be neat if we noticed if that user's follower count (and maybe other counts?) had changed since we last saved them and recorded that change in a separate history table. This would be an inexpensive way of building up rough charts of follower count over time.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/40/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 591613579,MDU6SXNzdWU1OTE2MTM1Nzk=,41,"Bug: recorded a since_id for None, None",9599,simonw,closed,0,,,,,0,2020-04-01T04:29:43Z,2020-04-01T04:31:11Z,2020-04-01T04:31:11Z,MEMBER,,"This shouldn't happen in the `since_ids` table (relates to #39): ",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/41/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 592829135,MDU6SXNzdWU1OTI4MjkxMzU=,713,Support YAML in metadata - metadata.yaml,9599,simonw,closed,0,,,,,6,2020-04-02T18:10:05Z,2020-04-02T19:36:17Z,2020-04-02T19:30:55Z,OWNER,,"I was originally going to do this with a plugin - see #357 - but the more I work with `metadata.json` the more I want it to just accept YAML as an optional alternative to JSON. The best example why is still this one: https://github.com/simonw/russian-ira-facebook-ads-datasette/blob/master/russian-ads-metadata.yaml YAML is just SO much better than JSON for multi-line strings - in particular HTML and SQL, both of which are common in `metadata.json` files.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/713/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 592844348,MDExOlB1bGxSZXF1ZXN0Mzk3NzQ5NjUz,714,--metadata accepts YAML as well as JSON,9599,simonw,closed,0,,,,,1,2020-04-02T18:36:02Z,2020-04-02T19:30:54Z,2020-04-02T19:30:54Z,OWNER,simonw/datasette/pulls/714,Refs #713. Still needs tests and documentation.,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/714/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 593006814,MDU6SXNzdWU1OTMwMDY4MTQ=,715,Refactor duplicate cell display logic,9599,simonw,open,0,,,,,0,2020-04-03T00:58:11Z,2020-04-03T00:58:11Z,,OWNER,,"The logic for rendering cells in table view and in database (or canned query) view is currently very similar: https://github.com/simonw/datasette/blob/7656fd64d8b6a32ebc34d89c1b8711cc5ea240f7/datasette/views/base.py#L514-L539 Compared with: https://github.com/simonw/datasette/blob/7656fd64d8b6a32ebc34d89c1b8711cc5ea240f7/datasette/views/table.py#L104-L195 I'll be changing this a bit in #698 but I should still try to clean this up more further in the future.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/715/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 593751293,MDU6SXNzdWU1OTM3NTEyOTM=,97,"Adding a ""recreate"" flag to the `Database` constructor",1448859,betatim,closed,0,,,,,4,2020-04-04T05:41:10Z,2020-04-15T14:29:31Z,2020-04-13T03:52:29Z,NONE,,"I have a [script](https://github.com/betatim/binder-datasette/blob/master/create-db.ipynb) that imports data into a sqlite DB. When I re-run that script I'd like to remove the existing sqlite DB, instead of adding to it. The pragmatic answer is to add the check and file deletion to my script. However I thought it would be easy and useful for others to add a `recreate=True` flag to `db = sqlite_utils.Database(""binder-launches.db"")`. After taking a look at the code for it I am not so sure any more. This is because the connection string could be a URL (or ""connection string"") like `""file:///tmp/foo.db""`. I don't know what the equivalent of `os.path.exists()` is for a connection string or how to detect that something is a connection string and raise an error ""can't use recreate=True and conn_string at the same time"". Does anyone have an idea/suggestion where to start investigating?",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/97/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 594168758,MDU6SXNzdWU1OTQxNjg3NTg=,716,extra_template_vars() sending wrong view_name for index,9599,simonw,closed,0,,,,,8,2020-04-04T23:57:09Z,2020-04-05T20:04:08Z,2020-04-05T18:28:48Z,OWNER,,"See https://github.com/simonw/museums/issues/20#issuecomment-609103663 - at some point between 286ed286b68793532c2a38436a08343b45cfbc91 and current master (e0e7a0facfc935a835cd73c720bc46661462f0b1 today) a bug was introduced where the `extra_template_vars(request, view_name)` plugin hook started being passed `None` instead of `index` for the `view_name` parameter on the site index page.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/716/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 594189527,MDU6SXNzdWU1OTQxODk1Mjc=,717,See if I can get Datasette working on Zeit Now v2,9599,simonw,closed,0,,,,,10,2020-04-05T00:56:48Z,2020-04-06T22:47:22Z,2020-04-06T22:47:21Z,OWNER,,"I thought this was impossible because AWS Lambda doesn't ship the `sqlite3` standard library module... but apparenttly that's not the case on Now v2 any more! https://now-2-python-versions-ks69olzpi.now.sh/api ``` _________________________________________________________________________________________________________________________________________________________________ / Hello from Python from a ZEIT Now Serverless Function! Version is 3.6.10 (default, Mar 10 2020, 22:54:43) \ \ [GCC 4.8.3 20140911 (Red Hat 4.8.3-9)], sqlite3 module = , sqlite3 version = [('3.7.17',)] / ----------------------------------------------------------------------------------------------------------------------------------------------------------------- \ ^__^ \ (oo)\_______ (__)\ )\/\ ||----w | || || ``` That's from shipping this code as `api/index.py`: ```python from http.server import BaseHTTPRequestHandler from cowpy import cow import sys try: import sqlite3 except ImportError: sqlite3 = None class handler(BaseHTTPRequestHandler): def do_GET(self): self.send_response(200) self.send_header(""Content-type"", ""text/plain"") self.end_headers() message = cow.Cowacter().milk( ""Hello from Python from a ZEIT Now Serverless Function! Version is {}, sqlite3 module = {}, sqlite3 version = {}"".format( sys.version, sqlite3, sqlite3.connect("":memory:"").execute(""select sqlite_version()"").fetchall() ) ) self.wfile.write(message.encode()) return ``` Now v2 supports ASGI so this might be possible without too much work: https://zeit.co/docs/runtimes#advanced-usage/advanced-python-usage/asynchronous-server-gateway-interface",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/717/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 594237015,MDU6SXNzdWU1OTQyMzcwMTU=,718,Plugin idea: datasette-redirects,9599,simonw,open,0,,,,,0,2020-04-05T03:41:38Z,2020-04-05T03:41:38Z,,OWNER,,"I just had to write a one-off custom plugin to redirect niche-musems.com to www.niche-museums.com (https://github.com/simonw/museums/issues/21) - it would be great if this kind of thing could be handled by a configurable plugin. https://github.com/simonw/museums/blob/6b1faf00c463b2228860d4d62d104b11935e01b1/plugins/redirect_www.py",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/718/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 594553553,MDExOlB1bGxSZXF1ZXN0Mzk5MTY2NDMz,719,asgi: check raw_path is not None,193185,cldellow,closed,0,,,,,1,2020-04-05T16:53:58Z,2020-05-04T17:14:26Z,2020-05-04T17:14:26Z,CONTRIBUTOR,simonw/datasette/pulls/719,"The ASGI spec (https://asgi.readthedocs.io/en/latest/specs/www.html#http) seems to imply that `None` is a valid value, so we need to check the value itself, not just whether the key is present. In particular, the [mangum](https://github.com/erm/mangum) adapter passes `None` for this key's value. This change permits mangum to be used to front datasette in Amazon API Gateway + AWS Lambda deployments.",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/719/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 596245802,MDExOlB1bGxSZXF1ZXN0NDAwNTc4OTc5,720,"Update beautifulsoup4 requirement from ~=4.8.1 to >=4.8.1,<4.10.0",27856297,dependabot-preview[bot],closed,0,,,,,0,2020-04-08T01:24:38Z,2020-05-04T17:14:51Z,2020-05-04T17:14:46Z,CONTRIBUTOR,simonw/datasette/pulls/720,"Updates the requirements on [beautifulsoup4](http://www.crummy.com/software/BeautifulSoup/bs4/) to permit the latest version. Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- **Note:** This repo was added to Dependabot recently, so you'll receive a maximum of 5 PRs for your first few update runs. Once an update run creates fewer than 5 PRs we'll remove that limit. You can always request more updates by clicking `Bump now` in your [Dependabot dashboard](https://app.dependabot.com).
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/720/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 596245923,MDExOlB1bGxSZXF1ZXN0NDAwNTc5MDc3,721,"Update pytest requirement from ~=5.2.2 to >=5.2.2,<5.5.0",27856297,dependabot-preview[bot],closed,0,,,,,0,2020-04-08T01:25:04Z,2020-05-04T17:13:49Z,2020-05-04T17:13:41Z,CONTRIBUTOR,simonw/datasette/pulls/721,"Updates the requirements on [pytest](https://github.com/pytest-dev/pytest) to permit the latest version.
Release notes

Sourced from pytest's releases.

5.4.1

pytest 5.4.1 (2020-03-13)

Bug Fixes

  • #6909: Revert the change introduced by #6330, which required all arguments to @pytest.mark.parametrize to be explicitly defined in the function signature.

    The intention of the original change was to remove what was expected to be an unintended/surprising behavior, but it turns out many people relied on it, so the restriction has been reverted.

  • #6910: Fix crash when plugins return an unknown stats while using the --reportlog option.

Changelog

Sourced from pytest's changelog.

Commits
  • 3d0f3ba Preparing release version 5.4.1
  • b9e2cd0 Merge pull request #6914 from nicoddemus/revert-6330
  • a84fcbf Revert "[parametrize] enforce explicit argnames declaration (#6330)"
  • 59c1bfa Merge pull request #6913 from nicoddemus/backport-6910
  • 3267f64 Merge pull request #6910 from nicoddemus/resultlog-logreport
  • c9fd1bd Preparing release version 5.4.0
  • 93aa988 Merge pull request #6901 from RonnyPfannschmidt/regendoc-fix-simple
  • 7996724 Merge pull request #6902 from RoyalTS/filterwarnings-docfix
  • 90ee8a7 docfix
  • 378a75d run and fix tox -e regen to prepare 5.4
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- **Note:** This repo was added to Dependabot recently, so you'll receive a maximum of 5 PRs for your first few update runs. Once an update run creates fewer than 5 PRs we'll remove that limit. You can always request more updates by clicking `Bump now` in your [Dependabot dashboard](https://app.dependabot.com).
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/721/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 596246006,MDExOlB1bGxSZXF1ZXN0NDAwNTc5MTM2,722,"Update jinja2 requirement from ~=2.10.3 to >=2.10.3,<2.12.0",27856297,dependabot-preview[bot],closed,0,,,,,0,2020-04-08T01:25:24Z,2020-05-04T17:13:26Z,2020-05-04T17:13:16Z,CONTRIBUTOR,simonw/datasette/pulls/722,"Updates the requirements on [jinja2](https://github.com/pallets/jinja) to permit the latest version.
Release notes

Sourced from jinja2's releases.

2.11.1

This fixes an issue in async environment when indexing the result of an attribute lookup, like {{ data.items[1:] }}.

Changelog

Sourced from jinja2's changelog.

Version 2.11.1

Released 2020-01-30

  • Fix a bug that prevented looking up a key after an attribute ({{ data.items[1:] }}) in an async template. 1141

Version 2.11.0

Released 2020-01-27

  • Drop support for Python 2.6, 3.3, and 3.4. This will be the last version to support Python 2.7 and 3.5.
  • Added a new ChainableUndefined class to support getitem and getattr on an undefined object. 977
  • Allow {%+ syntax (with NOP behavior) when lstrip_blocks is disabled. 748
  • Added a default parameter for the map filter. 557
  • Exclude environment globals from meta.find_undeclared_variables. 931
  • Float literals can be written with scientific notation, like 2.56e-3. 912, 922
  • Int and float literals can be written with the '_' separator for legibility, like 12_345. 923
  • Fix a bug causing deadlocks in LRUCache.setdefault. 1000
  • The trim filter takes an optional string of characters to trim. 828
  • A new jinja2.ext.debug extension adds a {% debug %} tag to quickly dump the current context and available filters and tests. 174, 798, 983
  • Lexing templates with large amounts of whitespace is much faster. 857, 858
  • Parentheses around comparisons are preserved, so {{ 2 * (3 < 5) }} outputs "2" instead of "False". 755, 938
  • Add new boolean, false, true, integer and float tests. 824
  • The environment's finalize function is only applied to the output of expressions (constant or not), not static template data. 63
  • When providing multiple paths to FileSystemLoader, a template can have the same name as a directory. 821
  • Always return Undefined when omitting the else clause in a {{ 'foo' if bar }} expression, regardless of the environment's undefined class. Omitting the else clause is a valid shortcut and should not raise an error when using StrictUndefined. 710, 1079
  • Fix behavior of loop control variables such as length and revindex0 when looping over a generator. 459, 751, 794, 993
  • Async support is only loaded the first time an environment enables it, in order to avoid a slow initial import. 765
  • In async environments, the |map filter will await the filter call if needed. 913
  • In for loops that access loop attributes, the iterator is not advanced ahead of the current iteration unless length, revindex, nextitem, or last are accessed. This makes it less likely to break groupby results. 555, 1101
  • In async environments, the loop attributes length and revindex work for async iterators. 1101
  • In async environments, values from attribute/property access will be awaited if needed. 1101
  • ~loader.PackageLoader doesn't depend on setuptools or pkg_resources. 970
  • PackageLoader has limited support for 420 namespace packages. 1097
  • Support os.PathLike objects in ~loader.FileSystemLoader and ~loader.ModuleLoader. 870
  • ~nativetypes.NativeTemplate correctly handles quotes between expressions. "'{{ a }}', '{{ b }}'" renders as the tuple ('1', '2') rather than the string '1, 2'. 1020
  • Creating a ~nativetypes.NativeTemplate directly creates a ~nativetypes.NativeEnvironment instead of a default Environment. 1091
  • After calling LRUCache.copy(), the copy's queue methods point to the correct queue. 843
  • Compiling templates always writes UTF-8 instead of defaulting to the system encoding. 889
  • |wordwrap filter treats existing newlines as separate paragraphs to be wrapped individually, rather than creating short intermediate lines. 175
  • Add break_on_hyphens parameter to |wordwrap filter. 550
  • Cython compiled functions decorated as context functions will be passed the context. 1108
  • When chained comparisons of constants are evaluated at compile time, the result follows Python's behavior of returning False if any comparison returns False, rather than only the last one. 1102
  • Tracebacks for exceptions in templates show the correct line numbers and source for Python >= 3.7. 1104
  • Tracebacks for template syntax errors in Python 3 no longer show internal compiler frames. 763
  • Add a DerivedContextReference node that can be used by extensions to get the current context and local variables such as loop. 860
  • Constant folding during compilation is applied to some node types that were previously overlooked. 733
  • TemplateSyntaxError.source is not empty when raised from an included template. 457
... (truncated)
Commits
  • b85283e release version 2.11.1
  • 3d5bfc6 Merge pull request #1143 from pallets/bugfix/attribute-access
  • d61c1ea add changelog
  • 15d7e61 Added regression test for slicing of attributes
  • 05dee9b Fix attribute access in async code. Fixes #1141
  • bbdafe3 release version 2.11.0
  • 9ff27f6 add python 3.8 classifier, clean up changelog
  • d312609 isolate bytecode cache tests
  • 9849979 import Markup from markupsafe, fix flake8 import warnings
  • c6d864c increment bytecode cache version
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- **Note:** This repo was added to Dependabot recently, so you'll receive a maximum of 5 PRs for your first few update runs. Once an update run creates fewer than 5 PRs we'll remove that limit. You can always request more updates by clicking `Bump now` in your [Dependabot dashboard](https://app.dependabot.com).
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/722/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 597671518,MDU6SXNzdWU1OTc2NzE1MTg=,98,"Only set .last_rowid and .last_pk for single update/inserts, not for .insert_all()/.upsert_all() with multiple records",9599,simonw,closed,0,,,,,7,2020-04-10T03:19:40Z,2021-09-28T04:38:44Z,2020-04-13T03:29:15Z,OWNER,,,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/98/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 598013965,MDU6SXNzdWU1OTgwMTM5NjU=,724,--plugin-secret over-rides existing metadata.json plugin config,9599,simonw,closed,0,,,,,3,2020-04-10T17:56:30Z,2020-04-16T04:58:12Z,2020-04-10T18:34:21Z,OWNER,,"This means if you use `--plugin-secret` at all (with e.g. `publish cloudrun`) any existing plugin configuration in your `metadata.json` will be ignored. https://github.com/simonw/datasette/blob/af9cd4ca64652fae262e6f7b5d201f6e0adc989b/datasette/publish/cloudrun.py#L98-L109 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/724/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 598640234,MDU6SXNzdWU1OTg2NDAyMzQ=,99,.upsert_all() should maybe error if dictionaries passed to it do not have the same keys,9599,simonw,closed,0,,,,,2,2020-04-13T03:02:25Z,2020-04-13T03:05:20Z,2020-04-13T03:05:04Z,OWNER,,"While investigating #98 I stumbled across this: ``` def test_upsert_compound_primary_key(fresh_db): table = fresh_db[""table""] table.upsert_all( [ {""species"": ""dog"", ""id"": 1, ""name"": ""Cleo"", ""age"": 4}, {""species"": ""cat"", ""id"": 1, ""name"": ""Catbag""}, ], pk=(""species"", ""id""), ) table.upsert_all( [ {""species"": ""dog"", ""id"": 1, ""age"": 5}, {""species"": ""dog"", ""id"": 2, ""name"": ""New Dog"", ""age"": 1}, ], pk=(""species"", ""id""), ) > assert [ {""species"": ""dog"", ""id"": 1, ""name"": ""Cleo"", ""age"": 5}, {""species"": ""cat"", ""id"": 1, ""name"": ""Catbag"", ""age"": None}, {""species"": ""dog"", ""id"": 2, ""name"": ""New Dog"", ""age"": 1}, ] == list(table.rows) E AssertionError: assert [{'age': 5, '...cies': 'dog'}] == [{'age': 5, '...cies': 'dog'}] E At index 0 diff: {'species': 'dog', 'id': 1, 'name': 'Cleo', 'age': 5} != {'species': 'dog', 'id': 1, 'name': None, 'age': 5} E Full diff: E - [{'age': 5, 'id': 1, 'name': 'Cleo', 'species': 'dog'}, E ? ^^^ -- E + [{'age': 5, 'id': 1, 'name': None, 'species': 'dog'}, E ? ^^^ E {'age': None, 'id': 1, 'name': 'Catbag', 'species': 'cat'}, E {'age': 1, 'id': 2, 'name': 'New Dog', 'species': 'dog'}] ``` If you run `.upsert_all()` with multiple dictionaries it doesn't quite have the effect you might expect.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/99/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 598891570,MDExOlB1bGxSZXF1ZXN0NDAyNjQ1OTg0,725,"Update aiofiles requirement from ~=0.4.0 to >=0.4,<0.6",27856297,dependabot-preview[bot],closed,0,,,,,3,2020-04-13T13:32:47Z,2020-05-04T18:16:54Z,2020-05-04T16:17:49Z,CONTRIBUTOR,simonw/datasette/pulls/725,"Updates the requirements on [aiofiles](https://github.com/Tinche/aiofiles) to permit the latest version.
Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/725/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 599776345,MDU6SXNzdWU1OTk3NzYzNDU=,24,Feature idea: github-to-sqlite everything ...,9599,simonw,open,0,,,,,0,2020-04-14T18:34:00Z,2020-04-14T18:34:00Z,,MEMBER,,"At the moment if you want to pull all your repos, issues, issues comments etc you have to do it with a sequence of separate commands. Consider adding a `everything` or `all` command which fetches everything that the tool knows how to fetch, and is designed to be run on a cron in a way that fetches just new stuff each time.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/24/reactions"", ""total_count"": 7, ""+1"": 7, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 600120439,MDU6SXNzdWU2MDAxMjA0Mzk=,726,Foreign key : case of a link to the associated row not displayed,6371750,JBPressac,closed,0,,,,,1,2020-04-15T08:31:27Z,2020-04-27T22:05:47Z,2020-04-27T22:05:46Z,CONTRIBUTOR,,"Hello, I use Datasette to publish tsv files linked together by foreign keys declared thanks to sqlite-utils. In one table, [prelib_personne](http://crbc-dataset.huma-num.fr/prelib/prelib_personne), the foreign keys are properly noticed by a link to the associated row (for instance ville_naissance_id is properly linked to prelib_ville). But every link to the foreign key prelib_oeuvre.id fails. For instance, [prelib_ecritoeuvre](http://crbc-dataset.huma-num.fr/prelib/prelib_ecritoeuvre) has links to prelib_personne but none to prelib_oeuvre. In despite of the schema: CREATE TABLE ""prelib_ecritoeuvre"" ( ""id"" INTEGER, ""fonction_id"" INTEGER, ""oeuvre_id"" INTEGER, ""personne_id"" INTEGER ,PRIMARY KEY ([id]), FOREIGN KEY(fonction_id) REFERENCES prelib_fonctionecritoeuvre(id), FOREIGN KEY(personne_id) REFERENCES prelib_personne(id), FOREIGN KEY(oeuvre_id) REFERENCES prelib_oeuvre(id) ); Would you have any clue to investigate the reason of this problem? Thanks,",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/726/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 600583271,MDU6SXNzdWU2MDA1ODMyNzE=,727,Custom CSS class on body for styling canned queries,9599,simonw,closed,0,,,,,5,2020-04-15T20:57:32Z,2020-04-15T21:14:58Z,2020-04-15T21:07:50Z,OWNER,,"https://latest.datasette.io/fixtures/neighborhood_search is a canned query page. One of the templates scanned is `query-fixtures-neighborhood_search.html` BUT... the body CSS class just looks like this: ```html ``` I would be useful if that included a class that can be used to style that specific canned query page.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/727/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601265023,MDU6SXNzdWU2MDEyNjUwMjM=,25,Improvements to demo instance,9599,simonw,closed,0,,,,,1,2020-04-16T17:26:55Z,2020-04-16T18:07:12Z,2020-04-16T18:07:12Z,MEMBER,,- [x] Demo should pull issue-comments as well,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601271612,MDU6SXNzdWU2MDEyNzE2MTI=,26,Topics are missing from repositories,9599,simonw,closed,0,,,,,2,2020-04-16T17:36:32Z,2020-04-16T17:41:11Z,2020-04-16T17:41:11Z,MEMBER,,"I'm sure this used to work, but right now repositories are fetched without their topics. https://developer.github.com/v3/repos/ says you need to send a custom `Accept` header of `application/vnd.github.mercy-preview+json` to get topics.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/26/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601330277,MDU6SXNzdWU2MDEzMzAyNzc=,27,Repos have a big blob of JSON in the organization column,9599,simonw,closed,0,,,,,5,2020-04-16T18:43:14Z,2020-04-18T00:19:16Z,2020-04-18T00:18:52Z,MEMBER,,"e.g. https://github-to-sqlite.dogsheep.net/github/repos ![github__repos__11_rows_where_sorted_by_updated_at_descending](https://user-images.githubusercontent.com/9599/79494124-5640b980-7fd7-11ea-99a2-17ffbd82f9ce.png) This appears to be obsolete because the `owner` column already links to that record, albeit in the `users` table with `type` set to `Organization`: https://github-to-sqlite.dogsheep.net/github/users/53015001",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/27/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601333634,MDU6SXNzdWU2MDEzMzM2MzQ=,28,Pull repository contributors,9599,simonw,closed,0,,,,,3,2020-04-16T18:46:40Z,2020-04-18T15:05:10Z,2020-04-18T15:05:10Z,MEMBER,,"https://developer.github.com/v3/repos/#list-contributors `GET /repos/:owner/:repo/contributors` Not sure if this should be a separate command or should be part of the existing `repos` command. I'm leaning towards a new `contributors` command.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/28/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601358649,MDU6SXNzdWU2MDEzNTg2NDk=,100,"Mechanism for forcing column-type, over-riding auto-detection",9599,simonw,closed,0,,,,,3,2020-04-16T19:12:52Z,2020-04-17T23:53:32Z,2020-04-17T23:53:32Z,OWNER,,"As seen in https://github.com/dogsheep/github-to-sqlite/issues/27#issuecomment-614843406 - there's a problem where you insert a record with a `None` value for a column and that column is created as `TEXT` - but actually you intended it to be an `INT` (as later examples will demonstrate). Some kind of mechanism for over-riding the detected types of columns would be useful here.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/100/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 601392318,MDU6SXNzdWU2MDEzOTIzMTg=,101,README should include an example of CLI data insertion,9599,simonw,closed,0,,,,,0,2020-04-16T19:45:37Z,2020-04-17T23:59:49Z,2020-04-17T23:59:49Z,OWNER,,Maybe using `curl` from the GitHub API.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/101/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602173589,MDU6SXNzdWU2MDIxNzM1ODk=,42,Error running user-timeline with --sql and --ids together,9599,simonw,closed,0,,,,,0,2020-04-17T19:02:06Z,2020-04-17T23:34:40Z,2020-04-17T23:34:40Z,MEMBER,,"``` $ twitter-to-sqlite user-timeline tweets.db --sql='select id from users' --ids Traceback (most recent call last): File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/bin/twitter-to-sqlite"", line 11, in load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')() File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 764, in __call__ return self.main(*args, **kwargs) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 717, in main rv = self.invoke(ctx) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 1137, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simonw/.local/share/virtualenvs/twitter-to-sqlite-4ech4lJi/lib/python3.7/site-packages/click/core.py"", line 555, in invoke return callback(*args, **kwargs) File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py"", line 284, in user_timeline ""@{:"" + str(max(len(identifier) for identifier in identifiers)) + ""}"" File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py"", line 284, in ""@{:"" + str(max(len(identifier) for identifier in identifiers)) + ""}"" TypeError: object of type 'int' has no len() ``` But this DID work - casting to strings: ``` $ twitter-to-sqlite user-timeline tweets.db --sql='select """" || id from users' --ids ... this worked ... ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/42/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602176870,MDU6SXNzdWU2MDIxNzY4NzA=,43,"""twitter-to-sqlite lists"" command for retrieving a user's owned lists",9599,simonw,closed,0,,,,,1,2020-04-17T19:08:59Z,2020-04-17T23:48:28Z,2020-04-17T23:30:39Z,MEMBER,,"https://developer.twitter.com/en/docs/accounts-and-users/create-manage-lists/api-reference/get-lists-ownerships `https://api.twitter.com/1.1/lists/ownerships.json `",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/43/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602181581,MDU6SXNzdWU2MDIxODE1ODE=,44,"tweet[""source""] can be an empty string",9599,simonw,closed,0,,,,,0,2020-04-17T19:18:26Z,2020-04-17T22:01:44Z,2020-04-17T22:01:44Z,MEMBER,,"Got this excepion: ``` File ""/Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py"", line 641, in extract_and_save_source details = m.groupdict() AttributeError: 'NoneType' object has no attribute 'groupdict' ``` I traced it back to this tweet: https://twitter.com/osder/status/578712651393576960 ``` (Pdb) source_re re.compile('.*?)"".*?>(?P.*?)') (Pdb) locals()['source'] '' (Pdb) u > /Users/simonw/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py(393)save_tweets() -> tweet[""source""] = extract_and_save_source(db, tweet[""source""]) (Pdb) tweet {'created_at': '2015-03-20T00:20:22+00:00', 'id': 578712651393576960, 'full_text': '@osder', 'truncated': False, 'display_text_range': [0, 6], 'source': '', 'in_reply_to_status_id': 578712521382715392, 'in_reply_to_user_id': 1545741, 'in_reply_to_screen_name': 'osder', 'geo': None, 'coordinates': None, 'place': None, 'contributors': None, 'is_quote_status': False, 'retweet_count': 0, 'favorite_count': 0, 'favorited': False, 'retweeted': False, 'lang': 'und', 'user': 1545741} ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/44/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602533300,MDU6SXNzdWU2MDI1MzMzMDA=,1,Import photo metadata from Apple Photos into SQLite,9599,simonw,open,0,,,5324096,Apple Photos online and securely browsable,8,2020-04-18T19:23:26Z,2020-05-04T02:41:40Z,,MEMBER,,"Faces, albums, locations, that kind of thing.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/1/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 602533352,MDU6SXNzdWU2MDI1MzMzNTI=,2,Ability to convert HEIC images to JPEG,9599,simonw,closed,0,,,5324096,Apple Photos online and securely browsable,1,2020-04-18T19:23:43Z,2020-04-28T16:47:21Z,2020-04-28T16:47:21Z,MEMBER,,,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/2/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602533481,MDU6SXNzdWU2MDI1MzM0ODE=,3,"Import EXIF data into SQLite - lens used, ISO, aperture etc",9599,simonw,open,0,,,5324096,Apple Photos online and securely browsable,2,2020-04-18T19:24:31Z,2021-10-05T12:38:24Z,,MEMBER,,,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/3/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 602533539,MDU6SXNzdWU2MDI1MzM1Mzk=,4,Upload all my photos to a secure S3 bucket,9599,simonw,closed,0,,,5324096,Apple Photos online and securely browsable,14,2020-04-18T19:24:50Z,2020-04-18T21:58:11Z,2020-04-18T21:57:13Z,MEMBER,,"- [x] Create a bucket with bucket credentials - [x] Programmatically upload some recent photos to it (from a notebook) - [x] Turn this into a script",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/4/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602551638,MDU6SXNzdWU2MDI1NTE2Mzg=,5,photos-to-sqlite s3-auth command,9599,simonw,closed,0,,,,,1,2020-04-18T21:05:25Z,2020-04-18T21:08:44Z,2020-04-18T21:08:44Z,MEMBER,,Modeled on `github-to-sqlite auth` - prompts the user for their S3 credentials and saves them to `auth.json`.,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/5/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602569315,MDU6SXNzdWU2MDI1NjkzMTU=,102,Can't store an array or dictionary containing a bytes value,9599,simonw,closed,0,,,,,0,2020-04-18T22:49:21Z,2020-05-01T20:45:45Z,2020-05-01T20:45:45Z,OWNER,,"``` In [1]: import sqlite_utils In [2]: db = sqlite_utils.Database(memory=True) In [3]: db[""t""].insert({""id"": 1, ""data"": {""foo"": b""bytes""}}) --------------------------------------------------------------------------- TypeError Traceback (most recent call last) in ----> 1 db[""t""].insert({""id"": 1, ""data"": {""foo"": b""bytes""}}) ~/Dropbox/Development/sqlite-utils/sqlite_utils/db.py in insert(self, record, pk, foreign_keys, column_order, not_null, defaults, hash_id, alter, ignore, replace, extracts, conversions, columns) 950 extracts=extracts, 951 conversions=conversions, --> 952 columns=columns, 953 ) 954 ~/Dropbox/Development/sqlite-utils/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, column_order, not_null, defaults, batch_size, hash_id, alter, ignore, replace, extracts, conversions, columns, upsert) 1052 for key in all_columns: 1053 value = jsonify_if_needed( -> 1054 record.get(key, None if key != hash_id else _hash(record)) 1055 ) 1056 if key in extracts: ~/Dropbox/Development/sqlite-utils/sqlite_utils/db.py in jsonify_if_needed(value) 1318 def jsonify_if_needed(value): 1319 if isinstance(value, (dict, list, tuple)): -> 1320 return json.dumps(value) 1321 elif isinstance(value, (datetime.time, datetime.date, datetime.datetime)): 1322 return value.isoformat() /usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/json/__init__.py in dumps(obj, skipkeys, ensure_ascii, check_circular, allow_nan, cls, indent, separators, default, sort_keys, **kw) 229 cls is None and indent is None and separators is None and 230 default is None and not sort_keys and not kw): --> 231 return _default_encoder.encode(obj) 232 if cls is None: 233 cls = JSONEncoder /usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/json/encoder.py in encode(self, o) 197 # exceptions aren't as detailed. The list call should be roughly 198 # equivalent to the PySequence_Fast that ''.join() would do. --> 199 chunks = self.iterencode(o, _one_shot=True) 200 if not isinstance(chunks, (list, tuple)): 201 chunks = list(chunks) /usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/json/encoder.py in iterencode(self, o, _one_shot) 255 self.key_separator, self.item_separator, self.sort_keys, 256 self.skipkeys, _one_shot) --> 257 return _iterencode(o, 0) 258 259 def _make_iterencode(markers, _default, _encoder, _indent, _floatstr, /usr/local/Cellar/python/3.7.4_1/Frameworks/Python.framework/Versions/3.7/lib/python3.7/json/encoder.py in default(self, o) 177 178 """""" --> 179 raise TypeError(f'Object of type {o.__class__.__name__} ' 180 f'is not JSON serializable') 181 TypeError: Object of type bytes is not JSON serializable ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/102/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602575575,MDU6SXNzdWU2MDI1NzU1NzU=,6,Add progress bar to upload command,9599,simonw,closed,0,,,,,2,2020-04-18T23:32:41Z,2020-04-19T00:15:24Z,2020-04-19T00:15:24Z,MEMBER,,Upload was added in #4 ,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/6/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 602585497,MDU6SXNzdWU2MDI1ODU0OTc=,7,Integrate image content hashing,9599,simonw,open,0,,,,,2,2020-04-19T00:36:58Z,2021-08-26T02:01:01Z,,MEMBER,,To spot duplicate images (where the file content differs such that the sha256 is no longer a match) it would be useful to calculate and store perceptual hashes of some sort.,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/7/reactions"", ""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",, 602619330,MDU6SXNzdWU2MDI2MTkzMzA=,45,Use raise_for_status() everywhere,9599,simonw,open,0,,,,,1,2020-04-19T04:38:28Z,2020-04-19T04:39:22Z,,MEMBER,,"I keep seeing errors which I think are caused by authentication or rate limit problems but which appear to be unexpected JSON responses - presumably because they are actually an error message. Recent example: https://github.com/simonw/jsk-fellows-on-twitter/runs/598892575 Using `response.raise_for_status()` everywhere will make these errors less confusing.",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/45/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 603242257,MDExOlB1bGxSZXF1ZXN0NDA2MDY3MDE5,728,"Update mergedeep requirement from ~=1.1.1 to >=1.1.1,<1.4.0",27856297,dependabot-preview[bot],closed,0,,,,,0,2020-04-20T13:33:23Z,2020-05-04T16:45:58Z,2020-05-04T16:45:49Z,CONTRIBUTOR,simonw/datasette/pulls/728,"Updates the requirements on [mergedeep](https://github.com/clarketm/mergedeep) to permit the latest version.
Commits
  • 3d6e7b4 v1.3.0 - support additive merging of Counter types
  • 56a258a v1.2.1 - tidy docs and variable names
  • 61ab213 v1.2.0 - support both TYPESAFE_REPLACE and TYPESAFE_ADDITIVE merge strategies...
  • b331bb5 cleanup Makefile
  • 6f577bf officially label support for python3.8
  • 84faf37 use pipenv for managing dev dependencies
  • 3a8761a Update README.md
  • See full diff in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/728/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 603295970,MDU6SXNzdWU2MDMyOTU5NzA=,729,Visually distinguish integer and text columns,9599,simonw,closed,0,,,,,8,2020-04-20T14:47:26Z,2020-05-18T17:20:02Z,2020-05-15T18:16:56Z,OWNER,,It would be useful if I could tell from looking at the table page if a column was a integer or a text (or a float I guess?). This is particularly important for knowing if it safe to sort by that column.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/729/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 603617013,MDU6SXNzdWU2MDM2MTcwMTM=,29,Milestones should have foreign key to creator and repo,9599,simonw,closed,0,,,,,1,2020-04-21T00:20:44Z,2020-04-21T00:43:58Z,2020-04-21T00:43:58Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github/milestones Creator is an integer but not a foreign key to users Repo is missing entirely!",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/29/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 603618244,MDU6SXNzdWU2MDM2MTgyNDQ=,30,Issues milestone column is the wrong type,9599,simonw,closed,0,,,,,2,2020-04-21T00:24:34Z,2020-04-21T00:45:23Z,2020-04-21T00:36:22Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github/issues?milestone=2857392 ![2A4C1185-2434-4F29-9EA0-3246E2F03F77](https://user-images.githubusercontent.com/9599/79811760-b7e08b00-832b-11ea-9ad7-684a6ae097a6.jpeg) It is TEXT when it should be an INTEGER - which is why the foreign key label is not correctly displayed.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/30/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 603624862,MDU6SXNzdWU2MDM2MjQ4NjI=,31,Issue and milestone should have foreign key to repo,9599,simonw,closed,0,,,,,3,2020-04-21T00:46:24Z,2020-04-22T01:20:19Z,2020-04-22T01:20:19Z,MEMBER,,"Currently the `repo` column on those tables is a string `simonw/datasette` rather than an ID referencing a row in `repos`. _Originally posted by @simonw in https://github.com/dogsheep/github-to-sqlite/issues/29#issuecomment-616883275_",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/31/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 604001627,MDExOlB1bGxSZXF1ZXN0NDA2Njc3MjA1,730,"Update pytest-asyncio requirement from ~=0.10.0 to >=0.10,<0.12",27856297,dependabot-preview[bot],closed,0,,,,,1,2020-04-21T13:32:35Z,2020-05-04T13:27:24Z,2020-05-04T13:27:23Z,CONTRIBUTOR,simonw/datasette/pulls/730,"Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version.
Commits
  • 1026c39 0.11.0
  • ab2b140 Test on Python 3.8, drop 3.3 and 3.4
  • 6397a22 plugin: Use pytest 5.4.0 new Function API
  • 21a0f94 Replace yield_fixture() by fixture()
  • 964b295 Added min hypothesis version so that bugfix for https://github.com/Hypothesis...
  • 4a11a20 Add max supported pytest version to < 5.4.0 to prevent fails until #141 is fi...
  • b305594 Change event_loop to module scope in hypothesis tests, fixing #145.
  • d5a0f47 Enable test_subprocess to be run on win, by changing to ProactorEventLoop in ...
  • d07cd2d Fix required pytest version
  • 86cd9a6 Handle BaseExceptions from loop.run_until_complete (#126)
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/730/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 604222295,MDU6SXNzdWU2MDQyMjIyOTU=,32,Issue comments don't appear to populate issues foreign key,9599,simonw,closed,0,,,,,3,2020-04-21T19:17:32Z,2020-04-22T01:17:44Z,2020-04-22T01:17:44Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github?sql=select+html_url%2C+id%2C+issue+from+issue_comments+order+by+updated_at+desc+limit+101 ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/32/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 605110015,MDU6SXNzdWU2MDUxMTAwMTU=,731,Option to automatically configure based on directory layout,9599,simonw,closed,0,,,,,9,2020-04-22T22:17:47Z,2020-04-27T16:32:44Z,2020-04-27T16:30:26Z,OWNER,,"My Datasette projects increasingly take on the following structure: - `metadata.json` with the metadata - One or more `something.db` database files - A `templates/` folder with some custom templates - A `plugins/` folder with some custom plugins Then I have to run Datasette like this: datasette *.db -m metadata.json --template-dir=templates --plugins-dir=plugins It would be really interesting if Datasette had a special mode where you could point it at a directory with the above layout and it would automatically configure itself based on the contents. Maybe even allow `datasette serve` to detect if it was passed a single argument that's a directory, not a file, and kick in to ""directory layout configuration mode"" in that case: datasette . ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/731/reactions"", ""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 605147638,MDU6SXNzdWU2MDUxNDc2Mzg=,8,Should I have used MD5 instead of SHA256?,9599,simonw,closed,0,,,,,2,2020-04-23T00:02:08Z,2020-04-23T00:03:35Z,2020-04-23T00:03:35Z,MEMBER,,"https://docs.aws.amazon.com/AmazonS3/latest/API/RESTCommonResponseHeaders.html > Objects created by the PUT Object, POST Object, or Copy operation, or through the AWS Management Console, and are encrypted by SSE-S3 or plaintext, have ETags that are an MD5 digest of their object data. ",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/8/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 605546606,MDExOlB1bGxSZXF1ZXN0NDA3OTI5MTI4,734,"Update janus requirement from ~=0.4.0 to >=0.4,<0.6",27856297,dependabot-preview[bot],closed,0,,,,,0,2020-04-23T13:43:45Z,2020-05-04T16:48:14Z,2020-05-04T16:48:04Z,CONTRIBUTOR,simonw/datasette/pulls/734,"Updates the requirements on [janus](https://github.com/aio-libs/janus) to permit the latest version.
Changelog

Sourced from janus's changelog.

0.5.0 (2020-04-23)

  • Remove explicit loop arguments and forbid creating queues outside event loops #246

0.4.0 (2018-07-28)

  • Add py.typed macro #89
  • Drop python 3.4 support and fix minimal version python3.5.3 #88
  • Add property with that indicates if queue is closed #86

0.3.2 (2018-07-06)

  • Fixed python 3.7 support #97

0.3.1 (2018-01-30)

  • Fixed bug with join() in case tasks are added by sync_q.put() #75

0.3.0 (2017-02-21)

  • Expose unfinished_tasks property #34

0.2.4 (2016-12-05)

  • Restore tarball deploying

0.2.3 (2016-07-12)

  • Fix exception type

0.2.2 (2016-07-11)

  • Update asyncio.async() to use asyncio.ensure_future() #6

0.2.1 (2016-03-24)

  • Fix python setup.py test command #4

0.2.0 (2015-09-20)

... (truncated)
Commits
  • 8e89b45 Bump to 0.5.0
  • ec8592b Fix up Python 3.8 loop argument warnings (#246)
  • 2543af6 Bump coverage from 5.0.4 to 5.1
  • 03d1b36 Bump tox from 3.14.5 to 3.14.6
  • 8219c38 Bump coverage from 5.0.3 to 5.0.4
  • 85ec71d Bump pytest from 5.4.0 to 5.4.1
  • 3b974c9 Bump pytest from 5.3.5 to 5.4.0
  • 282dc12 Bump mypy from 0.761 to 0.770
  • 1364fb3 Bump tox from 3.14.4 to 3.14.5
  • dc519bb Bump tox from 3.14.3 to 3.14.4
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/734/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 605806386,MDU6SXNzdWU2MDU4MDYzODY=,735,"Error when I click on ""View and edit SQL""",30607,aborruso,closed,0,,,,,2,2020-04-23T19:31:32Z,2020-04-28T06:10:20Z,2020-04-27T19:00:30Z,NONE,,"Hi, when I do it [here](https://my-database.now.sh/commissioniComunePalermo/youtube), I have ""unrecognized token: ""["""" error. Is it normal? Thank you",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/735/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 605938063,MDU6SXNzdWU2MDU5MzgwNjM=,9,"upload command should be resumable, should only upload photos not already uploaded",9599,simonw,closed,0,,,,,2,2020-04-23T23:31:08Z,2020-04-23T23:39:14Z,2020-04-23T23:39:14Z,MEMBER,,Follow on from #4. ,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/9/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 606028272,MDU6SXNzdWU2MDYwMjgyNzI=,10,Speed up hashing step using threads,9599,simonw,closed,0,,,,,0,2020-04-24T04:20:08Z,2020-04-24T04:32:35Z,2020-04-24T04:32:35Z,MEMBER,,"This TODO from the code: https://github.com/dogsheep/photos-to-sqlite/blob/2e7f2c67cc18b02c75bb64992a05b0196e507252/photos_to_sqlite/cli.py#L82-L90",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/10/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 606032950,MDU6SXNzdWU2MDYwMzI5NTA=,11,Try running S3 uploads in a thread pool,9599,simonw,closed,0,,,,,0,2020-04-24T04:34:31Z,2020-04-24T16:45:41Z,2020-04-24T16:45:41Z,MEMBER,,"Since #10 provided such a speedup, can the same thing be done for the actual uploads? http://ls.pwd.io/2013/06/parallel-s3-uploads-using-boto-and-threads-in-python/ suggests it can really help performance.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/11/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 606033104,MDU6SXNzdWU2MDYwMzMxMDQ=,12,"If less than 500MB, show size in MB not GB",9599,simonw,open,0,,,,,1,2020-04-24T04:35:01Z,2020-04-24T04:35:25Z,,MEMBER,,"Just saw this: ``` Uploading 0.05 GB ```",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/12/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 606720674,MDU6SXNzdWU2MDY3MjA2NzQ=,736,strange behavior using accented characters,30607,aborruso,closed,0,,,,,3,2020-04-25T08:34:51Z,2020-04-28T06:09:28Z,2020-04-27T18:59:16Z,NONE,,"Hi, when I search `incompatibilità` [here](https://my-database.now.sh/commissioniComunePalermo/youtube), using full text search, it becomes `incompatibilità` and I have no result. If I encode the `à` char in the URL (`incompatibilit%C3%A0`) I have the right result. ![image](https://user-images.githubusercontent.com/30607/80275201-00a79380-86e0-11ea-865e-f7e1474e8098.png) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/736/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 607067303,MDExOlB1bGxSZXF1ZXN0NDA5MTIzODk3,737,"Custom pages mechanism, refs #648",9599,simonw,closed,0,,,,,4,2020-04-26T17:31:41Z,2020-04-26T18:46:43Z,2020-04-26T18:46:43Z,OWNER,simonw/datasette/pulls/737,"Refs #648. TODO: - [x] Pass a `view_name` to `render_template()` - [x] Mechanism for custom status code / headers / redirect - [x] Documentation",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/737/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 607086780,MDU6SXNzdWU2MDcwODY3ODA=,738,Pass a request object to custom page templates,9599,simonw,closed,0,,,,,1,2020-04-26T18:57:48Z,2020-04-26T19:01:54Z,2020-04-26T19:01:54Z,OWNER,,"Follow-up to #648. I'm not passing a request object to `.render_template()` at the moment, which breaks any other custom plugins using e.g. `extra_template_vars()` that were expecting to be able to access the request.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/738/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 607107849,MDExOlB1bGxSZXF1ZXN0NDA5MTUzODcw,739,Configuration directory mode,9599,simonw,closed,0,,,,,3,2020-04-26T20:37:46Z,2020-04-27T16:30:25Z,2020-04-27T16:30:25Z,OWNER,simonw/datasette/pulls/739,"Refs #731 TODO: - [x] Decide how to combine explicit command-line options with items detected from the directory structure - [x] Add unit tests - [x] Implement `inspect-data.json` mechanism for populating `immutables` - [x] Add documentation",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/739/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 607211058,MDU6SXNzdWU2MDcyMTEwNTg=,740,Don't throw 500 error on attempted directory browse,9599,simonw,closed,0,,,,,1,2020-04-27T03:50:11Z,2020-04-27T18:29:15Z,2020-04-27T18:29:15Z,OWNER,," This should be a 403 error instead, because the `--static` mechanism doesn't allow directory browsing.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/740/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 607223136,MDU6SXNzdWU2MDcyMjMxMzY=,741,"Replace ""datasette publish --extra-options"" with ""--setting""",9599,simonw,open,0,,,3268330,Datasette 1.0,9,2020-04-27T04:29:04Z,2022-05-12T19:21:16Z,,OWNER,,"See https://github.com/simonw/datasette-publish-now/issues/9#issuecomment-618155764 - the `--extra-options` mechanism is in practice just used to set `--config` options in data that you publish, but that means you end up with pretty messy looking commands: datasette publish my.db --extra-options=""--config default_page_size:50 --config sql_time_limit_ms:3500"" A neater design would be to support `--config` as an option for `datasette publish` directly: datasette publish my.db --config default_page_size:50 --config sql_time_limit_ms:3500 ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/741/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 607243940,MDU6SXNzdWU2MDcyNDM5NDA=,742,"Speed up tests with scope=""session""?",9599,simonw,closed,0,,,,,1,2020-04-27T05:23:54Z,2020-04-27T18:24:53Z,2020-04-27T18:24:53Z,OWNER,,"Tests are pretty slow - could I speed them up with pytest `scope=""session""` on some of the fixtures? Eg https://travis-ci.org/github/simonw/datasette/jobs/679940036 ran 452 tests in 3m53s - the `test_html` ones seem particularly slow.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/742/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 607770595,MDU6SXNzdWU2MDc3NzA1OTU=,743,escape_fts() does not correctly escape * wildcards,9599,simonw,closed,0,,,,,4,2020-04-27T18:48:53Z,2020-04-27T19:11:30Z,2020-04-27T19:11:01Z,OWNER,,"Spotted in #732. This should not return any results... but it does: https://latest.datasette.io/fixtures/searchable?_search=bar%2A&_trace=1 The query from trace is: ``` ""sql"": ""select count(*) from searchable where rowid in (select rowid from searchable_fts where searchable_fts match escape_fts(:search))"", ""params"": { ""search"": ""bar*"" } ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/743/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 607888367,MDU6SXNzdWU2MDc4ODgzNjc=,13,Also upload movie files,9599,simonw,open,0,,,,,2,2020-04-27T22:11:25Z,2020-04-28T00:39:45Z,,MEMBER,,"The `upload` command currently only handles static images: https://github.com/dogsheep/photos-to-sqlite/blob/d939455af00e07866686457ee2fcb9b2d1b7194e/photos_to_sqlite/utils.py#L26-L33 Need to cover movies taken by my phone and DSLR too.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/13/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 608058890,MDU6SXNzdWU2MDgwNTg4OTA=,744,link_or_copy_directory() error - Invalid cross-device link,30607,aborruso,closed,0,,,,,28,2020-04-28T06:26:45Z,2020-05-28T14:32:53Z,2020-05-27T06:01:28Z,NONE,,"Hi, when I run ``` datasette publish heroku -n myapp --template-dir ./template mydb.db ``` I have this error ``` Traceback (most recent call last): File ""/home/aborruso/.local/lib/python3.7/site-packages/datasette/utils/__init__.py"", line 607, in link_or_copy_directory shutil.copytree(src, dst, copy_function=os.link) File ""/usr/lib/python3.7/shutil.py"", line 365, in copytree raise Error(errors) shutil.Error: [('/myfolder/youtubeComunePalermo/processing/./template/base.html', '/tmp/tmps9_4mzc4/templates/base.html', ""[Errno 18] Invalid cross-device link: '/myfolder/youtubeComunePalermo/processing/./template/base.html' -> '/tmp/tmps9_4mzc4/templates/base.html'""), ('/myfolder/youtubeComunePalermo/processing/./template/index.html', '/tmp/tmps9_4mzc4/templates/index.html', ""[Errno 18] Invalid cross-device link: '/myfolder/youtubeComunePalermo/processing/./template/index.html' -> '/tmp/tmps9_4mzc4/templates/index.html'"")] During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/home/aborruso/.local/bin/datasette"", line 8, in sys.exit(cli()) File ""/home/aborruso/.local/lib/python3.7/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/home/aborruso/.local/lib/python3.7/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/home/aborruso/.local/lib/python3.7/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/aborruso/.local/lib/python3.7/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/home/aborruso/.local/lib/python3.7/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/home/aborruso/.local/lib/python3.7/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/home/aborruso/.local/lib/python3.7/site-packages/datasette/publish/heroku.py"", line 103, in heroku extra_metadata, File ""/usr/lib/python3.7/contextlib.py"", line 112, in __enter__ return next(self.gen) File ""/home/aborruso/.local/lib/python3.7/site-packages/datasette/publish/heroku.py"", line 191, in temporary_heroku_directory os.path.join(tmp.name, ""templates""), File ""/home/aborruso/.local/lib/python3.7/site-packages/datasette/utils/__init__.py"", line 609, in link_or_copy_directory shutil.copytree(src, dst) File ""/usr/lib/python3.7/shutil.py"", line 321, in copytree os.makedirs(dst) File ""/usr/lib/python3.7/os.py"", line 221, in makedirs mkdir(name, mode) FileExistsError: [Errno 17] File exists: '/tmp/tmps9_4mzc4/templates' ``` I'm attaching my very basic template folder. Thank you [template.zip](https://github.com/simonw/datasette/files/4543751/template.zip) ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/744/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 608512747,MDU6SXNzdWU2MDg1MTI3NDc=,14,Annotate photos using the Google Cloud Vision API,9599,simonw,open,0,,,,,5,2020-04-28T18:09:03Z,2020-04-28T18:19:06Z,,MEMBER,,"It can detect faces, run OCR, do image labeling (it knows what a lemur is!) and do object localization where it identifies objects and returns bounding polygons for them.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/14/reactions"", ""total_count"": 3, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 1, ""rocket"": 0, ""eyes"": 0}",, 608613033,MDU6SXNzdWU2MDg2MTMwMzM=,745,Extract the hash-URL mechanism out into a plugin,9599,simonw,closed,0,,,,,2,2020-04-28T21:00:38Z,2020-10-23T19:47:18Z,2020-10-23T19:47:10Z,OWNER,,"0.28 in May 2019 made this feature not-the-default: https://datasette.readthedocs.io/en/stable/changelog.html#v0-28 - see #418 I've not felt the need to use it myself since. I think I should move it into a plugin.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/745/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 608752766,MDExOlB1bGxSZXF1ZXN0NDEwNDY5Mjcy,746,"shutil.Error, not OSError",9599,simonw,closed,0,,,,,1,2020-04-29T03:30:51Z,2020-04-29T07:07:24Z,2020-04-29T07:07:23Z,OWNER,simonw/datasette/pulls/746,Refs #744,107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/746/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 609950090,MDU6SXNzdWU2MDk5NTAwOTA=,33,Fall back to authentication via ENV,2029,garethr,closed,0,,,,,4,2020-04-30T12:58:14Z,2020-05-02T18:46:10Z,2020-05-02T18:45:37Z,NONE,,"Would you accept a PR that falls back to looking for an environment variable for the GitHub token? Specifically a change here: https://github.com/dogsheep/github-to-sqlite/blob/c34d5a18bfc41fa08755ba3d5cf9fe09ff204238/github_to_sqlite/cli.py#L271 I'd like to use `github-to-sqlite` in a GitHub Action workflow and this would be simpler than trying to fill out the prompt or generate a file with sensitive content. Wanted to check first, I'm happy to submit a PR with tests and updates to the docs. ",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/33/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610192152,MDU6SXNzdWU2MTAxOTIxNTI=,747,Directory configuration mode should support metadata.yaml,9599,simonw,closed,0,,,,,4,2020-04-30T16:05:30Z,2020-04-30T19:04:19Z,2020-04-30T19:04:19Z,OWNER,,Refs #739 - `metadata.yml` or `metadata.yaml` should be detected in the same way as `metadata.json` is.,107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/747/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610284471,MDU6SXNzdWU2MTAyODQ0NzE=,46,Error running 'search' for the first time,9599,simonw,closed,0,,,,,0,2020-04-30T18:11:20Z,2020-04-30T18:11:58Z,2020-04-30T18:11:58Z,MEMBER,,"``` % twitter-to-sqlite search infodemic.db '#infodemic' Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/bin/twitter-to-sqlite"", line 11, in load_entry_point('twitter-to-sqlite', 'console_scripts', 'twitter-to-sqlite')() File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/share/virtualenvs/twitter-to-sqlite-PBRUqIv6/lib/python3.7/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/Users/simon/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/cli.py"", line 867, in search for tweet in tweets: File ""/Users/simon/Dropbox/Development/twitter-to-sqlite/twitter_to_sqlite/utils.py"", line 165, in fetch_timeline [since_type_id, since_key], sqlite3.OperationalError: no such table: since_ids ```",206156866,twitter-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/46/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610342575,MDU6SXNzdWU2MTAzNDI1NzU=,748,?_searchmode=raw should be documented on full-text search page,9599,simonw,closed,0,,,,,0,2020-04-30T19:50:06Z,2020-04-30T21:06:12Z,2020-04-30T21:06:12Z,OWNER,,"It's currently documented here: https://datasette.readthedocs.io/en/stable/json_api.html#special-table-arguments But it should also be described here: https://datasette.readthedocs.io/en/stable/full_text_search.html#the-table-view-api",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/748/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610408908,MDU6SXNzdWU2MTA0MDg5MDg=,34,Command for retrieving dependents for a repo,9599,simonw,closed,0,,,,,6,2020-04-30T21:47:51Z,2020-05-03T15:53:01Z,2020-05-03T15:53:01Z,MEMBER,,"I really, really want to start grabbing this data: https://github.com/simonw/datasette/network/dependents",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/34/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610511450,MDU6SXNzdWU2MTA1MTE0NTA=,35,Create index on issue_comments(user) and other foreign keys,9599,simonw,closed,0,,,,,3,2020-05-01T02:06:56Z,2020-05-02T18:26:24Z,2020-05-02T18:26:24Z,MEMBER,,"``` create index issue_comments_user on issue_comments(user) ``` I'm sure there are other user columns that could benefit from an index.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/35/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610517472,MDU6SXNzdWU2MTA1MTc0NzI=,103,sqlite3.OperationalError: too many SQL variables in insert_all when using rows with varying numbers of columns,32605365,b0b5h4rp13,closed,0,,,,,8,2020-05-01T02:26:14Z,2020-05-14T00:18:57Z,2020-05-14T00:18:57Z,CONTRIBUTOR,,"If using insert_all to put in 1000 rows of data with varying number of columns, it comes up with this message `sqlite3.OperationalError: too many SQL variables` if the number of columns is larger in later records (past the first row) I've reduced `SQLITE_MAX_VARS` by 100 to 899 at the top of `db.py` to add wiggle room, so that if the column count increases it wont go past SQLite's batch limit as calculated by this line of code based on the count of the first row's dict keys batch_size = max(1, min(batch_size, SQLITE_MAX_VARS // num_columns))",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/103/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610829227,MDU6SXNzdWU2MTA4MjkyMjc=,749,Cloud Run fails to serve database files larger than 32MB,9599,simonw,closed,0,,,,,4,2020-05-01T16:06:46Z,2020-12-03T00:31:15Z,2020-12-03T00:31:14Z,OWNER,,"https://cloud.google.com/run/quotas lists the maximum response size as 32MB. I spotted a bug where attempting to download a database file larger than that from a Cloud Run deployment (in this case it was https://github-to-sqlite.dogsheep.net/github.db after I [accidentally increased the size of that database](https://github.com/dogsheep/github-to-sqlite/commit/630bdba68a23c0ac453e015518ef0bf41107a952)) returned a 500 error because of this.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/749/reactions"", ""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610842926,MDU6SXNzdWU2MTA4NDI5MjY=,36,Add view for better display of dependent repos,9599,simonw,closed,0,,,,,2,2020-05-01T16:33:44Z,2020-05-02T16:50:31Z,2020-05-02T16:30:11Z,MEMBER,,"```sql select repos.full_name as repo, 'https://github.com/' || repos2.full_name as dependent, repos2.created_at as dependent_repo_created, repos2.updated_at as dependent_repo_updated, repos2.stargazers_count as dependent_repo_stars, repos2.watchers_count as dependent_repo_watchers from dependents join repos as repos2 on dependents.dependent = repos2.id join repos on dependents.repo = repos.id order by repos2.created_at desc ``` https://dogsheep.simonwillison.net/github?sql=select%0D%0A++repos.full_name+as+repo%2C%0D%0A++%27https%3A%2F%2Fgithub.com%2F%27+%7C%7C+repos2.full_name+as+dependent%2C%0D%0A++repos2.created_at+as+dependent_repo_created%2C%0D%0A++repos2.updated_at+as+dependent_repo_updated%2C%0D%0A++repos2.stargazers_count+as+dependent_repo_stars%2C%0D%0A++repos2.watchers_count+as+dependent_repo_watchers%0D%0Afrom%0D%0A++dependents%0D%0A++join+repos+as+repos2+on+dependents.dependent+%3D+repos2.id%0D%0A++join+repos+on+dependents.repo+%3D+repos.id%0D%0Aorder+by%0D%0A++repos2.created_at+desc",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/36/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610843136,MDU6SXNzdWU2MTA4NDMxMzY=,37,Mechanism for creating views if they don't yet exist,9599,simonw,closed,0,,,,,3,2020-05-01T16:34:10Z,2020-05-02T16:19:47Z,2020-05-02T16:19:31Z,MEMBER,,Needed for #36 #10 #12 ,207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/37/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610853393,MDU6SXNzdWU2MTA4NTMzOTM=,104,"--schema option to ""sqlite-utils tables""",9599,simonw,closed,0,,,,,0,2020-05-01T16:55:49Z,2020-05-01T17:12:37Z,2020-05-01T17:12:37Z,OWNER,,Adds output showing the table schema.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/104/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 610853576,MDU6SXNzdWU2MTA4NTM1NzY=,105,"""sqlite-utils views"" command",9599,simonw,closed,0,,,,,1,2020-05-01T16:56:11Z,2020-05-01T20:40:07Z,2020-05-01T20:38:36Z,OWNER,,Similar to `sqlite-utils tables`. See also #104.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/105/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611216862,MDU6SXNzdWU2MTEyMTY4NjI=,106,"create_view(..., ignore=True, replace=True) parameters",9599,simonw,closed,0,,,,,1,2020-05-02T15:45:21Z,2020-05-02T16:04:51Z,2020-05-02T16:02:10Z,OWNER,,"Two new parameters which specify what should happen if the view already exists. I want this for https://github.com/dogsheep/github-to-sqlite/issues/37 Here's the current `create_view()` implementation: https://github.com/simonw/sqlite-utils/blob/b4d953d3ccef28bb81cea40ca165a647b59971fa/sqlite_utils/db.py#L325-L332 `ignore=True` will not do anything if the view exists already. `replace=True` will drop and redefine the view - but only if its SQL definition differs, otherwise it will be left alone.",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/106/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611222968,MDU6SXNzdWU2MTEyMjI5Njg=,107,sqlite-utils create-view CLI command,9599,simonw,closed,0,,,,,2,2020-05-02T16:15:13Z,2020-05-03T15:36:58Z,2020-05-03T15:36:37Z,OWNER,,Can go with #27 - `sqlite-utils create-table`.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/107/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611252244,MDU6SXNzdWU2MTEyNTIyNDQ=,750,Add notlike table filter,9599,simonw,closed,0,,,,,3,2020-05-02T18:54:36Z,2020-05-02T19:10:44Z,2020-05-02T19:10:44Z,OWNER,,"I found myself wanting that for applying the opposite of this: https://github-to-sqlite.dogsheep.net/github/dependent_repos?dependent__like=%25simonw%2F%25&_sort_desc=dependent_stars ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/750/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611284481,MDU6SXNzdWU2MTEyODQ0ODE=,38,[Feature Request] Support Repo Name in Search 🥺,5779832,zzeleznick,closed,0,,,,,4,2020-05-02T22:08:51Z,2020-05-03T02:34:32Z,2020-05-02T23:15:11Z,NONE,,"## Description Per your [v2.2 release tweet](https://twitter.com/simonw/status/1256700238099693568) I played with the demo, but the output did not match my expectations. ## Expected Behavior Expected a search query for ""twitter"" contained within the `repo` column to return non-zero results. ## Actual Behavior 😭 [0 rows where repo contains ""twitter"" sorted by starred_at descending](https://github-to-sqlite.dogsheep.net/github/stars?repo__contains=twitter&_sort_desc=starred_at) ## Best Explanation Per the table schema (see appendix) `repo` is of type `INTEGER` which built from `repo_id` and does not expose the repo name in search. ## Desired Behavior Given that searching for ""206156866"" is less intuitive than ""twitter"", it would be great to support this via extending the search capabilities or by adding an additional column. ✅ 104 rows where repo contains ""twitter"" ❌ [104 rows where repo contains ""206156866"" sorted by starred_at descending](https://github-to-sqlite.dogsheep.net/github/stars?repo__contains=206156866&_sort_desc=starred_at) ## Appendix ``` CREATE TABLE [stars] ( [user] INTEGER REFERENCES [users]([id]), [repo] INTEGER REFERENCES [repos]([id]), [starred_at] TEXT, PRIMARY KEY ([user], [repo]) ); CREATE INDEX [idx_stars_repo] ON [stars] ([repo]); CREATE INDEX [idx_stars_user] ON [stars] ([user]); ```",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/38/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611326701,MDU6SXNzdWU2MTEzMjY3MDE=,108,Documentation unit tests for CLI commands,9599,simonw,closed,0,,,,,2,2020-05-03T03:58:42Z,2020-05-03T04:13:57Z,2020-05-03T04:13:57Z,OWNER,,Have a test that ensures all CLI commands are documented.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/108/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611540797,MDU6SXNzdWU2MTE1NDA3OTc=,751,Ability to set custom default _size on a per-table basis,9599,simonw,closed,0,,,5471110,Datasette 0.43,4,2020-05-04T00:13:03Z,2020-05-28T05:00:22Z,2020-05-28T05:00:20Z,OWNER,,"I have some tables where I'd like the default page size to be 10, without affecting the rest of my Datasette instance.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/751/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611835285,MDU6SXNzdWU2MTE4MzUyODU=,752,Non-utf8 encoding in exceptionhandlers and custom-pages,2181410,clausjuhl,closed,0,,,,,1,2020-05-04T12:24:42Z,2020-05-04T17:42:20Z,2020-05-04T17:42:20Z,NONE,,"Hi Simon. Whenever a response is not piped through a router-view, the template is encoded in latin-1 (I think). This is especially a problem (for me) with the new custom_pages-functionality, but also problematic with the 404- and 500-handlers. Thanks!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/752/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 611874514,MDExOlB1bGxSZXF1ZXN0NDEyOTUxMTkx,753,"Update pytest-asyncio requirement from ~=0.10.0 to >=0.10,<0.13",27856297,dependabot-preview[bot],closed,0,,,,,0,2020-05-04T13:27:19Z,2020-05-04T17:41:01Z,2020-05-04T17:40:49Z,CONTRIBUTOR,simonw/datasette/pulls/753,"Updates the requirements on [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) to permit the latest version.
Commits
  • b8e2a45 0.12.0
  • 06580c6 Update changelog
  • b45de23 Fixed failing test case, 'test_asyncio_marker_without_loop'.
  • 238cced Put event_loop first among the fixtures of asyncio tests, fixes #154.
  • e5e3dc7 Added unittests for issue #154.
  • a7e5795 0.12.0 open for business!
  • 1026c39 0.11.0
  • ab2b140 Test on Python 3.8, drop 3.3 and 3.4
  • 6397a22 plugin: Use pytest 5.4.0 new Function API
  • 21a0f94 Replace yield_fixture() by fixture()
  • Additional commits viewable in compare view

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) ---
Dependabot commands and options
You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) - `@dependabot use these labels` will set the current labels as the default for future PRs for this repo and language - `@dependabot use these reviewers` will set the current reviewers as the default for future PRs for this repo and language - `@dependabot use these assignees` will set the current assignees as the default for future PRs for this repo and language - `@dependabot use this milestone` will set the current milestone as the default for future PRs for this repo and language - `@dependabot badge me` will comment on this PR with code to add a ""Dependabot enabled"" badge to your readme Additionally, you can set the following in your Dependabot [dashboard](https://app.dependabot.com): - Update frequency (including time of day and day of week) - Pull request limits (per update run and/or open at any time) - Out-of-range updates (receive only lockfile updates, if desired) - Security updates (receive only security updates, if desired)
",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/753/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 611997130,MDU6SXNzdWU2MTE5OTcxMzA=,754,Clean up aiofiles warnings on 3.8,9599,simonw,closed,0,,,,,2,2020-05-04T16:14:59Z,2020-05-04T16:22:30Z,2020-05-04T16:22:30Z,OWNER,,"https://travis-ci.org/github/simonw/datasette/jobs/682624476 Lots of warnings like this: ``` /home/travis/virtualenv/python3.8.0/lib/python3.8/site-packages/aiofiles/threadpool/utils.py:33 /home/travis/virtualenv/python3.8.0/lib/python3.8/site-packages/aiofiles/threadpool/utils.py:33 /home/travis/virtualenv/python3.8.0/lib/python3.8/site-packages/aiofiles/threadpool/utils.py:33: DeprecationWarning: ""@coroutine"" decorator is deprecated since Python 3.8, use ""async def"" instead def method(self, *args, **kwargs): /home/travis/virtualenv/python3.8.0/lib/python3.8/site-packages/aiofiles/threadpool/__init__.py:27 /home/travis/virtualenv/python3.8.0/lib/python3.8/site-packages/aiofiles/threadpool/__init__.py:27: DeprecationWarning: ""@coroutine"" decorator is deprecated since Python 3.8, use ""async def"" instead def _open(file, mode='r', buffering=-1, encoding=None, errors=None, newline=None, ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/754/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612082842,MDU6SXNzdWU2MTIwODI4NDI=,755,"Fix ""no such column: id"" output in tests",9599,simonw,closed,0,,,,,1,2020-05-04T18:37:49Z,2020-05-04T18:42:14Z,2020-05-04T18:42:14Z,OWNER,,"``` pytest ... tests/test_custom_pages.py ........ [ 33%] tests/test_database.py ......no such column: id ... [ 35%] ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/755/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612089949,MDU6SXNzdWU2MTIwODk5NDk=,756,Add pipx to installation documentation,9599,simonw,closed,0,,,,,2,2020-05-04T18:49:01Z,2020-05-04T19:19:06Z,2020-05-04T19:10:33Z,OWNER,,"Add to this page: https://datasette.readthedocs.io/en/stable/installation.html Here's how to install plugins: https://twitter.com/simonw/status/1257348687979778050 ``` $ datasette plugins [] $ pipx inject datasette datasette-json-html injected package datasette-json-html into venv datasette done! ✨ 🌟 ✨ $ datasette plugins [ { ""name"": ""datasette-json-html"", ""static"": false, ""templates"": false, ""version"": ""0.6"" } ] ```",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/756/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612151767,MDU6SXNzdWU2MTIxNTE3Njc=,15,Expose scores from ZCOMPUTEDASSETATTRIBUTES,9599,simonw,closed,0,,,,,7,2020-05-04T20:36:07Z,2020-12-20T04:44:22Z,2020-05-05T00:11:45Z,MEMBER,,"The Apple Photos database has a `ZCOMPUTEDASSETATTRIBUTES` that looks absurdly interesting... it has calculated scores for every photo: ",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/15/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612287234,MDU6SXNzdWU2MTIyODcyMzQ=,16,"Import machine-learning detected labels (dog, llama etc) from Apple Photos",9599,simonw,open,0,,,,,13,2020-05-05T02:45:43Z,2020-05-05T05:38:16Z,,MEMBER,,"Follow-on from #1. Apple Photos runs some very sophisticated machine learning on-device to figure out if photos are of dogs, llamas and so on. I really want to extract those labels out into my own database.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/16/reactions"", ""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 1, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 612378203,MDU6SXNzdWU2MTIzNzgyMDM=,757,Question: Any fixed date for the release with the uft8-encoding fix?,2181410,clausjuhl,closed,0,,,,,3,2020-05-05T06:51:20Z,2020-05-06T18:41:29Z,2020-05-06T18:41:29Z,NONE,,Just a little impatient :),107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/757/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612382643,MDU6SXNzdWU2MTIzODI2NDM=,758,Question: Access to immutable database-path,2181410,clausjuhl,open,0,,,,,6,2020-05-05T07:01:18Z,2020-05-28T08:23:27Z,,NONE,,"Hi Simon Is there anywhere in the app-context where one can access the hashed urlpath of the database? Currently it's included in the template-context (`databases[0][""path"")` when rendering urls of the database (eg. `/db-44b06v9/cases`...), but where can I find the hashed url when rendering the index-page? I'm trying to avoid redirects. Thanks!",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/758/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 612658444,MDU6SXNzdWU2MTI2NTg0NDQ=,109,"table.create_index(..., ignore=True)",9599,simonw,closed,0,,,,,1,2020-05-05T14:44:21Z,2020-05-05T14:46:53Z,2020-05-05T14:46:53Z,OWNER,,Option to silently do nothing if the index already exists.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/109/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612673948,MDU6SXNzdWU2MTI2NzM5NDg=,759,fts search on a column doesn't work anymore due to escape_fts,133845,Krazybug,closed,0,,,,,3,2020-05-05T15:03:44Z,2021-07-16T02:11:54Z,2020-05-06T17:50:57Z,NONE,,"Hi and first, thank you for this awesome work you make with this projet. On a db indexed in full text search, I can't query on indexed column anymore. This request ""cauvin language:ita"": is running smoothly on a old version of datasette but not on the current version. Compare the current version query `select uuid, title, authors, year, series, language, formats, publisher, tags, identifiers from summary where rowid in (select rowid from summary_fts where summary_fts match escape_fts(:search)) order by uuid limit 101` To an older version: `select title, authors, series, uuid, language, identifiers, tags, publisher, formats, year, links from summary where rowid in (select rowid from summary_fts where summary_fts match :search) order by uuid limit 101` _language_ is a searchable column but now the search string is known as ""cauvin language:ita"" literally as a search term. columns are not parsed. ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/759/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612860531,MDU6SXNzdWU2MTI4NjA1MzE=,17,Only install osxphotos if running on macOS,9599,simonw,closed,0,,,,,3,2020-05-05T20:03:26Z,2020-05-05T20:20:05Z,2020-05-05T20:11:23Z,MEMBER,,The build is broken right now because you can't `pip install osxphotos` on Ubuntu.,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/17/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 612860758,MDU6SXNzdWU2MTI4NjA3NTg=,18,Switch CI solution to GitHub Actions with a macOS runner,9599,simonw,open,0,,,,,1,2020-05-05T20:03:50Z,2020-05-05T23:49:18Z,,MEMBER,,Refs #17.,256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/18/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 613002220,MDU6SXNzdWU2MTMwMDIyMjA=,19,apple-photos command should work even if upload has not run,9599,simonw,closed,0,,,,,1,2020-05-06T02:02:25Z,2020-05-19T20:59:59Z,2020-05-19T20:59:59Z,MEMBER,,"I want people to be able to query their Apple Photos metadata without having to first run `upload` to upload all of their files to their own S3 bucket. To do this I can have `apple-photos` calculate SHA256 hashes of each photo if the `uploads` table does not yet exist (or does not contain that photo).",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/19/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 613006393,MDU6SXNzdWU2MTMwMDYzOTM=,20,Ability to serve thumbnailed Apple Photo from its place on disk,9599,simonw,closed,0,,,,,10,2020-05-06T02:17:50Z,2020-05-25T20:14:22Z,2020-05-25T20:09:41Z,MEMBER,,"A custom Datasette plugin that can be run locally on a Mac laptop which knows how to serve photos such that they can be seen in the browser. _Originally posted by @simonw in https://github.com/dogsheep/photos-to-sqlite/issues/19#issuecomment-624406285_",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/20/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 613422636,MDU6SXNzdWU2MTM0MjI2MzY=,760,Way of seeing full schema for a database,9599,simonw,open,0,,,,,3,2020-05-06T15:46:08Z,2020-05-06T23:49:06Z,,OWNER,,"I find myself wanting to quickly figure out all of the BLOB columns in a database. A `/-/schema` page showing the full schema (actually since it's per-database probably `/dbname/-/schema` or `/-/schema/dbname`) would be really handy. It would need to be carefully constructed from various queries against `sqlite_master` - just doing `select * from sqlite_master where type='table'` isn't quite enough because I also want to show indexes, triggers etc.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/760/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 613467382,MDU6SXNzdWU2MTM0NjczODI=,761,Allow-list pragma_table_info(tablename) and similar,9599,simonw,closed,0,,,,,8,2020-05-06T16:54:14Z,2020-05-07T03:09:05Z,2020-05-06T17:18:38Z,OWNER,,"It would be great if `pragma_table_info(tablename)` was allowed to be used in queries. See also https://github.com/simonw/til/blob/master/sqlite/list-all-columns-in-a-database.md > `select * from pragma_table_info(tablename);` is currently disallowed for user-provided queries via a regex restriction - but could help here too. > > https://github.com/simonw/datasette/blob/d349d57cdf3d577afb62bdf784af342a4d5be660/datasette/utils/__init__.py#L174 _Originally posted by @simonw in https://github.com/simonw/datasette/issues/760#issuecomment-624729459_",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/761/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 613491342,MDU6SXNzdWU2MTM0OTEzNDI=,762,Experiment with PRAGMA hard_heap_limit ,9599,simonw,open,0,,,,,0,2020-05-06T17:33:23Z,2020-05-07T03:08:44Z,,OWNER,,"This was added in SQLite 2020-01-22 (3.31.0): https://www.sqlite.org/changes.html#version_3_31_0 > Add the [sqlite3_hard_heap_limit64()](https://www.sqlite.org/c3ref/hard_heap_limit64.html) interface and the corresponding [PRAGMA hard_heap_limit](https://www.sqlite.org/pragma.html#pragma_hard_heap_limit) command. This sounds like it could be a nice extra safety measure.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/762/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 613755043,MDU6SXNzdWU2MTM3NTUwNDM=,110,Support decimal.Decimal type,134771,dvhthomas,closed,0,,,,,6,2020-05-07T03:57:19Z,2020-05-11T01:58:20Z,2020-05-11T01:50:11Z,NONE,,"Decimal types in Postgres cause a failure in db.py data type selection --- I have a Django app using a MoneyField, which uses a `numeric(14,0)` data type in Postgres (https://www.postgresql.org/docs/9.3/datatype-numeric.html). When attempting to export that table I get the following error: ```bash $ db-to-sqlite --table isaweb_proposal ""postgres://connection"" test.db .... column_type=COLUMN_TYPE_MAPPING[column_type], KeyError: ``` Looking at `sql_utils.db.py` at 292-ish it's clear that there is no matching type for what I assume SQLAlchemy interprets as Python decimal.Decimal. From the [SQLite docs](https://www.sqlite.org/datatype3.html#affinity_name_examples) it looks like DECIMAL in other DBs are considered numeric. I'm not quite sure if it's as simple as adding a data type to that list or if there are repercussions beyond it. Thanks for a great tool!",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/110/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 613777056,MDU6SXNzdWU2MTM3NzcwNTY=,39,issues foreign key to repo isn't working,9599,simonw,closed,0,,,,,1,2020-05-07T05:11:48Z,2020-08-18T14:24:46Z,2020-08-18T14:23:56Z,MEMBER,,"https://github-to-sqlite.dogsheep.net/github/issues?_facet=repo If the foreign key was working those would be repository names. From the schema at the bottom of the page: ``` [repo] TEXT, ``` That's the wrong type and not a foreign key.",207052882,github-to-sqlite,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/github-to-sqlite/issues/39/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 614806683,MDExOlB1bGxSZXF1ZXN0NDE1Mjg2MTA1,763,Documentation + improvements for db.execute() and Results class,9599,simonw,closed,0,,,,,0,2020-05-08T15:16:02Z,2020-06-11T16:05:48Z,2020-05-08T16:05:46Z,OWNER,simonw/datasette/pulls/763,"Refs #685 Still TODO: - [x] Implement `results.first()` - [x] Implement `results.single_value()` - [x] Unit tests for the above ",107914493,datasette,pull,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/763/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",0, 615474990,MDU6SXNzdWU2MTU0NzQ5OTA=,21,bpylist.archiver.CircularReference: archive has a cycle with uid(13),9599,simonw,closed,0,,,,,11,2020-05-10T20:58:06Z,2020-12-19T07:44:49Z,2020-05-10T21:57:13Z,MEMBER,,"``` % python -i $(which photos-to-sqlite) apple-photos photos.db Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/photoinfo.py"", line 611, in place return self._place # pylint: disable=access-member-before-definition AttributeError: 'PhotoInfo' object has no attribute '_place' During handling of the above exception, another exception occurred: Traceback (most recent call last): File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/bin/photos-to-sqlite"", line 11, in load_entry_point('photos-to-sqlite', 'console_scripts', 'photos-to-sqlite')() File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py"", line 829, in __call__ return self.main(*args, **kwargs) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py"", line 782, in main rv = self.invoke(ctx) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py"", line 1259, in invoke return _process_result(sub_ctx.command.invoke(sub_ctx)) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py"", line 1066, in invoke return ctx.invoke(self.callback, **ctx.params) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/click/core.py"", line 610, in invoke return callback(*args, **kwargs) File ""/Users/simon/Dropbox/Development/photos-to-sqlite/photos_to_sqlite/cli.py"", line 249, in apple_photos photo_row = osxphoto_to_row(sha256, photo) File ""/Users/simon/Dropbox/Development/photos-to-sqlite/photos_to_sqlite/utils.py"", line 91, in osxphoto_to_row place = photo.place File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/photoinfo.py"", line 614, in place self._place = PlaceInfo5(self._info[""reverse_geolocation""]) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py"", line 505, in __init__ self._plrevgeoloc = archiver.unarchive(revgeoloc_bplist) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 16, in unarchive return Unarchive(plist).top_object() File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 256, in top_object return self.decode_object(self.top_uid) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 247, in decode_object obj = klass.decode_archive(ArchivedObject(raw_obj, self)) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py"", line 126, in decode_archive mapItem = archive.decode(""mapItem"") File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 140, in decode return self._unarchiver.decode_key(self._object, key) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 216, in decode_key return self.decode_object(val) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 247, in decode_object obj = klass.decode_archive(ArchivedObject(raw_obj, self)) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py"", line 180, in decode_archive sortedPlaceInfos = archive.decode(""sortedPlaceInfos"") File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 140, in decode return self._unarchiver.decode_key(self._object, key) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 216, in decode_key return self.decode_object(val) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 247, in decode_object obj = klass.decode_archive(ArchivedObject(raw_obj, self)) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 112, in decode_archive return [archive._decode_index(index) for index in uids] File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 112, in return [archive._decode_index(index) for index in uids] File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 137, in _decode_index return self._unarchiver.decode_object(index) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 247, in decode_object obj = klass.decode_archive(ArchivedObject(raw_obj, self)) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/osxphotos/placeinfo.py"", line 217, in decode_archive placeType = archive.decode(""placeType"") File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 140, in decode return self._unarchiver.decode_key(self._object, key) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 216, in decode_key return self.decode_object(val) File ""/Users/simon/.local/share/virtualenvs/photos-to-sqlite-0uGSHd6e/lib/python3.8/site-packages/bpylist/archiver.py"", line 227, in decode_object raise CircularReference(index) bpylist.archiver.CircularReference: archive has a cycle with uid(13) ``` In the debugger I traced this back to: ``` 178 @staticmethod 179 def decode_archive(archive): 180 -> sortedPlaceInfos = archive.decode(""sortedPlaceInfos"") 181 finalPlaceInfos = archive.decode(""finalPlaceInfos"") 182 return PLRevGeoMapItem(sortedPlaceInfos, finalPlaceInfos) ```",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 615477131,MDU6SXNzdWU2MTU0NzcxMzE=,111,sqlite-utils drop-table and drop-view commands,9599,simonw,closed,0,,,,,2,2020-05-10T21:10:42Z,2020-05-11T01:58:36Z,2020-05-11T00:44:26Z,OWNER,,Would be useful to be able to drop views and tables from the CLI.,140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/111/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 615626118,MDU6SXNzdWU2MTU2MjYxMTg=,22,Try out ExifReader,9599,simonw,open,0,,,,,4,2020-05-11T06:32:13Z,2020-05-14T05:59:53Z,,MEMBER,,"https://pypi.org/project/ExifReader/ New fork that should be able to handle EXIF in HEIC files. Forked here: https://github.com/ianare/exif-py/issues/102#issuecomment-626376522 Refs #3 ",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/22/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 616012427,MDU6SXNzdWU2MTYwMTI0Mjc=,764,Add PyPI project urls to setup.py,9599,simonw,closed,0,,,5471110,Datasette 0.43,3,2020-05-11T16:23:08Z,2020-05-27T20:21:36Z,2020-05-11T18:28:55Z,OWNER,,"Spotted this example here: ```python project_urls={ ""Issues"": ""https://gitlab.com/Cyb3r-Jak3/ExifReader/issues"", ""Source Code"": ""https://gitlab.com/Cyb3r-Jak3/ExifReader/-/tree/publish"", ""CI"": ""https://gitlab.com/Cyb3r-Jak3/ExifReader/pipelines"", ""Releases"": ""https://github.com/Cyb3r-Jak3/ExifReader"" }, ``` Results in this on https://pypi.org/project/ExifReader/ ",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/764/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 616087149,MDU6SXNzdWU2MTYwODcxNDk=,765,publish heroku should default to currently tagged version,9599,simonw,open,0,,,,,1,2020-05-11T18:24:06Z,2020-05-11T18:25:43Z,,OWNER,,"Had a report that deploying to Heroku was using the previously installed version of Datasette, not the latest. Could be because of this: https://github.com/simonw/datasette/blob/af6c6c5d6f929f951c0e63bfd1c82e37a071b50f/datasette/publish/heroku.py#L172-L179 Heroku documentation recommends pinning to specific versions https://devcenter.heroku.com/articles/python-pip So... we could ensure we default to an install value of `[""datasette>=current_tag""]`.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/765/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 616271236,MDU6SXNzdWU2MTYyNzEyMzY=,112,"add_foreign_key(...., ignore=True)",9599,simonw,closed,0,,,5896742,2.19,4,2020-05-12T00:24:00Z,2020-09-20T22:17:34Z,2020-09-20T22:17:34Z,OWNER,,"When using this library I often find myself wanting to ""add this foreign key, but only if it doesn't exist yet"". The `ignore=True` parameter is increasingly being used for this else where in the library (e.g. in `create_view()`).",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/112/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 617323873,MDU6SXNzdWU2MTczMjM4NzM=,766,Enable wildcard-searches by default,2181410,clausjuhl,open,0,,,,,2,2020-05-13T10:14:48Z,2021-03-05T16:35:21Z,,NONE,,"Hi Simon. It seems that datasette currently has wildcard-searches disabled by default (along with the boolean search-options, NEAR-queries and more, and despite the docs). If I try out the search-url provided in the [docs](https://datasette.readthedocs.io/en/stable/full_text_search.html#the-table-page-and-table-view-api) (https://fara.datasettes.com/fara/FARA_All_ShortForms?_search=manafort), it does not handle wildcard-searches, and I'm unable to make it work on my datasette-instance. I would argue that wildcard-searches is such a standard query, that it should be enabled by default. Requiring ""_searchmode=raw"" when using prefix-searches seems unnecessary. Plus: What happens to non-ascii searches when using ""_searchmode=raw""? Is the ""escape_fts""-function from datasette.utils ignored? Thanks! /Claus",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/766/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 620969465,MDU6SXNzdWU2MjA5Njk0NjU=,767,Allow to specify a URL fragment for canned queries,2657547,rixx,closed,0,,,5471110,Datasette 0.43,2,2020-05-19T13:17:42Z,2020-05-27T21:52:25Z,2020-05-27T21:52:25Z,CONTRIBUTOR,,"Canned queries are very useful to direct users to prepared data and views. I like to use them with charts using datasette-vega a lot, because people get a direct impression at first glance. datasette-vega doesn't show up by default though, and users have to click through to it. Also, datasette-vega does not always guess the best way to render columns correctly though, so it would be nice if I could specify a URL fragment in my canned queries to make sure people see what I want them to see. My current workaround is to include a fragement link in ``description_html`` and ask people to reload the page, like [here](https://data.rixx.de/songs/show_by_bpm#g.mark=bar&g.x_column=bpm_floor&g.x_type=ordinal&g.y_column=bpm_count&g.y_type=quantitative), which is a bit hacky.",107914493,datasette,issue,,,"{""url"": ""https://api.github.com/repos/simonw/datasette/issues/767/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 621280529,MDU6SXNzdWU2MjEyODA1Mjk=,23,create-subset command for creating a publishable subset of a photos database,9599,simonw,closed,0,,,,,1,2020-05-19T20:58:20Z,2020-05-19T22:32:48Z,2020-05-19T22:32:37Z,MEMBER,,"I want to share a subset of my photos, without sharing everything. Idea: $ photos-to-sqlite create-subset photos.db public.db ""select sha256 from ... where ..."" So the command takes a SQL query that returns sha256 hashes, then creates a new file called `public.db` containing just the data corresponding to those photos.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/23/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 621286870,MDU6SXNzdWU2MjEyODY4NzA=,113,Syntactic sugar for ATTACH DATABASE,9599,simonw,closed,0,,,,,2,2020-05-19T21:10:00Z,2021-02-19T05:09:12Z,2021-02-19T04:56:36Z,OWNER,,"https://www.sqlite.org/lang_attach.html Maybe something like this: ```python db.attach(""other_db"", ""other_db.db"") ```",140912432,sqlite-utils,issue,,,"{""url"": ""https://api.github.com/repos/simonw/sqlite-utils/issues/113/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed 621323348,MDU6SXNzdWU2MjEzMjMzNDg=,24,Configurable URL for images,9599,simonw,open,0,,,,,1,2020-05-19T22:25:56Z,2020-05-20T06:00:29Z,,MEMBER,,"This is hard-coded at the moment, which is bad: https://github.com/dogsheep/photos-to-sqlite/blob/d5d69b9019703c47bc251444838578dd752801e2/photos_to_sqlite/cli.py#L269-L272",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/24/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",, 621332242,MDU6SXNzdWU2MjEzMzIyNDI=,25,Create a public demo,9599,simonw,closed,0,,,,,5,2020-05-19T22:47:20Z,2020-05-21T22:26:16Z,2020-05-20T05:54:18Z,MEMBER,,"So I can show people what this does, using some of my photos.",256834907,dogsheep-photos,issue,,,"{""url"": ""https://api.github.com/repos/dogsheep/dogsheep-photos/issues/25/reactions"", ""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",,completed