html_url,issue_url,id,node_id,user,user_label,created_at,updated_at,author_association,body,reactions,issue,issue_label,performed_via_github_app https://github.com/simonw/datasette/issues/44#issuecomment-345342512,https://api.github.com/repos/simonw/datasette/issues/44,345342512,MDEyOklzc3VlQ29tbWVudDM0NTM0MjUxMg==,9599,simonw,2017-11-17T19:27:53Z,2017-11-20T04:37:35Z,OWNER,"This should support multiple columns, e.g. `?_group_count=precinct&_group_count=candidate`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",269731374,?_group_count=country - return counts by specific column(s), https://github.com/simonw/datasette/issues/44#issuecomment-345343079,https://api.github.com/repos/simonw/datasette/issues/44,345343079,MDEyOklzc3VlQ29tbWVudDM0NTM0MzA3OQ==,9599,simonw,2017-11-17T19:29:43Z,2017-11-17T19:29:43Z,OWNER,Should this support sum/avg/etc as well?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",269731374,?_group_count=country - return counts by specific column(s), https://github.com/simonw/datasette/pull/117#issuecomment-345404257,https://api.github.com/repos/simonw/datasette/issues/117,345404257,MDEyOklzc3VlQ29tbWVudDM0NTQwNDI1Nw==,9599,simonw,2017-11-18T00:53:58Z,2017-11-18T00:53:58Z,OWNER,Thanks!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274900388,Don't prevent tabbing to `Run SQL` button, https://github.com/simonw/datasette/pull/104#issuecomment-345447161,https://api.github.com/repos/simonw/datasette/issues/104,345447161,MDEyOklzc3VlQ29tbWVudDM0NTQ0NzE2MQ==,9599,simonw,2017-11-18T14:53:17Z,2017-11-18T14:53:17Z,OWNER,any reason I shouldn't land this?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274284246,[WIP] Add publish to heroku support, https://github.com/simonw/datasette/issues/36#issuecomment-345448756,https://api.github.com/repos/simonw/datasette/issues/36,345448756,MDEyOklzc3VlQ29tbWVudDM0NTQ0ODc1Ng==,9599,simonw,2017-11-18T15:17:43Z,2017-11-18T15:17:43Z,OWNER,"This may be useful: https://github.com/coleifer/peewee/blob/db85167d93861451a1fe7cde8c4f05748b222634/peewee.py#L162-L185","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268262480,"date, year, month and day querystring lookups", https://github.com/simonw/datasette/issues/121#issuecomment-345452215,https://api.github.com/repos/simonw/datasette/issues/121,345452215,MDEyOklzc3VlQ29tbWVudDM0NTQ1MjIxNQ==,9599,simonw,2017-11-18T16:11:23Z,2017-11-18T16:11:23Z,OWNER,"If a column value is invalid JSON, let's return the invalid JSON as a regular string.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275089535,?_json=foo&_json=bar query string argument , https://github.com/simonw/datasette/pull/104#issuecomment-345452669,https://api.github.com/repos/simonw/datasette/issues/104,345452669,MDEyOklzc3VlQ29tbWVudDM0NTQ1MjY2OQ==,21148,jacobian,2017-11-18T16:18:45Z,2017-11-18T16:18:45Z,CONTRIBUTOR,"I'd like to do a bit of cleanup, and some error checking in case heroku/heroku-builds isn't installed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274284246,[WIP] Add publish to heroku support, https://github.com/simonw/datasette/issues/105#issuecomment-345493344,https://api.github.com/repos/simonw/datasette/issues/105,345493344,MDEyOklzc3VlQ29tbWVudDM0NTQ5MzM0NA==,9599,simonw,2017-11-19T05:28:49Z,2017-11-19T05:28:49Z,OWNER,Looks like there are a ton of interesting datasets packaged in this way at http://datahub.io/docs/core-data - see also https://github.com/datasets,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274314940,Consider data-package as a format for metadata, https://github.com/simonw/datasette/issues/105#issuecomment-345494052,https://api.github.com/repos/simonw/datasette/issues/105,345494052,MDEyOklzc3VlQ29tbWVudDM0NTQ5NDA1Mg==,9599,simonw,2017-11-19T05:49:53Z,2017-11-19T05:49:53Z,OWNER,https://github.com/rgieseke/pandas-datapackage-reader,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274314940,Consider data-package as a format for metadata, https://github.com/simonw/datasette/issues/85#issuecomment-345494724,https://api.github.com/repos/simonw/datasette/issues/85,345494724,MDEyOklzc3VlQ29tbWVudDM0NTQ5NDcyNA==,9599,simonw,2017-11-19T06:08:19Z,2017-11-19T06:08:19Z,OWNER,"This is working really nicely now: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273678673,Detect foreign keys and use them to link HTML pages together, https://github.com/simonw/datasette/issues/86#issuecomment-345494775,https://api.github.com/repos/simonw/datasette/issues/86,345494775,MDEyOklzc3VlQ29tbWVudDM0NTQ5NDc3NQ==,9599,simonw,2017-11-19T06:09:43Z,2017-11-19T06:09:43Z,OWNER,"Now that we have foreign key support (#85) this is even more important, since foreign key support actively encourages linking to filtered table views.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/86#issuecomment-345494918,https://api.github.com/repos/simonw/datasette/issues/86,345494918,MDEyOklzc3VlQ29tbWVudDM0NTQ5NDkxOA==,9599,simonw,2017-11-19T06:14:17Z,2017-11-19T06:14:17Z,OWNER,"If the selected relationship is a foreign key reference, we should resolve that foreign key and display it on the page.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/44#issuecomment-345494971,https://api.github.com/repos/simonw/datasette/issues/44,345494971,MDEyOklzc3VlQ29tbWVudDM0NTQ5NDk3MQ==,9599,simonw,2017-11-19T06:15:39Z,2017-11-19T06:15:39Z,OWNER,It would be great if this could support foreign key references and automatically resolve and hyperlink them if they are detected.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",269731374,?_group_count=country - return counts by specific column(s), https://github.com/simonw/datasette/issues/127#issuecomment-345495046,https://api.github.com/repos/simonw/datasette/issues/127,345495046,MDEyOklzc3VlQ29tbWVudDM0NTQ5NTA0Ng==,9599,simonw,2017-11-19T06:17:42Z,2017-11-19T06:17:42Z,OWNER,Maybe I should support `&_count=1` to handle this - that would be easy to Ajax-in in conjenction with the other filters.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275135719,"Filtered tables should show count of all matching rows, if fast enough", https://github.com/simonw/datasette/issues/86#issuecomment-345496540,https://api.github.com/repos/simonw/datasette/issues/86,345496540,MDEyOklzc3VlQ29tbWVudDM0NTQ5NjU0MA==,9599,simonw,2017-11-19T06:59:40Z,2017-11-19T06:59:40Z,OWNER,"OK,I've figured out how to do an initial version of this without JavaScript. I'll provide three form fields labell d ""add filter"": * a select box of all of the columns * a select box of the available operations * a value box Submit those and the site will redirect you to a correctly populated querystring for that filter. If you have filters applied, those will display as prepopulated form field triples. For foreign key reference filters, I will display the resolved value next to the text box containing the numeric ID. In the future this can get a select2 style treatment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/86#issuecomment-345497453,https://api.github.com/repos/simonw/datasette/issues/86,345497453,MDEyOklzc3VlQ29tbWVudDM0NTQ5NzQ1Mw==,9599,simonw,2017-11-19T07:21:22Z,2017-11-19T07:21:22Z,OWNER,I'm going to be a bit classier about this and auto generate a title for the page that describes the currently applied filters.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/86#issuecomment-345497534,https://api.github.com/repos/simonw/datasette/issues/86,345497534,MDEyOklzc3VlQ29tbWVudDM0NTQ5NzUzNA==,9599,simonw,2017-11-19T07:23:33Z,2017-11-19T07:23:33Z,OWNER,"""Tablename: 3,567 rows where status = 3 (published) and n > 55""","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/86#issuecomment-345497689,https://api.github.com/repos/simonw/datasette/issues/86,345497689,MDEyOklzc3VlQ29tbWVudDM0NTQ5NzY4OQ==,9599,simonw,2017-11-19T07:27:40Z,2017-11-19T07:27:40Z,OWNER,"I'll have to refactor the foreign key annotating code to be usable in other contexts - at the moment it only works for annotating displays of rows, but I need to use it to resolve selected filters as well. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/105#issuecomment-345503897,https://api.github.com/repos/simonw/datasette/issues/105,345503897,MDEyOklzc3VlQ29tbWVudDM0NTUwMzg5Nw==,198537,rgieseke,2017-11-19T09:38:08Z,2017-11-19T09:38:08Z,CONTRIBUTOR,"Thanks, I wrote this very simple reader because the default approach as described on the Datahub pages seemed to complicated. I had metadata from the `datapackage.json` attached to the returned DataFrames but removed this due to some attribute handling change in the latest Pandas version. This could also be useful for getting from Data Package to SQL db: https://github.com/frictionlessdata/tableschema-sql-py I maintain a few climate science related dataset at https://github.com/openclimatedata/ The Data Retriever (mainly ecological data) by @ethanwhite et al. is also using the Data Package format for metadata and has some tooling for different dbs: https://frictionlessdata.io/articles/the-data-retriever/ https://github.com/weecology/retriever The Open Power System Data project also has a couple of datasets that show nicely how CSV is great for assembling and then already make SQLite files available. It's one of the first data sets I tried with Datasette, perfect for the use case of getting an API for putting power stations on a map ... https://data.open-power-system-data.org/","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274314940,Consider data-package as a format for metadata, https://github.com/simonw/datasette/issues/97#issuecomment-345509500,https://api.github.com/repos/simonw/datasette/issues/97,345509500,MDEyOklzc3VlQ29tbWVudDM0NTUwOTUwMA==,231923,yschimke,2017-11-19T11:26:58Z,2017-11-19T11:26:58Z,NONE,"Specifically docs should make it clearer this file exists https://parlgov.datasettes.com/.json And from that you can build https://parlgov.datasettes.com/parlgov-25f9855.json Then https://parlgov.datasettes.com/parlgov-25f9855/cabinet.json","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274022950,Link to JSON for the list of tables , https://github.com/simonw/datasette/issues/131#issuecomment-345526171,https://api.github.com/repos/simonw/datasette/issues/131,345526171,MDEyOklzc3VlQ29tbWVudDM0NTUyNjE3MQ==,9599,simonw,2017-11-19T15:44:30Z,2017-11-19T15:44:30Z,OWNER,"Relevant SQLite docs: * https://sqlite.org/fts5.html * https://www.sqlite.org/fts3.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275166669,UI support for running FTS searches, https://github.com/simonw/datasette/issues/131#issuecomment-345526517,https://api.github.com/repos/simonw/datasette/issues/131,345526517,MDEyOklzc3VlQ29tbWVudDM0NTUyNjUxNw==,9599,simonw,2017-11-19T15:48:28Z,2017-11-19T15:48:28Z,OWNER,"Since SQLite supports column specifications in the MATCH body itself, there's no need to provide a separate mechanism for specifying columns in the query string: https://sqlite.org/fts5.html#fts5_column_filters","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275166669,UI support for running FTS searches, https://github.com/simonw/datasette/issues/131#issuecomment-345533274,https://api.github.com/repos/simonw/datasette/issues/131,345533274,MDEyOklzc3VlQ29tbWVudDM0NTUzMzI3NA==,9599,simonw,2017-11-19T17:17:37Z,2017-11-19T17:18:05Z,OWNER,"Demo: https://sf-trees.now.sh/sf-trees-ebc2ad9/Street_Tree_List?_search=grove+st ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275166669,UI support for running FTS searches, https://github.com/simonw/datasette/issues/134#issuecomment-345537268,https://api.github.com/repos/simonw/datasette/issues/134,345537268,MDEyOklzc3VlQ29tbWVudDM0NTUzNzI2OA==,9599,simonw,2017-11-19T18:10:48Z,2017-11-19T18:10:48Z,OWNER,Dupe of #127 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275176094,Filtered table view should show a count, https://github.com/simonw/datasette/issues/44#issuecomment-345537315,https://api.github.com/repos/simonw/datasette/issues/44,345537315,MDEyOklzc3VlQ29tbWVudDM0NTUzNzMxNQ==,9599,simonw,2017-11-19T18:11:27Z,2017-11-19T18:11:27Z,OWNER,This would enable faceted search - moving it to the search milestone.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",269731374,?_group_count=country - return counts by specific column(s), https://github.com/simonw/datasette/issues/127#issuecomment-345538016,https://api.github.com/repos/simonw/datasette/issues/127,345538016,MDEyOklzc3VlQ29tbWVudDM0NTUzODAxNg==,9599,simonw,2017-11-19T18:22:45Z,2017-11-19T18:22:45Z,OWNER,I implemented a basic version of this in f59c840e7db8870afcdeba7a53bdea07bb674334 for custom SQL.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275135719,"Filtered tables should show count of all matching rows, if fast enough", https://github.com/simonw/datasette/issues/122#issuecomment-345552358,https://api.github.com/repos/simonw/datasette/issues/122,345552358,MDEyOklzc3VlQ29tbWVudDM0NTU1MjM1OA==,9599,simonw,2017-11-19T21:45:38Z,2017-12-05T19:09:52Z,OWNER,"For the overall shape of the rows: `?_shape=lists` (default), `?_shape=objects`, `?_shape=object` (primary key as object keys) For getting back extra keys: `?_extras=schema,query,timing` For expanding columns: `?_expand_all=1` Or `?_expand=qSpecies&_expand=qCaretaker` The template view will only be allowed to work with data it can request using extra options. That leaves one sighted nasty edge-case: the default view will expand all columns, but the `.json` view of it won't? I think that's OK. The default view won't include the extras used by the template to render the page either.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275092453,"Redesign JSON output, ditch jsono, offer variants controlled by parameter instead", https://github.com/simonw/datasette/issues/122#issuecomment-345552440,https://api.github.com/repos/simonw/datasette/issues/122,345552440,MDEyOklzc3VlQ29tbWVudDM0NTU1MjQ0MA==,9599,simonw,2017-11-19T21:46:43Z,2017-11-19T21:46:43Z,OWNER,"This calls for refactoring the code so the table view, the row view and the custom SQL view share as much logic as possible.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275092453,"Redesign JSON output, ditch jsono, offer variants controlled by parameter instead", https://github.com/simonw/datasette/issues/122#issuecomment-345552500,https://api.github.com/repos/simonw/datasette/issues/122,345552500,MDEyOklzc3VlQ29tbWVudDM0NTU1MjUwMA==,9599,simonw,2017-11-19T21:47:27Z,2017-11-19T21:47:27Z,OWNER,"To start with, I could just ditch the .jsono in favour of the new _shape argument.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275092453,"Redesign JSON output, ditch jsono, offer variants controlled by parameter instead", https://github.com/simonw/datasette/issues/86#issuecomment-345559864,https://api.github.com/repos/simonw/datasette/issues/86,345559864,MDEyOklzc3VlQ29tbWVudDM0NTU1OTg2NA==,9599,simonw,2017-11-19T23:35:48Z,2017-11-19T23:35:48Z,OWNER,"I need a nicer abstraction around the concept of filters. It needs to be able to: - convert querystring parameters into filters - convert filters into a querystring - iterate through currently applied filters - convert selected filters into a human description (e.g. for a title) - expand filters that involve a foreign key - add filters - remove filters - define different types of filters It should replace my current `build_where_clauses` implementation, in particular this bit: https://github.com/simonw/datasette/blob/a5881e105a02830d26f07e98177248d5910893da/datasette/utils.py#L38-L56","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/44#issuecomment-345601103,https://api.github.com/repos/simonw/datasette/issues/44,345601103,MDEyOklzc3VlQ29tbWVudDM0NTYwMTEwMw==,9599,simonw,2017-11-20T06:13:35Z,2017-11-20T06:13:35Z,OWNER,"Some demos: Single column: https://sf-trees-flat.now.sh/sf-trees-flat-ba738ce/Street_Tree_List?_group_count=qSpecies Multi column: https://sf-trees-flat.now.sh/sf-trees-flat-ba738ce/Street_Tree_List?_group_count=qLegalStatus&_group_count=qSpecies ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",269731374,?_group_count=country - return counts by specific column(s), https://github.com/simonw/datasette/issues/133#issuecomment-345601870,https://api.github.com/repos/simonw/datasette/issues/133,345601870,MDEyOklzc3VlQ29tbWVudDM0NTYwMTg3MA==,9599,simonw,2017-11-20T06:18:53Z,2017-11-20T06:18:53Z,OWNER,This may be tackled by the filters work happening in #86,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275176006,"If view is filtered, search should apply within those filtered rows", https://github.com/simonw/datasette/issues/27#issuecomment-345652450,https://api.github.com/repos/simonw/datasette/issues/27,345652450,MDEyOklzc3VlQ29tbWVudDM0NTY1MjQ1MA==,198537,rgieseke,2017-11-20T10:19:39Z,2017-11-20T10:19:39Z,CONTRIBUTOR,"If Data Package metadata gets adopted (#105) the views spec work might also be worth a look: http://frictionlessdata.io/specs/views/ http://datahub.io/docs/features/views ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267886330,Ability to plot a simple graph, https://github.com/simonw/datasette/issues/137#issuecomment-345750135,https://api.github.com/repos/simonw/datasette/issues/137,345750135,MDEyOklzc3VlQ29tbWVudDM0NTc1MDEzNQ==,9599,simonw,2017-11-20T16:30:56Z,2018-07-10T17:53:13Z,OWNER,"One possible route: introduce prefixes eg `?a.Trees.age__gt=5&a.Trees._group_count=qSpecies&b.Trees.age__gt=10&b.Trees._group_count=qSpecies` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275415799,Ability to combine multiple SQL queries on a single graph, https://github.com/simonw/datasette/issues/129#issuecomment-345793887,https://api.github.com/repos/simonw/datasette/issues/129,345793887,MDEyOklzc3VlQ29tbWVudDM0NTc5Mzg4Nw==,9599,simonw,2017-11-20T19:00:30Z,2017-11-20T19:00:30Z,OWNER,"Need to hide these from the index summary page as well: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275164558,Hide FTS-created tables by default on the database index page, https://github.com/simonw/datasette/issues/105#issuecomment-345809808,https://api.github.com/repos/simonw/datasette/issues/105,345809808,MDEyOklzc3VlQ29tbWVudDM0NTgwOTgwOA==,9599,simonw,2017-11-20T19:50:53Z,2017-11-20T19:50:53Z,OWNER,"OK, https://github.com/openclimatedata/global-carbon-budget/blob/master/datapackage.json really does look like it covers all of the bases I need for #138. Closing this ticket in favour of that new one.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274314940,Consider data-package as a format for metadata, https://github.com/simonw/datasette/issues/42#issuecomment-345810031,https://api.github.com/repos/simonw/datasette/issues/42,345810031,MDEyOklzc3VlQ29tbWVudDM0NTgxMDAzMQ==,9599,simonw,2017-11-20T19:51:29Z,2017-11-20T19:51:29Z,OWNER,See also #138,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268591332,Homepage UI for editing metadata file, https://github.com/simonw/datasette/issues/14#issuecomment-345893877,https://api.github.com/repos/simonw/datasette/issues/14,345893877,MDEyOklzc3VlQ29tbWVudDM0NTg5Mzg3Nw==,9599,simonw,2017-11-21T02:11:27Z,2017-11-21T02:11:27Z,OWNER,http://setuptools.readthedocs.io/en/latest/setuptools.html#dynamic-discovery-of-services-and-plugins Is pretty good ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/pull/104#issuecomment-346116745,https://api.github.com/repos/simonw/datasette/issues/104,346116745,MDEyOklzc3VlQ29tbWVudDM0NjExNjc0NQ==,21148,jacobian,2017-11-21T18:23:25Z,2017-11-21T18:23:25Z,CONTRIBUTOR,"@simonw ready for a review and merge if you want. There's still some nasty duplicated code in cli.py and utils.py, which is just going to get worse if/when we start adding any other deploy targets (and I want to do one for cloud.gov, at least). I think there's an opportunity for some refactoring here. I'm happy to do that now as part of this PR, or if you merge this first I'll do it in a different one.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274284246,[WIP] Add publish to heroku support, https://github.com/simonw/datasette/pull/104#issuecomment-346124073,https://api.github.com/repos/simonw/datasette/issues/104,346124073,MDEyOklzc3VlQ29tbWVudDM0NjEyNDA3Mw==,21148,jacobian,2017-11-21T18:49:55Z,2017-11-21T18:49:55Z,CONTRIBUTOR,"Actually hang on, don't merge - there are some bugs that #141 masked when I tested this out elsewhere.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274284246,[WIP] Add publish to heroku support, https://github.com/simonw/datasette/pull/104#issuecomment-346124764,https://api.github.com/repos/simonw/datasette/issues/104,346124764,MDEyOklzc3VlQ29tbWVudDM0NjEyNDc2NA==,21148,jacobian,2017-11-21T18:52:14Z,2017-11-21T18:52:14Z,CONTRIBUTOR,"OK, now this should work.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274284246,[WIP] Add publish to heroku support, https://github.com/simonw/datasette/issues/141#issuecomment-346157542,https://api.github.com/repos/simonw/datasette/issues/141,346157542,MDEyOklzc3VlQ29tbWVudDM0NjE1NzU0Mg==,9599,simonw,2017-11-21T20:53:47Z,2017-11-21T20:53:47Z,OWNER,"I think a copy is the right thing to do here - it will be cleaned up when the temp directory is removed. The hard link thing was always intended to save space, but if we can't do a hard link I don't see any harm in a temporary file copy.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275814941,datasette publish can fail if /tmp is on a different device, https://github.com/simonw/datasette/issues/90#issuecomment-346161985,https://api.github.com/repos/simonw/datasette/issues/90,346161985,MDEyOklzc3VlQ29tbWVudDM0NjE2MTk4NQ==,9599,simonw,2017-11-21T21:10:22Z,2017-11-21T21:10:22Z,OWNER,"Woohoo! I've found one tiny issue: right now, the following doesn't work: datasette publish heroku ../demo-databses/google-trends.db It results in this error in the Heroku logs: 2017-11-21T21:03:29.210511+00:00 app[web.1]: Usage: datasette serve [OPTIONS] [FILES]... 2017-11-21T21:03:29.210524+00:00 app[web.1]: 2017-11-21T21:03:29.210555+00:00 app[web.1]: Error: Invalid value for ""files"": Path ""../demo-databses/google-trends.db"" does not exist. The command works fine if you run it in the same directory as the database file you are publishing.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123,datasette publish heroku, https://github.com/simonw/datasette/issues/90#issuecomment-346163513,https://api.github.com/repos/simonw/datasette/issues/90,346163513,MDEyOklzc3VlQ29tbWVudDM0NjE2MzUxMw==,9599,simonw,2017-11-21T21:16:16Z,2017-11-21T21:16:16Z,OWNER,"The reason relative paths work for `publish now` is that the `make_dockerfile()` function is called by passing the file names, not the full file paths: https://github.com/simonw/datasette/blob/e47117ce1d15f11246a3120aa49de70205713d05/datasette/utils.py#L166 Clearly the correct thing to do here is for us to refactor the shared code between heroku/package/now.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123,datasette publish heroku, https://github.com/simonw/datasette/issues/142#issuecomment-346217739,https://api.github.com/repos/simonw/datasette/issues/142,346217739,MDEyOklzc3VlQ29tbWVudDM0NjIxNzczOQ==,9599,simonw,2017-11-22T01:45:30Z,2017-11-22T01:45:30Z,OWNER,Might be nice to have a --no-limits option that disables time and maximum row count limits.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275917760,Show extra instructions with the interrupted, https://github.com/simonw/datasette/issues/14#issuecomment-346244871,https://api.github.com/repos/simonw/datasette/issues/14,346244871,MDEyOklzc3VlQ29tbWVudDM0NjI0NDg3MQ==,21148,jacobian,2017-11-22T05:06:30Z,2017-11-22T05:06:30Z,CONTRIBUTOR,"I'd also suggest taking a look at [stevedore](https://docs.openstack.org/stevedore/latest/), which has a ton of tools for doing plugin stuff. I've had good luck with it in the past.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/144#issuecomment-346405660,https://api.github.com/repos/simonw/datasette/issues/144,346405660,MDEyOklzc3VlQ29tbWVudDM0NjQwNTY2MA==,9599,simonw,2017-11-22T16:38:05Z,2017-11-22T16:38:05Z,OWNER,"I have a solution for FTS already, but I'm interested in apsw as a mechanism for allowing custom virtual tables to be written in Python (pysqlite only lets you write custom functions) Not having PyPI support is pretty tough though. I'm planning a plugin/extension system which would be ideal for things like an optional apsw mode, but that's a lot harder if apsw isn't in PyPI.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276091279,apsw as alternative sqlite3 binding (for full text search), https://github.com/simonw/datasette/issues/14#issuecomment-346406009,https://api.github.com/repos/simonw/datasette/issues/14,346406009,MDEyOklzc3VlQ29tbWVudDM0NjQwNjAwOQ==,9599,simonw,2017-11-22T16:39:08Z,2017-11-22T16:39:08Z,OWNER,"Oh thanks, that definitely looks like an interesting option.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/144#issuecomment-346427794,https://api.github.com/repos/simonw/datasette/issues/144,346427794,MDEyOklzc3VlQ29tbWVudDM0NjQyNzc5NA==,649467,mhalle,2017-11-22T17:55:45Z,2017-11-22T17:55:45Z,NONE,"Thanks. There is a way to use pip to grab apsw, which also let's you configure it (flags to build extensions, use an internal sqlite, etc). Don't know how that works as a dependency for another package, though. On November 22, 2017 11:38:06 AM EST, Simon Willison wrote: >I have a solution for FTS already, but I'm interested in apsw as a >mechanism for allowing custom virtual tables to be written in Python >(pysqlite only lets you write custom functions) > >Not having PyPI support is pretty tough though. I'm planning a >plugin/extension system which would be ideal for things like an >optional apsw mode, but that's a lot harder if apsw isn't in PyPI. > >-- >You are receiving this because you authored the thread. >Reply to this email directly or view it on GitHub: >https://github.com/simonw/datasette/issues/144#issuecomment-346405660 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276091279,apsw as alternative sqlite3 binding (for full text search), https://github.com/simonw/datasette/issues/129#issuecomment-346463342,https://api.github.com/repos/simonw/datasette/issues/129,346463342,MDEyOklzc3VlQ29tbWVudDM0NjQ2MzM0Mg==,9599,simonw,2017-11-22T20:22:02Z,2017-11-22T20:22:02Z,OWNER,"On the index page: On the database index page: After clicking that link: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275164558,Hide FTS-created tables by default on the database index page, https://github.com/simonw/datasette/issues/86#issuecomment-346530498,https://api.github.com/repos/simonw/datasette/issues/86,346530498,MDEyOklzc3VlQ29tbWVudDM0NjUzMDQ5OA==,9599,simonw,2017-11-23T04:35:07Z,2017-11-23T04:35:07Z,OWNER,"Here's where I am now. Needs a bit of UI tidy up and it will be good to release: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/146#issuecomment-346682905,https://api.github.com/repos/simonw/datasette/issues/146,346682905,MDEyOklzc3VlQ29tbWVudDM0NjY4MjkwNQ==,9599,simonw,2017-11-23T18:55:08Z,2017-11-23T18:55:08Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276455748,datasette publish gcloud, https://github.com/simonw/datasette/issues/86#issuecomment-346691243,https://api.github.com/repos/simonw/datasette/issues/86,346691243,MDEyOklzc3VlQ29tbWVudDM0NjY5MTI0Mw==,9599,simonw,2017-11-23T20:07:15Z,2017-11-23T20:07:15Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/86#issuecomment-346694211,https://api.github.com/repos/simonw/datasette/issues/86,346694211,MDEyOklzc3VlQ29tbWVudDM0NjY5NDIxMQ==,9599,simonw,2017-11-23T20:34:32Z,2017-11-23T20:34:32Z,OWNER,And with ef3eacf622e69723d48ab1ad597645770a7361db I'm ready to call this one done.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273703829,Filter UI on table page, https://github.com/simonw/datasette/issues/132#issuecomment-346701751,https://api.github.com/repos/simonw/datasette/issues/132,346701751,MDEyOklzc3VlQ29tbWVudDM0NjcwMTc1MQ==,9599,simonw,2017-11-23T21:51:51Z,2017-11-23T21:51:51Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275175929,Row view is not currently expanding foreign keys, https://github.com/simonw/datasette/issues/133#issuecomment-346705879,https://api.github.com/repos/simonw/datasette/issues/133,346705879,MDEyOklzc3VlQ29tbWVudDM0NjcwNTg3OQ==,9599,simonw,2017-11-23T22:43:42Z,2017-11-24T22:07:46Z,OWNER,"Easiest way to do this will be to move it into the same `
` as the filters. Would be nice to detect `?_search=` and redirect to URL without the `_search` parameter, just for aesthetics.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275176006,"If view is filtered, search should apply within those filtered rows", https://github.com/simonw/datasette/issues/147#issuecomment-346900554,https://api.github.com/repos/simonw/datasette/issues/147,346900554,MDEyOklzc3VlQ29tbWVudDM0NjkwMDU1NA==,9599,simonw,2017-11-24T22:02:22Z,2017-11-24T22:02:22Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276476670,Tidy up design of the header of the table page, https://github.com/simonw/datasette/issues/133#issuecomment-346902583,https://api.github.com/repos/simonw/datasette/issues/133,346902583,MDEyOklzc3VlQ29tbWVudDM0NjkwMjU4Mw==,9599,simonw,2017-11-24T22:30:32Z,2017-11-24T22:30:32Z,OWNER," ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275176006,"If view is filtered, search should apply within those filtered rows", https://github.com/simonw/datasette/issues/149#issuecomment-346903317,https://api.github.com/repos/simonw/datasette/issues/149,346903317,MDEyOklzc3VlQ29tbWVudDM0NjkwMzMxNw==,9599,simonw,2017-11-24T22:41:58Z,2017-11-24T22:41:58Z,OWNER,"Custom SQL results now look like this: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276704127,Update custom SQL results to match new table view header, https://github.com/simonw/datasette/issues/141#issuecomment-346974336,https://api.github.com/repos/simonw/datasette/issues/141,346974336,MDEyOklzc3VlQ29tbWVudDM0Njk3NDMzNg==,50138,janimo,2017-11-26T00:00:35Z,2017-11-26T00:00:35Z,NONE,FWIW I worked around this by setting TMPDIR to ~/tmp before running the command.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275814941,datasette publish can fail if /tmp is on a different device, https://github.com/simonw/datasette/issues/124#issuecomment-346987395,https://api.github.com/repos/simonw/datasette/issues/124,346987395,MDEyOklzc3VlQ29tbWVudDM0Njk4NzM5NQ==,50138,janimo,2017-11-26T06:24:08Z,2017-11-26T06:24:08Z,NONE,"Are there performance gains when using immutable as opposed to read-only? From what I see other processes can still modify the DB when immutable, but there are no change notifications.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125805,Option to open readonly but not immutable, https://github.com/simonw/datasette/issues/124#issuecomment-347049888,https://api.github.com/repos/simonw/datasette/issues/124,347049888,MDEyOklzc3VlQ29tbWVudDM0NzA0OTg4OA==,9599,simonw,2017-11-27T00:01:08Z,2017-11-27T00:01:08Z,OWNER,"https://sqlite.org/c3ref/open.html Is the only documentation I've been able to find of the immutable option: > **immutable**: The immutable parameter is a boolean query parameter that indicates that the database file is stored on read-only media. When immutable is set, SQLite assumes that the database file cannot be changed, even by a process with higher privilege, and so the database is opened read-only and all locking and change detection is disabled. Caution: Setting the immutable property on a database file that does in fact change can result in incorrect query results and/or SQLITE_CORRUPT errors. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125805,Option to open readonly but not immutable, https://github.com/simonw/datasette/issues/153#issuecomment-347050235,https://api.github.com/repos/simonw/datasette/issues/153,347050235,MDEyOklzc3VlQ29tbWVudDM0NzA1MDIzNQ==,9599,simonw,2017-11-27T00:06:24Z,2017-11-27T00:06:24Z,OWNER,"I've been thinking about 1. a bit - I actually think it would be fine to have a rule that says ""if the contents of the cell starts with `http://` or `https://` and doesn't contain any whitespace, turn that into a link"". If you need the non-linked version that will always be available in the JSON. For the other two... I think #12 may be the way to go here: if you can easily over-ride the `row.html` and `table.html` templates for specific databases you can easily set pre-formatted text or similar for certain values - maybe even with CSS that targets a specific table column.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-347051331,https://api.github.com/repos/simonw/datasette/issues/153,347051331,MDEyOklzc3VlQ29tbWVudDM0NzA1MTMzMQ==,9599,simonw,2017-11-27T00:23:40Z,2017-11-27T03:58:49Z,OWNER,"One quick fix could be to add a `extra_css_url` key to the `metadata.json` format (which currently hosts `title`, `license_url` etc) - if populated, we can inject a link to that stylesheet on every page. We could add a few classes in strategic places that include the database and table names to give people styling hooks. While we're at it, an `extra_js_url` key would let people go really nuts!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/124#issuecomment-347123991,https://api.github.com/repos/simonw/datasette/issues/124,347123991,MDEyOklzc3VlQ29tbWVudDM0NzEyMzk5MQ==,50138,janimo,2017-11-27T09:25:15Z,2017-11-27T09:25:15Z,NONE,"That's the only reference to immutable I saw as well, making me think that there may be no perceivable advantages over simply using mode=ro. Since the database is never or seldom updated the change notifications should not impact performance.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125805,Option to open readonly but not immutable, https://github.com/simonw/datasette/issues/124#issuecomment-347236102,https://api.github.com/repos/simonw/datasette/issues/124,347236102,MDEyOklzc3VlQ29tbWVudDM0NzIzNjEwMg==,9599,simonw,2017-11-27T16:24:15Z,2017-11-27T16:24:15Z,OWNER,I'd really like to get some benchmarks working so I can see the actual impact of this kind of thing.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125805,Option to open readonly but not immutable, https://github.com/simonw/datasette/issues/155#issuecomment-347713453,https://api.github.com/repos/simonw/datasette/issues/155,347713453,MDEyOklzc3VlQ29tbWVudDM0NzcxMzQ1Mw==,9599,simonw,2017-11-29T00:41:30Z,2017-11-29T00:41:30Z,OWNER,Could you provide the SQL to create a reproducible test case (both CREATE TABLE and INSERT statements)?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",277589569,A primary key column that has foreign key restriction associated won't rendering label column, https://github.com/simonw/datasette/issues/155#issuecomment-347714314,https://api.github.com/repos/simonw/datasette/issues/155,347714314,MDEyOklzc3VlQ29tbWVudDM0NzcxNDMxNA==,388154,wsxiaoys,2017-11-29T00:46:25Z,2017-11-29T00:46:25Z,NONE,"``` CREATE TABLE rhs ( id INTEGER PRIMARY KEY, name TEXT ); CREATE TABLE lhs ( symbol INTEGER PRIMARY KEY, FOREIGN KEY (symbol) REFERENCES rhs(id) ); INSERT INTO rhs VALUES (1, ""foo""); INSERT INTO rhs VALUES (2, ""bar""); INSERT INTO lhs VALUES (1); INSERT INTO lhs VALUES (2); ``` It's expected that in lhs's view, foo / bar should be displayed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",277589569,A primary key column that has foreign key restriction associated won't rendering label column, https://github.com/simonw/datasette/issues/155#issuecomment-347714471,https://api.github.com/repos/simonw/datasette/issues/155,347714471,MDEyOklzc3VlQ29tbWVudDM0NzcxNDQ3MQ==,9599,simonw,2017-11-29T00:47:21Z,2017-11-29T00:47:21Z,OWNER,Thanks!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",277589569,A primary key column that has foreign key restriction associated won't rendering label column, https://github.com/simonw/datasette/issues/155#issuecomment-347715452,https://api.github.com/repos/simonw/datasette/issues/155,347715452,MDEyOklzc3VlQ29tbWVudDM0NzcxNTQ1Mg==,9599,simonw,2017-11-29T00:52:30Z,2017-11-29T00:52:30Z,OWNER,"Interestingly, it almost does the right thing on the individual row page: https://bug-155-dkcqckhgki.now.sh/bug-155-9a7bb68/lhs/1 The symbol has been expanded, but there's a rogue '1' that shouldn't be there at all - I think that's bug #152 The table view itself is definitely doing the wrong thing: https://bug-155-dkcqckhgki.now.sh/bug-155-9a7bb68/lhs ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",277589569,A primary key column that has foreign key restriction associated won't rendering label column, https://github.com/simonw/datasette/issues/153#issuecomment-347735334,https://api.github.com/repos/simonw/datasette/issues/153,347735334,MDEyOklzc3VlQ29tbWVudDM0NzczNTMzNA==,9599,simonw,2017-11-29T02:45:03Z,2017-11-29T02:45:03Z,OWNER,"@ftrain OK I've shipped the first version of this. Here's the initial documentation: Create a `metadata.json` file that looks like this: { ""extra_css_urls"": [ ""https://simonwillison.net/static/css/all.bf8cd891642c.css"" ], ""extra_js_urls"": [ ""https://code.jquery.com/jquery-3.2.1.slim.min.js"" ] } Then start datasette like this: datasette mydb.db --metadata=metadata.json The CSS and JavaScript files will be linked in the `` of every page. You can also specify a SRI (subresource integrity hash) for these assets: { ""extra_css_urls"": [ { ""url"": ""https://simonwillison.net/static/css/all.bf8cd891642c.css"", ""sri"": ""sha384-9qIZekWUyjCyDIf2YK1FRoKiPJq4PHt6tp/ulnuuyRBvazd0hG7pWbE99zvwSznI"" } ], ""extra_js_urls"": [ { ""url"": ""https://code.jquery.com/jquery-3.2.1.slim.min.js"", ""sri"": ""sha256-k2WSCIexGzOj3Euiig+TlR8gA0EmPjuc79OEeY5L45g="" } ] } Modern browsers will only execute the stylsheet or JavaScript if the SRI hash matches the content served. You can generate hashes using www.srihash.org This isn't shipped in a release yet, but you can still access these features in `datasette publish` like so: datasette publish now mydb.db --metadata=metadata.json --branch=master The `--branch=master` option will pull the latest master build of Datasette from GitHub.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-347735598,https://api.github.com/repos/simonw/datasette/issues/153,347735598,MDEyOklzc3VlQ29tbWVudDM0NzczNTU5OA==,9599,simonw,2017-11-29T02:46:31Z,2017-11-29T02:47:27Z,OWNER,"To style individual columns you'll currently need to use the `nth-of-type` selector, e.g.: td:nth-of-type(5):before { white-space: pre }","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-347735724,https://api.github.com/repos/simonw/datasette/issues/153,347735724,MDEyOklzc3VlQ29tbWVudDM0NzczNTcyNA==,9599,simonw,2017-11-29T02:47:14Z,2017-11-29T02:47:14Z,OWNER,(This only addresses point 2 in your issue description - points 1 and point 3 are still to come),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-347928926,https://api.github.com/repos/simonw/datasette/issues/153,347928926,MDEyOklzc3VlQ29tbWVudDM0NzkyODkyNg==,9599,simonw,2017-11-29T17:09:40Z,2017-11-29T17:09:40Z,OWNER,"OK, that's point 1 covered.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-348103270,https://api.github.com/repos/simonw/datasette/issues/153,348103270,MDEyOklzc3VlQ29tbWVudDM0ODEwMzI3MA==,9599,simonw,2017-11-30T07:16:40Z,2017-11-30T07:16:40Z,OWNER,"Every template now gets CSS classes in the body designed to support custom styling. The index template (the top level page at /) gets this: The database template (/dbname/) gets this: The table template (/dbname/tablename) gets: The row template (/dbname/tablename/rowid) gets: The db-x and table-x classes use the database or table names themselves IF they are valid CSS identifiers. If they aren't, we strip any invalid characters out and append a 6 character md5 digest of the original name, in order to ensure that multiple tables which resolve to the same stripped character version still have different CSS classes. Some examples (extracted from the unit tests): ""simple"" => ""simple"" ""MixedCase"" => ""MixedCase"" ""-no-leading-hyphens"" => ""no-leading-hyphens-65bea6"" ""_no-leading-underscores"" => ""no-leading-underscores-b921bc"" ""no spaces"" => ""no-spaces-7088d7"" ""-"" => ""336d5e"" ""no $ characters"" => ""no--characters-59e024"" ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/12#issuecomment-348245757,https://api.github.com/repos/simonw/datasette/issues/12,348245757,MDEyOklzc3VlQ29tbWVudDM0ODI0NTc1Nw==,9599,simonw,2017-11-30T16:39:45Z,2017-11-30T16:39:45Z,OWNER,"It is now possible to over-ride templates on a per-database / per-row or per- table basis. When you access e.g. `/mydatabase/mytable` Datasette will look for the following: - table-mydatabase-mytable.html - table.html If you provided a `--template-dir` argument to datasette serve it will look in that directory first. The lookup rules are as follows: Index page (/): index.html Database page (/mydatabase): database-mydatabase.html database.html Table page (/mydatabase/mytable): table-mydatabase-mytable.html table.html Row page (/mydatabase/mytable/id): row-mydatabase-mytable.html row.html If a table name has spaces or other unexpected characters in it, the template filename will follow the same rules as our custom `` CSS classes introduced in 8ab3a16 - for example, a table called ""Food Trucks"" will attempt to load the following templates: table-mydatabase-Food-Trucks-399138.html table.html It is possible to extend the default templates using Jinja template inheritance. If you want to customize EVERY row template with some additional content you can do so by creating a `row.html` template like this: {% extends ""default:row.html"" %} {% block content %}

EXTRA HTML AT THE TOP OF THE CONTENT BLOCK

This line renders the original block:

{{ super() }} {% endblock %} ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267523511,Make it so you can override templates, https://github.com/simonw/datasette/issues/153#issuecomment-348245843,https://api.github.com/repos/simonw/datasette/issues/153,348245843,MDEyOklzc3VlQ29tbWVudDM0ODI0NTg0Mw==,9599,simonw,2017-11-30T16:40:02Z,2017-11-30T16:40:02Z,OWNER,"It is now possible to over-ride templates on a per-database / per-row or per- table basis. When you access e.g. `/mydatabase/mytable` Datasette will look for the following: - table-mydatabase-mytable.html - table.html If you provided a `--template-dir` argument to datasette serve it will look in that directory first. The lookup rules are as follows: Index page (/): index.html Database page (/mydatabase): database-mydatabase.html database.html Table page (/mydatabase/mytable): table-mydatabase-mytable.html table.html Row page (/mydatabase/mytable/id): row-mydatabase-mytable.html row.html If a table name has spaces or other unexpected characters in it, the template filename will follow the same rules as our custom `` CSS classes introduced in 8ab3a16 - for example, a table called ""Food Trucks"" will attempt to load the following templates: table-mydatabase-Food-Trucks-399138.html table.html It is possible to extend the default templates using Jinja template inheritance. If you want to customize EVERY row template with some additional content you can do so by creating a `row.html` template like this: {% extends ""default:row.html"" %} {% block content %}

EXTRA HTML AT THE TOP OF THE CONTENT BLOCK

This line renders the original block:

{{ super() }} {% endblock %} ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-348248406,https://api.github.com/repos/simonw/datasette/issues/153,348248406,MDEyOklzc3VlQ29tbWVudDM0ODI0ODQwNg==,9599,simonw,2017-11-30T16:47:45Z,2017-11-30T16:47:45Z,OWNER,Remaining work on this now lives in a milestone: https://github.com/simonw/datasette/milestone/6,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/126#issuecomment-348248957,https://api.github.com/repos/simonw/datasette/issues/126,348248957,MDEyOklzc3VlQ29tbWVudDM0ODI0ODk1Nw==,9599,simonw,2017-11-30T16:49:24Z,2017-11-30T16:49:24Z,OWNER,https://simonwillison.net/2017/Nov/25/new-in-datasette/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275135535,Blog entry announcing foreign key support, https://github.com/simonw/datasette/issues/153#issuecomment-348252037,https://api.github.com/repos/simonw/datasette/issues/153,348252037,MDEyOklzc3VlQ29tbWVudDM0ODI1MjAzNw==,20264,ftrain,2017-11-30T16:59:00Z,2017-11-30T16:59:00Z,NONE,"WOW! -- Paul Ford // (646) 369-7128 // @ftrain On Thu, Nov 30, 2017 at 11:47 AM, Simon Willison wrote: > Remaining work on this now lives in a milestone: > https://github.com/simonw/datasette/milestone/6 > > — > You are receiving this because you were mentioned. > Reply to this email directly, view it on GitHub > , > or mute the thread > > . > ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/156#issuecomment-348255782,https://api.github.com/repos/simonw/datasette/issues/156,348255782,MDEyOklzc3VlQ29tbWVudDM0ODI1NTc4Mg==,9599,simonw,2017-11-30T17:11:34Z,2017-11-30T17:11:34Z,OWNER,http://datasette.readthedocs.io/en/latest/custom_templates.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278189708,Document CSS hooks and custom templates, https://github.com/simonw/datasette/issues/153#issuecomment-348255925,https://api.github.com/repos/simonw/datasette/issues/153,348255925,MDEyOklzc3VlQ29tbWVudDM0ODI1NTkyNQ==,9599,simonw,2017-11-30T17:12:03Z,2017-11-30T17:12:03Z,OWNER,Documentation is now live for this: http://datasette.readthedocs.io/en/latest/custom_templates.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/160#issuecomment-348404864,https://api.github.com/repos/simonw/datasette/issues/160,348404864,MDEyOklzc3VlQ29tbWVudDM0ODQwNDg2NA==,9599,simonw,2017-12-01T05:26:57Z,2017-12-01T05:26:57Z,OWNER,"Question is... what should happen to the default static stuff? At the moment that's just https://fivethirtyeight.datasettes.com/-/static/app.css - though I want to improve that to include a content hash, see #154 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/154#issuecomment-348404988,https://api.github.com/repos/simonw/datasette/issues/154,348404988,MDEyOklzc3VlQ29tbWVudDM0ODQwNDk4OA==,9599,simonw,2017-12-01T05:27:40Z,2017-12-01T05:27:40Z,OWNER,If I do add additional static file bundling should that automatically get content hashes as well? #160 - problem with that is then I might have to parse the CSS files and rewrite their internal background-url references etc.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276873891,Datasette CSS should include content hash in the URL, https://github.com/simonw/datasette/issues/20#issuecomment-348420129,https://api.github.com/repos/simonw/datasette/issues/20,348420129,MDEyOklzc3VlQ29tbWVudDM0ODQyMDEyOQ==,9599,simonw,2017-12-01T07:16:25Z,2017-12-01T07:16:25Z,OWNER,"I've found some examples of canned queries I want to support that can't be represented as views, so I'm going to reopen this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/20#issuecomment-348420955,https://api.github.com/repos/simonw/datasette/issues/20,348420955,MDEyOklzc3VlQ29tbWVudDM0ODQyMDk1NQ==,9599,simonw,2017-12-01T07:21:08Z,2017-12-01T07:21:08Z,OWNER,"I'll use the existing metadata.json file: { ""databases"": { ""mydb"": { ""queries"": { ""custom_thingy"": {... The query definition can either be just a string of SQL, or it can be an object with a sql key and optional title and description keys. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/160#issuecomment-348719680,https://api.github.com/repos/simonw/datasette/issues/160,348719680,MDEyOklzc3VlQ29tbWVudDM0ODcxOTY4MA==,9599,simonw,2017-12-02T20:59:27Z,2017-12-02T20:59:27Z,OWNER,"This is about more than just CSS and JavaScript - there are plenty of reasons someone might want to bundle HTML as well, e.g. for building something like https://sf-tree-search.now.sh/ So, instead of thinking about this in terms of /static/, I'm going to think about this in terms of allowing people to mount one or more document roots (or docroots). datasette serve mydb.db -d my-doc-root/ This will cause the root of the server to show content from the `my-doc-root/` directory (assuming it has an index.html file in it). A more common option will be to mount specific folders to specific directories, like this: datasette serve mydb.db -d static:my-static/ Now any hits to `/static/foo.css` will serve content from `my-static/foo.css`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/160#issuecomment-348719752,https://api.github.com/repos/simonw/datasette/issues/160,348719752,MDEyOklzc3VlQ29tbWVudDM0ODcxOTc1Mg==,9599,simonw,2017-12-02T21:00:21Z,2017-12-02T21:00:21Z,OWNER,Not sure which I like better out of `-d/--docroot` or `-s/--static` or `-m/--mount` for this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/160#issuecomment-348719827,https://api.github.com/repos/simonw/datasette/issues/160,348719827,MDEyOklzc3VlQ29tbWVudDM0ODcxOTgyNw==,9599,simonw,2017-12-02T21:01:36Z,2017-12-02T21:01:36Z,OWNER,`-m` is already taken for `--metadata`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/160#issuecomment-348793054,https://api.github.com/repos/simonw/datasette/issues/160,348793054,MDEyOklzc3VlQ29tbWVudDM0ODc5MzA1NA==,9599,simonw,2017-12-03T16:35:22Z,2017-12-03T16:35:22Z,OWNER,"You can now tell Datasette to serve static files from a specific location at a specific mountpoint. For example: datasette serve mydb.db --static extra-css:/tmp/static/css Now if you visit this URL: http://localhost:8001/extra-css/blah.css The following file will be served: /tmp/static/css/blah.css ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/160#issuecomment-348793156,https://api.github.com/repos/simonw/datasette/issues/160,348793156,MDEyOklzc3VlQ29tbWVudDM0ODc5MzE1Ng==,9599,simonw,2017-12-03T16:35:53Z,2017-12-03T16:35:53Z,OWNER,Still TODO: teach `datasette publish` and friends about this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/161#issuecomment-348860191,https://api.github.com/repos/simonw/datasette/issues/161,348860191,MDEyOklzc3VlQ29tbWVudDM0ODg2MDE5MQ==,9599,simonw,2017-12-04T04:52:14Z,2017-12-04T04:52:14Z,OWNER,Seems like a reasonable thing for us to support.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278814220,Support WITH query , https://github.com/simonw/datasette/issues/20#issuecomment-348860623,https://api.github.com/repos/simonw/datasette/issues/20,348860623,MDEyOklzc3VlQ29tbWVudDM0ODg2MDYyMw==,9599,simonw,2017-12-04T04:56:21Z,2017-12-04T04:56:21Z,OWNER,"While I'm doing this, I could add per-database and per-table metadata too ala #68","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/20#issuecomment-349027974,https://api.github.com/repos/simonw/datasette/issues/20,349027974,MDEyOklzc3VlQ29tbWVudDM0OTAyNzk3NA==,9599,simonw,2017-12-04T17:01:19Z,2017-12-04T17:01:19Z,OWNER, This is also a good opportunity to re-factor out a separate query.html template - right now the database.html template is doing two jobs.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/135#issuecomment-349047335,https://api.github.com/repos/simonw/datasette/issues/135,349047335,MDEyOklzc3VlQ29tbWVudDM0OTA0NzMzNQ==,9599,simonw,2017-12-04T17:57:08Z,2017-12-04T17:57:08Z,OWNER,Turns out there's a bug in this: https://timezones-now-hrjgkinozh.now.sh/timezones-0d61a90/ElementaryGeometries should not be showing the search box.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275179724,?_search=x should work if used directly against a FTS virtual table, https://github.com/simonw/datasette/issues/20#issuecomment-349359498,https://api.github.com/repos/simonw/datasette/issues/20,349359498,MDEyOklzc3VlQ29tbWVudDM0OTM1OTQ5OA==,9599,simonw,2017-12-05T16:30:06Z,2017-12-05T16:30:06Z,OWNER,"Named canned queries can now be defined in metadata.json like this: { ""databases"": { ""timezones"": { ""queries"": { ""timezone_for_point"": ""select tzid from timezones ..."" } } } } These will be shown in a new ""Queries"" section beneath ""Views"" on the database page. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/20#issuecomment-349383276,https://api.github.com/repos/simonw/datasette/issues/20,349383276,MDEyOklzc3VlQ29tbWVudDM0OTM4MzI3Ng==,9599,simonw,2017-12-05T17:45:20Z,2017-12-05T17:45:20Z,OWNER,http://datasette.readthedocs.io/en/latest/sql_queries.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/20#issuecomment-349406761,https://api.github.com/repos/simonw/datasette/issues/20,349406761,MDEyOklzc3VlQ29tbWVudDM0OTQwNjc2MQ==,9599,simonw,2017-12-05T19:03:06Z,2017-12-05T19:03:06Z,OWNER,Demo: https://timezones-api.now.sh/timezones-3cb9f64/by_point,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267759136,Config file with support for defining canned queries, https://github.com/simonw/datasette/issues/122#issuecomment-349408214,https://api.github.com/repos/simonw/datasette/issues/122,349408214,MDEyOklzc3VlQ29tbWVudDM0OTQwODIxNA==,9599,simonw,2017-12-05T19:08:04Z,2017-12-05T19:08:04Z,OWNER,I think `.json` should continue to return rows as list-of-lists - it's a nice default because it produces a smaller overall JSON file. Encouraging people to specify an alternative shape to get the current `.jsono` format feels appropriate.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275092453,"Redesign JSON output, ditch jsono, offer variants controlled by parameter instead", https://github.com/simonw/datasette/issues/135#issuecomment-349860851,https://api.github.com/repos/simonw/datasette/issues/135,349860851,MDEyOklzc3VlQ29tbWVudDM0OTg2MDg1MQ==,9599,simonw,2017-12-07T04:37:59Z,2017-12-07T04:37:59Z,OWNER,"I'm testing this like so: datasette ~/Dropbox/Development/timezones-api/timezones.db --reload --load-extension /usr/local/lib/mod_spatialite.dylib ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275179724,?_search=x should work if used directly against a FTS virtual table, https://github.com/simonw/datasette/issues/135#issuecomment-349861461,https://api.github.com/repos/simonw/datasette/issues/135,349861461,MDEyOklzc3VlQ29tbWVudDM0OTg2MTQ2MQ==,9599,simonw,2017-12-07T04:43:12Z,2017-12-07T04:43:12Z,OWNER,"This query looks like it does the right thing: select * from sqlite_master where rootpage = 0 and ( sql like '%VIRTUAL TABLE%USING FTS%content=""ElementaryGeometries""%' or ( tbl_name = ""ElementaryGeometries"" and sql like '%VIRTUAL TABLE%USING FTS%' ) ) Against a table that should not be shown as FTS: https://timezones-now-hrjgkinozh.now.sh/timezones-0d61a90?sql=++++++++select+*+from+sqlite_master%0D%0A++++++++++++where+rootpage+%3D+0%0D%0A++++++++++++and+%28%0D%0A++++++++++++++++sql+like+%27%25VIRTUAL+TABLE%25USING+FTS%25content%3D%22ElementaryGeometries%22%25%27%0D%0A++++++++++++++++or+%28%0D%0A++++++++++++++++++tbl_name+%3D+%22ElementaryGeometries%22%0D%0A++++++++++++++++++and+sql+like+%27%25VIRTUAL+TABLE%25USING+FTS%25%27%0D%0A++++++++++++++++%29%0D%0A++++++++++++%29+ Against a table that SHOULD match: https://sf-trees.now.sh/sf-trees-ebc2ad9?sql=++++++++select+*+from+sqlite_master%0D%0A++++++++++++where+rootpage+%3D+0%0D%0A++++++++++++and+%28%0D%0A++++++++++++++++sql+like+%27%25VIRTUAL+TABLE%25USING+FTS%25content%3D%22Street_Tree_List_fts%22%25%27%0D%0A++++++++++++++++or+%28%0D%0A++++++++++++++++++tbl_name+%3D+%22Street_Tree_List_fts%22%0D%0A++++++++++++++++++and+sql+like+%27%25VIRTUAL+TABLE%25USING+FTS%25%27%0D%0A++++++++++++++++%29%0D%0A++++++++++++%29+","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275179724,?_search=x should work if used directly against a FTS virtual table, https://github.com/simonw/datasette/issues/158#issuecomment-349868849,https://api.github.com/repos/simonw/datasette/issues/158,349868849,MDEyOklzc3VlQ29tbWVudDM0OTg2ODg0OQ==,9599,simonw,2017-12-07T05:41:08Z,2017-12-07T05:41:08Z,OWNER,"I'm happy with this - we have extra_head, content, body_class and title blocks which should provide enough hooks for most reasonable customizations.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278190981,Ensure default templates are designed to be extended, https://github.com/simonw/datasette/issues/153#issuecomment-349874052,https://api.github.com/repos/simonw/datasette/issues/153,349874052,MDEyOklzc3VlQ29tbWVudDM0OTg3NDA1Mg==,9599,simonw,2017-12-07T06:17:33Z,2017-12-07T06:17:33Z,OWNER,"In #159 I added a mechanism for easily customizing per-column displays, and I've added documentation showing an example of using this mechanism to set certain columns to display as unescaped HTML: http://datasette.readthedocs.io/en/latest/custom_templates.html#custom-templates This fixes item 3, so I'm closing this ticket!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/164#issuecomment-349874709,https://api.github.com/repos/simonw/datasette/issues/164,349874709,MDEyOklzc3VlQ29tbWVudDM0OTg3NDcwOQ==,9599,simonw,2017-12-07T06:22:10Z,2017-12-07T06:22:10Z,OWNER,"Example usage: datasette skeleton parlgov.db -m parlgov.json Generates a `parlgov.json` file containing this: { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null, ""databases"": { ""parlgov"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null, ""queries"": {}, ""tables"": { ""info_data_source"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_castles_mair"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_chess"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_huber_inglehart"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""info_table"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_euprofiler"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""party_family"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""info_id"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""sqlite_stat1"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_benoit_laver"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_country_iso"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""viewcalc_party_position"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""viewcalc_election_parameter"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""viewcalc_parliament_composition"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""viewcalc_country_year_share"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""election"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""politician_president"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""party_name_change"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_commissioner_doering"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_ray"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""party_change"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""cabinet_party"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_ees"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""party"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""external_party_cmp"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""country"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""cabinet"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""info_variable"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null }, ""election_result"": { ""title"": null, ""description"": null, ""description_html"": null, ""license"": null, ""license_url"": null, ""source"": null, ""source_url"": null } } } } } ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280013907,datasette skeleton command for kick-starting database and table metadata, https://github.com/simonw/datasette/issues/164#issuecomment-349874844,https://api.github.com/repos/simonw/datasette/issues/164,349874844,MDEyOklzc3VlQ29tbWVudDM0OTg3NDg0NA==,9599,simonw,2017-12-07T06:22:58Z,2017-12-07T06:22:58Z,OWNER,This metadata doesn't yet do anything - need to implement #165,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280013907,datasette skeleton command for kick-starting database and table metadata, https://github.com/simonw/datasette/issues/165#issuecomment-350026183,https://api.github.com/repos/simonw/datasette/issues/165,350026183,MDEyOklzc3VlQ29tbWVudDM1MDAyNjE4Mw==,9599,simonw,2017-12-07T16:47:46Z,2017-12-07T16:47:46Z,OWNER,"Here's an example metadata.json file illustrating custom per-database and per- table metadata: { ""title"": ""Overall datasette title"", ""description_html"": ""This is a description with HTML."", ""databases"": { ""db1"": { ""title"": ""First database"", ""description"": ""This is a string description & has no HTML"", ""license_url"": ""http://example.com/"", ""license"": ""The example license"", ""queries"": { ""canned_query"": ""select * from table1 limit 3;"" }, ""tables"": { ""table1"": { ""title"": ""Custom title for table1"", ""description"": ""Tables can have descriptions too"", ""source"": ""This has a custom source"", ""source_url"": ""http://example.com/"" } } } } }","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280014287,metadata.json support for per-database and per-table information, https://github.com/simonw/datasette/issues/165#issuecomment-350026452,https://api.github.com/repos/simonw/datasette/issues/165,350026452,MDEyOklzc3VlQ29tbWVudDM1MDAyNjQ1Mg==,9599,simonw,2017-12-07T16:48:34Z,2017-12-07T16:48:34Z,OWNER,"Needs documentation, see #166 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280014287,metadata.json support for per-database and per-table information, https://github.com/simonw/datasette/issues/166#issuecomment-350035741,https://api.github.com/repos/simonw/datasette/issues/166,350035741,MDEyOklzc3VlQ29tbWVudDM1MDAzNTc0MQ==,9599,simonw,2017-12-07T17:20:35Z,2017-12-07T17:20:35Z,OWNER,"http://datasette.readthedocs.io/en/latest/metadata.html ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280023225,Documentation for metadata.json and datasette skeleton, https://github.com/simonw/datasette/issues/161#issuecomment-350108113,https://api.github.com/repos/simonw/datasette/issues/161,350108113,MDEyOklzc3VlQ29tbWVudDM1MDEwODExMw==,388154,wsxiaoys,2017-12-07T22:02:24Z,2017-12-07T22:02:24Z,NONE,"It's not throwing the validation error anymore, but i still cannot run following with query: ``` WITH RECURSIVE cnt(x) AS (SELECT 1 UNION ALL SELECT x+1 FROM cnt LIMIT 10) SELECT x FROM cnt; ``` I got `near ""WITH"": syntax error`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278814220,Support WITH query , https://github.com/simonw/datasette/issues/167#issuecomment-350125953,https://api.github.com/repos/simonw/datasette/issues/167,350125953,MDEyOklzc3VlQ29tbWVudDM1MDEyNTk1Mw==,9599,simonw,2017-12-07T23:25:28Z,2017-12-07T23:25:28Z,OWNER,"My column/row HTML display logic has got way too convoluted. This is a sign I need to add proper unit tests for it and clean it up. The complexity comes from: * Displaying a rowid for tables that do not have a primary key * Showing an additional Link column for rows with a primary key * Not displaying that Link column on the individual row pages * Trying to get foreign keys working correctly in all cases, e.g. #152 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280315352,Nasty bug: last column not being correctly displayed, https://github.com/simonw/datasette/issues/161#issuecomment-350158037,https://api.github.com/repos/simonw/datasette/issues/161,350158037,MDEyOklzc3VlQ29tbWVudDM1MDE1ODAzNw==,9599,simonw,2017-12-08T02:52:34Z,2017-12-08T02:52:34Z,OWNER,That might mean your version of SQLite doesn't support that syntax. Unfortunately the version bundled with Python is a bit old - the one built by the Dockerfile in this repo should handle it though.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278814220,Support WITH query , https://github.com/simonw/datasette/issues/161#issuecomment-350182904,https://api.github.com/repos/simonw/datasette/issues/161,350182904,MDEyOklzc3VlQ29tbWVudDM1MDE4MjkwNA==,388154,wsxiaoys,2017-12-08T06:18:12Z,2017-12-08T06:18:12Z,NONE,"You're right..got this resolved after upgrading the sqlite version. Thanks you!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278814220,Support WITH query , https://github.com/simonw/datasette/issues/141#issuecomment-350292364,https://api.github.com/repos/simonw/datasette/issues/141,350292364,MDEyOklzc3VlQ29tbWVudDM1MDI5MjM2NA==,9599,simonw,2017-12-08T15:33:18Z,2017-12-08T15:33:18Z,OWNER,"I can emulate this on OS X using a disk image (Disk Utility -> File -> New Image -> Blank Image...) - once mounted, I get the following: >>> os.link('/tmp/hello', '/Volumes/Untitled/hello') Traceback (most recent call last): File """", line 1, in OSError: [Errno 18] Cross-device link: '/tmp/hello' -> '/Volumes/Untitled/hello' I can simulate that in a mock like this: >>> from unittest.mock import patch >>> @patch('os.link') ... def test_link(mock_link): ... mock_link.side_effect = OSError ... mock_link() ... ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275814941,datasette publish can fail if /tmp is on a different device, https://github.com/simonw/datasette/issues/141#issuecomment-350301248,https://api.github.com/repos/simonw/datasette/issues/141,350301248,MDEyOklzc3VlQ29tbWVudDM1MDMwMTI0OA==,9599,simonw,2017-12-08T16:07:04Z,2017-12-08T16:07:04Z,OWNER,"This fix should work, please have a go with latest master and let me know if you run into any problems.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275814941,datasette publish can fail if /tmp is on a different device, https://github.com/simonw/datasette/issues/154#issuecomment-350302417,https://api.github.com/repos/simonw/datasette/issues/154,350302417,MDEyOklzc3VlQ29tbWVudDM1MDMwMjQxNw==,9599,simonw,2017-12-08T16:11:24Z,2017-12-08T16:11:24Z,OWNER,I think I'll do this as a custom Jinja template filter. That way template authors can re-use it for their own static files if they want.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276873891,Datasette CSS should include content hash in the URL, https://github.com/simonw/datasette/issues/154#issuecomment-350323722,https://api.github.com/repos/simonw/datasette/issues/154,350323722,MDEyOklzc3VlQ29tbWVudDM1MDMyMzcyMg==,9599,simonw,2017-12-08T17:35:25Z,2017-12-08T17:35:25Z,OWNER,If I do this as a querystring parameter I won't need to worry about URL routing.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276873891,Datasette CSS should include content hash in the URL, https://github.com/simonw/datasette/pull/168#issuecomment-350413422,https://api.github.com/repos/simonw/datasette/issues/168,350413422,MDEyOklzc3VlQ29tbWVudDM1MDQxMzQyMg==,9599,simonw,2017-12-09T01:33:40Z,2017-12-09T01:33:40Z,OWNER,https://github.com/channelcat/sanic/releases/tag/0.7.0,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280662866,Upgrade to Sanic 0.7.0, https://github.com/simonw/datasette/issues/167#issuecomment-350421661,https://api.github.com/repos/simonw/datasette/issues/167,350421661,MDEyOklzc3VlQ29tbWVudDM1MDQyMTY2MQ==,9599,simonw,2017-12-09T03:52:46Z,2017-12-09T03:52:46Z,OWNER,"Input: results from the database, foreign key definitions, primary key definitions, type of page Output: display_columns and display_rows","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280315352,Nasty bug: last column not being correctly displayed, https://github.com/simonw/datasette/issues/167#issuecomment-350424595,https://api.github.com/repos/simonw/datasette/issues/167,350424595,MDEyOklzc3VlQ29tbWVudDM1MDQyNDU5NQ==,9599,simonw,2017-12-09T05:08:27Z,2017-12-09T05:08:27Z,OWNER,Perhaps the row.html and table.html templates should be passed the same data but should themselves decide if they will display the Link column ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280315352,Nasty bug: last column not being correctly displayed, https://github.com/simonw/datasette/issues/160#issuecomment-350496258,https://api.github.com/repos/simonw/datasette/issues/160,350496258,MDEyOklzc3VlQ29tbWVudDM1MDQ5NjI1OA==,9599,simonw,2017-12-09T18:29:28Z,2017-12-09T18:29:28Z,OWNER,"Example usage: datasette package --static css:extra-css/ --static js:extra-js/ \ sf-trees.db --template-dir templates/ --tag sf-trees --branch master This creates a local Docker image that includes copies of the templates/, extra-css/ and extra-js/ directories. You can then run it like this: docker run -p 8001:8001 sf-trees For publishing to Zeit now: datasette publish now --static css:extra-css/ --static js:extra-js/ \ sf-trees.db --template-dir templates/ --name sf-trees --branch master Example: https://sf-trees-wbihszoazc.now.sh/sf-trees-02c8ef1/Street_Tree_List For publishing to Heroku: datasette publish heroku --static css:extra-css/ --static js:extra-js/ \ sf-trees.db --template-dir templates/ --branch master ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/issues/157#issuecomment-350496277,https://api.github.com/repos/simonw/datasette/issues/157,350496277,MDEyOklzc3VlQ29tbWVudDM1MDQ5NjI3Nw==,9599,simonw,2017-12-09T18:29:41Z,2017-12-09T18:29:41Z,OWNER,"Example usage: datasette package --static css:extra-css/ --static js:extra-js/ \ sf-trees.db --template-dir templates/ --tag sf-trees --branch master This creates a local Docker image that includes copies of the templates/, extra-css/ and extra-js/ directories. You can then run it like this: docker run -p 8001:8001 sf-trees For publishing to Zeit now: datasette publish now --static css:extra-css/ --static js:extra-js/ \ sf-trees.db --template-dir templates/ --name sf-trees --branch master Example: https://sf-trees-wbihszoazc.now.sh/sf-trees-02c8ef1/Street_Tree_List For publishing to Heroku: datasette publish heroku --static css:extra-css/ --static js:extra-js/ \ sf-trees.db --template-dir templates/ --branch master ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278190321,"Teach ""datasette publish"" about custom template directories", https://github.com/simonw/datasette/issues/170#issuecomment-350506593,https://api.github.com/repos/simonw/datasette/issues/170,350506593,MDEyOklzc3VlQ29tbWVudDM1MDUwNjU5Mw==,9599,simonw,2017-12-09T21:25:50Z,2017-12-09T21:25:50Z,OWNER,Turns out this is already supported: https://github.com/simonw/datasette/blob/6bdfcf60760c27e29ff34692d06e62b36aeecc56/datasette/app.py#L307,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280745470,Custom template for named canned query, https://github.com/simonw/datasette/issues/170#issuecomment-350506751,https://api.github.com/repos/simonw/datasette/issues/170,350506751,MDEyOklzc3VlQ29tbWVudDM1MDUwNjc1MQ==,9599,simonw,2017-12-09T21:28:32Z,2017-12-09T21:28:32Z,OWNER,"My mistake, that's using the database name - there isn't a way of customizing for a specific named query yet.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280745470,Custom template for named canned query, https://github.com/simonw/datasette/issues/170#issuecomment-350507155,https://api.github.com/repos/simonw/datasette/issues/170,350507155,MDEyOklzc3VlQ29tbWVudDM1MDUwNzE1NQ==,9599,simonw,2017-12-09T21:35:30Z,2017-12-09T21:35:30Z,OWNER," Canned query page (/mydatabase/canned-query): query-mydatabase-canned-query.html query-mydatabase.html query.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280745470,Custom template for named canned query, https://github.com/simonw/datasette/issues/171#issuecomment-350508049,https://api.github.com/repos/simonw/datasette/issues/171,350508049,MDEyOklzc3VlQ29tbWVudDM1MDUwODA0OQ==,9599,simonw,2017-12-09T21:50:50Z,2017-12-09T21:50:50Z,OWNER,"Quoting the new documentation: You can find out which templates were considered for a specific page by viewing source on that page and looking for an HTML comment at the bottom. The comment will look something like this: This example is from the canned query page for a query called ""tz"" in the database called ""mydb"". The asterisk shows which template was selected - so in this case, Datasette found a template file called `query-mydb-tz.html` and used that - but if that template had not been found, it would have tried for `query-mydb.html` or the default `query.html`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280745746,HTML comments specifying custom templates for page, https://github.com/simonw/datasette/issues/167#issuecomment-350515616,https://api.github.com/repos/simonw/datasette/issues/167,350515616,MDEyOklzc3VlQ29tbWVudDM1MDUxNTYxNg==,9599,simonw,2017-12-10T00:21:58Z,2017-12-10T00:21:58Z,OWNER,This function signature is pretty gross: https://github.com/simonw/datasette/blob/7a7e4b2ed8c76c6d002a9d707dbc840f6a2abf7f/datasette/app.py#L418,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280315352,Nasty bug: last column not being correctly displayed, https://github.com/simonw/datasette/issues/167#issuecomment-350515985,https://api.github.com/repos/simonw/datasette/issues/167,350515985,MDEyOklzc3VlQ29tbWVudDM1MDUxNTk4NQ==,9599,simonw,2017-12-10T00:28:39Z,2017-12-10T00:28:39Z,OWNER,"A better alternative: ```async def display_columns_and_rows(self, database, table, rows, link_column=False):```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280315352,Nasty bug: last column not being correctly displayed, https://github.com/simonw/datasette/issues/167#issuecomment-350516782,https://api.github.com/repos/simonw/datasette/issues/167,350516782,MDEyOklzc3VlQ29tbWVudDM1MDUxNjc4Mg==,9599,simonw,2017-12-10T00:48:54Z,2017-12-10T00:48:54Z,OWNER,I can simplify this all by dropping the nicety where if a table is using a rowid the Link column is titled rowid instead.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280315352,Nasty bug: last column not being correctly displayed, https://github.com/simonw/datasette/issues/169#issuecomment-350519711,https://api.github.com/repos/simonw/datasette/issues/169,350519711,MDEyOklzc3VlQ29tbWVudDM1MDUxOTcxMQ==,9599,simonw,2017-12-10T02:04:56Z,2017-12-10T02:04:56Z,OWNER,Done! https://github.com/simonw/datasette/releases/tag/0.14,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280744309,Release v0.14 with templates and static files features, https://github.com/simonw/datasette/issues/153#issuecomment-350519736,https://api.github.com/repos/simonw/datasette/issues/153,350519736,MDEyOklzc3VlQ29tbWVudDM1MDUxOTczNg==,9599,simonw,2017-12-10T02:06:01Z,2017-12-10T02:06:01Z,OWNER,@ftrain Datasette 0.14 is now released with all of the above: https://github.com/simonw/datasette/releases/tag/0.14,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/153#issuecomment-350519821,https://api.github.com/repos/simonw/datasette/issues/153,350519821,MDEyOklzc3VlQ29tbWVudDM1MDUxOTgyMQ==,9599,simonw,2017-12-10T02:08:45Z,2017-12-10T02:08:45Z,OWNER,"Also worth mentioning: as of #160 and #157 the `datasette publish now`, `datasette publish heroku` and `datasette package` commands all know how to bundle up any `--static` or `--template-dir` content and include it in the Docker image / Heroku/Now deployment that gets generated.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276842536,Ability to customize presentation of specific columns in HTML view, https://github.com/simonw/datasette/issues/42#issuecomment-350521619,https://api.github.com/repos/simonw/datasette/issues/42,350521619,MDEyOklzc3VlQ29tbWVudDM1MDUyMTYxOQ==,9599,simonw,2017-12-10T03:02:14Z,2017-12-10T03:02:14Z,OWNER,I think the `datasette skeleton` command from #164 makes this obsolete.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268591332,Homepage UI for editing metadata file, https://github.com/simonw/datasette/issues/52#issuecomment-350521635,https://api.github.com/repos/simonw/datasette/issues/52,350521635,MDEyOklzc3VlQ29tbWVudDM1MDUyMTYzNQ==,9599,simonw,2017-12-10T03:02:56Z,2017-12-10T03:02:56Z,OWNER,I don't think this is necessary.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273026602,Solution for temporarily uploading DB so it can be built by docker, https://github.com/simonw/datasette/issues/90#issuecomment-350521711,https://api.github.com/repos/simonw/datasette/issues/90,350521711,MDEyOklzc3VlQ29tbWVudDM1MDUyMTcxMQ==,9599,simonw,2017-12-10T03:05:48Z,2017-12-10T03:05:48Z,OWNER,I fixed that last issue in c195ee4d46f2577b1943836a8270d84c8341d138,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123,datasette publish heroku, https://github.com/simonw/datasette/issues/90#issuecomment-350521736,https://api.github.com/repos/simonw/datasette/issues/90,350521736,MDEyOklzc3VlQ29tbWVudDM1MDUyMTczNg==,9599,simonw,2017-12-10T03:06:34Z,2017-12-10T03:06:34Z,OWNER,Heroku is now in the README as of 6bdfcf60760c27e29ff34692d06e62b36aeecc56,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273846123,datasette publish heroku, https://github.com/simonw/datasette/issues/91#issuecomment-350521780,https://api.github.com/repos/simonw/datasette/issues/91,350521780,MDEyOklzc3VlQ29tbWVudDM1MDUyMTc4MA==,9599,simonw,2017-12-10T03:07:53Z,2017-12-10T03:07:53Z,OWNER,Won't fix - I think the custom templates and static stuff in https://github.com/simonw/datasette/releases/tag/0.14 renders this obsolete.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273878873,"Option to serve databases from a different prefix, serve regular content elsewhere", https://github.com/simonw/datasette/issues/138#issuecomment-350521806,https://api.github.com/repos/simonw/datasette/issues/138,350521806,MDEyOklzc3VlQ29tbWVudDM1MDUyMTgwNg==,9599,simonw,2017-12-10T03:08:26Z,2017-12-10T03:08:36Z,OWNER,Implemented this in 80bf3afa43e3cb396c7a7c9b168eedbc6fe0fa15 and #165. Didn't use data package though.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275476839,"Per-database and per-table metadata, probably using data-package", https://github.com/simonw/datasette/issues/123#issuecomment-350521853,https://api.github.com/repos/simonw/datasette/issues/123,350521853,MDEyOklzc3VlQ29tbWVudDM1MDUyMTg1Mw==,9599,simonw,2017-12-10T03:09:53Z,2017-12-10T03:09:53Z,OWNER,I'm going to keep this separate in csvs-to-sqlite.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125561,Datasette serve should accept paths/URLs to CSVs and other file formats, https://github.com/simonw/datasette/issues/121#issuecomment-350527283,https://api.github.com/repos/simonw/datasette/issues/121,350527283,MDEyOklzc3VlQ29tbWVudDM1MDUyNzI4Mw==,9599,simonw,2017-12-10T06:00:47Z,2017-12-10T06:00:47Z,OWNER,This is also really interesting when combined with the spatialite AsGeoJSON function: http://www.gaia-gis.it/gaia-sins/spatialite-sql-4.2.0.html#p3misc,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275089535,?_json=foo&_json=bar query string argument , https://github.com/simonw/datasette/issues/175#issuecomment-353424169,https://api.github.com/repos/simonw/datasette/issues/175,353424169,MDEyOklzc3VlQ29tbWVudDM1MzQyNDE2OQ==,9599,simonw,2017-12-21T18:33:55Z,2017-12-21T18:33:55Z,OWNER,Done - thanks for curating these: https://github.com/topics/automatic-api,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",282971961,"Add project topic ""automatic-api""", https://github.com/simonw/datasette/issues/120#issuecomment-355487646,https://api.github.com/repos/simonw/datasette/issues/120,355487646,MDEyOklzc3VlQ29tbWVudDM1NTQ4NzY0Ng==,723567,nickdirienzo,2018-01-05T07:10:12Z,2018-01-05T07:10:12Z,NONE,"Ah, glad I found this issue. I have private data that I'd like to share to a few different people. Personally, a shared username and password would be sufficient for me, more-or-less Basic Auth. Do you have more complex requirements in mind? I'm not sure if ""plugin"" means ""build a plugin"" or ""find a plugin"" or something else entirely. FWIW, I stumbled upon [sanic-auth](https://github.com/pyx/sanic-auth) which looks like a new project to bring some interfaces around auth to sanic, similar to Flask. Alternatively, it shouldn't be too bad to add in Basic Auth. If we went down that route, that would probably be best built as a separate package for sanic that `datasette` brings in. What are your thoughts around this?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275087397,Plugin that adds an authentication layer of some sort, https://github.com/simonw/datasette/issues/176#issuecomment-356115657,https://api.github.com/repos/simonw/datasette/issues/176,356115657,MDEyOklzc3VlQ29tbWVudDM1NjExNTY1Nw==,4313116,wulfmann,2018-01-08T22:22:32Z,2018-01-08T22:22:32Z,NONE,"This project probably would not be the place for that. This is a layer for sqllite specifically. It solves a similar problem as graphql, so adding that here wouldn't make sense. Here's an example i found from google that uses micro to run a graphql microservice. you'd just then need to connect your db. https://github.com/timneutkens/micro-graphql","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/176#issuecomment-356161672,https://api.github.com/repos/simonw/datasette/issues/176,356161672,MDEyOklzc3VlQ29tbWVudDM1NjE2MTY3Mg==,173848,yozlet,2018-01-09T02:35:35Z,2018-01-09T02:35:35Z,NONE,"@wulfmann I think I disagree, except I'm not entirely sure what you mean by that first paragraph. The JSON API that Datasette currently exposes is quite different to GraphQL. Furthermore, there's no ""just"" about connecting micro-graphql to a DB; at least, no more ""just"" than adding any other API. You still need to configure the schema, which is exactly the kind of thing that Datasette does for JSON API. This is why I think that GraphQL's a good fit here.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/176#issuecomment-356175667,https://api.github.com/repos/simonw/datasette/issues/176,356175667,MDEyOklzc3VlQ29tbWVudDM1NjE3NTY2Nw==,4313116,wulfmann,2018-01-09T04:19:03Z,2018-01-09T04:19:03Z,NONE,"@yozlet Yes I think that I was confused when I posted my original comment. I see your main point now and am in agreement. ","{""total_count"": 2, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 2, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/pull/178#issuecomment-357542404,https://api.github.com/repos/simonw/datasette/issues/178,357542404,MDEyOklzc3VlQ29tbWVudDM1NzU0MjQwNA==,9599,simonw,2018-01-14T21:06:07Z,2018-01-14T21:06:07Z,OWNER,"Thanks for catching this, merged!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",287240246,"If metadata exists, add it to heroku launch command", https://github.com/simonw/datasette/issues/176#issuecomment-359697938,https://api.github.com/repos/simonw/datasette/issues/176,359697938,MDEyOklzc3VlQ29tbWVudDM1OTY5NzkzOA==,7193,gijs,2018-01-23T07:17:56Z,2018-01-23T07:17:56Z,NONE,👍 I'd like this too! ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/179#issuecomment-360535979,https://api.github.com/repos/simonw/datasette/issues/179,360535979,MDEyOklzc3VlQ29tbWVudDM2MDUzNTk3OQ==,82988,psychemedia,2018-01-25T17:18:24Z,2018-01-25T17:18:24Z,CONTRIBUTOR,"To summarise that thread: - expose full `metadata.json` object to the index page template, eg to allow tables to be referred to by name; - ability to import multiple `metadata.json` files, eg to allow metadata files created for a specific SQLite db to be reused in a datasette referring to several database files; It could also be useful to allow users to import a python file containing custom functions that can that be loaded into scope and made available to custom templates. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",288438570,More metadata options for template authors , https://github.com/simonw/datasette/issues/176#issuecomment-368625350,https://api.github.com/repos/simonw/datasette/issues/176,368625350,MDEyOklzc3VlQ29tbWVudDM2ODYyNTM1MA==,7431774,wuhland,2018-02-26T19:44:11Z,2018-02-26T19:44:11Z,NONE,great idea!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/185#issuecomment-370273359,https://api.github.com/repos/simonw/datasette/issues/185,370273359,MDEyOklzc3VlQ29tbWVudDM3MDI3MzM1OQ==,9599,simonw,2018-03-04T23:10:56Z,2018-03-04T23:10:56Z,OWNER,"Are you talking specifically about accessing metadata from HTML templates? That makes a lot of sense, I'll think about how this could work.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-370461231,https://api.github.com/repos/simonw/datasette/issues/185,370461231,MDEyOklzc3VlQ29tbWVudDM3MDQ2MTIzMQ==,222245,carlmjohnson,2018-03-05T15:43:56Z,2018-03-05T15:44:27Z,NONE,"Yes. I think the simplest implementation is to change lines like ```python metadata = self.ds.metadata.get('databases', {}).get(name, {}) ``` to ```python metadata = { **self.ds.metadata, **self.ds.metadata.get('databases', {}).get(name, {}), } ``` so that specified inner values overwrite outer values, but only if they exist.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/186#issuecomment-374810115,https://api.github.com/repos/simonw/datasette/issues/186,374810115,MDEyOklzc3VlQ29tbWVudDM3NDgxMDExNQ==,9599,simonw,2018-03-21T01:21:13Z,2018-03-21T01:21:13Z,OWNER,"Hah, this is exactly the opposite of datasette's default approach to caching, which is to cache everything for as long as possible. I don't think we'll need to add `Cache-Control: no-cache` headers provided we instead set it up so you can turn off Datasette's caching.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",306811513,proposal new option to disable user agents cache, https://github.com/simonw/datasette/issues/186#issuecomment-374811114,https://api.github.com/repos/simonw/datasette/issues/186,374811114,MDEyOklzc3VlQ29tbWVudDM3NDgxMTExNA==,9599,simonw,2018-03-21T01:28:30Z,2018-03-21T01:28:30Z,OWNER,"We actually have this already: https://github.com/simonw/datasette/blob/012fc7c5cd3e9160c9a4c19cc964253e97fb054a/datasette/cli.py#L253-L255 You can disable the cache headers using the `datasette --debug` option.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",306811513,proposal new option to disable user agents cache, https://github.com/simonw/datasette/issues/186#issuecomment-374872202,https://api.github.com/repos/simonw/datasette/issues/186,374872202,MDEyOklzc3VlQ29tbWVudDM3NDg3MjIwMg==,47107,stefanocudini,2018-03-21T09:07:22Z,2018-03-21T09:07:22Z,NONE,--debug is perfect tnk,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",306811513,proposal new option to disable user agents cache, https://github.com/simonw/datasette/issues/185#issuecomment-376585911,https://api.github.com/repos/simonw/datasette/issues/185,376585911,MDEyOklzc3VlQ29tbWVudDM3NjU4NTkxMQ==,9599,simonw,2018-03-27T16:19:43Z,2018-03-27T16:19:43Z,OWNER,"OK, I have an implementation of this. I realised that not ALL metadata should be inherited: it makes sense for source/source_url/license/license_url to be inherited, but it doesn't make sense for the title and description to be inherited down to the individual databases and tables.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-376587017,https://api.github.com/repos/simonw/datasette/issues/185,376587017,MDEyOklzc3VlQ29tbWVudDM3NjU4NzAxNw==,9599,simonw,2018-03-27T16:22:59Z,2018-03-27T16:22:59Z,OWNER,One thing that's missing from this: if you set source/license data at the individual database level they should be inherited by tables within that database.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-376589591,https://api.github.com/repos/simonw/datasette/issues/185,376589591,MDEyOklzc3VlQ29tbWVudDM3NjU4OTU5MQ==,9599,simonw,2018-03-27T16:30:51Z,2018-03-27T16:30:51Z,OWNER,"Also needed: the ability to unset metadata. If the root metadata specifies a license_url it should be possible to set ""license_url"": null on a child database or table. The current implementation will ignore null (or empty string) values and default to the top level value. I think the templates themselves should be able to indicate if they want the inherited values or not. That way we could support arbitrary key/values and avoid the application code having special knowledge of license_url etc.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-376590265,https://api.github.com/repos/simonw/datasette/issues/185,376590265,MDEyOklzc3VlQ29tbWVudDM3NjU5MDI2NQ==,222245,carlmjohnson,2018-03-27T16:32:51Z,2018-03-27T16:32:51Z,NONE,">I think the templates themselves should be able to indicate if they want the inherited values or not. That way we could support arbitrary key/values and avoid the application code having special knowledge of license_url etc. Yes, you could have `metadata` that works like `metadata` does currently and `inherited_metadata` that works with inheritance.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-376592044,https://api.github.com/repos/simonw/datasette/issues/185,376592044,MDEyOklzc3VlQ29tbWVudDM3NjU5MjA0NA==,222245,carlmjohnson,2018-03-27T16:38:23Z,2018-03-27T16:38:23Z,NONE,"It would be nice to also allow arbitrary keys (maybe under a parent key called params or something to prevent conflicts). For our datasette project, we just have a bunch of dictionaries defined in the base template for things like site URL and column humanized names: https://github.com/baltimore-sun-data/salaries-datasette/blob/master/templates/base.html It would be cleaner if this were in the metadata.json.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/188#issuecomment-376594727,https://api.github.com/repos/simonw/datasette/issues/188,376594727,MDEyOklzc3VlQ29tbWVudDM3NjU5NDcyNw==,9599,simonw,2018-03-27T16:46:49Z,2018-05-28T21:34:34Z,OWNER,"One point of complexity: datasette can be used to bundle multiple .db files into a single ""app"". I think that's OK. We could require that the `datasette_files` table is present in the first database file passed on the command-line. Or we could even construct a search path and consult multiple versions of the table spread across multiple files. That said... any configuration that corresponds to a specific table should live in the same database file as that table. Ditto for general metadata: if we have license/source information for a specific table or database that information should be able to live in the same .db file as the data.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309047460,Ability to bundle metadata and templates inside the SQLite file, https://github.com/simonw/datasette/issues/185#issuecomment-376604558,https://api.github.com/repos/simonw/datasette/issues/185,376604558,MDEyOklzc3VlQ29tbWVudDM3NjYwNDU1OA==,9599,simonw,2018-03-27T17:16:27Z,2018-03-27T17:16:27Z,OWNER,"I am SO inspired by what you've done with https://salaries.news.baltimoresun.com/ - that's pretty much my ideal use-case for Datasette, and it's by far the most elaborate customization I've seen so far. I'd love to hear other ideas that came up while building that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/185#issuecomment-376614973,https://api.github.com/repos/simonw/datasette/issues/185,376614973,MDEyOklzc3VlQ29tbWVudDM3NjYxNDk3Mw==,222245,carlmjohnson,2018-03-27T17:49:00Z,2018-03-27T17:49:00Z,NONE,"@simonw Other than metadata, the biggest item on wishlist for the salaries project was the ability to reorder by column. Of course, that could be done with a custom SQL query, but we didn't want to have to reimplement all the nav/pagination stuff from scratch. @carolinp, feel free to add your thoughts. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/189#issuecomment-376981291,https://api.github.com/repos/simonw/datasette/issues/189,376981291,MDEyOklzc3VlQ29tbWVudDM3Njk4MTI5MQ==,9599,simonw,2018-03-28T18:06:08Z,2018-03-28T18:06:08Z,OWNER,"Given how unlikely it is that this will pose a real problem I think I like option 1: enable sort-by-column by default for all tables, then allow power users to instead switch to explicit enabling of the functionality in their `metadata.json` if they know their data is too big.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-376983741,https://api.github.com/repos/simonw/datasette/issues/189,376983741,MDEyOklzc3VlQ29tbWVudDM3Njk4Mzc0MQ==,9599,simonw,2018-03-28T18:12:35Z,2018-03-28T18:12:35Z,OWNER,"I think this can work with a `?_sort=xxx` parameter - and `?_sort=-xxx` to sort in the opposite direction. I'd like to support ""sort by X descending, then by Y ascending if there are dupes for X"" as well. Two ways that could work: `?_sort=-xxx,yyy` Or... `?_sort=-xxx&_sort=yyy` The second option is probably better in that it makes it easier for columns to have a comma in their name. Is it possible for a SQLite column to start with a `-` character?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-376986668,https://api.github.com/repos/simonw/datasette/issues/189,376986668,MDEyOklzc3VlQ29tbWVudDM3Njk4NjY2OA==,9599,simonw,2018-03-28T18:21:53Z,2018-03-28T18:21:53Z,OWNER,"Might have to do something special to get sort-by-nulls-last: https://stackoverflow.com/questions/12503120/how-to-do-nulls-last-in-sqlite order by ifnull(column_name, -999999) Would need to figure out a smart way to get the default value - maybe by running a min() or max() against the column first?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377049625,https://api.github.com/repos/simonw/datasette/issues/189,377049625,MDEyOklzc3VlQ29tbWVudDM3NzA0OTYyNQ==,9599,simonw,2018-03-28T21:52:05Z,2018-03-28T21:52:05Z,OWNER,"This is a better pattern as you don't have to pick a minimum value: ORDER BY CASE WHEN SOMECOL IS NULL THEN 1 ELSE 0 END, SOMECOL","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377050461,https://api.github.com/repos/simonw/datasette/issues/189,377050461,MDEyOklzc3VlQ29tbWVudDM3NzA1MDQ2MQ==,9599,simonw,2018-03-28T21:55:14Z,2018-03-28T22:06:30Z,OWNER,"I think there are actually four kinds of sort order we need to support; * ascending * descending * ascending, nulls last * descending, nulls last It looks like [-blah] is a valid SQLite table name, so mark I descending with a hyphen prefix isn't good. Instead, maybe this: ?_sort_asc=col1&_sort_desc_nulls_last=col2 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377051018,https://api.github.com/repos/simonw/datasette/issues/189,377051018,MDEyOklzc3VlQ29tbWVudDM3NzA1MTAxOA==,9599,simonw,2018-03-28T21:57:20Z,2018-03-28T22:00:17Z,OWNER,"I'd like to continue to support _next=token pagination even for custom sort orders. To do that I should include rowid (or general primary key) as the tie breaker on all sorts so I can incorporate that it into the _next= token.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377052634,https://api.github.com/repos/simonw/datasette/issues/189,377052634,MDEyOklzc3VlQ29tbWVudDM3NzA1MjYzNA==,9599,simonw,2018-03-28T22:03:16Z,2018-03-28T22:03:16Z,OWNER,"In terms of user interface: the obvious place to put this is as a drop down menu on the column headers. This also means the UI can support combined sort orders. Assuming you are already sorted by county descending and you select the candidate column header, the options could be: * sort all by candidate * sort all by candidate, descending * sort by county descending, then by candidate * sort by county descending, then by candidate descending","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377054358,https://api.github.com/repos/simonw/datasette/issues/189,377054358,MDEyOklzc3VlQ29tbWVudDM3NzA1NDM1OA==,9599,simonw,2018-03-28T22:09:25Z,2018-03-28T22:09:25Z,OWNER,I'm tempted to put these verbose sorting options inline in the page HTML but have them in the table footer so they don't clog up the top half of the page with uninteresting links - then use JavaScript to hoik them out into a dropdown menu attached to each column header.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377055663,https://api.github.com/repos/simonw/datasette/issues/189,377055663,MDEyOklzc3VlQ29tbWVudDM3NzA1NTY2Mw==,9599,simonw,2018-03-28T22:14:53Z,2018-03-28T22:14:53Z,OWNER,"There is one other interesting option for auto-enabling/disabling sort: the inspect command could include data about column index presence and whether or not a column has any null values in it. This would allow us to dynamically include a ""nulls last"" option but only for columns that contain at least one null. It's quite a lot of additional engineering for a very minor feature though, so I think I'll punt on that for the moment. We may find that the _group_count feature can benefit from column value statistics later on though.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/190#issuecomment-377065541,https://api.github.com/repos/simonw/datasette/issues/190,377065541,MDEyOklzc3VlQ29tbWVudDM3NzA2NTU0MQ==,9599,simonw,2018-03-28T22:58:52Z,2018-03-28T22:58:52Z,OWNER,"This is because the SQL we are using here is: select * from compound_primary_key where ""pk1"" > ""d"" and ""pk2"" > ""v"" order by pk1, pk2 limit 101 This is incorrect. The correct SQL syntax (according to the example on https://www.sqlite.org/rowvalue.html#scrolling_window_queries ) is: select * from compound_primary_key where (""pk1"", ""pk2"") > (""d"", ""v"") order by pk1, pk2 limit 101 BUT... this uses ""row values"" syntax which was only added to SQLite in version 3.15.0 in October 2016: https://sqlite.org/changes.html#version_3_15_0 The version on https://datasette-issue-190-compound-pks.now.sh/compound-pks-9aafe8f?sql=select+sqlite_version%28%29%3B is 3.8.7.1 from October 2014.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309558826,Keyset pagination doesn't work correctly for compound primary keys, https://github.com/simonw/datasette/issues/190#issuecomment-377066466,https://api.github.com/repos/simonw/datasette/issues/190,377066466,MDEyOklzc3VlQ29tbWVudDM3NzA2NjQ2Ng==,9599,simonw,2018-03-28T23:03:45Z,2018-03-28T23:03:57Z,OWNER,"Without row values syntax, the necessary SQL to retrieve the next page after `d, v` gets a bit gnarly: select * from compound_primary_key where pk1 >= ""d"" and not (pk1 = ""d"" and pk2 <= ""v"") order by pk1, pk2 See https://datasette-issue-190-compound-pks.now.sh/compound-pks-9aafe8f?sql=select+*+from+compound_primary_key+where+pk1+%3E%3D+%22d%22+and+not+%28pk1+%3D+%22d%22+and+pk2+%3C%3D+%22v%22%29+order+by+pk1%2C+pk2 This article was useful for figuring this out: https://use-the-index-luke.com/sql/partial-results/fetch-next-page","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309558826,Keyset pagination doesn't work correctly for compound primary keys, https://github.com/simonw/datasette/issues/190#issuecomment-377067541,https://api.github.com/repos/simonw/datasette/issues/190,377067541,MDEyOklzc3VlQ29tbWVudDM3NzA2NzU0MQ==,9599,simonw,2018-03-28T23:09:18Z,2018-03-28T23:09:51Z,OWNER,"Here's how I generated the table for testing this with 3 compound primary keys: CREATE_SQL = ''' CREATE TABLE compound_three_primary_keys ( pk1 varchar(30), pk2 varchar(30), pk3 varchar(30), content text, PRIMARY KEY (pk1, pk2, pk3) );''' alphabet = 'abcdefghijklmnopqrstuvwxyz' for a in alphabet: for b in alphabet: for c in alphabet: print(''' INSERT INTO compound_three_primary_keys VALUES ('{}', '{}', '{}', '{}'); '''.strip().format(a, b, c, '{}-{}-{}-{}-{}-{}'.format(a,b,c,a,b,c))) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309558826,Keyset pagination doesn't work correctly for compound primary keys, https://github.com/simonw/datasette/issues/190#issuecomment-377072022,https://api.github.com/repos/simonw/datasette/issues/190,377072022,MDEyOklzc3VlQ29tbWVudDM3NzA3MjAyMg==,9599,simonw,2018-03-28T23:32:24Z,2018-03-28T23:32:24Z,OWNER,"Here's the SQL for a next page with three compound primary keys: https://datasette-issue-190-compound-pks.now.sh/compound-pks-8e99805?sql=select+*+from+compound_three_primary_keys%0D%0Awhere%0D%0A++%28pk1+%3E+%3Apk1%29%0D%0A++++or%0D%0A++%28pk1+%3D+%3Apk1+and+pk2+%3E+%3Apk2%29%0D%0A++++or%0D%0A++%28pk1+%3D+%3Apk1+and+pk2+%3D+%3Apk2+and+pk3+%3E+%3Apk3%29%0D%0Aorder+by+pk1%2C+pk2%2C+pk3%3B%0D%0A%0D%0A%0D%0A&pk1=a&pk2=d&pk3=v ``` select * from compound_three_primary_keys where (pk1 > :pk1) or (pk1 = :pk1 and pk2 > :pk2) or (pk1 = :pk1 and pk2 = :pk2 and pk3 > :pk3) order by pk1, pk2, pk3; ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309558826,Keyset pagination doesn't work correctly for compound primary keys, https://github.com/simonw/datasette/issues/189#issuecomment-377362466,https://api.github.com/repos/simonw/datasette/issues/189,377362466,MDEyOklzc3VlQ29tbWVudDM3NzM2MjQ2Ng==,9599,simonw,2018-03-29T20:29:14Z,2018-03-29T20:29:14Z,OWNER,"Alternative idea: by default enable all sorting in the UI. If a table has more than 100,000 rows disable sorting UI except for columns that have an index. Allow this to be overridden in metadata.json ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/190#issuecomment-377454591,https://api.github.com/repos/simonw/datasette/issues/190,377454591,MDEyOklzc3VlQ29tbWVudDM3NzQ1NDU5MQ==,9599,simonw,2018-03-30T06:11:59Z,2018-03-30T06:11:59Z,OWNER,"Re-opening this issue: my fix doesn't play nicely with extra filter arguments. Consider this page: https://datasette-issue-190-compound-pks-not-quite-fixed.now.sh/compound-pks-8e99805/compound_three_primary_keys?content__contains=d The next link is to `?_next=f%2Cz%2Ct&content__contains=z` (that's next of `f,z,t`) but that gives us https://datasette-issue-190-compound-pks-not-quite-fixed.now.sh/compound-pks-8e99805/compound_three_primary_keys?_next=b%2Cx%2Cd&content__contains=d which shows `a,a,d` at the top. Sure enough, the generated SQL looks like this: https://datasette-issue-190-compound-pks-not-quite-fixed.now.sh/compound-pks-8e99805?sql=select+%2A+from+compound_three_primary_keys+where+%22content%22+like+%3Ap0+and+%28%5Bpk1%5D+%3E+%3Ap0%29%0A++or%0A%28%5Bpk1%5D+%3D+%3Ap0+and+%5Bpk2%5D+%3E+%3Ap1%29%0A++or%0A%28%5Bpk1%5D+%3D+%3Ap0+and+%5Bpk2%5D+%3D+%3Ap1+and+%5Bpk3%5D+%3E+%3Ap2%29+order+by+pk1%2C+pk2%2C+pk3+limit+101&p0=%25d%25&p1=b&p2=x&p3=d select * from compound_three_primary_keys where ""content"" like :p0 and ([pk1] > :p0) or ([pk1] = :p0 and [pk2] > :p1) or ([pk1] = :p0 and [pk2] = :p1 and [pk3] > :p2) order by pk1, pk2, pk3 limit 101 The parameters here are confused. The :p0 should be reserved just for the like clause - the other parameters should be p1, p2 and p3 (not p0, p1 and p2).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309558826,Keyset pagination doesn't work correctly for compound primary keys, https://github.com/simonw/datasette/issues/190#issuecomment-377457087,https://api.github.com/repos/simonw/datasette/issues/190,377457087,MDEyOklzc3VlQ29tbWVudDM3NzQ1NzA4Nw==,9599,simonw,2018-03-30T06:30:23Z,2018-03-30T06:30:23Z,OWNER,"Interestingly, in deploying a copy of the database to demonstrate this final bug fix I had to use the `--force` argument like so: datasette publish now --branch=master compound-pks.db --force This is because `now` had already deployed a Dockerfile referencing `--branch=master` once already, so it thought nothing had changed and it could re-use that last deployment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309558826,Keyset pagination doesn't work correctly for compound primary keys, https://github.com/simonw/datasette/issues/190#issuecomment-377457214,https://api.github.com/repos/simonw/datasette/issues/190,377457214,MDEyOklzc3VlQ29tbWVudDM3NzQ1NzIxNA==,9599,simonw,2018-03-30T06:31:15Z,2018-03-30T06:31:15Z,OWNER,"Fixed! https://datasette-issue-190-compound-pks-second-fix.now.sh/compound-pks-8e99805/compound_three_primary_keys?_next=b%2Cx%2Cd&content__contains=d now correctly shows `b,y,d` as the first row on the page.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309558826,Keyset pagination doesn't work correctly for compound primary keys, https://github.com/simonw/datasette/issues/189#issuecomment-377459579,https://api.github.com/repos/simonw/datasette/issues/189,377459579,MDEyOklzc3VlQ29tbWVudDM3NzQ1OTU3OQ==,9599,simonw,2018-03-30T06:47:52Z,2018-03-30T06:47:52Z,OWNER,"I'm not entirely sure how to get `_next=` pagination working against sorted collections when a tie-breaker is needed. Consider this data: https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9?sql=select+rowid%2C+*+from+%5Bnfl-wide-receivers%2Fadvanced-historical%5D%0D%0Aorder+by+case+when+career_ranypa+is+null+then+1+else+0+end%2C+career_ranypa%2C+rowid+limit+11 ![2018-03-29 at 11 46 pm](https://user-images.githubusercontent.com/9599/38127549-790c8bd0-33ab-11e8-8d32-66f5d3847c8a.png) If the page size was set to 9 rather than 11, the page divide would be between those two rows with the same value in the `career_ranypa` column. What would the `?_next=` token look like such that the correct row would be returned? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377460127,https://api.github.com/repos/simonw/datasette/issues/189,377460127,MDEyOklzc3VlQ29tbWVudDM3NzQ2MDEyNw==,9599,simonw,2018-03-30T06:51:29Z,2018-03-30T06:51:52Z,OWNER,"The problem is that our `_next=` pagination currently works based on a `>` - but for this case a `>=` for the value is needed combined with a `>` on the tie-breaker (which would be the `rowid` column). So I think this is the right SQL: ``` select rowid, * from [nfl-wide-receivers/advanced-historical] where career_ranypa >= -6.331167749 and rowid > 2736 order by case when career_ranypa is null then 1 else 0 end, career_ranypa, rowid limit 11 ``` https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9?sql=select+rowid%2C+*+from+%5Bnfl-wide-receivers%2Fadvanced-historical%5D%0D%0Awhere+career_ranypa+%3E%3D+-6.331167749+and+rowid+%3E+2736%0D%0Aorder+by+case+when+career_ranypa+is+null+then+1+else+0+end%2C+career_ranypa%2C+rowid+limit+11 But how do I encode a `_next` token that means "">= X and > Y""?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377462334,https://api.github.com/repos/simonw/datasette/issues/189,377462334,MDEyOklzc3VlQ29tbWVudDM3NzQ2MjMzNA==,9599,simonw,2018-03-30T07:06:21Z,2018-03-30T07:06:21Z,OWNER,"Maybe the answer here is that anything that's encoded in the next token is treated as >= with the exception of columns known to be primary keys, which are treated as >","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377546510,https://api.github.com/repos/simonw/datasette/issues/189,377546510,MDEyOklzc3VlQ29tbWVudDM3NzU0NjUxMA==,9599,simonw,2018-03-30T15:13:11Z,2018-03-30T15:13:11Z,OWNER,"Pushed some work-in-progress with failing unit tests here: https://github.com/simonw/datasette/commit/2f8359c6f25768805431c80c74e5ec4213c2b2a6 Here's a demo: https://datasette-column-sort-wip.now.sh/sortable-4bbaa6f/sortable?_sort=sortable - note that the `_sort_desc` and `_sort_nulls_last` options aren't done yet, plus it doesn't correctly paginate (the `_next` tokens do not yet take sorting into account).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-377547265,https://api.github.com/repos/simonw/datasette/issues/189,377547265,MDEyOklzc3VlQ29tbWVudDM3NzU0NzI2NQ==,9599,simonw,2018-03-30T15:16:43Z,2018-03-30T15:16:43Z,OWNER,"I think this is the right incantation for a ""next"" link: https://datasette-column-sort-wip.now.sh/sortable-4bbaa6f?sql=select+*+from+sortable%0D%0Awhere+sortable+%3C%3D+94%0D%0Aand+%28%0D%0A++%28pk1+%3E+%27d%27%29%0D%0A++or%0D%0A++%28pk1+%3D+%27d%27+and+pk2+%3E+%27w%27%29%0D%0A%29%0D%0Aorder+by+sortable+desc%2C+pk1%2C+pk2%0D%0Alimit+7","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/122#issuecomment-378279612,https://api.github.com/repos/simonw/datasette/issues/122,378279612,MDEyOklzc3VlQ29tbWVudDM3ODI3OTYxMg==,9599,simonw,2018-04-03T14:55:54Z,2018-04-03T14:55:54Z,OWNER,The new documentation for the `_shape=` parameter is now live at http://datasette.readthedocs.io/en/latest/json_api.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275092453,"Redesign JSON output, ditch jsono, offer variants controlled by parameter instead", https://github.com/simonw/datasette/issues/183#issuecomment-378281740,https://api.github.com/repos/simonw/datasette/issues/183,378281740,MDEyOklzc3VlQ29tbWVudDM3ODI4MTc0MA==,9599,simonw,2018-04-03T15:01:43Z,2018-04-03T15:01:43Z,OWNER,"I'm having trouble replicating this bug. In particular, I don't understand what you mean by ""these are then rendered in the datasette query box using single quotes"" - since canned queries aren't displayed in a textarea. Do you have an example database / metadata.json I can use to investigate this further?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",291639118,Custom Queries - escaping strings, https://github.com/simonw/datasette/pull/181#issuecomment-378293484,https://api.github.com/repos/simonw/datasette/issues/181,378293484,MDEyOklzc3VlQ29tbWVudDM3ODI5MzQ4NA==,9599,simonw,2018-04-03T15:34:29Z,2018-04-03T15:34:29Z,OWNER,"Here's what this looks like: ![2018-04-03 at 8 32 am](https://user-images.githubusercontent.com/9599/38259345-9e1c75ea-3719-11e8-83c9-2160c6fa079c.png) I need to figure out the right way to handle licensing of bundled software like this - it's MIT licensed which is compatible with Datasette's Apache 2 license, but I feel like bundled licensed software (including codemirror) needs to be recognized in the README or docs somehow.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",289425975,"add ""format sql"" button to query page, uses sql-formatter", https://github.com/simonw/datasette/pull/181#issuecomment-378293599,https://api.github.com/repos/simonw/datasette/issues/181,378293599,MDEyOklzc3VlQ29tbWVudDM3ODI5MzU5OQ==,9599,simonw,2018-04-03T15:34:50Z,2018-04-03T15:36:58Z,OWNER,"Let's only show the ""Format SQL"" button if the user has JavaScript enabled. We can do that in this code here: https://github.com/bsmithgall/datasette/blob/4a7151a58d6ab7c8404a91beef7083e8a5807cf8/datasette/templates/_codemirror_foot.html#L14-L21","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",289425975,"add ""format sql"" button to query page, uses sql-formatter", https://github.com/simonw/datasette/pull/181#issuecomment-378295376,https://api.github.com/repos/simonw/datasette/issues/181,378295376,MDEyOklzc3VlQ29tbWVudDM3ODI5NTM3Ng==,9599,simonw,2018-04-03T15:39:57Z,2018-04-03T15:39:57Z,OWNER,"On the licensing front: it looks like the way Django handles this is to keep the licensing header in the files intact, e.g. https://github.com/django/django/blob/6deaddcca367d0143c815aaa42342021baa3b41e/django/contrib/admin/static/admin/js/vendor/jquery/jquery.js So for this change, adding a comment at the top of `sql-formatter.min.js` which references the MIT license would do the trick.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",289425975,"add ""format sql"" button to query page, uses sql-formatter", https://github.com/simonw/datasette/pull/181#issuecomment-378297842,https://api.github.com/repos/simonw/datasette/issues/181,378297842,MDEyOklzc3VlQ29tbWVudDM3ODI5Nzg0Mg==,1957344,bsmithgall,2018-04-03T15:47:13Z,2018-04-03T15:47:13Z,NONE,I can work on that -- would you prefer to inline a `display: hidden` and then have the javascript flip the visibility or include it as css?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",289425975,"add ""format sql"" button to query page, uses sql-formatter", https://github.com/simonw/datasette/issues/193#issuecomment-379142500,https://api.github.com/repos/simonw/datasette/issues/193,379142500,MDEyOklzc3VlQ29tbWVudDM3OTE0MjUwMA==,222245,carlmjohnson,2018-04-06T04:05:58Z,2018-04-06T04:05:58Z,NONE,"You could try pulling out a validate query strings method. If it fails validation build the error object from the message. If it passes, you only need to go down a happy path. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310882100,Cleaner mechanism for handling custom errors, https://github.com/simonw/datasette/issues/189#issuecomment-379555484,https://api.github.com/repos/simonw/datasette/issues/189,379555484,MDEyOklzc3VlQ29tbWVudDM3OTU1NTQ4NA==,9599,simonw,2018-04-08T14:39:57Z,2018-04-08T14:39:57Z,OWNER,I'm going to combine the code for explicit sorting with the existing code for _next= pagination - so even tables without an explicit sort order will run through the same code since they are ordered and paginated by primary key.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/48#issuecomment-379556637,https://api.github.com/repos/simonw/datasette/issues/48,379556637,MDEyOklzc3VlQ29tbWVudDM3OTU1NjYzNw==,9599,simonw,2018-04-08T14:56:52Z,2018-04-08T14:56:52Z,OWNER,It would be useful to have a microbenchmark in place to help understand how much of a performance benefit this would actually provide.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",272391665,Switch to ujson, https://github.com/simonw/datasette/issues/189#issuecomment-379556774,https://api.github.com/repos/simonw/datasette/issues/189,379556774,MDEyOklzc3VlQ29tbWVudDM3OTU1Njc3NA==,9599,simonw,2018-04-08T14:59:05Z,2018-04-08T14:59:05Z,OWNER,"A common problem with keyset pagination is that it can distort the ""total number of rows"" logic - every time you navigate to a further page the total rows count can decrease due to the extra arguments in the `where` clause. The `filtered_table_rows` value (see #194) calculated using `count_sql` currently has this problem.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/194#issuecomment-379556881,https://api.github.com/repos/simonw/datasette/issues/194,379556881,MDEyOklzc3VlQ29tbWVudDM3OTU1Njg4MQ==,9599,simonw,2018-04-08T15:00:48Z,2018-04-08T15:02:35Z,OWNER,`table_rows_count` is always the *total* number of rows in the table. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312312125,Rename table_rows and filtered_table_rows to have _count suffix, https://github.com/simonw/datasette/issues/194#issuecomment-379556981,https://api.github.com/repos/simonw/datasette/issues/194,379556981,MDEyOklzc3VlQ29tbWVudDM3OTU1Njk4MQ==,9599,simonw,2018-04-08T15:02:23Z,2018-04-08T15:02:23Z,OWNER,Maybe `table_rows_filtered_count` would be more aesthetically pleasing than `filtered_table_rows_count`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312312125,Rename table_rows and filtered_table_rows to have _count suffix, https://github.com/simonw/datasette/issues/195#issuecomment-379557743,https://api.github.com/repos/simonw/datasette/issues/195,379557743,MDEyOklzc3VlQ29tbWVudDM3OTU1Nzc0Mw==,9599,simonw,2018-04-08T15:13:18Z,2018-04-08T15:13:18Z,OWNER,https://github.com/simonw/datasette/blob/446d47fdb005b3776bc06ad8d1f44b01fc2e938b/datasette/app.py#L93-L102,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312313496,"Run pks_for_table in inspect, executing once at build time rather than constantly", https://github.com/simonw/datasette/issues/189#issuecomment-379557982,https://api.github.com/repos/simonw/datasette/issues/189,379557982,MDEyOklzc3VlQ29tbWVudDM3OTU1Nzk4Mg==,9599,simonw,2018-04-08T15:16:49Z,2018-04-08T15:16:49Z,OWNER,"A note about views: a view cannot be paginated using keyset pagination because records returned from a view don't have a primary key - so there's no way to reliably distinguish between _next= records when the sorted column has duplicates with the same value. Datasette already takes this into account: views are paginated using offset/limit instead. We can continue to do that even for views that have been sorted using a `_sort` parameter. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/195#issuecomment-379559074,https://api.github.com/repos/simonw/datasette/issues/195,379559074,MDEyOklzc3VlQ29tbWVudDM3OTU1OTA3NA==,9599,simonw,2018-04-08T15:31:49Z,2018-04-08T15:31:49Z,OWNER,"While I'm at it, doing the same thing for fts_table detection is worth considering: https://github.com/simonw/datasette/blob/446d47fdb005b3776bc06ad8d1f44b01fc2e938b/datasette/app.py#L598-L603","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312313496,"Run pks_for_table in inspect, executing once at build time rather than constantly", https://github.com/simonw/datasette/issues/150#issuecomment-379559214,https://api.github.com/repos/simonw/datasette/issues/150,379559214,MDEyOklzc3VlQ29tbWVudDM3OTU1OTIxNA==,9599,simonw,2018-04-08T15:33:58Z,2018-04-08T15:33:58Z,OWNER,The single biggest challenge here is expanding foreign key references. This is the blocker that prevents `_group_count` from being useful at the moment.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276704327,_group_count= feature improvements, https://github.com/simonw/datasette/issues/150#issuecomment-379559319,https://api.github.com/repos/simonw/datasette/issues/150,379559319,MDEyOklzc3VlQ29tbWVudDM3OTU1OTMxOQ==,9599,simonw,2018-04-08T15:35:43Z,2018-04-08T15:35:43Z,OWNER,"From a code point of view, the current mechanism for `_group_count` makes the `TableView` even **more** complicated: https://github.com/simonw/datasette/blob/446d47fdb005b3776bc06ad8d1f44b01fc2e938b/datasette/app.py#L644-L653 Instead, I think if `_group_count` is detected we should generate the SQL and then defer to `self.custom_sql`, like we do for canned queries: https://github.com/simonw/datasette/blob/446d47fdb005b3776bc06ad8d1f44b01fc2e938b/datasette/app.py#L539-L541","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276704327,_group_count= feature improvements, https://github.com/simonw/datasette/issues/195#issuecomment-379588602,https://api.github.com/repos/simonw/datasette/issues/195,379588602,MDEyOklzc3VlQ29tbWVudDM3OTU4ODYwMg==,9599,simonw,2018-04-08T22:40:16Z,2018-04-08T22:40:16Z,OWNER,"Could also identify all views for that database, which would save on these queries: https://github.com/simonw/datasette/blob/b2188f044265c95f7e54860e28107c17d2a6ed2e/datasette/app.py#L543-L545","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312313496,"Run pks_for_table in inspect, executing once at build time rather than constantly", https://github.com/simonw/datasette/issues/189#issuecomment-379591062,https://api.github.com/repos/simonw/datasette/issues/189,379591062,MDEyOklzc3VlQ29tbWVudDM3OTU5MTA2Mg==,9599,simonw,2018-04-08T23:23:12Z,2018-04-08T23:23:12Z,OWNER,"To break this up into smaller units, the first implementation of this will only support a single `_sort` or `_sort_desc` querystring parameter.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379592393,https://api.github.com/repos/simonw/datasette/issues/189,379592393,MDEyOklzc3VlQ29tbWVudDM3OTU5MjM5Mw==,9599,simonw,2018-04-08T23:45:42Z,2018-04-08T23:46:31Z,OWNER,"Actually next page SQL when sorting looks more like this: ``` select rowid, * from [alcohol-consumption/drinks] where ""country"" like :p0 and ( beer_servings > 111 or (beer_servings = 111 and rowid > 190) ) order by beer_servings, rowid limit 101 ``` The next page after row 190 with sortable value 111 should show either records that are greater than 111 or records that match 111 but have a greater primary key than the last one seen. https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9?sql=select+rowid%2C+*+from+%5Balcohol-consumption%2Fdrinks%5D%0D%0Awhere+%22country%22+like+%3Ap0%0D%0Aand+%28%0D%0A++++beer_servings+%3E+111%0D%0A++++or+%28beer_servings+%3D+111+and+rowid+%3E+190%29%0D%0A%29%0D%0Aorder+by+beer_servings%2C+rowid+limit+101&p0=%25a%25","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379594529,https://api.github.com/repos/simonw/datasette/issues/189,379594529,MDEyOklzc3VlQ29tbWVudDM3OTU5NDUyOQ==,9599,simonw,2018-04-09T00:15:03Z,2018-04-09T00:15:03Z,OWNER,"Demo: senator tweets ordered by number of replies: https://datasette-issue-189-demo.now.sh/fivethirtyeight-2628db9/twitter-ratio%2Fsenators?_sort_desc=replies Page 2 (note that since Senators retweet things there are tweets with the same text/number-of-replies but retweeted by different senators that span the page break): https://datasette-issue-189-demo.now.sh/fivethirtyeight-2628db9/twitter-ratio%2Fsenators?_next=8556%2C121799&_sort_desc=replies ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/185#issuecomment-379595253,https://api.github.com/repos/simonw/datasette/issues/185,379595253,MDEyOklzc3VlQ29tbWVudDM3OTU5NTI1Mw==,9599,simonw,2018-04-09T00:24:10Z,2018-04-09T00:24:10Z,OWNER,@carlmjohnson in case you aren't following along with #189 I've shipped the first working prototype of sort-by-column - you can try it out here: https://datasette-issue-189-demo-2.now.sh/salaries-7859114-7859114/2017+Maryland+state+salaries?_search=university&_sort_desc=annual_salary,"{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/189#issuecomment-379595274,https://api.github.com/repos/simonw/datasette/issues/189,379595274,MDEyOklzc3VlQ29tbWVudDM3OTU5NTI3NA==,9599,simonw,2018-04-09T00:24:37Z,2018-04-09T00:29:46Z,OWNER,"Another demo: https://datasette-issue-189-demo-2.now.sh/salaries-7859114-7859114/2017+Maryland+state+salaries?_search=university&_sort_desc=annual_salary https://datasette-issue-189-demo-2.now.sh/salaries-7859114-7859114/2017+Maryland+state+salaries?_search=university&last_name__exact=JOHNSON&_sort_desc=annual_salary","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379602339,https://api.github.com/repos/simonw/datasette/issues/189,379602339,MDEyOklzc3VlQ29tbWVudDM3OTYwMjMzOQ==,9599,simonw,2018-04-09T01:33:26Z,2018-04-09T01:33:26Z,OWNER,"Small bug: ""201 rows where sorted by sortable_with_nulls"" shouldn't have the word ""where"" in it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379602690,https://api.github.com/repos/simonw/datasette/issues/189,379602690,MDEyOklzc3VlQ29tbWVudDM3OTYwMjY5MA==,9599,simonw,2018-04-09T01:37:03Z,2018-04-09T01:37:03Z,OWNER,"I'm going to split the following out into separate tickets: * Ability to sort by multiple columns e.g. `?_sort=name&sort_desc=age&_sort=height` * Ability to specify nulls last e.g. `?_sort_desc_nulls_last=age`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379603156,https://api.github.com/repos/simonw/datasette/issues/189,379603156,MDEyOklzc3VlQ29tbWVudDM3OTYwMzE1Ng==,9599,simonw,2018-04-09T01:41:22Z,2018-04-09T01:41:22Z,OWNER,"Actually I think I always want nulls last when ordering asc, nulls first when ordering desc.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379608977,https://api.github.com/repos/simonw/datasette/issues/189,379608977,MDEyOklzc3VlQ29tbWVudDM3OTYwODk3Nw==,9599,simonw,2018-04-09T02:22:59Z,2018-04-09T02:22:59Z,OWNER,"Here's a demo of the new clickable column headers: https://datasette-issue-189-demo-3.now.sh/salaries-7859114-7859114/2017+Maryland+state+salaries?_search=university&_sort_desc=last_name ![2018-04-08 at 7 22 pm](https://user-images.githubusercontent.com/9599/38476370-3e62a60e-3b62-11e8-9d30-8dc6608133dd.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/193#issuecomment-379624163,https://api.github.com/repos/simonw/datasette/issues/193,379624163,MDEyOklzc3VlQ29tbWVudDM3OTYyNDE2Mw==,9599,simonw,2018-04-09T04:03:49Z,2018-04-09T04:03:49Z,OWNER,"This is harder than I thought, because the `_shape=` logic actually runs AFTER the main block of code which is set up to catch exceptions - this code here: https://github.com/simonw/datasette/blob/0abd3abacb309a2bd5913a7a2df4e9256585b1bb/datasette/app.py#L200-L216","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310882100,Cleaner mechanism for handling custom errors, https://github.com/simonw/datasette/issues/189#issuecomment-379634425,https://api.github.com/repos/simonw/datasette/issues/189,379634425,MDEyOklzc3VlQ29tbWVudDM3OTYzNDQyNQ==,9599,simonw,2018-04-09T05:16:02Z,2018-04-09T05:16:02Z,OWNER,I've merged this into master.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/184#issuecomment-379636068,https://api.github.com/repos/simonw/datasette/issues/184,379636068,MDEyOklzc3VlQ29tbWVudDM3OTYzNjA2OA==,9599,simonw,2018-04-09T05:26:21Z,2018-04-09T05:26:21Z,OWNER,Do you have steps to reproduce here - ideally a small example SQLite database that exhibits the error?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",292011379,500 from missing table name, https://github.com/simonw/datasette/pull/181#issuecomment-379636695,https://api.github.com/repos/simonw/datasette/issues/181,379636695,MDEyOklzc3VlQ29tbWVudDM3OTYzNjY5NQ==,9599,simonw,2018-04-09T05:30:16Z,2018-04-09T05:30:16Z,OWNER,"I'd prefer to have the JavaScript actually manipulate the DOM to add the button - something like this: var button = document.createElement('button'); button.value = 'Format SQL'; button.addEventListener( 'click', format, false ); document.getElementById('run-sql').parentNode.appendChild(button);","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",289425975,"add ""format sql"" button to query page, uses sql-formatter", https://github.com/simonw/datasette/pull/181#issuecomment-379759875,https://api.github.com/repos/simonw/datasette/issues/181,379759875,MDEyOklzc3VlQ29tbWVudDM3OTc1OTg3NQ==,1957344,bsmithgall,2018-04-09T13:53:14Z,2018-04-09T13:53:14Z,NONE,I've implemented that approach in 86ac746. It does cause the button to pop in only after Codemirror is finished rendering which is a bit awkward.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",289425975,"add ""format sql"" button to query page, uses sql-formatter", https://github.com/simonw/datasette/issues/184#issuecomment-379788103,https://api.github.com/repos/simonw/datasette/issues/184,379788103,MDEyOklzc3VlQ29tbWVudDM3OTc4ODEwMw==,222245,carlmjohnson,2018-04-09T15:15:11Z,2018-04-09T15:15:11Z,NONE,Visit https://salaries.news.baltimoresun.com/salaries/bad-table.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",292011379,500 from missing table name, https://github.com/simonw/datasette/issues/189#issuecomment-379791047,https://api.github.com/repos/simonw/datasette/issues/189,379791047,MDEyOklzc3VlQ29tbWVudDM3OTc5MTA0Nw==,222245,carlmjohnson,2018-04-09T15:23:45Z,2018-04-09T15:23:45Z,NONE,Awesome!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379803864,https://api.github.com/repos/simonw/datasette/issues/189,379803864,MDEyOklzc3VlQ29tbWVudDM3OTgwMzg2NA==,9599,simonw,2018-04-09T16:02:09Z,2018-04-09T16:02:09Z,OWNER,This is now released in Datasette 0.15 https://github.com/simonw/datasette/releases/tag/0.15,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/189#issuecomment-379830529,https://api.github.com/repos/simonw/datasette/issues/189,379830529,MDEyOklzc3VlQ29tbWVudDM3OTgzMDUyOQ==,9599,simonw,2018-04-09T17:28:47Z,2018-04-09T17:28:47Z,OWNER,Another demo: https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9/congress-age%2Fcongress-terms,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/issues/199#issuecomment-379833216,https://api.github.com/repos/simonw/datasette/issues/199,379833216,MDEyOklzc3VlQ29tbWVudDM3OTgzMzIxNg==,9599,simonw,2018-04-09T17:37:47Z,2018-04-09T17:37:47Z,OWNER,I may do this by adding select boxes for _sort and _sort_desc to the filters UI. This would allow sorting in mobile portrait mode but would also ensure that the existing sort order is persisted if the user edits the current filters (right now sort resets when filters are applied).,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312620566,Ability to apply sort on mobile in portrait mode, https://github.com/simonw/datasette/issues/199#issuecomment-379833481,https://api.github.com/repos/simonw/datasette/issues/199,379833481,MDEyOklzc3VlQ29tbWVudDM3OTgzMzQ4MQ==,9599,simonw,2018-04-09T17:38:39Z,2018-04-09T17:38:39Z,OWNER,"Since you can't apply `_sort` and `_sort_desc` at the same time, maybe just one select box for picking the column to sort by and a boolean checkbox for ""sort descending"" - which then redirects to the `_sort_desc=` URL variant.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312620566,Ability to apply sort on mobile in portrait mode, https://github.com/simonw/datasette/issues/199#issuecomment-379936068,https://api.github.com/repos/simonw/datasette/issues/199,379936068,MDEyOklzc3VlQ29tbWVudDM3OTkzNjA2OA==,9599,simonw,2018-04-10T00:32:37Z,2018-04-10T00:32:37Z,OWNER,"![2018-04-09 at 5 32 pm](https://user-images.githubusercontent.com/9599/38529802-fd2a7e68-3c1b-11e8-974a-bf5438fec701.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312620566,Ability to apply sort on mobile in portrait mode, https://github.com/simonw/datasette/issues/199#issuecomment-379936832,https://api.github.com/repos/simonw/datasette/issues/199,379936832,MDEyOklzc3VlQ29tbWVudDM3OTkzNjgzMg==,9599,simonw,2018-04-10T00:37:52Z,2018-04-10T00:37:52Z,OWNER,Demo: https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9/twitter-ratio%2Fsenators?_sort_desc=replies&text__contains=bipartisan,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",312620566,Ability to apply sort on mobile in portrait mode, https://github.com/simonw/datasette/pull/200#issuecomment-380606998,https://api.github.com/repos/simonw/datasette/issues/200,380606998,MDEyOklzc3VlQ29tbWVudDM4MDYwNjk5OA==,9599,simonw,2018-04-11T21:50:14Z,2018-04-11T21:50:14Z,OWNER,"We should only do this if we're certain the spatialite module has been loaded. I could imagine someone having a `sql_statements_log` table of their own without using spatialite for example. I think the most reliable way to detect spatialite is to run `SELECT AddGeometryColumn(1, 2, 3, 4, 5);` against a `:memory:` database and see if it throws an exception - similar to how we detect FTS. We could add this as a `detect_spatialite()` function in `utils.py` and call it once on startup.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313494458,Hide Spatialite system tables, https://github.com/simonw/datasette/issues/184#issuecomment-380608340,https://api.github.com/repos/simonw/datasette/issues/184,380608340,MDEyOklzc3VlQ29tbWVudDM4MDYwODM0MA==,9599,simonw,2018-04-11T21:55:41Z,2018-04-11T21:55:41Z,OWNER,"Yuck, nasty - OK I get it, this happens with ANY non-existent table name. Let's fix that - these should clearly return an HTTP 404.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",292011379,500 from missing table name, https://github.com/simonw/datasette/pull/200#issuecomment-380608372,https://api.github.com/repos/simonw/datasette/issues/200,380608372,MDEyOklzc3VlQ29tbWVudDM4MDYwODM3Mg==,45057,russss,2018-04-11T21:55:46Z,2018-04-11T21:55:46Z,CONTRIBUTOR,"> I think the most reliable way to detect spatialite is to run `SELECT AddGeometryColumn(1, 2, 3, 4, 5);` against a `:memory:` database and see if it throws an exception Or just see if there's a `geometry_columns` table? I think that's quite unlikely to be added by accident (and it's an OGC standard). It also tells you if Spatialite is installed in the database rather than just loaded.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313494458,Hide Spatialite system tables, https://github.com/simonw/datasette/issues/193#issuecomment-380619851,https://api.github.com/repos/simonw/datasette/issues/193,380619851,MDEyOklzc3VlQ29tbWVudDM4MDYxOTg1MQ==,9599,simonw,2018-04-11T22:48:19Z,2018-04-11T22:48:19Z,OWNER,I can clean this up further with the mechanism I'm using for #184,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310882100,Cleaner mechanism for handling custom errors, https://github.com/simonw/datasette/pull/200#issuecomment-380951474,https://api.github.com/repos/simonw/datasette/issues/200,380951474,MDEyOklzc3VlQ29tbWVudDM4MDk1MTQ3NA==,9599,simonw,2018-04-12T21:34:39Z,2018-04-12T21:34:39Z,OWNER,"Nice, thanks very much.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313494458,Hide Spatialite system tables, https://github.com/simonw/datasette/issues/203#issuecomment-380951815,https://api.github.com/repos/simonw/datasette/issues/203,380951815,MDEyOklzc3VlQ29tbWVudDM4MDk1MTgxNQ==,9599,simonw,2018-04-12T21:36:10Z,2018-04-12T21:36:10Z,OWNER,I like this. I'd like to be able to attach a full description to a column as well. We could support these in `metadata.json`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/203#issuecomment-380951920,https://api.github.com/repos/simonw/datasette/issues/203,380951920,MDEyOklzc3VlQ29tbWVudDM4MDk1MTkyMA==,9599,simonw,2018-04-12T21:36:38Z,2018-04-12T21:36:38Z,OWNER,This also feeds into the visualization features I want to add - we could use this kind of metadata to automatically apply meaningful labels to graphs.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/203#issuecomment-380966565,https://api.github.com/repos/simonw/datasette/issues/203,380966565,MDEyOklzc3VlQ29tbWVudDM4MDk2NjU2NQ==,45057,russss,2018-04-12T22:43:08Z,2018-04-12T22:43:08Z,CONTRIBUTOR,"Looks like [pint](https://pint.readthedocs.io/en/latest/tutorial.html) is pretty good at this. ```python In [1]: import pint In [2]: ureg = pint.UnitRegistry() In [3]: q = 3e6 * ureg('Hz') In [4]: '{:~P}'.format(q.to_compact()) Out[4]: '3.0 MHz' In [5]: q = 0.3 * ureg('m') In [5]: '{:~P}'.format(q.to_compact()) Out[5]: '300.0 mm' In [6]: q = 5 * ureg('') In [7]: '{:~P}'.format(q.to_compact()) Out[7]: '5' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/pull/202#issuecomment-381220441,https://api.github.com/repos/simonw/datasette/issues/202,381220441,MDEyOklzc3VlQ29tbWVudDM4MTIyMDQ0MQ==,9599,simonw,2018-04-13T18:19:15Z,2018-04-13T18:19:15Z,OWNER,I'm afraid I've just made this obsolete with 9f28bbe43dc277a3963a12aaae37b5ee3c277207,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313785206,Raise 404 on nonexistent table URLs, https://github.com/simonw/datasette/pull/202#issuecomment-381237440,https://api.github.com/repos/simonw/datasette/issues/202,381237440,MDEyOklzc3VlQ29tbWVudDM4MTIzNzQ0MA==,45057,russss,2018-04-13T19:22:53Z,2018-04-13T19:22:53Z,CONTRIBUTOR,I spotted you'd mentioned that in #184 but only after I'd written the patch!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313785206,Raise 404 on nonexistent table URLs, https://github.com/simonw/datasette/issues/201#issuecomment-381262824,https://api.github.com/repos/simonw/datasette/issues/201,381262824,MDEyOklzc3VlQ29tbWVudDM4MTI2MjgyNA==,9599,simonw,2018-04-13T21:17:14Z,2018-04-13T21:17:14Z,OWNER,"Demo: https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9?sql=explain+query+plan+select+*+from+%5Bmost-common-name%2Fsurnames%5D+order+by+rank+desc https://fivethirtyeight.datasettes.com/fivethirtyeight-2628db9?sql=explain+select+*+from+%5Bmost-common-name%2Fsurnames%5D+order+by+rank+desc","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313512748,Support explain select / explain query plan select, https://github.com/simonw/datasette/issues/203#issuecomment-381300336,https://api.github.com/repos/simonw/datasette/issues/203,381300336,MDEyOklzc3VlQ29tbWVudDM4MTMwMDMzNg==,9599,simonw,2018-04-14T03:35:02Z,2018-04-14T03:35:02Z,OWNER,"This is really cool - I'm very impressed by pint. I'd like to figure out a sensible opt-in way to expose this in the JSON output as well. Maybe with a `&_units=true` parameter? We should definitely expose the units section from the table metadata in the output of https://wtr-api.herokuapp.com/wtr-663ea99/license_frequency.json","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/203#issuecomment-381300386,https://api.github.com/repos/simonw/datasette/issues/203,381300386,MDEyOklzc3VlQ29tbWVudDM4MTMwMDM4Ng==,9599,simonw,2018-04-14T03:35:56Z,2018-04-14T03:35:56Z,OWNER,"In #204 you said ""I'd like to add support for using units when querying but this is PR is pretty usable as-is."" - I'm fascinated to hear more about how this could work.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/203#issuecomment-381315675,https://api.github.com/repos/simonw/datasette/issues/203,381315675,MDEyOklzc3VlQ29tbWVudDM4MTMxNTY3NQ==,45057,russss,2018-04-14T09:14:45Z,2018-04-14T09:27:30Z,CONTRIBUTOR,"> I'd like to figure out a sensible opt-in way to expose this in the JSON output as well. Maybe with a &_units=true parameter? From a machine-readable perspective I'm not sure why it would be useful to decorate the values with units. Edit: Should have had some coffee first. It's clearly useful for stuff like map rendering! I agree that the unit metadata should definitely be exposed in the JSON. > In #204 you said ""I'd like to add support for using units when querying but this is PR is pretty usable as-is."" - I'm fascinated to hear more about how this could work. I'm thinking about a couple of approaches here. I think the simplest one is: if the column has a unit attached, optionally accept units in query fields: ```python column_units = ureg(""Hz"") # Create a unit object for the column's unit query_variable = ureg(""4 GHz"") # Supplied query variable # Now we can convert the query units into column units before querying supplied_value.to(column_units).magnitude > 4000000000.0 # If the user doesn't supply units, pint just returns the plain # number and we can query as usual assuming it's the base unit query_variable = ureg(""50"") query_variable > 50 isinstance(query_variable, numbers.Number) > True ``` This also lets us do some nice unit conversion on querying: ```python column_units = ureg(""m"") query_variable = ureg(""50 ft"") supplied_value.to(column_units) > ``` The alternative would be to provide a dropdown of units next to the query field (so a ""Hz"" field would give you ""kHz"", ""MHz"", ""GHz""). Although this would be clearer to the user, it isn't so easy - we'd need to know more about the context of the field to give you sensible SI prefixes (I'm not so interested in nanoHertz, for example). You also lose the bonus of being able to convert - although pint will happily show you all the compatible units, it again suffers from a lack of context: ```python ureg(""m"").compatible_units() > frozenset({, , , , , , , , , , , }) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/203#issuecomment-381330075,https://api.github.com/repos/simonw/datasette/issues/203,381330075,MDEyOklzc3VlQ29tbWVudDM4MTMzMDA3NQ==,9599,simonw,2018-04-14T13:41:53Z,2018-04-14T13:41:53Z,OWNER,"Presumably units only work for numeric fields? If that's the case then automatically processing them if the incoming query string argument has a unit suffix makes total sense to me. Here's a pretty crazy idea: what if we exposed unit conversion to SQL as a custom SQLite function? That way it would be possible to optionally use units in actual custom SQL queries. I'd have to think quite carefully about performance implications here - wouldn't want a poorly considered unit calculation over a 500,000 row table to lock up the server. But I think the 1s query time limit might still prevent that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/pull/205#issuecomment-381330220,https://api.github.com/repos/simonw/datasette/issues/205,381330220,MDEyOklzc3VlQ29tbWVudDM4MTMzMDIyMA==,9599,simonw,2018-04-14T13:44:15Z,2018-04-14T13:44:15Z,OWNER,This looks great so far - love the new documentation. Let's throw in a unit test or two for the basic unit filters (mainly as a protection against accidental regressions in the future).,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314319372,Support filtering with units and more, https://github.com/simonw/datasette/pull/205#issuecomment-381332222,https://api.github.com/repos/simonw/datasette/issues/205,381332222,MDEyOklzc3VlQ29tbWVudDM4MTMzMjIyMg==,45057,russss,2018-04-14T14:16:35Z,2018-04-14T14:16:35Z,CONTRIBUTOR,I've added some tests and that docs link.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314319372,Support filtering with units and more, https://github.com/simonw/datasette/pull/207#issuecomment-381334973,https://api.github.com/repos/simonw/datasette/issues/207,381334973,MDEyOklzc3VlQ29tbWVudDM4MTMzNDk3Mw==,9599,simonw,2018-04-14T14:59:52Z,2018-04-14T14:59:52Z,OWNER,I'm going to merge this and then add a unit test.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314329002,Link foreign keys which don't have labels, https://github.com/simonw/datasette/pull/205#issuecomment-381336696,https://api.github.com/repos/simonw/datasette/issues/205,381336696,MDEyOklzc3VlQ29tbWVudDM4MTMzNjY5Ng==,9599,simonw,2018-04-14T15:24:04Z,2018-04-14T15:24:04Z,OWNER,I merged this to master in c857608738d6b6c3e4f3248304a22f8b2648dd3e - thanks @russss!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314319372,Support filtering with units and more, https://github.com/simonw/datasette/issues/203#issuecomment-381348849,https://api.github.com/repos/simonw/datasette/issues/203,381348849,MDEyOklzc3VlQ29tbWVudDM4MTM0ODg0OQ==,9599,simonw,2018-04-14T18:12:52Z,2018-04-14T18:12:52Z,OWNER,I think I'm going to hold on to the custom sql function idea for the moment and implement it as an example plugin.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/125#issuecomment-381361734,https://api.github.com/repos/simonw/datasette/issues/125,381361734,MDEyOklzc3VlQ29tbWVudDM4MTM2MTczNA==,45057,russss,2018-04-14T21:26:30Z,2018-04-14T21:26:30Z,CONTRIBUTOR,"FWIW I am now doing this on my WTR app (instead of silently limiting maps to 1000). [Telefonica](https://wtr-api.herokuapp.com/wtr-663ea99/licensee/18325) now has about 4000 markers and good old [BT](https://wtr-api.herokuapp.com/wtr-663ea99/licensee/8412) has 22,000 or so.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275135393,Plot rows on a map with Leaflet and Leaflet.markercluster, https://github.com/simonw/datasette/issues/189#issuecomment-381429213,https://api.github.com/repos/simonw/datasette/issues/189,381429213,MDEyOklzc3VlQ29tbWVudDM4MTQyOTIxMw==,222245,carlmjohnson,2018-04-15T18:54:22Z,2018-04-15T18:54:22Z,NONE,"I think I found a bug. I tried to sort by middle initial in my salaries set, and many middle initials are null. The next_url gets set by Datasette to: http://localhost:8001/salaries-d3a5631/2017+Maryland+state+salaries?_next=None%2C391&_sort=middle_initial But then `None` is interpreted literally and it tries to find a name with the middle initial ""None"" and ends up skipping ahead to O on page 2.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309471814,Ability to sort (and paginate) by column, https://github.com/simonw/datasette/pull/209#issuecomment-381441392,https://api.github.com/repos/simonw/datasette/issues/209,381441392,MDEyOklzc3VlQ29tbWVudDM4MTQ0MTM5Mg==,45057,russss,2018-04-15T21:59:15Z,2018-04-15T21:59:15Z,CONTRIBUTOR,"I suspected this would cause some test failures, but I'll wait for opinions before attempting to fix them.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314455877, Don't duplicate simple primary keys in the link column, https://github.com/simonw/datasette/issues/14#issuecomment-381442233,https://api.github.com/repos/simonw/datasette/issues/14,381442233,MDEyOklzc3VlQ29tbWVudDM4MTQ0MjIzMw==,9599,simonw,2018-04-15T22:13:06Z,2018-04-15T22:13:06Z,OWNER,"I started a thread on Twitter asking people for good examples of Python projects with a strong plugin ecosystem: https://twitter.com/simonw/status/985377670388105216 The most impressive example that came back was pytest - which now has nearly 400 plugins: https://plugincompat.herokuapp.com/ The pytest plugin infrastructure is available as an independent package called pluggy - which appears to offer everything I need for Datasette. I'm going to give that a go and see how well it works: https://pluggy.readthedocs.io/en/latest/","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381442494,https://api.github.com/repos/simonw/datasette/issues/14,381442494,MDEyOklzc3VlQ29tbWVudDM4MTQ0MjQ5NA==,9599,simonw,2018-04-15T22:17:59Z,2018-04-15T22:17:59Z,OWNER,"Datasette 1.0 will be the release of Datasette that attempts to provide a stable plugin API: https://github.com/simonw/datasette/milestone/7 There's a lot of work to be done before then, but as a starting point I'm going to support two very simple extension mechanisms: * Template system plugins - where the hook gets passed the Jinja environment and can freely register new template tags and filters * SQLite connection plugins - where the hook gets passed a new SQLite connection and can register custom SQLite functions The template system hook will go near here: https://github.com/simonw/datasette/blob/efbb4e83374a2c795e436c72fa79f70da72309b8/datasette/app.py#L1225-L1228 The SQLite connection hook will go near here: https://github.com/simonw/datasette/blob/efbb4e83374a2c795e436c72fa79f70da72309b8/datasette/app.py#L1094-L1098 These two feel simple enough that I'm not worried that I might design an API that I later regret.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381443728,https://api.github.com/repos/simonw/datasette/issues/14,381443728,MDEyOklzc3VlQ29tbWVudDM4MTQ0MzcyOA==,9599,simonw,2018-04-15T22:39:00Z,2018-04-15T22:39:00Z,OWNER,Tox is a good example of a project that uses pluggy in the way I want to use it (function hooks rather than classes): https://github.com/tox-dev/tox/blob/master/tox/hookspecs.py,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381446392,https://api.github.com/repos/simonw/datasette/issues/14,381446392,MDEyOklzc3VlQ29tbWVudDM4MTQ0NjM5Mg==,9599,simonw,2018-04-15T23:22:40Z,2018-04-16T05:25:57Z,OWNER,"OK, from that prototype in f2720b0c6b7172ebe8820 it looks like pluggy provides a solid path forward. Next steps: - [x] Build a demo plugin that uses setuptools entrypoints to register with the `datasette` plugin manager via pluggy - [x] Figure out a mechanism for registering plugins without first needing to publish them to PyPI. Can I load plugins from a special `plugins/` directory similar to the `--template-dir=templates/` option already supported by Datasette? #211","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381446511,https://api.github.com/repos/simonw/datasette/issues/14,381446511,MDEyOklzc3VlQ29tbWVudDM4MTQ0NjUxMQ==,9599,simonw,2018-04-15T23:25:04Z,2018-04-15T23:25:04Z,OWNER,"Here's a demo of the `convert_units()` SQL function I prototyped in f2720b0c6b7172ebe88 ![2018-04-15 at 4 23 pm](https://user-images.githubusercontent.com/9599/38784633-8c43821e-40c9-11e8-97dd-697755a0f858.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/203#issuecomment-381446554,https://api.github.com/repos/simonw/datasette/issues/203,381446554,MDEyOklzc3VlQ29tbWVudDM4MTQ0NjU1NA==,9599,simonw,2018-04-15T23:25:54Z,2018-04-15T23:26:03Z,OWNER,I built a prototype of the `convert_units()` custom SQL function as a plugin over in https://github.com/simonw/datasette/issues/14#issuecomment-381446511,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/14#issuecomment-381446906,https://api.github.com/repos/simonw/datasette/issues/14,381446906,MDEyOklzc3VlQ29tbWVudDM4MTQ0NjkwNg==,9599,simonw,2018-04-15T23:31:58Z,2018-04-15T23:34:10Z,OWNER,"Once I've got the plugins mechanism stable and people start releasing plugins it would be useful to have a dedicated Trove classifier on PyPI for Datasette plugins - `Framework :: Datasette` for example. This would help me build a Datasette equivalent of the http://plugincompat.herokuapp.com/ site, which works by scanning PyPI for items with the ``Framework :: Pytest`` classifier: https://github.com/pytest-dev/plugincompat/blob/8bdf1a6fb82807091ece0c68c196103ee8270194/update_index.py#L52-L53 It looks like the mechanism for requesting new PyPI classifiers is to file a ticket against warehouse, like these ones: https://github.com/pypa/warehouse/issues/3570 and https://github.com/pypa/warehouse/issues/2881","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381450394,https://api.github.com/repos/simonw/datasette/issues/14,381450394,MDEyOklzc3VlQ29tbWVudDM4MTQ1MDM5NA==,9599,simonw,2018-04-16T00:27:23Z,2018-04-16T00:27:23Z,OWNER,"I created https://github.com/simonw/datasette-plugin-demos which is now published to PyPI and can be installed with `pip install datasette-plugin-demos` - I've confirmed that if you DO install it my Datasette `plugins` branch picks up the plugins, and `select random_integer(1, 4)` works as it should.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381450591,https://api.github.com/repos/simonw/datasette/issues/14,381450591,MDEyOklzc3VlQ29tbWVudDM4MTQ1MDU5MQ==,9599,simonw,2018-04-16T00:30:22Z,2018-04-16T00:34:42Z,OWNER,"Slight code design problem... when I tried installing my branch in a fresh virtual environment I got this error, because `setup.py` now depends on `pluggy` (from importing `__version__`): ``` File ""/private/var/folders/jj/fngnv0810tn2lt_kd3911pdc0000gp/T/pip-req-build-dftqdezt/setup.py"", line 2, in from datasette import __version__ File ""/private/var/folders/jj/fngnv0810tn2lt_kd3911pdc0000gp/T/pip-req-build-dftqdezt/datasette/__init__.py"", line 2, in from .hookspecs import hookimpl # noqa File ""/private/var/folders/jj/fngnv0810tn2lt_kd3911pdc0000gp/T/pip-req-build-dftqdezt/datasette/hookspecs.py"", line 1, in from pluggy import HookimplMarker ModuleNotFoundError: No module named 'pluggy' ``` Looks like I've run into point 6 on https://packaging.python.org/guides/single-sourcing-package-version/ : ![2018-04-15 at 5 34 pm](https://user-images.githubusercontent.com/9599/38785314-403ce86a-40d3-11e8-8542-ba426eddf4ac.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/139#issuecomment-381455054,https://api.github.com/repos/simonw/datasette/issues/139,381455054,MDEyOklzc3VlQ29tbWVudDM4MTQ1NTA1NA==,9599,simonw,2018-04-16T01:24:13Z,2018-04-16T01:24:13Z,OWNER,"I think Vega-Lite is the way to go here: https://vega.github.io/vega-lite/ I've been playing around with it and Datasette with some really positive initial results: https://vega.github.io/editor/#/gist/vega-lite/simonw/89100ce80573d062d70f780d10e5e609/decada131575825875c0a076e418c661c2adb014/vice-shootings-gender-race-by-department.vl.json https://vega.github.io/editor/#/gist/vega-lite/simonw/5f69fbe29380b0d5d95f31a385f49ee4/7087b64df03cf9dba44a5258a606f29182cb8619/trees-san-francisco.vl.json","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275493851,Build a visualization plugin for Vega, https://github.com/simonw/datasette/issues/211#issuecomment-381456434,https://api.github.com/repos/simonw/datasette/issues/211,381456434,MDEyOklzc3VlQ29tbWVudDM4MTQ1NjQzNA==,9599,simonw,2018-04-16T01:36:16Z,2018-04-16T01:37:44Z,OWNER,"The easiest way to implement this in Python 2 would be `execfile(...)` - but that was removed in Python 3. According to https://stackoverflow.com/a/437857/6083 `2to3` replaces that with this, which ensures the filename is associated with the code for debugging purposes: ``` with open(""somefile.py"") as f: code = compile(f.read(), ""somefile.py"", 'exec') exec(code, global_vars, local_vars) ``` Implementing it this way would force this kind of plugin to be self-contained in a single file. I think that's OK: if you want a more complex plugin you can use the standard pluggy-powered setuptools mechanism to build it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314471743,Load plugins from a `--plugins-dir=plugins/` directory, https://github.com/simonw/datasette/issues/211#issuecomment-381462005,https://api.github.com/repos/simonw/datasette/issues/211,381462005,MDEyOklzc3VlQ29tbWVudDM4MTQ2MjAwNQ==,9599,simonw,2018-04-16T02:23:07Z,2018-04-16T02:23:07Z,OWNER,This needs unit tests. I also need to manually test the `datasette package` and `datesette publish` commands.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314471743,Load plugins from a `--plugins-dir=plugins/` directory, https://github.com/simonw/datasette/issues/211#issuecomment-381478217,https://api.github.com/repos/simonw/datasette/issues/211,381478217,MDEyOklzc3VlQ29tbWVudDM4MTQ3ODIxNw==,9599,simonw,2018-04-16T04:41:38Z,2018-04-16T04:41:38Z,OWNER,"Here's the result of running: datasette publish now fivethirtyeight.db \ --plugins-dir=plugins/ --title=""FiveThirtyEight"" --branch=plugins-dir https://datasette-phjtvzwwzl.now.sh/fivethirtyeight-2628db9?sql=select+convert_units%28100%2C+%27m%27%2C+%27ft%27%29 Where `plugins/pint_plugin.py` contains the following: ``` from datasette import hookimpl import pint ureg = pint.UnitRegistry() @hookimpl def prepare_connection(conn): def convert_units(amount, from_, to_): ""select convert_units(100, 'm', 'ft');"" return (amount * ureg(from_)).to(to_).to_tuple()[0] conn.create_function('convert_units', 3, convert_units) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314471743,Load plugins from a `--plugins-dir=plugins/` directory, https://github.com/simonw/datasette/issues/211#issuecomment-381478253,https://api.github.com/repos/simonw/datasette/issues/211,381478253,MDEyOklzc3VlQ29tbWVudDM4MTQ3ODI1Mw==,9599,simonw,2018-04-16T04:42:02Z,2018-04-16T04:42:02Z,OWNER,"This worked as well: datasette package fivethirtyeight.db \ --plugins-dir=plugins/ --title=""FiveThirtyEight"" --branch=plugins-dir ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314471743,Load plugins from a `--plugins-dir=plugins/` directory, https://github.com/simonw/datasette/issues/211#issuecomment-381481990,https://api.github.com/repos/simonw/datasette/issues/211,381481990,MDEyOklzc3VlQ29tbWVudDM4MTQ4MTk5MA==,9599,simonw,2018-04-16T05:14:57Z,2018-04-16T05:14:57Z,OWNER,Added unit tests in 33c6bcadb962457be6b0c7f369826b404e2bcef5,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314471743,Load plugins from a `--plugins-dir=plugins/` directory, https://github.com/simonw/datasette/issues/211#issuecomment-381482407,https://api.github.com/repos/simonw/datasette/issues/211,381482407,MDEyOklzc3VlQ29tbWVudDM4MTQ4MjQwNw==,9599,simonw,2018-04-16T05:18:29Z,2018-04-16T05:18:29Z,OWNER,"Here's the result of running this: datasette publish heroku fivethirtyeight.db \ --plugins-dir=plugins/ --title=""FiveThirtyEight"" --branch=plugins-dir https://intense-river-24599.herokuapp.com/fivethirtyeight-2628db9?sql=select+convert_units%28100%2C+%27m%27%2C+%27ft%27%29","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314471743,Load plugins from a `--plugins-dir=plugins/` directory, https://github.com/simonw/datasette/pull/209#issuecomment-381483301,https://api.github.com/repos/simonw/datasette/issues/209,381483301,MDEyOklzc3VlQ29tbWVudDM4MTQ4MzMwMQ==,9599,simonw,2018-04-16T05:25:08Z,2018-04-16T05:25:08Z,OWNER,I think this is a good improvement. If you fix the tests I'll merge it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314455877, Don't duplicate simple primary keys in the link column, https://github.com/simonw/datasette/issues/191#issuecomment-381488049,https://api.github.com/repos/simonw/datasette/issues/191,381488049,MDEyOklzc3VlQ29tbWVudDM4MTQ4ODA0OQ==,9599,simonw,2018-04-16T05:58:15Z,2018-04-16T05:58:15Z,OWNER,"I think this is pretty hard. @coleifer has done some work in this direction, including https://github.com/coleifer/pysqlite3 which ports the standalone pysqlite module to Python 3. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310533258,Figure out how to bundle a more up-to-date SQLite, https://github.com/simonw/datasette/issues/214#issuecomment-381490361,https://api.github.com/repos/simonw/datasette/issues/214,381490361,MDEyOklzc3VlQ29tbWVudDM4MTQ5MDM2MQ==,9599,simonw,2018-04-16T06:13:02Z,2018-04-16T06:13:02Z,OWNER,"Packaging JS and CSS in a pip installable wheel is fiddly but possible. http://peak.telecommunity.com/DevCenter/PythonEggs#accessing-package-resources from pkg_resources import resource_string foo_config = resource_string(__name__, 'foo.conf')","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506446,Ability for plugins to define extra JavaScript and CSS, https://github.com/simonw/datasette/issues/214#issuecomment-381491707,https://api.github.com/repos/simonw/datasette/issues/214,381491707,MDEyOklzc3VlQ29tbWVudDM4MTQ5MTcwNw==,9599,simonw,2018-04-16T06:21:23Z,2018-04-16T06:21:23Z,OWNER,This looks like a good example: https://github.com/funkey/nyroglancer/commit/d4438ab42171360b2b8e9020f672846dd70c8d80,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506446,Ability for plugins to define extra JavaScript and CSS, https://github.com/simonw/datasette/issues/191#issuecomment-381602005,https://api.github.com/repos/simonw/datasette/issues/191,381602005,MDEyOklzc3VlQ29tbWVudDM4MTYwMjAwNQ==,119974,coleifer,2018-04-16T13:37:32Z,2018-04-16T13:37:32Z,NONE,I don't think it should be too difficult... you can look at what @ghaering did with pysqlite (and similarly what I copied for pysqlite3). You would theoretically take an amalgamation build of Sqlite (all code in a single .c and .h file). The `AmalgamationLibSqliteBuilder` class detects the presence of this amalgamated source file and builds a statically-linked pysqlite.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310533258,Figure out how to bundle a more up-to-date SQLite, https://github.com/simonw/datasette/issues/14#issuecomment-381611738,https://api.github.com/repos/simonw/datasette/issues/14,381611738,MDEyOklzc3VlQ29tbWVudDM4MTYxMTczOA==,9599,simonw,2018-04-16T14:07:30Z,2018-04-16T14:07:30Z,OWNER,I should check if it's possible to have two template registration function plugins in a single plugin module. If it isn't maybe I should use class plugins instead of module plugins.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/214#issuecomment-381612585,https://api.github.com/repos/simonw/datasette/issues/214,381612585,MDEyOklzc3VlQ29tbWVudDM4MTYxMjU4NQ==,9599,simonw,2018-04-16T14:10:16Z,2018-04-16T14:10:16Z,OWNER,`resource_stream` returns a file-like object which may be better for serving from Sanic.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506446,Ability for plugins to define extra JavaScript and CSS, https://github.com/simonw/datasette/issues/14#issuecomment-381621338,https://api.github.com/repos/simonw/datasette/issues/14,381621338,MDEyOklzc3VlQ29tbWVudDM4MTYyMTMzOA==,9599,simonw,2018-04-16T14:36:27Z,2018-04-16T14:36:27Z,OWNER,"Annoyingly, the following only results in the last of the two `prepare_connection` hooks being registered: ``` from datasette import hookimpl import pint import random ureg = pint.UnitRegistry() @hookimpl def prepare_connection(conn): def convert_units(amount, from_, to_): ""select convert_units(100, 'm', 'ft');"" return (amount * ureg(from_)).to(to_).to_tuple()[0] conn.create_function('convert_units', 3, convert_units) @hookimpl def prepare_connection(conn): conn.create_function('random_integer', 2, random.randint) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-381622793,https://api.github.com/repos/simonw/datasette/issues/14,381622793,MDEyOklzc3VlQ29tbWVudDM4MTYyMjc5Mw==,9599,simonw,2018-04-16T14:40:39Z,2018-04-17T01:47:15Z,OWNER,"I think that's OK. The two plugins I've implemented so far (`prepare_connection` and `prepare_jinja2_environment`) both make sense if they can only be defined once-per-plugin. For the moment I'll assume I can define future hooks to work well with the same limitation. The syntactic sugar idea in #220 can help here too.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/216#issuecomment-381643173,https://api.github.com/repos/simonw/datasette/issues/216,381643173,MDEyOklzc3VlQ29tbWVudDM4MTY0MzE3Mw==,9599,simonw,2018-04-16T15:21:17Z,2018-04-16T15:21:17Z,OWNER,"Yikes, definitely a bug.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381644355,https://api.github.com/repos/simonw/datasette/issues/216,381644355,MDEyOklzc3VlQ29tbWVudDM4MTY0NDM1NQ==,9599,simonw,2018-04-16T15:24:38Z,2018-04-16T15:24:38Z,OWNER,"So there are two tricky problems to solve here: * I need a way of encoding `null` into that `_next=` that is unambiguous from the string `None` or `null`. This means introducing some kind of escaping mechanism in those strings. I already use URL encoding as part of the construction of those components here, maybe that can help here? * I need to figure out what the SQL should be for the ""next"" set of results if the previous value was null. Thankfully we use the primary key as a tie-breaker so this shouldn't be impossible.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381645274,https://api.github.com/repos/simonw/datasette/issues/216,381645274,MDEyOklzc3VlQ29tbWVudDM4MTY0NTI3NA==,9599,simonw,2018-04-16T15:27:16Z,2018-04-16T15:27:16Z,OWNER,"Relevant code: https://github.com/simonw/datasette/blob/904f1c75a3c17671d25c53b91e177c249d14ab3b/datasette/app.py#L828-L832","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381645973,https://api.github.com/repos/simonw/datasette/issues/216,381645973,MDEyOklzc3VlQ29tbWVudDM4MTY0NTk3Mw==,9599,simonw,2018-04-16T15:29:11Z,2018-04-16T15:29:11Z,OWNER,"I could use `$null` as a magic value that means None. Since I'm applying `quote_plus()` to actual values, any legit strings that look like this will be encoded as `%24null`: ``` >>> urllib.parse.quote_plus('$null') '%24null' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381648053,https://api.github.com/repos/simonw/datasette/issues/216,381648053,MDEyOklzc3VlQ29tbWVudDM4MTY0ODA1Mw==,9599,simonw,2018-04-16T15:35:17Z,2018-04-16T15:35:17Z,OWNER,"I think the correct SQL is this: https://datasette-issue-189-demo-3.now.sh/salaries-7859114-7859114?sql=select+rowid%2C+*+from+%5B2017+Maryland+state+salaries%5D%0D%0Awhere+%28middle_initial+is+not+null+or+%28middle_initial+is+null+and+rowid+%3E+%3Ap0%29%29%0D%0Aorder+by+middle_initial+limit+101&p0=391 ``` select rowid, * from [2017 Maryland state salaries] where (middle_initial is not null or (middle_initial is null and rowid > :p0)) order by middle_initial limit 101 ``` Though this will also need to be taken into account for #198 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381649140,https://api.github.com/repos/simonw/datasette/issues/216,381649140,MDEyOklzc3VlQ29tbWVudDM4MTY0OTE0MA==,9599,simonw,2018-04-16T15:38:29Z,2018-04-16T15:38:29Z,OWNER,But what would that SQL look like for `_sort_desc`?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381649437,https://api.github.com/repos/simonw/datasette/issues/216,381649437,MDEyOklzc3VlQ29tbWVudDM4MTY0OTQzNw==,9599,simonw,2018-04-16T15:39:21Z,2018-04-16T15:39:21Z,OWNER,"Here's where that SQL gets constructed at the moment: https://github.com/simonw/datasette/blob/10a34f995c70daa37a8a2aa02c3135a4b023a24c/datasette/app.py#L761-L771","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/pull/209#issuecomment-381738137,https://api.github.com/repos/simonw/datasette/issues/209,381738137,MDEyOklzc3VlQ29tbWVudDM4MTczODEzNw==,45057,russss,2018-04-16T20:27:43Z,2018-04-16T20:27:43Z,CONTRIBUTOR,"Tests now fixed, honest. The failing test on Travis looks like an intermittent sqlite failure which should resolve itself on a retry...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314455877, Don't duplicate simple primary keys in the link column, https://github.com/simonw/datasette/issues/203#issuecomment-381763651,https://api.github.com/repos/simonw/datasette/issues/203,381763651,MDEyOklzc3VlQ29tbWVudDM4MTc2MzY1MQ==,45057,russss,2018-04-16T21:59:17Z,2018-04-16T21:59:17Z,CONTRIBUTOR,"Ah, I had no idea you could bind python functions into sqlite! I think the primary purpose of this issue has been served now - I'm going to close this and create a new issue for the only bit of this that hasn't been touched yet, which is (optionally) exposing units in the JSON API.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",313837303,Support for units, https://github.com/simonw/datasette/issues/220#issuecomment-381777108,https://api.github.com/repos/simonw/datasette/issues/220,381777108,MDEyOklzc3VlQ29tbWVudDM4MTc3NzEwOA==,9599,simonw,2018-04-16T23:04:04Z,2018-04-16T23:04:04Z,OWNER,This could also help workaround the current predicament that a single plugin can only define one prepare_connection hook.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314847571,Investigate syntactic sugar for plugins, https://github.com/simonw/datasette/issues/216#issuecomment-381786522,https://api.github.com/repos/simonw/datasette/issues/216,381786522,MDEyOklzc3VlQ29tbWVudDM4MTc4NjUyMg==,9599,simonw,2018-04-16T23:58:45Z,2018-04-16T23:59:13Z,OWNER,"Weird... tests are failing in Travis, despite passing on my local machine. https://travis-ci.org/simonw/datasette/builds/367423706","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381788051,https://api.github.com/repos/simonw/datasette/issues/216,381788051,MDEyOklzc3VlQ29tbWVudDM4MTc4ODA1MQ==,9599,simonw,2018-04-17T00:07:48Z,2018-04-17T00:07:48Z,OWNER,Still failing. This is very odd.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381794744,https://api.github.com/repos/simonw/datasette/issues/216,381794744,MDEyOklzc3VlQ29tbWVudDM4MTc5NDc0NA==,9599,simonw,2018-04-17T00:51:41Z,2018-04-17T00:51:41Z,OWNER,I'm reverting this out of master until I can figure out why the tests are failing.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381798786,https://api.github.com/repos/simonw/datasette/issues/216,381798786,MDEyOklzc3VlQ29tbWVudDM4MTc5ODc4Ng==,9599,simonw,2018-04-17T01:18:25Z,2018-04-17T01:18:25Z,OWNER,"Here's the test that's failing: https://github.com/simonw/datasette/blob/59a3aa859c0e782aeda9a515b1b52c358e8458a2/tests/test_api.py#L437-L470 I got Travis to spit out the `fetched` and `expected` variables. `expected` has 201 items in it and is identical to what I get on my local laptop. `fetched` has 250 items in it, so it's clearly different from my local environment. I've managed to replicate the bug in production! I created a test database like this: python tests/fixtures.py sortable.db Then deployed that database like so: datasette publish now sortable.db \ --extra-options=""--page_size=50"" --branch=debug-travis-issue-216 And... if you click ""next"" on this page https://datasette-issue-216-pagination.now.sh/sortable-5679797/sortable?_sort_desc=sortable_with_nulls five times you get back 250 results, when you should only get back 201.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381799267,https://api.github.com/repos/simonw/datasette/issues/216,381799267,MDEyOklzc3VlQ29tbWVudDM4MTc5OTI2Nw==,9599,simonw,2018-04-17T01:21:35Z,2018-04-17T01:21:35Z,OWNER,"The version that I deployed which exhibits the bug is running SQLite `3.8.7.1` - https://datasette-issue-216-pagination.now.sh/sortable-5679797?sql=select+sqlite_version%28%29 The version that I have running locally which does NOT exhibit the bug is running SQLite `3.23.0`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381799408,https://api.github.com/repos/simonw/datasette/issues/216,381799408,MDEyOklzc3VlQ29tbWVudDM4MTc5OTQwOA==,9599,simonw,2018-04-17T01:22:30Z,2018-04-17T01:22:30Z,OWNER,"... which is VERY surprising, because `3.23.0` only came out on 2nd April this year: https://www.sqlite.org/changes.html - I have no idea how I came to be running that version on my laptop.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381801302,https://api.github.com/repos/simonw/datasette/issues/216,381801302,MDEyOklzc3VlQ29tbWVudDM4MTgwMTMwMg==,9599,simonw,2018-04-17T01:33:43Z,2018-04-17T01:33:43Z,OWNER,"This is the SQL that returns differing results in production and on my laptop: https://datasette-issue-216-pagination.now.sh/sortable-5679797?sql=select+%2A+from+sortable+where+%28sortable_with_nulls+is+null+and+%28%28pk1+%3E+%3Ap0%29%0A++or%0A%28pk1+%3D+%3Ap0+and+pk2+%3E+%3Ap1%29%29%29+order+by+sortable_with_nulls+desc+limit+51&p0=b&p1=t ``` select * from sortable where (sortable_with_nulls is null and ((pk1 > :p0) or (pk1 = :p0 and pk2 > :p1))) order by sortable_with_nulls desc limit 51 ``` I think that `order by sortable_with_nulls desc` bit is at fault - the primary keys should be included in that order by as well. Sure enough, changing the query to this one returns the same results across both environments: ``` select * from sortable where (sortable_with_nulls is null and ((pk1 > :p0) or (pk1 = :p0 and pk2 > :p1))) order by sortable_with_nulls desc, pk1, pk2 limit 51 ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/216#issuecomment-381803157,https://api.github.com/repos/simonw/datasette/issues/216,381803157,MDEyOklzc3VlQ29tbWVudDM4MTgwMzE1Nw==,9599,simonw,2018-04-17T01:45:24Z,2018-04-17T01:45:24Z,OWNER,Fixed!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314665147,Bug: Sort by column with NULL in next_page URL, https://github.com/simonw/datasette/issues/14#issuecomment-381809998,https://api.github.com/repos/simonw/datasette/issues/14,381809998,MDEyOklzc3VlQ29tbWVudDM4MTgwOTk5OA==,9599,simonw,2018-04-17T02:23:39Z,2018-04-17T02:23:39Z,OWNER,I just shipped Datasette 0.19 with where I'm at so far: https://github.com/simonw/datasette/releases/tag/0.19,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/pull/209#issuecomment-381905593,https://api.github.com/repos/simonw/datasette/issues/209,381905593,MDEyOklzc3VlQ29tbWVudDM4MTkwNTU5Mw==,45057,russss,2018-04-17T08:50:28Z,2018-04-17T08:50:28Z,CONTRIBUTOR,"I've added another commit which puts classes a class on each `` by default with its column name, and I've also made the PK column bold. Unfortunately the tests are still failing on 3.6, which is weird. I can't reproduce locally...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314455877, Don't duplicate simple primary keys in the link column, https://github.com/simonw/datasette/issues/214#issuecomment-382038613,https://api.github.com/repos/simonw/datasette/issues/214,382038613,MDEyOklzc3VlQ29tbWVudDM4MjAzODYxMw==,9599,simonw,2018-04-17T15:38:23Z,2018-04-17T15:38:23Z,OWNER,"I figured out the recipe for bundling static assets in a plugin: https://github.com/simonw/datasette-plugin-demos/commit/26c5548f4ab7c6cc6d398df17767950be50d0edf (and then `python3 setup.py bdist_wheel`) Having done that, I ran `pip install ../datasette-plugin-demos/dist/datasette_plugin_demos-0.2-py3-none-any.whl` from my Datasette virtual environment and then did the following: ``` >>> import pkg_resources >>> pkg_resources.resource_stream( ... 'datasette_plugin_demos', 'static/plugin.js' ... ).read() b""alert('hello');\n"" >>> pkg_resources.resource_filename( ... 'datasette_plugin_demos', 'static/plugin.js' ... ) '..../venv/lib/python3.6/site-packages/datasette_plugin_demos/static/plugin.js' >>> pkg_resources.resource_string( ... 'datasette_plugin_demos', 'static/plugin.js' ... ) b""alert('hello');\n"" ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506446,Ability for plugins to define extra JavaScript and CSS, https://github.com/simonw/datasette/issues/214#issuecomment-382048582,https://api.github.com/repos/simonw/datasette/issues/214,382048582,MDEyOklzc3VlQ29tbWVudDM4MjA0ODU4Mg==,9599,simonw,2018-04-17T16:04:42Z,2018-04-18T02:24:46Z,OWNER,"One possible option: let plugins bundle their own `static/` directory and then register themselves with Datasette, then have `/-/static-plugins/name-of-plugin/...` serve files from that directory.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506446,Ability for plugins to define extra JavaScript and CSS, https://github.com/simonw/datasette/issues/214#issuecomment-382069980,https://api.github.com/repos/simonw/datasette/issues/214,382069980,MDEyOklzc3VlQ29tbWVudDM4MjA2OTk4MA==,9599,simonw,2018-04-17T17:08:28Z,2018-04-17T17:08:28Z,OWNER,"Even if we automatically serve ALL `static/` content from installed plugins, we'll still need them to register which files need to be linked to from `extra_css_urls` and `extra_js_urls`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506446,Ability for plugins to define extra JavaScript and CSS, https://github.com/simonw/datasette/pull/209#issuecomment-382205189,https://api.github.com/repos/simonw/datasette/issues/209,382205189,MDEyOklzc3VlQ29tbWVudDM4MjIwNTE4OQ==,9599,simonw,2018-04-18T00:42:44Z,2018-04-18T00:43:02Z,OWNER,"I managed to get a better error message out of that test. The server is returning this (but only on Python 3.6, not on Python 3.5 - and only in Travis, not in my local environment): ```{'error': 'interrupted', 'ok': False, 'status': 400, 'title': 'Invalid SQL'}``` https://travis-ci.org/simonw/datasette/jobs/367929134","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314455877, Don't duplicate simple primary keys in the link column, https://github.com/simonw/datasette/pull/209#issuecomment-382210976,https://api.github.com/repos/simonw/datasette/issues/209,382210976,MDEyOklzc3VlQ29tbWVudDM4MjIxMDk3Ng==,9599,simonw,2018-04-18T01:12:26Z,2018-04-18T01:12:26Z,OWNER,"OK, aaf59db570ab7688af72c08bb5bc1edc145e3e07 should mean that the tests pass when I merge that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314455877, Don't duplicate simple primary keys in the link column, https://github.com/simonw/datasette/issues/14#issuecomment-382256729,https://api.github.com/repos/simonw/datasette/issues/14,382256729,MDEyOklzc3VlQ29tbWVudDM4MjI1NjcyOQ==,9599,simonw,2018-04-18T04:29:29Z,2018-04-18T04:30:14Z,OWNER,I added a mechanism for plugins to serve static files and define custom CSS and JS URLs in #214 - see new documentation on http://datasette.readthedocs.io/en/latest/plugins.html#static-assets and http://datasette.readthedocs.io/en/latest/plugins.html#extra-css-urls,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/223#issuecomment-382408128,https://api.github.com/repos/simonw/datasette/issues/223,382408128,MDEyOklzc3VlQ29tbWVudDM4MjQwODEyOA==,9599,simonw,2018-04-18T14:33:09Z,2018-04-18T14:33:09Z,OWNER,"Demo: datasette publish now sortable.db --install datasette-plugin-demos --branch=master Produced this deployment, with both the `random_integer()` function and the static file from https://github.com/simonw/datasette-plugin-demos/tree/0.2 https://datasette-issue-223.now.sh/-/static-plugins/datasette_plugin_demos/plugin.js https://datasette-issue-223.now.sh/sortable-4bbaa6f?sql=select+random_integer%280%2C+10%29 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315327860,datasette publish --install=name-of-plugin, https://github.com/simonw/datasette/issues/223#issuecomment-382409989,https://api.github.com/repos/simonw/datasette/issues/223,382409989,MDEyOklzc3VlQ29tbWVudDM4MjQwOTk4OQ==,9599,simonw,2018-04-18T14:38:08Z,2018-04-18T14:38:08Z,OWNER,"Tested on Heroku as well. datasette publish heroku sortable.db --install datasette-plugin-demos --branch=master https://morning-tor-45944.herokuapp.com/-/static-plugins/datasette_plugin_demos/plugin.js https://morning-tor-45944.herokuapp.com/sortable-4bbaa6f?sql=select+random_integer%280%2C+10%29","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315327860,datasette publish --install=name-of-plugin, https://github.com/simonw/datasette/issues/223#issuecomment-382413121,https://api.github.com/repos/simonw/datasette/issues/223,382413121,MDEyOklzc3VlQ29tbWVudDM4MjQxMzEyMQ==,9599,simonw,2018-04-18T14:47:18Z,2018-04-18T14:47:18Z,OWNER,"And tested `datasette package` - this time exercising the ability to pass more than one `--install` option: ``` $ datasette package sortable.db --branch=master --install requests --install datasette-plugin-demos Sending build context to Docker daemon 125.4kB Step 1/7 : FROM python:3 ---> 79e1dc9af1c1 Step 2/7 : COPY . /app ---> 6e8e40bce378 Step 3/7 : WORKDIR /app Removing intermediate container 7cdc9ab20d09 ---> f42258c2211f Step 4/7 : RUN pip install https://github.com/simonw/datasette/archive/master.zip requests datasette-plugin-demos ---> Running in a0f17cec08a4 Collecting ... Removing intermediate container a0f17cec08a4 ---> beea84e73271 Step 5/7 : RUN datasette inspect sortable.db --inspect-file inspect-data.json ---> Running in 4daa28792348 Removing intermediate container 4daa28792348 ---> c60312d21b99 Step 6/7 : EXPOSE 8001 ---> Running in fa728468482d Removing intermediate container fa728468482d ---> 8f219a61fddc Step 7/7 : CMD [""datasette"", ""serve"", ""--host"", ""0.0.0.0"", ""sortable.db"", ""--cors"", ""--port"", ""8001"", ""--inspect-file"", ""inspect-data.json""] ---> Running in cd4eaeb2ce9e Removing intermediate container cd4eaeb2ce9e ---> 066e257c7c44 Successfully built 066e257c7c44 (venv) datasette $ docker run -p 8081:8001 066e257c7c44 Serve! files=('sortable.db',) on port 8001 [2018-04-18 14:40:18 +0000] [1] [INFO] Goin' Fast @ http://0.0.0.0:8001 [2018-04-18 14:40:18 +0000] [1] [INFO] Starting worker [1] [2018-04-18 14:46:01 +0000] - (sanic.access)[INFO][1:7]: GET http://localhost:8081/-/static-plugins/datasette_plugin_demos/plugin.js 200 16 ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315327860,datasette publish --install=name-of-plugin, https://github.com/simonw/datasette/issues/224#issuecomment-382616527,https://api.github.com/repos/simonw/datasette/issues/224,382616527,MDEyOklzc3VlQ29tbWVudDM4MjYxNjUyNw==,9599,simonw,2018-04-19T05:40:28Z,2018-04-19T05:40:28Z,OWNER,"No need to use `PackageLoader` after all, we can use the same mechanism we used for the static path: https://github.com/simonw/datasette/blob/b55809a1e20986bb2e638b698815a77902e8708d/datasette/utils.py#L694-L695","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315517578,Ability for plugins to bundle templates, https://github.com/simonw/datasette/issues/227#issuecomment-382808266,https://api.github.com/repos/simonw/datasette/issues/227,382808266,MDEyOklzc3VlQ29tbWVudDM4MjgwODI2Ng==,9599,simonw,2018-04-19T16:59:23Z,2018-04-19T16:59:23Z,OWNER,"Maybe this should have a second argument indicating which codepath was being handled. That way plugins could say ""only inject this extra context variable on the row page"".","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/228#issuecomment-382924910,https://api.github.com/repos/simonw/datasette/issues/228,382924910,MDEyOklzc3VlQ29tbWVudDM4MjkyNDkxMA==,9599,simonw,2018-04-20T00:35:48Z,2018-04-20T00:35:48Z,OWNER,"Hiding tables with the `idx_` prefix should be good enough here, since false positives aren't very harmful.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316031566,"If spatialite detected, mark idx_XXX_Geometry tables as hidden", https://github.com/simonw/datasette/issues/227#issuecomment-382958693,https://api.github.com/repos/simonw/datasette/issues/227,382958693,MDEyOklzc3VlQ29tbWVudDM4Mjk1ODY5Mw==,9599,simonw,2018-04-20T03:15:52Z,2018-04-20T03:15:52Z,OWNER,"A better way to do this would be with many different plugin hooks, one for each view.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/227#issuecomment-382959857,https://api.github.com/repos/simonw/datasette/issues/227,382959857,MDEyOklzc3VlQ29tbWVudDM4Mjk1OTg1Nw==,9599,simonw,2018-04-20T03:21:43Z,2018-04-20T03:21:43Z,OWNER,"Plus a generic prepare_context() hook called in the common render method. prepare_context_table(), prepare_context_row() etc Arguments are context, request, self (hence can access self.ds) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/227#issuecomment-382964794,https://api.github.com/repos/simonw/datasette/issues/227,382964794,MDEyOklzc3VlQ29tbWVudDM4Mjk2NDc5NA==,9599,simonw,2018-04-20T03:45:18Z,2018-04-20T03:45:18Z,OWNER,"What if the context needs to make await calls? One possible option: plugins can either manipulate the context in place OR they can return an awaitable. If they do that, the caller will await it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/227#issuecomment-382966604,https://api.github.com/repos/simonw/datasette/issues/227,382966604,MDEyOklzc3VlQ29tbWVudDM4Mjk2NjYwNA==,9599,simonw,2018-04-20T03:54:56Z,2018-04-20T03:54:56Z,OWNER,Should this differentiate between preparing the data to be sent back as JSON and preparing the context for the template?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/227#issuecomment-382967238,https://api.github.com/repos/simonw/datasette/issues/227,382967238,MDEyOklzc3VlQ29tbWVudDM4Mjk2NzIzOA==,9599,simonw,2018-04-20T03:58:09Z,2018-04-20T03:58:09Z,OWNER,Maybe prepare_table_data() vs prepare_table_context(),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/230#issuecomment-383109984,https://api.github.com/repos/simonw/datasette/issues/230,383109984,MDEyOklzc3VlQ29tbWVudDM4MzEwOTk4NA==,9599,simonw,2018-04-20T14:15:39Z,2018-04-20T14:15:39Z,OWNER,Refs #229,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316128955,Setting page size AND max returned rows to 1000 doesn't seem to work, https://github.com/simonw/datasette/issues/14#issuecomment-383139889,https://api.github.com/repos/simonw/datasette/issues/14,383139889,MDEyOklzc3VlQ29tbWVudDM4MzEzOTg4OQ==,9599,simonw,2018-04-20T15:51:47Z,2018-04-20T15:51:47Z,OWNER,"I released everything we have so far in [Datasette 0.20](https://github.com/simonw/datasette/releases/tag/0.20) and built and released an example plugin, [datasette-cluster-map](https://pypi.org/project/datasette-cluster-map/). Here's my blog entry about it: https://simonwillison.net/2018/Apr/20/datasette-plugins/","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/14#issuecomment-383140111,https://api.github.com/repos/simonw/datasette/issues/14,383140111,MDEyOklzc3VlQ29tbWVudDM4MzE0MDExMQ==,9599,simonw,2018-04-20T15:52:33Z,2018-04-20T15:52:33Z,OWNER,Here's a link demonstrating my new plugin: https://datasette-cluster-map-demo.now.sh/polar-bears-455fe3a/USGS_WC_eartags_output_files_2009-2011-Status,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/pull/232#issuecomment-383252624,https://api.github.com/repos/simonw/datasette/issues/232,383252624,MDEyOklzc3VlQ29tbWVudDM4MzI1MjYyNA==,9599,simonw,2018-04-21T00:19:00Z,2018-04-21T00:19:00Z,OWNER,Thanks!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316365426,Fix a typo, https://github.com/simonw/datasette/issues/231#issuecomment-383315348,https://api.github.com/repos/simonw/datasette/issues/231,383315348,MDEyOklzc3VlQ29tbWVudDM4MzMxNTM0OA==,9599,simonw,2018-04-21T17:37:50Z,2018-04-22T23:06:04Z,OWNER,"I could also have an `""autodetect"": false` option for that plugin to turn off autodetecting entirely. Would be useful if the plugin didn't append its JavaScript in pages that it wasn't used for - that might require making the `extra_js_urls()` hook optionally aware of the columns and table and metadata.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316323336,metadata.json support for plugin configuration options, https://github.com/simonw/datasette/issues/234#issuecomment-383398182,https://api.github.com/repos/simonw/datasette/issues/234,383398182,MDEyOklzc3VlQ29tbWVudDM4MzM5ODE4Mg==,9599,simonw,2018-04-22T17:31:12Z,2018-04-22T17:31:12Z,OWNER,"```{ ""databases"": { ""database1"": { ""tables"": { ""example_table"": { ""label_column"": ""name"" } } } } } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316526433,label_column option in metadata.json, https://github.com/simonw/datasette/issues/234#issuecomment-383399762,https://api.github.com/repos/simonw/datasette/issues/234,383399762,MDEyOklzc3VlQ29tbWVudDM4MzM5OTc2Mg==,9599,simonw,2018-04-22T17:54:39Z,2018-04-22T17:54:39Z,OWNER,Docs here: http://datasette.readthedocs.io/en/latest/metadata.html#specifying-the-label-column-for-a-table,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316526433,label_column option in metadata.json, https://github.com/simonw/datasette/issues/234#issuecomment-383410146,https://api.github.com/repos/simonw/datasette/issues/234,383410146,MDEyOklzc3VlQ29tbWVudDM4MzQxMDE0Ng==,9599,simonw,2018-04-22T20:32:30Z,2018-04-22T20:47:02Z,OWNER,"I built this wrong: my implementation is looking for the `label_column` on the table-being-displayed, but it should be looking for it on the table-the-foreign-key-links-to.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316526433,label_column option in metadata.json, https://github.com/simonw/datasette/issues/235#issuecomment-383727973,https://api.github.com/repos/simonw/datasette/issues/235,383727973,MDEyOklzc3VlQ29tbWVudDM4MzcyNzk3Mw==,9599,simonw,2018-04-23T21:23:59Z,2018-04-23T21:23:59Z,OWNER,"There might also be something clever we can do here with PRAGMA statements: https://stackoverflow.com/questions/14146881/limit-the-maximum-amount-of-memory-sqlite3-uses And https://www.sqlite.org/pragma.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316621102,Add limit on the size in KB of data returned from a single query, https://github.com/simonw/datasette/issues/235#issuecomment-383764533,https://api.github.com/repos/simonw/datasette/issues/235,383764533,MDEyOklzc3VlQ29tbWVudDM4Mzc2NDUzMw==,9599,simonw,2018-04-24T00:30:02Z,2018-04-24T00:30:02Z,OWNER,The `resource` module in he standard library has the ability to set limits on memory usage for the current process: https://pymotw.com/2/resource/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316621102,Add limit on the size in KB of data returned from a single query, https://github.com/simonw/datasette/issues/238#issuecomment-384362028,https://api.github.com/repos/simonw/datasette/issues/238,384362028,MDEyOklzc3VlQ29tbWVudDM4NDM2MjAyOA==,9599,simonw,2018-04-25T17:07:11Z,2018-04-25T17:07:11Z,OWNER,"On further thought: this is actually only an issue for immutable deployments to platforms like Zeit Now and Heroku. As such, adding it to `datasette serve` feels clumsy. Maybe `datasette publish` should instead gain the ability to optionally install an extra mechanism that periodically pulls a fresh copy of `metadata.json` from a URL.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317714268,External metadata.json, https://github.com/simonw/datasette/issues/239#issuecomment-384500327,https://api.github.com/repos/simonw/datasette/issues/239,384500327,MDEyOklzc3VlQ29tbWVudDM4NDUwMDMyNw==,9599,simonw,2018-04-26T03:18:12Z,2018-04-26T03:18:20Z,OWNER,"``` { ""databases"": { ""database1"": { ""tables"": { ""example_table"": { ""hidden"": true } } } } } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317760361,Support for hidden tables in metadata.json, https://github.com/simonw/datasette/issues/239#issuecomment-384503873,https://api.github.com/repos/simonw/datasette/issues/239,384503873,MDEyOklzc3VlQ29tbWVudDM4NDUwMzg3Mw==,9599,simonw,2018-04-26T03:45:11Z,2018-04-26T03:45:11Z,OWNER,Documentation: http://datasette.readthedocs.io/en/latest/metadata.html#hiding-tables,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317760361,Support for hidden tables in metadata.json, https://github.com/simonw/datasette/issues/229#issuecomment-384512192,https://api.github.com/repos/simonw/datasette/issues/229,384512192,MDEyOklzc3VlQ29tbWVudDM4NDUxMjE5Mg==,9599,simonw,2018-04-26T04:49:46Z,2018-04-26T04:49:46Z,OWNER,Documentation: http://datasette.readthedocs.io/en/latest/json_api.html#special-table-arguments,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316123256,Table view should support ?_size=400 parameter, https://github.com/simonw/datasette/issues/79#issuecomment-384675792,https://api.github.com/repos/simonw/datasette/issues/79,384675792,MDEyOklzc3VlQ29tbWVudDM4NDY3NTc5Mg==,9599,simonw,2018-04-26T15:08:13Z,2018-04-26T15:08:13Z,OWNER,"Docs now live at http://datasette.readthedocs.io/ I still need to document a few more parts of the API before closing this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273569068,Add more detailed API documentation to the README, https://github.com/simonw/datasette/issues/44#issuecomment-384676488,https://api.github.com/repos/simonw/datasette/issues/44,384676488,MDEyOklzc3VlQ29tbWVudDM4NDY3NjQ4OA==,9599,simonw,2018-04-26T15:09:57Z,2018-04-26T15:09:57Z,OWNER,Remaining work for this is tracked in #150,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",269731374,?_group_count=country - return counts by specific column(s), https://github.com/simonw/datasette/issues/125#issuecomment-384678319,https://api.github.com/repos/simonw/datasette/issues/125,384678319,MDEyOklzc3VlQ29tbWVudDM4NDY3ODMxOQ==,9599,simonw,2018-04-26T15:14:31Z,2018-04-26T15:14:31Z,OWNER,"I shipped this last week as the first plugin: https://simonwillison.net/2018/Apr/20/datasette-plugins/ Demo: https://datasette-cluster-map-demo.datasettes.com/polar-bears-455fe3a/USGS_WC_eartags_output_files_2009-2011-Status Plugin: https://github.com/simonw/datasette-cluster-map","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275135393,Plot rows on a map with Leaflet and Leaflet.markercluster, https://github.com/simonw/datasette/issues/244#issuecomment-386309928,https://api.github.com/repos/simonw/datasette/issues/244,386309928,MDEyOklzc3VlQ29tbWVudDM4NjMwOTkyOA==,9599,simonw,2018-05-03T14:13:49Z,2018-05-03T14:13:49Z,OWNER,Demo: https://datasette-versions-and-shape-demo.now.sh/-/versions,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",318738000,/-/versions page, https://github.com/simonw/datasette/issues/245#issuecomment-386310149,https://api.github.com/repos/simonw/datasette/issues/245,386310149,MDEyOklzc3VlQ29tbWVudDM4NjMxMDE0OQ==,9599,simonw,2018-05-03T14:14:33Z,2018-05-03T14:14:33Z,OWNER,"Demos: * https://datasette-versions-and-shape-demo.now.sh/sf-trees-02c8ef1/qSpecies.json?_shape=array * https://datasette-versions-and-shape-demo.now.sh/sf-trees-02c8ef1/qSpecies.json?_shape=object * https://datasette-versions-and-shape-demo.now.sh/sf-trees-02c8ef1/qSpecies.json?_shape=arrays * https://datasette-versions-and-shape-demo.now.sh/sf-trees-02c8ef1/qSpecies.json?_shape=objects","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",319358200,?_shape=array option, https://github.com/simonw/datasette/issues/248#issuecomment-386357645,https://api.github.com/repos/simonw/datasette/issues/248,386357645,MDEyOklzc3VlQ29tbWVudDM4NjM1NzY0NQ==,9599,simonw,2018-05-03T16:36:59Z,2018-05-03T16:36:59Z,OWNER,"Even better: use `plugin_manager.list_plugin_distinfo()` from pluggy to get back a list of tuples, the second item in each tuple is a `pkg_resources.DistInfoDistribution` with a `.version` attribute.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",319954545,/-/plugins should show version of each installed plugin, https://github.com/simonw/datasette/issues/248#issuecomment-386692333,https://api.github.com/repos/simonw/datasette/issues/248,386692333,MDEyOklzc3VlQ29tbWVudDM4NjY5MjMzMw==,9599,simonw,2018-05-04T18:25:40Z,2018-05-04T18:25:40Z,OWNER,Demo: https://datasette-plugins-and-max-size-demo.now.sh/-/plugins,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",319954545,/-/plugins should show version of each installed plugin, https://github.com/simonw/datasette/issues/249#issuecomment-386692534,https://api.github.com/repos/simonw/datasette/issues/249,386692534,MDEyOklzc3VlQ29tbWVudDM4NjY5MjUzNA==,9599,simonw,2018-05-04T18:26:30Z,2018-05-04T18:26:30Z,OWNER,Demo: https://datasette-plugins-and-max-size-demo.now.sh/sf-trees/Street_Tree_List.json?_size=max,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",320090329,?_size=max argument , https://github.com/simonw/datasette/issues/237#issuecomment-386840307,https://api.github.com/repos/simonw/datasette/issues/237,386840307,MDEyOklzc3VlQ29tbWVudDM4Njg0MDMwNw==,9599,simonw,2018-05-05T22:45:45Z,2018-05-05T22:45:45Z,OWNER,Documented here: http://datasette.readthedocs.io/en/latest/json_api.html#special-table-arguments,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317475156,Support for ?_search_colname=blah searches, https://github.com/simonw/datasette/issues/237#issuecomment-386840806,https://api.github.com/repos/simonw/datasette/issues/237,386840806,MDEyOklzc3VlQ29tbWVudDM4Njg0MDgwNg==,9599,simonw,2018-05-05T22:56:42Z,2018-05-05T22:56:42Z,OWNER,"Demo: datasette publish now ../datasettes/san-francisco/sf-film-locations.db --branch=master --name datasette-column-search-demo https://datasette-column-search-demo.now.sh/sf-film-locations/Film_Locations_in_San_Francisco?_search_Locations=justin","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317475156,Support for ?_search_colname=blah searches, https://github.com/simonw/datasette/issues/251#issuecomment-386879509,https://api.github.com/repos/simonw/datasette/issues/251,386879509,MDEyOklzc3VlQ29tbWVudDM4Njg3OTUwOQ==,9599,simonw,2018-05-06T13:29:26Z,2018-05-06T13:29:26Z,OWNER,"We can solve this using the `sqlite_timelimit(conn, 20)` helper, which can tell SQLite to give up after 20ms. We can wrap that around the following SQL: select distinct COLUMN from TABLE limit 21; Then we look at the number of rows returned. If it's 21 or more we know that this table had more than 21 distinct values, so we'll treat it as ""unlimited"". Likewise, if the SQL times out before 20ms is up we will skip this introspection.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",320592643,"Explore ""distinct values for column"" in inspect()", https://github.com/simonw/datasette/issues/251#issuecomment-386879840,https://api.github.com/repos/simonw/datasette/issues/251,386879840,MDEyOklzc3VlQ29tbWVudDM4Njg3OTg0MA==,9599,simonw,2018-05-06T13:34:24Z,2018-05-06T13:34:24Z,OWNER,"Here's a quick demo of that exploration: https://datasette-distinct-column-values.now.sh/-/inspect Example output: ``` { ""antiquities-act/actions_under_antiquities_act"": { ""columns"": [ ""current_name"", ""states"", ""original_name"", ""current_agency"", ""action"", ""date"", ""year"", ""pres_or_congress"", ""acres_affected"" ], ""count"": 344, ""distinct_values_by_column"": { ""acres_affected"": null, ""action"": null, ""current_agency"": [ ""NPS"", ""State of Montana"", ""BLM"", ""State of Arizona"", ""USFS"", ""State of North Dakota"", ""NPS, BLM"", ""State of South Carolina"", ""State of New York"", ""FWS"", ""FWS, NOAA"", ""NPS, FWS"", ""NOAA"", ""BLM, USFS"", ""NOAA, FWS"" ], ""current_name"": null, ""date"": null, ""original_name"": null, ""pres_or_congress"": null, ""states"": null, ""year"": null }, ""foreign_keys"": { ""incoming"": [], ""outgoing"": [] }, ""fts_table"": null, ""hidden"": false, ""label_column"": null, ""name"": ""antiquities-act/actions_under_antiquities_act"", ""primary_keys"": [] } } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",320592643,"Explore ""distinct values for column"" in inspect()", https://github.com/simonw/datasette/issues/251#issuecomment-386879878,https://api.github.com/repos/simonw/datasette/issues/251,386879878,MDEyOklzc3VlQ29tbWVudDM4Njg3OTg3OA==,9599,simonw,2018-05-06T13:34:57Z,2018-05-06T13:34:57Z,OWNER,If I'm going to expand column introspection in this way it would be useful to also capture column type information.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",320592643,"Explore ""distinct values for column"" in inspect()", https://github.com/simonw/datasette/issues/254#issuecomment-388360255,https://api.github.com/repos/simonw/datasette/issues/254,388360255,MDEyOklzc3VlQ29tbWVudDM4ODM2MDI1NQ==,9599,simonw,2018-05-11T13:16:09Z,2018-05-11T22:45:31Z,OWNER,"Do you have an example I can look at? I think I have a possible route for fixing this, but it's pretty tricky (it involves adding a full SQL statement parser, but that's needed for some other potential improvements as well). In the meantime, is this causing actual errors for you or is it more of an inconvenience (form fields being displayed that don't actually do anything)? Another potential solution here could be to allow canned queries to optionally declare their parameters in metadata.json","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322283067,Escaping named parameters in canned queries, https://github.com/simonw/datasette/issues/254#issuecomment-388367027,https://api.github.com/repos/simonw/datasette/issues/254,388367027,MDEyOklzc3VlQ29tbWVudDM4ODM2NzAyNw==,247131,philroche,2018-05-11T13:41:46Z,2018-05-11T13:41:46Z,NONE,"An example deployment @ https://datasette-zkcvlwdrhl.now.sh/simplestreams-270f20c/cloudimage?content_id__exact=com.ubuntu.cloud%3Areleased%3Adownload It is not causing errors, more of an inconvenience. I have worked around it using a `like` query instead. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322283067,Escaping named parameters in canned queries, https://github.com/simonw/datasette/issues/254#issuecomment-388497467,https://api.github.com/repos/simonw/datasette/issues/254,388497467,MDEyOklzc3VlQ29tbWVudDM4ODQ5NzQ2Nw==,9599,simonw,2018-05-11T22:06:00Z,2018-05-11T22:06:34Z,OWNER,"Got it, this seems to trigger the problem: https://datasette-zkcvlwdrhl.now.sh/simplestreams-270f20c?sql=select+*+from+cloudimage+where+%22content_id%22+%3D+%22com.ubuntu.cloud%3Areleased%3Adownload%22+order+by+id+limit+10","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322283067,Escaping named parameters in canned queries, https://github.com/simonw/datasette/issues/255#issuecomment-388525357,https://api.github.com/repos/simonw/datasette/issues/255,388525357,MDEyOklzc3VlQ29tbWVudDM4ODUyNTM1Nw==,9599,simonw,2018-05-12T03:01:14Z,2018-05-12T03:01:14Z,OWNER,Facet counts will be generated by extra SQL queries with their own aggressive time limit.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/253#issuecomment-388550742,https://api.github.com/repos/simonw/datasette/issues/253,388550742,MDEyOklzc3VlQ29tbWVudDM4ODU1MDc0Mg==,9599,simonw,2018-05-12T12:09:02Z,2018-05-12T12:09:02Z,OWNER,http://datasette.readthedocs.io/en/latest/full_text_search.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",321631020,Documentation explaining how to use SQLite FTS with Datasette, https://github.com/simonw/datasette/issues/255#issuecomment-388587855,https://api.github.com/repos/simonw/datasette/issues/255,388587855,MDEyOklzc3VlQ29tbWVudDM4ODU4Nzg1NQ==,9599,simonw,2018-05-12T22:30:23Z,2018-05-12T22:30:23Z,OWNER,Adding some TODOs to the original description (so they show up as a todo progress bar),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-388588011,https://api.github.com/repos/simonw/datasette/issues/255,388588011,MDEyOklzc3VlQ29tbWVudDM4ODU4ODAxMQ==,9599,simonw,2018-05-12T22:33:39Z,2018-05-12T22:33:39Z,OWNER,Initial documentation: http://datasette.readthedocs.io/en/latest/facets.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-388588998,https://api.github.com/repos/simonw/datasette/issues/255,388588998,MDEyOklzc3VlQ29tbWVudDM4ODU4ODk5OA==,9599,simonw,2018-05-12T22:57:30Z,2018-05-12T23:00:24Z,OWNER,"A few demos: * https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9/college-majors%2Fall-ages?_facet=Major_category * https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9/congress-age%2Fcongress-terms?_facet=chamber&_facet=state&_facet=party&_facet=incumbent * https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9/bechdel%2Fmovies?_facet=binary&_facet=test","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-388589072,https://api.github.com/repos/simonw/datasette/issues/255,388589072,MDEyOklzc3VlQ29tbWVudDM4ODU4OTA3Mg==,9599,simonw,2018-05-12T22:59:07Z,2018-05-12T22:59:07Z,OWNER,"I need to decide how to display these. They currently look like this: https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9/congress-age%2Fcongress-terms?_facet=chamber&_facet=state&_facet=party&_facet=incumbent&state=MO ![2018-05-12 at 7 58 pm](https://user-images.githubusercontent.com/9599/39962230-e7bf9e10-561e-11e8-80a7-0941b8991318.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/pull/257#issuecomment-388625703,https://api.github.com/repos/simonw/datasette/issues/257,388625703,MDEyOklzc3VlQ29tbWVudDM4ODYyNTcwMw==,9599,simonw,2018-05-13T13:10:09Z,2018-05-13T13:10:09Z,OWNER,"I'm still seeing intermittent Python 3.5 failures due to dictionary ordering differences. https://travis-ci.org/simonw/datasette/jobs/378356802 ``` > assert expected_facet_results == facet_results E AssertionError: assert {'city': [{'c...alue': 'MI'}]} == {'city': [{'co...alue': 'MI'}]} E Omitting 1 identical items, use -vv to show E Differing items: E {'city': [{'count': 4, 'toggle_url': '_facet=state&_facet=city&state=MI&city=Detroit', 'value': 'Detroit'}]} != {'city': [{'count': 4, 'toggle_url': 'state=MI&_facet=state&_facet=city&city=Detroit', 'value': 'Detroit'}]} E Use -v to get the full diff ``` To solve these cleanly I need to be able to run Python 3.5 on my local laptop rather than relying on Travis every time.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322591993,Refactor views, https://github.com/simonw/datasette/pull/257#issuecomment-388626721,https://api.github.com/repos/simonw/datasette/issues/257,388626721,MDEyOklzc3VlQ29tbWVudDM4ODYyNjcyMQ==,9599,simonw,2018-05-13T13:27:04Z,2018-05-13T13:27:04Z,OWNER,"I managed to get Python 3.5.0 running on my laptop using [pyenv](https://github.com/pyenv/pyenv). Here's the incantation I used: ``` # Install pyenv using homebrew (turns out I already had it) brew install pyenv # Check which versions of Python I have installed pyenv versions # Install Python 3.5.0 pyenv install 3.5.0 # Figure out where pyenv has been installing things pyenv root # Check I can run my newly installed Python 3.5.0 /Users/simonw/.pyenv/versions/3.5.0/bin/python # Use it to create a new virtualenv /Users/simonw/.pyenv/versions/3.5.0/bin/python -mvenv venv35 source venv35/bin/activate # Install datasette into that virtualenv python setup.py install ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322591993,Refactor views, https://github.com/simonw/datasette/pull/257#issuecomment-388626804,https://api.github.com/repos/simonw/datasette/issues/257,388626804,MDEyOklzc3VlQ29tbWVudDM4ODYyNjgwNA==,9599,simonw,2018-05-13T13:28:20Z,2018-05-13T13:28:20Z,OWNER,"Unfortunately, running `python setup.py test` on my laptop using Python 3.5.0 in that virtualenv results in a flow of weird Sanic-related errors: ``` File ""/Users/simonw/Dropbox/Development/datasette/venv35/lib/python3.5/site-packages/sanic-0.7.0-py3.5.egg/sanic/testing.py"", line 16, in _local_request import aiohttp File ""/Users/simonw/Dropbox/Development/datasette/.eggs/aiohttp-2.3.2-py3.5-macosx-10.13-x86_64.egg/aiohttp/__init__.py"", line 6, in from .client import * # noqa File ""/Users/simonw/Dropbox/Development/datasette/.eggs/aiohttp-2.3.2-py3.5-macosx-10.13-x86_64.egg/aiohttp/client.py"", line 13, in from yarl import URL File ""/Users/simonw/Dropbox/Development/datasette/.eggs/yarl-1.2.4-py3.5-macosx-10.13-x86_64.egg/yarl/__init__.py"", line 11, in from .quoting import _Quoter, _Unquoter File ""/Users/simonw/Dropbox/Development/datasette/.eggs/yarl-1.2.4-py3.5-macosx-10.13-x86_64.egg/yarl/quoting.py"", line 3, in from typing import Optional, TYPE_CHECKING, cast ImportError: cannot import name 'TYPE_CHECKING' ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322591993,Refactor views, https://github.com/simonw/datasette/pull/257#issuecomment-388627281,https://api.github.com/repos/simonw/datasette/issues/257,388627281,MDEyOklzc3VlQ29tbWVudDM4ODYyNzI4MQ==,9599,simonw,2018-05-13T13:36:21Z,2018-05-13T13:36:21Z,OWNER,"https://github.com/rtfd/readthedocs.org/issues/3812#issuecomment-373780860 suggests Python 3.5.2 may have the fix. Yup, that worked: ``` pyenv install 3.5.2 rm -rf venv35 /Users/simonw/.pyenv/versions/3.5.2/bin/python -mvenv venv35 source venv35/bin/activate # Not sure why I need this in my local environment but I do: pip install datasette_plugin_demos python setup.py test ``` This is now giving me the same test failure locally that I am seeing in Travis.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322591993,Refactor views, https://github.com/simonw/datasette/pull/257#issuecomment-388628966,https://api.github.com/repos/simonw/datasette/issues/257,388628966,MDEyOklzc3VlQ29tbWVudDM4ODYyODk2Ng==,9599,simonw,2018-05-13T14:00:47Z,2018-05-13T14:06:35Z,OWNER,"Running specific tests: ``` venv35/bin/pip install pytest beautifulsoup4 aiohttp venv35/bin/pytest tests/test_utils.py ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322591993,Refactor views, https://github.com/simonw/datasette/issues/255#issuecomment-388645828,https://api.github.com/repos/simonw/datasette/issues/255,388645828,MDEyOklzc3VlQ29tbWVudDM4ODY0NTgyOA==,9599,simonw,2018-05-13T18:18:56Z,2018-05-13T18:20:02Z,OWNER,I may be able to run the SQL for all of the facet counts in one go using a WITH CTE query - will have to microbenchmark this to make sure it is worthwhile: https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9?sql=with+blah+as+%28select+*+from+%5Bcollege-majors%2Fall-ages%5D%29%0D%0Aselect+*+from+%28select+%22Major_category%22%2C+Major_category%2C+count%28*%29+as+n+from%0D%0Ablah+group+by+Major_category+order+by+n+desc+limit+10%29%0D%0Aunion+all%0D%0Aselect+*+from+%28select+%22Major_category2%22%2C+Major_category%2C+count%28*%29+as+n+from%0D%0Ablah+group+by+Major_category+order+by+n+desc+limit+10%29,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/256#issuecomment-388684356,https://api.github.com/repos/simonw/datasette/issues/256,388684356,MDEyOklzc3VlQ29tbWVudDM4ODY4NDM1Ng==,9599,simonw,2018-05-14T03:05:37Z,2018-05-14T03:05:37Z,OWNER,"I just landed pull request #257 - I haven't refactored the tests, I may do that later if it looks worthwhile.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322551723,Break up app.py into separate view modules, https://github.com/simonw/datasette/issues/255#issuecomment-388686463,https://api.github.com/repos/simonw/datasette/issues/255,388686463,MDEyOklzc3VlQ29tbWVudDM4ODY4NjQ2Mw==,9599,simonw,2018-05-14T03:23:44Z,2018-05-14T03:25:22Z,OWNER,It would be neat if there was a mechanism for calculating aggregates per facet - e.g. calculating the sum() of specific columns against each facet result on https://datasette-facets-demo.now.sh/fivethirtyeight-2628db9/nba-elo%2Fnbaallelo?_facet=lg_id&_facet=fran_id&lg_id=ABA&_facet=team_id,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-388784063,https://api.github.com/repos/simonw/datasette/issues/255,388784063,MDEyOklzc3VlQ29tbWVudDM4ODc4NDA2Mw==,9599,simonw,2018-05-14T11:25:00Z,2018-05-14T11:25:15Z,OWNER,"Can I get facets working across many2many relationships? This would be fiendishly useful, but the querystring and `metadata.json` syntax is non-obvious.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-388784787,https://api.github.com/repos/simonw/datasette/issues/255,388784787,MDEyOklzc3VlQ29tbWVudDM4ODc4NDc4Nw==,9599,simonw,2018-05-14T11:28:05Z,2018-05-14T11:28:05Z,OWNER,"To decide which facets to suggest: for each column, is the unique value count less than the number of rows matching the current query or is it less than 20 (if we are showing more than 20 rows)? Maybe only do this if there are less than ten non-float columns. Or always try for foreign keys and booleans, then if there are none of those try indexed text and integer fields, then finally try non-indexed text and integer fields but only if there are less than ten.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/259#issuecomment-388797919,https://api.github.com/repos/simonw/datasette/issues/259,388797919,MDEyOklzc3VlQ29tbWVudDM4ODc5NzkxOQ==,9599,simonw,2018-05-14T12:23:11Z,2018-05-14T12:23:11Z,OWNER,"For M2M to work we will need a mechanism for applying IN queries to the table view, so you can select multiple M2M filters. Maybe this would work: ?_m2m_category=123&_m2m_category=865","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322787470,inspect() should detect many-to-many relationships, https://github.com/simonw/datasette/issues/251#issuecomment-388987044,https://api.github.com/repos/simonw/datasette/issues/251,388987044,MDEyOklzc3VlQ29tbWVudDM4ODk4NzA0NA==,9599,simonw,2018-05-14T22:47:55Z,2018-05-14T22:47:55Z,OWNER,This work is now happening in the facets branch. Closing this in favor of #255.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",320592643,"Explore ""distinct values for column"" in inspect()", https://github.com/simonw/datasette/issues/255#issuecomment-389145872,https://api.github.com/repos/simonw/datasette/issues/255,389145872,MDEyOklzc3VlQ29tbWVudDM4OTE0NTg3Mg==,9599,simonw,2018-05-15T12:17:52Z,2018-05-15T12:17:52Z,OWNER,Activity has now moved to this branch: https://github.com/simonw/datasette/commits/suggested-facets,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-389147608,https://api.github.com/repos/simonw/datasette/issues/255,389147608,MDEyOklzc3VlQ29tbWVudDM4OTE0NzYwOA==,9599,simonw,2018-05-15T12:24:46Z,2018-05-15T12:24:46Z,OWNER,"New demo (published with `datasette publish now --branch=suggested-facets fivethirtyeight.db sf-trees.db --name=datastte-suggested-facets-demo`): https://datasette-suggested-facets-demo.now.sh/fivethirtyeight-2628db9/comic-characters%2Fmarvel-wikia-data After turning on a couple of suggested facets... https://datasette-suggested-facets-demo.now.sh/fivethirtyeight-2628db9/comic-characters%2Fmarvel-wikia-data?_facet=SEX&_facet=ID ![2018-05-15 at 7 24 am](https://user-images.githubusercontent.com/9599/40056411-fa265d16-5810-11e8-89ec-e38fe29ffb2c.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/pull/258#issuecomment-389386142,https://api.github.com/repos/simonw/datasette/issues/258,389386142,MDEyOklzc3VlQ29tbWVudDM4OTM4NjE0Mg==,9599,simonw,2018-05-16T03:51:13Z,2018-05-16T03:51:13Z,OWNER,"The URL does persist across deployments already, in that you can use the URL without the hash and it will redirect to the current location. Here's an example of that: https://san-francisco.datasettes.com/sf-trees/Street_Tree_List.json This also works if you attempt to hit the incorrect hash, e.g. if you have deployed a new version of the database with an updated hash. The old hash will redirect, e.g. https://san-francisco.datasettes.com/sf-trees-c4b972c/Street_Tree_List.json If you serve Datasette from a HTTP/2 proxy (I've been using Cloudflare for this) you won't even have to pay the cost of the redirect - Datasette sends a `Link: ; rel=preload` header with those redirects, which causes Cloudflare to push out the redirected source as part of that HTTP/2 request. You can fire up the Chrome DevTools to watch this happen. https://github.com/simonw/datasette/blob/2b79f2bdeb1efa86e0756e741292d625f91cb93d/datasette/views/base.py#L91 All of that said... I'm not at all opposed to this feature. For consistency with other Datasette options (e.g. `--cors`) I'd prefer to do this as an optional argument to the `datasette serve` command - something like this: datasette serve mydb.db --no-url-hash","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322741659,Add new metadata key persistent_urls which removes the hash from all database urls, https://github.com/simonw/datasette/issues/255#issuecomment-389386919,https://api.github.com/repos/simonw/datasette/issues/255,389386919,MDEyOklzc3VlQ29tbWVudDM4OTM4NjkxOQ==,9599,simonw,2018-05-16T03:57:47Z,2018-05-16T03:58:30Z,OWNER,"I updated that demo to demonstrate the new foreign key label expansions: https://datasette-suggested-facets-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List?_facet=qLegalStatus ![2018-05-15 at 8 58 pm](https://user-images.githubusercontent.com/9599/40095806-b645026a-5882-11e8-8100-76136df50212.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-389397457,https://api.github.com/repos/simonw/datasette/issues/255,389397457,MDEyOklzc3VlQ29tbWVudDM4OTM5NzQ1Nw==,9599,simonw,2018-05-16T05:20:04Z,2018-05-16T05:20:04Z,OWNER,Maybe `suggested_facets` should only be calculated for the HTML view.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/pull/258#issuecomment-389536870,https://api.github.com/repos/simonw/datasette/issues/258,389536870,MDEyOklzc3VlQ29tbWVudDM4OTUzNjg3MA==,9599,simonw,2018-05-16T14:22:31Z,2018-05-16T14:22:31Z,OWNER,"The principle benefit provided by the hash URLs is that Datasette can set a far-future cache expiry header on every response. This is particularly useful for JavaScript API work as it makes fantastic use of the browser's cache. It also means that if you are serving your API from behind a caching proxy like Cloudflare you get a fantastic cache hit rate. An option to serve without persistent hashes would also need to turn off the cache headers. Maybe the option should support both? If you hit a page with the hash in the URL you still get the cache headers, but hits to the URL without the hash serve uncashed content directly.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322741659,Add new metadata key persistent_urls which removes the hash from all database urls, https://github.com/simonw/datasette/issues/255#issuecomment-389546040,https://api.github.com/repos/simonw/datasette/issues/255,389546040,MDEyOklzc3VlQ29tbWVudDM4OTU0NjA0MA==,9599,simonw,2018-05-16T14:47:34Z,2018-05-16T14:47:34Z,OWNER,"Latest demo - now with multiple columns: https://datasette-suggested-facets-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List?_facet=qCaretaker&_facet=qCareAssistant&_facet=qLegalStatus ![2018-05-16 at 7 47 am](https://user-images.githubusercontent.com/9599/40124418-63e680ba-58dd-11e8-8063-9686826abb8e.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/255#issuecomment-389562708,https://api.github.com/repos/simonw/datasette/issues/255,389562708,MDEyOklzc3VlQ29tbWVudDM4OTU2MjcwOA==,9599,simonw,2018-05-16T15:32:12Z,2018-05-16T15:32:12Z,OWNER,"This is now landed in master, ready for the next release.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/263#issuecomment-389563719,https://api.github.com/repos/simonw/datasette/issues/263,389563719,MDEyOklzc3VlQ29tbWVudDM4OTU2MzcxOQ==,9599,simonw,2018-05-16T15:34:46Z,2018-05-16T15:34:46Z,OWNER,The underlying mechanics for the `_extras` mechanism described in #262 may help with this.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323671577,Facets should not execute for ?shape=array|object, https://github.com/simonw/datasette/issues/265#issuecomment-389566147,https://api.github.com/repos/simonw/datasette/issues/265,389566147,MDEyOklzc3VlQ29tbWVudDM4OTU2NjE0Nw==,9599,simonw,2018-05-16T15:41:42Z,2018-05-16T15:41:42Z,OWNER,"An official demo instance of Datasette dedicated to this use-case would be useful, especially if it was automatically deployed by Travis for every commit to master that passes the tests. Maybe there should be a permanent version of it deployed for each released version too?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323677499,Add links to example Datasette instances to appropiate places in docs, https://github.com/simonw/datasette/issues/266#issuecomment-389570841,https://api.github.com/repos/simonw/datasette/issues/266,389570841,MDEyOklzc3VlQ29tbWVudDM4OTU3MDg0MQ==,9599,simonw,2018-05-16T15:54:49Z,2018-06-15T07:41:09Z,OWNER,"At the most basic level, this will work based on an extension. Most places you currently put a `.json` extension should also allow a `.csv` extension. By default this will return the exact results you see on the current page (default max will remain 1000). ## Streaming all records Where things get interested is *streaming mode*. This will be an option which returns ALL matching records as a streaming CSV file, even if that ends up being millions of records. I think the best way to build this will be on top of the existing mechanism used to efficiently implement keyset pagination via `_next=` tokens. ## Expanding foreign keys For tables with foreign key references it would be useful if the CSV format could expand those references to include the labels from `label_column` - maybe via an additional `?_expand=1` option. When expanding each foreign key column will be shown twice: rowid,city_id,city_id_label,state","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389572201,https://api.github.com/repos/simonw/datasette/issues/266,389572201,MDEyOklzc3VlQ29tbWVudDM4OTU3MjIwMQ==,9599,simonw,2018-05-16T15:58:43Z,2018-05-16T16:00:47Z,OWNER,"This will likely be implemented in the `BaseView` class, which needs to know how to spot the `.csv` extension, call the underlying JSON generating function and then return the `columns` and `rows` as correctly formatted CSV. https://github.com/simonw/datasette/blob/9959a9e4deec8e3e178f919e8b494214d5faa7fd/datasette/views/base.py#L201-L207 This means it will take ALL arguments that are available to the `.json` view. It may ignore some (e.g. `_facet=` makes no sense since CSV tables don't have space to show the facet results). In streaming mode, things will behave a little bit differently - in particular, if `_stream=1` then `_next=` will be forbidden. It can't include a length header because we don't know how many bytes it will be CSV output will throw an error if the endpoint doesn't have rows and columns keys eg `/-/inspect.json` So the implementation... - looks for the `.csv` extension - internally fetches the `.json` data instead - If no `_stream` it just transposes that JSON to CSV with the correct content type header - If `_stream=1` - checks for `_next=` and throws an error if it was provided - Otherwise... fetch first page and emit CSV header and first set of rows - Then start async looping, emitting more CSV rows and following the `_next=` internal reference until done I like that this takes advantage of efficient pagination. It may not work so well for views which use offset/limit though. It won't work at all for custom SQL because custom SQL doesn't support _next= pagination. That's fine. For views... easiest fix is to cut off after first X000 records. That seems OK. View JSON would need to include a property that the mechanism can identify.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389579363,https://api.github.com/repos/simonw/datasette/issues/266,389579363,MDEyOklzc3VlQ29tbWVudDM4OTU3OTM2Mw==,9599,simonw,2018-05-16T16:20:06Z,2018-05-16T16:20:06Z,OWNER,I started a thread on Twitter discussing various CSV output dialects: https://twitter.com/simonw/status/996783395504979968 - I want to pick defaults which will work as well as possible for whatever tools people might be using to consume the data.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389579762,https://api.github.com/repos/simonw/datasette/issues/266,389579762,MDEyOklzc3VlQ29tbWVudDM4OTU3OTc2Mg==,9599,simonw,2018-05-16T16:21:12Z,2018-05-16T16:21:12Z,OWNER,"> I basically want someone to tell me which arguments I can pass to Python's csv.writer() function that will result in the least complaints from people who try to parse the results :) https://twitter.com/simonw/status/996786815938977792","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389592566,https://api.github.com/repos/simonw/datasette/issues/266,389592566,MDEyOklzc3VlQ29tbWVudDM4OTU5MjU2Ng==,9599,simonw,2018-05-16T17:01:29Z,2018-05-16T17:02:21Z,OWNER,Let's provide a CSV Dialect definition too: https://frictionlessdata.io/specs/csv-dialect/ - via https://twitter.com/drewdaraabrams/status/996794915680997382,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389608473,https://api.github.com/repos/simonw/datasette/issues/266,389608473,MDEyOklzc3VlQ29tbWVudDM4OTYwODQ3Mw==,9599,simonw,2018-05-16T17:52:35Z,2018-05-16T17:54:11Z,OWNER,"There are some code examples in this issue which should help with the streaming part: https://github.com/channelcat/sanic/issues/1067 Also https://github.com/channelcat/sanic/blob/master/docs/sanic/streaming.md#response-streaming","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389626715,https://api.github.com/repos/simonw/datasette/issues/266,389626715,MDEyOklzc3VlQ29tbWVudDM4OTYyNjcxNQ==,9599,simonw,2018-05-16T18:50:46Z,2018-05-16T18:50:46Z,OWNER,"> I’d recommend using the Windows-1252 encoding for maximum compatibility, unless you have any characters not in that set, in which case use UTF8 with a byte order mark. Bit of a pain, but some progams (eg various versions of Excel) don’t read UTF8. **frankieroberto** https://twitter.com/frankieroberto/status/996823071947460616 > There is software that consumes CSV and doesn't speak UTF8!? Huh. Well I can't just use Windows-1252 because I need to support the full UTF8 range of potential data - maybe I should support an optional ?_encoding=windows-1252 argument **simonw** https://twitter.com/simonw/status/996824677245857793","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/262#issuecomment-389702480,https://api.github.com/repos/simonw/datasette/issues/262,389702480,MDEyOklzc3VlQ29tbWVudDM4OTcwMjQ4MA==,9599,simonw,2018-05-17T00:00:39Z,2020-09-12T18:19:30Z,OWNER,Idea: `?_extra=sqllog` could output a lot of every individual SQL statement that was executed in order to generate the page - useful for seeing how foreign key expansion and faceting actually works.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323658641,Add ?_extra= mechanism for requesting extra properties in JSON, https://github.com/simonw/datasette/issues/266#issuecomment-389893810,https://api.github.com/repos/simonw/datasette/issues/266,389893810,MDEyOklzc3VlQ29tbWVudDM4OTg5MzgxMA==,9599,simonw,2018-05-17T14:49:35Z,2018-05-17T14:49:35Z,OWNER,Idea: add a `supports_csv = False` property to `BaseView` and over-ride it to `True` just on the view classes that should support CSV (Table and Row). Slight subtlety: the `DatabaseView` class only supports CSV in the `custom_sql()` path. Maybe that needs to be refactored a bit.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-389894382,https://api.github.com/repos/simonw/datasette/issues/266,389894382,MDEyOklzc3VlQ29tbWVudDM4OTg5NDM4Mg==,9599,simonw,2018-05-17T14:51:13Z,2018-05-17T14:53:23Z,OWNER,"I should definitely sanity check if the `_next=` route really is the most efficient way to build this. It may turn out that iterating over a SQLite cursor with a million rows in it is super-efficient and would provide much more reliable performance (plus solve the problem for retrieving full custom SQL queries where we can't do keyset pagination). Problem here is that we run SQL queries in a thread pool. A query that returns millions of rows would presumably tie up a SQL thread until it has finished, which could block the server. This may be a reason to stick with `_next=` keyset pagination - since it ensures each SQL thread yields back again after each 1,000 rows.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/271#issuecomment-389989015,https://api.github.com/repos/simonw/datasette/issues/271,389989015,MDEyOklzc3VlQ29tbWVudDM4OTk4OTAxNQ==,9599,simonw,2018-05-17T19:54:10Z,2018-05-17T19:54:10Z,OWNER,"This is a departure from how Datasette has been designed so far, and it may turn out that it's not feasible or it requires too many philosophical changes to be worthwhile. If we CAN do it though it would mean Datasette could stay running pointed at a directory on disk and new SQLite databases could be dropped into that directory by another process and served directly as they become available.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324162476,Mechanism for automatically picking up changes when on-disk .db file changes, https://github.com/simonw/datasette/issues/271#issuecomment-389989615,https://api.github.com/repos/simonw/datasette/issues/271,389989615,MDEyOklzc3VlQ29tbWVudDM4OTk4OTYxNQ==,9599,simonw,2018-05-17T19:56:13Z,2018-05-17T19:56:13Z,OWNER,"From https://www.sqlite.org/c3ref/open.html > **immutable**: The immutable parameter is a boolean query parameter that indicates that the database file is stored on read-only media. When immutable is set, SQLite assumes that the database file cannot be changed, even by a process with higher privilege, and so the database is opened read-only and all locking and change detection is disabled. Caution: Setting the immutable property on a database file that does in fact change can result in incorrect query results and/or SQLITE_CORRUPT errors. See also: SQLITE_IOCAP_IMMUTABLE. So this would probably have to be a new mode, `datasette serve --detect-db-changes`, which no longer opens in immutable mode. Or maybe current behavior becomes not-the-default and you opt into it with `datasette serve --immutable`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324162476,Mechanism for automatically picking up changes when on-disk .db file changes, https://github.com/simonw/datasette/issues/270#issuecomment-390105147,https://api.github.com/repos/simonw/datasette/issues/270,390105147,MDEyOklzc3VlQ29tbWVudDM5MDEwNTE0Nw==,9599,simonw,2018-05-18T06:13:07Z,2018-05-18T06:13:07Z,OWNER,I'm going to add a `/-/limits` page that shows the current limits.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323830051,--limit= CLI option for setting limits, https://github.com/simonw/datasette/issues/264#issuecomment-390105943,https://api.github.com/repos/simonw/datasette/issues/264,390105943,MDEyOklzc3VlQ29tbWVudDM5MDEwNTk0Mw==,9599,simonw,2018-05-18T06:18:00Z,2018-05-18T06:18:00Z,OWNER,Docs: http://datasette.readthedocs.io/en/latest/limits.html#default-facet-size,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323673899,Make it possible to customize various facet settings, https://github.com/simonw/datasette/issues/273#issuecomment-390250253,https://api.github.com/repos/simonw/datasette/issues/273,390250253,MDEyOklzc3VlQ29tbWVudDM5MDI1MDI1Mw==,198537,rgieseke,2018-05-18T15:49:52Z,2018-05-18T15:49:52Z,CONTRIBUTOR,"Shouldn't [versioneer](https://github.com/warner/python-versioneer) do that? E.g. 0.21+2.g1076c97 You'd need to install via `pip install git+https://github.com/simow/datasette.git` though, this does a temp git clone.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324451322,Figure out a way to have /-/version return current git commit hash, https://github.com/simonw/datasette/issues/274#issuecomment-390433040,https://api.github.com/repos/simonw/datasette/issues/274,390433040,MDEyOklzc3VlQ29tbWVudDM5MDQzMzA0MA==,9599,simonw,2018-05-19T21:12:42Z,2018-05-20T16:01:03Z,OWNER,Could also support these as optional environment variables - `DATASETTE_NAMEOFSETTING`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324652142,"Rename --limit to --config, add --help-config", https://github.com/simonw/datasette/issues/274#issuecomment-390496376,https://api.github.com/repos/simonw/datasette/issues/274,390496376,MDEyOklzc3VlQ29tbWVudDM5MDQ5NjM3Ng==,9599,simonw,2018-05-20T17:04:55Z,2018-05-20T17:04:55Z,OWNER,http://datasette.readthedocs.io/en/latest/config.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324652142,"Rename --limit to --config, add --help-config", https://github.com/simonw/datasette/pull/258#issuecomment-390577711,https://api.github.com/repos/simonw/datasette/issues/258,390577711,MDEyOklzc3VlQ29tbWVudDM5MDU3NzcxMQ==,247131,philroche,2018-05-21T07:38:15Z,2018-05-21T07:38:15Z,NONE,"Excellent, I was not aware of the auto redirect to the new hash. My bad This solves my use case. I do agree that your suggested --no-url-hash approach is much neater. I will investigate ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322741659,Add new metadata key persistent_urls which removes the hash from all database urls, https://github.com/simonw/datasette/issues/247#issuecomment-390689406,https://api.github.com/repos/simonw/datasette/issues/247,390689406,MDEyOklzc3VlQ29tbWVudDM5MDY4OTQwNg==,11912854,jsancho-gpl,2018-05-21T15:29:31Z,2018-05-21T15:29:31Z,NONE,"I've changed my mind about the way to support external connectors aside of SQLite and I'm working in a more simple style that respects the original Datasette, i.e. less refactoring. I present you [a version of Datasette wich supports other database connectors](https://github.com/jsancho-gpl/datasette/tree/external-connectors) and [a Datasette connector for HDF5/PyTables files](https://github.com/jsancho-gpl/datasette-pytables).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",319449852,SQLite code decoupled from Datasette, https://github.com/simonw/datasette/pull/277#issuecomment-390707183,https://api.github.com/repos/simonw/datasette/issues/277,390707183,MDEyOklzc3VlQ29tbWVudDM5MDcwNzE4Mw==,9599,simonw,2018-05-21T16:28:39Z,2018-05-21T16:28:39Z,OWNER,"This is definitely a big improvement. I'd like to refactor the unit tests that cover .inspect() too - currently they are a huge ugly blob at the top of test_api.py","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324836533,Refactor inspect logic, https://github.com/simonw/datasette/issues/276#issuecomment-390707760,https://api.github.com/repos/simonw/datasette/issues/276,390707760,MDEyOklzc3VlQ29tbWVudDM5MDcwNzc2MA==,9599,simonw,2018-05-21T16:30:35Z,2018-05-21T16:30:35Z,OWNER,"This probably needs to be in a plugin simply because getting Spatialite compiled and installed is a bit of a pain. It's a great opportunity to expand the plugin hooks in useful ways though.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-390795067,https://api.github.com/repos/simonw/datasette/issues/276,390795067,MDEyOklzc3VlQ29tbWVudDM5MDc5NTA2Nw==,45057,russss,2018-05-21T21:55:57Z,2018-05-21T21:55:57Z,CONTRIBUTOR,"Well, we do have the capability to detect spatialite so my intention certainly wasn't to require it. I can see the advantage of having it as a plugin but it does touch a number of points in the code. I think I'm going to attack this by refactoring the necessary bits and seeing where that leads (which was my plan anyway). I think my main concern is - if I add certain plugin hooks for this, is anything else ever going to use them? I'm not sure I have an answer to that question yet, either way.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/pull/277#issuecomment-390804333,https://api.github.com/repos/simonw/datasette/issues/277,390804333,MDEyOklzc3VlQ29tbWVudDM5MDgwNDMzMw==,9599,simonw,2018-05-21T22:40:16Z,2018-05-21T22:43:50Z,OWNER,"We should merge this before refactoring the tests though, because that way we don't couple the new tests to the verification of this change.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324836533,Refactor inspect logic, https://github.com/simonw/datasette/issues/278#issuecomment-390991640,https://api.github.com/repos/simonw/datasette/issues/278,390991640,MDEyOklzc3VlQ29tbWVudDM5MDk5MTY0MA==,9599,simonw,2018-05-22T13:33:46Z,2018-05-22T13:33:46Z,OWNER,For SpatiaLite this example may be useful - though it's building 4.3.0 and not 4.4.0: https://github.com/terranodo/spatialite-docker/blob/master/Dockerfile,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325294102,Build smallest possible Docker image with Datasette plus recent SQLite (with json1) plus Spatialite 4.4.0, https://github.com/simonw/datasette/issues/278#issuecomment-390993397,https://api.github.com/repos/simonw/datasette/issues/278,390993397,MDEyOklzc3VlQ29tbWVudDM5MDk5MzM5Nw==,9599,simonw,2018-05-22T13:38:57Z,2018-05-22T13:38:57Z,OWNER,"Useful GitHub code search: https://github.com/search?utf8=✓&q=%22libspatialite-4.4.0%22+%22RC0%22&type=Code ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325294102,Build smallest possible Docker image with Datasette plus recent SQLite (with json1) plus Spatialite 4.4.0, https://github.com/simonw/datasette/issues/278#issuecomment-390993861,https://api.github.com/repos/simonw/datasette/issues/278,390993861,MDEyOklzc3VlQ29tbWVudDM5MDk5Mzg2MQ==,9599,simonw,2018-05-22T13:40:14Z,2018-05-22T14:38:05Z,OWNER,If we can't get `import sqlite3` to load the latest version but we can get `import pysqlite3` to work that's fine too - I can teach Datasette to import the best available version.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325294102,Build smallest possible Docker image with Datasette plus recent SQLite (with json1) plus Spatialite 4.4.0, https://github.com/simonw/datasette/issues/255#issuecomment-390999055,https://api.github.com/repos/simonw/datasette/issues/255,390999055,MDEyOklzc3VlQ29tbWVudDM5MDk5OTA1NQ==,9599,simonw,2018-05-22T13:54:55Z,2018-05-22T13:54:55Z,OWNER,This shipped in Datasette 0.22. Here's my blog post about it: https://simonwillison.net/2018/May/20/datasette-facets/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322477187,Facets, https://github.com/simonw/datasette/issues/276#issuecomment-391000659,https://api.github.com/repos/simonw/datasette/issues/276,391000659,MDEyOklzc3VlQ29tbWVudDM5MTAwMDY1OQ==,9599,simonw,2018-05-22T13:59:27Z,2018-05-22T13:59:27Z,OWNER,"Right now the plugin stuff is early enough that I'd like to get as many potential plugin hooks as possible crafted out A much easier to judge if they should be added as actual hooks if we have a working branch prototype of them. Some kind of mechanism for custom column display is already needed - eg there are columns where I want to say ""render this as markdown"" or ""URLify any links in this text"" - or even ""use this date format"" or ""add commas to this integer"". You can do it with a custom template but a lower-level mechanism would be nicer. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/273#issuecomment-391003285,https://api.github.com/repos/simonw/datasette/issues/273,391003285,MDEyOklzc3VlQ29tbWVudDM5MTAwMzI4NQ==,9599,simonw,2018-05-22T14:06:40Z,2018-05-22T14:06:40Z,OWNER,"That looks great. I don't think it's possible to derive the current commit version from the .zip downloaded directly from GitHub, so needing to pip install via git+https feels reasonable to me.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324451322,Figure out a way to have /-/version return current git commit hash, https://github.com/simonw/datasette/issues/272#issuecomment-391011268,https://api.github.com/repos/simonw/datasette/issues/272,391011268,MDEyOklzc3VlQ29tbWVudDM5MTAxMTI2OA==,9599,simonw,2018-05-22T14:28:12Z,2018-05-22T14:28:12Z,OWNER,"I think I can do this almost entirely within my existing BaseView class structure. First, decouple the async data() methods by teaching them to take a querystring object as an argument instead of a Sanic request object. The get() method can then send that new object instead of a request. Next teach the base class how to obey the ASGI protocol. I should be able to get support for both Sanic and uvicorn/daphne working in the same codebase, which will make it easy to compare their performance. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/276#issuecomment-391025841,https://api.github.com/repos/simonw/datasette/issues/276,391025841,MDEyOklzc3VlQ29tbWVudDM5MTAyNTg0MQ==,9599,simonw,2018-05-22T15:06:36Z,2018-05-22T15:06:36Z,OWNER,"The other reason I mention plugins is that I have an idea to outlaw JavaScript entirely from Datasette core and instead encourage ALL JavaScript functionality to move into plugins.right now that just means CodeMirror. I may set up some of those plugins (like CodeMirror) as default dependencies so you get them from ""pip install datasette"". I like the neatness of saying that core Datasette is a very simple JSON + HTML application, then encouraging people to go completely wild with JavaScript in the plugins.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/243#issuecomment-391030083,https://api.github.com/repos/simonw/datasette/issues/243,391030083,MDEyOklzc3VlQ29tbWVudDM5MTAzMDA4Mw==,9599,simonw,2018-05-22T15:17:10Z,2018-05-22T15:17:10Z,OWNER,See also #278,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",318737808,--spatialite option for datasette publish commands, https://github.com/simonw/datasette/issues/276#issuecomment-391050113,https://api.github.com/repos/simonw/datasette/issues/276,391050113,MDEyOklzc3VlQ29tbWVudDM5MTA1MDExMw==,45057,russss,2018-05-22T16:13:00Z,2018-05-22T16:13:00Z,CONTRIBUTOR,"Yup, I'll have a think about it. My current thoughts are for spatialite we'll need to hook into the following places: * Inspection, so we can detect which columns are geometry columns. (We also currently ignore spatialite tables during inspection, it may be worth moving that to the plugin as well.) * After data load, so we can convert WKB into the correct intermediate format for display. The alternative here is to alter the select SQL itself and get spatialite to do this conversion, but that strikes me as a bit more complex and possibly not as useful. * HTML rendering. * Querying? The rendering and querying hooks could also potentially be used to move the units support into a plugin.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/pull/279#issuecomment-391055490,https://api.github.com/repos/simonw/datasette/issues/279,391055490,MDEyOklzc3VlQ29tbWVudDM5MTA1NTQ5MA==,9599,simonw,2018-05-22T16:29:30Z,2018-05-22T16:29:30Z,OWNER,"This is fantastic! I think I prefer the aesthetics of just ""0.22"" for the version string if it's a tagged release with no additional changes - does that work? I'd like to continue to provide a tuple that can be imported from the version.py module as well, as seen here: https://github.com/simonw/datasette/blob/558d9d7bfef3dd633eb16389281b67d42c9bdeef/datasette/version.py#L1 Presumably we can generate that from the versioneer string? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325352370,Add version number support with Versioneer, https://github.com/simonw/datasette/pull/280#issuecomment-391059008,https://api.github.com/repos/simonw/datasette/issues/280,391059008,MDEyOklzc3VlQ29tbWVudDM5MTA1OTAwOA==,565628,r4vi,2018-05-22T16:40:27Z,2018-05-22T16:40:27Z,CONTRIBUTOR,"```python >>> import sqlite3 >>> sqlite3.sqlite_version '3.23.1' >>> ``` running the above in the container seems to show 3.23.1 too so maybe we don't need pysqlite3 at all?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/279#issuecomment-391073009,https://api.github.com/repos/simonw/datasette/issues/279,391073009,MDEyOklzc3VlQ29tbWVudDM5MTA3MzAwOQ==,198537,rgieseke,2018-05-22T17:23:26Z,2018-05-22T17:23:26Z,CONTRIBUTOR,"> I think I prefer the aesthetics of just ""0.22"" for the version string if it's a tagged release with no additional changes - does that work? Yes! That's the default versioneer behaviour. > I'd like to continue to provide a tuple that can be imported from the version.py module as well, as seen here: Should work now, it can be a two (for a tagged version), three or four items tuple. ``` In [2]: datasette.__version__ Out[2]: '0.12+292.ga70c2a8.dirty' In [3]: datasette.__version_info__ Out[3]: ('0', '12+292', 'ga70c2a8', 'dirty') ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325352370,Add version number support with Versioneer, https://github.com/simonw/datasette/pull/279#issuecomment-391073267,https://api.github.com/repos/simonw/datasette/issues/279,391073267,MDEyOklzc3VlQ29tbWVudDM5MTA3MzI2Nw==,198537,rgieseke,2018-05-22T17:24:16Z,2018-05-22T17:24:16Z,CONTRIBUTOR,"Sorry, just realised you rely on `version` being a module ...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325352370,Add version number support with Versioneer, https://github.com/simonw/datasette/pull/280#issuecomment-391076239,https://api.github.com/repos/simonw/datasette/issues/280,391076239,MDEyOklzc3VlQ29tbWVudDM5MTA3NjIzOQ==,9599,simonw,2018-05-22T17:33:33Z,2018-05-22T17:33:33Z,OWNER,This looks amazing! Can't wait to try this out this evening.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/280#issuecomment-391076458,https://api.github.com/repos/simonw/datasette/issues/280,391076458,MDEyOklzc3VlQ29tbWVudDM5MTA3NjQ1OA==,9599,simonw,2018-05-22T17:34:13Z,2018-05-22T17:34:13Z,OWNER,Yeah let's try this without pysqlite3 and see if we still get the correct version.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/279#issuecomment-391077700,https://api.github.com/repos/simonw/datasette/issues/279,391077700,MDEyOklzc3VlQ29tbWVudDM5MTA3NzcwMA==,198537,rgieseke,2018-05-22T17:38:17Z,2018-05-22T17:38:17Z,CONTRIBUTOR,"Alright, that should work now -- let me know if you would prefer any different behaviour.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325352370,Add version number support with Versioneer, https://github.com/simonw/datasette/pull/280#issuecomment-391141391,https://api.github.com/repos/simonw/datasette/issues/280,391141391,MDEyOklzc3VlQ29tbWVudDM5MTE0MTM5MQ==,565628,r4vi,2018-05-22T21:08:39Z,2018-05-22T21:08:39Z,CONTRIBUTOR,"I'm going to clean this up for consistency tomorrow morning so hold off merging until then please On Tue, May 22, 2018 at 6:34 PM, Simon Willison wrote: > Yeah let's try this without pysqlite3 and see if we still get the correct > version. > > — > You are receiving this because you authored the thread. > Reply to this email directly, view it on GitHub > , or mute > the thread > > . > ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/280#issuecomment-391190497,https://api.github.com/repos/simonw/datasette/issues/280,391190497,MDEyOklzc3VlQ29tbWVudDM5MTE5MDQ5Nw==,9599,simonw,2018-05-23T01:22:53Z,2018-05-23T01:22:53Z,OWNER,"I grabbed just your Dockerfile and built it like this: docker build . -t datasette Once it had built, I ran it like this: docker run -p 8001:8001 -v `pwd`:/mnt datasette \ datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db \ --load-extension=/usr/local/lib/mod_spatialite.so (The fixtures.db file is created by running `python tests/fixtures.py fixtures.db`) Then I visited http://localhost:8001/-/versions and I got this: { ""datasette"": { ""version"": ""0+unknown"" }, ""python"": { ""full"": ""3.6.3 (default, Dec 12 2017, 06:37:05) \n[GCC 6.3.0 20170516]"", ""version"": ""3.6.3"" }, ""sqlite"": { ""extensions"": { ""json1"": null, ""spatialite"": ""4.4.0-RC0"" }, ""fts_versions"": [ ""FTS4"", ""FTS3"" ], ""version"": ""3.23.1"" } } Fantastic! I'm getting SQLite `3.23.1` and SpatiaLite `4.4.0-RC0`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/280#issuecomment-391290271,https://api.github.com/repos/simonw/datasette/issues/280,391290271,MDEyOklzc3VlQ29tbWVudDM5MTI5MDI3MQ==,565628,r4vi,2018-05-23T09:53:38Z,2018-05-23T09:53:38Z,CONTRIBUTOR,"Running: ```bash docker run -p 8001:8001 -v `pwd`:/mnt datasette \ datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db \ --load-extension=/usr/local/lib/mod_spatialite.so ``` is now returning FTS5 enabled in the versions output: ```json { ""datasette"": { ""version"": ""0.22"" }, ""python"": { ""full"": ""3.6.5 (default, May 5 2018, 03:07:21) \n[GCC 6.3.0 20170516]"", ""version"": ""3.6.5"" }, ""sqlite"": { ""extensions"": { ""json1"": null, ""spatialite"": ""4.4.0-RC0"" }, ""fts_versions"": [ ""FTS5"", ""FTS4"", ""FTS3"" ], ""version"": ""3.23.1"" } } ``` The old query didn't work because specifying `(t TEXT)` caused an error","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/280#issuecomment-391354237,https://api.github.com/repos/simonw/datasette/issues/280,391354237,MDEyOklzc3VlQ29tbWVudDM5MTM1NDIzNw==,9599,simonw,2018-05-23T13:51:22Z,2018-05-23T13:51:22Z,OWNER,@r4vi any objections to me merging this?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/280#issuecomment-391355030,https://api.github.com/repos/simonw/datasette/issues/280,391355030,MDEyOklzc3VlQ29tbWVudDM5MTM1NTAzMA==,565628,r4vi,2018-05-23T13:53:27Z,2018-05-23T15:22:45Z,CONTRIBUTOR,"No objections; It's good to go @simonw On Wed, 23 May 2018, 14:51 Simon Willison, wrote: > @r4vi any objections to me merging this? > > — > You are receiving this because you were mentioned. > Reply to this email directly, view it on GitHub > , or mute > the thread > > . > ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/issues/282#issuecomment-391355099,https://api.github.com/repos/simonw/datasette/issues/282,391355099,MDEyOklzc3VlQ29tbWVudDM5MTM1NTA5OQ==,9599,simonw,2018-05-23T13:53:39Z,2018-05-23T13:53:39Z,OWNER,Confirmed fixed: https://fivethirtyeight-datasette-mipwbeadvr.now.sh/fivethirtyeight-5de27e3/nba-elo%2Fnbaallelo?_facet=lg_id&_next=100 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325705981,Faceting breaks pagination, https://github.com/simonw/datasette/pull/280#issuecomment-391437199,https://api.github.com/repos/simonw/datasette/issues/280,391437199,MDEyOklzc3VlQ29tbWVudDM5MTQzNzE5OQ==,9599,simonw,2018-05-23T17:44:20Z,2018-05-23T17:44:20Z,OWNER,Thank you very much! This is most excellent.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/pull/281#issuecomment-391437462,https://api.github.com/repos/simonw/datasette/issues/281,391437462,MDEyOklzc3VlQ29tbWVudDM5MTQzNzQ2Mg==,9599,simonw,2018-05-23T17:45:07Z,2018-05-23T17:45:07Z,OWNER,I'm afraid I just merged #280 which means this no longer applies. You're very welcome to see if you can further optimize the new Dockerfile though.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325553991,Reduces image size using Alpine + Multistage (re: #278), https://github.com/simonw/datasette/issues/276#issuecomment-391504199,https://api.github.com/repos/simonw/datasette/issues/276,391504199,MDEyOklzc3VlQ29tbWVudDM5MTUwNDE5OQ==,9599,simonw,2018-05-23T21:35:17Z,2018-05-23T21:35:17Z,OWNER,"I'm not keen on anything that modifies the SQLite file itself on startup - part of the datasette contract is that it should work with any SQLite file you throw at it without having any side-effects. A neat thing about SQLite is that because everything happens in the same process there's very little additional overhead involved in executing extra SQL queries - even if we ran a query-per-row to transform data in one specific column it shouldn't add more than a few ms to the total page load time (whereas with MySQL all of the extra query overhead would kill us).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-391504757,https://api.github.com/repos/simonw/datasette/issues/276,391504757,MDEyOklzc3VlQ29tbWVudDM5MTUwNDc1Nw==,9599,simonw,2018-05-23T21:37:07Z,2018-05-23T21:37:18Z,OWNER,"That said, it looks like we may be able to use a library like https://github.com/geomet/geomet to run the conversion from WKB entirely in Python space.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-391505930,https://api.github.com/repos/simonw/datasette/issues/276,391505930,MDEyOklzc3VlQ29tbWVudDM5MTUwNTkzMA==,45057,russss,2018-05-23T21:41:37Z,2018-05-23T21:41:37Z,CONTRIBUTOR,"> I'm not keen on anything that modifies the SQLite file itself on startup Ah I didn't mean that - I meant altering the SELECT query to fetch the data so that it ran a spatialite function to transform that specific column. I think that's less useful as a general-purpose plugin hook though, and it's not that hard to parse the WKB in Python (my default approach would be to use [shapely](https://github.com/Toblerity/Shapely), which is great, but geomet looks like an interesting pure-python alternative).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/283#issuecomment-391583528,https://api.github.com/repos/simonw/datasette/issues/283,391583528,MDEyOklzc3VlQ29tbWVudDM5MTU4MzUyOA==,9599,simonw,2018-05-24T04:21:49Z,2018-05-24T04:21:49Z,OWNER,"The challenge here is which database should be the ""default"" database. The first database attached to SQLite is treated as the default - if no database is specified in a query, that's the database that queries will be executed against. Currently, each database URL in Datasette (e.g. https://san-francisco.datasettes.com/sf-film-locations-84594a7 v.s. https://san-francisco.datasettes.com/sf-trees-ebc2ad9 ) gets its own independent connection, and all queries within that base URL run against that database. If we're going to attach multiple databases to the same connection, how do we set which database gets to be the default? The easiest thing to do here will be to have a special database (maybe which is turned off by default and can be enabled using `datasette serve --enable-cross-database-joins` or similar) which attaches to ALL the databases. Perhaps it starts as an in-memory database, maybe at `/memory`? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391584112,https://api.github.com/repos/simonw/datasette/issues/283,391584112,MDEyOklzc3VlQ29tbWVudDM5MTU4NDExMg==,9599,simonw,2018-05-24T04:26:29Z,2018-05-24T04:30:50Z,OWNER,"I built a very rough prototype of this to prove it could work. It's deployed here - and here's an example of a query that joins across two different databases: https://datasette-cross-database-joins-prototype.now.sh/memory?sql=select+fivethirtyeight.%5Blove-actually%2Flove_actually_adjacencies%5D.rowid%2C%0D%0Afivethirtyeight.%5Blove-actually%2Flove_actually_adjacencies%5D.actors%2C%0D%0A%5Bgoogle-trends%5D.%5B20150430_UKDebate%5D.city%0D%0Afrom+fivethirtyeight.%5Blove-actually%2Flove_actually_adjacencies%5D%0D%0Ajoin+%5Bgoogle-trends%5D.%5B20150430_UKDebate%5D%0D%0A++on+%5Bgoogle-trends%5D.%5B20150430_UKDebate%5D.rowid+%3D+fivethirtyeight.%5Blove-actually%2Flove_actually_adjacencies%5D.rowid ``` select fivethirtyeight.[love-actually/love_actually_adjacencies].rowid, fivethirtyeight.[love-actually/love_actually_adjacencies].actors, [google-trends].[20150430_UKDebate].city from fivethirtyeight.[love-actually/love_actually_adjacencies] join [google-trends].[20150430_UKDebate] on [google-trends].[20150430_UKDebate].rowid = fivethirtyeight.[love-actually/love_actually_adjacencies].rowid ``` I deployed it like this: datasette publish now --branch=cross-database-joins fivethirtyeight.db google-trends.db --name=datasette-cross-database-joins-prototype ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391584366,https://api.github.com/repos/simonw/datasette/issues/283,391584366,MDEyOklzc3VlQ29tbWVudDM5MTU4NDM2Ng==,9599,simonw,2018-05-24T04:28:20Z,2018-05-24T04:28:20Z,OWNER,"I used some pretty ugly hacks, like faking an entire `.inspect()` block for the `:memory:` database just to get past the errors I was seeing. To ship this as a feature it will need quite a bit of code refactoring to make those hacks unnecessary. https://github.com/simonw/datasette/blob/7a3040f5782375373b2b66e5969bc2c49b3a6f0e/datasette/views/database.py#L18-L26","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391584527,https://api.github.com/repos/simonw/datasette/issues/283,391584527,MDEyOklzc3VlQ29tbWVudDM5MTU4NDUyNw==,9599,simonw,2018-05-24T04:29:40Z,2018-05-24T04:29:40Z,OWNER,Rather than stealing the `/memory` namespace for this it would be nicer if these cross-database joins could be executed at the very top-level URL of the Datasette instance - `https://example.com/?sql=...`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391752218,https://api.github.com/repos/simonw/datasette/issues/283,391752218,MDEyOklzc3VlQ29tbWVudDM5MTc1MjIxOA==,9599,simonw,2018-05-24T15:15:19Z,2018-05-24T15:15:19Z,OWNER,Most of the time Datasette is used with just a single database file. So maybe it makes sense for this option to be turned on by default and to ALWAYS be available on the Datasette instance homepage unless the user has explicitly disabled it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391752425,https://api.github.com/repos/simonw/datasette/issues/283,391752425,MDEyOklzc3VlQ29tbWVudDM5MTc1MjQyNQ==,9599,simonw,2018-05-24T15:15:51Z,2018-05-24T15:15:51Z,OWNER,"This would make Datasett's SQL features a lot more instantly obvious to people who land on a homepage, which is probably a good thing.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391752629,https://api.github.com/repos/simonw/datasette/issues/283,391752629,MDEyOklzc3VlQ29tbWVudDM5MTc1MjYyOQ==,9599,simonw,2018-05-24T15:16:25Z,2018-05-24T15:16:25Z,OWNER,"Should this support canned queries too? I think it should, though that raises interesting questions regarding their URL structure.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391752882,https://api.github.com/repos/simonw/datasette/issues/283,391752882,MDEyOklzc3VlQ29tbWVudDM5MTc1Mjg4Mg==,9599,simonw,2018-05-24T15:17:10Z,2018-05-24T15:17:10Z,OWNER,Another option: give this the `/-/all` URL namespace.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391754506,https://api.github.com/repos/simonw/datasette/issues/283,391754506,MDEyOklzc3VlQ29tbWVudDM5MTc1NDUwNg==,9599,simonw,2018-05-24T15:21:37Z,2018-05-24T15:21:53Z,OWNER,"Giving it `/all/` would be easier since that way the existing URL routes (including canned queries) would all work... but I would have to teach it NOT to expect a database content hash on that URL. Or maybe it should still have a content hash (to enable far-future cache expiry headers on query results) but the hash should be constructed out of all of the other database hashes concatenated together. That way the URLs would be `/all-5de27e3` and `/all-5de27e3/canned-query-name` Only downside: this would make it impossible to have a database file with the name `all.db`. I think that's probably an OK trade-off. You could turn the feature off with a config flag if you really want to use that filename (for whatever reason). How about `/-all-5de27e3/` instead to avoid collisions?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391755300,https://api.github.com/repos/simonw/datasette/issues/283,391755300,MDEyOklzc3VlQ29tbWVudDM5MTc1NTMwMA==,9599,simonw,2018-05-24T15:23:37Z,2018-05-24T15:23:37Z,OWNER,On the `/-all-5de27e3` page we can show the regular https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3 interface but instead of the list of tables we can show a list of attached databases plus some help text showing how to construct a cross-database join.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/283#issuecomment-391756841,https://api.github.com/repos/simonw/datasette/issues/283,391756841,MDEyOklzc3VlQ29tbWVudDM5MTc1Njg0MQ==,9599,simonw,2018-05-24T15:27:42Z,2018-05-24T15:27:42Z,OWNER,"For an example query that pre-populates that textarea... maybe a UNION that pulls the first 10 rows from the first table of each of the first two databases? ``` select * from (select rowid, actors from fivethirtyeight.[love-actually/love_actually_adjacencies] limit 10) union all select * from (select rowid, city from [google-trends].[20150430_UKDebate] limit 10) ``` https://datasette-cross-database-joins-prototype.now.sh/memory?sql=select+*+from+%28select+rowid%2C+actors+from+fivethirtyeight.%5Blove-actually%2Flove_actually_adjacencies%5D+limit+10%29%0D%0A+++union+all%0D%0Aselect+*+from+%28select+rowid%2C+city+from+%5Bgoogle-trends%5D.%5B20150430_UKDebate%5D+limit+10%29","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/284#issuecomment-391765706,https://api.github.com/repos/simonw/datasette/issues/284,391765706,MDEyOklzc3VlQ29tbWVudDM5MTc2NTcwNg==,9599,simonw,2018-05-24T15:52:24Z,2018-05-24T15:52:24Z,OWNER,I'm not crazy about the `enable_` prefix on these.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326182814,Ability to enable/disable specific features via --config, https://github.com/simonw/datasette/issues/284#issuecomment-391765973,https://api.github.com/repos/simonw/datasette/issues/284,391765973,MDEyOklzc3VlQ29tbWVudDM5MTc2NTk3Mw==,9599,simonw,2018-05-24T15:53:08Z,2018-05-24T15:53:08Z,OWNER,This will also give us a mechanism for turning on and off the cross-database joins feature from #283,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326182814,Ability to enable/disable specific features via --config, https://github.com/simonw/datasette/issues/284#issuecomment-391766420,https://api.github.com/repos/simonw/datasette/issues/284,391766420,MDEyOklzc3VlQ29tbWVudDM5MTc2NjQyMA==,9599,simonw,2018-05-24T15:54:33Z,2018-05-24T15:54:33Z,OWNER,"Maybe `allow_sql`, `allow_facet` and `allow_download`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326182814,Ability to enable/disable specific features via --config, https://github.com/simonw/datasette/issues/283#issuecomment-391768302,https://api.github.com/repos/simonw/datasette/issues/283,391768302,MDEyOklzc3VlQ29tbWVudDM5MTc2ODMwMg==,9599,simonw,2018-05-24T16:00:05Z,2018-05-24T16:00:05Z,OWNER,I like `/-/all-5de27e3` for this (with `/-/all` redirecting to the correct hash),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325958506,Support cross-database joins, https://github.com/simonw/datasette/issues/275#issuecomment-391771202,https://api.github.com/repos/simonw/datasette/issues/275,391771202,MDEyOklzc3VlQ29tbWVudDM5MTc3MTIwMg==,9599,simonw,2018-05-24T16:08:41Z,2018-05-24T16:08:41Z,OWNER,"So the lookup priority order should be: * table level in metadata * database level in metadata * root level in metadata * `--config` options passed to `datasette serve` * `DATASETTE_X` environment variables","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324720095,"""config"" section in metadata.json (root, database and table level)", https://github.com/simonw/datasette/issues/275#issuecomment-391771658,https://api.github.com/repos/simonw/datasette/issues/275,391771658,MDEyOklzc3VlQ29tbWVudDM5MTc3MTY1OA==,9599,simonw,2018-05-24T16:09:55Z,2018-05-24T16:09:55Z,OWNER,It feels slightly weird continuing to call it `metadata.json` as it starts to grow support for config options (which already started with the `units` and `facets` keys) but I can live with that.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324720095,"""config"" section in metadata.json (root, database and table level)", https://github.com/simonw/datasette/issues/284#issuecomment-391912392,https://api.github.com/repos/simonw/datasette/issues/284,391912392,MDEyOklzc3VlQ29tbWVudDM5MTkxMjM5Mg==,9599,simonw,2018-05-25T01:16:56Z,2018-05-25T01:17:13Z,OWNER,`allow_sql` should only affect the `?sql=` parameter and whether or not the form is displayed. You should still be able to use and execute canned queries even if this option is turned off.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326182814,Ability to enable/disable specific features via --config, https://github.com/simonw/datasette/issues/284#issuecomment-391950691,https://api.github.com/repos/simonw/datasette/issues/284,391950691,MDEyOklzc3VlQ29tbWVudDM5MTk1MDY5MQ==,9599,simonw,2018-05-25T06:01:23Z,2018-05-25T06:05:02Z,OWNER,"Demo: datasette publish now --branch=master fixtures.db \ --source=""#284 Demo"" \ --source_url=""https://github.com/simonw/datasette/issues/284"" \ --extra-options ""--config allow_sql:off --config allow_facet:off --config allow_download:off"" \ --name=datasette-demo-284 now alias https://datasette-demo-284-jogjwngegj.now.sh datasette-demo-284.now.sh https://datasette-demo-284.now.sh/ Note the following: * https://datasette-demo-284.now.sh/fixtures-fda0fea has no SQL input textarea * https://datasette-demo-284.now.sh/fixtures-fda0fea has no database download link * https://datasette-demo-284.now.sh/fixtures-fda0fea.db returns 403 forbidden * https://datasette-demo-284.now.sh/fixtures-fda0fea?sql=select%20*%20from%20sqlite_master throws error 400 * https://datasette-demo-284.now.sh/fixtures-fda0fea/facetable shows no suggested facets * https://datasette-demo-284.now.sh/fixtures-fda0fea/facetable?_facet=city_id throws error 400","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326182814,Ability to enable/disable specific features via --config, https://github.com/simonw/datasette/issues/272#issuecomment-392118755,https://api.github.com/repos/simonw/datasette/issues/272,392118755,MDEyOklzc3VlQ29tbWVudDM5MjExODc1NQ==,9599,simonw,2018-05-25T16:56:40Z,2018-06-05T16:01:13Z,OWNER,"Thinking about this further, maybe I should embrace ASGI turtles-all-the-way-down and teach each datasette view class to take a scope to the constructor and act entirely as an ASGI component. Would be a nice way of diving deep into ASGI and I can add utility helpers for things like querystring evaluation as I need them.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/286#issuecomment-392121500,https://api.github.com/repos/simonw/datasette/issues/286,392121500,MDEyOklzc3VlQ29tbWVudDM5MjEyMTUwMA==,9599,simonw,2018-05-25T17:06:46Z,2018-05-25T17:06:46Z,OWNER,"A few extra thoughts: * Some users may want to opt out of this. We could have `--config version_in_hash:false` * should this affect the filename for the downloadable copy of the SQLite database? Maybe that should stay as just the hash of the contents, but that's a fair bit more complex * What about users who stick with the same version of datasette but deploy changes to their custom templates - how can we help them cache bust? Maybe with `--config cache_version:2`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326599525,Database hash should include current datasette version, https://github.com/simonw/datasette/issues/286#issuecomment-392121743,https://api.github.com/repos/simonw/datasette/issues/286,392121743,MDEyOklzc3VlQ29tbWVudDM5MjEyMTc0Mw==,9599,simonw,2018-05-25T17:07:36Z,2018-05-25T17:07:36Z,OWNER,This is also a great excuse to finally write up some detailed documentation on Datasette's caching strategy,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326599525,Database hash should include current datasette version, https://github.com/simonw/datasette/issues/267#issuecomment-392121905,https://api.github.com/repos/simonw/datasette/issues/267,392121905,MDEyOklzc3VlQ29tbWVudDM5MjEyMTkwNQ==,9599,simonw,2018-05-25T17:08:14Z,2018-05-25T17:08:14Z,OWNER,See also #286,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323716411,"Documentation for URL hashing, redirects and cache policy", https://github.com/simonw/datasette/issues/259#issuecomment-392212119,https://api.github.com/repos/simonw/datasette/issues/259,392212119,MDEyOklzc3VlQ29tbWVudDM5MjIxMjExOQ==,9599,simonw,2018-05-25T23:22:26Z,2018-05-25T23:22:26Z,OWNER,"This should detect any table which can be linked to the current table via some other table, based on the other table having a foreign key to them both. These join tables could be arbitrarily complicated. They might have foreign keys to more than two other tables, maybe even multiple foreign keys to the same column. Ideally M2M defection would catch all of these cases. Maybe the resulting inspect data looks something like this: ``` ""artists"": { ... ""m2m"": [{ ""other_table"": ""festivals"", ""through"": ""performances"", ""our_fk"": ""artist_id"", ""other_fk"": ""performance_id"" }] ``` Let's ignore compound primary keys: we k it detect m2m relationships where the join table has foreign keys to a single primary key on the other two tables.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322787470,inspect() should detect many-to-many relationships, https://github.com/simonw/datasette/issues/259#issuecomment-392214791,https://api.github.com/repos/simonw/datasette/issues/259,392214791,MDEyOklzc3VlQ29tbWVudDM5MjIxNDc5MQ==,9599,simonw,2018-05-25T23:43:15Z,2018-07-29T00:56:03Z,OWNER,"We may need to derive a usable name for each of these relationships that can be used in eg querystring parameters. The name of the join table is a reasonable choice here. Say the join table is called `event_tags` - the querystring for returning all events that are tagged `badger` could be `/db/events?_m2m_event_tags__tag=badger` perhaps? But what if `event_tags` has more than one foreign key back to `events`? Might need to specify the column in `events` that is referred back to by `event_tags` somehow in that case.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322787470,inspect() should detect many-to-many relationships, https://github.com/simonw/datasette/issues/276#issuecomment-392279508,https://api.github.com/repos/simonw/datasette/issues/276,392279508,MDEyOklzc3VlQ29tbWVudDM5MjI3OTUwOA==,9599,simonw,2018-05-26T18:32:07Z,2018-05-26T18:32:07Z,OWNER,Related: I started the documentation for using SpatiaLite with Datasette here: https://datasette.readthedocs.io/en/latest/spatialite.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-392279644,https://api.github.com/repos/simonw/datasette/issues/276,392279644,MDEyOklzc3VlQ29tbWVudDM5MjI3OTY0NA==,9599,simonw,2018-05-26T18:34:21Z,2018-05-26T18:34:21Z,OWNER,"I've been thinking a bit about modifying the SQL select statement used for the table view recently. I've run into a few examples of SQLite database that slow to a crawl when viewed with datasette because the rows are too big, so there's definitely scope for supporting custom select clauses (avoiding some columns, showing length(colname) for others).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/288#issuecomment-392288531,https://api.github.com/repos/simonw/datasette/issues/288,392288531,MDEyOklzc3VlQ29tbWVudDM5MjI4ODUzMQ==,9599,simonw,2018-05-26T21:14:37Z,2019-04-15T23:01:17Z,OWNER,"This might also be an opportunity to support an __in= operator - though that's an odd one as it acts equivalent to an OR whereas every other parameter is combined with an AND UPDATE 15th April 2019: I implemented `?column__in=` in a different way, see #433 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326767626,Support multiple filters of the same type, https://github.com/simonw/datasette/issues/289#issuecomment-392288990,https://api.github.com/repos/simonw/datasette/issues/289,392288990,MDEyOklzc3VlQ29tbWVudDM5MjI4ODk5MA==,9599,simonw,2018-05-26T21:24:10Z,2018-05-26T21:24:10Z,OWNER,An example of a query where you might want to use this option: https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3?sql=select+rowid%2C+*+from+%5Balcohol-consumption%2Fdrinks%5D+order+by+random%28%29+limit+1,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326768188,?_ttl= parameter to control caching, https://github.com/simonw/datasette/issues/289#issuecomment-392291605,https://api.github.com/repos/simonw/datasette/issues/289,392291605,MDEyOklzc3VlQ29tbWVudDM5MjI5MTYwNQ==,9599,simonw,2018-05-26T22:20:02Z,2018-05-26T22:20:02Z,OWNER,Documented here https://datasette.readthedocs.io/en/latest/json_api.html#special-table-arguments and here: https://datasette.readthedocs.io/en/latest/config.html#default-cache-ttl,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326768188,?_ttl= parameter to control caching, https://github.com/simonw/datasette/issues/289#issuecomment-392291716,https://api.github.com/repos/simonw/datasette/issues/289,392291716,MDEyOklzc3VlQ29tbWVudDM5MjI5MTcxNg==,9599,simonw,2018-05-26T22:22:47Z,2018-05-26T22:22:47Z,OWNER,Demo: hit refresh on https://fivethirtyeight.datasettes.com/fivethirtyeight-5de27e3?sql=select+rowid%2C+*+from+%5Balcohol-consumption%2Fdrinks%5D+order+by+random%28%29+limit+1&_ttl=0,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326768188,?_ttl= parameter to control caching, https://github.com/simonw/datasette/issues/287#issuecomment-392296758,https://api.github.com/repos/simonw/datasette/issues/287,392296758,MDEyOklzc3VlQ29tbWVudDM5MjI5Njc1OA==,9599,simonw,2018-05-27T00:32:53Z,2018-05-27T00:32:53Z,OWNER,Docs: https://datasette.readthedocs.io/en/latest/json_api.html#different-shapes,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326617744,?_shape=arrayfirst, https://github.com/simonw/datasette/issues/285#issuecomment-392297392,https://api.github.com/repos/simonw/datasette/issues/285,392297392,MDEyOklzc3VlQ29tbWVudDM5MjI5NzM5Mg==,9599,simonw,2018-05-27T00:50:27Z,2018-05-27T00:50:27Z,OWNER,"I ran a very rough micro-benchmark on the new `num_sql_threads` config option. datasette --config num_sql_threads:1 fivethirtyeight.db Then ab -n 100 -c 10 'http://127.0.0.1:8011/fivethirtyeight-2628db9/twitter-ratio%2Fsenators' | Number of threads | Requests/second | |---|---| | 1 | 4.57 | | 3 | 9.77 | | 10 | 13.53 | | 20 | 15.24 | 50 | 8.21 | ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326189744,num_threads and cache_max_age should be --config options, https://github.com/simonw/datasette/issues/285#issuecomment-392297508,https://api.github.com/repos/simonw/datasette/issues/285,392297508,MDEyOklzc3VlQ29tbWVudDM5MjI5NzUwOA==,9599,simonw,2018-05-27T00:53:35Z,2018-05-27T00:53:35Z,OWNER,Documentation: http://datasette.readthedocs.io/en/latest/config.html#num-sql-threads,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326189744,num_threads and cache_max_age should be --config options, https://github.com/simonw/datasette/issues/291#issuecomment-392302406,https://api.github.com/repos/simonw/datasette/issues/291,392302406,MDEyOklzc3VlQ29tbWVudDM5MjMwMjQwNg==,9599,simonw,2018-05-27T03:18:06Z,2018-05-27T03:18:06Z,OWNER,"My first attempt at this was to have plugins depend on each other - so there would be a `datasette-leaflet` plugin which adds Leaflet to the page, and the `datasette-cluster-map` and `datasette-leaflet-geojson` plugins would depend on that plugin. I tried this and it didn't work, because it turns out the order in which plugins are loaded isn't predictable. `datasette-cluster-map` ended up adding it's script link before Leaflet had been loaded by `datasette-leaflet`, resulting in JavaScript errors.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326783670,Avoid plugins accidentally loading dependencies twice, https://github.com/simonw/datasette/issues/291#issuecomment-392302416,https://api.github.com/repos/simonw/datasette/issues/291,392302416,MDEyOklzc3VlQ29tbWVudDM5MjMwMjQxNg==,9599,simonw,2018-05-27T03:18:16Z,2018-05-27T03:18:16Z,OWNER,For the moment then I'm going with a really simple solution: when iterating through `extra_css_urls` and `extra_js_urls` de-dupe by URL and avoid outputting the same link twice.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326783670,Avoid plugins accidentally loading dependencies twice, https://github.com/simonw/datasette/issues/291#issuecomment-392302456,https://api.github.com/repos/simonw/datasette/issues/291,392302456,MDEyOklzc3VlQ29tbWVudDM5MjMwMjQ1Ng==,9599,simonw,2018-05-27T03:19:24Z,2018-05-27T03:19:24Z,OWNER,The big gap in this solution is conflicting versions: I don't yet have a story for what happens if two plugins attempt to load different versions of Leaflet. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326783670,Avoid plugins accidentally loading dependencies twice, https://github.com/simonw/datasette/issues/231#issuecomment-392305776,https://api.github.com/repos/simonw/datasette/issues/231,392305776,MDEyOklzc3VlQ29tbWVudDM5MjMwNTc3Ng==,9599,simonw,2018-05-27T05:10:46Z,2018-05-27T05:10:46Z,OWNER,These plugin config options should be exposed to JavaScript as `datasette.config.plugins`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316323336,metadata.json support for plugin configuration options, https://github.com/simonw/datasette/issues/276#issuecomment-392316250,https://api.github.com/repos/simonw/datasette/issues/276,392316250,MDEyOklzc3VlQ29tbWVudDM5MjMxNjI1MA==,9599,simonw,2018-05-27T08:59:46Z,2018-05-27T08:59:46Z,OWNER,It looks like we can use the `geometry_columns` table to introspect which columns are SpatiaLite geometries. It includes a `geometry_type` integer which is documented here: https://www.gaia-gis.it/fossil/libspatialite/wiki?name=switching-to-4.0,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-392316306,https://api.github.com/repos/simonw/datasette/issues/276,392316306,MDEyOklzc3VlQ29tbWVudDM5MjMxNjMwNg==,9599,simonw,2018-05-27T09:00:46Z,2018-05-27T09:00:46Z,OWNER,Relevant to this ticket: I've been playing with a plugin that automatically renders any GeoJSON cells as leaflet maps: https://github.com/simonw/datasette-leaflet-geojson,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/292#issuecomment-392316673,https://api.github.com/repos/simonw/datasette/issues/292,392316673,MDEyOklzc3VlQ29tbWVudDM5MjMxNjY3Mw==,9599,simonw,2018-05-27T09:08:06Z,2018-05-27T09:08:06Z,OWNER,Open question: how should this affect the row page? Just because columns were hidden on the table page doesn't necessarily mean they should be hidden on the row page as well. ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392316701,https://api.github.com/repos/simonw/datasette/issues/292,392316701,MDEyOklzc3VlQ29tbWVudDM5MjMxNjcwMQ==,9599,simonw,2018-05-27T09:08:49Z,2018-05-27T09:08:49Z,OWNER,I could certainly see people wanting different custom column selects for the row page compared to the table page.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392338130,https://api.github.com/repos/simonw/datasette/issues/292,392338130,MDEyOklzc3VlQ29tbWVudDM5MjMzODEzMA==,9599,simonw,2018-05-27T15:09:18Z,2018-05-27T15:09:28Z,OWNER,"Here's my first sketch at a metadata format for this: * `columns`: optional list of columns to include - if missing, shows all * `column_selects`: dictionary mapping column names to alternative select clauses `column_selects` can also invent new keys and use them to create derived columns. These new keys will be selected at the end of the list of columns UNLESS they are mentioned in `columns`, in which case that sequence will define the order. Can you facet by things that are customized using `column_selects`? Yes, and let's try running suggested facets against those columns as well. ``` { ""databases"": { ""databasename"": { ""tables"": { ""tablename"": { ""columns"": [ ""id"", ""name"", ""size"" ], ""column_selects"": { ""name"": ""upper(name)"", ""geo_json"": ""AsGeoJSON(Geometry)"" } ""row_columns"": [...] ""row_column_selects"": {...} } ``` The `row_columns` and `row_column_selects` properties work the same as the `column*` ones, except they are applied on the row page instead. If omitted, the `column*` ones will be used on the row page as well. If you want the row page to switch back to Datasette's default behaviour you can set `""row_columns"": [], ""row_column_selects"": {}`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392342269,https://api.github.com/repos/simonw/datasette/issues/292,392342269,MDEyOklzc3VlQ29tbWVudDM5MjM0MjI2OQ==,9599,simonw,2018-05-27T15:55:40Z,2018-05-27T16:01:26Z,OWNER,"Here's the metadata I tried against that first working prototype: ``` { ""databases"": { ""timezones"": { ""tables"": { ""timezones"": { ""columns"": [""PK_UID""], ""column_selects"": { ""upper_tzid"": ""upper(tzid)"", ""Geometry"": ""AsGeoJSON(Geometry)"" } } } }, ""wtr"": { ""tables"": { ""license_frequency"": { ""columns"": [""id"", ""license"", ""tx_rx"", ""frequency""], ""column_selects"": { ""latitude"": ""Y(Geometry)"", ""longitude"": ""X(Geometry)"" } } } } } } ``` Run using this: datasette timezones.db wtr.db \ --reload --debug --load-extension=/usr/local/lib/mod_spatialite.dylib \ -m column-metadata.json --config sql_time_limit_ms:10000 Usefully, the `--reload` flag detects changes to the `metadata.json` file as well as Datasette's own Python code.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392342947,https://api.github.com/repos/simonw/datasette/issues/292,392342947,MDEyOklzc3VlQ29tbWVudDM5MjM0Mjk0Nw==,9599,simonw,2018-05-27T16:01:43Z,2018-05-27T16:01:43Z,OWNER,I'd still like to be able to over-ride this using querystring arguments.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392343690,https://api.github.com/repos/simonw/datasette/issues/292,392343690,MDEyOklzc3VlQ29tbWVudDM5MjM0MzY5MA==,9599,simonw,2018-05-27T16:08:25Z,2018-05-27T16:08:40Z,OWNER,"Turns out it's actually possible to pull data from other tables using the mechanism in the prototype: ``` { ""databases"": { ""wtr"": { ""tables"": { ""license"": { ""column_selects"": { ""count"": ""(select count(*) from license_frequency where license_frequency.license = license.id)"" } } } } } } ``` Performance using this technique is pretty terrible though: ![2018-05-27 at 9 07 am](https://user-images.githubusercontent.com/9599/40588124-8169d7fa-618d-11e8-9880-ccc1904b05d9.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392343839,https://api.github.com/repos/simonw/datasette/issues/292,392343839,MDEyOklzc3VlQ29tbWVudDM5MjM0MzgzOQ==,9599,simonw,2018-05-27T16:10:09Z,2018-06-04T17:38:04Z,OWNER,"The more efficient way of doing this kind of count would be to provide a mechanism which can also add extra fragments to a `GROUP BY` clause used for the `SELECT`. Or... how about a mechanism similar to Django's `prefetch_related` which lets you define extra queries that will be called with a list of primary keys (or values from other columns) and used to populate a new column? A little unconventional but could be extremely useful and efficient. Related to that: since the per-query overhead in SQLite is tiny, could even define an extra query to be run once-per-row before returning results.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392345062,https://api.github.com/repos/simonw/datasette/issues/292,392345062,MDEyOklzc3VlQ29tbWVudDM5MjM0NTA2Mg==,9599,simonw,2018-05-27T16:26:53Z,2018-05-27T16:26:53Z,OWNER,There needs to be a way to turn this off and return to Datasette default bahviour. Maybe a `?_raw=1` querystring parameter for the table view.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392350495,https://api.github.com/repos/simonw/datasette/issues/292,392350495,MDEyOklzc3VlQ29tbWVudDM5MjM1MDQ5NQ==,9599,simonw,2018-05-27T17:47:31Z,2018-05-27T17:47:31Z,OWNER,"Querystring design: * `?_column=a&_column=b` - equivalent of `""columns"": [""a"", ""b""]` in `metadata.json` * `?_select_nameupper=upper(name)` - equivalent of `""column_selects"": {""nameupper"": ""upper(name)""}`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392350568,https://api.github.com/repos/simonw/datasette/issues/292,392350568,MDEyOklzc3VlQ29tbWVudDM5MjM1MDU2OA==,9599,simonw,2018-05-27T17:48:45Z,2018-05-27T17:54:41Z,OWNER,"If any `?_column=` parameters are provided the metadata version is completely ignored. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/292#issuecomment-392350980,https://api.github.com/repos/simonw/datasette/issues/292,392350980,MDEyOklzc3VlQ29tbWVudDM5MjM1MDk4MA==,9599,simonw,2018-05-27T17:56:30Z,2018-05-27T17:56:50Z,OWNER,"Should `?_raw=1` also turn off foreign key expansions? No, we will eventually provide a separate mechanism for that (or leave it to nerds who care to figure out using JSON or CSV export).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/150#issuecomment-392568047,https://api.github.com/repos/simonw/datasette/issues/150,392568047,MDEyOklzc3VlQ29tbWVudDM5MjU2ODA0Nw==,9599,simonw,2018-05-28T16:41:28Z,2018-05-28T16:41:28Z,OWNER,Closing this as obsolete since we have facets now.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276704327,_group_count= feature improvements, https://github.com/simonw/datasette/issues/116#issuecomment-392574208,https://api.github.com/repos/simonw/datasette/issues/116,392574208,MDEyOklzc3VlQ29tbWVudDM5MjU3NDIwOA==,9599,simonw,2018-05-28T17:23:41Z,2018-05-28T17:23:41Z,OWNER,"I'm handling this as separate documentation sections instead, e.g. http://datasette.readthedocs.io/en/latest/spatialite.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274884209,Add documentation section about SQLite extensions, https://github.com/simonw/datasette/issues/79#issuecomment-392574358,https://api.github.com/repos/simonw/datasette/issues/79,392574358,MDEyOklzc3VlQ29tbWVudDM5MjU3NDM1OA==,9599,simonw,2018-05-28T17:24:48Z,2018-05-28T17:24:48Z,OWNER,Closing this as obsolete in favor of other issues [tagged documentation](https://github.com/simonw/datasette/labels/documentation).,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273569068,Add more detailed API documentation to the README, https://github.com/simonw/datasette/issues/73#issuecomment-392574415,https://api.github.com/repos/simonw/datasette/issues/73,392574415,MDEyOklzc3VlQ29tbWVudDM5MjU3NDQxNQ==,9599,simonw,2018-05-28T17:25:14Z,2018-05-28T17:25:14Z,OWNER,I implemented this as `?_ttl=0` in #289 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273296178,_nocache=1 query string option for use with sort-by-random, https://github.com/simonw/datasette/issues/36#issuecomment-392575160,https://api.github.com/repos/simonw/datasette/issues/36,392575160,MDEyOklzc3VlQ29tbWVudDM5MjU3NTE2MA==,9599,simonw,2018-05-28T17:30:52Z,2018-05-28T17:30:52Z,OWNER,"I've changed my mind about this. ""Select every record on the 3rd day of the month"" doesn't strike me as an actually useful feature. ""Select every record in 2018 / in May 2018 / on 1st May 2018"", if you are using the SQLite-preferred datestring format, are already supported using LIKE queries (or the startswith filter): * https://fivethirtyeight.datasettes.com/fivethirtyeight/inconvenient-sequel%2Fratings?timestamp__startswith=2017 * https://fivethirtyeight.datasettes.com/fivethirtyeight/inconvenient-sequel%2Fratings?timestamp__startswith=2017-08 * https://fivethirtyeight.datasettes.com/fivethirtyeight/inconvenient-sequel%2Fratings?timestamp__startswith=2017-08-29 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268262480,"date, year, month and day querystring lookups", https://github.com/simonw/datasette/issues/121#issuecomment-392575448,https://api.github.com/repos/simonw/datasette/issues/121,392575448,MDEyOklzc3VlQ29tbWVudDM5MjU3NTQ0OA==,9599,simonw,2018-05-28T17:33:07Z,2018-05-28T17:33:07Z,OWNER,"This shouldn't be a comma-separated list, it should be an argument you can pass multiple times to better match #255 and #292 Maybe `?_json=foo&_json=bar` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275089535,?_json=foo&_json=bar query string argument , https://github.com/simonw/datasette/issues/31#issuecomment-392580715,https://api.github.com/repos/simonw/datasette/issues/31,392580715,MDEyOklzc3VlQ29tbWVudDM5MjU4MDcxNQ==,9599,simonw,2018-05-28T18:10:45Z,2018-05-28T18:10:45Z,OWNER,"Oops, that commit should have referenced #121 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268087542,Idea: colour scheme based on sha256 of db, https://github.com/simonw/datasette/issues/121#issuecomment-392580902,https://api.github.com/repos/simonw/datasette/issues/121,392580902,MDEyOklzc3VlQ29tbWVudDM5MjU4MDkwMg==,9599,simonw,2018-05-28T18:11:51Z,2018-05-28T18:11:51Z,OWNER,"Implemented in 76d11eb768e2f05f593c4d37a25280c0fcdf8fd6 Documented here: http://datasette.readthedocs.io/en/latest/json_api.html#special-json-arguments","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275089535,?_json=foo&_json=bar query string argument , https://github.com/simonw/datasette/issues/34#issuecomment-392600866,https://api.github.com/repos/simonw/datasette/issues/34,392600866,MDEyOklzc3VlQ29tbWVudDM5MjYwMDg2Ng==,9599,simonw,2018-05-28T20:45:34Z,2018-05-28T20:45:42Z,OWNER,"This is an accidental duplicate, work is now taking place in #266","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268176505,Support CSV export with a .csv extension, https://github.com/simonw/datasette/issues/38#issuecomment-392601114,https://api.github.com/repos/simonw/datasette/issues/38,392601114,MDEyOklzc3VlQ29tbWVudDM5MjYwMTExNA==,9599,simonw,2018-05-28T20:47:31Z,2018-05-28T20:47:31Z,OWNER,I think the way Datasette executes SQL queries in a thread pool introduced in #45 is a good solution for this ticket.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",268462768,Experiment with patterns for concurrent long running queries, https://github.com/simonw/datasette/issues/56#issuecomment-392601478,https://api.github.com/repos/simonw/datasette/issues/56,392601478,MDEyOklzc3VlQ29tbWVudDM5MjYwMTQ3OA==,9599,simonw,2018-05-28T20:50:24Z,2018-05-28T20:50:24Z,OWNER,I'm going to close this as WONTFIX for the moment. Once Plugins #14 grows the ability to add extra URL paths and views someone who needs this could build it as a plugin instead.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127443,Easy way to block search engine crawling in robots.txt, https://github.com/simonw/datasette/issues/97#issuecomment-392602334,https://api.github.com/repos/simonw/datasette/issues/97,392602334,MDEyOklzc3VlQ29tbWVudDM5MjYwMjMzNA==,9599,simonw,2018-05-28T20:57:21Z,2018-05-28T20:57:21Z,OWNER,"The `/.json` endpoint is more of an implementation detail of the homepage at this point. A better, documented ( http://datasette.readthedocs.io/en/stable/introspection.html#inspect ) endpoint for finding all of the databases and tables is https://parlgov.datasettes.com/-/inspect.json","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274022950,Link to JSON for the list of tables , https://github.com/simonw/datasette/issues/142#issuecomment-392602558,https://api.github.com/repos/simonw/datasette/issues/142,392602558,MDEyOklzc3VlQ29tbWVudDM5MjYwMjU1OA==,9599,simonw,2018-05-28T20:58:59Z,2018-05-28T20:58:59Z,OWNER,I'll have the error message display a link to the documentation.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275917760,Show extra instructions with the interrupted, https://github.com/simonw/datasette/issues/142#issuecomment-392605574,https://api.github.com/repos/simonw/datasette/issues/142,392605574,MDEyOklzc3VlQ29tbWVudDM5MjYwNTU3NA==,9599,simonw,2018-05-28T21:25:05Z,2018-05-28T21:25:05Z,OWNER,"![2018-05-28 at 2 24 pm](https://user-images.githubusercontent.com/9599/40629887-e991c61c-6282-11e8-9d66-6387f90e87ca.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275917760,Show extra instructions with the interrupted, https://github.com/simonw/datasette/issues/144#issuecomment-392606044,https://api.github.com/repos/simonw/datasette/issues/144,392606044,MDEyOklzc3VlQ29tbWVudDM5MjYwNjA0NA==,9599,simonw,2018-05-28T21:29:42Z,2018-05-28T21:29:42Z,OWNER,"The other major limitation of APSW is its treatment of unicode: https://rogerbinns.github.io/apsw/types.html - it tells you that it is your responsibility to ensure that TEXT columns in your SQLite database are correctly encoded. Since Datasette is designed to work against ANY SQLite database that someone may have already created, I see that as a show-stopping limitation. Thanks to https://github.com/coleifer/sqlite-vtfunc I now have a working mechanism for virtual tables (I've even built a demo plugin with them - https://github.com/simonw/datasette-sql-scraper ) which was the main thing that interested me about APSW. I'm going to close this as WONTFIX - I think Python's built-in `sqlite3` is good enough, and is now so firmly embedded in the project that making it pluggable would be more trouble than it's worth.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",276091279,apsw as alternative sqlite3 binding (for full text search), https://github.com/simonw/datasette/issues/179#issuecomment-392606418,https://api.github.com/repos/simonw/datasette/issues/179,392606418,MDEyOklzc3VlQ29tbWVudDM5MjYwNjQxOA==,9599,simonw,2018-05-28T21:32:37Z,2018-05-28T21:32:37Z,OWNER,"> It could also be useful to allow users to import a python file containing custom functions that can that be loaded into scope and made available to custom templates. That's now covered by the plugins mechanism - you can create plugins that define custom template functions: http://datasette.readthedocs.io/en/stable/plugins.html#prepare-jinja2-environment-env","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",288438570,More metadata options for template authors , https://github.com/simonw/datasette/issues/276#issuecomment-392815673,https://api.github.com/repos/simonw/datasette/issues/276,392815673,MDEyOklzc3VlQ29tbWVudDM5MjgxNTY3Mw==,9599,simonw,2018-05-29T15:17:04Z,2018-05-29T15:17:04Z,OWNER,"I'm coming round to the idea that this should be baked into Datasette core - see above referenced issues for some of the explorations I've been doing around this area. Datasette should absolutely work without SpatiaLite, but it's such a huge bonus part of the SQLite ecosystem that I'm happy to ship features that take advantage of it without being relegated to plugins. I'm also becoming aware that there aren't really that many other interesting loadable extensions for SQLite. If SpatiaLite was one of dozens I'd feel that a rule that ""anything dependent on an extension lives in a plugin"" would make sense, but as it stands I think 99% of the time the only loadable extensions people will be using will be SpatiaLite and json1 (and json1 is available in the amalgamation anyway). ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/191#issuecomment-392822050,https://api.github.com/repos/simonw/datasette/issues/191,392822050,MDEyOklzc3VlQ29tbWVudDM5MjgyMjA1MA==,9599,simonw,2018-05-29T15:33:25Z,2018-05-29T15:33:25Z,OWNER,"I don't know how it happened, but I've somehow got myself into a state where my local SQLite for Python 3 on OS X is `3.23.1`: ``` ~ $ python3 Python 3.6.5 (default, Mar 30 2018, 06:41:53) [GCC 4.2.1 Compatible Apple LLVM 9.0.0 (clang-900.0.39.2)] on darwin Type ""help"", ""copyright"", ""credits"" or ""license"" for more information. >>> import sqlite3 >>> sqlite3.connect(':memory:').execute('select sqlite_version()').fetchall() [('3.23.1',)] >>> ``` Maybe I did something in homebrew that changed this? I'd love to understand what exactly I did to get to this state.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310533258,Figure out how to bundle a more up-to-date SQLite, https://github.com/simonw/datasette/issues/276#issuecomment-392825746,https://api.github.com/repos/simonw/datasette/issues/276,392825746,MDEyOklzc3VlQ29tbWVudDM5MjgyNTc0Ng==,45057,russss,2018-05-29T15:42:53Z,2018-05-29T15:42:53Z,CONTRIBUTOR,"I haven't had time to look further into this, but if doing this as a plugin results in useful hooks then I think we should do it that way. We could always require the plugin as a standard dependency. I think this is going to result in quite a bit of refactoring anyway so it's a good time to add hooks regardless. On the other hand, if we have to add lots of specialist hooks for it then maybe it's worth integrating into the core.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/191#issuecomment-392828475,https://api.github.com/repos/simonw/datasette/issues/191,392828475,MDEyOklzc3VlQ29tbWVudDM5MjgyODQ3NQ==,119974,coleifer,2018-05-29T15:50:18Z,2018-05-29T15:50:18Z,NONE,"Python standard-library SQLite dynamically links against the system sqlite3. So presumably you installed a more up-to-date sqlite3 somewhere on your `LD_LIBRARY_PATH`. To compile a statically-linked pysqlite you need to include an amalgamation in the project root when building the extension. Read the relevant setup.py.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310533258,Figure out how to bundle a more up-to-date SQLite, https://github.com/simonw/datasette/issues/191#issuecomment-392831543,https://api.github.com/repos/simonw/datasette/issues/191,392831543,MDEyOklzc3VlQ29tbWVudDM5MjgzMTU0Mw==,9599,simonw,2018-05-29T15:58:33Z,2018-05-29T15:58:33Z,OWNER,"I ran an informal survey on twitter and most people were on 3.21 - https://twitter.com/simonw/status/1001487546289815553 Maybe this is from upgrading to the latest OS X release.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310533258,Figure out how to bundle a more up-to-date SQLite, https://github.com/simonw/datasette/issues/296#issuecomment-392840811,https://api.github.com/repos/simonw/datasette/issues/296,392840811,MDEyOklzc3VlQ29tbWVudDM5Mjg0MDgxMQ==,9599,simonw,2018-05-29T16:26:27Z,2018-05-29T19:43:23Z,OWNER,"Since #275 will allow configs to be overridden at the table and database level it also makes sense to expose a completely evaluated list of configs at: * `/dbname/-/config` * `/dbname/tablename/-/config` Similar to https://fivethirtyeight.datasettes.com/-/config","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327395270,Per-database and per-table /-/ URL namespace, https://github.com/simonw/datasette/issues/265#issuecomment-392890045,https://api.github.com/repos/simonw/datasette/issues/265,392890045,MDEyOklzc3VlQ29tbWVudDM5Mjg5MDA0NQ==,231923,yschimke,2018-05-29T18:37:49Z,2018-05-29T18:37:49Z,NONE,"Just about to ask for this! Move this page https://github.com/simonw/datasette/wiki/Datasettes into a datasette, with some concept of versioning as well.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323677499,Add links to example Datasette instances to appropiate places in docs, https://github.com/simonw/datasette/issues/97#issuecomment-392895733,https://api.github.com/repos/simonw/datasette/issues/97,392895733,MDEyOklzc3VlQ29tbWVudDM5Mjg5NTczMw==,231923,yschimke,2018-05-29T18:51:35Z,2018-05-29T18:51:35Z,NONE,Do you have an existing example with views?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",274022950,Link to JSON for the list of tables , https://github.com/simonw/datasette/issues/298#issuecomment-392917380,https://api.github.com/repos/simonw/datasette/issues/298,392917380,MDEyOklzc3VlQ29tbWVudDM5MjkxNzM4MA==,9599,simonw,2018-05-29T19:41:59Z,2018-05-29T19:41:59Z,OWNER,Creating URLs using concatenation as seen in `('https://twitter.com/' || user) as user_url` is likely to have all sorts of useful applications for ad-hoc analysis.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327459829,URLify URLs in results from custom SQL statements / views, https://github.com/simonw/datasette/issues/296#issuecomment-392918311,https://api.github.com/repos/simonw/datasette/issues/296,392918311,MDEyOklzc3VlQ29tbWVudDM5MjkxODMxMQ==,9599,simonw,2018-05-29T19:44:33Z,2018-05-29T19:44:33Z,OWNER,Should the `tablename` ones also work for views and canned queries? Probably not.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327395270,Per-database and per-table /-/ URL namespace, https://github.com/simonw/datasette/issues/276#issuecomment-392969173,https://api.github.com/repos/simonw/datasette/issues/276,392969173,MDEyOklzc3VlQ29tbWVudDM5Mjk2OTE3Mw==,9599,simonw,2018-05-29T22:32:08Z,2018-05-29T22:32:08Z,OWNER,The more time I spend with SpatiaLite the more convinced I am that this should be default behavior. There's nothing useful about the binary Geometry representation - it's not even valid WKB. I'm on board with WKT as the default display in HTML and GeoJSON as the default for `.json`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/265#issuecomment-393003340,https://api.github.com/repos/simonw/datasette/issues/265,393003340,MDEyOklzc3VlQ29tbWVudDM5MzAwMzM0MA==,9599,simonw,2018-05-30T01:44:22Z,2018-05-30T01:44:22Z,OWNER,Funny you should mention that... I'm planning on doing that as part of the official Datasette website at some point soon. A Datasette instance that lists other Datasette instances feels pleasingly appropriate.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323677499,Add links to example Datasette instances to appropiate places in docs, https://github.com/simonw/datasette/issues/276#issuecomment-393014943,https://api.github.com/repos/simonw/datasette/issues/276,393014943,MDEyOklzc3VlQ29tbWVudDM5MzAxNDk0Mw==,9599,simonw,2018-05-30T02:59:53Z,2018-05-30T02:59:53Z,OWNER,I just realised a problem with GeoJSON is that it assumes that the underlying geometry is WGS 84 latitude/longitude points - but it's very possible for a SpatiaLite geometry to contain geometric data that's nothing to do with geospatial projections.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/266#issuecomment-393020749,https://api.github.com/repos/simonw/datasette/issues/266,393020749,MDEyOklzc3VlQ29tbWVudDM5MzAyMDc0OQ==,9599,simonw,2018-05-30T03:42:54Z,2018-05-30T03:42:54Z,OWNER,"Challenge: how to deal with tables where the name ends in `.csv`? I actually have one of these in the test suite at the moment: https://github.com/simonw/datasette/blob/d69ebce53385b7c6fafb85fdab3b136dbf3f332c/tests/fixtures.py#L234-L237","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/265#issuecomment-393064224,https://api.github.com/repos/simonw/datasette/issues/265,393064224,MDEyOklzc3VlQ29tbWVudDM5MzA2NDIyNA==,9599,simonw,2018-05-30T07:48:37Z,2018-05-30T07:48:37Z,OWNER,"https://datasette-registry.now.sh Is now live, powered by https://github.com/simonw/datasette-registry - still needs plenty of work but it's an interesting start.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323677499,Add links to example Datasette instances to appropiate places in docs, https://github.com/simonw/datasette/issues/276#issuecomment-393106520,https://api.github.com/repos/simonw/datasette/issues/276,393106520,MDEyOklzc3VlQ29tbWVudDM5MzEwNjUyMA==,45057,russss,2018-05-30T10:09:25Z,2018-05-30T10:09:25Z,CONTRIBUTOR,"I don't think it's unreasonable to only support spatialite geometries in a coordinate reference system which is at least transformable to WGS84. It would be nice to support different CRSes in the database so conversion to spatialite from the source data is lossless. I think the working CRS for datasette should be WGS84 though (leaflet requires it, for example) - it's just a case of calling `ST_Transform(geom, 4326)` on the column while we're loading the data.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/295#issuecomment-393534579,https://api.github.com/repos/simonw/datasette/issues/295,393534579,MDEyOklzc3VlQ29tbWVudDM5MzUzNDU3OQ==,9599,simonw,2018-05-31T13:44:15Z,2018-05-31T13:44:15Z,OWNER,I actually started doing this in 45e502aace6cc1198cc5f9a04d61b4a1860a012b,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327383759,Extract unit tests for inspect out to test_inspect.py, https://github.com/simonw/datasette/issues/243#issuecomment-393544357,https://api.github.com/repos/simonw/datasette/issues/243,393544357,MDEyOklzc3VlQ29tbWVudDM5MzU0NDM1Nw==,9599,simonw,2018-05-31T14:14:49Z,2018-05-31T14:14:49Z,OWNER,"Demo: https://datasette-publish-spatialite-demo.now.sh/spatialite-test-c88bc35?sql=select+AsText(Geometry)+from+HighWays+limit+1%3B Published using `datasette publish now --spatialite /tmp/spatialite-test.db`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",318737808,--spatialite option for datasette publish commands, https://github.com/simonw/datasette/issues/294#issuecomment-393547960,https://api.github.com/repos/simonw/datasette/issues/294,393547960,MDEyOklzc3VlQ29tbWVudDM5MzU0Nzk2MA==,9599,simonw,2018-05-31T14:25:43Z,2018-05-31T14:25:43Z,OWNER,"SpatialLite columns are actually quite a bit more interesting than this - they also have a `geometry_type` (point, polygon, linestring etc), a `coord_dimension` (usually 2 but can be higher) and an `srid`. For example: https://datasette-publish-spatialite-demo.now.sh/spatialite-test-c88bc35/geometry_columns ![2018-05-31 at 7 22 am](https://user-images.githubusercontent.com/9599/40787843-6f9600ee-64a3-11e8-84e5-64d7cc69603a.png) The SRID here is particularly interesting, because it helps hint at the fact that the results from these queries won't be latitude/longitude co-ordinates - which means that `AsGeoJSON()` won't return results that can be easily rendered by Leaflet: https://datasette-publish-spatialite-demo.now.sh/spatialite-test-c88bc35?sql=select+AsGeoJSON(Geometry)+from+HighWays%20limit1 Compare with https://timezones-api.now.sh/timezones-a99b2e3/geometry_columns: ![2018-05-31 at 7 25 am](https://user-images.githubusercontent.com/9599/40787991-d2650756-64a3-11e8-936e-2dcce7dd1515.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327365110,inspect should record column types, https://github.com/simonw/datasette/issues/294#issuecomment-393548602,https://api.github.com/repos/simonw/datasette/issues/294,393548602,MDEyOklzc3VlQ29tbWVudDM5MzU0ODYwMg==,9599,simonw,2018-05-31T14:27:41Z,2018-05-31T14:27:56Z,OWNER,Presumably the difference in primary key structure between those two is caused by the fact that the `spatialite-test` database (actually https://www.gaia-gis.it/spatialite-2.3.1/test-2.3.sqlite.gz downloaded from https://www.gaia-gis.it/spatialite-2.3.1/resources.html ) was created by a much older version of SpatialLite - presumably v2.3.1,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327365110,inspect should record column types, https://github.com/simonw/datasette/issues/294#issuecomment-393549215,https://api.github.com/repos/simonw/datasette/issues/294,393549215,MDEyOklzc3VlQ29tbWVudDM5MzU0OTIxNQ==,9599,simonw,2018-05-31T14:29:37Z,2018-05-31T14:29:37Z,OWNER,"Also of note: `spatialite-test` uses readable strings in the `type` column, while `timezones` has a `geometry_type` column with integers in it. Those integers are documented here: https://www.gaia-gis.it/fossil/libspatialite/wiki?name=switching-to-4.0 ![2018-05-31 at 7 29 am](https://user-images.githubusercontent.com/9599/40788210-5d0f0dd4-64a4-11e8-8141-0386b5c7b384.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327365110,inspect should record column types, https://github.com/simonw/datasette/issues/297#issuecomment-393554151,https://api.github.com/repos/simonw/datasette/issues/297,393554151,MDEyOklzc3VlQ29tbWVudDM5MzU1NDE1MQ==,9599,simonw,2018-05-31T14:44:37Z,2018-05-31T14:44:37Z,OWNER,I fixed this in https://github.com/simonw/datasette/commit/b18e4515855c3f1eeca3dfcccdbb6df05869084a,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327420945,datasette publish Dockerfile should use python:3.6-slim-stretch, https://github.com/simonw/datasette/issues/303#issuecomment-393557406,https://api.github.com/repos/simonw/datasette/issues/303,393557406,MDEyOklzc3VlQ29tbWVudDM5MzU1NzQwNg==,9599,simonw,2018-05-31T14:54:03Z,2018-05-31T14:54:03Z,OWNER,"Our test fixtures currently have a table with a name ending in `.csv`: https://github.com/simonw/datasette/blob/d69ebce53385b7c6fafb85fdab3b136dbf3f332c/tests/fixtures.py#L234-L237","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328172521,Support table names ending with .json or .csv, https://github.com/simonw/datasette/issues/294#issuecomment-393557968,https://api.github.com/repos/simonw/datasette/issues/294,393557968,MDEyOklzc3VlQ29tbWVudDM5MzU1Nzk2OA==,9599,simonw,2018-05-31T14:55:46Z,2018-05-31T14:55:46Z,OWNER,"I'm not sure what the best JSON shape for this would be considering the potential complexity of geospatial columns. I do think it's worth exposing these in the inspect JSON though, mainly so Datasette Registry can keep track of all of the openly available geodata out there.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327365110,inspect should record column types, https://github.com/simonw/datasette/issues/303#issuecomment-393599840,https://api.github.com/repos/simonw/datasette/issues/303,393599840,MDEyOklzc3VlQ29tbWVudDM5MzU5OTg0MA==,9599,simonw,2018-05-31T16:54:22Z,2018-05-31T16:54:32Z,OWNER,The interesting thing about this is that it requires URL routing to become aware of the names of all of the available tables.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328172521,Support table names ending with .json or .csv, https://github.com/simonw/datasette/issues/303#issuecomment-393600441,https://api.github.com/repos/simonw/datasette/issues/303,393600441,MDEyOklzc3VlQ29tbWVudDM5MzYwMDQ0MQ==,9599,simonw,2018-05-31T16:56:25Z,2018-05-31T16:57:41Z,OWNER,"Here's a nasty challenge: what happens if a database has the following two tables: * `blah` * `blah.json` What would the URL be for the JSON endpoint for the `blah` table?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328172521,Support table names ending with .json or .csv, https://github.com/simonw/datasette/issues/304#issuecomment-393610731,https://api.github.com/repos/simonw/datasette/issues/304,393610731,MDEyOklzc3VlQ29tbWVudDM5MzYxMDczMQ==,9599,simonw,2018-05-31T17:29:31Z,2018-05-31T17:30:05Z,OWNER,I prototyped this a while ago here https://github.com/simonw/datasette/commit/04476ead53758044a5f272ae8696b63d6703115e before we had the ``--config`` mechanism.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328229224,Ability to configure SQLite cache_size, https://github.com/simonw/datasette/issues/303#issuecomment-394037368,https://api.github.com/repos/simonw/datasette/issues/303,394037368,MDEyOklzc3VlQ29tbWVudDM5NDAzNzM2OA==,9599,simonw,2018-06-01T23:50:17Z,2018-06-01T23:50:35Z,OWNER,"Solution for he above: support an optional `?_format=json/csv` parameter on the regular table view. Then if you have tables with the above colliding names you can use `/db/blah.json?_format=json` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328172521,Support table names ending with .json or .csv, https://github.com/simonw/datasette/issues/304#issuecomment-394400419,https://api.github.com/repos/simonw/datasette/issues/304,394400419,MDEyOklzc3VlQ29tbWVudDM5NDQwMDQxOQ==,9599,simonw,2018-06-04T15:39:03Z,2018-06-04T15:39:03Z,OWNER,"In the interest of getting this shipped, I'm going to ignore the `3.7.10` issue.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328229224,Ability to configure SQLite cache_size, https://github.com/simonw/datasette/issues/304#issuecomment-394412217,https://api.github.com/repos/simonw/datasette/issues/304,394412217,MDEyOklzc3VlQ29tbWVudDM5NDQxMjIxNw==,9599,simonw,2018-06-04T16:13:32Z,2018-06-04T16:13:32Z,OWNER,Docs: http://datasette.readthedocs.io/en/latest/config.html#cache-size-kb,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328229224,Ability to configure SQLite cache_size, https://github.com/simonw/datasette/issues/302#issuecomment-394412784,https://api.github.com/repos/simonw/datasette/issues/302,394412784,MDEyOklzc3VlQ29tbWVudDM5NDQxMjc4NA==,9599,simonw,2018-06-04T16:15:22Z,2018-06-04T16:15:22Z,OWNER,I think this is related to #303,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328171513,test-2.3.sqlite database filename throws a 404, https://github.com/simonw/datasette/issues/266#issuecomment-394417567,https://api.github.com/repos/simonw/datasette/issues/266,394417567,MDEyOklzc3VlQ29tbWVudDM5NDQxNzU2Nw==,9599,simonw,2018-06-04T16:30:48Z,2018-06-04T16:32:55Z,OWNER,"When serving streaming responses, I need to check that a large CSV file doesn't completely max out the CPU in a way that is harmful to the rest of the instance. If it does, one option may be to insert an async sleep call in between each chunk that is streamed back. This could be controlled by a `csv_pause_ms` config setting, defaulting to maybe 5 but can be disabled entirely by setting to 0. That's only if testing proves that this is a necessary mechanism.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/272#issuecomment-394431323,https://api.github.com/repos/simonw/datasette/issues/272,394431323,MDEyOklzc3VlQ29tbWVudDM5NDQzMTMyMw==,9599,simonw,2018-06-04T17:17:37Z,2018-06-04T17:17:37Z,OWNER,I built this ASGI debugging tool to help with this migration: https://asgi-scope.now.sh/fivethirtyeight-34d6604/most-common-name%2Fsurnames.json?foo=bar&bazoeuto=onetuh&a=.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-394503399,https://api.github.com/repos/simonw/datasette/issues/272,394503399,MDEyOklzc3VlQ29tbWVudDM5NDUwMzM5OQ==,9599,simonw,2018-06-04T21:20:14Z,2018-06-04T21:20:14Z,OWNER,Results of an extremely simple micro-benchmark comparing the two shows that uvicorn is at least as fast as Sanic (benchmarks a little faster with a very simple payload): https://gist.github.com/simonw/418950af178c01c416363cc057420851,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-394764713,https://api.github.com/repos/simonw/datasette/issues/272,394764713,MDEyOklzc3VlQ29tbWVudDM5NDc2NDcxMw==,9599,simonw,2018-06-05T15:58:54Z,2018-06-05T16:00:40Z,OWNER,"https://github.com/encode/uvicorn/blob/572b5fe6c811b63298d5350a06b664839624c860/uvicorn/run.py#L63 is how you start a Uvicorn server from code as opposed to the `uvicorn` CLI from uvicorn.run import UvicornServer UvicornServer().run(app, host=host, port=port) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/306#issuecomment-394894500,https://api.github.com/repos/simonw/datasette/issues/306,394894500,MDEyOklzc3VlQ29tbWVudDM5NDg5NDUwMA==,9599,simonw,2018-06-05T23:40:40Z,2018-06-05T23:40:40Z,OWNER,"Input: - function that says if a name is a valid database - Function that says if a table exists - URL Output: - view class - Arguments - Redirect (if it should redirect)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",329661905,Custom URL routing with independent tests, https://github.com/simonw/datasette/issues/306#issuecomment-394894910,https://api.github.com/repos/simonw/datasette/issues/306,394894910,MDEyOklzc3VlQ29tbWVudDM5NDg5NDkxMA==,9599,simonw,2018-06-05T23:43:18Z,2018-06-05T23:49:41Z,OWNER,I'm going to use a named tuple for the output. That way I can support either tuple destructing or explicit property access on the returned value.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",329661905,Custom URL routing with independent tests, https://github.com/simonw/datasette/issues/306#issuecomment-394895267,https://api.github.com/repos/simonw/datasette/issues/306,394895267,MDEyOklzc3VlQ29tbWVudDM5NDg5NTI2Nw==,9599,simonw,2018-06-05T23:45:26Z,2018-06-05T23:45:26Z,OWNER,To support a future where Datasette is an ASGI app that can be attached to a URL within a larger application the routing function should have the option to accept a path prefix which will then be automatically attached to any resulting redirects.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",329661905,Custom URL routing with independent tests, https://github.com/simonw/datasette/issues/306#issuecomment-394895750,https://api.github.com/repos/simonw/datasette/issues/306,394895750,MDEyOklzc3VlQ29tbWVudDM5NDg5NTc1MA==,9599,simonw,2018-06-05T23:48:06Z,2018-06-06T23:50:31Z,OWNER,"A neat trick could be that if the router returns a redirect it could then resolve that redirect to see if it will 404 (or redirect itself) before returning that response. This would need its own counter to guard against infinite redirects. I'm not going to do this though: any view that results in a chain of redirects like this is a bug that should be fixed at the source.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",329661905,Custom URL routing with independent tests, https://github.com/simonw/datasette/issues/306#issuecomment-395463497,https://api.github.com/repos/simonw/datasette/issues/306,395463497,MDEyOklzc3VlQ29tbWVudDM5NTQ2MzQ5Nw==,9599,simonw,2018-06-07T15:29:28Z,2018-06-07T15:29:28Z,OWNER,"I started sketching this out in a branch, see pull request #307 - but I've decided I don't like it. I'm going to close this ticket and stick with regular expression URL routing for the moment. If I change my mind in the future the code in #307 lives in separate files (`datasette/routes.py` and `tests/test_routes.py`) so bringing it back into the project will be trivial.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",329661905,Custom URL routing with independent tests, https://github.com/simonw/datasette/pull/307#issuecomment-395463598,https://api.github.com/repos/simonw/datasette/issues/307,395463598,MDEyOklzc3VlQ29tbWVudDM5NTQ2MzU5OA==,9599,simonw,2018-06-07T15:29:41Z,2018-06-07T15:29:41Z,OWNER,Closing this pull request for reasons outlined here: https://github.com/simonw/datasette/issues/306#issuecomment-395463497,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",330323860,"Initial sketch of custom URL routing, refs #306", https://github.com/simonw/datasette/issues/305#issuecomment-396048471,https://api.github.com/repos/simonw/datasette/issues/305,396048471,MDEyOklzc3VlQ29tbWVudDM5NjA0ODQ3MQ==,9599,simonw,2018-06-10T13:16:13Z,2018-06-10T13:16:13Z,OWNER,https://github.com/kubernetes/community/blob/master/contributors/devel/help-wanted.md Is worth stealing from too.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",329147284,Add contributor guidelines to docs, https://github.com/simonw/datasette/issues/266#issuecomment-397534196,https://api.github.com/repos/simonw/datasette/issues/266,397534196,MDEyOklzc3VlQ29tbWVudDM5NzUzNDE5Ng==,9599,simonw,2018-06-15T07:12:16Z,2018-06-15T07:12:16Z,OWNER,"The first version of this is now shipped to master. I ended up rewriting most of the experimental branch to deal with the nasty issue described in #303 Demo is available on https://fivethirtyeight.datasettes.com/fivethirtyeight-ab24e01/most-common-name%2Fsurnames ![2018-06-15 at 12 11 am](https://user-images.githubusercontent.com/9599/41455090-bd5ece30-7030-11e8-8da4-11fbb1f2ef8b.png) Here's the CSV version of that page: https://fivethirtyeight.datasettes.com/fivethirtyeight-ab24e01/most-common-name%2Fsurnames.csv","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397534404,https://api.github.com/repos/simonw/datasette/issues/266,397534404,MDEyOklzc3VlQ29tbWVudDM5NzUzNDQwNA==,9599,simonw,2018-06-15T07:13:20Z,2018-06-15T07:13:20Z,OWNER,"Still to add: the streaming version that iterates through all of the pages, as seen in experimental commit https://github.com/simonw/datasette/commit/ced379ea325787b8c3bf0a614daba1fa4856a3bd","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397534498,https://api.github.com/repos/simonw/datasette/issues/266,397534498,MDEyOklzc3VlQ29tbWVudDM5NzUzNDQ5OA==,9599,simonw,2018-06-15T07:13:52Z,2018-06-15T07:13:52Z,OWNER,Also needs documentation.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397637302,https://api.github.com/repos/simonw/datasette/issues/233,397637302,MDEyOklzc3VlQ29tbWVudDM5NzYzNzMwMg==,9599,simonw,2018-06-15T14:24:08Z,2018-06-15T14:55:19Z,OWNER,"I'm going with the terminology ""labels"" here. You'll be able to add ``?_labels=1`` and the JSON will look something like this: ``` { ""rowid"": 233, ""TreeID"": 121240, ""qLegalStatus"": { ""value"" 2, ""label"": ""Private"" } ""qSpecies"": { ""value"": 16, ""label"": ""Sycamore"" } ""qAddress"": ""91 Commonwealth Ave"", ... } ``` I need this to help build foreign key expansions for CSV files, see #266 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397648080,https://api.github.com/repos/simonw/datasette/issues/233,397648080,MDEyOklzc3VlQ29tbWVudDM5NzY0ODA4MA==,9599,simonw,2018-06-15T14:56:21Z,2018-06-15T14:56:21Z,OWNER,"I considered including a `""table""` key like this: ``` ""qLegalStatus"": { ""value"" 2, ""label"": ""Private"", ""table"": ""qLegalStatus"" } ``` This would help generate the HTML links using just the JSON data. But... I realized that in a list of 50 rows that value would be duplicated 50 times which is a bit nasty.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397663968,https://api.github.com/repos/simonw/datasette/issues/233,397663968,MDEyOklzc3VlQ29tbWVudDM5NzY2Mzk2OA==,9599,simonw,2018-06-15T15:51:17Z,2018-06-15T15:51:17Z,OWNER,"Nearly done, but I need the HTML view to ignore the `?_labels=1` param (it throws an error at the moment).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397668427,https://api.github.com/repos/simonw/datasette/issues/233,397668427,MDEyOklzc3VlQ29tbWVudDM5NzY2ODQyNw==,9599,simonw,2018-06-15T16:07:43Z,2018-06-15T16:07:43Z,OWNER,Demo: https://datasette-json-labels-demo.now.sh/fixtures-fda0fea/facetable.json?_labels=1&_shape=array,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397729319,https://api.github.com/repos/simonw/datasette/issues/233,397729319,MDEyOklzc3VlQ29tbWVudDM5NzcyOTMxOQ==,9599,simonw,2018-06-15T20:10:24Z,2018-06-15T20:10:24Z,OWNER,I'm also going to add the ability to specify individual columns that you want to expand using `?_label=city_id&_label=state_id`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397729500,https://api.github.com/repos/simonw/datasette/issues/233,397729500,MDEyOklzc3VlQ29tbWVudDM5NzcyOTUwMA==,9599,simonw,2018-06-15T20:11:14Z,2018-06-15T20:11:14Z,OWNER,The `.json` and `.csv` links displayed on the table page should default to using `?_labels=1` if Datasette detects that there are foreign key expansions available for the page.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397729945,https://api.github.com/repos/simonw/datasette/issues/266,397729945,MDEyOklzc3VlQ29tbWVudDM5NzcyOTk0NQ==,9599,simonw,2018-06-15T20:13:05Z,2018-06-15T20:13:05Z,OWNER,"The ""This data as ..."" area of the page is getting a bit untidy, especially if I'm going to add other download options in the future. I think I'll move the HTML to the page footer (less concerns about taking up lots of space there) and then have a bit of JavaScript that turns it into a show/hide menu of some sort in its current location.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/pull/311#issuecomment-397823913,https://api.github.com/repos/simonw/datasette/issues/311,397823913,MDEyOklzc3VlQ29tbWVudDM5NzgyMzkxMw==,9599,simonw,2018-06-16T16:32:07Z,2018-06-16T16:48:48Z,OWNER,"Still todo: - [ ] HTML view to obey the ?_labels=1 param (it throws an error at the moment) - [ ] `?_label=one&_label=2` support for only expanding specific labels - [ ] Better docs","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",332998752,"?_labels=1 to expand foreign keys (in csv and json), refs #233", https://github.com/simonw/datasette/issues/233#issuecomment-397824991,https://api.github.com/repos/simonw/datasette/issues/233,397824991,MDEyOklzc3VlQ29tbWVudDM5NzgyNDk5MQ==,9599,simonw,2018-06-16T16:50:31Z,2018-06-16T16:50:42Z,OWNER,"I'm going to support `?_labels=` on HTML views, but I'll allow it to be used to turn them off (they are on by default) using `?_labels=off`. Related: 7e0caa1e62607c6579101cc0e62bec8899013715 where I added a new `value_as_boolean` helper extracted from how `--config` works in `cli.py`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/312#issuecomment-397825583,https://api.github.com/repos/simonw/datasette/issues/312,397825583,MDEyOklzc3VlQ29tbWVudDM5NzgyNTU4Mw==,9599,simonw,2018-06-16T17:00:12Z,2018-06-16T17:00:12Z,OWNER,This is already covered by #292 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333000163,"HTML, CSV and JSON views should support ?_col=&_col=", https://github.com/simonw/datasette/issues/233#issuecomment-397839482,https://api.github.com/repos/simonw/datasette/issues/233,397839482,MDEyOklzc3VlQ29tbWVudDM5NzgzOTQ4Mg==,9599,simonw,2018-06-16T21:21:03Z,2018-06-16T21:21:03Z,OWNER,Should facets always have their labels expanded or should they also obey the `_labels` and `_label` querystring arguments?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397839583,https://api.github.com/repos/simonw/datasette/issues/233,397839583,MDEyOklzc3VlQ29tbWVudDM5NzgzOTU4Mw==,9599,simonw,2018-06-16T21:23:14Z,2018-06-16T21:23:44Z,OWNER,"I'm a bit torn on naming - choices are: * `?_labels=on` and `?_label=col1&_label=col2` * `?_expands=on` (or `?_expand_all=on`) and `?_expand=col1&_expand=col2`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/233#issuecomment-397840676,https://api.github.com/repos/simonw/datasette/issues/233,397840676,MDEyOklzc3VlQ29tbWVudDM5Nzg0MDY3Ng==,9599,simonw,2018-06-16T21:49:50Z,2018-06-16T21:49:50Z,OWNER,For the moment I'm going with `_labels=`.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/pull/311#issuecomment-397841968,https://api.github.com/repos/simonw/datasette/issues/311,397841968,MDEyOklzc3VlQ29tbWVudDM5Nzg0MTk2OA==,9599,simonw,2018-06-16T22:20:31Z,2018-06-16T22:20:31Z,OWNER,I merged this manually in ed631e690b81e34fcaeaba1f16c9166f1c505990,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",332998752,"?_labels=1 to expand foreign keys (in csv and json), refs #233", https://github.com/simonw/datasette/issues/233#issuecomment-397842194,https://api.github.com/repos/simonw/datasette/issues/233,397842194,MDEyOklzc3VlQ29tbWVudDM5Nzg0MjE5NA==,9599,simonw,2018-06-16T22:26:21Z,2018-06-16T22:26:21Z,OWNER,"Some demos: * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List - regular HTML view * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List?_labels=off - no labels * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List.json?_labels=on - JSON with all labels * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List.json?_label=qSpecies&_shape=array - JSON with specific labels in array shape * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List.csv?_labels=on - CSV with all labels * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List.csv?_label=qSpecies - CSV with specific labels","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316444720,Option to expose expanded foreign keys in JSON/CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397842246,https://api.github.com/repos/simonw/datasette/issues/266,397842246,MDEyOklzc3VlQ29tbWVudDM5Nzg0MjI0Ng==,9599,simonw,2018-06-16T22:27:59Z,2018-06-16T22:27:59Z,OWNER,"Two demos of the new functionality in #233 as it applies to CSV: * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List.csv?_labels=on - CSV with all foreign key columns expanded * https://datasette-labels-demo.now.sh/sf-trees-02c8ef1/Street_Tree_List.csv?_label=qSpecies - CSV with specific columns expanded","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397842667,https://api.github.com/repos/simonw/datasette/issues/266,397842667,MDEyOklzc3VlQ29tbWVudDM5Nzg0MjY2Nw==,9599,simonw,2018-06-16T22:38:15Z,2018-06-18T05:55:11Z,OWNER,"Still todo: - [x] Streaming version - [ ] Tidy up the ""This data as ..."" UI - [x] Default .csv (and .json) links to use `?_labels=on` (only if at least one foreign key detected) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/313#issuecomment-397900434,https://api.github.com/repos/simonw/datasette/issues/313,397900434,MDEyOklzc3VlQ29tbWVudDM5NzkwMDQzNA==,9599,simonw,2018-06-17T19:23:23Z,2018-06-17T19:23:23Z,OWNER,This will require some relatively sophisticated Travis build steps. Useful docs: https://docs.travis-ci.com/user/build-stages/ - useful example: https://docs.travis-ci.com/user/build-stages/deploy-heroku/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333086005,Deploy demo of Datasette on every commit that passes tests, https://github.com/simonw/datasette/issues/313#issuecomment-397907987,https://api.github.com/repos/simonw/datasette/issues/313,397907987,MDEyOklzc3VlQ29tbWVudDM5NzkwNzk4Nw==,9599,simonw,2018-06-17T21:32:52Z,2018-06-17T21:32:52Z,OWNER,"This very nearly works... * https://latest.datasette.io/ * https://f0c1722.datasette.io/ But... https://f0c1722.datasette.io/-/versions isn't showing the correct note: ``` { ""datasette"": { ""version"": ""0.22.1"" } ... ``` There should be a `""note""` field there with the full commit hash.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333086005,Deploy demo of Datasette on every commit that passes tests, https://github.com/simonw/datasette/issues/313#issuecomment-397908093,https://api.github.com/repos/simonw/datasette/issues/313,397908093,MDEyOklzc3VlQ29tbWVudDM5NzkwODA5Mw==,9599,simonw,2018-06-17T21:34:52Z,2018-06-17T21:34:52Z,OWNER,"It looks like all of my test deploys ended up going to the same Zeit deployment ID: https://zeit.co/simonw/datasette-latest/rbmtcedvlj This is strange... the Dockerfile should be different for each one (due to the differing version-note). https://github.com/simonw/datasette/commit/db1e6bc182d11f333e6addaa1a6be87625a4e12b#diff-34418c57343344c73271e13b01b7fcd9R255","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333086005,Deploy demo of Datasette on every commit that passes tests, https://github.com/simonw/datasette/issues/313#issuecomment-397908185,https://api.github.com/repos/simonw/datasette/issues/313,397908185,MDEyOklzc3VlQ29tbWVudDM5NzkwODE4NQ==,9599,simonw,2018-06-17T21:36:50Z,2018-06-17T21:36:50Z,OWNER,"``` The command ""datasette publish now fixtures.db -m fixtures.json --token=$NOW_TOKEN --branch=$TRAVIS_COMMIT --version-note=$TRAVIS_COMMIT"" exited with 0. ``` Partial log of the ``datasette publish now`` output: ``` > Step 5/7 : RUN datasette inspect fixtures.db --inspect-file inspect-data.json > ---> Running in d373f330e53e > ---> 09bab386aaa3 > Removing intermediate container d373f330e53e > Step 6/7 : EXPOSE 8001 > ---> Running in e0fe37b3061c > ---> 47798440e214 > Removing intermediate container e0fe37b3061c > Step 7/7 : CMD datasette serve --host 0.0.0.0 fixtures.db --cors --port 8001 --inspect-file inspect-data.json --metadata metadata.json --version-note f0c17229b7a7914d3da02e087dfd0e25d8321448 ``` So it looks like `--version-note` is being correctly set there.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333086005,Deploy demo of Datasette on every commit that passes tests, https://github.com/simonw/datasette/issues/313#issuecomment-397908614,https://api.github.com/repos/simonw/datasette/issues/313,397908614,MDEyOklzc3VlQ29tbWVudDM5NzkwODYxNA==,9599,simonw,2018-06-17T21:44:51Z,2018-06-17T21:45:03Z,OWNER,"Aha! ```1.03s$ now alias --token=$NOW_TOKEN > Error! Couldn't find a deployment to alias. Please provide one as an argument. The command ""now alias --token=$NOW_TOKEN"" exited with 1. ``` That explains it. I need to set the same alias in my call to `datasette publish`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333086005,Deploy demo of Datasette on every commit that passes tests, https://github.com/simonw/datasette/issues/313#issuecomment-397908947,https://api.github.com/repos/simonw/datasette/issues/313,397908947,MDEyOklzc3VlQ29tbWVudDM5NzkwODk0Nw==,9599,simonw,2018-06-17T21:51:34Z,2018-06-17T21:51:34Z,OWNER,"That fixed it! https://958b75c.datasette.io/-/versions ``` { ""python"": { ""version"": ""3.6.5"", ""full"": ""3.6.5 (default, Jun 6 2018, 19:19:24) \n[GCC 6.3.0 20170516]"" }, ""datasette"": { ""version"": ""0+unknown"", ""note"": ""958b75c69841ef5913da86e0eb2df634a9b95fda"" }, ""sqlite"": { ""version"": ""3.16.2"", ""fts_versions"": [ ""FTS5"", ""FTS4"", ""FTS3"" ], ""extensions"": { ""json1"": null } } } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333086005,Deploy demo of Datasette on every commit that passes tests, https://github.com/simonw/datasette/issues/266#issuecomment-397912840,https://api.github.com/repos/simonw/datasette/issues/266,397912840,MDEyOklzc3VlQ29tbWVudDM5NzkxMjg0MA==,9599,simonw,2018-06-17T23:13:35Z,2018-06-17T23:16:42Z,OWNER,"This worked! https://github.com/simonw/datasette/commit/5a0a82faf9cf9dd109d76181ed00eea19472087c - it spat out a 76MB CSV when I ran it against the sf-trees demo database. It was just a quick hack though - it currently ignores `_labels=` and `_dl=` which need to be supported. I'm going to add a config option for turning full CSV export off just in case any Datasette users are uncomfortable with URLs that churn out that much data in one go. ``` ConfigOption(""allow_csv_stream"", True, """""" Allow .csv?_stream=1 to download all rows (ignoring max_returned_rows) """""".strip()), ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397915258,https://api.github.com/repos/simonw/datasette/issues/266,397915258,MDEyOklzc3VlQ29tbWVudDM5NzkxNTI1OA==,9599,simonw,2018-06-18T00:01:05Z,2018-06-18T00:01:05Z,OWNER,Someone malicious could use a UNION to generate an unpleasantly large CSV response. I'll add another config setting which limits the response size to 100MB but can be turned off by setting it to 0.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397915403,https://api.github.com/repos/simonw/datasette/issues/266,397915403,MDEyOklzc3VlQ29tbWVudDM5NzkxNTQwMw==,9599,simonw,2018-06-18T00:03:17Z,2018-06-18T00:14:37Z,OWNER,"Since CSV streaming export doesn't work for custom SQL queries (since they don't support `_next=` pagination) there's no need to provide a option that disables streams just for custom SQL. Related: the UI should not show the option to download everything on custom SQL pages.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397916091,https://api.github.com/repos/simonw/datasette/issues/266,397916091,MDEyOklzc3VlQ29tbWVudDM5NzkxNjA5MQ==,9599,simonw,2018-06-18T00:13:43Z,2018-06-18T00:15:50Z,OWNER,I was also worried about the performance of pagination over custom `_sort` orders or views which use offset pagination - but Datasette's SQL time limits should prevent those from getting out of hand. This does mean that a streaming CSV file may be truncated with an error - if this happens we should ensure the error is written out as the last line of the CSV so anyone who tried to import it gets a relevant error message informing them that the export did not complete.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397916321,https://api.github.com/repos/simonw/datasette/issues/266,397916321,MDEyOklzc3VlQ29tbWVudDM5NzkxNjMyMQ==,9599,simonw,2018-06-18T00:17:44Z,2018-06-18T00:18:05Z,OWNER,The export UI could be a GET form controlling various parameters. This would discourage crawlers from hitting the export links and would also allow us to express the full range of export options.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397918264,https://api.github.com/repos/simonw/datasette/issues/266,397918264,MDEyOklzc3VlQ29tbWVudDM5NzkxODI2NA==,9599,simonw,2018-06-18T00:49:35Z,2018-06-18T00:49:35Z,OWNER,"Simpler design: the top of the page will link to basic .json and .csv and ""advanced"" - which will fragment link to an advanced export format the bottom of the page.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397923253,https://api.github.com/repos/simonw/datasette/issues/266,397923253,MDEyOklzc3VlQ29tbWVudDM5NzkyMzI1Mw==,9599,simonw,2018-06-18T01:49:52Z,2018-06-18T03:02:28Z,OWNER,Ideally the downloadable filenames of exported CSVs would differ across different querystring parameters. Maybe S`treet_Trees-56cbd54.csv` where `56cbd54` is a hash of the querystring?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397949002,https://api.github.com/repos/simonw/datasette/issues/266,397949002,MDEyOklzc3VlQ29tbWVudDM5Nzk0OTAwMg==,9599,simonw,2018-06-18T05:53:17Z,2018-06-18T05:53:17Z,OWNER,"Advanced export pane: ![2018-06-17 at 10 52 pm](https://user-images.githubusercontent.com/9599/41520166-3809a45a-7281-11e8-9dfa-2b10f4cb9672.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/266#issuecomment-397952129,https://api.github.com/repos/simonw/datasette/issues/266,397952129,MDEyOklzc3VlQ29tbWVudDM5Nzk1MjEyOQ==,9599,simonw,2018-06-18T06:15:36Z,2018-06-18T06:15:51Z,OWNER,Advanced export pane demo: https://latest.datasette.io/fixtures-35b6eb6/facetable?_size=4,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/316#issuecomment-398030903,https://api.github.com/repos/simonw/datasette/issues/316,398030903,MDEyOklzc3VlQ29tbWVudDM5ODAzMDkwMw==,132230,gavinband,2018-06-18T12:00:43Z,2018-06-18T12:00:43Z,NONE,"I should add that I'm using datasette version 0.22, Python 2.7.10 on Mac OS X. Happy to send more info if helpful.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333238932,datasette inspect takes a very long time on large dbs, https://github.com/simonw/datasette/issues/266#issuecomment-398098582,https://api.github.com/repos/simonw/datasette/issues/266,398098582,MDEyOklzc3VlQ29tbWVudDM5ODA5ODU4Mg==,9599,simonw,2018-06-18T15:40:32Z,2018-06-18T15:40:32Z,OWNER,This is now released in Datasette 0.23! http://datasette.readthedocs.io/en/latest/changelog.html#v0-23,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323681589,Export to CSV, https://github.com/simonw/datasette/issues/316#issuecomment-398101670,https://api.github.com/repos/simonw/datasette/issues/316,398101670,MDEyOklzc3VlQ29tbWVudDM5ODEwMTY3MA==,9599,simonw,2018-06-18T15:49:35Z,2018-06-18T15:50:38Z,OWNER,"Wow, I've gone as high as 7GB but I've never tried it against 600GB. `datasette inspect` is indeed expected to take a long time for large databases. That's why it's available as a separate command: by running `datasette inspect` to generate `inspect-data.json` you can execute it just once against a large database and then have `datasette serve` take advantage of that cached metadata (hence avoiding `datasette serve` hanging on startup). As you spotted, most of the time is spent in those counts. I imagine you don't need those row counts in order for the rest of Datasette to function correctly (they are mainly used for display purposes - on the https://latest.datasette.io/fixtures index page for example). If your database changes infrequently, for the moment I recommend running `datasette inspect` once to generate the `inspect-data.json` file (let me know how long it takes) and then passing that file to `datasette serve mydb.db --inspect-file=inspect-data.json` If your database DOES change frequently then this workaround won't help you much. Let me know and I'll see how much work it would take to have those row counts be optional rather than required.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333238932,datasette inspect takes a very long time on large dbs, https://github.com/simonw/datasette/issues/265#issuecomment-398102537,https://api.github.com/repos/simonw/datasette/issues/265,398102537,MDEyOklzc3VlQ29tbWVudDM5ODEwMjUzNw==,9599,simonw,2018-06-18T15:52:15Z,2018-06-18T15:52:15Z,OWNER,https://latest.datasette.io/ now always hosts the latest version of the code. I've started linking to it from our documentation.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323677499,Add links to example Datasette instances to appropiate places in docs, https://github.com/simonw/datasette/issues/316#issuecomment-398109204,https://api.github.com/repos/simonw/datasette/issues/316,398109204,MDEyOklzc3VlQ29tbWVudDM5ODEwOTIwNA==,132230,gavinband,2018-06-18T16:12:45Z,2018-06-18T16:12:45Z,NONE,"Hi Simon, Thanks for the response. Ok I'll try running `datasette inspect` up front. In principle the db won't change. However, the site's in development and it's likely I'll need to add views and some auxiliary (smaller) tables as I go along. I will need to be careful with this if it involves an inspect step in each iteration, though. g. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333238932,datasette inspect takes a very long time on large dbs, https://github.com/simonw/datasette/issues/316#issuecomment-398133159,https://api.github.com/repos/simonw/datasette/issues/316,398133159,MDEyOklzc3VlQ29tbWVudDM5ODEzMzE1OQ==,9599,simonw,2018-06-18T17:29:59Z,2018-07-10T15:14:53Z,OWNER,"For #271 I've been contemplating having Datasette work against an on-disk database that gets modified without needing to restart the server. For that to work, I'll have to dramatically change the inspect() mechanism. It may be that inspect becomes an optional optimization in the future.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333238932,datasette inspect takes a very long time on large dbs, https://github.com/simonw/datasette/issues/271#issuecomment-398133924,https://api.github.com/repos/simonw/datasette/issues/271,398133924,MDEyOklzc3VlQ29tbWVudDM5ODEzMzkyNA==,9599,simonw,2018-06-18T17:32:22Z,2018-06-18T17:32:22Z,OWNER,"As seen in #316 inspect is already taking a VERY long time to run against large (600GB) databases. To get this working I may have to make inspect an optional optimization and run introspection for columns and primary keys in demand. The one catch here is the `count(*)` queries - Datasette may need to learn not to return full table counts in circumstances where the count has not been pre-calculates and takes more than Xms to generate.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324162476,Mechanism for automatically picking up changes when on-disk .db file changes, https://github.com/simonw/datasette/issues/188#issuecomment-398778485,https://api.github.com/repos/simonw/datasette/issues/188,398778485,MDEyOklzc3VlQ29tbWVudDM5ODc3ODQ4NQ==,12617395,bsilverm,2018-06-20T14:48:39Z,2018-06-20T14:48:39Z,NONE,This would be a great feature to have!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309047460,Ability to bundle metadata and templates inside the SQLite file, https://github.com/simonw/datasette/issues/302#issuecomment-398825294,https://api.github.com/repos/simonw/datasette/issues/302,398825294,MDEyOklzc3VlQ29tbWVudDM5ODgyNTI5NA==,9599,simonw,2018-06-20T17:06:36Z,2018-06-20T17:06:36Z,OWNER,Still a bug in 0.23,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328171513,test-2.3.sqlite database filename throws a 404, https://github.com/simonw/datasette/issues/215#issuecomment-398826108,https://api.github.com/repos/simonw/datasette/issues/215,398826108,MDEyOklzc3VlQ29tbWVudDM5ODgyNjEwOA==,9599,simonw,2018-06-20T17:09:18Z,2020-06-06T21:46:51Z,OWNER,This depends on #272 - Datasette ported to ASGI.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314506669,Allow plugins to define additional URL routes and views, https://github.com/simonw/datasette/issues/321#issuecomment-398973176,https://api.github.com/repos/simonw/datasette/issues/321,398973176,MDEyOklzc3VlQ29tbWVudDM5ODk3MzE3Ng==,9599,simonw,2018-06-21T04:34:11Z,2018-06-21T16:53:57Z,OWNER,"This is a little bit fiddly, but it's possible to do it using SQLite string concatenation. Here's an example: ``` select * from facetable where neighborhood like ""%"" || :text || ""%""; ``` Try it here: https://latest.datasette.io/fixtures-35b6eb6?sql=select+*+from+facetable+where+neighborhood+like+%22%25%22+%7C%7C+%3Atext+%7C%7C+%22%25%22%3B&text=town ![2018-06-20 at 9 33 pm](https://user-images.githubusercontent.com/9599/41698185-a52143f2-74d1-11e8-8d16-32bfc4542104.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334190959,Wildcard support in query parameters, https://github.com/simonw/datasette/issues/318#issuecomment-398973309,https://api.github.com/repos/simonw/datasette/issues/318,398973309,MDEyOklzc3VlQ29tbWVudDM5ODk3MzMwOQ==,9599,simonw,2018-06-21T04:35:12Z,2018-06-21T04:37:37Z,OWNER,"Demo of fix: the `on_earth` facet on https://latest.datasette.io/fixtures-cafd088/facetable?_facet=planet_int&_facet=on_earth&_facet=city_id ![2018-06-20 at 9 35 pm](https://user-images.githubusercontent.com/9599/41698208-ebb6b72a-74d1-11e8-9d85-de7600177f69.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334148669,Facets with value of 0 displayed incorrectly, https://github.com/simonw/datasette/issues/321#issuecomment-398976488,https://api.github.com/repos/simonw/datasette/issues/321,398976488,MDEyOklzc3VlQ29tbWVudDM5ODk3NjQ4OA==,9599,simonw,2018-06-21T04:59:33Z,2018-06-21T06:11:02Z,OWNER,"I've added this to the unit tests and the documentation. Docs: http://datasette.readthedocs.io/en/latest/sql_queries.html#canned-queries Canned query demo: https://latest.datasette.io/fixtures/neighborhood_search?text=town New unit test: https://github.com/simonw/datasette/blob/3683a6b626b2e79f4dc9600d45853ca4ae8de11a/tests/test_api.py#L333-L344 https://github.com/simonw/datasette/blob/3683a6b626b2e79f4dc9600d45853ca4ae8de11a/tests/fixtures.py#L145-L153","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334190959,Wildcard support in query parameters, https://github.com/simonw/datasette/issues/321#issuecomment-399098080,https://api.github.com/repos/simonw/datasette/issues/321,399098080,MDEyOklzc3VlQ29tbWVudDM5OTA5ODA4MA==,12617395,bsilverm,2018-06-21T13:10:48Z,2018-06-21T13:10:48Z,NONE,"Perfect, thank you!!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334190959,Wildcard support in query parameters, https://github.com/simonw/datasette/issues/321#issuecomment-399106871,https://api.github.com/repos/simonw/datasette/issues/321,399106871,MDEyOklzc3VlQ29tbWVudDM5OTEwNjg3MQ==,12617395,bsilverm,2018-06-21T13:39:37Z,2018-06-21T13:39:37Z,NONE,"One thing I've noticed with this approach is that the query is executed with no parameters which I do not believe was the case previously. In the case the table contains a lot of data, this adds some time executing the query before the user can enter their input and run it with the parameters they want.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334190959,Wildcard support in query parameters, https://github.com/simonw/datasette/issues/321#issuecomment-399126228,https://api.github.com/repos/simonw/datasette/issues/321,399126228,MDEyOklzc3VlQ29tbWVudDM5OTEyNjIyOA==,9599,simonw,2018-06-21T14:36:40Z,2018-06-21T14:36:53Z,OWNER,"This seems to fix that: ``` select neighborhood, facet_cities.name, state from facetable join facet_cities on facetable.city_id = facet_cities.id where :text != '' and neighborhood like '%' || :text || '%' order by neighborhood; ``` Compare this (with empty string): https://latest.datasette.io/fixtures-cafd088?sql=select+neighborhood%2C+facet_cities.name%2C+state%0D%0Afrom+facetable%0D%0A++++join+facet_cities+on+facetable.city_id+%3D+facet_cities.id%0D%0Awhere+%3Atext+%21%3D+%22%22+and+neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0D%0Aorder+by+neighborhood%3B To this: https://latest.datasette.io/fixtures-cafd088?sql=select+neighborhood%2C+facet_cities.name%2C+state%0D%0Afrom+facetable%0D%0A++++join+facet_cities+on+facetable.city_id+%3D+facet_cities.id%0D%0Awhere+%3Atext+%21%3D+%22%22+and+neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0D%0Aorder+by+neighborhood%3B&text=town","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334190959,Wildcard support in query parameters, https://github.com/simonw/datasette/issues/321#issuecomment-399129220,https://api.github.com/repos/simonw/datasette/issues/321,399129220,MDEyOklzc3VlQ29tbWVudDM5OTEyOTIyMA==,12617395,bsilverm,2018-06-21T14:45:02Z,2018-06-21T14:45:02Z,NONE,Those queries look identical. How can this be prevented if the queries are in a metadata.json file?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334190959,Wildcard support in query parameters, https://github.com/simonw/datasette/issues/309#issuecomment-399134680,https://api.github.com/repos/simonw/datasette/issues/309,399134680,MDEyOklzc3VlQ29tbWVudDM5OTEzNDY4MA==,9599,simonw,2018-06-21T14:59:57Z,2018-06-21T14:59:57Z,OWNER,I can use Sanic middleware for this: http://sanic.readthedocs.io/en/latest/sanic/middleware.html#responding-early,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",331343824,On 404s with a trailing slash redirect to that page without a trailing slash, https://github.com/simonw/datasette/issues/319#issuecomment-399139462,https://api.github.com/repos/simonw/datasette/issues/319,399139462,MDEyOklzc3VlQ29tbWVudDM5OTEzOTQ2Mg==,9599,simonw,2018-06-21T15:13:58Z,2018-06-21T15:13:58Z,OWNER,"Demo of fix: https://latest.datasette.io/fixtures-e14e080/searchable_tags ![2018-06-21 at 8 13 am](https://user-images.githubusercontent.com/9599/41728203-0b571e9a-752b-11e8-9702-9887e3ede5bc.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334149717,Incorrect display of compound primary keys with foreign key relationships, https://github.com/simonw/datasette/issues/309#issuecomment-399142274,https://api.github.com/repos/simonw/datasette/issues/309,399142274,MDEyOklzc3VlQ29tbWVudDM5OTE0MjI3NA==,9599,simonw,2018-06-21T15:22:02Z,2018-06-21T15:22:02Z,OWNER,Demo: https://latest.datasette.io/fixtures-e14e080/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",331343824,On 404s with a trailing slash redirect to that page without a trailing slash, https://github.com/simonw/datasette/issues/317#issuecomment-399144688,https://api.github.com/repos/simonw/datasette/issues/317,399144688,MDEyOklzc3VlQ29tbWVudDM5OTE0NDY4OA==,9599,simonw,2018-06-21T15:29:06Z,2018-06-21T15:29:16Z,OWNER,"From https://docs.travis-ci.com/user/deployment/pypi/ > Note that if your PyPI password contains special characters you need to escape them before encrypting your password. Some people have [reported difficulties](https://github.com/travis-ci/dpl/issues/377) connecting to PyPI with passwords containing anything except alphanumeric characters. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333326107,Travis CI fails to upload new releases to PyPI, https://github.com/simonw/datasette/issues/317#issuecomment-399150285,https://api.github.com/repos/simonw/datasette/issues/317,399150285,MDEyOklzc3VlQ29tbWVudDM5OTE1MDI4NQ==,9599,simonw,2018-06-21T15:45:47Z,2018-06-21T15:45:47Z,OWNER,That fixed it! https://travis-ci.org/simonw/datasette/jobs/395078407 ran successfully and https://pypi.org/project/datasette/ now hosts Datasette 0.23.1 deployed via Travis.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333326107,Travis CI fails to upload new releases to PyPI, https://github.com/simonw/datasette/issues/319#issuecomment-399154550,https://api.github.com/repos/simonw/datasette/issues/319,399154550,MDEyOklzc3VlQ29tbWVudDM5OTE1NDU1MA==,9599,simonw,2018-06-21T15:58:15Z,2018-06-21T15:58:15Z,OWNER,Fixed here too now: https://registry.datasette.io/registry-c10707b/datasette_tags,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334149717,Incorrect display of compound primary keys with foreign key relationships, https://github.com/simonw/datasette/issues/314#issuecomment-399156960,https://api.github.com/repos/simonw/datasette/issues/314,399156960,MDEyOklzc3VlQ29tbWVudDM5OTE1Njk2MA==,9599,simonw,2018-06-21T16:04:59Z,2018-06-21T16:04:59Z,OWNER,"Demo of fix: https://latest.datasette.io/fixtures-e14e080/simple_view ![2018-06-21 at 9 04 am](https://user-images.githubusercontent.com/9599/41731021-2be526aa-7532-11e8-9c3b-f787f918328e.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333096176,HTML table does not correctly display entirely blank rows, https://github.com/simonw/datasette/issues/259#issuecomment-399157944,https://api.github.com/repos/simonw/datasette/issues/259,399157944,MDEyOklzc3VlQ29tbWVudDM5OTE1Nzk0NA==,9599,simonw,2018-06-21T16:07:49Z,2018-06-21T16:07:49Z,OWNER,Thanks to #319 the test suite now includes a m2m table: https://latest.datasette.io/fixtures-e14e080/searchable_tags,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322787470,inspect() should detect many-to-many relationships, https://github.com/simonw/datasette/issues/321#issuecomment-399171239,https://api.github.com/repos/simonw/datasette/issues/321,399171239,MDEyOklzc3VlQ29tbWVudDM5OTE3MTIzOQ==,9599,simonw,2018-06-21T16:51:08Z,2018-06-21T16:51:29Z,OWNER,"I may have misunderstood your problem here. I understood that the problem is that when using the `""%"" || :text || ""%""` construct the first hit to that page (with an empty string for `:text`) results in a `where neighborhood like ""%%""` query which is slow because it matches every row in the database. My fix was to add this to the where clause: where :text != '' and ... Which means that when you first load the page the where fails to match any rows and you get no results (and hopefully instant loading times assuming SQLite is smart enough to optimize this away). That's why you don't see any rows returned on this page: https://latest.datasette.io/fixtures-cafd088?sql=select+neighborhood%2C+facet_cities.name%2C+state%0D%0Afrom+facetable%0D%0A++++join+facet_cities+on+facetable.city_id+%3D+facet_cities.id%0D%0Awhere+%3Atext+%21%3D+%22%22+and+neighborhood+like+%27%25%27+%7C%7C+%3Atext+%7C%7C+%27%25%27%0D%0Aorder+by+neighborhood%3B","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334190959,Wildcard support in query parameters, https://github.com/simonw/datasette/issues/321#issuecomment-399173916,https://api.github.com/repos/simonw/datasette/issues/321,399173916,MDEyOklzc3VlQ29tbWVudDM5OTE3MzkxNg==,12617395,bsilverm,2018-06-21T17:00:10Z,2018-06-21T17:00:10Z,NONE,"Oh I see.. My issue is that the query executes with an empty string prior to the user submitting the parameters. I'll try adding your workaround to some of my queries. Thanks again,","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334190959,Wildcard support in query parameters, https://github.com/simonw/datasette/issues/326#issuecomment-399721346,https://api.github.com/repos/simonw/datasette/issues/326,399721346,MDEyOklzc3VlQ29tbWVudDM5OTcyMTM0Ng==,9599,simonw,2018-06-24T01:10:26Z,2018-06-24T01:10:26Z,OWNER,"Demo: go to https://vega.github.io/editor/ and paste in the following: ``` { ""data"": { ""url"": ""https://fivethirtyeight.datasettes.com/fivethirtyeight/twitter-ratio%2Fsenators.csv?_size=max&_sort_desc=replies"", ""format"": { ""type"": ""csv"" } }, ""mark"": ""bar"", ""encoding"": { ""x"": { ""field"": ""created_at"", ""type"": ""temporal"" }, ""y"": { ""field"": ""replies"", ""type"": ""quantitative"" }, ""color"": { ""field"": ""user"", ""type"": ""nominal"" } } } ``` ![2018-06-23 at 6 10 pm](https://user-images.githubusercontent.com/9599/41814923-b1613370-7710-11e8-94ac-5b87b0b629ed.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",335141434,CSV should respect --cors and return cors headers, https://github.com/simonw/datasette/issues/272#issuecomment-400166540,https://api.github.com/repos/simonw/datasette/issues/272,400166540,MDEyOklzc3VlQ29tbWVudDQwMDE2NjU0MA==,9599,simonw,2018-06-26T03:29:43Z,2018-06-26T03:29:43Z,OWNER,This looks VERY relevant: https://github.com/encode/starlette,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-400571521,https://api.github.com/repos/simonw/datasette/issues/272,400571521,MDEyOklzc3VlQ29tbWVudDQwMDU3MTUyMQ==,647359,tomchristie,2018-06-27T07:30:07Z,2018-06-27T07:30:07Z,NONE,"I’m up for helping with this. Looks like you’d need static files support, which I’m planning on adding a component for. Anything else obviously missing? For a quick overview it looks very doable - the test client ought to me your test cases stay roughly the same. Are you using any middleware or other components for the Sanic ecosystem? Do you use cookies or sessions at all?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/328#issuecomment-400903687,https://api.github.com/repos/simonw/datasette/issues/328,400903687,MDEyOklzc3VlQ29tbWVudDQwMDkwMzY4Nw==,9599,simonw,2018-06-28T04:00:18Z,2018-06-28T04:00:18Z,OWNER,Need to ship docker image: #57 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336464733,"Installation instructions, including how to use the docker image", https://github.com/simonw/datasette/issues/57#issuecomment-400903871,https://api.github.com/repos/simonw/datasette/issues/57,400903871,MDEyOklzc3VlQ29tbWVudDQwMDkwMzg3MQ==,9599,simonw,2018-06-28T04:01:38Z,2018-06-28T04:01:38Z,OWNER,"Shipped to Docker Hub: https://hub.docker.com/r/datasetteproject/datasette/ I did this manually the first time. I'll set Travis up to do this automatically in #329","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273127694,Ship a Docker image of the whole thing, https://github.com/simonw/datasette/issues/328#issuecomment-400904514,https://api.github.com/repos/simonw/datasette/issues/328,400904514,MDEyOklzc3VlQ29tbWVudDQwMDkwNDUxNA==,9599,simonw,2018-06-28T04:06:39Z,2018-06-28T04:06:39Z,OWNER,https://datasette.readthedocs.io/en/latest/installation.html#using-docker,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336464733,"Installation instructions, including how to use the docker image", https://github.com/simonw/datasette/pull/280#issuecomment-401003061,https://api.github.com/repos/simonw/datasette/issues/280,401003061,MDEyOklzc3VlQ29tbWVudDQwMTAwMzA2MQ==,9599,simonw,2018-06-28T11:26:23Z,2018-06-28T11:26:23Z,OWNER,I pushed this to Docker Hub https://hub.docker.com/r/datasetteproject/datasette/ and added notes on how to use it to the documentation: http://datasette.readthedocs.io/en/latest/installation.html#using-docker,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",325373747,Build Dockerfile with recent Sqlite + Spatialite, https://github.com/simonw/datasette/issues/276#issuecomment-401310732,https://api.github.com/repos/simonw/datasette/issues/276,401310732,MDEyOklzc3VlQ29tbWVudDQwMTMxMDczMg==,82988,psychemedia,2018-06-29T10:05:04Z,2018-06-29T10:07:25Z,CONTRIBUTOR,"@russs Different map projections can presumably be handled on the client side using a leaflet plugin to transform the geometry (eg [kartena/Proj4Leaflet](https://kartena.github.io/Proj4Leaflet/)) although the leaflet side would need to detect or be informed of the original projection? Another possibility would be to provide an easy way/guidance for users to create an FK'd table containing the WGS84 projection of a non-WGS84 geometry in the original/principle table? This could then as a proxy for serving GeoJSON to the leaflet map?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/276#issuecomment-401312981,https://api.github.com/repos/simonw/datasette/issues/276,401312981,MDEyOklzc3VlQ29tbWVudDQwMTMxMjk4MQ==,45057,russss,2018-06-29T10:14:54Z,2018-06-29T10:14:54Z,CONTRIBUTOR,"> @RusSs Different map projections can presumably be handled on the client side using a leaflet plugin to transform the geometry (eg kartena/Proj4Leaflet) although the leaflet side would need to detect or be informed of the original projection? Well, as @simonw mentioned, GeoJSON only supports WGS84, and GeoJSON (and/or TopoJSON) is the standard we probably want to aim for. On-the-fly reprojection in spatialite is not an issue anyway, and in general I think you want to be serving stuff to web maps in WGS84 or Web Mercator.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324835838,Handle spatialite geometry columns better, https://github.com/simonw/datasette/issues/332#issuecomment-401477622,https://api.github.com/repos/simonw/datasette/issues/332,401477622,MDEyOklzc3VlQ29tbWVudDQwMTQ3NzYyMg==,9599,simonw,2018-06-29T21:23:17Z,2018-06-29T21:23:55Z,OWNER,"https://docs.python.org/3/library/json.html#json.dump > **json.dump**(obj, fp, *, skipkeys=False, ensure_ascii=True, check_circular=True, allow_nan=True, cls=None, indent=None, separators=None, default=None, sort_keys=False, **kw)¶ > If `allow_nan` is false (default: True), then it will be a ValueError to serialize out of range float values (nan, inf, -inf) in strict compliance of the JSON specification. If allow_nan is true, their JavaScript equivalents (NaN, Infinity, -Infinity) will be used.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/332#issuecomment-401478223,https://api.github.com/repos/simonw/datasette/issues/332,401478223,MDEyOklzc3VlQ29tbWVudDQwMTQ3ODIyMw==,9599,simonw,2018-06-29T21:26:12Z,2018-06-29T21:26:19Z,OWNER,"I'm not sure what the correct thing to do here is. I don't want to throw a `ValueError` when trying to render that data as JSON, but I also want to produce JSON that doesn't break when fetched by JavaScript.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/332#issuecomment-402243153,https://api.github.com/repos/simonw/datasette/issues/332,402243153,MDEyOklzc3VlQ29tbWVudDQwMjI0MzE1Mw==,9599,simonw,2018-07-03T17:58:50Z,2018-07-12T16:10:39Z,OWNER,"I think I'm going to return `null` in the JSON for infinity/nan values by default, but if you send `_nan=1` I will instead return invalid JSON with `Infinity` or `NaN` in it (since you have opted in to getting those and hence should be able to handle them).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/325#issuecomment-403263890,https://api.github.com/repos/simonw/datasette/issues/325,403263890,MDEyOklzc3VlQ29tbWVudDQwMzI2Mzg5MA==,9599,simonw,2018-07-08T05:35:20Z,2018-07-09T17:28:27Z,OWNER,Fixed: https://v0-23-2.datasette.io/fixtures-e14e080/table%2Fwith%2Fslashes.csv / https://v0-23-2.datasette.io/fixtures-e14e080/table%2Fwith%2Fslashes.csv/3,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",335064777,Error on row page if table has slashes in the name and ends in .csv, https://github.com/simonw/datasette/issues/334#issuecomment-403526263,https://api.github.com/repos/simonw/datasette/issues/334,403526263,MDEyOklzc3VlQ29tbWVudDQwMzUyNjI2Mw==,9599,simonw,2018-07-09T15:49:01Z,2018-07-09T15:49:01Z,OWNER,Yup that's definitely a bug.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",339095976,extra_options not passed to heroku publisher, https://github.com/simonw/datasette/issues/334#issuecomment-403672561,https://api.github.com/repos/simonw/datasette/issues/334,403672561,MDEyOklzc3VlQ29tbWVudDQwMzY3MjU2MQ==,9599,simonw,2018-07-10T01:45:28Z,2018-07-10T01:45:28Z,OWNER,"Tested with `datasette publish heroku fixtures.db --extra-options=""--config sql_time_limit_ms:4000""` https://blooming-anchorage-31561.herokuapp.com/-/config","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",339095976,extra_options not passed to heroku publisher, https://github.com/simonw/datasette/issues/323#issuecomment-403855639,https://api.github.com/repos/simonw/datasette/issues/323,403855639,MDEyOklzc3VlQ29tbWVudDQwMzg1NTYzOQ==,9599,simonw,2018-07-10T15:03:36Z,2018-07-10T15:03:36Z,OWNER,I'm satisified with the improvement we got from the pip wheel cache.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334698969,Speed up Travis CI builds, https://github.com/simonw/datasette/issues/330#issuecomment-403855963,https://api.github.com/repos/simonw/datasette/issues/330,403855963,MDEyOklzc3VlQ29tbWVudDQwMzg1NTk2Mw==,9599,simonw,2018-07-10T15:04:31Z,2018-07-10T15:04:31Z,OWNER,This relates to #276 - I'm definitely convinced now that displaying a giant `b'...'` blob on the page is not a useful default.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336924199,Limit text display in cells containing large amounts of text, https://github.com/simonw/datasette/issues/331#issuecomment-403856114,https://api.github.com/repos/simonw/datasette/issues/331,403856114,MDEyOklzc3VlQ29tbWVudDQwMzg1NjExNA==,9599,simonw,2018-07-10T15:04:56Z,2018-07-10T15:04:56Z,OWNER,Great idea.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336936010,Datasette throws error when loading spatialite db without extension loaded, https://github.com/simonw/datasette/issues/331#issuecomment-403858949,https://api.github.com/repos/simonw/datasette/issues/331,403858949,MDEyOklzc3VlQ29tbWVudDQwMzg1ODk0OQ==,9599,simonw,2018-07-10T15:12:53Z,2018-07-10T15:13:04Z,OWNER,"``` $ datasette airports.sqlite Serve! files=('airports.sqlite',) on port 8001 Usage: datasette airports.sqlite [OPTIONS] [FILES]... Error: It looks like you're trying to load a SpatiaLite database without first loading the SpatiaLite module. Read more: https://datasette.readthedocs.io/en/latest/spatialite.html ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336936010,Datasette throws error when loading spatialite db without extension loaded, https://github.com/simonw/datasette/issues/335#issuecomment-403863927,https://api.github.com/repos/simonw/datasette/issues/335,403863927,MDEyOklzc3VlQ29tbWVudDQwMzg2MzkyNw==,9599,simonw,2018-07-10T15:26:27Z,2018-07-10T15:29:54Z,OWNER,Here are some useful examples of other Python apps that have been packaged using the recipe described above: https://github.com/Homebrew/homebrew-core/search?utf8=%E2%9C%93&q=virtualenv_install_with_resources&type=,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",339505204,Package datasette for installation using homebrew, https://github.com/simonw/datasette/issues/335#issuecomment-403865063,https://api.github.com/repos/simonw/datasette/issues/335,403865063,MDEyOklzc3VlQ29tbWVudDQwMzg2NTA2Mw==,9599,simonw,2018-07-10T15:29:32Z,2018-07-10T15:29:32Z,OWNER,"Huh... from https://docs.brew.sh/Acceptable-Formulae > We frown on authors submitting their own work unless it is very popular. Marking this one as ""help wanted"" :)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",339505204,Package datasette for installation using homebrew, https://github.com/simonw/datasette/issues/335#issuecomment-403866099,https://api.github.com/repos/simonw/datasette/issues/335,403866099,MDEyOklzc3VlQ29tbWVudDQwMzg2NjA5OQ==,9599,simonw,2018-07-10T15:32:14Z,2018-07-10T15:32:14Z,OWNER,"I can host a custom tap without needing to get anything accepted into homebrew-core: https://docs.brew.sh/How-to-Create-and-Maintain-a-Tap Since my principle goal here is ensuring an easy installation path for people who are familiar with `brew` but don't know how to use pip and Python 3 that could be a good option.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",339505204,Package datasette for installation using homebrew, https://github.com/simonw/datasette/issues/330#issuecomment-403868584,https://api.github.com/repos/simonw/datasette/issues/330,403868584,MDEyOklzc3VlQ29tbWVudDQwMzg2ODU4NA==,9599,simonw,2018-07-10T15:39:12Z,2018-07-10T16:21:08Z,OWNER,"I think this makes sense for the HTML view (not for JSON or CSV). It could be controlled be a new [config option](http://datasette.readthedocs.io/en/latest/config.html), `truncate_cells_html` - which is on by default but can be turned off.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336924199,Limit text display in cells containing large amounts of text, https://github.com/simonw/datasette/issues/330#issuecomment-403906747,https://api.github.com/repos/simonw/datasette/issues/330,403906747,MDEyOklzc3VlQ29tbWVudDQwMzkwNjc0Nw==,9599,simonw,2018-07-10T17:39:46Z,2018-07-10T17:39:46Z,OWNER,"``` datasette publish now timezones.db --spatialite \ --extra-options=""--config truncate_cells_html:200"" \ --name=datasette-issue-330-demo \ --branch=master ``` https://datasette-issue-330-demo-sbelwxttfn.now.sh/timezones-3cb9f64/timezones ![2018-07-10 at 10 39 am](https://user-images.githubusercontent.com/9599/42527428-7eabc6c8-842d-11e8-91ac-5666dbc5872c.png) But https://datasette-issue-330-demo-sbelwxttfn.now.sh/timezones-3cb9f64/timezones/1 displays the full blob.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336924199,Limit text display in cells containing large amounts of text, https://github.com/simonw/datasette/issues/330#issuecomment-403907193,https://api.github.com/repos/simonw/datasette/issues/330,403907193,MDEyOklzc3VlQ29tbWVudDQwMzkwNzE5Mw==,9599,simonw,2018-07-10T17:41:14Z,2018-07-10T17:41:14Z,OWNER,Documentation: http://datasette.readthedocs.io/en/latest/config.html#truncate-cells-html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336924199,Limit text display in cells containing large amounts of text, https://github.com/simonw/datasette/issues/191#issuecomment-403908704,https://api.github.com/repos/simonw/datasette/issues/191,403908704,MDEyOklzc3VlQ29tbWVudDQwMzkwODcwNA==,9599,simonw,2018-07-10T17:46:13Z,2018-07-10T17:46:13Z,OWNER,I consider this resolved by #46 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",310533258,Figure out how to bundle a more up-to-date SQLite, https://github.com/simonw/datasette/issues/139#issuecomment-403909389,https://api.github.com/repos/simonw/datasette/issues/139,403909389,MDEyOklzc3VlQ29tbWVudDQwMzkwOTM4OQ==,9599,simonw,2018-07-10T17:48:18Z,2018-07-10T17:48:18Z,OWNER,This is done! https://github.com/simonw/datasette-vega,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275493851,Build a visualization plugin for Vega, https://github.com/simonw/datasette/issues/143#issuecomment-403909469,https://api.github.com/repos/simonw/datasette/issues/143,403909469,MDEyOklzc3VlQ29tbWVudDQwMzkwOTQ2OQ==,9599,simonw,2018-07-10T17:48:34Z,2018-07-10T17:48:34Z,OWNER,This is now a dupe of https://github.com/simonw/datasette-vega/issues/4,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275939188,"Mechanism for ""suggested visualizations""", https://github.com/simonw/datasette/issues/87#issuecomment-403909671,https://api.github.com/repos/simonw/datasette/issues/87,403909671,MDEyOklzc3VlQ29tbWVudDQwMzkwOTY3MQ==,9599,simonw,2018-07-10T17:49:12Z,2018-07-10T17:49:12Z,OWNER,This was fixed by https://github.com/simonw/datasette/commit/6a32684ebba89dfe882e1147b23aa8778479f5d8#diff-354f30a63fb0907d4ad57269548329e3,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273709194,Configure Travis to release new tags to PyPI, https://github.com/simonw/datasette/issues/140#issuecomment-403910318,https://api.github.com/repos/simonw/datasette/issues/140,403910318,MDEyOklzc3VlQ29tbWVudDQwMzkxMDMxOA==,9599,simonw,2018-07-10T17:51:11Z,2018-07-10T17:51:11Z,OWNER,This would be a nice example plugin to demonstrate plugin configuration options in #231,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275755475,Heatmap visualization plugin, https://github.com/simonw/datasette/issues/27#issuecomment-403910774,https://api.github.com/repos/simonw/datasette/issues/27,403910774,MDEyOklzc3VlQ29tbWVudDQwMzkxMDc3NA==,9599,simonw,2018-07-10T17:52:41Z,2018-07-10T17:52:41Z,OWNER,I consider this handled by https://github.com/simonw/datasette-vega,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267886330,Ability to plot a simple graph, https://github.com/simonw/datasette/issues/140#issuecomment-403939399,https://api.github.com/repos/simonw/datasette/issues/140,403939399,MDEyOklzc3VlQ29tbWVudDQwMzkzOTM5OQ==,9599,simonw,2018-07-10T19:30:17Z,2018-07-10T19:30:41Z,OWNER,Building this using Svelte would also produce a neat example of a plugin that uses Svelte: https://svelte.technology/guide - and if I like it I might part datasette-vega to it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275755475,Heatmap visualization plugin, https://github.com/simonw/datasette/issues/272#issuecomment-403959704,https://api.github.com/repos/simonw/datasette/issues/272,403959704,MDEyOklzc3VlQ29tbWVudDQwMzk1OTcwNA==,9599,simonw,2018-07-10T20:44:47Z,2018-07-10T20:44:47Z,OWNER,"No cookies or sessions - no POST requests in fact, Datasette just cares about GET (path and querystring) and being able to return custom HTTP headers.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/336#issuecomment-403996143,https://api.github.com/repos/simonw/datasette/issues/336,403996143,MDEyOklzc3VlQ29tbWVudDQwMzk5NjE0Mw==,9599,simonw,2018-07-10T23:21:27Z,2018-07-10T23:21:27Z,OWNER,Easiest way to do this I think would be to make those help blocks separate files in the docs/ directory (publish-help.txt perhaps) and then include them with a sphinx directive: https://reinout.vanrees.org/weblog/2010/12/08/include-external-in-sphinx.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",340039409,Ensure --help examples in docs are always up to date, https://github.com/simonw/datasette/issues/337#issuecomment-404021589,https://api.github.com/repos/simonw/datasette/issues/337,404021589,MDEyOklzc3VlQ29tbWVudDQwNDAyMTU4OQ==,9599,simonw,2018-07-11T02:07:32Z,2018-07-11T02:07:32Z,OWNER,http://datasette.readthedocs.io/en/latest/publish.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",340065374,Documentation for datasette publish and datasette package, https://github.com/simonw/datasette/issues/336#issuecomment-404021890,https://api.github.com/repos/simonw/datasette/issues/336,404021890,MDEyOklzc3VlQ29tbWVudDQwNDAyMTg5MA==,9599,simonw,2018-07-11T02:09:25Z,2018-07-11T02:09:25Z,OWNER,"I decided against the unit tests, instead I have a new script called `./update-docs-help.sh` which I can run any time I want to refresh the included documentation: https://github.com/simonw/datasette/commit/aec3ae53237e43b0c268dbf9b58fa265ef38cfe1#diff-cb15a1e5a244bb82ad4afce67f252543","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",340039409,Ensure --help examples in docs are always up to date, https://github.com/simonw/datasette/issues/335#issuecomment-404208602,https://api.github.com/repos/simonw/datasette/issues/335,404208602,MDEyOklzc3VlQ29tbWVudDQwNDIwODYwMg==,9599,simonw,2018-07-11T15:20:12Z,2018-07-11T15:20:12Z,OWNER,Here's a good example of a homebrew tap: https://github.com/saulpw/homebrew-vd,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",339505204,Package datasette for installation using homebrew, https://github.com/simonw/datasette/issues/338#issuecomment-404209205,https://api.github.com/repos/simonw/datasette/issues/338,404209205,MDEyOklzc3VlQ29tbWVudDQwNDIwOTIwNQ==,9599,simonw,2018-07-11T15:21:47Z,2018-07-11T15:21:47Z,OWNER,"Oops, opened this in the wrong repo - moved it here: https://github.com/simonw/datasette-vega/issues/13","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",340282796,Only load vegaEmbed if charting tools are enabled, https://github.com/simonw/datasette/issues/339#issuecomment-404338345,https://api.github.com/repos/simonw/datasette/issues/339,404338345,MDEyOklzc3VlQ29tbWVudDQwNDMzODM0NQ==,9599,simonw,2018-07-11T23:09:24Z,2018-07-11T23:09:24Z,OWNER,"It sounds like you're running into the Sanic default response timeout value of 60 seconds: https://github.com/channelcat/sanic/blob/master/docs/sanic/config.md#builtin-configuration-values For the moment you can over-ride that using an environment variable like this: SANIC_RESPONSE_TIMEOUT=6000 datasette fivethirtyeight.db -p 8008 --config sql_time_limit_ms:600000","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",340396247,Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way, https://github.com/simonw/datasette/issues/272#issuecomment-404514973,https://api.github.com/repos/simonw/datasette/issues/272,404514973,MDEyOklzc3VlQ29tbWVudDQwNDUxNDk3Mw==,647359,tomchristie,2018-07-12T13:38:24Z,2018-07-12T13:38:24Z,NONE,"Okay. I reckon the latest version should have all the kinds of components you'd need: Recently added ASGI components for Routing and Static Files support, as well as making few tweaks to make sure requests and responses are instantiated efficiently. Don't have any redirect-to-slash / redirect-to-non-slash stuff out of the box yet, which it looks like you might miss.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/339#issuecomment-404565566,https://api.github.com/repos/simonw/datasette/issues/339,404565566,MDEyOklzc3VlQ29tbWVudDQwNDU2NTU2Ng==,9599,simonw,2018-07-12T16:08:42Z,2018-07-12T16:08:42Z,OWNER,I'm going to turn this into an issue about better supporting the above option.,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",340396247,Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way, https://github.com/simonw/datasette/issues/332#issuecomment-404567587,https://api.github.com/repos/simonw/datasette/issues/332,404567587,MDEyOklzc3VlQ29tbWVudDQwNDU2NzU4Nw==,9599,simonw,2018-07-12T16:15:29Z,2018-07-12T16:17:54Z,OWNER,Here's how plotly handled this issue: https://github.com/plotly/plotly.py/pull/203 - see also https://github.com/plotly/plotly.py/blob/213602df6c89b45ce2b811ed2591171c961408e7/plotly/utils.py#L137,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/332#issuecomment-404569003,https://api.github.com/repos/simonw/datasette/issues/332,404569003,MDEyOklzc3VlQ29tbWVudDQwNDU2OTAwMw==,9599,simonw,2018-07-12T16:20:06Z,2018-07-12T16:20:06Z,OWNER,And here's how django-rest-framework did it: https://github.com/encode/django-rest-framework/pull/4918/files,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/332#issuecomment-404574598,https://api.github.com/repos/simonw/datasette/issues/332,404574598,MDEyOklzc3VlQ29tbWVudDQwNDU3NDU5OA==,9599,simonw,2018-07-12T16:39:51Z,2018-07-12T16:39:51Z,OWNER,Since my data is all flat lists of values I don't think I need to customize the JSON encoder itself (no need to deal with nested values). I'll fix the data on its way into the encoder instead. This will also help if I decide to move to uJSON for better performance #48,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/339#issuecomment-404576136,https://api.github.com/repos/simonw/datasette/issues/339,404576136,MDEyOklzc3VlQ29tbWVudDQwNDU3NjEzNg==,12617395,bsilverm,2018-07-12T16:45:08Z,2018-07-12T16:45:08Z,NONE,Thanks for the quick reply. Looks like that is working well.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",340396247,Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way, https://github.com/simonw/datasette/issues/327#issuecomment-404923318,https://api.github.com/repos/simonw/datasette/issues/327,404923318,MDEyOklzc3VlQ29tbWVudDQwNDkyMzMxOA==,9599,simonw,2018-07-13T18:58:11Z,2018-07-13T18:58:11Z,OWNER,Relevant: https://code.fb.com/data-infrastructure/xars-a-more-efficient-open-source-system-for-self-contained-executables/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",335200136,Explore if SquashFS can be used to shrink size of packaged Docker containers, https://github.com/simonw/datasette/issues/342#issuecomment-404953877,https://api.github.com/repos/simonw/datasette/issues/342,404953877,MDEyOklzc3VlQ29tbWVudDQwNDk1Mzg3Nw==,9599,simonw,2018-07-13T21:05:12Z,2018-07-13T21:05:12Z,OWNER,That's a good idea. We already do this for tables - e.g. on https://fivethirtyeight.datasettes.com/fivethirtyeight-ac35616/most-common-name%2Fsurnames - so having it as an option for canned queries definitely makes sense.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",341123355,Requesting support for query description, https://github.com/simonw/datasette/issues/342#issuecomment-404954202,https://api.github.com/repos/simonw/datasette/issues/342,404954202,MDEyOklzc3VlQ29tbWVudDQwNDk1NDIwMg==,9599,simonw,2018-07-13T21:06:53Z,2018-07-13T21:07:13Z,OWNER,"https://timezones-api.now.sh/-/metadata currently shows this: ``` { ""databases"": { ""timezones"": { ""license"": ""ODbL"", ""license_url"": ""http://opendatacommons.org/licenses/odbl/"", ""queries"": { ""by_point"": ""select tzid\nfrom\n timezones\nwhere\n within(GeomFromText(\u0027POINT(\u0027 || :longitude || \u0027 \u0027 || :latitude || \u0027)\u0027), timezones.Geometry)\n and rowid in (\n SELECT pkid FROM idx_timezones_Geometry\n where xmin \u003c :longitude\n and xmax \u003e :longitude\n and ymin \u003c :latitude\n and ymax \u003e :latitude\n )"" }, ""source"": ""timezone-boundary-builder"", ""source_url"": ""https://github.com/evansiroky/timezone-boundary-builder"", ""tables"": { ""timezones"": { ""license"": ""ODbL"", ""license_url"": ""http://opendatacommons.org/licenses/odbl/"", ""sortable_columns"": [ ""tzid"" ], ""source"": ""timezone-boundary-builder"", ""source_url"": ""https://github.com/evansiroky/timezone-boundary-builder"" } } } }, ""license"": ""ODbL"", ""license_url"": ""http://opendatacommons.org/licenses/odbl/"", ""source"": ""timezone-boundary-builder"", ""source_url"": ""https://github.com/evansiroky/timezone-boundary-builder"", ""title"": ""OpenStreetMap Time Zone Boundaries"" } ``` We could support the value part of the `""queries""` array optionally being a dictionary with the same set of metadata fields supported for a table, plus a new `""sql""` key to hold the SQL for the query. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",341123355,Requesting support for query description, https://github.com/simonw/datasette/issues/342#issuecomment-404954672,https://api.github.com/repos/simonw/datasette/issues/342,404954672,MDEyOklzc3VlQ29tbWVudDQwNDk1NDY3Mg==,9599,simonw,2018-07-13T21:09:01Z,2018-07-13T21:09:01Z,OWNER,"So it would look like this: ``` { ""databases"": { ""timezones"": { ""license"": ""ODbL"", ""license_url"": ""http://opendatacommons.org/licenses/odbl/"", ""queries"": { ""by_point"": { ""title"": ""Timezones by point"", ""description"": ""Find the timezone for a latitude/longitude point"", ""sql"": ""select tzid\nfrom\n timezones\nwhere\n within(GeomFromText('POINT(' || :longitude || ' ' || :latitude || ')'), timezones.Geometry)\n and rowid in (\n SELECT pkid FROM idx_timezones_Geometry\n where xmin < :longitude\n and xmax > :longitude\n and ymin < :latitude\n and ymax > :latitude\n )"" } } } } } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",341123355,Requesting support for query description, https://github.com/simonw/datasette/issues/344#issuecomment-405022335,https://api.github.com/repos/simonw/datasette/issues/344,405022335,MDEyOklzc3VlQ29tbWVudDQwNTAyMjMzNQ==,45057,russss,2018-07-14T13:00:48Z,2018-07-14T13:00:48Z,CONTRIBUTOR,"Looks like this was a red herring actually, and heroku had a blip when I was testing it...","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",341229113,datasette publish heroku fails without name provided, https://github.com/simonw/datasette/pull/345#issuecomment-405025731,https://api.github.com/repos/simonw/datasette/issues/345,405025731,MDEyOklzc3VlQ29tbWVudDQwNTAyNTczMQ==,9599,simonw,2018-07-14T14:04:31Z,2018-07-14T14:04:31Z,OWNER,"Fantastic, we really needed this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",341235633,Allow app names for `datasette publish heroku`, https://github.com/simonw/datasette/issues/343#issuecomment-405026441,https://api.github.com/repos/simonw/datasette/issues/343,405026441,MDEyOklzc3VlQ29tbWVudDQwNTAyNjQ0MQ==,45057,russss,2018-07-14T14:17:14Z,2018-07-14T14:17:14Z,CONTRIBUTOR,This probably depends on #294.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",341228846,Render boolean fields better by default, https://github.com/simonw/datasette/issues/294#issuecomment-405026800,https://api.github.com/repos/simonw/datasette/issues/294,405026800,MDEyOklzc3VlQ29tbWVudDQwNTAyNjgwMA==,45057,russss,2018-07-14T14:24:31Z,2018-07-14T14:24:31Z,CONTRIBUTOR,"I had a quick look at this in relation to #343 and I feel like it might be worth modelling the inspected table metadata internally as an object rather than a dict. (We'd still have to serialise it back to JSON.) There are a few places where we rely on the structure of this metadata dict for various reasons, including in templates (and potentially also in user templates). It would be nice to have a reasonably well defined API for accessing metadata internally so that it's clearer what we're breaking.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327365110,inspect should record column types, https://github.com/simonw/datasette/issues/342#issuecomment-405138460,https://api.github.com/repos/simonw/datasette/issues/342,405138460,MDEyOklzc3VlQ29tbWVudDQwNTEzODQ2MA==,9599,simonw,2018-07-16T02:42:32Z,2018-07-16T02:42:32Z,OWNER,"Demos: * https://latest.datasette.io/fixtures/neighborhood_search * https://timezones-api.now.sh/timezones/by_point Documentation: http://datasette.readthedocs.io/en/latest/sql_queries.html#canned-queries","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",341123355,Requesting support for query description, https://github.com/simonw/datasette/issues/332#issuecomment-405968983,https://api.github.com/repos/simonw/datasette/issues/332,405968983,MDEyOklzc3VlQ29tbWVudDQwNTk2ODk4Mw==,9599,simonw,2018-07-18T15:18:57Z,2018-07-18T15:18:57Z,OWNER,Maybe argument should be `?_json_nan=1` since that makes it more explicitly obvious what is going on here.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/308#issuecomment-405971920,https://api.github.com/repos/simonw/datasette/issues/308,405971920,MDEyOklzc3VlQ29tbWVudDQwNTk3MTkyMA==,9599,simonw,2018-07-18T15:27:12Z,2018-07-18T15:27:12Z,OWNER,"It looks like there are a few extra options we should support: https://devcenter.heroku.com/articles/heroku-cli-commands ``` -t, --team=team team to use --region=region specify region for the app to run in --space=space the private space to create the app in ``` Since these differ from the options for Zeit Now I think this means splitting up `datasette publish now` and `datasette publish Heroku` into separate subcommands.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",330826972,"Support extra Heroku apps:create options - region, space, team", https://github.com/simonw/datasette/issues/333#issuecomment-405975025,https://api.github.com/repos/simonw/datasette/issues/333,405975025,MDEyOklzc3VlQ29tbWVudDQwNTk3NTAyNQ==,9599,simonw,2018-07-18T15:36:11Z,2018-07-18T15:40:04Z,OWNER,"A `force_https_api_urls` config option would work here - if set, Datasette will ignore the incoming protocol and always use https. The `datasette deploy now` command could then add that as an option passed to `datasette serve`. This is the pattern which is producing incorrect URLs on Zeit Now, because the Sanic `request.url` property is not being correctly set. https://github.com/simonw/datasette/blob/6e37f091edec35e2706197489f54fff5d890c63c/datasette/views/table.py#L653-L655 Suggested help text: > Always use https:// for URLs output as part of Datasette API responses","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",338768551,Datasette on Zeit Now returns http URLs for facet and next links, https://github.com/simonw/datasette/issues/333#issuecomment-405988035,https://api.github.com/repos/simonw/datasette/issues/333,405988035,MDEyOklzc3VlQ29tbWVudDQwNTk4ODAzNQ==,9599,simonw,2018-07-18T16:12:35Z,2018-07-18T16:12:35Z,OWNER,"I'll add a `absolute_url(request, path)` method on the base view class which knows to check the new config option.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",338768551,Datasette on Zeit Now returns http URLs for facet and next links, https://github.com/simonw/datasette/issues/333#issuecomment-407109113,https://api.github.com/repos/simonw/datasette/issues/333,407109113,MDEyOklzc3VlQ29tbWVudDQwNzEwOTExMw==,9599,simonw,2018-07-23T15:59:02Z,2018-07-23T15:59:02Z,OWNER,I still need to modify `datasette publish now` to set this config option on the instances that it deploys.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",338768551,Datasette on Zeit Now returns http URLs for facet and next links, https://github.com/simonw/datasette/issues/332#issuecomment-407262311,https://api.github.com/repos/simonw/datasette/issues/332,407262311,MDEyOklzc3VlQ29tbWVudDQwNzI2MjMxMQ==,9599,simonw,2018-07-24T02:43:03Z,2018-07-24T02:43:03Z,OWNER,Actually SQLite doesn't handle NaN at all (it treats it as null) so I'm going to change this ticket to just deal with Infinity and -Infinity.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/332#issuecomment-407262436,https://api.github.com/repos/simonw/datasette/issues/332,407262436,MDEyOklzc3VlQ29tbWVudDQwNzI2MjQzNg==,9599,simonw,2018-07-24T02:43:50Z,2018-07-24T02:43:50Z,OWNER,I'm going with `_json_infinity=1` as the querystring argument.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/332#issuecomment-407262561,https://api.github.com/repos/simonw/datasette/issues/332,407262561,MDEyOklzc3VlQ29tbWVudDQwNzI2MjU2MQ==,9599,simonw,2018-07-24T02:44:39Z,2018-07-24T02:44:39Z,OWNER,According to https://www.mail-archive.com/sqlite-users@mailinglists.sqlite.org/msg110573.html you can insert Infinity/-Infinity in raw SQL (as used by our fixtures) using 1e999 and -1e999.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/332#issuecomment-407267707,https://api.github.com/repos/simonw/datasette/issues/332,407267707,MDEyOklzc3VlQ29tbWVudDQwNzI2NzcwNw==,9599,simonw,2018-07-24T03:20:08Z,2018-07-24T03:20:08Z,OWNER,"Demo: * https://700d83d.datasette.io/fixtures-dcc1dbf/infinity.json - Infinity converted to Null * https://700d83d.datasette.io/fixtures-dcc1dbf/infinity.json?_json_infinity=on - invalid JSON containing `Infinity` and `-Infinity`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/332#issuecomment-407267762,https://api.github.com/repos/simonw/datasette/issues/332,407267762,MDEyOklzc3VlQ29tbWVudDQwNzI2Nzc2Mg==,9599,simonw,2018-07-24T03:20:33Z,2018-07-24T03:20:33Z,OWNER,Documentation: http://datasette.readthedocs.io/en/latest/json_api.html#special-json-arguments,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",337141108,Sanely handle Infinity/-Infinity values in JSON using ?_json_infinity=1, https://github.com/simonw/datasette/issues/333#issuecomment-407267966,https://api.github.com/repos/simonw/datasette/issues/333,407267966,MDEyOklzc3VlQ29tbWVudDQwNzI2Nzk2Ng==,9599,simonw,2018-07-24T03:21:42Z,2018-07-24T03:21:42Z,OWNER,Demo: https://700d83d.datasette.io/fixtures-dcc1dbf/facetable.json?_facet=state&_size=5&_labels=on,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",338768551,Datasette on Zeit Now returns http URLs for facet and next links, https://github.com/simonw/datasette/issues/320#issuecomment-407269243,https://api.github.com/repos/simonw/datasette/issues/320,407269243,MDEyOklzc3VlQ29tbWVudDQwNzI2OTI0Mw==,9599,simonw,2018-07-24T03:30:32Z,2018-07-24T03:30:32Z,OWNER,"* No primary key => no ""object"" option: https://latest.datasette.io/fixtures-dcc1dbf/no_primary_key * Has a primary key => show ""object"" option: https://latest.datasette.io/fixtures-dcc1dbf/complex_foreign_keys * Has a next page => has ""stream all rows"" option: https://latest.datasette.io/fixtures-dcc1dbf/no_primary_key * Has foreign key references = show default-checked ""expand labels"" option: https://latest.datasette.io/fixtures-dcc1dbf/complex_foreign_keys * Does not have a next page => do not show ""stream all rows"" option: https://latest.datasette.io/fixtures-dcc1dbf/complex_foreign_keys ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",334169932,Need unit tests covering the different states for the advanced export box, https://github.com/simonw/datasette/issues/298#issuecomment-407274059,https://api.github.com/repos/simonw/datasette/issues/298,407274059,MDEyOklzc3VlQ29tbWVudDQwNzI3NDA1OQ==,9599,simonw,2018-07-24T04:03:05Z,2018-07-24T04:03:05Z,OWNER,Demo: https://latest.datasette.io/fixtures-dcc1dbf?sql=select+%28%27https%3A%2F%2Ftwitter.com%2F%27+%7C%7C+%27simonw%27%29+as+user_url%3B,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327459829,URLify URLs in results from custom SQL statements / views, https://github.com/simonw/datasette/issues/329#issuecomment-407275996,https://api.github.com/repos/simonw/datasette/issues/329,407275996,MDEyOklzc3VlQ29tbWVudDQwNzI3NTk5Ng==,9599,simonw,2018-07-24T04:18:28Z,2018-07-24T04:18:28Z,OWNER,Hopefully this will do the trick: https://github.com/simonw/datasette/commit/2bdab66772dca51b0c729b4e1063610cb2edd890,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336465018,Travis should push tagged images to Docker Hub for each release, https://github.com/simonw/datasette/issues/329#issuecomment-407280689,https://api.github.com/repos/simonw/datasette/issues/329,407280689,MDEyOklzc3VlQ29tbWVudDQwNzI4MDY4OQ==,9599,simonw,2018-07-24T04:52:58Z,2018-07-24T04:52:58Z,OWNER,"It almost worked... but I had to fix the `docker login` command: https://github.com/simonw/datasette/commit/3a46d5e3c4278e74c3694f36995ea134bff800bc Hopefully the next release will be published correctly.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336465018,Travis should push tagged images to Docker Hub for each release, https://github.com/simonw/datasette/issues/336#issuecomment-407450815,https://api.github.com/repos/simonw/datasette/issues/336,407450815,MDEyOklzc3VlQ29tbWVudDQwNzQ1MDgxNQ==,9599,simonw,2018-07-24T15:35:03Z,2018-07-24T15:35:03Z,OWNER,Actually I do like the idea of a unit test that reminds me if I've forgotten to update the included files.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",340039409,Ensure --help examples in docs are always up to date, https://github.com/simonw/datasette/issues/301#issuecomment-407979065,https://api.github.com/repos/simonw/datasette/issues/301,407979065,MDEyOklzc3VlQ29tbWVudDQwNzk3OTA2NQ==,9599,simonw,2018-07-26T05:17:34Z,2018-07-26T05:17:34Z,OWNER,This code now lives in https://github.com/simonw/datasette/blob/master/datasette/publish/heroku.py,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",328155946,"--spatialite option for ""datasette publish heroku""", https://github.com/simonw/datasette/issues/217#issuecomment-407980050,https://api.github.com/repos/simonw/datasette/issues/217,407980050,MDEyOklzc3VlQ29tbWVudDQwNzk4MDA1MA==,9599,simonw,2018-07-26T05:24:17Z,2018-07-26T05:24:17Z,OWNER,Documentation: http://datasette.readthedocs.io/en/latest/plugins.html#publish-subcommand-publish,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",314725342,Plugin support for datasette publish, https://github.com/simonw/datasette/pull/349#issuecomment-407980716,https://api.github.com/repos/simonw/datasette/issues/349,407980716,MDEyOklzc3VlQ29tbWVudDQwNzk4MDcxNg==,9599,simonw,2018-07-26T05:28:54Z,2018-07-26T05:28:54Z,OWNER,"Documentation here: http://datasette.readthedocs.io/en/latest/plugins.html#publish-subcommand-publish The best way to write a new publish plugin is to check out how the Heroku and Now default plugins are implemented: https://github.com/simonw/datasette/tree/master/datasette/publish","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",344695978,"publish_subcommand hook + default plugins mechanism, used for publish heroku/now", https://github.com/simonw/datasette/issues/348#issuecomment-407983375,https://api.github.com/repos/simonw/datasette/issues/348,407983375,MDEyOklzc3VlQ29tbWVudDQwNzk4MzM3NQ==,9599,simonw,2018-07-26T05:46:01Z,2018-07-26T05:46:01Z,OWNER,"Oops, forgot to commit those unit tests.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",344656114,"Unit tests for ""datasette publish""", https://github.com/simonw/datasette/issues/272#issuecomment-408093480,https://api.github.com/repos/simonw/datasette/issues/272,408093480,MDEyOklzc3VlQ29tbWVudDQwODA5MzQ4MA==,9599,simonw,2018-07-26T13:15:55Z,2018-07-26T13:46:40Z,OWNER,"I'm now hacking around with an initial version of this in the [starlette branch](https://github.com/simonw/datasette/tree/starlette). Here's my work in progress, deployed using `datasette publish now fixtures.db -n datasette-starlette-demo --branch=starlette --extra-options=""--asgi""` https://datasette-starlette-demo.now.sh/ Lots more work to do - the CSS isn't being served correctly for example, it's showing this error when I hit `/-/static/app.css`: ``` INFO: 127.0.0.1 - ""GET /-/static/app.css HTTP/1.1"" 200 ERROR: Exception in ASGI application Traceback (most recent call last): File ""/Users/simonw/Dropbox/Development/datasette/venv/lib/python3.6/site-packages/uvicorn/protocols/http/httptools_impl.py"", line 363, in run_asgi result = await asgi(self.receive, self.send) File ""/Users/simonw/Dropbox/Development/datasette/venv/lib/python3.6/site-packages/starlette/staticfiles.py"", line 91, in __call__ await response(receive, send) File ""/Users/simonw/Dropbox/Development/datasette/venv/lib/python3.6/site-packages/starlette/response.py"", line 180, in __call__ {""type"": ""http.response.body"", ""body"": chunk, ""more_body"": False} File ""/Users/simonw/Dropbox/Development/datasette/venv/lib/python3.6/site-packages/uvicorn/protocols/http/httptools_impl.py"", line 483, in send raise RuntimeError(""Response content shorter than Content-Length"") RuntimeError: Response content shorter than Content-Length ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-408097719,https://api.github.com/repos/simonw/datasette/issues/272,408097719,MDEyOklzc3VlQ29tbWVudDQwODA5NzcxOQ==,9599,simonw,2018-07-26T13:29:38Z,2018-07-26T13:29:38Z,OWNER,It looks like that's a bug in Starlette - filed here: https://github.com/encode/starlette/issues/32,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-408105251,https://api.github.com/repos/simonw/datasette/issues/272,408105251,MDEyOklzc3VlQ29tbWVudDQwODEwNTI1MQ==,9599,simonw,2018-07-26T13:54:06Z,2018-07-26T13:54:06Z,OWNER,"Tom shipped my fix for that bug already, so https://datasette-starlette-demo.now.sh/ is now serving CSS!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-408478935,https://api.github.com/repos/simonw/datasette/issues/272,408478935,MDEyOklzc3VlQ29tbWVudDQwODQ3ODkzNQ==,9599,simonw,2018-07-27T17:00:08Z,2018-07-27T17:00:08Z,OWNER,"Refs https://github.com/encode/uvicorn/issues/168","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/299#issuecomment-408581551,https://api.github.com/repos/simonw/datasette/issues/299,408581551,MDEyOklzc3VlQ29tbWVudDQwODU4MTU1MQ==,9599,simonw,2018-07-28T04:24:05Z,2018-07-28T04:24:05Z,OWNER,New documentation is now online here: https://datasette.readthedocs.io/en/latest/pages.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327461381,Documentation covering ALL datasette URLs, https://github.com/simonw/datasette/issues/259#issuecomment-409087501,https://api.github.com/repos/simonw/datasette/issues/259,409087501,MDEyOklzc3VlQ29tbWVudDQwOTA4NzUwMQ==,9599,simonw,2018-07-31T04:03:29Z,2018-07-31T04:03:29Z,OWNER,Parent ticket: #354,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322787470,inspect() should detect many-to-many relationships, https://github.com/simonw/datasette/issues/355#issuecomment-409087871,https://api.github.com/repos/simonw/datasette/issues/355,409087871,MDEyOklzc3VlQ29tbWVudDQwOTA4Nzg3MQ==,9599,simonw,2018-07-31T04:06:22Z,2018-07-31T04:06:22Z,OWNER,"I started playing with this in the `m2m` branch - work so far: https://github.com/simonw/datasette/compare/295d005ca48747faf046ed30c3c61e7563c61ed2...af4ce463e7518f9d7828b846efd5b528a1905eca Here's a demo: https://datasette-m2m-work-in-progress.now.sh/russian-ads-e8e09e2/ads?_m2m_ad_targets__target_id=ec3ac&_m2m_ad_targets__target_id=e128e","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",346027040,Table view should support filtering via many-to-many relationships, https://github.com/simonw/datasette/issues/356#issuecomment-409088967,https://api.github.com/repos/simonw/datasette/issues/356,409088967,MDEyOklzc3VlQ29tbWVudDQwOTA4ODk2Nw==,9599,simonw,2018-07-31T04:14:44Z,2018-07-31T04:14:44Z,OWNER,"Here's the query I'm playing with for facet counts: https://datasette-m2m-work-in-progress.now.sh/russian-ads-e8e09e2?sql=select+target_id%2C+count%28*%29+as+n+from+ad_targets%0D%0Awhere%0D%0A++target_id+not+in+%28%22ec3ac%22%2C+%22e128e%22%29%0D%0A++and+ad_id+in+%28select+ad_id+from+ad_targets+where+target_id+%3D+%22ec3ac%22%29%0D%0A++and+ad_id+in+%28select+ad_id+from+ad_targets+where+target_id+%3D+%22e128e%22%29%0D%0Agroup+by+target_id+order+by+n+desc%3B ``` select target_id, count(*) as n from ad_targets where target_id not in (""ec3ac"", ""e128e"") and ad_id in (select ad_id from ad_targets where target_id = ""ec3ac"") and ad_id in (select ad_id from ad_targets where target_id = ""e128e"") group by target_id order by n desc; ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",346028655,Ability to display facet counts for many-to-many relationships, https://github.com/simonw/datasette/issues/352#issuecomment-409715112,https://api.github.com/repos/simonw/datasette/issues/352,409715112,MDEyOklzc3VlQ29tbWVudDQwOTcxNTExMg==,9599,simonw,2018-08-01T20:41:04Z,2018-08-01T20:41:04Z,OWNER,The hook is currently only used on the custom SQL results page - it needs to run on table/view pages as well.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",345821500,render_cell(value) plugin hook, https://github.com/simonw/datasette/issues/352#issuecomment-410485995,https://api.github.com/repos/simonw/datasette/issues/352,410485995,MDEyOklzc3VlQ29tbWVudDQxMDQ4NTk5NQ==,9599,simonw,2018-08-05T00:16:21Z,2018-08-05T00:16:21Z,OWNER,"First plugin using this hook: https://github.com/simonw/datasette-json-html Hook documentation: http://datasette.readthedocs.io/en/latest/plugins.html#render-cell-value","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",345821500,render_cell(value) plugin hook, https://github.com/simonw/datasette/issues/352#issuecomment-410580202,https://api.github.com/repos/simonw/datasette/issues/352,410580202,MDEyOklzc3VlQ29tbWVudDQxMDU4MDIwMg==,9599,simonw,2018-08-06T03:39:40Z,2018-08-06T03:39:40Z,OWNER,I used `datasette-json-html` to build this: https://russian-ira-facebook-ads-datasette-whmbonekoj.now.sh/russian-ads-919cbfd/display_ads,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",345821500,render_cell(value) plugin hook, https://github.com/simonw/datasette/issues/357#issuecomment-410818501,https://api.github.com/repos/simonw/datasette/issues/357,410818501,MDEyOklzc3VlQ29tbWVudDQxMDgxODUwMQ==,9599,simonw,2018-08-06T19:04:54Z,2018-08-06T19:04:54Z,OWNER,Another potential use-case for this hook: loading metadata via a URL,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",348043884,Plugin hook for loading metadata.json, https://github.com/simonw/datasette/issues/174#issuecomment-412290986,https://api.github.com/repos/simonw/datasette/issues/174,412290986,MDEyOklzc3VlQ29tbWVudDQxMjI5MDk4Ng==,9599,simonw,2018-08-11T17:46:51Z,2018-08-11T17:46:51Z,OWNER,This was fixed in https://github.com/simonw/datasette/commit/89d9fbb91bfc0dd9091b34dbf3cf540ab849cc44,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",281197863,License/Source in footer should inherit from top level, https://github.com/simonw/datasette/issues/188#issuecomment-412291327,https://api.github.com/repos/simonw/datasette/issues/188,412291327,MDEyOklzc3VlQ29tbWVudDQxMjI5MTMyNw==,9599,simonw,2018-08-11T17:53:17Z,2018-08-11T17:53:17Z,OWNER,"Potential problem: the existing `metadata.json` format looks like this: ``` { ""title"": ""Custom title for your index page"", ""description"": ""Some description text can go here"", ""license"": ""ODbL"", ""license_url"": ""https://opendatacommons.org/licenses/odbl/"", ""databases"": { ""database1"": { ""source"": ""Alternative source"", ""source_url"": ""http://example.com/"", ""tables"": { ""example_table"": { ""description_html"": ""Custom table description"", ""license"": ""CC BY 3.0 US"", ""license_url"": ""https://creativecommons.org/licenses/by/3.0/us/"" } } } } } ``` This doesn't make sense for metadata that is bundled with a specific database - there's no point in having the `databases` key, instead the content of `database1` in the above example should be at the top level. This also means that if you rename the `*.db` file you won't have to edit its metadata at the same time. Calling such an embedded file `metadata.json` when the shape is different could be confusing. Maybe call it `database-metadata.json` instead.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309047460,Ability to bundle metadata and templates inside the SQLite file, https://github.com/simonw/datasette/issues/231#issuecomment-412291395,https://api.github.com/repos/simonw/datasette/issues/231,412291395,MDEyOklzc3VlQ29tbWVudDQxMjI5MTM5NQ==,9599,simonw,2018-08-11T17:54:41Z,2018-08-11T17:54:41Z,OWNER,"I'm going to separate the issue of enabling and disabling plugins from the existence of the `plugins` key. The format will simply be: ``` { ""plugins"": { ""name-of-plugin"": { ... any structures you like go here, defined by the plugin ... } } } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316323336,metadata.json support for plugin configuration options, https://github.com/simonw/datasette/issues/238#issuecomment-412291437,https://api.github.com/repos/simonw/datasette/issues/238,412291437,MDEyOklzc3VlQ29tbWVudDQxMjI5MTQzNw==,9599,simonw,2018-08-11T17:55:26Z,2018-08-11T18:02:48Z,OWNER,"On further thought, I'd much rather implement this using some kind of metadata plugin hook - see #357","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",317714268,External metadata.json, https://github.com/simonw/datasette/issues/185#issuecomment-412299013,https://api.github.com/repos/simonw/datasette/issues/185,412299013,MDEyOklzc3VlQ29tbWVudDQxMjI5OTAxMw==,9599,simonw,2018-08-11T20:14:54Z,2018-08-11T20:14:54Z,OWNER,"I've been worrying about how this one relates to #260 - I'd like to validate metadata (to help protect against people e.g. misspelling `license_url` and then being confused when their license isn't displayed properly), but this issue requests the ability to add arbitrary additional keys to the metadata structure. I think the solution is to introduce a metadata key called `extra_metadata_keys` which allows you to specifically list the extra keys that you want to enable. Something like this: ``` { ""title"": ""My title"", ""source"": ""Source"", ""source_url"": ""https://www.example.com/"", ""release_date"": ""2018-04-01"", ""extra_metadata_keys"": [""release_date""] } ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/359#issuecomment-412356537,https://api.github.com/repos/simonw/datasette/issues/359,412356537,MDEyOklzc3VlQ29tbWVudDQxMjM1NjUzNw==,9599,simonw,2018-08-12T17:01:21Z,2018-08-12T17:01:39Z,OWNER,"Example table: https://latest-code.datasette.io/code/definitions Here's a query that does facet counting against that column: https://latest-code.datasette.io/code-a26fa3c?sql=select+count%28*%29+as+n%2C+j.value+from+definitions+join+json_each%28params%29+j+group+by+j.value+order+by+n+desc%3B ``` select count(*) as n, j.value from definitions join json_each(params) j group by j.value order by n desc; ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",349827640,Faceted browse against a JSON list of tags, https://github.com/simonw/datasette/issues/359#issuecomment-412356746,https://api.github.com/repos/simonw/datasette/issues/359,412356746,MDEyOklzc3VlQ29tbWVudDQxMjM1Njc0Ng==,9599,simonw,2018-08-12T17:05:00Z,2018-08-12T17:05:00Z,OWNER,"And here's the query for pulling back every record tagged with a specific tag: https://latest-code.datasette.io/code-a26fa3c?sql=select+*+from+definitions+where+rowid+in+%28%0D%0A++select+definitions.rowid%0D%0A++from+definitions+join+json_each%28params%29+j%0D%0A++where+j.value+%3D+%3Atag%0D%0A%29&tag=filename ``` select * from definitions where rowid in ( select definitions.rowid from definitions join json_each(params) j where j.value = :tag ) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",349827640,Faceted browse against a JSON list of tags, https://github.com/simonw/datasette/issues/359#issuecomment-412357691,https://api.github.com/repos/simonw/datasette/issues/359,412357691,MDEyOklzc3VlQ29tbWVudDQxMjM1NzY5MQ==,9599,simonw,2018-08-12T17:17:29Z,2018-08-12T17:17:29Z,OWNER,"Note that there doesn't seem to be a way to use indexes (even [indexes on expressions](https://www.sqlite.org/expridx.html)) to speed these up, so this will only ever be effective on smaller data sets, probably in the 10,000-100,000 range. Datasette is often used with smaller data sets so this is still worth pursuing.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",349827640,Faceted browse against a JSON list of tags, https://github.com/simonw/datasette/issues/185#issuecomment-412663658,https://api.github.com/repos/simonw/datasette/issues/185,412663658,MDEyOklzc3VlQ29tbWVudDQxMjY2MzY1OA==,222245,carlmjohnson,2018-08-13T21:04:11Z,2018-08-13T21:04:11Z,NONE,That seems good to me.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",299760684,Metadata should be a nested arbitrary KV store, https://github.com/simonw/datasette/issues/360#issuecomment-413386332,https://api.github.com/repos/simonw/datasette/issues/360,413386332,MDEyOklzc3VlQ29tbWVudDQxMzM4NjMzMg==,9599,simonw,2018-08-16T00:51:00Z,2018-08-16T00:51:00Z,OWNER,Relevant: https://github.com/coleifer/pysqlite3/issues/2,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",351017129,Use pysqlite3 if available, https://github.com/simonw/datasette/issues/360#issuecomment-413387424,https://api.github.com/repos/simonw/datasette/issues/360,413387424,MDEyOklzc3VlQ29tbWVudDQxMzM4NzQyNA==,9599,simonw,2018-08-16T00:57:25Z,2018-08-16T00:57:25Z,OWNER,"I deployed a working demo of this here: https://pysqlite3-datasette.now.sh I used this command to deploy it: datasette publish now \ fixtures.db fivethirtyeight.db \ --branch=pysqlite3 \ --install=https://github.com/karlb/pysqlite3/archive/master.zip \ -n pysqlite3-datasette https://pysqlite3-datasette.now.sh/-/versions confirms version of SQLite is `3.25.0`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",351017129,Use pysqlite3 if available, https://github.com/simonw/datasette/issues/360#issuecomment-413396812,https://api.github.com/repos/simonw/datasette/issues/360,413396812,MDEyOklzc3VlQ29tbWVudDQxMzM5NjgxMg==,9599,simonw,2018-08-16T01:50:42Z,2018-08-16T01:50:42Z,OWNER,"Now that this has merged into master the command for deploying it can use `--branch=master` instead: datasette publish now \ fixtures.db fivethirtyeight.db \ --branch=master \ --install=https://github.com/karlb/pysqlite3/archive/master.zip \ -n pysqlite3-datasette ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",351017129,Use pysqlite3 if available, https://github.com/simonw/datasette/issues/267#issuecomment-414860009,https://api.github.com/repos/simonw/datasette/issues/267,414860009,MDEyOklzc3VlQ29tbWVudDQxNDg2MDAwOQ==,78156,annapowellsmith,2018-08-21T23:57:51Z,2018-08-21T23:57:51Z,NONE,"Looks to me like hashing, redirects and caching were documented as part of https://github.com/simonw/datasette/commit/788a542d3c739da5207db7d1fb91789603cdd336#diff-3021b0e065dce289c34c3b49b3952a07 - so perhaps this can be closed? :tada:","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",323716411,"Documentation for URL hashing, redirects and cache policy", https://github.com/simonw/datasette/issues/350#issuecomment-416659043,https://api.github.com/repos/simonw/datasette/issues/350,416659043,MDEyOklzc3VlQ29tbWVudDQxNjY1OTA0Mw==,9599,simonw,2018-08-28T16:48:19Z,2018-08-28T16:48:19Z,OWNER,Closed in https://github.com/simonw/datasette/commit/0bd41d4cb0a42d7d2baf8b49675418d1482ae39b,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",344701755,Don't list default plugins on /-/plugins, https://github.com/simonw/datasette/issues/350#issuecomment-416667565,https://api.github.com/repos/simonw/datasette/issues/350,416667565,MDEyOklzc3VlQ29tbWVudDQxNjY2NzU2NQ==,9599,simonw,2018-08-28T17:13:50Z,2018-08-28T17:13:50Z,OWNER,https://b7257a2.datasette.io/-/plugins is now correctly returning an empty list.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",344701755,Don't list default plugins on /-/plugins, https://github.com/simonw/datasette/issues/362#issuecomment-416727898,https://api.github.com/repos/simonw/datasette/issues/362,416727898,MDEyOklzc3VlQ29tbWVudDQxNjcyNzg5OA==,9599,simonw,2018-08-28T20:24:00Z,2018-08-28T20:24:00Z,OWNER,"Are you talking about these filters here? ![2018-08-28 at 9 22 pm](https://user-images.githubusercontent.com/9599/44748784-8688cb00-ab08-11e8-8baf-ace2e04e181f.png) I haven't thought much about how those could be made more usable - right now they basically expose all available options, but customizing them for particular use-cases is certainly an interesting potential space. Could you sketch out a bit more about how your ideal interface here would work?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",352768017,Add option to include/exclude columns in search filters, https://github.com/simonw/datasette/pull/363#issuecomment-417684877,https://api.github.com/repos/simonw/datasette/issues/363,417684877,MDEyOklzc3VlQ29tbWVudDQxNzY4NDg3Nw==,436032,kevboh,2018-08-31T14:39:45Z,2018-08-31T14:39:45Z,NONE,"It looks like the check passed, not sure why it's showing as running in GH.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",355299310,Search all apps during heroku publish, https://github.com/simonw/datasette/issues/308#issuecomment-418106781,https://api.github.com/repos/simonw/datasette/issues/308,418106781,MDEyOklzc3VlQ29tbWVudDQxODEwNjc4MQ==,9599,simonw,2018-09-03T12:53:21Z,2018-09-03T12:53:21Z,OWNER,Now that I've split the heroku command out into a separate default plugin this is a much easier thing to add: https://github.com/simonw/datasette/blob/master/datasette/publish/heroku.py,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",330826972,"Support extra Heroku apps:create options - region, space, team", https://github.com/simonw/datasette/issues/272#issuecomment-418695115,https://api.github.com/repos/simonw/datasette/issues/272,418695115,MDEyOklzc3VlQ29tbWVudDQxODY5NTExNQ==,647359,tomchristie,2018-09-05T11:21:25Z,2018-09-05T11:21:25Z,NONE,"Some notes: * Starlette just got a bump to 0.3.0 - there's some renamings in there. It's got enough functionality now that you can treat it either as a framework or as a toolkit. Either way the component design is all just *here's an ASGI app* all the way through. * Uvicorn got a bump to 0.3.3 - Removed some cyclical references that were causing garbage collection to impact performance. Ought to be a decent speed bump. * Wrt. passing config - Either use a single envvar that points to a config, or use multiple envvars for the config. Uvicorn could get a flag to read a `.env` file, but I don't see ASGI itself having a specific interface there.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/pull/293#issuecomment-420295524,https://api.github.com/repos/simonw/datasette/issues/293,420295524,MDEyOklzc3VlQ29tbWVudDQyMDI5NTUyNA==,11912854,jsancho-gpl,2018-09-11T14:32:45Z,2018-09-11T14:32:45Z,NONE,I close this PR because it's better to use the new one #364 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326987229,Support for external database connectors, https://github.com/simonw/datasette/issues/329#issuecomment-422821483,https://api.github.com/repos/simonw/datasette/issues/329,422821483,MDEyOklzc3VlQ29tbWVudDQyMjgyMTQ4Mw==,418191,jaywgraves,2018-09-19T14:17:42Z,2018-09-19T14:17:42Z,CONTRIBUTOR,"I'm using the docker image (0.23.2) and notice some differences/bugs between the docs and the published version with canned queries. (submitted a tiny doc fix also) I was able to build the docker container locally using `master` and I'm using that for now. Would it be possible to manually push 0.24 to DockerHub until the TravisCI stuff is fixed? I would like to run this in our Kubernetes cluster but don't want to publish a version in our internal registry if I don't have to. Thanks!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336465018,Travis should push tagged images to Docker Hub for each release, https://github.com/simonw/datasette/pull/365#issuecomment-422885014,https://api.github.com/repos/simonw/datasette/issues/365,422885014,MDEyOklzc3VlQ29tbWVudDQyMjg4NTAxNA==,9599,simonw,2018-09-19T17:15:16Z,2018-09-19T17:15:16Z,OWNER,Thanks!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",361764460,fix small doc typo, https://github.com/simonw/datasette/issues/329#issuecomment-422903031,https://api.github.com/repos/simonw/datasette/issues/329,422903031,MDEyOklzc3VlQ29tbWVudDQyMjkwMzAzMQ==,9599,simonw,2018-09-19T18:07:09Z,2018-09-19T18:07:09Z,OWNER,"The new 0.25 release has been successfully pushed to Docker Hub! https://hub.docker.com/r/datasetteproject/datasette/tags/ One catch: it looks like it didn't update the ""latest"" tag to point at it. Looking into that now.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336465018,Travis should push tagged images to Docker Hub for each release, https://github.com/simonw/datasette/issues/329#issuecomment-422908130,https://api.github.com/repos/simonw/datasette/issues/329,422908130,MDEyOklzc3VlQ29tbWVudDQyMjkwODEzMA==,9599,simonw,2018-09-19T18:23:02Z,2018-09-19T18:23:02Z,OWNER,"I fixed that by running the following on my laptop: $ docker pull datasetteproject/datasette:0.25 $ docker tag datasetteproject/datasette:0.25 datasetteproject/datasette:latest $ docker push datasetteproject/datasette The `latest` tag now points to the most recent release.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336465018,Travis should push tagged images to Docker Hub for each release, https://github.com/simonw/datasette/issues/329#issuecomment-422915450,https://api.github.com/repos/simonw/datasette/issues/329,422915450,MDEyOklzc3VlQ29tbWVudDQyMjkxNTQ1MA==,418191,jaywgraves,2018-09-19T18:45:02Z,2018-09-20T10:50:50Z,CONTRIBUTOR,"That works for me. Was able to pull the public image and no errors on my canned query. (~although a small rendering bug. I'll create an issue and if I have time today, a PR to fix~ this turned out to be my error.) Thanks for the quick response!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336465018,Travis should push tagged images to Docker Hub for each release, https://github.com/simonw/datasette/issues/292#issuecomment-423543060,https://api.github.com/repos/simonw/datasette/issues/292,423543060,MDEyOklzc3VlQ29tbWVudDQyMzU0MzA2MA==,9599,simonw,2018-09-21T14:06:31Z,2018-09-21T14:09:06Z,OWNER,"I keep on finding new reasons that I want this. The latest is that I'm playing with the more advanced features of FTS5 - in particular the highlight() function and the ability to sort by rank. The problem is... in order to do this, I need to properly join against the `_fts` table. Here's an example query: select highlight(events_fts, 0, '', ''), events_fts.rank, events.* from events join events_fts on events.rowid = events_fts.rowid where events_fts match :search order by rank Note that this is a different query from the usual FTS one (which does `where rowid in (select rowid from events_fts...)`) because I need the rank column somewhere I can sort against. I'd like to be able to use this on the table view page so I can get faceting etc for free, but this is a completely different query from the default. Maybe I need a way to customize the entire query? That feels weird though - why am I not using a view in that case? Answer: because views can't accept `:search` style parameters. I could use a canned query, but canned queries don't get faceting etc.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326800219,Mechanism for customizing the SQL used to select specific columns in the table view, https://github.com/simonw/datasette/issues/328#issuecomment-427261369,https://api.github.com/repos/simonw/datasette/issues/328,427261369,MDEyOklzc3VlQ29tbWVudDQyNzI2MTM2OQ==,13698964,chmaynard,2018-10-05T06:37:06Z,2018-10-05T06:37:06Z,NONE,"``` ~ $ docker pull datasetteproject/datasette ~ $ docker run -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db Usage: datasette -p [OPTIONS] [FILES]... Error: Invalid value for ""files"": Path ""/mnt/fixtures.db"" does not exist. ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336464733,"Installation instructions, including how to use the docker image", https://github.com/simonw/datasette/issues/187#issuecomment-427943710,https://api.github.com/repos/simonw/datasette/issues/187,427943710,MDEyOklzc3VlQ29tbWVudDQyNzk0MzcxMA==,1583271,progpow,2018-10-08T18:58:05Z,2018-10-08T18:58:05Z,NONE,"I have same error: ``` Collecting uvloop Using cached https://files.pythonhosted.org/packages/5c/37/6daa39aac42b2deda6ee77f408bec0419b600e27b89b374b0d440af32b10/uvloop-0.11.2.tar.gz Complete output from command python setup.py egg_info: Traceback (most recent call last): File """", line 1, in File ""C:\Users\sageev\AppData\Local\Temp\pip-install-bq64l8jy\uvloop\setup.py"", line 15, in raise RuntimeError('uvloop does not support Windows at the moment') RuntimeError: uvloop does not support Windows at the moment ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309033998,Windows installation error, https://github.com/simonw/datasette/issues/366#issuecomment-429737929,https://api.github.com/repos/simonw/datasette/issues/366,429737929,MDEyOklzc3VlQ29tbWVudDQyOTczNzkyOQ==,416374,gfrmin,2018-10-15T07:32:57Z,2018-10-15T07:32:57Z,CONTRIBUTOR,"Very hacky solution is to write now.json file forcing the usage of v1 of Zeit cloud, see https://github.com/slygent/datasette/commit/3ab824793ec6534b6dd87078aa46b11c4fa78ea3 This does work, at least.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",369716228,Default built image size over Zeit Now 100MiB limit, https://github.com/simonw/datasette/issues/176#issuecomment-431867885,https://api.github.com/repos/simonw/datasette/issues/176,431867885,MDEyOklzc3VlQ29tbWVudDQzMTg2Nzg4NQ==,634572,eads,2018-10-22T15:24:57Z,2018-10-22T15:24:57Z,NONE,"I'd like this as well. It would let me access Datasette-driven projects from GatsbyJS the same way I can access Postgres DBs via Hasura. While I don't see SQLite replacing Postgres for the 50m row datasets I sometimes have to work with, there's a whole class of smaller datasets that are great with Datasette but currently would find another option.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",285168503,Add GraphQL endpoint, https://github.com/simonw/datasette/issues/366#issuecomment-433680598,https://api.github.com/repos/simonw/datasette/issues/366,433680598,MDEyOklzc3VlQ29tbWVudDQzMzY4MDU5OA==,9599,simonw,2018-10-28T06:38:43Z,2018-10-28T06:38:43Z,OWNER,I've just started running into this as well. Looks like I'll have to anchor to v1 for the moment - I'm hoping the discussion on https://github.com/zeit/now-cli/issues/1523 encourages an increase in this limit policy :/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",369716228,Default built image size over Zeit Now 100MiB limit, https://github.com/simonw/datasette/issues/371#issuecomment-435767775,https://api.github.com/repos/simonw/datasette/issues/371,435767775,MDEyOklzc3VlQ29tbWVudDQzNTc2Nzc3NQ==,9599,simonw,2018-11-05T06:27:33Z,2018-11-05T06:27:33Z,OWNER,"This would be fantastic - that tutorial looks like many of the details needed for this. Do you know if Digital Ocean have the ability to provision URLs for a droplet without you needing to buy your own domain name? Heroku have https://example.herokuapp.com/ and Zeit have https://blah.now.sh/ - does Digital Ocean have an equivalent? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377156339,datasette publish digitalocean plugin, https://github.com/simonw/datasette/issues/369#issuecomment-435767827,https://api.github.com/repos/simonw/datasette/issues/369,435767827,MDEyOklzc3VlQ29tbWVudDQzNTc2NzgyNw==,9599,simonw,2018-11-05T06:27:55Z,2018-11-05T06:28:48Z,OWNER,"This is a good idea. Basically a version of this bug but on the custom SQL query page: ![2018-11-04 at 10 28 pm](https://user-images.githubusercontent.com/9599/47981499-fd9a8c80-e080-11e8-9c59-00e626d3aa4c.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",374953006,Interface should show same JSON shape options for custom SQL queries, https://github.com/simonw/datasette/issues/369#issuecomment-435768450,https://api.github.com/repos/simonw/datasette/issues/369,435768450,MDEyOklzc3VlQ29tbWVudDQzNTc2ODQ1MA==,416374,gfrmin,2018-11-05T06:31:59Z,2018-11-05T06:31:59Z,CONTRIBUTOR,"That would be ideal, but you know better than me whether the CSV streaming trick works for custom SQL queries.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",374953006,Interface should show same JSON shape options for custom SQL queries, https://github.com/simonw/datasette/issues/329#issuecomment-435772031,https://api.github.com/repos/simonw/datasette/issues/329,435772031,MDEyOklzc3VlQ29tbWVudDQzNTc3MjAzMQ==,9599,simonw,2018-11-05T06:53:28Z,2018-11-05T06:54:10Z,OWNER,"This works now! The `0.25.1` release was the first release which successfully pushed to Docker Hub: https://hub.docker.com/r/datasetteproject/datasette/tags/ ![2018-11-04 at 10 53 pm](https://user-images.githubusercontent.com/9599/47982395-70593700-e084-11e8-8870-9100677c2bde.png) Here's the log from the successful Travis release job: https://travis-ci.org/simonw/datasette/jobs/450714602 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",336465018,Travis should push tagged images to Docker Hub for each release, https://github.com/simonw/datasette/issues/371#issuecomment-435862009,https://api.github.com/repos/simonw/datasette/issues/371,435862009,MDEyOklzc3VlQ29tbWVudDQzNTg2MjAwOQ==,82988,psychemedia,2018-11-05T12:48:35Z,2018-11-05T12:48:35Z,CONTRIBUTOR,I think you need to register a domain name you own separately in order to get a non-IP address address? https://www.digitalocean.com/docs/networking/dns/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377156339,datasette publish digitalocean plugin, https://github.com/simonw/datasette/issues/370#issuecomment-435974786,https://api.github.com/repos/simonw/datasette/issues/370,435974786,MDEyOklzc3VlQ29tbWVudDQzNTk3NDc4Ng==,9599,simonw,2018-11-05T18:06:56Z,2018-11-05T18:06:56Z,OWNER,"I've been thinking a bit about ways of using Jupyter Notebook more effectively with Datasette (thinks like a `publish_dataframes(df1, df2, df3)` function which publishes some Pandas dataframes and returns you a URL to a new hosted Datasette instance) but you're right, Jupyter Lab is potentially a much more interesting fit.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377155320,Integration with JupyterLab, https://github.com/simonw/datasette/issues/374#issuecomment-435976262,https://api.github.com/repos/simonw/datasette/issues/374,435976262,MDEyOklzc3VlQ29tbWVudDQzNTk3NjI2Mg==,9599,simonw,2018-11-05T18:11:10Z,2018-11-05T18:11:10Z,OWNER,"I think there is a useful way forward here though: the image size may be limited to 100MB, but once the instance launches it gets access to a filesystem with a lot more space than that (possibly as much as 15GB given my initial poking around). So... one potential solution here is to teach Datasette to launch from a smaller image and then download a larger SQLite file from a known URL as part of its initial startup. Combined with the ability to get Now to always run at least one copy of an instance this could allow Datasette to host much larger SQLite databases on that platform while playing nicely with the Zeit v2 platform. See also https://github.com/zeit/now-cli/issues/1523","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377518499,Get Datasette working with Zeit Now v2's 100MB image size limit, https://github.com/simonw/datasette/issues/370#issuecomment-436037692,https://api.github.com/repos/simonw/datasette/issues/370,436037692,MDEyOklzc3VlQ29tbWVudDQzNjAzNzY5Mg==,82988,psychemedia,2018-11-05T21:15:47Z,2018-11-05T21:18:37Z,CONTRIBUTOR,"In terms of integration with `pandas`, I was pondering two different ways `datasette`/`csvs_to_sqlite` integration may work: - like [`pandasql`](https://github.com/yhat/pandasql), to provide a SQL query layer either by a direct connection to the sqlite db or via `datasette` API; - as an improvement of `pandas.to_sql()`, which is a bit ropey (e.g. `pandas.to_sql_from_csvs()`, routing the dataframe to sqlite via `csvs_tosqlite` rather than the dodgy mapping that `pandas` supports). The `pandas.publish_*` idea could be quite interesting though... Would it be useful/fruitful to think about `publish_` as a complement to [`pandas.to_`](https://pandas.pydata.org/pandas-docs/stable/api.html#id12)?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377155320,Integration with JupyterLab, https://github.com/simonw/datasette/issues/370#issuecomment-436042445,https://api.github.com/repos/simonw/datasette/issues/370,436042445,MDEyOklzc3VlQ29tbWVudDQzNjA0MjQ0NQ==,82988,psychemedia,2018-11-05T21:30:42Z,2018-11-05T21:31:48Z,CONTRIBUTOR,"Another route would be something like creating a `datasette` IPython magic for notebooks to take a dataframe and easily render it as a `datasette`. You'd need to run the app in the background rather than block execution in the notebook. Related to that, or to publishing a dataframe in notebook cell for use in other cells in a non-blocking way, there may be cribs in something like https://github.com/micahscopes/nbmultitask .","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377155320,Integration with JupyterLab, https://github.com/simonw/datasette/issues/227#issuecomment-439194286,https://api.github.com/repos/simonw/datasette/issues/227,439194286,MDEyOklzc3VlQ29tbWVudDQzOTE5NDI4Ng==,222245,carlmjohnson,2018-11-15T21:20:37Z,2018-11-15T21:20:37Z,NONE,I'm diving back into https://salaries.news.baltimoresun.com and what I really want is the ability to inject the request into my context.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",315960272,prepare_context() plugin hook, https://github.com/simonw/datasette/issues/120#issuecomment-439421164,https://api.github.com/repos/simonw/datasette/issues/120,439421164,MDEyOklzc3VlQ29tbWVudDQzOTQyMTE2NA==,36796532,ad-si,2018-11-16T15:05:18Z,2018-11-16T15:05:18Z,NONE,This would be an awesome feature ❤️ ,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275087397,Plugin that adds an authentication layer of some sort, https://github.com/simonw/datasette/issues/374#issuecomment-439762759,https://api.github.com/repos/simonw/datasette/issues/374,439762759,MDEyOklzc3VlQ29tbWVudDQzOTc2Mjc1OQ==,9599,simonw,2018-11-19T03:41:36Z,2018-11-19T03:41:36Z,OWNER,"It turned out Zeit didn't end up shipping the new 100MB-limit Docker-based Zeit 2.0 after all - they ended up going in a completely different direction, towards lambdas instead (which don't really fit the Datasette model): https://zeit.co/blog/now-2 But... as far as I can tell they have introduced the 100MB image size for all free Zeit accounts ever against their 1.0 platform. So we still need to solve this, or free Zeit users won't be able to use `datasette publish now` even while 1.0 is still available. I made some notes on this here: https://simonwillison.net/2018/Nov/19/smaller-python-docker-images/ I've got it working for the Datasette Publish webapp, but I still need to fix `datasette publish now` to create much smaller patterns. I know how to do this for regular datasette, but I haven't yet figured out an Alpine Linux pattern for spatialite extras: https://github.com/simonw/datasette/blob/5e3a432a0caa23837fa58134f69e2f82e4f632a6/datasette/utils.py#L287-L300","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377518499,Get Datasette working with Zeit Now v2's 100MB image size limit, https://github.com/simonw/datasette/issues/374#issuecomment-439763196,https://api.github.com/repos/simonw/datasette/issues/374,439763196,MDEyOklzc3VlQ29tbWVudDQzOTc2MzE5Ng==,9599,simonw,2018-11-19T03:45:13Z,2018-11-19T03:45:13Z,OWNER,This looks like it might be a recipe for spatialite Python on Alpine Linux: https://github.com/bentrm/geopython/blob/8e52062d9545f4b7c1f04a3516354a5a9155e31f/Dockerfile,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377518499,Get Datasette working with Zeit Now v2's 100MB image size limit, https://github.com/simonw/datasette/issues/374#issuecomment-439763268,https://api.github.com/repos/simonw/datasette/issues/374,439763268,MDEyOklzc3VlQ29tbWVudDQzOTc2MzI2OA==,9599,simonw,2018-11-19T03:45:44Z,2018-11-19T03:45:44Z,OWNER,Another example that might be useful: https://github.com/poc-flask/alpine/blob/8e9f48a2351e106347dab36d08cf21dee865993e/Dockerfile,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377518499,Get Datasette working with Zeit Now v2's 100MB image size limit, https://github.com/simonw/datasette/pull/389#issuecomment-440128762,https://api.github.com/repos/simonw/datasette/issues/389,440128762,MDEyOklzc3VlQ29tbWVudDQ0MDEyODc2Mg==,9599,simonw,2018-11-20T03:52:11Z,2018-11-20T03:52:11Z,OWNER,"The problem is Sanic. Here's the error I'm getting: ``` (venv) datasette $ pytest -x ============================================================= test session starts ============================================================== platform darwin -- Python 3.7.1, pytest-4.0.0, py-1.7.0, pluggy-0.8.0 rootdir: /Users/simonw/Dropbox/Development/datasette, inifile: collected 258 items tests/test_api.py ...................F =================================================================== FAILURES =================================================================== _______________________________________________________ test_table_with_slashes_in_name ________________________________________________________ app_client = def test_table_with_slashes_in_name(app_client): response = app_client.get('/fixtures/table%2Fwith%2Fslashes.csv?_shape=objects&_format=json') > assert response.status == 200 E AssertionError: assert 404 == 200 ``` That's because something about how Sanic handles escape characters in URLs changed between 0.7.0 and 0.8.3.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",382471625,Bump dependency versions, https://github.com/simonw/datasette/pull/390#issuecomment-447677798,https://api.github.com/repos/simonw/datasette/issues/390,447677798,MDEyOklzc3VlQ29tbWVudDQ0NzY3Nzc5OA==,9599,simonw,2018-12-16T21:32:45Z,2018-12-16T21:32:45Z,OWNER,Thanks for spotting this!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",386459810,tiny typo in customization docs, https://github.com/simonw/datasette/issues/374#issuecomment-448437245,https://api.github.com/repos/simonw/datasette/issues/374,448437245,MDEyOklzc3VlQ29tbWVudDQ0ODQzNzI0NQ==,9599,simonw,2018-12-19T01:35:59Z,2018-12-19T01:35:59Z,OWNER,"Closing this as Zeit went on a different direction with Now v2, so the 100MB limit is no longer a concern.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377518499,Get Datasette working with Zeit Now v2's 100MB image size limit, https://github.com/simonw/datasette/issues/393#issuecomment-450943172,https://api.github.com/repos/simonw/datasette/issues/393,450943172,MDEyOklzc3VlQ29tbWVudDQ1MDk0MzE3Mg==,9599,simonw,2019-01-02T18:28:43Z,2019-01-02T18:28:43Z,OWNER,"Definitely a bug, thanks.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",395236066,"CSV export in ""Advanced export"" pane doesn't respect query", https://github.com/simonw/datasette/issues/393#issuecomment-450943632,https://api.github.com/repos/simonw/datasette/issues/393,450943632,MDEyOklzc3VlQ29tbWVudDQ1MDk0MzYzMg==,9599,simonw,2019-01-02T18:30:20Z,2019-01-02T18:30:20Z,OWNER,"This is the code which is meant to add those options as hidden form fields: https://github.com/simonw/datasette/blob/fe5b6ea95a973534fe8a44907c0ea2449aae7602/datasette/templates/table.html#L150-L155 It's clearly not working. Need to fix this and add a corresponding unit test.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",395236066,"CSV export in ""Advanced export"" pane doesn't respect query", https://github.com/simonw/datasette/issues/393#issuecomment-450944166,https://api.github.com/repos/simonw/datasette/issues/393,450944166,MDEyOklzc3VlQ29tbWVudDQ1MDk0NDE2Ng==,9599,simonw,2019-01-02T18:32:12Z,2019-01-02T18:32:12Z,OWNER,"Here's the test that needs updating: https://github.com/simonw/datasette/blob/8b8ae55e7c8b9e1dceef53f55a330b596ca44d41/tests/test_html.py#L427-L435","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",395236066,"CSV export in ""Advanced export"" pane doesn't respect query", https://github.com/simonw/datasette/issues/391#issuecomment-450964512,https://api.github.com/repos/simonw/datasette/issues/391,450964512,MDEyOklzc3VlQ29tbWVudDQ1MDk2NDUxMg==,9599,simonw,2019-01-02T19:45:12Z,2019-01-02T19:45:12Z,OWNER,"Thanks, I've fixed this. I had to re-alias it against now: ``` ~ $ now alias google-trends-pnwhfwvgqf.now.sh https://google-trends.datasettes.com/ > Assigning alias google-trends.datasettes.com to deployment google-trends-pnwhfwvgqf.now.sh > Certificate for google-trends.datasettes.com (cert_uXaADIuNooHS3tZ) created [18s] > Success! google-trends.datasettes.com now points to google-trends-pnwhfwvgqf.now.sh [20s] ```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",392610803,Google Trends example doesn’t work, https://github.com/simonw/datasette/issues/393#issuecomment-451046123,https://api.github.com/repos/simonw/datasette/issues/393,451046123,MDEyOklzc3VlQ29tbWVudDQ1MTA0NjEyMw==,9599,simonw,2019-01-03T03:05:07Z,2019-01-03T03:05:07Z,OWNER,The fix was released as part of Datasette 0.26 - you can see the fix working here: https://v0-26.datasette.io/fixtures-dd88475/facetable?_facet=planet_int&planet_int=1#export,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",395236066,"CSV export in ""Advanced export"" pane doesn't respect query", https://github.com/simonw/datasette/issues/393#issuecomment-451047426,https://api.github.com/repos/simonw/datasette/issues/393,451047426,MDEyOklzc3VlQ29tbWVudDQ1MTA0NzQyNg==,9599,simonw,2019-01-03T03:19:04Z,2019-01-03T03:19:04Z,OWNER,https://fivethirtyeight.datasettes.com/-/versions is now running 0.26 - so your initial bug demo is now fixed: https://fivethirtyeight.datasettes.com/fivethirtyeight-c300360/classic-rock%2Fclassic-rock-song-list?Release+Year__exact=1989#export,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",395236066,"CSV export in ""Advanced export"" pane doesn't respect query", https://github.com/simonw/datasette/issues/393#issuecomment-451415063,https://api.github.com/repos/simonw/datasette/issues/393,451415063,MDEyOklzc3VlQ29tbWVudDQ1MTQxNTA2Mw==,1727065,ltrgoddard,2019-01-04T11:04:08Z,2019-01-04T11:04:08Z,NONE,Awesome - will get myself up and running on 0.26,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",395236066,"CSV export in ""Advanced export"" pane doesn't respect query", https://github.com/simonw/datasette/issues/394#issuecomment-451704724,https://api.github.com/repos/simonw/datasette/issues/394,451704724,MDEyOklzc3VlQ29tbWVudDQ1MTcwNDcyNA==,9599,simonw,2019-01-06T00:32:23Z,2019-01-06T00:33:44Z,OWNER,"I found a really nice pattern for writing the unit tests for this (though it would look even nicer with a solution to #395) ```python @pytest.mark.parametrize(""prefix"", [""/prefix/"", ""https://example.com/""]) @pytest.mark.parametrize(""path"", [ ""/"", ""/fixtures"", ""/fixtures/compound_three_primary_keys"", ""/fixtures/compound_three_primary_keys/a,a,a"", ""/fixtures/paginated_view"", ]) def test_url_prefix_config(prefix, path): for client in make_app_client(config={ ""url_prefix"": prefix, }): response = client.get(path) soup = Soup(response.body, ""html.parser"") for a in soup.findAll(""a""): href = a[""href""] if href not in { ""https://github.com/simonw/datasette"", ""https://github.com/simonw/datasette/blob/master/LICENSE"", ""https://github.com/simonw/datasette/blob/master/tests/fixtures.py"", }: assert href.startswith(prefix), (href, a.parent) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",396212021,base_url configuration setting, https://github.com/simonw/datasette/issues/397#issuecomment-453251589,https://api.github.com/repos/simonw/datasette/issues/397,453251589,MDEyOklzc3VlQ29tbWVudDQ1MzI1MTU4OQ==,9599,simonw,2019-01-10T20:59:42Z,2019-01-10T20:59:42Z,OWNER,"What version of SQLite are you seeing in Datasette? You can tell by hitting http://localhost:8001/-/versions - e.g. here: https://latest.datasette.io/-/versions My best guess is that your Python SQLite module is running an older version that doesn't support window functions. One way you can fix that is with the `pysqlite3` module - try running this in your virtual environment: pip install git+git://github.com/karlb/pysqlite3 That's using a fork of the official module that embeds a full recent SQLite. See this issue thread for more details: https://github.com/coleifer/pysqlite3/issues/2","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",397129564,Update official datasetteproject/datasette Docker container to SQLite 3.26.0, https://github.com/simonw/datasette/issues/397#issuecomment-453252024,https://api.github.com/repos/simonw/datasette/issues/397,453252024,MDEyOklzc3VlQ29tbWVudDQ1MzI1MjAyNA==,9599,simonw,2019-01-10T21:00:57Z,2019-01-10T21:00:57Z,OWNER,"Oh I just saw you're using the official Datasette docker package - yeah, that's not bundled with a recent SQLite at the moment. We should update that: https://github.com/simonw/datasette/blob/5b026115126bedbb66457767e169139146d1c9fd/Dockerfile#L9-L11","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",397129564,Update official datasetteproject/datasette Docker container to SQLite 3.26.0, https://github.com/simonw/datasette/issues/271#issuecomment-453262703,https://api.github.com/repos/simonw/datasette/issues/271,453262703,MDEyOklzc3VlQ29tbWVudDQ1MzI2MjcwMw==,9599,simonw,2019-01-10T21:35:18Z,2019-01-10T21:35:18Z,OWNER,It turns out this was much easier to support than I expected: https://github.com/simonw/datasette/commit/eac08f0dfc61a99e8887442fc247656d419c76f8,"{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324162476,Mechanism for automatically picking up changes when on-disk .db file changes, https://github.com/simonw/datasette/issues/396#issuecomment-453324601,https://api.github.com/repos/simonw/datasette/issues/396,453324601,MDEyOklzc3VlQ29tbWVudDQ1MzMyNDYwMQ==,9599,simonw,2019-01-11T00:55:21Z,2019-01-11T00:55:21Z,OWNER,Demo: https://latest.datasette.io/-/versions,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",397098882,Add pragma compile_options output to /-/versions, https://github.com/simonw/datasette/issues/397#issuecomment-453330680,https://api.github.com/repos/simonw/datasette/issues/397,453330680,MDEyOklzc3VlQ29tbWVudDQ1MzMzMDY4MA==,9599,simonw,2019-01-11T01:17:11Z,2019-01-11T01:25:33Z,OWNER,"If you pull [the latest image](https://hub.docker.com/r/datasetteproject/datasette) you should get the right SQLite version now: docker pull datasetteproject/datasette docker run -p 8001:8001 \ datasetteproject/datasette \ datasette -p 8001 -h 0.0.0.0 http://0.0.0.0:8001/-/versions now gives me: ``` ""version"": ""3.26.0"" ```","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",397129564,Update official datasetteproject/datasette Docker container to SQLite 3.26.0, https://github.com/simonw/datasette/issues/400#issuecomment-453795040,https://api.github.com/repos/simonw/datasette/issues/400,453795040,MDEyOklzc3VlQ29tbWVudDQ1Mzc5NTA0MA==,9599,simonw,2019-01-13T01:46:30Z,2019-01-13T01:46:30Z,OWNER,I'm really excited about this - it looks like it could be a great plugin.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",398559195,datasette publish cloudrun plugin, https://github.com/simonw/datasette/issues/399#issuecomment-453874429,https://api.github.com/repos/simonw/datasette/issues/399,453874429,MDEyOklzc3VlQ29tbWVudDQ1Mzg3NDQyOQ==,9599,simonw,2019-01-13T23:09:09Z,2019-01-13T23:09:09Z,OWNER,"It looks like there are two reasons for this: - The `.git` directory was listed in `.dockerignore` so it wasn't being copied into the build process - The docker build stage wasn't installing the `git` executable, so it couldn't read the current version ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",398089089,/-/versions for official Docker image returns wrong Datasette version, https://github.com/simonw/datasette/issues/399#issuecomment-453876023,https://api.github.com/repos/simonw/datasette/issues/399,453876023,MDEyOklzc3VlQ29tbWVudDQ1Mzg3NjAyMw==,9599,simonw,2019-01-13T23:31:59Z,2019-01-13T23:31:59Z,OWNER,"``` docker pull datasetteproject/datasette docker run -p 8001:8001 datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 ``` http://0.0.0.0:8001/-/versions now returns: ``` { ""datasette"": { ""version"": ""0.26.2+0.ga418c8b.dirty"" }, ``` I'm not sure why it's showing `.dirty` there. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",398089089,/-/versions for official Docker image returns wrong Datasette version, https://github.com/simonw/datasette/issues/402#issuecomment-455223551,https://api.github.com/repos/simonw/datasette/issues/402,455223551,MDEyOklzc3VlQ29tbWVudDQ1NTIyMzU1MQ==,9599,simonw,2019-01-17T15:55:06Z,2019-01-17T15:55:06Z,OWNER,"It's new in SQLite 3.26.0 so I will need to figure out how to only apply it in that version or higher. https://sqlite.org/releaselog/3_26_0.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",400340905,Use SQLITE_DBCONFIG_DEFENSIVE plus other recommendations from SQLite security docs, https://github.com/simonw/datasette/issues/402#issuecomment-455224327,https://api.github.com/repos/simonw/datasette/issues/402,455224327,MDEyOklzc3VlQ29tbWVudDQ1NTIyNDMyNw==,9599,simonw,2019-01-17T15:56:57Z,2019-01-17T15:56:57Z,OWNER,https://sqlite.org/security.html has other recommmendations for apps that accept SQLite files from untrusted sources that we should apply.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",400340905,Use SQLITE_DBCONFIG_DEFENSIVE plus other recommendations from SQLite security docs, https://github.com/simonw/datasette/issues/401#issuecomment-455230501,https://api.github.com/repos/simonw/datasette/issues/401,455230501,MDEyOklzc3VlQ29tbWVudDQ1NTIzMDUwMQ==,9599,simonw,2019-01-17T16:12:59Z,2019-01-17T16:12:59Z,OWNER,"Datasette-cluster-map doesn't use the new plugin configuration mechanism yet - it really should! The best example of how to use this mechanism right now is embedded in the Datasette unit tests: https://github.com/simonw/datasette/blob/b7257a21bf3dfa7353980f343c83a616da44daa7/tests/fixtures.py#L266-L270 https://github.com/simonw/datasette/blob/b7257a21bf3dfa7353980f343c83a616da44daa7/tests/test_plugins.py#L139-L145","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",400229984,How to pass configuration to plugins?, https://github.com/simonw/datasette/issues/402#issuecomment-455231411,https://api.github.com/repos/simonw/datasette/issues/402,455231411,MDEyOklzc3VlQ29tbWVudDQ1NTIzMTQxMQ==,9599,simonw,2019-01-17T16:15:21Z,2019-01-17T16:15:21Z,OWNER,Unfortunately it looks like there isn't currently a mechanism in the Python sqlite3 library for setting configuration flags like SQLITE_DBCONFIG_DEFENSIVE,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",400340905,Use SQLITE_DBCONFIG_DEFENSIVE plus other recommendations from SQLite security docs, https://github.com/simonw/datasette/issues/401#issuecomment-455445069,https://api.github.com/repos/simonw/datasette/issues/401,455445069,MDEyOklzc3VlQ29tbWVudDQ1NTQ0NTA2OQ==,9599,simonw,2019-01-18T06:49:07Z,2019-01-18T06:49:07Z,OWNER,I've released a new version of the datasette-cluster-map plugin to illustrate how plugin configuration can work: https://github.com/simonw/datasette-cluster-map/commit/fcc86c450e3df3e6b81c41f31df458923181527a,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",400229984,How to pass configuration to plugins?, https://github.com/simonw/datasette/issues/403#issuecomment-455445392,https://api.github.com/repos/simonw/datasette/issues/403,455445392,MDEyOklzc3VlQ29tbWVudDQ1NTQ0NTM5Mg==,9599,simonw,2019-01-18T06:51:14Z,2019-01-18T06:51:14Z,OWNER,"I talk about that a bit here: https://simonwillison.net/2018/Oct/4/datasette-ideas/#Bundling_the_data_with_the_code One of the key ideas behind Datasette is that if your data is read-only you can package it up with the rest of your code - so the normal limitations that apply with hosting services like now.sh no longer prevent you from including a database. The SQLite database is just another static binary file that gets packaged up as part of your deployment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",400511206,How does persistence work?, https://github.com/simonw/datasette/issues/401#issuecomment-455520561,https://api.github.com/repos/simonw/datasette/issues/401,455520561,MDEyOklzc3VlQ29tbWVudDQ1NTUyMDU2MQ==,1055831,dazzag24,2019-01-18T11:48:13Z,2019-01-18T11:48:13Z,NONE,"Thanks. I'll take a look at your changes. I must admit I was struggling to see how to pass info from the python code in __init__.py into the javascript document.addEventListener function.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",400229984,How to pass configuration to plugins?, https://github.com/simonw/datasette/issues/403#issuecomment-455752238,https://api.github.com/repos/simonw/datasette/issues/403,455752238,MDEyOklzc3VlQ29tbWVudDQ1NTc1MjIzOA==,1794527,ccorcos,2019-01-19T05:47:55Z,2019-01-19T05:47:55Z,NONE,Ah. That makes much more sense. Interesting approach.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",400511206,How does persistence work?, https://github.com/simonw/datasette/issues/405#issuecomment-457975075,https://api.github.com/repos/simonw/datasette/issues/405,457975075,MDEyOklzc3VlQ29tbWVudDQ1Nzk3NTA3NQ==,9599,simonw,2019-01-28T01:41:51Z,2019-01-28T01:41:51Z,OWNER,Implemented in https://github.com/simonw/datasette/commit/b5dd83981a7dbff571284d4d90a950c740245b05,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",403617881,.json?_nl=on option for exporting newline-delimited JSON, https://github.com/simonw/datasette/issues/405#issuecomment-457975857,https://api.github.com/repos/simonw/datasette/issues/405,457975857,MDEyOklzc3VlQ29tbWVudDQ1Nzk3NTg1Nw==,9599,simonw,2019-01-28T01:48:37Z,2019-01-28T01:49:00Z,OWNER,"Demo: https://latest.datasette.io/fixtures-dd88475/facetable.json?_shape=array&_nl=on Also https://b5dd839.datasette.io/fixtures-dd88475/facetable.json?_shape=array&_nl=on","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",403617881,.json?_nl=on option for exporting newline-delimited JSON, https://github.com/simonw/datasette/pull/404#issuecomment-457976864,https://api.github.com/repos/simonw/datasette/issues/404,457976864,MDEyOklzc3VlQ29tbWVudDQ1Nzk3Njg2NA==,9599,simonw,2019-01-28T01:56:55Z,2019-01-28T01:56:55Z,OWNER,"This failed in Python 3.5: ``` File ""/home/travis/virtualenv/python3.5.6/lib/python3.5/site-packages/jinja2/environment.py"", line 1020, in render_async raise NotImplementedError('This feature is not available for this ' NotImplementedError: This feature is not available for this version of Python ``` It looks like this is caused by this feature detection code: https://github.com/pallets/jinja/blob/a7ba0b637805c53d442e975e3864d3ea38d8743f/jinja2/utils.py#L633-L638","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",403499298,Experiment: run Jinja in async mode, https://github.com/simonw/sqlite-utils/issues/6#issuecomment-457978729,https://api.github.com/repos/simonw/sqlite-utils/issues/6,457978729,MDEyOklzc3VlQ29tbWVudDQ1Nzk3ODcyOQ==,9599,simonw,2019-01-28T02:12:19Z,2019-01-28T02:12:19Z,OWNER,Will need to solve #7 for this to become truly efficient.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",403624090,"""sqlite-utils insert"" should support newline-delimited JSON", https://github.com/simonw/sqlite-utils/issues/7#issuecomment-457980966,https://api.github.com/repos/simonw/sqlite-utils/issues/7,457980966,MDEyOklzc3VlQ29tbWVudDQ1Nzk4MDk2Ng==,9599,simonw,2019-01-28T02:29:32Z,2019-01-28T02:29:32Z,OWNER,"Remember to remove this TODO (and turn the `[]` into `()` on this line) as part of this task: https://github.com/simonw/sqlite-utils/blob/5309c5c7755818323a0f5353bad0de98ecc866be/sqlite_utils/cli.py#L78-L80","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",403625674,.insert_all() should accept a generator and process it efficiently, https://github.com/simonw/sqlite-utils/issues/7#issuecomment-458011885,https://api.github.com/repos/simonw/sqlite-utils/issues/7,458011885,MDEyOklzc3VlQ29tbWVudDQ1ODAxMTg4NQ==,9599,simonw,2019-01-28T06:25:48Z,2019-01-28T06:25:48Z,OWNER,Re-opening for the second bit involving the cli tool.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",403625674,.insert_all() should accept a generator and process it efficiently, https://github.com/simonw/sqlite-utils/issues/7#issuecomment-458011906,https://api.github.com/repos/simonw/sqlite-utils/issues/7,458011906,MDEyOklzc3VlQ29tbWVudDQ1ODAxMTkwNg==,9599,simonw,2019-01-28T06:25:55Z,2019-01-28T06:25:55Z,OWNER,"I tested this with a script called `churn_em_out.py` ``` i = 0 while True: i += 1 print( '{""id"": I, ""another"": ""row"", ""number"": J}'.replace(""I"", str(i)).replace( ""J"", str(i + 1) ) ) ``` Then I ran this: ``` python churn_em_out.py | \ sqlite-utils insert /tmp/getbig.db stats - \ --nl --batch-size=10000 ``` And used `watch 'ls -lah /tmp/getbig.db'` to watch the file growing as it had 10,000 lines of junk committed in batches. The memory used by the process never grew about around 50MB.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",403625674,.insert_all() should accept a generator and process it efficiently, https://github.com/simonw/datasette/issues/160#issuecomment-459915995,https://api.github.com/repos/simonw/datasette/issues/160,459915995,MDEyOklzc3VlQ29tbWVudDQ1OTkxNTk5NQ==,82988,psychemedia,2019-02-02T00:43:16Z,2019-02-02T00:58:20Z,CONTRIBUTOR,"Do you have any simple working examples of how to use `--static`? Inspection of default served files suggests locations such as `http://example.com/-/static/app.css?0e06ee`. If `datasette` is being proxied to `http://example.com/foo/datasette`, what form should arguments to `--static` take so that static files are correctly referenced? Use case is here: https://github.com/psychemedia/jupyterserverproxy-datasette-demo Trying to do a really simple `datasette` demo in MyBinder using jupyter-server-proxy.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",278208011,Ability to bundle and serve additional static files, https://github.com/simonw/datasette/pull/407#issuecomment-460897973,https://api.github.com/repos/simonw/datasette/issues/407,460897973,MDEyOklzc3VlQ29tbWVudDQ2MDg5Nzk3Mw==,9599,simonw,2019-02-06T04:31:30Z,2019-02-06T04:31:30Z,OWNER,This helped my figure out what to do: https://github.com/heroku/heroku-builds/issues/36,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",407073223,Heroku --include-vcs-ignore, https://github.com/simonw/datasette/issues/398#issuecomment-460901857,https://api.github.com/repos/simonw/datasette/issues/398,460901857,MDEyOklzc3VlQ29tbWVudDQ2MDkwMTg1Nw==,9599,simonw,2019-02-06T05:01:19Z,2019-02-06T05:01:19Z,OWNER,"I'd really like to use the content-length header here, but Sanic hasn't yet fixed the bug I filed about it: https://github.com/huge-success/sanic/issues/1194","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",398011658,Ensure downloading a 100+MB SQLite database file works, https://github.com/simonw/datasette/issues/172#issuecomment-460902824,https://api.github.com/repos/simonw/datasette/issues/172,460902824,MDEyOklzc3VlQ29tbWVudDQ2MDkwMjgyNA==,9599,simonw,2019-02-06T05:09:05Z,2019-02-06T05:09:05Z,OWNER,"Demo: https://latest.datasette.io/fixtures-dd88475 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",280896290,Show size of .db file next to download link, https://github.com/simonw/datasette/issues/187#issuecomment-463917744,https://api.github.com/repos/simonw/datasette/issues/187,463917744,MDEyOklzc3VlQ29tbWVudDQ2MzkxNzc0NA==,4190962,phoenixjun,2019-02-15T05:58:44Z,2019-02-15T05:58:44Z,NONE,is this supported or not? you can comment if it is not supported so that people like me can stop trying.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309033998,Windows installation error, https://github.com/simonw/sqlite-utils/issues/8#issuecomment-464341721,https://api.github.com/repos/simonw/sqlite-utils/issues/8,464341721,MDEyOklzc3VlQ29tbWVudDQ2NDM0MTcyMQ==,82988,psychemedia,2019-02-16T12:08:41Z,2019-02-16T12:08:41Z,NONE,We also get an error if a column name contains a `.`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",403922644,Problems handling column names containing spaces or - , https://github.com/simonw/datasette/issues/187#issuecomment-466325528,https://api.github.com/repos/simonw/datasette/issues/187,466325528,MDEyOklzc3VlQ29tbWVudDQ2NjMyNTUyOA==,2892252,fkuhn,2019-02-22T09:03:50Z,2019-02-22T09:03:50Z,NONE,"I ran into the same issue when trying to install datasette on windows after successfully using it on linux. Unfortunately, there has not been any progress in implementing uvloop for windows - so I recommend not to use it on win. You can read about this issue here: [https://github.com/MagicStack/uvloop/issues/14](url)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309033998,Windows installation error, https://github.com/simonw/sqlite-utils/issues/8#issuecomment-466695500,https://api.github.com/repos/simonw/sqlite-utils/issues/8,466695500,MDEyOklzc3VlQ29tbWVudDQ2NjY5NTUwMA==,9599,simonw,2019-02-23T21:09:03Z,2019-02-23T21:09:03Z,OWNER,"Fixed in https://github.com/simonw/sqlite-utils/commit/228d595f7d10994f34e948888093c2cd290267c4 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",403922644,Problems handling column names containing spaces or - , https://github.com/simonw/sqlite-utils/issues/11#issuecomment-466695672,https://api.github.com/repos/simonw/sqlite-utils/issues/11,466695672,MDEyOklzc3VlQ29tbWVudDQ2NjY5NTY3Mg==,9599,simonw,2019-02-23T21:10:23Z,2019-02-23T21:10:23Z,OWNER,"Rough sketch: ``` +try: + import numpy +except ImportError: + numpy = None + Column = namedtuple( ""Column"", (""cid"", ""name"", ""type"", ""notnull"", ""default_value"", ""is_pk"") ) @@ -70,6 +79,22 @@ class Database: datetime.time: ""TEXT"", None.__class__: ""TEXT"", } + # If numpy is available, add more types + if numpy: + col_type_mapping.update({ + numpy.int8: ""INTEGER"", + numpy.int16: ""INTEGER"", + numpy.int32: ""INTEGER"", + numpy.int64: ""INTEGER"", + numpy.uint8: ""INTEGER"", + numpy.uint16: ""INTEGER"", + numpy.uint32: ""INTEGER"", + numpy.uint64: ""INTEGER"", + numpy.float16: ""FLOAT"", + numpy.float32: ""FLOAT"", + numpy.float64: ""FLOAT"", + numpy.float128: ""FLOAT"", + }) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413740684,Detect numpy types when creating tables, https://github.com/simonw/sqlite-utils/issues/11#issuecomment-466695695,https://api.github.com/repos/simonw/sqlite-utils/issues/11,466695695,MDEyOklzc3VlQ29tbWVudDQ2NjY5NTY5NQ==,9599,simonw,2019-02-23T21:10:35Z,2019-02-23T21:10:35Z,OWNER,Need to test this both with and without `numpy` installed.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413740684,Detect numpy types when creating tables, https://github.com/simonw/sqlite-utils/issues/13#issuecomment-466732039,https://api.github.com/repos/simonw/sqlite-utils/issues/13,466732039,MDEyOklzc3VlQ29tbWVudDQ2NjczMjAzOQ==,9599,simonw,2019-02-24T04:07:57Z,2019-02-24T04:07:57Z,OWNER,"Example: http://api.nobelprize.org/v1/laureate.json This includes affiliations which look like this: ""affiliations"": [ { ""name"": ""Sorbonne University"", ""city"": ""Paris"", ""country"": ""France"" } ]","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413779210,Ability to automatically create IDs from content hash of row, https://github.com/simonw/sqlite-utils/issues/10#issuecomment-466794069,https://api.github.com/repos/simonw/sqlite-utils/issues/10,466794069,MDEyOklzc3VlQ29tbWVudDQ2Njc5NDA2OQ==,9599,simonw,2019-02-24T16:55:37Z,2019-02-24T16:55:37Z,OWNER,"This was fixed by https://github.com/simonw/sqlite-utils/commit/228d595f7d10994f34e948888093c2cd290267c4 - see also #8 ``` >>> db = sqlite_utils.Database("":memory:"") >>> dfX=pd.DataFrame({'order':range(3),'col2':range(3)}) >>> db[""test""].upsert_all(dfX.to_dict(orient='records')) ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",411066700,Error in upsert if column named 'order', https://github.com/simonw/sqlite-utils/issues/14#issuecomment-466794369,https://api.github.com/repos/simonw/sqlite-utils/issues/14,466794369,MDEyOklzc3VlQ29tbWVudDQ2Njc5NDM2OQ==,9599,simonw,2019-02-24T16:59:11Z,2019-02-24T16:59:43Z,OWNER,"https://www.sqlite.org/lang_createindex.html ![image](https://user-images.githubusercontent.com/9599/53302378-72512c80-3812-11e9-8828-46a03d893879.png) May as well support ``--if-not-exists`` as well.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413842611,Utilities for adding indexes, https://github.com/simonw/sqlite-utils/issues/14#issuecomment-466800090,https://api.github.com/repos/simonw/sqlite-utils/issues/14,466800090,MDEyOklzc3VlQ29tbWVudDQ2NjgwMDA5MA==,9599,simonw,2019-02-24T18:01:10Z,2019-02-24T18:01:10Z,OWNER,"The `WHERE` clause can be used to create partial indexes: https://www.sqlite.org/partialindex.html I'm going to ignore it for the moment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413842611,Utilities for adding indexes, https://github.com/simonw/sqlite-utils/issues/14#issuecomment-466800210,https://api.github.com/repos/simonw/sqlite-utils/issues/14,466800210,MDEyOklzc3VlQ29tbWVudDQ2NjgwMDIxMA==,9599,simonw,2019-02-24T18:02:23Z,2019-02-24T18:02:23Z,OWNER,Likewise I'm going to ignore indexes on expressions (as opposed to just columns): https://www.sqlite.org/expridx.html,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413842611,Utilities for adding indexes, https://github.com/simonw/sqlite-utils/issues/2#issuecomment-466807308,https://api.github.com/repos/simonw/sqlite-utils/issues/2,466807308,MDEyOklzc3VlQ29tbWVudDQ2NjgwNzMwOA==,9599,simonw,2019-02-24T19:18:19Z,2019-02-24T19:18:25Z,OWNER,"Python API: db[""articles""].add_foreign_key(""author_id"", ""authors"", ""id"") CLI: $ sqlite-utils add-foreign-key articles author_id authors id ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",349850687,Mechanism for adding foreign keys to an existing table, https://github.com/simonw/sqlite-utils/issues/17#issuecomment-466820167,https://api.github.com/repos/simonw/sqlite-utils/issues/17,466820167,MDEyOklzc3VlQ29tbWVudDQ2NjgyMDE2Nw==,9599,simonw,2019-02-24T21:42:33Z,2019-02-24T21:42:33Z,OWNER,"It looks like the type information isn't actually used for anything at all, so this: https://github.com/simonw/sqlite-utils/blob/f8d3b7cfe5c1950b0749d40eb2640df50b52f651/tests/test_create.py#L97-L103 Could actually be written like this: ``` fresh_db[""m2m""].insert( {""one_id"": 1, ""two_id"": 1}, foreign_keys=( (""one_id"", ""one"", ""id""), (""two_id"", ""two"", ""id""), ), ) ``` ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413868452,Improve and document foreign_keys=... argument to insert/create/etc, https://github.com/simonw/sqlite-utils/issues/17#issuecomment-466820188,https://api.github.com/repos/simonw/sqlite-utils/issues/17,466820188,MDEyOklzc3VlQ29tbWVudDQ2NjgyMDE4OA==,9599,simonw,2019-02-24T21:42:50Z,2019-02-24T21:42:50Z,OWNER,Sanity checking those foreign keys would be worthwhile.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413868452,Improve and document foreign_keys=... argument to insert/create/etc, https://github.com/simonw/sqlite-utils/issues/17#issuecomment-466821200,https://api.github.com/repos/simonw/sqlite-utils/issues/17,466821200,MDEyOklzc3VlQ29tbWVudDQ2NjgyMTIwMA==,9599,simonw,2019-02-24T21:55:08Z,2019-02-24T21:55:54Z,OWNER,"This involves a breaking API change. I need to call that out in the README and also fix my two other projects which use the old four-tuple version of `foreign_keys=`: https://github.com/simonw/db-to-sqlite/blob/c2f8e93bc6bbdfd135de3656ea0f497859ae49ff/db_to_sqlite/cli.py#L30-L42 And https://github.com/simonw/russian-ira-facebook-ads-datasette/blob/e7106710abdd7bdcae035bedd8bdaba75ae56a12/fetch_and_build_russian_ads.py#L71-L74 I'll also need to set a minimum version for `sqlite-utils` in the `db-to-sqlite` setup.py: https://github.com/simonw/db-to-sqlite/blob/c2f8e93bc6bbdfd135de3656ea0f497859ae49ff/setup.py#L25","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413868452,Improve and document foreign_keys=... argument to insert/create/etc, https://github.com/simonw/sqlite-utils/issues/17#issuecomment-466823422,https://api.github.com/repos/simonw/sqlite-utils/issues/17,466823422,MDEyOklzc3VlQ29tbWVudDQ2NjgyMzQyMg==,9599,simonw,2019-02-24T22:20:05Z,2019-02-24T22:20:05Z,OWNER,Re-opening this until I've fixed the other two projects.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413868452,Improve and document foreign_keys=... argument to insert/create/etc, https://github.com/simonw/sqlite-utils/issues/17#issuecomment-466827533,https://api.github.com/repos/simonw/sqlite-utils/issues/17,466827533,MDEyOklzc3VlQ29tbWVudDQ2NjgyNzUzMw==,9599,simonw,2019-02-24T23:03:29Z,2019-02-24T23:03:29Z,OWNER,Need to put out a new release of `sqlite-utils` so `db-to-sqlite` can depend on it.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413868452,Improve and document foreign_keys=... argument to insert/create/etc, https://github.com/simonw/sqlite-utils/issues/17#issuecomment-466828503,https://api.github.com/repos/simonw/sqlite-utils/issues/17,466828503,MDEyOklzc3VlQ29tbWVudDQ2NjgyODUwMw==,9599,simonw,2019-02-24T23:15:26Z,2019-02-24T23:15:26Z,OWNER,Released: https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v0-14,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413868452,Improve and document foreign_keys=... argument to insert/create/etc, https://github.com/simonw/sqlite-utils/issues/17#issuecomment-466830869,https://api.github.com/repos/simonw/sqlite-utils/issues/17,466830869,MDEyOklzc3VlQ29tbWVudDQ2NjgzMDg2OQ==,9599,simonw,2019-02-24T23:45:48Z,2019-02-24T23:45:48Z,OWNER,Both projects have been upgraded.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413868452,Improve and document foreign_keys=... argument to insert/create/etc, https://github.com/simonw/datasette/issues/187#issuecomment-467264937,https://api.github.com/repos/simonw/datasette/issues/187,467264937,MDEyOklzc3VlQ29tbWVudDQ2NzI2NDkzNw==,9599,simonw,2019-02-26T02:14:28Z,2019-02-26T02:14:28Z,OWNER,I'm working on a port of Datasette to Starlette which I think would fix this issue: https://github.com/encode/starlette,"{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309033998,Windows installation error, https://github.com/simonw/datasette/issues/409#issuecomment-472844001,https://api.github.com/repos/simonw/datasette/issues/409,472844001,MDEyOklzc3VlQ29tbWVudDQ3Mjg0NDAwMQ==,43100,Uninen,2019-03-14T13:04:20Z,2019-03-14T13:04:42Z,NONE,It seems this affects the Datasette Publish -site as well: https://github.com/simonw/datasette-publish-support/issues/3,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",408376825,Zeit API v1 does not work for new users - need to migrate to v2, https://github.com/simonw/datasette/issues/409#issuecomment-472875713,https://api.github.com/repos/simonw/datasette/issues/409,472875713,MDEyOklzc3VlQ29tbWVudDQ3Mjg3NTcxMw==,209967,michaelmcandrew,2019-03-14T14:14:39Z,2019-03-14T14:14:39Z,NONE,also linking this zeit issue in case it is helpful: https://github.com/zeit/now-examples/issues/163#issuecomment-440125769,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",408376825,Zeit API v1 does not work for new users - need to migrate to v2, https://github.com/simonw/datasette/pull/416#issuecomment-473154643,https://api.github.com/repos/simonw/datasette/issues/416,473154643,MDEyOklzc3VlQ29tbWVudDQ3MzE1NDY0Mw==,9599,simonw,2019-03-15T04:27:47Z,2019-03-15T04:28:00Z,OWNER,"Deployed a demo: https://datasette-optional-hash-demo.now.sh/ datasette publish now \ ../demo-databses/russian-ads.db \ ../demo-databses/polar-bears.db \ --branch=optional-hash \ -n datasette-optional-hash \ --alias datasette-optional-hash-demo \ --install=datasette-cluster-map \ --install=datasette-json-html ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421348146,URL hashing now optional: turn on with --config hash_urls:1 (#418), https://github.com/simonw/datasette/pull/416#issuecomment-473156513,https://api.github.com/repos/simonw/datasette/issues/416,473156513,MDEyOklzc3VlQ29tbWVudDQ3MzE1NjUxMw==,9599,simonw,2019-03-15T04:40:29Z,2019-03-15T04:40:29Z,OWNER,"Still TODO: need to figure out what to do about cache TTL. Defaulting to 365 days no longer makes sense without the hash_urls setting. Maybe drop that setting default to 0? Here's the setting: https://github.com/simonw/datasette/blob/9743e1d91b5f0a2b3c1c0bd6ffce8739341f43c4/datasette/app.py#L84-L86 And here's where it takes affect: https://github.com/simonw/datasette/blob/4462a5ab2817ac0d9ffe20dafbbf27c5c5b81466/datasette/views/base.py#L491-L501","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421348146,URL hashing now optional: turn on with --config hash_urls:1 (#418), https://github.com/simonw/datasette/issues/414#issuecomment-473156774,https://api.github.com/repos/simonw/datasette/issues/414,473156774,MDEyOklzc3VlQ29tbWVudDQ3MzE1Njc3NA==,9599,simonw,2019-03-15T04:42:06Z,2019-03-15T04:42:06Z,OWNER,"This has been bothering me as well, especially when I try to install `datasette` and `sqlite-utils` at the same time.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",415575624,datasette requires specific version of Click, https://github.com/simonw/datasette/issues/411#issuecomment-473156905,https://api.github.com/repos/simonw/datasette/issues/411,473156905,MDEyOklzc3VlQ29tbWVudDQ3MzE1NjkwNQ==,9599,simonw,2019-03-15T04:42:58Z,2019-03-15T04:42:58Z,OWNER,"Have you tried this? MakePoint(:Long || "", "" || :Lat) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",410384988,How to pass named parameter into spatialite MakePoint() function, https://github.com/simonw/datasette/issues/415#issuecomment-473157770,https://api.github.com/repos/simonw/datasette/issues/415,473157770,MDEyOklzc3VlQ29tbWVudDQ3MzE1Nzc3MA==,9599,simonw,2019-03-15T04:49:03Z,2019-03-15T04:49:03Z,OWNER,"Interesting idea. I can see how this would make sense if you are dealing with really long SQL queries. My own example of a long query that might benefit from this: https://russian-ads-demo.herokuapp.com/russian-ads-a42c4e8?sql=select%0D%0A++++target_id%2C%0D%0A++++targets.name%2C%0D%0A++++count(*)+as+n%2C%0D%0A++++json_object(%0D%0A++++++++%22href%22%2C+%22%2Frussian-ads%2Ffaceted-targets%3Ftargets%3D%22+||+%0D%0A++++++++++++json_insert(%3Atargets%2C+%27%24[%27+||+json_array_length(%3Atargets)+||+%27]%27%2C+target_id)%0D%0A++++++++%2C%0D%0A++++++++%22label%22%2C+json_insert(%3Atargets%2C+%27%24[%27+||+json_array_length(%3Atargets)+||+%27]%27%2C+target_id)%0D%0A++++)+as+apply_this_facet%2C%0D%0A++++json_object(%0D%0A++++++++%22href%22%2C+%22%2Frussian-ads%2Fdisplay_ads%3F_targets_json%3D%22+||+%0D%0A++++++++++++json_insert(%3Atargets%2C+%27%24[%27+||+json_array_length(%3Atargets)+||+%27]%27%2C+target_id)%0D%0A++++++++%2C%0D%0A++++++++%22label%22%2C+%22See+%22+||+count(*)+||+%22+ads+matching+%22+||+json_insert(%3Atargets%2C+%27%24[%27+||+json_array_length(%3Atargets)+||+%27]%27%2C+target_id)%0D%0A++++)+as+browse_these_ads%0D%0Afrom+ad_targets%0D%0Ajoin+targets+on+ad_targets.target_id+%3D+targets.id%0D%0Awhere%0D%0A++++json_array_length(%3Atargets)+%3D%3D+0+or%0D%0A++++ad_id+in+(%0D%0A++++++++select+ad_id%0D%0A++++++++from+%22ad_targets%22%0D%0A++++++++where+%22ad_targets%22.target_id+in+(select+value+from+json_each(%3Atargets))%0D%0A++++++++group+by+%22ad_targets%22.ad_id%0D%0A++++++++having+count(distinct+%22ad_targets%22.target_id)+%3D+json_array_length(%3Atargets)%0D%0A++++)%0D%0A++++and+target_id+not+in+(select+value+from+json_each(%3Atargets))%0D%0Agroup+by%0D%0A++++target_id+order+by+n+desc%0D%0A&targets=[%22e6200%22] Having a `show/hide` link would be an easy way to support this in the UI, and those could add/remove a `_hide_sql=1` parameter.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",418329842,Add query parameter to hide SQL textarea, https://github.com/simonw/datasette/issues/412#issuecomment-473158506,https://api.github.com/repos/simonw/datasette/issues/412,473158506,MDEyOklzc3VlQ29tbWVudDQ3MzE1ODUwNg==,9599,simonw,2019-03-15T04:53:53Z,2019-03-15T04:53:53Z,OWNER,"I've been thinking about how Datasette instances could query each other for a while - it's a really interesting direction. There are some tricky problems to solve to get this to work. There's a SQLite mechanism called ""virtual table functions"" which can implement things like this, but it's not supported by Python's `sqlite3` module out of the box. https://github.com/coleifer/sqlite-vtfunc is a library that enables this feature. I experimented with using that to implement a function that scrapes HTML content (with an eye to accessing data from other APIs and Datasette instances) a while ago: https://github.com/coleifer/sqlite-vtfunc/issues/6 The bigger challenge is how to get this kind of thing to behave well within a Python 3 async environment. I have some ideas here but they're going to require some very crafty engineering.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",411257981,Linked Data(sette), https://github.com/simonw/datasette/pull/416#issuecomment-473159679,https://api.github.com/repos/simonw/datasette/issues/416,473159679,MDEyOklzc3VlQ29tbWVudDQ3MzE1OTY3OQ==,9599,simonw,2019-03-15T05:01:27Z,2019-03-15T05:01:27Z,OWNER,"Also: if the option is False and the user visits a URL with a hash in it, should we redirect them? I'm inclined to say no: furthermore, I'd be OK continuing to serve a far-future cache header for that case.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421348146,URL hashing now optional: turn on with --config hash_urls:1 (#418), https://github.com/simonw/datasette/pull/413#issuecomment-473160476,https://api.github.com/repos/simonw/datasette/issues/413,473160476,MDEyOklzc3VlQ29tbWVudDQ3MzE2MDQ3Ng==,9599,simonw,2019-03-15T05:06:37Z,2019-03-15T05:06:37Z,OWNER,Thanks!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413887019,Update spatialite.rst, https://github.com/simonw/datasette/pull/416#issuecomment-473160702,https://api.github.com/repos/simonw/datasette/issues/416,473160702,MDEyOklzc3VlQ29tbWVudDQ3MzE2MDcwMg==,9599,simonw,2019-03-15T05:08:13Z,2019-03-15T05:08:13Z,OWNER,This also needs extensive tests to ensure that with the option turned on all of the redirects behave as they should.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421348146,URL hashing now optional: turn on with --config hash_urls:1 (#418), https://github.com/simonw/datasette/issues/415#issuecomment-473164038,https://api.github.com/repos/simonw/datasette/issues/415,473164038,MDEyOklzc3VlQ29tbWVudDQ3MzE2NDAzOA==,9599,simonw,2019-03-15T05:31:21Z,2019-03-15T05:31:21Z,OWNER,"Demo: https://latest.datasette.io/fixtures-dd88475?sql=select+%2A+from+sortable+order+by+pk1%2C+pk2+limit+101 v.s. https://latest.datasette.io/fixtures-dd88475?sql=select+%2A+from+sortable+order+by+pk1%2C+pk2+limit+101&_hide_sql=1 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",418329842,Add query parameter to hide SQL textarea, https://github.com/simonw/datasette/issues/415#issuecomment-473217334,https://api.github.com/repos/simonw/datasette/issues/415,473217334,MDEyOklzc3VlQ29tbWVudDQ3MzIxNzMzNA==,36796532,ad-si,2019-03-15T09:30:57Z,2019-03-15T09:30:57Z,NONE,"Awesome, thanks! 😁 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",418329842,Add query parameter to hide SQL textarea, https://github.com/simonw/datasette/issues/417#issuecomment-473308631,https://api.github.com/repos/simonw/datasette/issues/417,473308631,MDEyOklzc3VlQ29tbWVudDQ3MzMwODYzMQ==,9599,simonw,2019-03-15T14:32:13Z,2019-03-15T14:32:13Z,OWNER,"This would allow Datasette to be easily used as a ""data library"" (like a data warehouse but less expectation of big data querying technology such as Presto). One of the things I learned at the NICAR CAR 2019 conference in Newport Beach is that there is a very real need for some kind of easily accessible data library at most newsrooms.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421546944,Datasette Library, https://github.com/simonw/datasette/pull/416#issuecomment-473310026,https://api.github.com/repos/simonw/datasette/issues/416,473310026,MDEyOklzc3VlQ29tbWVudDQ3MzMxMDAyNg==,9599,simonw,2019-03-15T14:35:53Z,2019-03-15T14:35:53Z,OWNER,See #418 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421348146,URL hashing now optional: turn on with --config hash_urls:1 (#418), https://github.com/simonw/datasette/issues/417#issuecomment-473312514,https://api.github.com/repos/simonw/datasette/issues/417,473312514,MDEyOklzc3VlQ29tbWVudDQ3MzMxMjUxNA==,9599,simonw,2019-03-15T14:42:07Z,2019-03-17T22:12:30Z,OWNER,"A neat ability of Datasette Library would be if it can work against other files that have been dropped into the folder. In particular: if a user drops a CSV file into the folder, how about automatically converting that CSV file to SQLite using [sqlite-utils](https://github.com/simonw/sqlite-utils)?","{""total_count"": 2, ""+1"": 2, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421546944,Datasette Library, https://github.com/simonw/datasette/issues/123#issuecomment-473313975,https://api.github.com/repos/simonw/datasette/issues/123,473313975,MDEyOklzc3VlQ29tbWVudDQ3MzMxMzk3NQ==,9599,simonw,2019-03-15T14:45:46Z,2019-03-15T14:45:46Z,OWNER,"I'm reopening this one as part of #417. Further experience with Python's CSV standard library module has convinced me that pandas is not a required dependency for this. My [sqlite-utils](https://github.com/simonw/sqlite-utils) package can do most of the work here with very few dependencies.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125561,Datasette serve should accept paths/URLs to CSVs and other file formats, https://github.com/simonw/datasette/issues/123#issuecomment-473323329,https://api.github.com/repos/simonw/datasette/issues/123,473323329,MDEyOklzc3VlQ29tbWVudDQ3MzMyMzMyOQ==,9599,simonw,2019-03-15T15:09:15Z,2019-05-14T15:53:05Z,OWNER,"How would Datasette accepting URLs work? I want to support not just SQLite files and CSVs but other extensible formats (geojson, Atom, shapefiles etc) as well. So `datasette serve` needs to be able to take filepaths or URLs to a variety of different content types. If it's a URL, we can use the first 200 downloaded bytes to decide which type of file it is. This is likely more reliable than hoping the web server provided the correct content-type. Also: let's have a threshold for downloading to disk. We will start downloading to a temp file (location controlled by an environment variable) if either the content length header is above that threshold OR we hit that much data cached in memory already and don't know how much more is still to come. There needs to be a command line option for saying ""grab from this URL but force treat it as CSV"" - same thing for files on disk. datasette mydb.db --type=db http://blah/blah --type=csv If you provide less `--type` options thatn you did URLs then the default behavior is used for all of the subsequent URLs. Auto detection could be tricky. Probably do this with a plugin hook. https://github.com/h2non/filetype.py is interesting but deals with images video etc so not right for this purpose. I think we need our own simple content sniffing code via a plugin hook. What if two plugin type hooks can both potentially handle a sniffed file? The CLI can quit and return an error saying content is ambiguous and you need to specify a `--type`, picking from the following list. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",275125561,Datasette serve should accept paths/URLs to CSVs and other file formats, https://github.com/simonw/datasette/issues/419#issuecomment-473708724,https://api.github.com/repos/simonw/datasette/issues/419,473708724,MDEyOklzc3VlQ29tbWVudDQ3MzcwODcyNA==,9599,simonw,2019-03-17T19:55:21Z,2019-05-16T03:35:59Z,OWNER,"Thinking about this further: I think I may have made a mistake establishing ""immutable"" as the default mode for databases opened by Datasette. What would it look like if files were NOT opened in immutable mode by default? Maybe the command to start Datasette looks like this: datasette mutable1.db mutable2.db --immutable=this_is_immutable.db --immutable=this_is_immutable2.db So regular file arguments are treated as mutable (and opened in `?mode=ro`) while file arguments passed using the new `--immutable` option are opened in immutable mode. The `-i` shortcut has not yet been taken, so this could be abbreviated to: datasette mutable1.db mutable2.db -i this_is_immutable.db -i this_is_immutable2.db","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421551434,"Default to opening files in mutable mode, special option for immutable files", https://github.com/simonw/datasette/issues/419#issuecomment-473708941,https://api.github.com/repos/simonw/datasette/issues/419,473708941,MDEyOklzc3VlQ29tbWVudDQ3MzcwODk0MQ==,9599,simonw,2019-03-17T19:58:11Z,2019-03-17T19:58:11Z,OWNER,"Some problems to solve: * Right now Datasette assumes it can always show the count of rows in a table, because this has been pre-calculated. If a database is mutable the pre-calculation trick no longer works, and for giant tables a `select count(*) from X` query can be expensive to run. Maybe we set a time limit on these? If time limit expires show ""many rows""? * Maintaining a content hash of the table no longer makes sense if it is changing (though interestingly there's a `.sha3sum` built-in SQLite CLI command which takes a hash of the content and stays the same even through vacuum runs). Without that we need a different mechanism for calculating table colours. It also means that we can't do the special dbname-hash URL trick (see #418) at all if the database is opened as mutable.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421551434,"Default to opening files in mutable mode, special option for immutable files", https://github.com/simonw/datasette/issues/418#issuecomment-473709815,https://api.github.com/repos/simonw/datasette/issues/418,473709815,MDEyOklzc3VlQ29tbWVudDQ3MzcwOTgxNQ==,9599,simonw,2019-03-17T20:08:31Z,2019-03-17T20:08:31Z,OWNER,"In #419 I'm now proposing that Datasette default to opening files in ""mutable"" mode, in which case it would not make sense to support hash URLs for those files at all. So actually this feature will only be available for files that are explicitly opened in immutable mode.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421548881,Hashed URLs should be optional, https://github.com/simonw/datasette/issues/419#issuecomment-473709883,https://api.github.com/repos/simonw/datasette/issues/419,473709883,MDEyOklzc3VlQ29tbWVudDQ3MzcwOTg4Mw==,9599,simonw,2019-03-17T20:09:47Z,2019-03-17T20:37:45Z,OWNER,"Could I persist the last calculated count for a table and somehow detect if that table has been changed in any way by another process, hence invalidating the cached count (and potentially scheduling a new count)? https://www.sqlite.org/c3ref/update_hook.html says that `sqlite3_update_hook()` can be used to register a handler invoked on almost all update/insert/delete operations to a specific table... except that it misses out on deletes triggered by `ON CONFLICT REPLACE` and only works for `ROWID` tables. Also this hook is not exposed in the Python `sqlite3` library - though it may be available using some terrifying `ctypes` hacks: https://stackoverflow.com/a/16920926 So on further research, I think the answer is *no*: I should assume that it won't be possible to cache counts and magically invalidate the cache when the underlying file is changed by another process. Instead I need to assume that counts will be an expensive operation. As such, I can introduce a time limit on counts and use that anywhere a count is displayed. If the time limit is exceeded by the `count(*)` query I can show ""many"" instead. That said... running `count(*)` against a table with 200,000 rows in only takes about 3ms, so even a timeout of 20ms is likely to work fine for tables of around a million rows. It would be really neat if I could generate a lower bound count in a limited amount of time. If I counted up to 4m rows before the timeout I could show ""more than 4m rows"". No idea if that would be possible though. Relevant: https://stackoverflow.com/questions/8988915/sqlite-count-slow-on-big-tables - reports of very slow counts on 6GB database file. Consensus seems to be ""yeah, that's just how SQLite is built"" - though there was a suggestion that you can use `select max(ROWID) from table` provided you are certain there have been no deletions. Also relevant: http://sqlite.1065341.n5.nabble.com/sqlite3-performance-on-select-count-very-slow-for-16-GB-file-td80176.html","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421551434,"Default to opening files in mutable mode, special option for immutable files", https://github.com/simonw/datasette/issues/419#issuecomment-473712820,https://api.github.com/repos/simonw/datasette/issues/419,473712820,MDEyOklzc3VlQ29tbWVudDQ3MzcxMjgyMA==,9599,simonw,2019-03-17T20:43:23Z,2019-03-17T20:43:51Z,OWNER,"So the differences here are: * For immutable databases we calculate content hash and table counts; mutable databases we do not * Immutable databasse open with `file:{}?immutable=1`, mutable databases open with `file:{}?mode=ro` * Anywhere that shows a table count now needs to call a new method which knows to run `count(*)` with a timeout for mutable databases, read from the precalculated counts for immutable databases * The url-hash option should no longer be available at all for mutable databases * New command-line tool syntax: `datasette mutable.db` v.s. `datasette -i immutable.db`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421551434,"Default to opening files in mutable mode, special option for immutable files", https://github.com/simonw/datasette/issues/419#issuecomment-473713363,https://api.github.com/repos/simonw/datasette/issues/419,473713363,MDEyOklzc3VlQ29tbWVudDQ3MzcxMzM2Mw==,9599,simonw,2019-03-17T20:49:39Z,2019-03-17T20:52:46Z,OWNER,"And a really important difference: the whole model of caching inspect data no longer works for mutable files, because another process might make a change to the database schema (adding a new table for example). https://fivethirtyeight.datasettes.com/-/inspect So everywhere that uses `self.ds.inspect()` right now will have to change to calling a routine which knows the difference between mutable and immutable databases and queries for live schema data for mutables while using a cache for immutables. I'll track this as a separate ticket.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421551434,"Default to opening files in mutable mode, special option for immutable files", https://github.com/simonw/datasette/issues/420#issuecomment-473713946,https://api.github.com/repos/simonw/datasette/issues/420,473713946,MDEyOklzc3VlQ29tbWVudDQ3MzcxMzk0Ng==,9599,simonw,2019-03-17T20:56:38Z,2019-03-17T20:58:17Z,OWNER,"Some examples: https://github.com/simonw/datasette/blob/1f54e092306b208125f39d06712b02895eb75168/datasette/views/table.py#L34-L40 https://github.com/simonw/datasette/blob/1f54e092306b208125f39d06712b02895eb75168/datasette/views/table.py#L45-L48 https://github.com/simonw/datasette/blob/1f54e092306b208125f39d06712b02895eb75168/datasette/views/table.py#L62-L65 https://github.com/simonw/datasette/blob/1f54e092306b208125f39d06712b02895eb75168/datasette/views/table.py#L112-L123 https://github.com/simonw/datasette/blob/1f54e092306b208125f39d06712b02895eb75168/datasette/views/index.py#L11-L19 https://github.com/simonw/datasette/blob/afe9aa3ae03c485c5d6652741438d09445a486c1/datasette/views/base.py#L143-L147 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421971339,Fix all the places that currently use .inspect() data, https://github.com/simonw/datasette/pull/416#issuecomment-473714545,https://api.github.com/repos/simonw/datasette/issues/416,473714545,MDEyOklzc3VlQ29tbWVudDQ3MzcxNDU0NQ==,9599,simonw,2019-03-17T21:03:08Z,2019-03-17T21:04:17Z,OWNER,I'm going to introduce a new config setting: `default_cache_ttl_hashed` - and set the default value for `default_cache_ttl` to 10s (to protect against dog-piling).,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421348146,URL hashing now optional: turn on with --config hash_urls:1 (#418), https://github.com/simonw/datasette/pull/416#issuecomment-473715254,https://api.github.com/repos/simonw/datasette/issues/416,473715254,MDEyOklzc3VlQ29tbWVudDQ3MzcxNTI1NA==,9599,simonw,2019-03-17T21:11:37Z,2019-03-17T21:11:37Z,OWNER,The code for this has got a bit tricky. I need to make a decision at some point as to if the current request is a hashed_url request (if it includes a DB hash in the URL which is the current correct hash). I then need to be able to use that fact to decide which default TTL value to apply when returning the response.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421348146,URL hashing now optional: turn on with --config hash_urls:1 (#418), https://github.com/simonw/datasette/pull/416#issuecomment-473717052,https://api.github.com/repos/simonw/datasette/issues/416,473717052,MDEyOklzc3VlQ29tbWVudDQ3MzcxNzA1Mg==,9599,simonw,2019-03-17T21:32:24Z,2019-03-17T21:33:16Z,OWNER,"Since this feature is now controlled by a config setting, I'm inclined to make it also available via a URL parameter. If you hit this URL: /fixtures/table.json?_hash=1 We can redirect to: /fixtures-c2342/table.json In this way developers can opt-in to a hashed (and hence far-future cached) response on a per-query basis. This option won't be available against mutable databases though, which are coming in #419 This means that the `hash_urls:1` config basically has the effect of assuming `?_hash=1` on all URLs to mutable databases.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421348146,URL hashing now optional: turn on with --config hash_urls:1 (#418), https://github.com/simonw/datasette/issues/418#issuecomment-473724868,https://api.github.com/repos/simonw/datasette/issues/418,473724868,MDEyOklzc3VlQ29tbWVudDQ3MzcyNDg2OA==,9599,simonw,2019-03-17T23:07:31Z,2019-03-17T23:07:31Z,OWNER,"The design of this feature is discussed extensively in the comments on pull request #416 Some demos: * https://latest.datasette.io/fixtures/facetable now no longer redirects to the hash * https://latest.datasette.io/fixtures/facetable?_hash=1 redirects to https://latest.datasette.io/fixtures-dd88475/facetable ``` ~ $ curl -i 'https://latest.datasette.io/fixtures-dd88475/facetable' HTTP/2 200 date: Sun, 17 Mar 2019 23:05:21 GMT content-type: text/html; charset=utf-8 content-length: 17555 cache-control: max-age=31536000 ~ $ curl -i 'https://latest.datasette.io/fixtures/facetable' HTTP/2 200 date: Sun, 17 Mar 2019 23:05:40 GMT content-type: text/html; charset=utf-8 content-length: 17410 cache-control: max-age=5 ``` There are now three config settings relevant to the above: `default_cache_ttl` - defaults to 5s. The default cache TTL for non-hashed resources. `default_cache_ttl_hashed` - defaults to 31536000s. The default cache TTL for hashed resources. `hash_urls` - defaults to False. If True, all URLs will attempt to redirect to their hashed version.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421548881,Hashed URLs should be optional, https://github.com/simonw/datasette/issues/419#issuecomment-473726527,https://api.github.com/repos/simonw/datasette/issues/419,473726527,MDEyOklzc3VlQ29tbWVudDQ3MzcyNjUyNw==,9599,simonw,2019-03-17T23:28:41Z,2019-05-16T14:54:50Z,OWNER,"I've added the `-i` option, so this now works: datasette -i fixtures.db This feature is incomplete though. Some extra changes I need to make: * The `?_hash=1` and `--config hash_urls:1` options (introduced in #418) should only work for immutable databases #471 * Would be useful if there was a debug screen that could show which databases were mounted as mutable v.s. immutable - maybe a `/-/databases` page? - #470 * Need to rework how `.inspect()` works, see #420 * Documentation is needed #421 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421551434,"Default to opening files in mutable mode, special option for immutable files", https://github.com/simonw/datasette/issues/420#issuecomment-473726587,https://api.github.com/repos/simonw/datasette/issues/420,473726587,MDEyOklzc3VlQ29tbWVudDQ3MzcyNjU4Nw==,9599,simonw,2019-03-17T23:29:22Z,2019-03-17T23:29:22Z,OWNER,Needed for #419,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421971339,Fix all the places that currently use .inspect() data, https://github.com/simonw/datasette/issues/421#issuecomment-473726619,https://api.github.com/repos/simonw/datasette/issues/421,473726619,MDEyOklzc3VlQ29tbWVudDQ3MzcyNjYxOQ==,9599,simonw,2019-03-17T23:29:47Z,2019-03-17T23:29:47Z,OWNER,Needed for #419,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421985685,Documentation for ?_hash=1 and Datasette's hashed URL caching, https://github.com/simonw/datasette/issues/420#issuecomment-473744172,https://api.github.com/repos/simonw/datasette/issues/420,473744172,MDEyOklzc3VlQ29tbWVudDQ3Mzc0NDE3Mg==,9599,simonw,2019-03-18T02:08:12Z,2019-03-18T02:08:12Z,OWNER,Maybe this is a good opportunity to improve the introspection capabilities in [sqlite-utils](https://github.com/simonw/sqlite-utils) and add it as a dependency.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421971339,Fix all the places that currently use .inspect() data, https://github.com/simonw/datasette/issues/417#issuecomment-474280581,https://api.github.com/repos/simonw/datasette/issues/417,474280581,MDEyOklzc3VlQ29tbWVudDQ3NDI4MDU4MQ==,82988,psychemedia,2019-03-19T10:06:42Z,2019-03-19T10:06:42Z,CONTRIBUTOR,"This would be really interesting but several possibilities in use arise, I think? For example: - I put a new CSV file into the import dir and a new table is created therefrom - I put a CSV file into the import dir that replaces a previous file / table of the same name as a pre-existing table (eg files that contain monthly data in year to date). The data may also patch previous months, so a full replace / DROP on the original table may well be in order. - I put a CSV file into the import dir that updates a table of the same name as a pre-existing table (eg files that contain last month's data) CSV files may also have messy names compared to the table you want. Or for an update CSV, may have the form `MYTABLENAME-February2019.csv` etc","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421546944,Datasette Library, https://github.com/simonw/datasette/issues/412#issuecomment-474282321,https://api.github.com/repos/simonw/datasette/issues/412,474282321,MDEyOklzc3VlQ29tbWVudDQ3NDI4MjMyMQ==,82988,psychemedia,2019-03-19T10:09:46Z,2019-03-19T10:09:46Z,CONTRIBUTOR,Does this also relate to https://github.com/simonw/datasette/issues/283 and the ability to `ATTACH DATABASE`?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",411257981,Linked Data(sette), https://github.com/simonw/datasette/issues/420#issuecomment-474398127,https://api.github.com/repos/simonw/datasette/issues/420,474398127,MDEyOklzc3VlQ29tbWVudDQ3NDM5ODEyNw==,9599,simonw,2019-03-19T14:34:55Z,2019-03-19T14:34:55Z,OWNER,"I systematically reviewed the codebase for things that `.inspect()` is used for: In `app.py`: * `table_exists()` uses `table in self.inspect().get(database, {}).get(""tables"")` * `.execute()` looks up the database name to get the `info[""file""]` (the correct filename with the `.db` extension) In `cli.py`: * The `datasette inspect` command dumps it to JSON * `datasette skeleton` iterates over it * `datasette serve` calls it on startup (to populate static cache of inspect data) In `base.py`: * `.database_url(database)` calls it to lookup the hash (if `hash_urls` config turned on) * `.resolve_db_name()` uses it to lookup the hash In `database.py`: * `DatabaseView` uses it to find up the list of tables and views to display, plus the size of the DB file in bytes * `DatabaseDownload` uses it to get the filepath for download In `index.py`: * `IndexView` uses it _extensively_ - to loop through every database and every table. This would make a good starting point for the refactor. In `table.py`: * `sortable_columns_for_table()` uses it to find the columns in a table * `expandable_columns()` uses it to find foreign keys * `expand_foreign_keys()` uses it to find foreign keys * `display_columns_and_rows()` uses it to find primary keys and foreign keys... but also has access to a `cursor.description` which it uses to list the columns * `TableView.data` uses it to lookup columns and primary keys and the `table_rows_count` (used if the thing isn't a view) and probably a few more things, this method is huge! * `RowView.data` uses it for primary keys * `foreign_key_tables()` uses it for foreign keys In the tests it's used by `test_api.test_inspect_json()` and by a couple of tests in `test_inspect`.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421971339,Fix all the places that currently use .inspect() data, https://github.com/simonw/datasette/issues/420#issuecomment-474399630,https://api.github.com/repos/simonw/datasette/issues/420,474399630,MDEyOklzc3VlQ29tbWVudDQ3NDM5OTYzMA==,9599,simonw,2019-03-19T14:38:14Z,2019-03-19T14:38:14Z,OWNER,"Most of these can be replaced with relatively straight-forward direct introspection of the SQLite table. The one exception is the incoming foreign keys: these can only be found by inspecting ALL of the other tables. This requires running `PRAGMA foreign_key_list([table_name])` against every other table in the database. How expensive is doing this on a database with hundreds of tables?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421971339,Fix all the places that currently use .inspect() data, https://github.com/simonw/datasette/issues/420#issuecomment-474407617,https://api.github.com/repos/simonw/datasette/issues/420,474407617,MDEyOklzc3VlQ29tbWVudDQ3NDQwNzYxNw==,9599,simonw,2019-03-19T14:55:51Z,2019-03-19T14:55:51Z,OWNER,"A microbenchmark against `fivethirtyeight.db` (415 tables): In [1]: import sqlite3 In [2]: c = sqlite3.connect(""fivethirtyeight.db"") In [3]: %timeit c.execute(""select name from sqlite_master where type = 'table'"").fetchall() 283 µs ± 12.3 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each) In [4]: tables = [r[0] for r in c.execute(""select name from sqlite_master where type = 'table'"").fetchall()] In [5]: len(tables) Out[5]: 415 In [6]: %timeit [c.execute(""pragma foreign_keys([{}])"".format(t)).fetchall() for t in tables] 1.81 ms ± 161 µs per loop (mean ± std. dev. of 7 runs, 1000 loops each) So running `pragma foreign_keys()` against 415 tables only takes 1.81ms. This is going to be fine.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421971339,Fix all the places that currently use .inspect() data, https://github.com/simonw/datasette/issues/422#issuecomment-474888132,https://api.github.com/repos/simonw/datasette/issues/422,474888132,MDEyOklzc3VlQ29tbWVudDQ3NDg4ODEzMg==,9599,simonw,2019-03-20T15:34:37Z,2019-03-20T15:34:37Z,OWNER,"Here's a trick for lower bound counts which looks like it might actually work. Consider the following queries: ``` select count(*) from ( select rowid from [most-common-name/surnames] limit 1000 ) ``` https://fivethirtyeight.datasettes.com/fivethirtyeight-b76415d?sql=select+count%28*%29+from+%28%0D%0A++select+rowid+from+%5Bmost-common-name%2Fsurnames%5D+limit+1000%0D%0A%29 Takes 0.827ms (it took longer with `select * from` in the subquery). Same query but with limit 10,000: https://fivethirtyeight.datasettes.com/fivethirtyeight-b76415d?sql=select+count%28*%29+from+%28%0D%0A++select+rowid++from+%5Bmost-common-name%2Fsurnames%5D+limit+10000%0D%0A%29 Took 2.335ms With 100,000 limit: https://fivethirtyeight.datasettes.com/fivethirtyeight-b76415d?sql=select+count%28*%29+from+%28%0D%0A++select+rowid++from+%5Bmost-common-name%2Fsurnames%5D+limit+100000%0D%0A%29 Took 27.558ms So one solution here would be to pick an upper bound (maybe 100,001) and use this query, which should give an accurate count below that upper bound but allow us to show ""100,000+"" as a count if the table exceeds that boundary. Maybe the boundary is a config setting? Also, if a tighter timeout (maybe 20ms) is exceeded for that boundary we could halve it and try again.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",423316403,Figure out what to do about table counts in a mutable world, https://github.com/simonw/datasette/issues/422#issuecomment-474890920,https://api.github.com/repos/simonw/datasette/issues/422,474890920,MDEyOklzc3VlQ29tbWVudDQ3NDg5MDkyMA==,9599,simonw,2019-03-20T15:39:58Z,2019-03-20T15:39:58Z,OWNER,"The page with the most table counts on it is the index page: https://fivethirtyeight.datasettes.com/fivethirtyeight-b76415d If I paginate this (which needs to happen anyway for Datasette Library #417) the impact here won't be as bad. I could even load in the table row counts asynchronously via JavaScript? Bigger problem is this total summed count representation on the homepage: I think that feature just won't be feasibly against large databases in a mutable world. Maybe we consider to show that total but only for immutable databases? May be easier just to drop it entirely (we will still show the table count).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",423316403,Figure out what to do about table counts in a mutable world, https://github.com/simonw/datasette/issues/422#issuecomment-475277571,https://api.github.com/repos/simonw/datasette/issues/422,475277571,MDEyOklzc3VlQ29tbWVudDQ3NTI3NzU3MQ==,9599,simonw,2019-03-21T15:30:03Z,2019-03-21T15:30:03Z,OWNER,It would be useful to be able to detect if a table is a rowid table or not.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",423316403,Figure out what to do about table counts in a mutable world, https://github.com/simonw/datasette/issues/420#issuecomment-477633354,https://api.github.com/repos/simonw/datasette/issues/420,477633354,MDEyOklzc3VlQ29tbWVudDQ3NzYzMzM1NA==,9599,simonw,2019-03-28T15:01:37Z,2019-03-28T15:01:37Z,OWNER,"I started looking at how I would implement `table_exists()` with a direct call that uses `sqlite-utils` to see if a table exists. https://github.com/simonw/datasette/blob/82fec6048148b58748040a7e2caa163387e982a3/datasette/app.py#L303-L304 `sqlite-utils` needs access to the database connection - but the database connection itself is currently only available in code that runs in a thread inside the `.execute()` method: https://github.com/simonw/datasette/blob/82fec6048148b58748040a7e2caa163387e982a3/datasette/app.py#L413-L426 So I'm going to need to refactor this a bit. I think I need a way to say ""here is a function which needs access to the connection object for database named X - run that function in a thread, give it access to that connection and then give me back the result"". ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421971339,Fix all the places that currently use .inspect() data, https://github.com/simonw/datasette/issues/420#issuecomment-477636768,https://api.github.com/repos/simonw/datasette/issues/420,477636768,MDEyOklzc3VlQ29tbWVudDQ3NzYzNjc2OA==,9599,simonw,2019-03-28T15:09:27Z,2019-03-28T15:09:27Z,OWNER,Even more tricky: `table_exists()` is currently a synchronous function. If it's going to be executing a SQL query it needs to become an async function.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421971339,Fix all the places that currently use .inspect() data, https://github.com/simonw/datasette/issues/420#issuecomment-478391708,https://api.github.com/repos/simonw/datasette/issues/420,478391708,MDEyOklzc3VlQ29tbWVudDQ3ODM5MTcwOA==,9599,simonw,2019-03-31T22:33:32Z,2019-03-31T22:34:02Z,OWNER,"Next I need to fix this: https://github.com/simonw/datasette/blob/0209a0a344503157351e625f0629b686961763c9/datasette/app.py#L420-L435 Given the name of the database (from the URL e.g. https://latest.datasette.io/fixtures) I need to figure out what name I used to cache the collection.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421971339,Fix all the places that currently use .inspect() data, https://github.com/simonw/datasette/issues/420#issuecomment-478393116,https://api.github.com/repos/simonw/datasette/issues/420,478393116,MDEyOklzc3VlQ29tbWVudDQ3ODM5MzExNg==,9599,simonw,2019-03-31T22:52:48Z,2019-03-31T22:52:48Z,OWNER,"This means the `Datasette` class needs a new property, keeping track of all of the connected databases. ``` ds.databases = { ""name_used_in_urls"": { ""type"": ""file"", # or ""memory"" ""path"": filepath # or None if memory ""mutable"": True # or False, ""hash"": ""..."" # or None if mutable } } ``` Maybe these should be objects, not dictionaries.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421971339,Fix all the places that currently use .inspect() data, https://github.com/simonw/datasette/issues/420#issuecomment-480552387,https://api.github.com/repos/simonw/datasette/issues/420,480552387,MDEyOklzc3VlQ29tbWVudDQ4MDU1MjM4Nw==,9599,simonw,2019-04-07T02:06:20Z,2019-04-07T02:06:20Z,OWNER,"`expand_foreign_keys()` relies on the `.inspect()` command having automatically derived the `label_column` for a table, which it does using this code: https://github.com/simonw/datasette/blob/97331f3435ba1583a0f9dbcaffc25de8894cf1f8/datasette/inspect.py#L34-L42 This needs access to the column names for the table. I think we can drop this entirely in favour of a new utility function - and that function can incorporate the metadata check as well.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421971339,Fix all the places that currently use .inspect() data, https://github.com/simonw/datasette/issues/420#issuecomment-480556166,https://api.github.com/repos/simonw/datasette/issues/420,480556166,MDEyOklzc3VlQ29tbWVudDQ4MDU1NjE2Ng==,9599,simonw,2019-04-07T03:35:59Z,2019-04-07T03:48:14Z,OWNER,Still need to solve: `TableView.data()` - but this is the one with a row count in hence the need to solve #422 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421971339,Fix all the places that currently use .inspect() data, https://github.com/simonw/datasette/issues/425#issuecomment-480558501,https://api.github.com/repos/simonw/datasette/issues/425,480558501,MDEyOklzc3VlQ29tbWVudDQ4MDU1ODUwMQ==,9599,simonw,2019-04-07T04:32:28Z,2019-04-07T04:32:28Z,OWNER,"Here's the problem: https://github.com/simonw/datasette/blob/6f6d0ff2b41f1cacaf42287b1b230b646bcba9ee/datasette/templates/query.html#L30-L36 Need an else block here that adds the SQL as a hidden form field.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",430103450,Submitting SQL on hide page is broken, https://github.com/simonw/sqlite-utils/issues/18#issuecomment-480621924,https://api.github.com/repos/simonw/sqlite-utils/issues/18,480621924,MDEyOklzc3VlQ29tbWVudDQ4MDYyMTkyNA==,82988,psychemedia,2019-04-07T19:31:42Z,2019-04-07T19:31:42Z,NONE,"I've just noticed that SQLite lets you IGNORE inserts that collide with a pre-existing key. This can be quite handy if you have a dataset that keeps changing in part, and you don't want to upsert and replace pre-existing PK rows but you do want to ignore collisions to existing PK rows. Do `sqlite_utils` support such (cavalier!) behaviour?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413871266,.insert/.upsert/.insert_all/.upsert_all should add missing columns, https://github.com/simonw/datasette/issues/420#issuecomment-481310295,https://api.github.com/repos/simonw/datasette/issues/420,481310295,MDEyOklzc3VlQ29tbWVudDQ4MTMxMDI5NQ==,9599,simonw,2019-04-09T15:50:52Z,2019-04-09T15:50:52Z,OWNER,"Efficient row counts are even more important for the `DatabaseView` and `IndexView` pages. The row counts on those pages don't have to be precise, so one option is for me to calculate them and cache them occasionally. I could even have a dedicated thread which just does the counting? In #422 I've figured out a mechanism for getting accurate or lower-bound counts within a time limit (accurate if possible, lower-bound otherwise).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421971339,Fix all the places that currently use .inspect() data, https://github.com/simonw/datasette/issues/359#issuecomment-481724452,https://api.github.com/repos/simonw/datasette/issues/359,481724452,MDEyOklzc3VlQ29tbWVudDQ4MTcyNDQ1Mg==,9599,simonw,2019-04-10T14:52:59Z,2019-04-10T14:52:59Z,OWNER,"I'm going to go with `?_facet_array=definitions` as the querystring argument for this. And `?definitions_arraycontains=foo` as the filter argument.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",349827640,Faceted browse against a JSON list of tags, https://github.com/simonw/datasette/issues/359#issuecomment-481939013,https://api.github.com/repos/simonw/datasette/issues/359,481939013,MDEyOklzc3VlQ29tbWVudDQ4MTkzOTAxMw==,9599,simonw,2019-04-11T02:17:55Z,2019-04-11T02:17:55Z,OWNER,"Challenge: facets can also be defined in `metadata.json` like this: ``` { ""databases"": { ""sf-trees"": { ""tables"": { ""Street_Tree_List"": { ""facets"": [""qLegalStatus""] } } } } } ``` But... `?_facet_array=definitions` doesn't fit in that data structure. Need to have an alternative mechanism for defining this kind of facet.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",349827640,Faceted browse against a JSON list of tags, https://github.com/simonw/datasette/issues/427#issuecomment-481940539,https://api.github.com/repos/simonw/datasette/issues/427,481940539,MDEyOklzc3VlQ29tbWVudDQ4MTk0MDUzOQ==,9599,simonw,2019-04-11T02:26:43Z,2019-04-11T02:26:43Z,OWNER,"I quite like the Solr idea. It could look like this for Datasette: `?_facet=name` - default behaviour, same as today. But that's actually an alias for `?_facet.name=name` - which defines a name for the facet. `?_facet.tags.array=tags` - would define a facet called `tags` that uses an array facet against the `tags` column. I don't like the need to say `tags` twice in that though.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",431800286,"New design for facet abstraction, including querystring and metadata.json", https://github.com/simonw/datasette/issues/427#issuecomment-481957014,https://api.github.com/repos/simonw/datasette/issues/427,481957014,MDEyOklzc3VlQ29tbWVudDQ4MTk1NzAxNA==,9599,simonw,2019-04-11T04:05:07Z,2019-04-11T04:05:07Z,OWNER,"OK, I have a plan: `?_facet=foo` `?_facet_facettype=options` Options here can be one of the following: - A single value which is the name of a table - A comma separated list of options - A JSON object starting with `{` or `[` If the table name itself contains a `,`, `{` or `]` then you have to escape it by putting it in a JSON object, `?_facet_percentile={""column"":""{this_is,a_weird[column_name""}` for example.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",431800286,"New design for facet abstraction, including querystring and metadata.json", https://github.com/simonw/datasette/issues/427#issuecomment-481957313,https://api.github.com/repos/simonw/datasette/issues/427,481957313,MDEyOklzc3VlQ29tbWVudDQ4MTk1NzMxMw==,9599,simonw,2019-04-11T04:07:00Z,2019-04-11T04:07:40Z,OWNER,"This means the `metadata.json` format can look like this: ``` { ""databases"": { ""sf-trees"": { ""tables"": { ""Street_Tree_List"": { ""facets"": [""qLegalStatus"", {""array"": ""tags""}, {""percentile"": {""blah"": ""options""}}] } } } } } ``` So any advanced facets are represented here as a dictionary with a single key - the type - that maps to the options. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",431800286,"New design for facet abstraction, including querystring and metadata.json", https://github.com/simonw/datasette/issues/428#issuecomment-482424314,https://api.github.com/repos/simonw/datasette/issues/428,482424314,MDEyOklzc3VlQ29tbWVudDQ4MjQyNDMxNA==,9599,simonw,2019-04-12T03:33:35Z,2019-04-12T03:33:35Z,OWNER,"It looks like I accidentally broke the `fts_table` metadata mechanism here: https://github.com/simonw/datasette/commit/3a208a41d4dce35b97eca8b25f37055c3fda5aed#diff-5e0ffd62fced7d46339b9b2cd167c2f9L297 I'll fix that as part of this work.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",432371762,Make ?_fts_table=x and ?_fts_pk=y available as URL parameters on table view, https://github.com/simonw/datasette/issues/428#issuecomment-482433121,https://api.github.com/repos/simonw/datasette/issues/428,482433121,MDEyOklzc3VlQ29tbWVudDQ4MjQzMzEyMQ==,9599,simonw,2019-04-12T04:30:29Z,2019-04-12T04:30:29Z,OWNER,"Configured in metadata: https://latest.datasette.io/fixtures/searchable_view_configured_by_metadata?_search=weasel Configured with querystring: https://latest.datasette.io/fixtures/searchable_view?_search=weasel&_fts_table=searchable_fts&_fts_pk=pk","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",432371762,Make ?_fts_table=x and ?_fts_pk=y available as URL parameters on table view, https://github.com/simonw/datasette/issues/410#issuecomment-482434925,https://api.github.com/repos/simonw/datasette/issues/410,482434925,MDEyOklzc3VlQ29tbWVudDQ4MjQzNDkyNQ==,9599,simonw,2019-04-12T04:42:27Z,2019-04-12T04:42:27Z,OWNER,"You can pass multiple databases as command-line arguments, and each one will be loaded. For example: datasette mydb.db otherdb.db Then the URLs should be: http://127.0.0.1:8001/mydb for the first one http://127.0.0.1:8001/otherdb for the second one","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",408518024,How to setup a multi database environment?, https://github.com/simonw/datasette/issues/425#issuecomment-482439686,https://api.github.com/repos/simonw/datasette/issues/425,482439686,MDEyOklzc3VlQ29tbWVudDQ4MjQzOTY4Ng==,9599,simonw,2019-04-12T05:12:12Z,2019-04-12T05:12:12Z,OWNER,Fixed: https://latest.datasette.io/fixtures?sql=select+*+from+compound_three_primary_keys+order+by+pk1%2C+pk2%2C+pk3+limit+101&_hide_sql=1,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",430103450,Submitting SQL on hide page is broken, https://github.com/simonw/datasette/issues/356#issuecomment-482620313,https://api.github.com/repos/simonw/datasette/issues/356,482620313,MDEyOklzc3VlQ29tbWVudDQ4MjYyMDMxMw==,9599,simonw,2019-04-12T15:35:44Z,2019-04-12T15:35:44Z,OWNER,"One question here is how these facets should be defined in the table page query string. #427 started exploring this. For any m2m facet we need to know: - what is the join table? - how is the join table related to our current table? - what is the table on the other side of the relationship? - how does that table relate to the join table? - how should that table be displayed (what's the label column)? The simplest form of m2m relationship can be automatically derived from just knowing the table. We can support that like so: ?_facet_m2m=tagged This could work automatically if the following constraints turn out to apply: - the tagged table has a foreign key back to our table, against our primary key - the tagged table has a single other foreign key to one other table - that other table has a single text column we can use as the label (or has a label column defined in metadata) If any of the above rules don't hold, I think the solution is to have explicit configuration. Per #427 this will likely be done using JSON in the query string. Something like this (would be one line but indented for readability): ``` ?_facet_m2m={ ""through"":""tagged"", ""through_fk_us"":""tree_id"", ""other"":""tags"", ""through_fk_other"":""tag_id"", ""other_label"": ""tag"" } ``` Probably also need a way of specifying the outbound column used on both us and other - if the m2m table isn't linking to the foreign keys. I don't yet like the names of the above keys.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",346028655,Ability to display facet counts for many-to-many relationships, https://github.com/simonw/datasette/issues/427#issuecomment-482626534,https://api.github.com/repos/simonw/datasette/issues/427,482626534,MDEyOklzc3VlQ29tbWVudDQ4MjYyNjUzNA==,9599,simonw,2019-04-12T15:52:53Z,2019-04-12T15:52:53Z,OWNER,"I just realized: a key part of faceting is being able to correctly apply the facet (and know that it has been applied). Existing facets are exact match only, so they can be applied and detected with ?foo=bar More advanced facets like _facet_array and _facet_m2m will need different ways of applying themselves. This needs to be bundled up in the new Facet abstraction somehow.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",431800286,"New design for facet abstraction, including querystring and metadata.json", https://github.com/simonw/datasette/issues/427#issuecomment-482627099,https://api.github.com/repos/simonw/datasette/issues/427,482627099,MDEyOklzc3VlQ29tbWVudDQ4MjYyNzA5OQ==,9599,simonw,2019-04-12T15:54:41Z,2019-04-12T15:54:41Z,OWNER,"Bonus idea: since we are having a Facet abstraction we should allow additional facet type apps to be registered using a plugin. Fun idea for a (very inefficient) demo plugin: facet-by-emoji! Would work by counting all emoji in text fields using a horrible slow full-scan regular expression, then would apply selected emoji facets using a LIKE query.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",431800286,"New design for facet abstraction, including querystring and metadata.json", https://github.com/simonw/datasette/issues/429#issuecomment-482628978,https://api.github.com/repos/simonw/datasette/issues/429,482628978,MDEyOklzc3VlQ29tbWVudDQ4MjYyODk3OA==,9599,simonw,2019-04-12T16:00:04Z,2019-04-12T16:00:04Z,OWNER,I originally thought of this as a plugin but then realized that it's 100% compatible with Datasette's existing arbitrary SQL clauses and would make some of my other projects (especially involving custom queries that still need faceting) a whole lot easier.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",432636432,?_where=sql-fragment parameter for table views, https://github.com/simonw/datasette/issues/429#issuecomment-482638695,https://api.github.com/repos/simonw/datasette/issues/429,482638695,MDEyOklzc3VlQ29tbWVudDQ4MjYzODY5NQ==,9599,simonw,2019-04-12T16:29:25Z,2019-04-13T01:14:17Z,OWNER,"Getting a prototype working was hardly any code at all: http://127.0.0.1:8001/fixtures/facetable?_where=city_id+in+(select+id+from+facet_cities+where+name+like+%22%25an%25%22) ``` diff --git a/datasette/views/table.py b/datasette/views/table.py index b7c9a4b..7ca9572 100644 --- a/datasette/views/table.py +++ b/datasette/views/table.py @@ -295,6 +295,10 @@ class TableView(RowTableShared): filters = Filters(sorted(other_args.items()), units, ureg) where_clauses, params = filters.build_where_clauses(table) + # Add _where= from querystring + if self.ds.config(""allow_sql"") and ""_where"" in request.args: + where_clauses.extend(request.args[""_where""]) + # _search support: fts_table = special_args.get(""_fts_table"") fts_table = fts_table or table_metadata.get(""fts_table"") ``` Still needed: - [x] Unit tests - [x] Probably some kind of visual display on the table page so you know that extra clauses have been added (and maybe a UI for dropping them again) I'm going to leave the `:named` parameter support out of the first version of this feature.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",432636432,?_where=sql-fragment parameter for table views, https://github.com/simonw/datasette/issues/429#issuecomment-482640079,https://api.github.com/repos/simonw/datasette/issues/429,482640079,MDEyOklzc3VlQ29tbWVudDQ4MjY0MDA3OQ==,9599,simonw,2019-04-12T16:34:01Z,2019-04-12T16:34:01Z,OWNER,"UI concept: ```

1 extra where clause:

city_id in (select id from facet_cities where name like ""%an%"") [remove]

```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",432636432,?_where=sql-fragment parameter for table views, https://github.com/simonw/datasette/issues/429#issuecomment-482640250,https://api.github.com/repos/simonw/datasette/issues/429,482640250,MDEyOklzc3VlQ29tbWVudDQ4MjY0MDI1MA==,9599,simonw,2019-04-12T16:34:32Z,2019-04-12T16:34:32Z,OWNER,"Keeping track of these and building the ""remove"" links correctly is going to be a tiny bit fiddly.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",432636432,?_where=sql-fragment parameter for table views, https://github.com/simonw/datasette/issues/429#issuecomment-482640383,https://api.github.com/repos/simonw/datasette/issues/429,482640383,MDEyOklzc3VlQ29tbWVudDQ4MjY0MDM4Mw==,9599,simonw,2019-04-12T16:34:56Z,2019-04-12T16:34:56Z,OWNER,"Maybe put this section above the ""view and edit SQL"" link.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",432636432,?_where=sql-fragment parameter for table views, https://github.com/simonw/datasette/issues/429#issuecomment-482766801,https://api.github.com/repos/simonw/datasette/issues/429,482766801,MDEyOklzc3VlQ29tbWVudDQ4Mjc2NjgwMQ==,9599,simonw,2019-04-13T01:56:19Z,2019-04-13T01:56:19Z,OWNER,"Documentation is here: https://datasette.readthedocs.io/en/latest/json_api.html#special-table-arguments Demo: * https://latest.datasette.io/fixtures/facetable?_where=state=%22MI%22&_where=city_id=3 * https://latest.datasette.io/fixtures/facetable?_where=city_id%20in%20(select%20id%20from%20facet_cities%20where%20name%20!=%20%22Detroit%22)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",432636432,?_where=sql-fragment parameter for table views, https://github.com/simonw/datasette/issues/427#issuecomment-482864457,https://api.github.com/repos/simonw/datasette/issues/427,482864457,MDEyOklzc3VlQ29tbWVudDQ4Mjg2NDQ1Nw==,9599,simonw,2019-04-13T18:51:44Z,2019-04-13T18:57:51Z,OWNER,"A facet needs to: - given a sql query and a list of configs, return a list of buckets - Know how to generate URLs for selecting and deselecting a filter (along with underlying filter application sql logic) - Tell if a specific filter is currently selected or not - Set a time limit and report if it times out - Generate human readable labels - In some cases: expand foreign keys - which means they need access to foreign key information - just the name of the table and the name of the column is enough to call `expand_foreign_keys()` (I [moved that](https://github.com/simonw/datasette/commit/274ef43bb7b129ddc2e68805b4f4ff3776fb9503) to the Datasette class to make it easier to access) - Make suggestions for facets. Let's give it access to the whole table here so it could either run against each column in return and rely with a list of suggestions or it could spot eg a latitude and a longitude column","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",431800286,"New design for facet abstraction, including querystring and metadata.json", https://github.com/simonw/datasette/issues/427#issuecomment-482864837,https://api.github.com/repos/simonw/datasette/issues/427,482864837,MDEyOklzc3VlQ29tbWVudDQ4Mjg2NDgzNw==,9599,simonw,2019-04-13T18:53:43Z,2019-04-13T18:53:43Z,OWNER,"`TableView.data` is currently the longest, hairiest method in the codebase. It's 775 - 177 = 598 lines of code! Extracting faceting logic should help reduce that quite a bit. https://github.com/simonw/datasette/blob/274ef43bb7b129ddc2e68805b4f4ff3776fb9503/datasette/views/table.py#L177-L775","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",431800286,"New design for facet abstraction, including querystring and metadata.json", https://github.com/simonw/datasette/issues/427#issuecomment-482865424,https://api.github.com/repos/simonw/datasette/issues/427,482865424,MDEyOklzc3VlQ29tbWVudDQ4Mjg2NTQyNA==,9599,simonw,2019-04-13T18:56:25Z,2019-04-13T19:42:08Z,OWNER,"I think there's a `Facet` base class. `class ColumnFacet(Facet):` is the default behaviour we have today `class ArrayFacet(Facet):` facet by JSON array `class ManyToManyFacet(Facet):` facet by M2M table `class DateFacet(Facet):` facet by date `class DateTimeFacet(Facet):` facet by datetime `class EmojiFacet(Facet):` super-fun demo plugin I have planned Could even have a facet against a numerical column which loads the entire set of column values into numpy or pandas and calculates complex statistics facets in memory . There’s actually a lot of potential for Datasette plugins that load several MBs of data and analyze using other Python libraries.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",431800286,"New design for facet abstraction, including querystring and metadata.json", https://github.com/simonw/datasette/issues/431#issuecomment-482872210,https://api.github.com/repos/simonw/datasette/issues/431,482872210,MDEyOklzc3VlQ29tbWVudDQ4Mjg3MjIxMA==,9599,simonw,2019-04-13T19:37:57Z,2019-04-13T19:37:57Z,OWNER,"You should be able to see the reload happening in the console logs: I'm doing some work at the moment to handle mutating files MUCH better - #419 - my goal is to have Datasette work against SQLite files that are being updated out-of-the box, and change the current immutable behaviour to be an option rather than the default.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",432870248,Datasette doesn't reload when database file changes, https://github.com/simonw/datasette/pull/432#issuecomment-482876432,https://api.github.com/repos/simonw/datasette/issues/432,482876432,MDEyOklzc3VlQ29tbWVudDQ4Mjg3NjQzMg==,9599,simonw,2019-04-13T20:06:32Z,2019-04-13T20:06:32Z,OWNER,"This has a bug which isn't being caught by the unit tests (yet) - facet suggestion suggests facets that have already been enabled: There is also a test failure due to missing plugin hook documentation.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",432893491,"Refactor facets to a class and new plugin, refs #427", https://github.com/simonw/sqlite-utils/issues/8#issuecomment-482994231,https://api.github.com/repos/simonw/sqlite-utils/issues/8,482994231,MDEyOklzc3VlQ29tbWVudDQ4Mjk5NDIzMQ==,82988,psychemedia,2019-04-14T15:04:07Z,2019-04-14T15:29:33Z,NONE," PLEASE IGNORE THE BELOW... I did a package update and rebuilt the kernel I was working in... may just have been an old version of sqlite_utils, seems to be working now. (Too many containers / too many environments!) Has an issue been reintroduced here with FTS? eg I'm getting an error thrown by spaces in column names here: ``` /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, upsert, batch_size, column_order) def enable_fts(self, columns, fts_version=""FTS5""): --> 329 ""Enables FTS on the specified columns"" 330 sql = """""" 331 CREATE VIRTUAL TABLE ""{table}_fts"" USING {fts_version} ( ``` when trying an `insert_all`. Also, if a col has a `.` in it, I seem to get: ``` /usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, upsert, batch_size, column_order) 327 jsonify_if_needed(record.get(key, None)) for key in all_columns 328 ) --> 329 result = self.db.conn.execute(sql, values) 330 self.db.conn.commit() 331 self.last_id = result.lastrowid OperationalError: near ""."": syntax error ``` (Can't post a worked minimal example right now; racing trying to build something against a live timing screen that will stop until next weekend in an hour or two...) PS Hmmm I did a test and they seem to work; I must be messing up s/where else... ``` import sqlite3 from sqlite_utils import Database dbname='testingDB_sqlite_utils.db' #!rm $dbname conn = sqlite3.connect(dbname, timeout=10) #Setup database tables c = conn.cursor() setup=''' CREATE TABLE IF NOT EXISTS ""test1"" ( ""NO"" INTEGER, ""NAME"" TEXT ); CREATE TABLE IF NOT EXISTS ""test2"" ( ""NO"" INTEGER, `TIME OF DAY` TEXT ); CREATE TABLE IF NOT EXISTS ""test3"" ( ""NO"" INTEGER, `AVG. SPEED (MPH)` FLOAT ); ''' c.executescript(setup) DB = Database(conn) import pandas as pd df1 = pd.DataFrame({'NO':[1,2],'NAME':['a','b']}) DB['test1'].insert_all(df1.to_dict(orient='records')) df2 = pd.DataFrame({'NO':[1,2],'TIME OF DAY':['early on','late']}) DB['test2'].insert_all(df2.to_dict(orient='records')) df3 = pd.DataFrame({'NO':[1,2],'AVG. SPEED (MPH)':['123.3','123.4']}) DB['test3'].insert_all(df3.to_dict(orient='records')) ``` all seem to work ok. I'm still getting errors in my set up though, which is not too different to the text cases?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",403922644,Problems handling column names containing spaces or - , https://github.com/simonw/datasette/issues/431#issuecomment-483017176,https://api.github.com/repos/simonw/datasette/issues/431,483017176,MDEyOklzc3VlQ29tbWVudDQ4MzAxNzE3Ng==,82988,psychemedia,2019-04-14T16:58:37Z,2019-04-14T16:58:37Z,CONTRIBUTOR,Hmm... nope... I see an updated timestamp from `ls -al` on the db but no reload?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",432870248,Datasette doesn't reload when database file changes, https://github.com/simonw/datasette/issues/288#issuecomment-483106262,https://api.github.com/repos/simonw/datasette/issues/288,483106262,MDEyOklzc3VlQ29tbWVudDQ4MzEwNjI2Mg==,9599,simonw,2019-04-15T04:48:59Z,2019-04-15T04:48:59Z,OWNER,"This has got more urgent now that I've added the `?column__arraycontains=foo` filter as part of the effort to implement facet-by-array for #359 I added that filter in https://github.com/simonw/datasette/commit/78e45ead4d771007c57b307edf8fc920101f8733 but it can only be applied once - for proper faceting this needs to work: https://latest.datasette.io/fixtures/facetable?tags__arraycontains=tag1&tags__arraycontains=tag2","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326767626,Support multiple filters of the same type, https://github.com/simonw/datasette/issues/429#issuecomment-483202658,https://api.github.com/repos/simonw/datasette/issues/429,483202658,MDEyOklzc3VlQ29tbWVudDQ4MzIwMjY1OA==,82988,psychemedia,2019-04-15T10:48:01Z,2019-04-15T10:48:01Z,CONTRIBUTOR,"Minor UI observation: ![image](https://user-images.githubusercontent.com/82988/56127017-2bf78e80-5f74-11e9-9120-9393eb5d4988.png) `_where=` renders a `[remove]` link whereas `_facet=` gets a cross to remove it. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",432636432,?_where=sql-fragment parameter for table views, https://github.com/simonw/datasette/issues/288#issuecomment-483458569,https://api.github.com/repos/simonw/datasette/issues/288,483458569,MDEyOklzc3VlQ29tbWVudDQ4MzQ1ODU2OQ==,9599,simonw,2019-04-15T23:45:04Z,2019-04-15T23:45:04Z,OWNER,https://latest.datasette.io/fixtures/facetable?tags__arraycontains=tag1&tags__arraycontains=tag2 now works.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",326767626,Support multiple filters of the same type, https://github.com/simonw/datasette/pull/432#issuecomment-484584234,https://api.github.com/repos/simonw/datasette/issues/432,484584234,MDEyOklzc3VlQ29tbWVudDQ4NDU4NDIzNA==,9599,simonw,2019-04-18T16:33:52Z,2019-04-18T16:33:52Z,OWNER,"It would be nice to decouple the `request` object from the `Facet` class. The request is needed for two things at the moment: * To decide if a specific facet bucket has been selected or not * To construct the `toggle_url` for turning the selection on or off Can I pull those needs out of the Facet class somehow?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",432893491,"Refactor facets to a class and new plugin, refs #427", https://github.com/simonw/datasette/pull/434#issuecomment-484694648,https://api.github.com/repos/simonw/datasette/issues/434,484694648,MDEyOklzc3VlQ29tbWVudDQ4NDY5NDY0OA==,9599,simonw,2019-04-18T21:23:56Z,2019-04-18T21:23:56Z,OWNER,"Thanks for looking into this! To clarify: currently, the Dockerfile that we generate looks something like this: ``` CMD [""datasette"", ""serve"", ""--host"", ""0.0.0.0"", ""fixtures.db"", ""--cors"", ""--port"", ""8001""] ``` Your code here changes that CMD line to look like this instead, in order to set the port based on an environment variable: ``` CMD [""sh"", ""-c"", ""datasette serve --port $PORT ...""] ``` I wonder if this is the only way to do this? ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",434321685,"""datasette publish cloudrun"" command to publish to Google Cloud Run", https://github.com/simonw/datasette/pull/434#issuecomment-484699119,https://api.github.com/repos/simonw/datasette/issues/434,484699119,MDEyOklzc3VlQ29tbWVudDQ4NDY5OTExOQ==,9599,simonw,2019-04-18T21:40:45Z,2019-04-18T21:40:45Z,OWNER,"I asked @andrewgodwin about this and he confirmed that if we want to read an environment variable we can't use the `CMD [...]` syntax in the way that we were using it. He did suggest that if we're doing `CMD [""sh"", ""-c"", ""datasette serve --port $PORT ...""]` we may as well do this instead: `CMD ""datasette serve --port $PORT ...""` We should apply some command-line escaping here - if the user passes `--version-note=hello$there` to `datasette publish` we need that $ not to be accidentally evaluated as an environment variable. It looks like [shlex.quote](https://docs.python.org/dev/library/shlex.html#shlex.quote) is the right way to do that.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",434321685,"""datasette publish cloudrun"" command to publish to Google Cloud Run", https://github.com/simonw/datasette/issues/435#issuecomment-485269362,https://api.github.com/repos/simonw/datasette/issues/435,485269362,MDEyOklzc3VlQ29tbWVudDQ4NTI2OTM2Mg==,9599,simonw,2019-04-21T17:38:05Z,2019-04-21T17:38:05Z,OWNER,I built a first version of this in https://github.com/simonw/datasette/commit/7d01ca34a10b5f8a993859cfd05790eb2870b94e which dumped SQL queries out to the terminal logs.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",435531034,Tracing support for seeing what SQL queries were executed, https://github.com/simonw/datasette/issues/435#issuecomment-485270230,https://api.github.com/repos/simonw/datasette/issues/435,485270230,MDEyOklzc3VlQ29tbWVudDQ4NTI3MDIzMA==,9599,simonw,2019-04-21T17:52:03Z,2019-04-21T17:52:03Z,OWNER,"Demos: * https://latest.datasette.io/fixtures/facetable?_trace=1 - SQL dumped at bottom of HTML * https://latest.datasette.io/fixtures/facetable.json?_trace=1 - SQL in a ""_traces"" key ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",435531034,Tracing support for seeing what SQL queries were executed, https://github.com/simonw/datasette/pull/426#issuecomment-485557574,https://api.github.com/repos/simonw/datasette/issues/426,485557574,MDEyOklzc3VlQ29tbWVudDQ4NTU1NzU3NA==,222245,carlmjohnson,2019-04-22T21:23:22Z,2019-04-22T21:23:22Z,NONE,Can you cut a new release with this?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",431756352,Upgrade to Jinja2==2.10.1, https://github.com/simonw/datasette/pull/437#issuecomment-487537452,https://api.github.com/repos/simonw/datasette/issues/437,487537452,MDEyOklzc3VlQ29tbWVudDQ4NzUzNzQ1Mg==,45057,russss,2019-04-29T10:58:49Z,2019-04-29T10:58:49Z,CONTRIBUTOR,I've just spotted that this implements #215.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",438048318,Add inspect and prepare_sanic hooks, https://github.com/simonw/datasette/pull/439#issuecomment-487542486,https://api.github.com/repos/simonw/datasette/issues/439,487542486,MDEyOklzc3VlQ29tbWVudDQ4NzU0MjQ4Ng==,45057,russss,2019-04-29T11:20:30Z,2019-04-29T11:20:30Z,CONTRIBUTOR,Actually I think this is not the whole story because of the rowid issue. I'm going to think about this one a bit more.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",438240541,[WIP] Add primary key to the extra_body_script hook arguments, https://github.com/simonw/datasette/pull/441#issuecomment-487686655,https://api.github.com/repos/simonw/datasette/issues/441,487686655,MDEyOklzc3VlQ29tbWVudDQ4NzY4NjY1NQ==,45057,russss,2019-04-29T18:14:25Z,2019-04-29T18:14:25Z,CONTRIBUTOR,"Subsidiary note which I forgot in the commit message: I've decided to give each view a short string name to aid in differentiating which view a hook is being called from. Since hooks are functions and not subclasses, and can get called from different places in the URL hierarchy, it's sometimes difficult to distinguish what data you're actually operating on. I think this will come in handy for other hooks as well.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",438437973,Add register_output_renderer hook, https://github.com/simonw/datasette/pull/424#issuecomment-487689477,https://api.github.com/repos/simonw/datasette/issues/424,487689477,MDEyOklzc3VlQ29tbWVudDQ4NzY4OTQ3Nw==,45057,russss,2019-04-29T18:22:40Z,2019-04-29T18:22:40Z,CONTRIBUTOR,This is pretty conflicty because I forgot how to use git fetch. If you're interested in merging this I'll rewrite it against an actual modern checkout...,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",427429265,Column types in inspected metadata, https://github.com/simonw/datasette/pull/424#issuecomment-487692377,https://api.github.com/repos/simonw/datasette/issues/424,487692377,MDEyOklzc3VlQ29tbWVudDQ4NzY5MjM3Nw==,45057,russss,2019-04-29T18:30:46Z,2019-04-29T18:30:46Z,CONTRIBUTOR,"Actually no, I ended up not using the inspected column types in my plugin, and the binary column issue can be solved a lot more simply, so I'll close this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",427429265,Column types in inspected metadata, https://github.com/simonw/datasette/pull/441#issuecomment-487721095,https://api.github.com/repos/simonw/datasette/issues/441,487721095,MDEyOklzc3VlQ29tbWVudDQ4NzcyMTA5NQ==,9599,simonw,2019-04-29T19:57:55Z,2019-04-29T19:57:55Z,OWNER,Do you have an example renderer plugin I can look at? ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",438437973,Add register_output_renderer hook, https://github.com/simonw/datasette/pull/441#issuecomment-487723476,https://api.github.com/repos/simonw/datasette/issues/441,487723476,MDEyOklzc3VlQ29tbWVudDQ4NzcyMzQ3Ng==,45057,russss,2019-04-29T20:05:23Z,2019-04-29T20:05:23Z,CONTRIBUTOR,"This is the minimal example (I also included it in the docs): ```python from datasette import hookimpl def render_test(args, data, view_name): return {   'body': 'Hello World', 'content_type': 'text/plain' } @hookimpl def register_output_renderer(): return { 'extension': 'test', 'callback': render_test } ``` I'm working on the GeoJSON one now and it should be ready soon. (I forgot I was going to run into the same problem as before - that Spatialite's stupid binary format isn't WKB and I have no way of altering the query to change that - but I've just managed to write some code to rearrange the bytes from Spatialite blob-geometry into WKB...)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",438437973,Add register_output_renderer hook, https://github.com/simonw/datasette/pull/441#issuecomment-487724539,https://api.github.com/repos/simonw/datasette/issues/441,487724539,MDEyOklzc3VlQ29tbWVudDQ4NzcyNDUzOQ==,45057,russss,2019-04-29T20:08:32Z,2019-04-29T20:08:32Z,CONTRIBUTOR,I also just realised that I should be passing the datasette object into the hook function...as I just found I need it. So hold off merging until I've fixed that.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",438437973,Add register_output_renderer hook, https://github.com/simonw/datasette/pull/441#issuecomment-487735247,https://api.github.com/repos/simonw/datasette/issues/441,487735247,MDEyOklzc3VlQ29tbWVudDQ4NzczNTI0Nw==,45057,russss,2019-04-29T20:39:43Z,2019-04-29T20:39:43Z,CONTRIBUTOR,"I updated the hook to pass the datasette object through now. You can see the working [GeoJSON render function here](https://github.com/russss/datasette-geo/blob/master/datasette_plugin_geo/geojson.py) - the [hook function is here](https://github.com/russss/datasette-geo/blob/master/datasette_plugin_geo/__init__.py#L65-L70).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",438437973,Add register_output_renderer hook, https://github.com/simonw/datasette/pull/441#issuecomment-487748271,https://api.github.com/repos/simonw/datasette/issues/441,487748271,MDEyOklzc3VlQ29tbWVudDQ4Nzc0ODI3MQ==,45057,russss,2019-04-29T21:20:17Z,2019-04-29T21:20:17Z,CONTRIBUTOR,"Also I just pushed a change to add registered output renderers to the templates: ![image](https://user-images.githubusercontent.com/45057/56927799-f18e0580-6acc-11e9-8ea9-a0ee961323ec.png) ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",438437973,Add register_output_renderer hook, https://github.com/simonw/datasette/pull/439#issuecomment-487859345,https://api.github.com/repos/simonw/datasette/issues/439,487859345,MDEyOklzc3VlQ29tbWVudDQ4Nzg1OTM0NQ==,45057,russss,2019-04-30T08:21:19Z,2019-04-30T08:21:19Z,CONTRIBUTOR,I think the best approach to this is to pass through the `view_name` parameter I added in #441. It's then simple enough for me to add `.geojson` to the URL in JS - I don't need the pkey.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",438240541,[WIP] Add primary key to the extra_body_script hook arguments, https://github.com/simonw/datasette/pull/441#issuecomment-488247617,https://api.github.com/repos/simonw/datasette/issues/441,488247617,MDEyOklzc3VlQ29tbWVudDQ4ODI0NzYxNw==,45057,russss,2019-05-01T09:57:50Z,2019-05-01T09:57:50Z,CONTRIBUTOR,"Just for the record, this PR is now finished and ready to merge from my perspective.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",438437973,Add register_output_renderer hook, https://github.com/simonw/datasette/issues/438#issuecomment-488370672,https://api.github.com/repos/simonw/datasette/issues/438,488370672,MDEyOklzc3VlQ29tbWVudDQ4ODM3MDY3Mg==,9599,simonw,2019-05-01T18:33:57Z,2019-05-01T18:33:57Z,OWNER,Yeah this is a good call. Having pytest set some kind of flag (maybe an environment variable?) that disables auto-plugin discovery is a very reasonable fix here.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",438200529,Plugins are loaded when running pytest, https://github.com/simonw/datasette/pull/441#issuecomment-488373631,https://api.github.com/repos/simonw/datasette/issues/441,488373631,MDEyOklzc3VlQ29tbWVudDQ4ODM3MzYzMQ==,9599,simonw,2019-05-01T18:43:17Z,2019-05-01T18:43:17Z,OWNER,"I plan to merge this in about 7 hours time (after work). I may tweak the plugin format a little bit before the next Datasette release though (best to merge first, make changes later I think) - for consistency with some other upcoming hooks (the facet hook in particular). I'll discuss that with you while I'm working on it. Thanks so much for this, it's a really neat addition!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",438437973,Add register_output_renderer hook, https://github.com/simonw/datasette/issues/419#issuecomment-488528111,https://api.github.com/repos/simonw/datasette/issues/419,488528111,MDEyOklzc3VlQ29tbWVudDQ4ODUyODExMQ==,9599,simonw,2019-05-02T01:14:58Z,2019-05-02T01:14:58Z,OWNER,I just closed #420 - all of the places in the codebase that were using `.inspect()` should have been eliminated.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421551434,"Default to opening files in mutable mode, special option for immutable files", https://github.com/simonw/datasette/issues/438#issuecomment-488554835,https://api.github.com/repos/simonw/datasette/issues/438,488554835,MDEyOklzc3VlQ29tbWVudDQ4ODU1NDgzNQ==,9599,simonw,2019-05-02T05:09:18Z,2019-05-02T05:09:18Z,OWNER,https://docs.pytest.org/en/latest/example/simple.html#detect-if-running-from-within-a-pytest-run,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",438200529,Plugins are loaded when running pytest, https://github.com/simonw/datasette/issues/431#issuecomment-488555399,https://api.github.com/repos/simonw/datasette/issues/431,488555399,MDEyOklzc3VlQ29tbWVudDQ4ODU1NTM5OQ==,9599,simonw,2019-05-02T05:13:54Z,2019-05-02T05:13:54Z,OWNER,"Datasette master now treats databases as readonly but NOT immutable. This means you can make changes to those databases from another process and those changes will be instantly reflected in the Datasette interface. As such, reloading on database change is no longer necessary. Closing this ticket.","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",432870248,Datasette doesn't reload when database file changes, https://github.com/simonw/datasette/issues/422#issuecomment-488558896,https://api.github.com/repos/simonw/datasette/issues/422,488558896,MDEyOklzc3VlQ29tbWVudDQ4ODU1ODg5Ng==,9599,simonw,2019-05-02T05:43:11Z,2019-05-02T05:43:11Z,OWNER,"I'm doing this with time limits right now, which I think is good enough for the moment.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",423316403,Figure out what to do about table counts in a mutable world, https://github.com/simonw/datasette/issues/427#issuecomment-488564761,https://api.github.com/repos/simonw/datasette/issues/427,488564761,MDEyOklzc3VlQ29tbWVudDQ4ODU2NDc2MQ==,9599,simonw,2019-05-02T06:24:49Z,2019-05-03T00:07:16Z,OWNER,"https://github.com/simonw/datasette/compare/facet-refactor-2 is almost ready to merge now. The remaining things to do are listed as TODOs there: - [x] Ensure facet is not suggested if it is already active - [x] Don't allow facets to be hidden if they were configured in metadata.json ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",431800286,"New design for facet abstraction, including querystring and metadata.json", https://github.com/simonw/datasette/issues/427#issuecomment-488564891,https://api.github.com/repos/simonw/datasette/issues/427,488564891,MDEyOklzc3VlQ29tbWVudDQ4ODU2NDg5MQ==,9599,simonw,2019-05-02T06:25:41Z,2019-05-02T06:25:41Z,OWNER,It would be neat to ship at least one additional face with this work - probably either `ArrayFacet` or `DateFacet`. I think `ArrayFacet` because it demonstrates the only-if-json1-enabled functionality.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",431800286,"New design for facet abstraction, including querystring and metadata.json", https://github.com/simonw/datasette/pull/432#issuecomment-488595724,https://api.github.com/repos/simonw/datasette/issues/432,488595724,MDEyOklzc3VlQ29tbWVudDQ4ODU5NTcyNA==,45057,russss,2019-05-02T08:50:53Z,2019-05-02T08:50:53Z,CONTRIBUTOR,"> Can I pull those needs out of the Facet class somehow? I was thinking that it might be handy for datasette to have a request object which wraps the Sanic Request. This could include the datasette-specific querystring decoding and the `special_args` parsing from TableView.data. This would mean that we could expose the request object to plugin hooks without coupling them to Sanic.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",432893491,"Refactor facets to a class and new plugin, refs #427", https://github.com/simonw/datasette/pull/432#issuecomment-488874364,https://api.github.com/repos/simonw/datasette/issues/432,488874364,MDEyOklzc3VlQ29tbWVudDQ4ODg3NDM2NA==,9599,simonw,2019-05-03T00:04:23Z,2019-05-03T00:04:23Z,OWNER,Abandoning this in favour of #445 - which contains the code from this branch but updated to incorporate recent changes in master.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",432893491,"Refactor facets to a class and new plugin, refs #427", https://github.com/simonw/datasette/issues/419#issuecomment-489060765,https://api.github.com/repos/simonw/datasette/issues/419,489060765,MDEyOklzc3VlQ29tbWVudDQ4OTA2MDc2NQ==,45057,russss,2019-05-03T11:07:42Z,2019-05-03T11:07:42Z,CONTRIBUTOR,"Are you planning on removing inspect entirely? I didn't spot this work before I started on datasette-geo, but ironically I think it has a use case which really needs the inspect functionality (or some replacement). Datasette-geo uses it to store the bounding box of all the geographic features in the table. This is needed when rendering the map because it avoids having to send loads of tile requests for areas which are empty. Even with relatively small datasets, calculating the bounding box seems to take around 5 seconds, so I don't think it's really feasible to do this on page load. One possible fix would be to do this on startup, and then in a thread which watches the database for changes.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421551434,"Default to opening files in mutable mode, special option for immutable files", https://github.com/simonw/datasette/issues/359#issuecomment-489076725,https://api.github.com/repos/simonw/datasette/issues/359,489076725,MDEyOklzc3VlQ29tbWVudDQ4OTA3NjcyNQ==,9599,simonw,2019-05-03T12:20:38Z,2019-05-03T12:20:38Z,OWNER,Demo: https://latest.datasette.io/fixtures/facetable?_facet_array=tags#facet-tags,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",349827640,Faceted browse against a JSON list of tags, https://github.com/simonw/datasette/pull/434#issuecomment-489104146,https://api.github.com/repos/simonw/datasette/issues/434,489104146,MDEyOklzc3VlQ29tbWVudDQ4OTEwNDE0Ng==,9599,simonw,2019-05-03T13:56:45Z,2019-05-03T13:56:45Z,OWNER,This is amazing - works an absolute treat. Thank you very much!,"{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 1, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",434321685,"""datasette publish cloudrun"" command to publish to Google Cloud Run", https://github.com/simonw/datasette/pull/434#issuecomment-489105665,https://api.github.com/repos/simonw/datasette/issues/434,489105665,MDEyOklzc3VlQ29tbWVudDQ4OTEwNTY2NQ==,25778,eyeseast,2019-05-03T14:01:30Z,2019-05-03T14:01:30Z,CONTRIBUTOR,This is exactly what I needed. Thank you.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",434321685,"""datasette publish cloudrun"" command to publish to Google Cloud Run", https://github.com/simonw/datasette/pull/434#issuecomment-489138554,https://api.github.com/repos/simonw/datasette/issues/434,489138554,MDEyOklzc3VlQ29tbWVudDQ4OTEzODU1NA==,9599,simonw,2019-05-03T15:36:48Z,2019-05-03T15:36:48Z,OWNER,"Here's my first working deployment: https://datasette-j7hipcg4aq-uc.a.run.app/fixtures-c35b6a5/facetable?_facet_array=tags I deployed it using this: datasette publish cloudrun fixtures.db --branch=master The second time I ran the command I got an error: ERROR: (gcloud.beta.run.deploy) Deployment endpoint was not found. Perhaps the provided region was invalid. Set the `run/region` property to a valid region and retry. Ex: `gcloud config set run/region us-central1` So I ran the command it suggested and then everything worked: gcloud config set run/region us-central1 datasette publish cloudrun fixtures.db --branch=master","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",434321685,"""datasette publish cloudrun"" command to publish to Google Cloud Run", https://github.com/simonw/datasette/pull/434#issuecomment-489154360,https://api.github.com/repos/simonw/datasette/issues/434,489154360,MDEyOklzc3VlQ29tbWVudDQ4OTE1NDM2MA==,9599,simonw,2019-05-03T16:18:18Z,2019-05-03T16:18:18Z,OWNER,Documentation is now available here: https://datasette.readthedocs.io/en/latest/publish.html#publishing-to-google-cloud-run,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",434321685,"""datasette publish cloudrun"" command to publish to Google Cloud Run", https://github.com/simonw/datasette/pull/442#issuecomment-489162365,https://api.github.com/repos/simonw/datasette/issues/442,489162365,MDEyOklzc3VlQ29tbWVudDQ4OTE2MjM2NQ==,9599,simonw,2019-05-03T16:44:29Z,2019-05-03T16:44:29Z,OWNER,I'm going to merge this and add a unit test.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",438450757,Suppress rendering of binary data, https://github.com/simonw/datasette/pull/434#issuecomment-489163939,https://api.github.com/repos/simonw/datasette/issues/434,489163939,MDEyOklzc3VlQ29tbWVudDQ4OTE2MzkzOQ==,10352819,rprimet,2019-05-03T16:49:45Z,2019-05-03T16:50:03Z,CONTRIBUTOR,"> The second time I ran the command I got an error: > > ERROR: (gcloud.beta.run.deploy) Deployment endpoint was not found. Perhaps the > provided region was invalid. Set the `run/region` property to a valid region and > retry. Ex: `gcloud config set run/region us-central1` > Yes, I was able to reproduce this; I used to get prompted for a run region interactively by the `gcloud` tool before, but maybe this is changing? (the [documentation](https://cloud.google.com/run/docs/deploying) now assumes `run/region` is set). Not sure which course of action is best: making `datasette` ensure that `run/region` is set beforehand or wait a bit until the gcloud CLI stabilizes?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",434321685,"""datasette publish cloudrun"" command to publish to Google Cloud Run", https://github.com/simonw/datasette/issues/446#issuecomment-489167692,https://api.github.com/repos/simonw/datasette/issues/446,489167692,MDEyOklzc3VlQ29tbWVudDQ4OTE2NzY5Mg==,9599,simonw,2019-05-03T17:02:24Z,2019-05-03T17:02:24Z,OWNER,"I looked at using namedtuples for this but hey have one major constraint: there isn't a clean way to convert them to dictionary-style JSON: https://bugs.python.org/issue30343 So something that uses a class which knows how to be rendered as JSON would be a better fit.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",440134714,Define mechanism for plugins to return structured data, https://github.com/simonw/datasette/pull/447#issuecomment-489190440,https://api.github.com/repos/simonw/datasette/issues/447,489190440,MDEyOklzc3VlQ29tbWVudDQ4OTE5MDQ0MA==,9599,simonw,2019-05-03T18:13:56Z,2019-05-03T18:13:56Z,OWNER,"This appears to fix a very weird error we were getting just on Python 3.7-dev: https://travis-ci.org/simonw/datasette/jobs/527858613 That weird error boiled down to `count` being `None`: ``` { ""columns"": [""pk"", ""distance"", ""frequency""], ""name"": ""units"", ""count"": 3, ""hidden"": False, ""foreign_keys"": {""incoming"": [], ""outgoing"": []}, ""fts_table"": None, ""primary_keys"": [""pk""], } # compared to: { ""name"": ""units"", ""columns"": [""pk"", ""distance"", ""frequency""], ""primary_keys"": [""pk""], ""count"": None, ""hidden"": False, ""fts_table"": None, ""foreign_keys"": {""incoming"": [], ""outgoing"": []}, } ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",440159137,Use dist: xenial and python: 3.7 on Travis, https://github.com/simonw/datasette/pull/442#issuecomment-489194450,https://api.github.com/repos/simonw/datasette/issues/442,489194450,MDEyOklzc3VlQ29tbWVudDQ4OTE5NDQ1MA==,9599,simonw,2019-05-03T18:26:48Z,2019-05-03T18:26:48Z,OWNER,"Demo here: https://latest.datasette.io/fixtures/binary_data I slightly tweaked the copy: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",438450757,Suppress rendering of binary data, https://github.com/simonw/datasette/issues/446#issuecomment-489204605,https://api.github.com/repos/simonw/datasette/issues/446,489204605,MDEyOklzc3VlQ29tbWVudDQ4OTIwNDYwNQ==,9599,simonw,2019-05-03T18:59:51Z,2019-05-03T18:59:59Z,OWNER,"Potential design: ```python from collections import OrderedDict class DataSpec: __slots__ = [] def __init__(self, **kwargs): if list(kwargs.keys()) != self.__slots__: raise TypeError( ""{}() has required arguments {} (got {})"".format( self.__class__.__name__, self.__slots__, list(kwargs.keys()) ) ) for key in self.__slots__: setattr(self, key, kwargs[key]) def __repr__(self): return ""<{} {}>"".format(self.__class__.__name__, dict(self.as_dict())) def as_dict(self): return OrderedDict([(key, getattr(self, key)) for key in self.__slots__]) class Output(DataSpec): __slots__ = [""body"", ""content_type"", ""status_code""] ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",440134714,Define mechanism for plugins to return structured data, https://github.com/simonw/datasette/issues/446#issuecomment-489209255,https://api.github.com/repos/simonw/datasette/issues/446,489209255,MDEyOklzc3VlQ29tbWVudDQ4OTIwOTI1NQ==,9599,simonw,2019-05-03T19:15:23Z,2019-05-03T19:16:34Z,OWNER,"The `register_output_renderer()` hook currently returns a dictionary with `body`, `content_type` and `status_code` keys but each of these keys are optionaly. I'm tempted to make all three required to better fit this model - @russss any objections? Alternative would be to support default values for properties of the `DataSpec` subclass - maybe: `__defaults__ = {""body"": """", ""content_type"": ""text/plain"", ""status_code"": 200}`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",440134714,Define mechanism for plugins to return structured data, https://github.com/simonw/datasette/issues/446#issuecomment-489221481,https://api.github.com/repos/simonw/datasette/issues/446,489221481,MDEyOklzc3VlQ29tbWVudDQ4OTIyMTQ4MQ==,45057,russss,2019-05-03T19:58:31Z,2019-05-03T19:58:31Z,CONTRIBUTOR,"In this particular case I don't think there's an issue making all those required. However, I suspect we might have to allow optional values at some point - my preferred solution to russss/datasette-geo#2 would need one.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",440134714,Define mechanism for plugins to return structured data, https://github.com/simonw/datasette/issues/446#issuecomment-489222223,https://api.github.com/repos/simonw/datasette/issues/446,489222223,MDEyOklzc3VlQ29tbWVudDQ4OTIyMjIyMw==,45057,russss,2019-05-03T20:01:19Z,2019-05-03T20:01:29Z,CONTRIBUTOR,"Also I have a slight preference against (ab)using `__slots__` to enforce fields, although I have done it myself in the past. It would be possible to do this with `__setattr__` instead, although that's an implementation detail and I'm not too fussed about it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",440134714,Define mechanism for plugins to return structured data, https://github.com/simonw/datasette/issues/448#issuecomment-489240609,https://api.github.com/repos/simonw/datasette/issues/448,489240609,MDEyOklzc3VlQ29tbWVudDQ4OTI0MDYwOQ==,9599,simonw,2019-05-03T21:09:13Z,2019-05-03T21:09:13Z,OWNER,It may be that some facet implementations (`ArrayFacet` in this case) need a way to detect if they are supported by the thing they are running against (must be a rowid table in this case) and avoid suggesting themselves if they are not compatible. This may require a change to the information we make available to the `suggest()` method (information passed to the Facet class constructor).,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",440222719,_facet_array should work against views, https://github.com/simonw/datasette/pull/367#issuecomment-489240874,https://api.github.com/repos/simonw/datasette/issues/367,489240874,MDEyOklzc3VlQ29tbWVudDQ4OTI0MDg3NA==,9599,simonw,2019-05-03T21:10:13Z,2019-05-03T21:10:13Z,OWNER,"This is a neat fix, thanks!","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",374675798,Mark codemirror files as vendored, https://github.com/simonw/datasette/pull/367#issuecomment-489241377,https://api.github.com/repos/simonw/datasette/issues/367,489241377,MDEyOklzc3VlQ29tbWVudDQ4OTI0MTM3Nw==,9599,simonw,2019-05-03T21:12:09Z,2019-05-03T21:12:09Z,OWNER,"Before applying this fix, GitHub showed the following statistics: Python 50.1% JavaScript 46.0% HTML 3.0% Other 0.9% Afterwards, it shows: Python 92.8% HTML 5.5% CSS 1.3% Dockerfile 0.4% ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",374675798,Mark codemirror files as vendored, https://github.com/simonw/datasette/pull/434#issuecomment-489250828,https://api.github.com/repos/simonw/datasette/issues/434,489250828,MDEyOklzc3VlQ29tbWVudDQ4OTI1MDgyOA==,9599,simonw,2019-05-03T21:50:44Z,2019-05-03T21:50:44Z,OWNER,Since there's a useful error message I'm OK with revisiting this in a few weeks to see if they change the CLI tool.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",434321685,"""datasette publish cloudrun"" command to publish to Google Cloud Run", https://github.com/simonw/datasette/pull/450#issuecomment-489342728,https://api.github.com/repos/simonw/datasette/issues/450,489342728,MDEyOklzc3VlQ29tbWVudDQ4OTM0MjcyOA==,45057,russss,2019-05-04T16:37:35Z,2019-05-04T16:37:35Z,CONTRIBUTOR,For a bit more context: this fixes a crash with `unsupported operand type(s) for +: 'int' and 'NoneType'` on the index page for me.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",440304714,Coalesce hidden table count to 0, https://github.com/simonw/datasette/issues/187#issuecomment-489353316,https://api.github.com/repos/simonw/datasette/issues/187,489353316,MDEyOklzc3VlQ29tbWVudDQ4OTM1MzMxNg==,46059,carsonyl,2019-05-04T18:36:36Z,2019-05-04T18:36:36Z,NONE,"Hi @simonw - I just hit this issue when trying out Datasette after your PyCon talk today. Datasette is pinned to Sanic 0.7.0, but it looks like 0.8.0 added the option to remove the uvloop dependency for Windows by having an environment variable `SANIC_NO_UVLOOP` at install time. Maybe that'll be sufficient before a port to Starlette?","{""total_count"": 1, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 1, ""eyes"": 0}",309033998,Windows installation error, https://github.com/simonw/datasette/issues/454#issuecomment-489420385,https://api.github.com/repos/simonw/datasette/issues/454,489420385,MDEyOklzc3VlQ29tbWVudDQ4OTQyMDM4NQ==,9599,simonw,2019-05-05T12:07:56Z,2019-05-05T12:10:13Z,OWNER,"Since I want the option to store more than one host, I don't think this should be a command-line option or a `--config` setting. Instead, I'm inclined to add this to `metadata.json`. Maybe this should be a plugin? That way the `metadata.json` setting could look like this: ``` { ""title"": ""Title of this instance"", ""plugins"": { ""datasette-cors"": { ""allowed_origins"": [""https://example.com""] } } } ``` This could be implemented easily on top of ASGI #272. (It should probably raise an exception on startup if any of the `allowed_origins` ends with a slash e.g. `""https://example.com/""` since that's not actually a valid origin, and it's an easy mistake to make.)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",440437037,Plugin for allowing CORS from specified hosts, https://github.com/simonw/datasette/issues/454#issuecomment-489420661,https://api.github.com/repos/simonw/datasette/issues/454,489420661,MDEyOklzc3VlQ29tbWVudDQ4OTQyMDY2MQ==,9599,simonw,2019-05-05T12:11:01Z,2019-05-05T12:11:01Z,OWNER,"Also worth considering: `Access-Control-Max-Age: 86400` support - maybe as a `""max_age""` setting for the plugin. This can reduce the number of preflight checks the browser needs to make.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",440437037,Plugin for allowing CORS from specified hosts, https://github.com/simonw/datasette/issues/453#issuecomment-489421634,https://api.github.com/repos/simonw/datasette/issues/453,489421634,MDEyOklzc3VlQ29tbWVudDQ4OTQyMTYzNA==,9599,simonw,2019-05-05T12:24:24Z,2019-05-05T12:24:24Z,OWNER,"Demo: https://latest.datasette.io/fixtures.json?sql=select+blah ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",440332621,Error pages do not return CORS header with --cors, https://github.com/simonw/datasette/issues/454#issuecomment-489433651,https://api.github.com/repos/simonw/datasette/issues/454,489433651,MDEyOklzc3VlQ29tbWVudDQ4OTQzMzY1MQ==,9599,simonw,2019-05-05T14:52:46Z,2019-05-05T14:52:46Z,OWNER,"I really like the idea of this as a plugin, because it will provide a great example of an ASGI plugin including how to build unit tests against Datasette plugins which actually start up a Datasette server and run some requests through it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",440437037,Plugin for allowing CORS from specified hosts, https://github.com/simonw/datasette/issues/187#issuecomment-490039343,https://api.github.com/repos/simonw/datasette/issues/187,490039343,MDEyOklzc3VlQ29tbWVudDQ5MDAzOTM0Mw==,6422964,Maltazar,2019-05-07T11:24:42Z,2019-05-07T11:24:42Z,NONE,I totally agree with carsonyl,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",309033998,Windows installation error, https://github.com/simonw/datasette/issues/457#issuecomment-490979355,https://api.github.com/repos/simonw/datasette/issues/457,490979355,MDEyOklzc3VlQ29tbWVudDQ5MDk3OTM1NQ==,9599,simonw,2019-05-09T16:43:29Z,2019-05-09T16:43:29Z,OWNER,From https://cloud.google.com/sdk/gcloud/reference/beta/run/deploy it looks like the service can be provided as an optional positional argument. How about `datasette publish cloudrun ... --service=sf-trees`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",442330564,"Ability to ""publish cloudrun"" with no user input", https://github.com/simonw/datasette/issues/454#issuecomment-491003082,https://api.github.com/repos/simonw/datasette/issues/454,491003082,MDEyOklzc3VlQ29tbWVudDQ5MTAwMzA4Mg==,9599,simonw,2019-05-09T17:53:45Z,2019-05-09T17:53:57Z,OWNER,I built a new ASGI middleware component for CORS headers which I can use to implement this: https://pypi.org/project/asgi-cors/ and https://github.com/simonw/asgi-cors ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",440437037,Plugin for allowing CORS from specified hosts, https://github.com/simonw/datasette/issues/457#issuecomment-491038808,https://api.github.com/repos/simonw/datasette/issues/457,491038808,MDEyOklzc3VlQ29tbWVudDQ5MTAzODgwOA==,9599,simonw,2019-05-09T19:41:31Z,2019-05-09T19:41:31Z,OWNER,Updated docs now live at https://datasette.readthedocs.io/en/latest/publish.html#publishing-to-google-cloud-run ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",442330564,"Ability to ""publish cloudrun"" with no user input", https://github.com/simonw/datasette/issues/456#issuecomment-491039036,https://api.github.com/repos/simonw/datasette/issues/456,491039036,MDEyOklzc3VlQ29tbWVudDQ5MTAzOTAzNg==,9599,simonw,2019-05-09T19:42:11Z,2019-05-09T19:42:11Z,OWNER,Thanks for spotting this! Pull request would be very welcome.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",442327592,Installing installs the tests package, https://github.com/simonw/datasette/pull/458#issuecomment-491126028,https://api.github.com/repos/simonw/datasette/issues/458,491126028,MDEyOklzc3VlQ29tbWVudDQ5MTEyNjAyOA==,9599,simonw,2019-05-10T01:54:59Z,2019-05-10T01:54:59Z,OWNER,Thanks!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",442402832,setup: add tests to package exclusion, https://github.com/simonw/datasette/issues/459#issuecomment-491531364,https://api.github.com/repos/simonw/datasette/issues/459,491531364,MDEyOklzc3VlQ29tbWVudDQ5MTUzMTM2NA==,9599,simonw,2019-05-11T17:50:37Z,2019-05-11T17:50:37Z,OWNER,I also need to fix my various `.travis.yml` configs to adapt to this new change.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",443020048,"Fix the ""datasette now publish ... --alias=x"" option", https://github.com/simonw/datasette/issues/459#issuecomment-491531399,https://api.github.com/repos/simonw/datasette/issues/459,491531399,MDEyOklzc3VlQ29tbWVudDQ5MTUzMTM5OQ==,9599,simonw,2019-05-11T17:51:11Z,2019-05-11T17:51:11Z,OWNER,"This is the publish code that needs updating: https://github.com/simonw/datasette/blob/e7b31ae8c1a28cab9db8e165b3f21407c2e581e6/datasette/publish/now.py#L79-L96","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",443020048,"Fix the ""datasette now publish ... --alias=x"" option", https://github.com/simonw/datasette/issues/461#issuecomment-491532380,https://api.github.com/repos/simonw/datasette/issues/461,491532380,MDEyOklzc3VlQ29tbWVudDQ5MTUzMjM4MA==,9599,simonw,2019-05-11T18:06:01Z,2019-05-11T18:06:01Z,OWNER,"I plan to add a search filter box too, but only if there are more than X (probably 10) connected databases.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",443021509,Paginate + search for databases/tables on the homepage, https://github.com/simonw/datasette/issues/373#issuecomment-491532557,https://api.github.com/repos/simonw/datasette/issues/373,491532557,MDEyOklzc3VlQ29tbWVudDQ5MTUzMjU1Nw==,9599,simonw,2019-05-11T18:08:47Z,2019-05-11T18:08:47Z,OWNER,I'll do this as part of #460 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",377266351,Views should be shown on root/index page along with tables, https://github.com/simonw/datasette/pull/450#issuecomment-491532671,https://api.github.com/repos/simonw/datasette/issues/450,491532671,MDEyOklzc3VlQ29tbWVudDQ5MTUzMjY3MQ==,9599,simonw,2019-05-11T18:10:09Z,2019-05-11T18:10:09Z,OWNER,I fixed this in https://github.com/simonw/datasette/commit/de005b9b7d3db72375e6b8b048d1616a98e6347a instead (hadn't seen this PR),"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",440304714,Coalesce hidden table count to 0, https://github.com/simonw/datasette/issues/316#issuecomment-491533700,https://api.github.com/repos/simonw/datasette/issues/316,491533700,MDEyOklzc3VlQ29tbWVudDQ5MTUzMzcwMA==,9599,simonw,2019-05-11T18:26:25Z,2019-05-11T18:26:25Z,OWNER,This will be fixed by #419 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",333238932,datasette inspect takes a very long time on large dbs, https://github.com/simonw/datasette/issues/459#issuecomment-491533931,https://api.github.com/repos/simonw/datasette/issues/459,491533931,MDEyOklzc3VlQ29tbWVudDQ5MTUzMzkzMQ==,9599,simonw,2019-05-11T18:29:37Z,2019-05-11T19:26:11Z,OWNER,This needs doing before the next release because it is interfering with my Travis code that actually runs the release: https://travis-ci.org/simonw/datasette/jobs/530597261,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",443020048,"Fix the ""datasette now publish ... --alias=x"" option", https://github.com/simonw/datasette/issues/435#issuecomment-491534162,https://api.github.com/repos/simonw/datasette/issues/435,491534162,MDEyOklzc3VlQ29tbWVudDQ5MTUzNDE2Mg==,9599,simonw,2019-05-11T18:33:02Z,2019-05-11T18:36:33Z,OWNER,"I don't like the shape of the JSON: ``` ""_traces"": { ""num_traces"": 20, ""traces"": { ""duration"": 0.015190839767456055, ""queries"": [ [ ""sql"", [ ""fixtures"", ""select 1 from sqlite_master where type='table' and name=?"", [ ""facetable.json"" ] ], 1557599406.7231224, 1557599406.723611, 0.4887580871582031 ], [ ``` I want this instead: ``` ""_traces"": { ""num_traces"": 20, ""sum_duration_ms"": 0.015190839767456055, ""traces"": [ { ""type"": ""sql"", ""database"": ""fixtures"", ""sql"": ""select 1 from sqlite_master where type='table' and name=?"", ""args"": [""facetable.json""], ""start"": 1557599406.7231224, ""end"": 1557599406.723611, ""duration_ms"": 0.4887580871582031 }","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",435531034,Tracing support for seeing what SQL queries were executed, https://github.com/simonw/datasette/issues/460#issuecomment-491536725,https://api.github.com/repos/simonw/datasette/issues/460,491536725,MDEyOklzc3VlQ29tbWVudDQ5MTUzNjcyNQ==,9599,simonw,2019-05-11T19:11:44Z,2019-05-11T19:11:44Z,OWNER,I split pagination out to #461 - and I don't consider that necessary to ship the next release.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",443020810,Design changes to homepage to support mutable files, https://github.com/simonw/datasette/issues/435#issuecomment-491541721,https://api.github.com/repos/simonw/datasette/issues/435,491541721,MDEyOklzc3VlQ29tbWVudDQ5MTU0MTcyMQ==,9599,simonw,2019-05-11T20:32:21Z,2019-05-11T20:32:21Z,OWNER,"Demo of the finished feature: https://latest.datasette.io/fixtures/facetable?_trace=1 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",435531034,Tracing support for seeing what SQL queries were executed, https://github.com/simonw/datasette/issues/462#issuecomment-491543635,https://api.github.com/repos/simonw/datasette/issues/462,491543635,MDEyOklzc3VlQ29tbWVudDQ5MTU0MzYzNQ==,9599,simonw,2019-05-11T21:03:10Z,2019-05-11T21:03:23Z,OWNER,"`test_inspect.py` currently just contains two tests that exercise a small portion of what `.inspect()` does - I'm going to repurpose that module and have it only test the `datasette inspect` CLI command instead. Here's the current contents of that file: https://github.com/simonw/datasette/blob/ce09e5d2d392634eced44c3c8d603d7c628e2822/tests/test_inspect.py","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",443023308,Replace most of `.inspect()` (and `datasette inspect`) with table counting, https://github.com/simonw/datasette/issues/462#issuecomment-491543785,https://api.github.com/repos/simonw/datasette/issues/462,491543785,MDEyOklzc3VlQ29tbWVudDQ5MTU0Mzc4NQ==,9599,simonw,2019-05-11T21:05:54Z,2019-05-11T21:05:54Z,OWNER,"So I think `datasette inspect fixtures.db other.db` should output something like this: ```json { ""fixtures"": { ""hash"": ""894870db97229e9e18b40921dc32b581da813465d672445e96e040ab2adbd229"", ""file"": ""fixtures.db"", ""size"": 225280, ""tables"": { ""facetable"": { ""count"": 34, } } } ``` It currently writes it out to a file called `inspect-data.json`. Should I keep that as the default behaviour or switch it to outputting to stdout instead? Here's the current `datasette inspect --help`: ```$ datasette inspect --help Usage: datasette inspect [OPTIONS] [FILES]... Options: --inspect-file TEXT --load-extension PATH Path to a SQLite extension to load --help Show this message and exit.```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",443023308,Replace most of `.inspect()` (and `datasette inspect`) with table counting, https://github.com/simonw/datasette/issues/462#issuecomment-491543817,https://api.github.com/repos/simonw/datasette/issues/462,491543817,MDEyOklzc3VlQ29tbWVudDQ5MTU0MzgxNw==,9599,simonw,2019-05-11T21:06:13Z,2019-05-11T21:06:13Z,OWNER,I'm going to change it to output to stdout unless you pass it the `--inspect-file` argument.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",443023308,Replace most of `.inspect()` (and `datasette inspect`) with table counting, https://github.com/simonw/datasette/issues/465#issuecomment-491545872,https://api.github.com/repos/simonw/datasette/issues/465,491545872,MDEyOklzc3VlQ29tbWVudDQ5MTU0NTg3Mg==,9599,simonw,2019-05-11T21:40:07Z,2019-05-11T21:40:07Z,OWNER,I split this out from #462,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",443038584,Decide what to do about /-/inspect, https://github.com/simonw/datasette/issues/295#issuecomment-491545892,https://api.github.com/repos/simonw/datasette/issues/295,491545892,MDEyOklzc3VlQ29tbWVudDQ5MTU0NTg5Mg==,9599,simonw,2019-05-11T21:40:32Z,2019-05-11T21:40:32Z,OWNER,"I'm not going to do this, as a result of #462 and #419 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",327383759,Extract unit tests for inspect out to test_inspect.py, https://github.com/simonw/datasette/issues/462#issuecomment-491545952,https://api.github.com/repos/simonw/datasette/issues/462,491545952,MDEyOklzc3VlQ29tbWVudDQ5MTU0NTk1Mg==,9599,simonw,2019-05-11T21:41:49Z,2019-05-11T21:41:49Z,OWNER,I now need to update `datasette serve ... --inspect-data=X` to understand and correctly handle the new format.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",443023308,Replace most of `.inspect()` (and `datasette inspect`) with table counting, https://github.com/simonw/datasette/issues/466#issuecomment-491548189,https://api.github.com/repos/simonw/datasette/issues/466,491548189,MDEyOklzc3VlQ29tbWVudDQ5MTU0ODE4OQ==,9599,simonw,2019-05-11T22:21:40Z,2019-05-11T22:21:47Z,OWNER,"This is a little bit tricky. This SQL looks like it may detect Spatialite tables: ```sql select * from sqlite_master where type = ""table"" and sql like ""%CREATE VIRTUAL TABLE%"" and sql like ""%USING VirtualSpatialIndex%"" ``` But where to put it? I think this should go in a new ""checks"" mechanism, where we run checks against every connected database on Datasette startup.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",443040665,"Move ""no such module: VirtualSpatialIndex"" code elsewhere", https://github.com/simonw/datasette/issues/466#issuecomment-491549929,https://api.github.com/repos/simonw/datasette/issues/466,491549929,MDEyOklzc3VlQ29tbWVudDQ5MTU0OTkyOQ==,9599,simonw,2019-05-11T22:55:23Z,2019-05-11T22:55:23Z,OWNER,"To build a unit test for this I'm going to have to ship a small spatialite.db binary database as part of the git repo. This is because I need the tests to run even when the spatialite module is not available - but you cannot create a spatialite database without having access to that module. I'll include a build script in the repo for constructing that database.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",443040665,"Move ""no such module: VirtualSpatialIndex"" code elsewhere", https://github.com/simonw/datasette/issues/418#issuecomment-491551647,https://api.github.com/repos/simonw/datasette/issues/418,491551647,MDEyOklzc3VlQ29tbWVudDQ5MTU1MTY0Nw==,9599,simonw,2019-05-11T23:31:23Z,2019-05-11T23:31:23Z,OWNER,Actually right now https://latest.datasette.io/fixtures/facetable?_hash=1 redirects to https://latest.datasette.io/fixtures-000/facetable - because we are no longer calculating hashes on startup for non-immutable databases. So that's weird.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421548881,Hashed URLs should be optional, https://github.com/simonw/datasette/issues/418#issuecomment-491551702,https://api.github.com/repos/simonw/datasette/issues/418,491551702,MDEyOklzc3VlQ29tbWVudDQ5MTU1MTcwMg==,9599,simonw,2019-05-11T23:32:21Z,2019-05-11T23:32:21Z,OWNER,"I'm going to re-open this, because some of this needs revisiting now that we aren't running `.inspect()` and hence are not calculating hashes for anything other than immutable databases (and databases are treated as mutable by default).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421548881,Hashed URLs should be optional, https://github.com/simonw/datasette/issues/231#issuecomment-491943956,https://api.github.com/repos/simonw/datasette/issues/231,491943956,MDEyOklzc3VlQ29tbWVudDQ5MTk0Mzk1Ng==,9599,simonw,2019-05-13T18:56:21Z,2019-05-13T18:56:21Z,OWNER,I implemented this a while ago but forgot to close the issue: https://datasette.readthedocs.io/en/stable/plugins.html#plugin-configuration,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",316323336,metadata.json support for plugin configuration options, https://github.com/simonw/datasette/issues/14#issuecomment-491944613,https://api.github.com/repos/simonw/datasette/issues/14,491944613,MDEyOklzc3VlQ29tbWVudDQ5MTk0NDYxMw==,9599,simonw,2019-05-13T18:58:19Z,2019-05-13T18:58:19Z,OWNER,"We've grown a bunch of plugin hooks over the past two years: https://datasette.readthedocs.io/en/latest/plugins.html#plugin-hooks Since the plugin system will never be 100% ""finished"", I'm closing this in favor of the label: https://github.com/simonw/datasette/labels/plugins","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",267707940,Datasette Plugins, https://github.com/simonw/datasette/issues/59#issuecomment-491945391,https://api.github.com/repos/simonw/datasette/issues/59,491945391,MDEyOklzc3VlQ29tbWVudDQ5MTk0NTM5MQ==,9599,simonw,2019-05-13T19:00:44Z,2019-05-13T19:01:00Z,OWNER,Hyper shut down at the start of this year: https://news.ycombinator.com/item?id=18734658,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",273157085,datasette publish hyper, https://github.com/simonw/datasette/issues/460#issuecomment-492285114,https://api.github.com/repos/simonw/datasette/issues/460,492285114,MDEyOklzc3VlQ29tbWVudDQ5MjI4NTExNA==,9599,simonw,2019-05-14T15:22:58Z,2019-05-14T15:22:58Z,OWNER,If a database has less than 10 tables AND I can get a full count of all 10 of them in under 10ms each then I think I'll still show the row counts.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",443020810,Design changes to homepage to support mutable files, https://github.com/simonw/datasette/issues/455#issuecomment-492296234,https://api.github.com/repos/simonw/datasette/issues/455,492296234,MDEyOklzc3VlQ29tbWVudDQ5MjI5NjIzNA==,9599,simonw,2019-05-14T15:49:09Z,2019-05-14T15:49:29Z,OWNER,Part of #460,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",441858747,Hidden tables shown on the index page, https://github.com/simonw/datasette/issues/461#issuecomment-492296836,https://api.github.com/repos/simonw/datasette/issues/461,492296836,MDEyOklzc3VlQ29tbWVudDQ5MjI5NjgzNg==,9599,simonw,2019-05-14T15:50:34Z,2019-05-14T15:51:11Z,OWNER,This is needed by Datasette Library #417 since that's going to demand listing a LOT of databases on the homepage.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",443021509,Paginate + search for databases/tables on the homepage, https://github.com/simonw/datasette/issues/467#issuecomment-492883561,https://api.github.com/repos/simonw/datasette/issues/467,492883561,MDEyOklzc3VlQ29tbWVudDQ5Mjg4MzU2MQ==,9599,simonw,2019-05-16T01:40:09Z,2019-05-16T01:40:09Z,OWNER,"I'm setting X to 30 because the fixtures database currently has 26 tables (22 visible, 4 hidden) and I want to display counts for it.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",444711254,Index page row counts only for DBs with < 30 tables (10ms count limit per table), https://github.com/simonw/datasette/issues/467#issuecomment-492898241,https://api.github.com/repos/simonw/datasette/issues/467,492898241,MDEyOklzc3VlQ29tbWVudDQ5Mjg5ODI0MQ==,9599,simonw,2019-05-16T03:02:27Z,2019-05-16T03:02:27Z,OWNER,"I'm going to be lazy and skip the unit test for this, because I don't currently have a neat way of mocking a SQL interrupted exception to simulate a query taking too long (at least for these counts).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",444711254,Index page row counts only for DBs with < 30 tables (10ms count limit per table), https://github.com/simonw/datasette/issues/460#issuecomment-492898595,https://api.github.com/repos/simonw/datasette/issues/460,492898595,MDEyOklzc3VlQ29tbWVudDQ5Mjg5ODU5NQ==,9599,simonw,2019-05-16T03:04:29Z,2019-05-16T03:04:29Z,OWNER,One last thing before I close this: sort tables by number of inbound/outbound foreign keys.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",443020810,Design changes to homepage to support mutable files, https://github.com/simonw/datasette/issues/460#issuecomment-492899100,https://api.github.com/repos/simonw/datasette/issues/460,492899100,MDEyOklzc3VlQ29tbWVudDQ5Mjg5OTEwMA==,9599,simonw,2019-05-16T03:07:41Z,2019-05-16T03:07:41Z,OWNER,"I'm going to sort by row counts first, but if row counts aren't available I'll fall back to number of inbound/outbound foreign keys. To make unit testing easier, I'll accept an undocumented ?_sort=relationships parameter","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",443020810,Design changes to homepage to support mutable files, https://github.com/simonw/datasette/issues/419#issuecomment-492903398,https://api.github.com/repos/simonw/datasette/issues/419,492903398,MDEyOklzc3VlQ29tbWVudDQ5MjkwMzM5OA==,9599,simonw,2019-05-16T03:33:01Z,2019-05-16T03:33:01Z,OWNER,"@russss sorry I only just spotted your comment here. I think I have an alternative suggestion for what you need to do here. It sounds to me like you need to calculate a specific piece of information against a specific database. Instead of doing this in inspect, how about having a separate tool which runs this once against the database file and writes the result into a database file there? I've been thinking about this pattern a bit as part of the sqlite-utils work I've been doing. It's already something that's needed for SQLite FTS support - it's no good just creating a FTS index, you have to populate it as well. In sqlite-utils world you do that like this: https://sqlite-utils.readthedocs.io/en/latest/cli.html#configuring-full-text-search $ sqlite-utils enable-fts mydb.db documents title summary But then later if you've inserted new records you have to call this: $ sqlite-utils populate-fts mydb.db documents title summary So one option here could be for `datasette-geo` to know to look for a special `datasette_geo_bounding_box` database table and, if it's missing, to calculate at runtime (probably once on startup and then cache it). Another option: Datasette now has an option to open a database file in ""immutable"" mode, using `datasette -i mydatabase.db`. When you do that we calculate counts on startup - and we'll also be able to load counts from the `inspect-data.json` file (that's pretty much all that will be in there). I'm open to making this available as a plugin hook - all kinds of optimizations could be run against these `-i` databases. It would essentially be what we have with inspect today but just for databases opened in that specific mode.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421551434,"Default to opening files in mutable mode, special option for immutable files", https://github.com/simonw/datasette/issues/460#issuecomment-492903581,https://api.github.com/repos/simonw/datasette/issues/460,492903581,MDEyOklzc3VlQ29tbWVudDQ5MjkwMzU4MQ==,9599,simonw,2019-05-16T03:34:08Z,2019-05-16T03:34:08Z,OWNER,Demo of above: https://latest.datasette.io/?_sort=relationships compared to https://latest.datasette.io/,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",443020810,Design changes to homepage to support mutable files, https://github.com/simonw/datasette/issues/465#issuecomment-492904704,https://api.github.com/repos/simonw/datasette/issues/465,492904704,MDEyOklzc3VlQ29tbWVudDQ5MjkwNDcwNA==,9599,simonw,2019-05-16T03:41:27Z,2019-05-16T03:41:27Z,OWNER,"The main use-case for this endpoint now is going to be [Datasette Registry](https://github.com/simonw/datasette-registry) (which really needs some more love). That tool needs to be able to query a Datasette and find out: * What tables are available * What their columns are * Ideally, their row counts A single `/-/inspect` call is no good here because with Datasette Library #417 I'm going to be encouraging MUCH larger Datasette instances, potentially with hundreds of attached databases and thousands of attached tables. So pagination will be essential. Maybe a smarter approach will be the older idea of having a separate inspect for each database (and maybe each table): * `/mydatabase/-/inspect` * `/mydatabase/mytable/-/inspect` Either way, I'm going to decouple this from milestone 0.28.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",443038584,Decide what to do about /-/inspect, https://github.com/simonw/datasette/issues/464#issuecomment-492917925,https://api.github.com/repos/simonw/datasette/issues/464,492917925,MDEyOklzc3VlQ29tbWVudDQ5MjkxNzkyNQ==,9599,simonw,2019-05-16T05:04:35Z,2019-05-16T05:04:35Z,OWNER,https://datasette.readthedocs.io/en/latest/getting_started.html#glitch,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",443034218,Add Glitch to Getting Started docs section, https://github.com/simonw/datasette/issues/471#issuecomment-493102841,https://api.github.com/repos/simonw/datasette/issues/471,493102841,MDEyOklzc3VlQ29tbWVudDQ5MzEwMjg0MQ==,9599,simonw,2019-05-16T14:56:50Z,2019-05-16T15:10:11Z,OWNER,This is a good opportunity to add some missing test coverage for this feature.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",445003029,?_hash=1 and --config hash_urls:1 should only work for immutable databases, https://github.com/simonw/datasette/issues/418#issuecomment-493109347,https://api.github.com/repos/simonw/datasette/issues/418,493109347,MDEyOklzc3VlQ29tbWVudDQ5MzEwOTM0Nw==,9599,simonw,2019-05-16T15:12:26Z,2019-05-16T15:12:26Z,OWNER,I'm ready to close this now thanks to fixing #471 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421548881,Hashed URLs should be optional, https://github.com/simonw/datasette/issues/419#issuecomment-493110184,https://api.github.com/repos/simonw/datasette/issues/419,493110184,MDEyOklzc3VlQ29tbWVudDQ5MzExMDE4NA==,9599,simonw,2019-05-16T15:14:31Z,2019-05-16T15:14:31Z,OWNER,"This is done bar the documentation, which is tracked in #421 ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421551434,"Default to opening files in mutable mode, special option for immutable files", https://github.com/simonw/datasette/issues/472#issuecomment-493564683,https://api.github.com/repos/simonw/datasette/issues/472,493564683,MDEyOklzc3VlQ29tbWVudDQ5MzU2NDY4Mw==,9599,simonw,2019-05-17T19:03:49Z,2019-05-17T19:03:49Z,OWNER,Should set up an alias for the existing `now` to avoid breaking existing automation scripts: http://click.palletsprojects.com/en/5.x/advanced/#command-aliases,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",445230077,"Rename ""publish now"" to ""publish nowv1""", https://github.com/simonw/datasette/issues/421#issuecomment-493728476,https://api.github.com/repos/simonw/datasette/issues/421,493728476,MDEyOklzc3VlQ29tbWVudDQ5MzcyODQ3Ng==,9599,simonw,2019-05-19T05:32:37Z,2019-05-19T05:32:37Z,OWNER,https://datasette.readthedocs.io/en/latest/performance.html#http-caching,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",421985685,Documentation for ?_hash=1 and Datasette's hashed URL caching, https://github.com/simonw/datasette/issues/473#issuecomment-493781417,https://api.github.com/repos/simonw/datasette/issues/473,493781417,MDEyOklzc3VlQ29tbWVudDQ5Mzc4MTQxNw==,9599,simonw,2019-05-19T18:45:15Z,2019-05-19T18:45:15Z,OWNER,This expands on the refactoring work from https://github.com/simonw/datasette/commit/6da567dda953c7ac0e5500f17d8e220467a3499e,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",445850934,Plugin hook: filters_from_request, https://github.com/simonw/datasette/issues/470#issuecomment-493784506,https://api.github.com/repos/simonw/datasette/issues/470,493784506,MDEyOklzc3VlQ29tbWVudDQ5Mzc4NDUwNg==,9599,simonw,2019-05-19T19:28:43Z,2019-05-19T19:28:43Z,OWNER,Demo: https://latest.datasette.io/-/databases,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",444997937,/-/databases showing currently attached database details, https://github.com/simonw/datasette/issues/474#issuecomment-493785170,https://api.github.com/repos/simonw/datasette/issues/474,493785170,MDEyOklzc3VlQ29tbWVudDQ5Mzc4NTE3MA==,9599,simonw,2019-05-19T19:36:25Z,2019-05-19T19:36:25Z,OWNER,"This needs to happen in two places: https://github.com/simonw/datasette/blob/260085838887ee343f4d3b177c422e7aef5ade9d/datasette/templates/database.html#L59-L61 https://github.com/simonw/datasette/blob/260085838887ee343f4d3b177c422e7aef5ade9d/datasette/views/database.py#L75-L80 Plus new unit test.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",445855789,Do not allow downloads of mutable databases, https://github.com/simonw/datasette/issues/463#issuecomment-493793021,https://api.github.com/repos/simonw/datasette/issues/463,493793021,MDEyOklzc3VlQ29tbWVudDQ5Mzc5MzAyMQ==,9599,simonw,2019-05-19T21:24:44Z,2019-05-19T21:24:44Z,OWNER,https://datasette.readthedocs.io/en/latest/changelog.html#v0-28,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",443034003,Write release notes for 0.28, https://github.com/simonw/datasette/issues/478#issuecomment-493794975,https://api.github.com/repos/simonw/datasette/issues/478,493794975,MDEyOklzc3VlQ29tbWVudDQ5Mzc5NDk3NQ==,9599,simonw,2019-05-19T21:53:04Z,2019-05-19T21:53:04Z,OWNER,"Example build that's taking ages to push the release: https://travis-ci.org/simonw/datasette/builds/534573406 Here's the relevant Travis CI configuration: https://github.com/simonw/datasette/blob/e518f76c5f5dd0138032bfb26387f5bb91086a3f/.travis.yml#L31-L52","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",445868234,Make it so Docker build doesn't delay PyPI release, https://github.com/simonw/datasette/issues/451#issuecomment-493797127,https://api.github.com/repos/simonw/datasette/issues/451,493797127,MDEyOklzc3VlQ29tbWVudDQ5Mzc5NzEyNw==,9599,simonw,2019-05-19T22:23:43Z,2019-05-19T22:23:43Z,OWNER,"Done - and shipped a release! See https://simonwillison.net/2019/May/19/datasette-0-28/","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",440313209,Update README, https://github.com/simonw/datasette/issues/469#issuecomment-493799012,https://api.github.com/repos/simonw/datasette/issues/469,493799012,MDEyOklzc3VlQ29tbWVudDQ5Mzc5OTAxMg==,9599,simonw,2019-05-19T22:53:18Z,2019-05-19T22:53:18Z,OWNER,"I manually tested this against heroku, cloud run and nowv1 and all three worked as expected - in particular they all offered working ""download this database"" links.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",444749373,publish commands should use new -i option, https://github.com/simonw/datasette/pull/479#issuecomment-494064003,https://api.github.com/repos/simonw/datasette/issues/479,494064003,MDEyOklzc3VlQ29tbWVudDQ5NDA2NDAwMw==,9599,simonw,2019-05-20T16:42:18Z,2019-05-20T16:42:18Z,OWNER,Thanks!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",445873563,doc typo fix, https://github.com/simonw/datasette/pull/480#issuecomment-494064357,https://api.github.com/repos/simonw/datasette/issues/480,494064357,MDEyOklzc3VlQ29tbWVudDQ5NDA2NDM1Nw==,9599,simonw,2019-05-20T16:43:14Z,2019-05-20T16:43:14Z,OWNER,Thanks for this - looks like exactly what I wanted. I'll test this next time I ship a minor release.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",445875242,Split pypi and docker travis tasks, https://github.com/simonw/datasette/issues/478#issuecomment-494064566,https://api.github.com/repos/simonw/datasette/issues/478,494064566,MDEyOklzc3VlQ29tbWVudDQ5NDA2NDU2Ng==,9599,simonw,2019-05-20T16:43:51Z,2019-05-20T16:43:51Z,OWNER,"The fix from @glasnt looks exactly right, I'll test it out next time I ship a minor release.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",445868234,Make it so Docker build doesn't delay PyPI release, https://github.com/simonw/datasette/issues/272#issuecomment-494190922,https://api.github.com/repos/simonw/datasette/issues/272,494190922,MDEyOklzc3VlQ29tbWVudDQ5NDE5MDkyMg==,9599,simonw,2019-05-21T00:00:40Z,2019-05-21T00:01:09Z,OWNER,"Wow, this issue has been open for a full year now! I've been thinking about this a lot. I've decided I want Datasette to use ASGI 3.0 internally with no dependencies on anything else - then I want the option to run Datasette under both daphne and uvicorn - because uvicorn doesn't support Python 3.5 but Datasette still needs to (primarily for Glitch), and daphne works with 3.5. So I'm going to try to go the following route: - Every Datasette view becomes an ASGI app - The Datasette application itself is an ASGI app that routes to those views - When you `pip install datasette` you get Daphne as a dependency (I'd like you to be able to opt-out of installing Daphne, I'm not yet sure how that would work) - A new `asgi_serve` plugin hook allows a plugin to serve Datasette using uvicorn (or hypercorn) instead","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-494191378,https://api.github.com/repos/simonw/datasette/issues/272,494191378,MDEyOklzc3VlQ29tbWVudDQ5NDE5MTM3OA==,9599,simonw,2019-05-21T00:02:48Z,2019-05-21T00:02:48Z,OWNER,"I said earlier that I only need to support GET - I actually need to be able to support POST too, mainly to support plugins (e.g. a plugin that allows authenticated login before you can view Datasette, but potentially also plugins that let you write data directly to SQLite as well).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-494191738,https://api.github.com/repos/simonw/datasette/issues/272,494191738,MDEyOklzc3VlQ29tbWVudDQ5NDE5MTczOA==,9599,simonw,2019-05-21T00:05:02Z,2019-05-21T00:05:02Z,OWNER,While I'm not depending on Starlette any more I will need to instead depend on https://github.com/andrew-d/python-multipart for POST form parsing - as used by Starlette here https://github.com/encode/starlette/blob/ab86530eddfcf56e0f7e5ca56f6ab69c15594a7d/starlette/requests.py#L178-L193,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-494192163,https://api.github.com/repos/simonw/datasette/issues/272,494192163,MDEyOklzc3VlQ29tbWVudDQ5NDE5MjE2Mw==,9599,simonw,2019-05-21T00:07:25Z,2019-05-21T00:07:25Z,OWNER,"Bah, I'd much rather depend on Starlette for things like form parsing - but it's 3.6+ only! https://github.com/encode/starlette/blob/ab86530eddfcf56e0f7e5ca56f6ab69c15594a7d/setup.py#L39 Maybe I could require Python 3.6 or higher if you want to handle POST data? This would make my internals far too complicated though I think.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/272#issuecomment-494192779,https://api.github.com/repos/simonw/datasette/issues/272,494192779,MDEyOklzc3VlQ29tbWVudDQ5NDE5Mjc3OQ==,9599,simonw,2019-05-21T00:10:47Z,2019-05-21T00:10:47Z,OWNER,"https://github.com/simonw/datasette/commit/9fdb47ca952b93b7b60adddb965ea6642b1ff523 added `decode_path_component()` and `encode_path_component()` functions because ASGI decodes %2F encoded slashes in URLs automatically. The new encoding scheme looks like this: ""table/and/slashes"" => ""tableU+002FandU+002Fslashes"" ""~table"" => ""U+007Etable"" ""+bobcats!"" => ""U+002Bbobcats!"" ""U+007Etable"" => ""UU+002B007Etable"" For background see this comment: https://github.com/django/asgiref/issues/51#issuecomment-450603464","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/481#issuecomment-494255074,https://api.github.com/repos/simonw/datasette/issues/481,494255074,MDEyOklzc3VlQ29tbWVudDQ5NDI1NTA3NA==,9599,simonw,2019-05-21T06:18:17Z,2019-05-21T06:18:17Z,OWNER,"Demo: https://latest.datasette.io/fixtures/facetable?_facet_date=created ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",446429421,Facet by date, https://github.com/simonw/datasette/issues/272#issuecomment-494297022,https://api.github.com/repos/simonw/datasette/issues/272,494297022,MDEyOklzc3VlQ29tbWVudDQ5NDI5NzAyMg==,647359,tomchristie,2019-05-21T08:39:17Z,2019-05-21T08:39:17Z,NONE,"Useful context stuff: > ASGI decodes %2F encoded slashes in URLs automatically `raw_path` for ASGI looks to be under consideration: https://github.com/django/asgiref/issues/87 > uvicorn doesn't support Python 3.5 That was an issue specifically against the <=3.5.2 minor point releases of Python, now resolved: https://github.com/encode/uvicorn/issues/330 👍 > Starlette for things like form parsing - but it's 3.6+ only! Yeah - the bits that require 3.6 are anywhere with the ""async for"" syntax. If it wasn't for that I'd downport it, but that one's a pain. It's the one bit of syntax to watch out for if you're looking to bring any bits of implementation across to Datasette. ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",324188953,Port Datasette to ASGI, https://github.com/simonw/datasette/issues/184#issuecomment-494459264,https://api.github.com/repos/simonw/datasette/issues/184,494459264,MDEyOklzc3VlQ29tbWVudDQ5NDQ1OTI2NA==,222245,carlmjohnson,2019-05-21T16:17:29Z,2019-05-21T16:17:29Z,NONE,"Reopening this because it still raises 500 for incorrect table capitalization. Example: - https://salaries.news.baltimoresun.com/salaries/2018+Maryland+state+salaries/1 200 OK - https://salaries.news.baltimoresun.com/salaries/bad-table/1 400 - https://salaries.news.baltimoresun.com/salaries/2018+maryland+state+salaries/1 500 Internal Error (note lowercase 'm') I think because the table name exists but is not in its canonical form, it triggers a dict lookup error.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",292011379,500 from missing table name, https://github.com/simonw/datasette/issues/483#issuecomment-495032818,https://api.github.com/repos/simonw/datasette/issues/483,495032818,MDEyOklzc3VlQ29tbWVudDQ5NTAzMjgxOA==,9599,simonw,2019-05-23T01:28:06Z,2019-05-23T01:28:06Z,OWNER,"Here's a UI concept: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",447408527,Option to facet by date using month or year, https://github.com/simonw/datasette/issues/483#issuecomment-495032933,https://api.github.com/repos/simonw/datasette/issues/483,495032933,MDEyOklzc3VlQ29tbWVudDQ5NTAzMjkzMw==,9599,simonw,2019-05-23T01:28:45Z,2019-05-23T01:28:45Z,OWNER,Would this be useful for other facets? How many facet types are likely to have a small number of options that could be linked to in this way?,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",447408527,Option to facet by date using month or year, https://github.com/simonw/datasette/issues/483#issuecomment-495034774,https://api.github.com/repos/simonw/datasette/issues/483,495034774,MDEyOklzc3VlQ29tbWVudDQ5NTAzNDc3NA==,45919695,jcmkk3,2019-05-23T01:38:32Z,2019-05-23T01:43:04Z,NONE,"I think that location information is one of the other common pieces of hierarchical data. At least one that is general enough that extra dimensions could be auto-generated. Also, I think this is an awesome project. Thank you for creating this.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",447408527,Option to facet by date using month or year, https://github.com/simonw/datasette/issues/259#issuecomment-495058104,https://api.github.com/repos/simonw/datasette/issues/259,495058104,MDEyOklzc3VlQ29tbWVudDQ5NTA1ODEwNA==,9599,simonw,2019-05-23T03:55:37Z,2019-05-23T03:55:37Z,OWNER,I got rid of inspect in #462 - I will still be doing many-to-many detection (initially as part of #356) but it doesn't need a separate ticket.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",322787470,inspect() should detect many-to-many relationships, https://github.com/simonw/datasette/issues/355#issuecomment-495058622,https://api.github.com/repos/simonw/datasette/issues/355,495058622,MDEyOklzc3VlQ29tbWVudDQ5NTA1ODYyMg==,9599,simonw,2019-05-23T03:58:58Z,2019-05-23T03:58:58Z,OWNER,"So the design I have so far is: `?_m2m_linktablename__linktablecolumn=value` I'm concerned that this doesn't take tables or columns with `__` in their name into account. Does that matter? Could I support this without them?","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",346027040,Table view should support filtering via many-to-many relationships, https://github.com/simonw/datasette/issues/355#issuecomment-495058828,https://api.github.com/repos/simonw/datasette/issues/355,495058828,MDEyOklzc3VlQ29tbWVudDQ5NTA1ODgyOA==,9599,simonw,2019-05-23T04:00:27Z,2019-05-23T04:00:27Z,OWNER,"The alternative would be to use JSON: `?_m2m={""table"":""ad_targets"",""column"":""target_id"",""value"":""ec3ac""}`","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",346027040,Table view should support filtering via many-to-many relationships, https://github.com/simonw/datasette/issues/355#issuecomment-495058964,https://api.github.com/repos/simonw/datasette/issues/355,495058964,MDEyOklzc3VlQ29tbWVudDQ5NTA1ODk2NA==,9599,simonw,2019-05-23T04:01:17Z,2019-05-23T04:01:17Z,OWNER,"I think I like this better. I don't think `?_m2m=` is the correct name for it though. `?_through={""table"":""ad_targets"",""column"":""target_id"",""value"":""ec3ac""}` feels a little more accurate.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",346027040,Table view should support filtering via many-to-many relationships, https://github.com/simonw/datasette/issues/355#issuecomment-495059236,https://api.github.com/repos/simonw/datasette/issues/355,495059236,MDEyOklzc3VlQ29tbWVudDQ5NTA1OTIzNg==,9599,simonw,2019-05-23T04:03:04Z,2019-05-23T04:03:04Z,OWNER,"This assumes that our current table has a single, unambiguous foreign key relationship with the table indicated by the `?through=` parameter. I think that's reasonable. The JSON format could be extended to allow that side of the relationship to optionally be defined there (if the foreign key relationship is missing).","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",346027040,Table view should support filtering via many-to-many relationships, https://github.com/simonw/datasette/issues/355#issuecomment-495061686,https://api.github.com/repos/simonw/datasette/issues/355,495061686,MDEyOklzc3VlQ29tbWVudDQ5NTA2MTY4Ng==,9599,simonw,2019-05-23T04:21:00Z,2019-05-23T04:21:00Z,OWNER,"Filtering through one table already works - you need to know that table's primary key, then you do `?column_id=pk` against the first table. Filtering through a m2m table will be handled by the new `?_through=` parameter. I'm going to leave out filtering through a second levels of joins for the moment. Potentially that could be added later as some extra complicated JSON.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",346027040,Table view should support filtering via many-to-many relationships, https://github.com/simonw/datasette/issues/484#issuecomment-495068273,https://api.github.com/repos/simonw/datasette/issues/484,495068273,MDEyOklzc3VlQ29tbWVudDQ5NTA2ODI3Mw==,9599,simonw,2019-05-23T05:03:48Z,2019-05-23T05:04:35Z,OWNER,"Ideally we would display a limited number of m2m related records with a ""..."" if there are more than our limit. I could also show a count of the total number of records, but this would have to be agressively time-limited or it could cause extremely poor performance. This could be implemented as a SQL query for every displayed row, taking advantage of [Many Small Queries Are Efficient In SQLite](https://sqlite.org/np1queryprob.html). Provided that SQL runs against an index this should be fast to display even on a table with hundreds of rows.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",447451492,Mechanism for displaying summary of m2m relationships in rows on table view, https://github.com/simonw/datasette/issues/355#issuecomment-495077443,https://api.github.com/repos/simonw/datasette/issues/355,495077443,MDEyOklzc3VlQ29tbWVudDQ5NTA3NzQ0Mw==,9599,simonw,2019-05-23T05:52:52Z,2019-05-23T05:52:52Z,OWNER,Documentation here: https://datasette.readthedocs.io/en/latest/json_api.html#special-table-arguments,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",346027040,Table view should support filtering via many-to-many relationships, https://github.com/simonw/datasette/issues/355#issuecomment-495077528,https://api.github.com/repos/simonw/datasette/issues/355,495077528,MDEyOklzc3VlQ29tbWVudDQ5NTA3NzUyOA==,9599,simonw,2019-05-23T05:53:20Z,2019-05-23T05:53:20Z,OWNER,"Demo: https://latest.datasette.io/fixtures/roadside_attractions?_through={%22table%22:%22roadside_attraction_characteristics%22,%22column%22:%22characteristic_id%22,%22value%22:%221%22}","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",346027040,Table view should support filtering via many-to-many relationships, https://github.com/simonw/datasette/issues/355#issuecomment-495079393,https://api.github.com/repos/simonw/datasette/issues/355,495079393,MDEyOklzc3VlQ29tbWVudDQ5NTA3OTM5Mw==,9599,simonw,2019-05-23T06:02:11Z,2019-05-23T06:02:11Z,OWNER,"I'm re-opening because we need a UI mechanism for deselecting this: ","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",346027040,Table view should support filtering via many-to-many relationships, https://github.com/simonw/datasette/issues/355#issuecomment-495079705,https://api.github.com/repos/simonw/datasette/issues/355,495079705,MDEyOklzc3VlQ29tbWVudDQ5NTA3OTcwNQ==,9599,simonw,2019-05-23T06:03:40Z,2019-05-23T06:04:03Z,OWNER,"I think an approach similar to how `?_where=` works would do the job here. Can address this feedback from @psychemedia while I'm at it: https://github.com/simonw/datasette/issues/429#issuecomment-483202658 ![image](https://user-images.githubusercontent.com/82988/56127017-2bf78e80-5f74-11e9-9120-9393eb5d4988.png)","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",346027040,Table view should support filtering via many-to-many relationships, https://github.com/simonw/datasette/issues/483#issuecomment-495080390,https://api.github.com/repos/simonw/datasette/issues/483,495080390,MDEyOklzc3VlQ29tbWVudDQ5NTA4MDM5MA==,9599,simonw,2019-05-23T06:06:53Z,2019-05-23T06:06:53Z,OWNER,"Yes there's definitely something exciting to be done with location facets. The easiest one would be a radius-distance-from-a-point facet (5km, 10km etc). A more sophisticated thing might be possible on top of GeoJSON and SpatiaLite - that's probably something I would put in a plugin rather than shipping in Datasette itself.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",447408527,Option to facet by date using month or year, https://github.com/simonw/datasette/issues/483#issuecomment-495080591,https://api.github.com/repos/simonw/datasette/issues/483,495080591,MDEyOklzc3VlQ29tbWVudDQ5NTA4MDU5MQ==,9599,simonw,2019-05-23T06:07:53Z,2019-05-23T06:09:05Z,OWNER,"As far as URL design goes... I'm going to stick with `?_facet_date=` for this and use the not-yet-fully-baked JSON alternative syntax. Probably something like this: ?_facet_date={""column"":""created"",""interval"":""month""} Where interval can be day (the default), month or year. And maybe week? Not sure about that. Still not sure what/if I should do about exposing these options in the UI.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",447408527,Option to facet by date using month or year, https://github.com/simonw/datasette/issues/485#issuecomment-495083670,https://api.github.com/repos/simonw/datasette/issues/485,495083670,MDEyOklzc3VlQ29tbWVudDQ5NTA4MzY3MA==,9599,simonw,2019-05-23T06:21:52Z,2019-05-23T06:22:36Z,OWNER,"If a table has more than two columns we could do a betterl job at guessing the label column. A few potential tricks: * look for a column called name or title * look for the first column of type text * check for the text column with the most diversity in values","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",447469253,Improvements to table label detection , https://github.com/simonw/datasette/issues/485#issuecomment-495085021,https://api.github.com/repos/simonw/datasette/issues/485,495085021,MDEyOklzc3VlQ29tbWVudDQ5NTA4NTAyMQ==,9599,simonw,2019-05-23T06:27:57Z,2019-05-26T23:15:51Z,OWNER,"I could attempt to calculate the statistics needed for this in a time limited SQL query something like this one: https://latest.datasette.io/fixtures?sql=select+%27name%27+as+column%2C+count+%28distinct+name%29+as+count_distinct%2C+avg%28length%28name%29%29+as+avg_length+from+roadside_attractions%0D%0A++union%0D%0Aselect+%27address%27+as+column%2C+count%28distinct+address%29+as+count_distinct%2C+avg%28length%28address%29%29+as+avg_length+from+roadside_attractions ``` select 'name' as column, count (distinct name) as count_distinct, avg(length(name)) as avg_length from roadside_attractions union select 'address' as column, count(distinct address) as count_distinct, avg(length(address)) as avg_length from roadside_attractions ```","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",447469253,Improvements to table label detection , https://github.com/simonw/datasette/issues/486#issuecomment-495659567,https://api.github.com/repos/simonw/datasette/issues/486,495659567,MDEyOklzc3VlQ29tbWVudDQ5NTY1OTU2Nw==,9599,simonw,2019-05-24T14:41:45Z,2019-05-24T14:41:45Z,OWNER,"I'm really keen to offer this as a plugin hook once I have Datasette working on ASGI - #272 I'll hopefully have that working in the next few weeks, but in the meantime there are a couple of tricks you can use: - you can add static HTML files (no templates though) using the static route configuration options - you can link to external hosted pages using the `about_url` metadata option - you can add information to an existing page with a custom template. I do that here for example: https://russian-ira-facebook-ads.datasettes.com/","{""total_count"": 1, ""+1"": 1, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",448189298,Ability to add extra routes and related templates, https://github.com/simonw/datasette/issues/486#issuecomment-495660184,https://api.github.com/repos/simonw/datasette/issues/486,495660184,MDEyOklzc3VlQ29tbWVudDQ5NTY2MDE4NA==,9599,simonw,2019-05-24T14:43:09Z,2019-05-24T14:43:09Z,OWNER,Closing this as a dupe of #215 ,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",448189298,Ability to add extra routes and related templates, https://github.com/simonw/sqlite-utils/issues/18#issuecomment-495818358,https://api.github.com/repos/simonw/sqlite-utils/issues/18,495818358,MDEyOklzc3VlQ29tbWVudDQ5NTgxODM1OA==,9599,simonw,2019-05-25T00:16:50Z,2019-05-25T00:16:50Z,OWNER,"Oh that's really interesting - yeah I think having an option to insert if missing, skip otherwise would be a good feature.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",413871266,.insert/.upsert/.insert_all/.upsert_all should add missing columns, https://github.com/simonw/sqlite-utils/issues/21#issuecomment-495818615,https://api.github.com/repos/simonw/sqlite-utils/issues/21,495818615,MDEyOklzc3VlQ29tbWVudDQ5NTgxODYxNQ==,9599,simonw,2019-05-25T00:19:04Z,2019-05-25T00:19:04Z,OWNER,I think this can be implemented as an `--ignore` option to `sqlite-utils insert`,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",448391492,Option to ignore inserts if primary key exists already, https://github.com/simonw/sqlite-utils/issues/20#issuecomment-495818849,https://api.github.com/repos/simonw/sqlite-utils/issues/20,495818849,MDEyOklzc3VlQ29tbWVudDQ5NTgxODg0OQ==,9599,simonw,2019-05-25T00:21:04Z,2019-05-25T00:21:04Z,OWNER,I'm going to implement this using a new `--json-cols` option which works by detecting any string value that starts with [ or { and is valid JSON.,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",432727685,JSON column values get extraneously quoted , https://github.com/simonw/sqlite-utils/issues/22#issuecomment-495823173,https://api.github.com/repos/simonw/sqlite-utils/issues/22,495823173,MDEyOklzc3VlQ29tbWVudDQ5NTgyMzE3Mw==,9599,simonw,2019-05-25T01:17:55Z,2019-05-25T01:17:55Z,OWNER,Shipped 1.0 to PyPI!,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",448395665,Release notes for 1.0, https://github.com/simonw/sqlite-utils/issues/22#issuecomment-495823219,https://api.github.com/repos/simonw/sqlite-utils/issues/22,495823219,MDEyOklzc3VlQ29tbWVudDQ5NTgyMzIxOQ==,9599,simonw,2019-05-25T01:18:27Z,2019-05-25T01:18:27Z,OWNER,https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v1-0,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",448395665,Release notes for 1.0, https://github.com/simonw/datasette/pull/365#issuecomment-495929743,https://api.github.com/repos/simonw/datasette/issues/365,495929743,MDEyOklzc3VlQ29tbWVudDQ5NTkyOTc0Mw==,9599,simonw,2019-05-25T16:09:54Z,2019-05-25T16:09:54Z,OWNER,That last commit referenced the wrong ticket - it was meant to reference #356,"{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",361764460,fix small doc typo, https://github.com/simonw/datasette/issues/356#issuecomment-495931078,https://api.github.com/repos/simonw/datasette/issues/356,495931078,MDEyOklzc3VlQ29tbWVudDQ5NTkzMTA3OA==,9599,simonw,2019-05-25T16:30:09Z,2019-05-25T16:30:09Z,OWNER,"Implemented in https://github.com/simonw/datasette/commit/d923d847545e829bf946bb9170bebfc7c3f9d993 Documentation here: https://datasette.readthedocs.io/en/latest/facets.html#facet-by-many-to-many Demo here: https://latest.datasette.io/fixtures/roadside_attractions","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",346028655,Ability to display facet counts for many-to-many relationships, https://github.com/simonw/datasette/issues/356#issuecomment-495931140,https://api.github.com/repos/simonw/datasette/issues/356,495931140,MDEyOklzc3VlQ29tbWVudDQ5NTkzMTE0MA==,9599,simonw,2019-05-25T16:30:59Z,2019-05-25T16:30:59Z,OWNER,"I went with a much more simple URL scheme: `?_facet_m2m=destination_table` (it then figures out which the middle table is by looking at the foreign keys). This can be extended to be more complicated in the future if needed.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",346028655,Ability to display facet counts for many-to-many relationships, https://github.com/simonw/datasette/issues/485#issuecomment-496038601,https://api.github.com/repos/simonw/datasette/issues/485,496038601,MDEyOklzc3VlQ29tbWVudDQ5NjAzODYwMQ==,9599,simonw,2019-05-26T23:08:41Z,2019-05-26T23:08:41Z,OWNER,"The code currently assumes the primary key is called ""id"" or ""pk"" - improving it to detect the primary key using database introspection should work much better.","{""total_count"": 0, ""+1"": 0, ""-1"": 0, ""laugh"": 0, ""hooray"": 0, ""confused"": 0, ""heart"": 0, ""rocket"": 0, ""eyes"": 0}",447469253,Improvements to table label detection ,