{"html_url": "https://github.com/simonw/datasette/issues/93#issuecomment-344424382", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/93", "id": 344424382, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDQyNDM4Mg==", "user": {"value": 67420, "label": "atomotic"}, "created_at": "2017-11-14T22:42:16Z", "updated_at": "2017-11-14T22:42:16Z", "author_association": "NONE", "body": "tried quickly, this seems working:\r\n\r\n```\r\n~ pip3 install pyinstaller\r\n~ pyinstaller -F --add-data /usr/local/lib/python3.6/site-packages/datasette/templates:datasette/templates --add-data /usr/local/lib/python3.6/site-packages/datasette/static:datasette/static /usr/local/bin/datasette\r\n\r\n~ du -h dist/datasette\r\n6.8M\tdist/datasette\r\n~ file dist/datasette\r\ndist/datasette: Mach-O 64-bit executable x86_64\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273944952, "label": "Package as standalone binary"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/93#issuecomment-344430299", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/93", "id": 344430299, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDQzMDI5OQ==", "user": {"value": 67420, "label": "atomotic"}, "created_at": "2017-11-14T23:06:33Z", "updated_at": "2017-11-14T23:06:33Z", "author_association": "NONE", "body": "i will look better tomorrow, it's late i surely made some mistake\r\nhttps://asciinema.org/a/ZyAWbetrlriDadwWyVPUWB94H", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273944952, "label": "Package as standalone binary"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/93#issuecomment-344516406", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/93", "id": 344516406, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDUxNjQwNg==", "user": {"value": 67420, "label": "atomotic"}, "created_at": "2017-11-15T08:09:41Z", "updated_at": "2017-11-15T08:09:41Z", "author_association": "NONE", "body": "actually you can use travis to build for linux/macos and [appveyor](https://www.appveyor.com/) to build for windows.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273944952, "label": "Package as standalone binary"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/101#issuecomment-344597274", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/101", "id": 344597274, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDU5NzI3NA==", "user": {"value": 450244, "label": "eaubin"}, "created_at": "2017-11-15T13:48:55Z", "updated_at": "2017-11-15T13:48:55Z", "author_association": "NONE", "body": "This is a duplicate of https://github.com/simonw/datasette/issues/100", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 274161964, "label": "TemplateAssertionError: no filter named 'tojson'"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/100#issuecomment-344864254", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/100", "id": 344864254, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDg2NDI1NA==", "user": {"value": 13304454, "label": "coisnepe"}, "created_at": "2017-11-16T09:25:10Z", "updated_at": "2017-11-16T09:25:10Z", "author_association": "NONE", "body": "@simonw I see. I upgraded sanic-jinja2 and jinja2: it now works flawlessly. Thank you!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 274160723, "label": "TemplateAssertionError: no filter named 'tojson'"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/97#issuecomment-345509500", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/97", "id": 345509500, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NTUwOTUwMA==", "user": {"value": 231923, "label": "yschimke"}, "created_at": "2017-11-19T11:26:58Z", "updated_at": "2017-11-19T11:26:58Z", "author_association": "NONE", "body": "Specifically docs should make it clearer this file exists\r\n\r\nhttps://parlgov.datasettes.com/.json\r\n\r\nAnd from that you can build https://parlgov.datasettes.com/parlgov-25f9855.json\r\n\r\nThen https://parlgov.datasettes.com/parlgov-25f9855/cabinet.json", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 274022950, "label": "Link to JSON for the list of tables "}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/144#issuecomment-346427794", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/144", "id": 346427794, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NjQyNzc5NA==", "user": {"value": 649467, "label": "mhalle"}, "created_at": "2017-11-22T17:55:45Z", "updated_at": "2017-11-22T17:55:45Z", "author_association": "NONE", "body": "Thanks. There is a way to use pip to grab apsw, which also let's you configure it (flags to build extensions, use an internal sqlite, etc). Don't know how that works as a dependency for another package, though.\n\nOn November 22, 2017 11:38:06 AM EST, Simon Willison wrote:\n>I have a solution for FTS already, but I'm interested in apsw as a\n>mechanism for allowing custom virtual tables to be written in Python\n>(pysqlite only lets you write custom functions)\n>\n>Not having PyPI support is pretty tough though. I'm planning a\n>plugin/extension system which would be ideal for things like an\n>optional apsw mode, but that's a lot harder if apsw isn't in PyPI.\n>\n>-- \n>You are receiving this because you authored the thread.\n>Reply to this email directly or view it on GitHub:\n>https://github.com/simonw/datasette/issues/144#issuecomment-346405660\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 276091279, "label": "apsw as alternative sqlite3 binding (for full text search)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/141#issuecomment-346974336", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/141", "id": 346974336, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Njk3NDMzNg==", "user": {"value": 50138, "label": "janimo"}, "created_at": "2017-11-26T00:00:35Z", "updated_at": "2017-11-26T00:00:35Z", "author_association": "NONE", "body": "FWIW I worked around this by setting TMPDIR to ~/tmp before running the command.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 275814941, "label": "datasette publish can fail if /tmp is on a different device"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/124#issuecomment-346987395", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/124", "id": 346987395, "node_id": "MDEyOklzc3VlQ29tbWVudDM0Njk4NzM5NQ==", "user": {"value": 50138, "label": "janimo"}, "created_at": "2017-11-26T06:24:08Z", "updated_at": "2017-11-26T06:24:08Z", "author_association": "NONE", "body": "Are there performance gains when using immutable as opposed to read-only? From what I see other processes can still modify the DB when immutable, but there are no change notifications.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 275125805, "label": "Option to open readonly but not immutable"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/124#issuecomment-347123991", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/124", "id": 347123991, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NzEyMzk5MQ==", "user": {"value": 50138, "label": "janimo"}, "created_at": "2017-11-27T09:25:15Z", "updated_at": "2017-11-27T09:25:15Z", "author_association": "NONE", "body": "That's the only reference to immutable I saw as well, making me think that there may be no perceivable advantages over simply using mode=ro. Since the database is never or seldom updated the change notifications should not impact performance.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 275125805, "label": "Option to open readonly but not immutable"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/155#issuecomment-347714314", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/155", "id": 347714314, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NzcxNDMxNA==", "user": {"value": 388154, "label": "wsxiaoys"}, "created_at": "2017-11-29T00:46:25Z", "updated_at": "2017-11-29T00:46:25Z", "author_association": "NONE", "body": "```\r\nCREATE TABLE rhs (\r\n id INTEGER PRIMARY KEY,\r\n name TEXT\r\n);\r\n\r\nCREATE TABLE lhs (\r\n symbol INTEGER PRIMARY KEY,\r\n FOREIGN KEY (symbol) REFERENCES rhs(id)\r\n);\r\n\r\nINSERT INTO rhs VALUES (1, \"foo\");\r\nINSERT INTO rhs VALUES (2, \"bar\");\r\nINSERT INTO lhs VALUES (1);\r\nINSERT INTO lhs VALUES (2);\r\n```\r\n\r\nIt's expected that in lhs's view, foo / bar should be displayed.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 277589569, "label": "A primary key column that has foreign key restriction associated won't rendering label column"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/153#issuecomment-348252037", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/153", "id": 348252037, "node_id": "MDEyOklzc3VlQ29tbWVudDM0ODI1MjAzNw==", "user": {"value": 20264, "label": "ftrain"}, "created_at": "2017-11-30T16:59:00Z", "updated_at": "2017-11-30T16:59:00Z", "author_association": "NONE", "body": "WOW!\n\n--\nPaul Ford // (646) 369-7128 // @ftrain\n\nOn Thu, Nov 30, 2017 at 11:47 AM, Simon Willison \nwrote:\n\n> Remaining work on this now lives in a milestone:\n> https://github.com/simonw/datasette/milestone/6\n>\n> \u2014\n> You are receiving this because you were mentioned.\n> Reply to this email directly, view it on GitHub\n> ,\n> or mute the thread\n> \n> .\n>\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 276842536, "label": "Ability to customize presentation of specific columns in HTML view"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/161#issuecomment-350108113", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/161", "id": 350108113, "node_id": "MDEyOklzc3VlQ29tbWVudDM1MDEwODExMw==", "user": {"value": 388154, "label": "wsxiaoys"}, "created_at": "2017-12-07T22:02:24Z", "updated_at": "2017-12-07T22:02:24Z", "author_association": "NONE", "body": "It's not throwing the validation error anymore, but i still cannot run following with query:\r\n```\r\nWITH RECURSIVE cnt(x) AS (SELECT 1 UNION ALL SELECT x+1 FROM cnt LIMIT 10) SELECT x FROM cnt;\r\n```\r\n\r\nI got `near \"WITH\": syntax error`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 278814220, "label": "Support WITH query "}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/161#issuecomment-350182904", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/161", "id": 350182904, "node_id": "MDEyOklzc3VlQ29tbWVudDM1MDE4MjkwNA==", "user": {"value": 388154, "label": "wsxiaoys"}, "created_at": "2017-12-08T06:18:12Z", "updated_at": "2017-12-08T06:18:12Z", "author_association": "NONE", "body": "You're right..got this resolved after upgrading the sqlite version.\r\n\r\nThanks you!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 278814220, "label": "Support WITH query "}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/120#issuecomment-355487646", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/120", "id": 355487646, "node_id": "MDEyOklzc3VlQ29tbWVudDM1NTQ4NzY0Ng==", "user": {"value": 723567, "label": "nickdirienzo"}, "created_at": "2018-01-05T07:10:12Z", "updated_at": "2018-01-05T07:10:12Z", "author_association": "NONE", "body": "Ah, glad I found this issue. I have private data that I'd like to share to a few different people. Personally, a shared username and password would be sufficient for me, more-or-less Basic Auth. Do you have more complex requirements in mind?\r\n\r\nI'm not sure if \"plugin\" means \"build a plugin\" or \"find a plugin\" or something else entirely. FWIW, I stumbled upon [sanic-auth](https://github.com/pyx/sanic-auth) which looks like a new project to bring some interfaces around auth to sanic, similar to Flask.\r\n\r\nAlternatively, it shouldn't be too bad to add in Basic Auth. If we went down that route, that would probably be best built as a separate package for sanic that `datasette` brings in.\r\n\r\nWhat are your thoughts around this?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 275087397, "label": "Plugin that adds an authentication layer of some sort"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/176#issuecomment-356115657", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/176", "id": 356115657, "node_id": "MDEyOklzc3VlQ29tbWVudDM1NjExNTY1Nw==", "user": {"value": 4313116, "label": "wulfmann"}, "created_at": "2018-01-08T22:22:32Z", "updated_at": "2018-01-08T22:22:32Z", "author_association": "NONE", "body": "This project probably would not be the place for that. This is a layer for sqllite specifically. It solves a similar problem as graphql, so adding that here wouldn't make sense.\r\n\r\nHere's an example i found from google that uses micro to run a graphql microservice. you'd just then need to connect your db.\r\nhttps://github.com/timneutkens/micro-graphql", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 285168503, "label": "Add GraphQL endpoint"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/176#issuecomment-356161672", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/176", "id": 356161672, "node_id": "MDEyOklzc3VlQ29tbWVudDM1NjE2MTY3Mg==", "user": {"value": 173848, "label": "yozlet"}, "created_at": "2018-01-09T02:35:35Z", "updated_at": "2018-01-09T02:35:35Z", "author_association": "NONE", "body": "@wulfmann I think I disagree, except I'm not entirely sure what you mean by that first paragraph. The JSON API that Datasette currently exposes is quite different to GraphQL.\r\n\r\nFurthermore, there's no \"just\" about connecting micro-graphql to a DB; at least, no more \"just\" than adding any other API. You still need to configure the schema, which is exactly the kind of thing that Datasette does for JSON API. This is why I think that GraphQL's a good fit here.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 285168503, "label": "Add GraphQL endpoint"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/176#issuecomment-356175667", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/176", "id": 356175667, "node_id": "MDEyOklzc3VlQ29tbWVudDM1NjE3NTY2Nw==", "user": {"value": 4313116, "label": "wulfmann"}, "created_at": "2018-01-09T04:19:03Z", "updated_at": "2018-01-09T04:19:03Z", "author_association": "NONE", "body": "@yozlet Yes I think that I was confused when I posted my original comment. I see your main point now and am in agreement.\r\n\r\n", "reactions": "{\"total_count\": 2, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 2, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 285168503, "label": "Add GraphQL endpoint"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/176#issuecomment-359697938", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/176", "id": 359697938, "node_id": "MDEyOklzc3VlQ29tbWVudDM1OTY5NzkzOA==", "user": {"value": 7193, "label": "gijs"}, "created_at": "2018-01-23T07:17:56Z", "updated_at": "2018-01-23T07:17:56Z", "author_association": "NONE", "body": "\ud83d\udc4d I'd like this too! ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 285168503, "label": "Add GraphQL endpoint"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/176#issuecomment-368625350", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/176", "id": 368625350, "node_id": "MDEyOklzc3VlQ29tbWVudDM2ODYyNTM1MA==", "user": {"value": 7431774, "label": "wuhland"}, "created_at": "2018-02-26T19:44:11Z", "updated_at": "2018-02-26T19:44:11Z", "author_association": "NONE", "body": "great idea!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 285168503, "label": "Add GraphQL endpoint"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/185#issuecomment-370461231", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/185", "id": 370461231, "node_id": "MDEyOklzc3VlQ29tbWVudDM3MDQ2MTIzMQ==", "user": {"value": 222245, "label": "carlmjohnson"}, "created_at": "2018-03-05T15:43:56Z", "updated_at": "2018-03-05T15:44:27Z", "author_association": "NONE", "body": "Yes. I think the simplest implementation is to change lines like\r\n\r\n```python\r\n metadata = self.ds.metadata.get('databases', {}).get(name, {})\r\n```\r\n\r\nto\r\n\r\n```python\r\nmetadata = {\r\n **self.ds.metadata,\r\n **self.ds.metadata.get('databases', {}).get(name, {}),\r\n}\r\n```\r\n\r\nso that specified inner values overwrite outer values, but only if they exist.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 299760684, "label": "Metadata should be a nested arbitrary KV store"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/186#issuecomment-374872202", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/186", "id": 374872202, "node_id": "MDEyOklzc3VlQ29tbWVudDM3NDg3MjIwMg==", "user": {"value": 47107, "label": "stefanocudini"}, "created_at": "2018-03-21T09:07:22Z", "updated_at": "2018-03-21T09:07:22Z", "author_association": "NONE", "body": "--debug is perfect tnk", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 306811513, "label": "proposal new option to disable user agents cache"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/185#issuecomment-376590265", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/185", "id": 376590265, "node_id": "MDEyOklzc3VlQ29tbWVudDM3NjU5MDI2NQ==", "user": {"value": 222245, "label": "carlmjohnson"}, "created_at": "2018-03-27T16:32:51Z", "updated_at": "2018-03-27T16:32:51Z", "author_association": "NONE", "body": ">I think the templates themselves should be able to indicate if they want the inherited values or not. That way we could support arbitrary key/values and avoid the application code having special knowledge of license_url etc.\r\n\r\nYes, you could have `metadata` that works like `metadata` does currently and `inherited_metadata` that works with inheritance.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 299760684, "label": "Metadata should be a nested arbitrary KV store"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/185#issuecomment-376592044", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/185", "id": 376592044, "node_id": "MDEyOklzc3VlQ29tbWVudDM3NjU5MjA0NA==", "user": {"value": 222245, "label": "carlmjohnson"}, "created_at": "2018-03-27T16:38:23Z", "updated_at": "2018-03-27T16:38:23Z", "author_association": "NONE", "body": "It would be nice to also allow arbitrary keys (maybe under a parent key called params or something to prevent conflicts). For our datasette project, we just have a bunch of dictionaries defined in the base template for things like site URL and column humanized names: https://github.com/baltimore-sun-data/salaries-datasette/blob/master/templates/base.html It would be cleaner if this were in the metadata.json.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 299760684, "label": "Metadata should be a nested arbitrary KV store"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/185#issuecomment-376614973", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/185", "id": 376614973, "node_id": "MDEyOklzc3VlQ29tbWVudDM3NjYxNDk3Mw==", "user": {"value": 222245, "label": "carlmjohnson"}, "created_at": "2018-03-27T17:49:00Z", "updated_at": "2018-03-27T17:49:00Z", "author_association": "NONE", "body": "@simonw Other than metadata, the biggest item on wishlist for the salaries project was the ability to reorder by column. Of course, that could be done with a custom SQL query, but we didn't want to have to reimplement all the nav/pagination stuff from scratch. \r\n\r\n@carolinp, feel free to add your thoughts.\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 299760684, "label": "Metadata should be a nested arbitrary KV store"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/181#issuecomment-378297842", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/181", "id": 378297842, "node_id": "MDEyOklzc3VlQ29tbWVudDM3ODI5Nzg0Mg==", "user": {"value": 1957344, "label": "bsmithgall"}, "created_at": "2018-04-03T15:47:13Z", "updated_at": "2018-04-03T15:47:13Z", "author_association": "NONE", "body": "I can work on that -- would you prefer to inline a `display: hidden` and then have the javascript flip the visibility or include it as css?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 289425975, "label": "add \"format sql\" button to query page, uses sql-formatter"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/193#issuecomment-379142500", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/193", "id": 379142500, "node_id": "MDEyOklzc3VlQ29tbWVudDM3OTE0MjUwMA==", "user": {"value": 222245, "label": "carlmjohnson"}, "created_at": "2018-04-06T04:05:58Z", "updated_at": "2018-04-06T04:05:58Z", "author_association": "NONE", "body": "You could try pulling out a validate query strings method. If it fails validation build the error object from the message. If it passes, you only need to go down a happy path. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 310882100, "label": "Cleaner mechanism for handling custom errors"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/181#issuecomment-379759875", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/181", "id": 379759875, "node_id": "MDEyOklzc3VlQ29tbWVudDM3OTc1OTg3NQ==", "user": {"value": 1957344, "label": "bsmithgall"}, "created_at": "2018-04-09T13:53:14Z", "updated_at": "2018-04-09T13:53:14Z", "author_association": "NONE", "body": "I've implemented that approach in 86ac746. It does cause the button to pop in only after Codemirror is finished rendering which is a bit awkward.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 289425975, "label": "add \"format sql\" button to query page, uses sql-formatter"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/184#issuecomment-379788103", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/184", "id": 379788103, "node_id": "MDEyOklzc3VlQ29tbWVudDM3OTc4ODEwMw==", "user": {"value": 222245, "label": "carlmjohnson"}, "created_at": "2018-04-09T15:15:11Z", "updated_at": "2018-04-09T15:15:11Z", "author_association": "NONE", "body": "Visit https://salaries.news.baltimoresun.com/salaries/bad-table.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 292011379, "label": "500 from missing table name"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/189#issuecomment-379791047", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/189", "id": 379791047, "node_id": "MDEyOklzc3VlQ29tbWVudDM3OTc5MTA0Nw==", "user": {"value": 222245, "label": "carlmjohnson"}, "created_at": "2018-04-09T15:23:45Z", "updated_at": "2018-04-09T15:23:45Z", "author_association": "NONE", "body": "Awesome!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 309471814, "label": "Ability to sort (and paginate) by column"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/189#issuecomment-381429213", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/189", "id": 381429213, "node_id": "MDEyOklzc3VlQ29tbWVudDM4MTQyOTIxMw==", "user": {"value": 222245, "label": "carlmjohnson"}, "created_at": "2018-04-15T18:54:22Z", "updated_at": "2018-04-15T18:54:22Z", "author_association": "NONE", "body": "I think I found a bug. I tried to sort by middle initial in my salaries set, and many middle initials are null. The next_url gets set by Datasette to:\r\n\r\nhttp://localhost:8001/salaries-d3a5631/2017+Maryland+state+salaries?_next=None%2C391&_sort=middle_initial\r\n\r\nBut then `None` is interpreted literally and it tries to find a name with the middle initial \"None\" and ends up skipping ahead to O on page 2.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 309471814, "label": "Ability to sort (and paginate) by column"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/191#issuecomment-381602005", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/191", "id": 381602005, "node_id": "MDEyOklzc3VlQ29tbWVudDM4MTYwMjAwNQ==", "user": {"value": 119974, "label": "coleifer"}, "created_at": "2018-04-16T13:37:32Z", "updated_at": "2018-04-16T13:37:32Z", "author_association": "NONE", "body": "I don't think it should be too difficult... you can look at what @ghaering did with pysqlite (and similarly what I copied for pysqlite3). You would theoretically take an amalgamation build of Sqlite (all code in a single .c and .h file). The `AmalgamationLibSqliteBuilder` class detects the presence of this amalgamated source file and builds a statically-linked pysqlite.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 310533258, "label": "Figure out how to bundle a more up-to-date SQLite"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/254#issuecomment-388367027", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/254", "id": 388367027, "node_id": "MDEyOklzc3VlQ29tbWVudDM4ODM2NzAyNw==", "user": {"value": 247131, "label": "philroche"}, "created_at": "2018-05-11T13:41:46Z", "updated_at": "2018-05-11T13:41:46Z", "author_association": "NONE", "body": "An example deployment @ https://datasette-zkcvlwdrhl.now.sh/simplestreams-270f20c/cloudimage?content_id__exact=com.ubuntu.cloud%3Areleased%3Adownload\r\n\r\nIt is not causing errors, more of an inconvenience. I have worked around it using a `like` query instead. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322283067, "label": "Escaping named parameters in canned queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/258#issuecomment-390577711", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/258", "id": 390577711, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MDU3NzcxMQ==", "user": {"value": 247131, "label": "philroche"}, "created_at": "2018-05-21T07:38:15Z", "updated_at": "2018-05-21T07:38:15Z", "author_association": "NONE", "body": "Excellent, I was not aware of the auto redirect to the new hash. My bad\r\n\r\nThis solves my use case.\r\n\r\nI do agree that your suggested --no-url-hash approach is much neater. I will investigate ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 322741659, "label": "Add new metadata key persistent_urls which removes the hash from all database urls"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/247#issuecomment-390689406", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/247", "id": 390689406, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MDY4OTQwNg==", "user": {"value": 11912854, "label": "jsancho-gpl"}, "created_at": "2018-05-21T15:29:31Z", "updated_at": "2018-05-21T15:29:31Z", "author_association": "NONE", "body": "I've changed my mind about the way to support external connectors aside of SQLite and I'm working in a more simple style that respects the original Datasette, i.e. less refactoring. I present you [a version of Datasette wich supports other database connectors](https://github.com/jsancho-gpl/datasette/tree/external-connectors) and [a Datasette connector for HDF5/PyTables files](https://github.com/jsancho-gpl/datasette-pytables).", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 319449852, "label": "SQLite code decoupled from Datasette"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/191#issuecomment-392828475", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/191", "id": 392828475, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MjgyODQ3NQ==", "user": {"value": 119974, "label": "coleifer"}, "created_at": "2018-05-29T15:50:18Z", "updated_at": "2018-05-29T15:50:18Z", "author_association": "NONE", "body": "Python standard-library SQLite dynamically links against the system sqlite3. So presumably you installed a more up-to-date sqlite3 somewhere on your `LD_LIBRARY_PATH`.\r\n\r\nTo compile a statically-linked pysqlite you need to include an amalgamation in the project root when building the extension. Read the relevant setup.py.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 310533258, "label": "Figure out how to bundle a more up-to-date SQLite"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/265#issuecomment-392890045", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/265", "id": 392890045, "node_id": "MDEyOklzc3VlQ29tbWVudDM5Mjg5MDA0NQ==", "user": {"value": 231923, "label": "yschimke"}, "created_at": "2018-05-29T18:37:49Z", "updated_at": "2018-05-29T18:37:49Z", "author_association": "NONE", "body": "Just about to ask for this! Move this page https://github.com/simonw/datasette/wiki/Datasettes\r\n\r\ninto a datasette, with some concept of versioning as well.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 323677499, "label": "Add links to example Datasette instances to appropiate places in docs"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/97#issuecomment-392895733", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/97", "id": 392895733, "node_id": "MDEyOklzc3VlQ29tbWVudDM5Mjg5NTczMw==", "user": {"value": 231923, "label": "yschimke"}, "created_at": "2018-05-29T18:51:35Z", "updated_at": "2018-05-29T18:51:35Z", "author_association": "NONE", "body": "Do you have an existing example with views?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 274022950, "label": "Link to JSON for the list of tables "}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/316#issuecomment-398030903", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/316", "id": 398030903, "node_id": "MDEyOklzc3VlQ29tbWVudDM5ODAzMDkwMw==", "user": {"value": 132230, "label": "gavinband"}, "created_at": "2018-06-18T12:00:43Z", "updated_at": "2018-06-18T12:00:43Z", "author_association": "NONE", "body": "I should add that I'm using datasette version 0.22, Python 2.7.10 on Mac OS X. Happy to send more info if helpful.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 333238932, "label": "datasette inspect takes a very long time on large dbs"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/316#issuecomment-398109204", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/316", "id": 398109204, "node_id": "MDEyOklzc3VlQ29tbWVudDM5ODEwOTIwNA==", "user": {"value": 132230, "label": "gavinband"}, "created_at": "2018-06-18T16:12:45Z", "updated_at": "2018-06-18T16:12:45Z", "author_association": "NONE", "body": "Hi Simon,\r\nThanks for the response. Ok I'll try running `datasette inspect` up front.\r\nIn principle the db won't change. However, the site's in development and it's likely I'll need to add views and some auxiliary (smaller) tables as I go along. I will need to be careful with this if it involves an inspect step in each iteration, though.\r\ng.\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 333238932, "label": "datasette inspect takes a very long time on large dbs"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/188#issuecomment-398778485", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/188", "id": 398778485, "node_id": "MDEyOklzc3VlQ29tbWVudDM5ODc3ODQ4NQ==", "user": {"value": 12617395, "label": "bsilverm"}, "created_at": "2018-06-20T14:48:39Z", "updated_at": "2018-06-20T14:48:39Z", "author_association": "NONE", "body": "This would be a great feature to have!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 309047460, "label": "Ability to bundle metadata and templates inside the SQLite file"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/321#issuecomment-399098080", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/321", "id": 399098080, "node_id": "MDEyOklzc3VlQ29tbWVudDM5OTA5ODA4MA==", "user": {"value": 12617395, "label": "bsilverm"}, "created_at": "2018-06-21T13:10:48Z", "updated_at": "2018-06-21T13:10:48Z", "author_association": "NONE", "body": "Perfect, thank you!!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 334190959, "label": "Wildcard support in query parameters"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/321#issuecomment-399106871", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/321", "id": 399106871, "node_id": "MDEyOklzc3VlQ29tbWVudDM5OTEwNjg3MQ==", "user": {"value": 12617395, "label": "bsilverm"}, "created_at": "2018-06-21T13:39:37Z", "updated_at": "2018-06-21T13:39:37Z", "author_association": "NONE", "body": "One thing I've noticed with this approach is that the query is executed with no parameters which I do not believe was the case previously. In the case the table contains a lot of data, this adds some time executing the query before the user can enter their input and run it with the parameters they want.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 334190959, "label": "Wildcard support in query parameters"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/321#issuecomment-399129220", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/321", "id": 399129220, "node_id": "MDEyOklzc3VlQ29tbWVudDM5OTEyOTIyMA==", "user": {"value": 12617395, "label": "bsilverm"}, "created_at": "2018-06-21T14:45:02Z", "updated_at": "2018-06-21T14:45:02Z", "author_association": "NONE", "body": "Those queries look identical. How can this be prevented if the queries are in a metadata.json file?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 334190959, "label": "Wildcard support in query parameters"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/321#issuecomment-399173916", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/321", "id": 399173916, "node_id": "MDEyOklzc3VlQ29tbWVudDM5OTE3MzkxNg==", "user": {"value": 12617395, "label": "bsilverm"}, "created_at": "2018-06-21T17:00:10Z", "updated_at": "2018-06-21T17:00:10Z", "author_association": "NONE", "body": "Oh I see.. My issue is that the query executes with an empty string prior to the user submitting the parameters. I'll try adding your workaround to some of my queries. Thanks again,", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 334190959, "label": "Wildcard support in query parameters"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/272#issuecomment-400571521", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/272", "id": 400571521, "node_id": "MDEyOklzc3VlQ29tbWVudDQwMDU3MTUyMQ==", "user": {"value": 647359, "label": "tomchristie"}, "created_at": "2018-06-27T07:30:07Z", "updated_at": "2018-06-27T07:30:07Z", "author_association": "NONE", "body": "I\u2019m up for helping with this.\r\n\r\nLooks like you\u2019d need static files support, which I\u2019m planning on adding a component for. Anything else obviously missing?\r\n\r\nFor a quick overview it looks very doable - the test client ought to me your test cases stay roughly the same.\r\n\r\nAre you using any middleware or other components for the Sanic ecosystem? Do you use cookies or sessions at all?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 324188953, "label": "Port Datasette to ASGI"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/272#issuecomment-404514973", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/272", "id": 404514973, "node_id": "MDEyOklzc3VlQ29tbWVudDQwNDUxNDk3Mw==", "user": {"value": 647359, "label": "tomchristie"}, "created_at": "2018-07-12T13:38:24Z", "updated_at": "2018-07-12T13:38:24Z", "author_association": "NONE", "body": "Okay. I reckon the latest version should have all the kinds of components you'd need:\r\n\r\nRecently added ASGI components for Routing and Static Files support, as well as making few tweaks to make sure requests and responses are instantiated efficiently.\r\n\r\nDon't have any redirect-to-slash / redirect-to-non-slash stuff out of the box yet, which it looks like you might miss.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 324188953, "label": "Port Datasette to ASGI"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/339#issuecomment-404576136", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/339", "id": 404576136, "node_id": "MDEyOklzc3VlQ29tbWVudDQwNDU3NjEzNg==", "user": {"value": 12617395, "label": "bsilverm"}, "created_at": "2018-07-12T16:45:08Z", "updated_at": "2018-07-12T16:45:08Z", "author_association": "NONE", "body": "Thanks for the quick reply. Looks like that is working well.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 340396247, "label": "Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/185#issuecomment-412663658", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/185", "id": 412663658, "node_id": "MDEyOklzc3VlQ29tbWVudDQxMjY2MzY1OA==", "user": {"value": 222245, "label": "carlmjohnson"}, "created_at": "2018-08-13T21:04:11Z", "updated_at": "2018-08-13T21:04:11Z", "author_association": "NONE", "body": "That seems good to me.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 299760684, "label": "Metadata should be a nested arbitrary KV store"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/267#issuecomment-414860009", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/267", "id": 414860009, "node_id": "MDEyOklzc3VlQ29tbWVudDQxNDg2MDAwOQ==", "user": {"value": 78156, "label": "annapowellsmith"}, "created_at": "2018-08-21T23:57:51Z", "updated_at": "2018-08-21T23:57:51Z", "author_association": "NONE", "body": "Looks to me like hashing, redirects and caching were documented as part of https://github.com/simonw/datasette/commit/788a542d3c739da5207db7d1fb91789603cdd336#diff-3021b0e065dce289c34c3b49b3952a07 - so perhaps this can be closed? :tada:", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 323716411, "label": "Documentation for URL hashing, redirects and cache policy"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/363#issuecomment-417684877", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/363", "id": 417684877, "node_id": "MDEyOklzc3VlQ29tbWVudDQxNzY4NDg3Nw==", "user": {"value": 436032, "label": "kevboh"}, "created_at": "2018-08-31T14:39:45Z", "updated_at": "2018-08-31T14:39:45Z", "author_association": "NONE", "body": "It looks like the check passed, not sure why it's showing as running in GH.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 355299310, "label": "Search all apps during heroku publish"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/272#issuecomment-418695115", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/272", "id": 418695115, "node_id": "MDEyOklzc3VlQ29tbWVudDQxODY5NTExNQ==", "user": {"value": 647359, "label": "tomchristie"}, "created_at": "2018-09-05T11:21:25Z", "updated_at": "2018-09-05T11:21:25Z", "author_association": "NONE", "body": "Some notes:\r\n\r\n* Starlette just got a bump to 0.3.0 - there's some renamings in there. It's got enough functionality now that you can treat it either as a framework or as a toolkit. Either way the component design is all just *here's an ASGI app* all the way through.\r\n* Uvicorn got a bump to 0.3.3 - Removed some cyclical references that were causing garbage collection to impact performance. Ought to be a decent speed bump.\r\n* Wrt. passing config - Either use a single envvar that points to a config, or use multiple envvars for the config. Uvicorn could get a flag to read a `.env` file, but I don't see ASGI itself having a specific interface there.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 324188953, "label": "Port Datasette to ASGI"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/293#issuecomment-420295524", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/293", "id": 420295524, "node_id": "MDEyOklzc3VlQ29tbWVudDQyMDI5NTUyNA==", "user": {"value": 11912854, "label": "jsancho-gpl"}, "created_at": "2018-09-11T14:32:45Z", "updated_at": "2018-09-11T14:32:45Z", "author_association": "NONE", "body": "I close this PR because it's better to use the new one #364 ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 326987229, "label": "Support for external database connectors"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/328#issuecomment-427261369", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/328", "id": 427261369, "node_id": "MDEyOklzc3VlQ29tbWVudDQyNzI2MTM2OQ==", "user": {"value": 13698964, "label": "chmaynard"}, "created_at": "2018-10-05T06:37:06Z", "updated_at": "2018-10-05T06:37:06Z", "author_association": "NONE", "body": "```\r\n~ $ docker pull datasetteproject/datasette\r\n~ $ docker run -p 8001:8001 -v `pwd`:/mnt datasetteproject/datasette datasette -p 8001 -h 0.0.0.0 /mnt/fixtures.db\r\nUsage: datasette -p [OPTIONS] [FILES]...\r\n\r\nError: Invalid value for \"files\": Path \"/mnt/fixtures.db\" does not exist.\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 336464733, "label": "Installation instructions, including how to use the docker image"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/187#issuecomment-427943710", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/187", "id": 427943710, "node_id": "MDEyOklzc3VlQ29tbWVudDQyNzk0MzcxMA==", "user": {"value": 1583271, "label": "progpow"}, "created_at": "2018-10-08T18:58:05Z", "updated_at": "2018-10-08T18:58:05Z", "author_association": "NONE", "body": "I have same error:\r\n```\r\nCollecting uvloop\r\n Using cached https://files.pythonhosted.org/packages/5c/37/6daa39aac42b2deda6ee77f408bec0419b600e27b89b374b0d440af32b10/uvloop-0.11.2.tar.gz\r\n Complete output from command python setup.py egg_info:\r\n Traceback (most recent call last):\r\n File \"\", line 1, in \r\n File \"C:\\Users\\sageev\\AppData\\Local\\Temp\\pip-install-bq64l8jy\\uvloop\\setup.py\", line 15, in \r\n raise RuntimeError('uvloop does not support Windows at the moment')\r\n RuntimeError: uvloop does not support Windows at the moment\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 309033998, "label": "Windows installation error"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/176#issuecomment-431867885", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/176", "id": 431867885, "node_id": "MDEyOklzc3VlQ29tbWVudDQzMTg2Nzg4NQ==", "user": {"value": 634572, "label": "eads"}, "created_at": "2018-10-22T15:24:57Z", "updated_at": "2018-10-22T15:24:57Z", "author_association": "NONE", "body": "I'd like this as well. It would let me access Datasette-driven projects from GatsbyJS the same way I can access Postgres DBs via Hasura. While I don't see SQLite replacing Postgres for the 50m row datasets I sometimes have to work with, there's a whole class of smaller datasets that are great with Datasette but currently would find another option.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 285168503, "label": "Add GraphQL endpoint"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/227#issuecomment-439194286", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/227", "id": 439194286, "node_id": "MDEyOklzc3VlQ29tbWVudDQzOTE5NDI4Ng==", "user": {"value": 222245, "label": "carlmjohnson"}, "created_at": "2018-11-15T21:20:37Z", "updated_at": "2018-11-15T21:20:37Z", "author_association": "NONE", "body": "I'm diving back into https://salaries.news.baltimoresun.com and what I really want is the ability to inject the request into my context.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 315960272, "label": "prepare_context() plugin hook"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/120#issuecomment-439421164", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/120", "id": 439421164, "node_id": "MDEyOklzc3VlQ29tbWVudDQzOTQyMTE2NA==", "user": {"value": 36796532, "label": "ad-si"}, "created_at": "2018-11-16T15:05:18Z", "updated_at": "2018-11-16T15:05:18Z", "author_association": "NONE", "body": "This would be an awesome feature \u2764\ufe0f ", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 275087397, "label": "Plugin that adds an authentication layer of some sort"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/393#issuecomment-451415063", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/393", "id": 451415063, "node_id": "MDEyOklzc3VlQ29tbWVudDQ1MTQxNTA2Mw==", "user": {"value": 1727065, "label": "ltrgoddard"}, "created_at": "2019-01-04T11:04:08Z", "updated_at": "2019-01-04T11:04:08Z", "author_association": "NONE", "body": "Awesome - will get myself up and running on 0.26", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 395236066, "label": "CSV export in \"Advanced export\" pane doesn't respect query"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/401#issuecomment-455520561", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/401", "id": 455520561, "node_id": "MDEyOklzc3VlQ29tbWVudDQ1NTUyMDU2MQ==", "user": {"value": 1055831, "label": "dazzag24"}, "created_at": "2019-01-18T11:48:13Z", "updated_at": "2019-01-18T11:48:13Z", "author_association": "NONE", "body": "Thanks. I'll take a look at your changes.\r\nI must admit I was struggling to see how to pass info from the python code in __init__.py into the javascript document.addEventListener function.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 400229984, "label": "How to pass configuration to plugins?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/403#issuecomment-455752238", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/403", "id": 455752238, "node_id": "MDEyOklzc3VlQ29tbWVudDQ1NTc1MjIzOA==", "user": {"value": 1794527, "label": "ccorcos"}, "created_at": "2019-01-19T05:47:55Z", "updated_at": "2019-01-19T05:47:55Z", "author_association": "NONE", "body": "Ah. That makes much more sense. Interesting approach.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 400511206, "label": "How does persistence work?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/187#issuecomment-463917744", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/187", "id": 463917744, "node_id": "MDEyOklzc3VlQ29tbWVudDQ2MzkxNzc0NA==", "user": {"value": 4190962, "label": "phoenixjun"}, "created_at": "2019-02-15T05:58:44Z", "updated_at": "2019-02-15T05:58:44Z", "author_association": "NONE", "body": "is this supported or not? you can comment if it is not supported so that people like me can stop trying.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 309033998, "label": "Windows installation error"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/8#issuecomment-464341721", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/8", "id": 464341721, "node_id": "MDEyOklzc3VlQ29tbWVudDQ2NDM0MTcyMQ==", "user": {"value": 82988, "label": "psychemedia"}, "created_at": "2019-02-16T12:08:41Z", "updated_at": "2019-02-16T12:08:41Z", "author_association": "NONE", "body": "We also get an error if a column name contains a `.`", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 403922644, "label": "Problems handling column names containing spaces or - "}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/187#issuecomment-466325528", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/187", "id": 466325528, "node_id": "MDEyOklzc3VlQ29tbWVudDQ2NjMyNTUyOA==", "user": {"value": 2892252, "label": "fkuhn"}, "created_at": "2019-02-22T09:03:50Z", "updated_at": "2019-02-22T09:03:50Z", "author_association": "NONE", "body": "I ran into the same issue when trying to install datasette on windows after successfully using it on linux. Unfortunately, there has not been any progress in implementing uvloop for windows - so I recommend not to use it on win. You can read about this issue here:\r\n[https://github.com/MagicStack/uvloop/issues/14](url)", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 309033998, "label": "Windows installation error"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/409#issuecomment-472844001", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/409", "id": 472844001, "node_id": "MDEyOklzc3VlQ29tbWVudDQ3Mjg0NDAwMQ==", "user": {"value": 43100, "label": "Uninen"}, "created_at": "2019-03-14T13:04:20Z", "updated_at": "2019-03-14T13:04:42Z", "author_association": "NONE", "body": "It seems this affects the Datasette Publish -site as well: https://github.com/simonw/datasette-publish-support/issues/3", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 408376825, "label": "Zeit API v1 does not work for new users - need to migrate to v2"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/409#issuecomment-472875713", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/409", "id": 472875713, "node_id": "MDEyOklzc3VlQ29tbWVudDQ3Mjg3NTcxMw==", "user": {"value": 209967, "label": "michaelmcandrew"}, "created_at": "2019-03-14T14:14:39Z", "updated_at": "2019-03-14T14:14:39Z", "author_association": "NONE", "body": "also linking this zeit issue in case it is helpful: https://github.com/zeit/now-examples/issues/163#issuecomment-440125769", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 408376825, "label": "Zeit API v1 does not work for new users - need to migrate to v2"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/415#issuecomment-473217334", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/415", "id": 473217334, "node_id": "MDEyOklzc3VlQ29tbWVudDQ3MzIxNzMzNA==", "user": {"value": 36796532, "label": "ad-si"}, "created_at": "2019-03-15T09:30:57Z", "updated_at": "2019-03-15T09:30:57Z", "author_association": "NONE", "body": "Awesome, thanks! \ud83d\ude01 ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 418329842, "label": "Add query parameter to hide SQL textarea"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/18#issuecomment-480621924", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/18", "id": 480621924, "node_id": "MDEyOklzc3VlQ29tbWVudDQ4MDYyMTkyNA==", "user": {"value": 82988, "label": "psychemedia"}, "created_at": "2019-04-07T19:31:42Z", "updated_at": "2019-04-07T19:31:42Z", "author_association": "NONE", "body": "I've just noticed that SQLite lets you IGNORE inserts that collide with a pre-existing key. This can be quite handy if you have a dataset that keeps changing in part, and you don't want to upsert and replace pre-existing PK rows but you do want to ignore collisions to existing PK rows.\r\n\r\nDo `sqlite_utils` support such (cavalier!) behaviour?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 413871266, "label": ".insert/.upsert/.insert_all/.upsert_all should add missing columns"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/8#issuecomment-482994231", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/8", "id": 482994231, "node_id": "MDEyOklzc3VlQ29tbWVudDQ4Mjk5NDIzMQ==", "user": {"value": 82988, "label": "psychemedia"}, "created_at": "2019-04-14T15:04:07Z", "updated_at": "2019-04-14T15:29:33Z", "author_association": "NONE", "body": "\r\n\r\nPLEASE IGNORE THE BELOW... I did a package update and rebuilt the kernel I was working in... may just have been an old version of sqlite_utils, seems to be working now. (Too many containers / too many environments!)\r\n\r\n\r\nHas an issue been reintroduced here with FTS? eg I'm getting an error thrown by spaces in column names here:\r\n\r\n```\r\n/usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, upsert, batch_size, column_order)\r\n\r\ndef enable_fts(self, columns, fts_version=\"FTS5\"):\r\n--> 329 \"Enables FTS on the specified columns\"\r\n 330 sql = \"\"\"\r\n 331 CREATE VIRTUAL TABLE \"{table}_fts\" USING {fts_version} (\r\n```\r\n\r\nwhen trying an `insert_all`.\r\n\r\nAlso, if a col has a `.` in it, I seem to get:\r\n\r\n```\r\n/usr/local/lib/python3.7/site-packages/sqlite_utils/db.py in insert_all(self, records, pk, foreign_keys, upsert, batch_size, column_order)\r\n 327 jsonify_if_needed(record.get(key, None)) for key in all_columns\r\n 328 )\r\n--> 329 result = self.db.conn.execute(sql, values)\r\n 330 self.db.conn.commit()\r\n 331 self.last_id = result.lastrowid\r\n\r\nOperationalError: near \".\": syntax error\r\n```\r\n\r\n(Can't post a worked minimal example right now; racing trying to build something against a live timing screen that will stop until next weekend in an hour or two...)\r\n\r\nPS Hmmm I did a test and they seem to work; I must be messing up s/where else...\r\n\r\n```\r\nimport sqlite3\r\nfrom sqlite_utils import Database\r\n\r\ndbname='testingDB_sqlite_utils.db'\r\n\r\n#!rm $dbname\r\nconn = sqlite3.connect(dbname, timeout=10)\r\n\r\n\r\n#Setup database tables\r\nc = conn.cursor()\r\n\r\nsetup='''\r\nCREATE TABLE IF NOT EXISTS \"test1\" (\r\n \"NO\" INTEGER,\r\n \"NAME\" TEXT\r\n);\r\n\r\nCREATE TABLE IF NOT EXISTS \"test2\" (\r\n \"NO\" INTEGER,\r\n `TIME OF DAY` TEXT\r\n);\r\n\r\nCREATE TABLE IF NOT EXISTS \"test3\" (\r\n \"NO\" INTEGER,\r\n `AVG. SPEED (MPH)` FLOAT\r\n);\r\n'''\r\n\r\nc.executescript(setup)\r\n\r\n\r\nDB = Database(conn)\r\n\r\nimport pandas as pd\r\n\r\ndf1 = pd.DataFrame({'NO':[1,2],'NAME':['a','b']})\r\nDB['test1'].insert_all(df1.to_dict(orient='records'))\r\n\r\ndf2 = pd.DataFrame({'NO':[1,2],'TIME OF DAY':['early on','late']})\r\nDB['test2'].insert_all(df2.to_dict(orient='records'))\r\n\r\ndf3 = pd.DataFrame({'NO':[1,2],'AVG. SPEED (MPH)':['123.3','123.4']})\r\nDB['test3'].insert_all(df3.to_dict(orient='records'))\r\n```\r\n\r\nall seem to work ok. I'm still getting errors in my set up though, which is not too different to the text cases?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 403922644, "label": "Problems handling column names containing spaces or - "}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/426#issuecomment-485557574", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/426", "id": 485557574, "node_id": "MDEyOklzc3VlQ29tbWVudDQ4NTU1NzU3NA==", "user": {"value": 222245, "label": "carlmjohnson"}, "created_at": "2019-04-22T21:23:22Z", "updated_at": "2019-04-22T21:23:22Z", "author_association": "NONE", "body": "Can you cut a new release with this?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 431756352, "label": "Upgrade to Jinja2==2.10.1"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/187#issuecomment-489353316", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/187", "id": 489353316, "node_id": "MDEyOklzc3VlQ29tbWVudDQ4OTM1MzMxNg==", "user": {"value": 46059, "label": "carsonyl"}, "created_at": "2019-05-04T18:36:36Z", "updated_at": "2019-05-04T18:36:36Z", "author_association": "NONE", "body": "Hi @simonw - I just hit this issue when trying out Datasette after your PyCon talk today. Datasette is pinned to Sanic 0.7.0, but it looks like 0.8.0 added the option to remove the uvloop dependency for Windows by having an environment variable `SANIC_NO_UVLOOP` at install time. Maybe that'll be sufficient before a port to Starlette?", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 1, \"eyes\": 0}", "issue": {"value": 309033998, "label": "Windows installation error"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/187#issuecomment-490039343", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/187", "id": 490039343, "node_id": "MDEyOklzc3VlQ29tbWVudDQ5MDAzOTM0Mw==", "user": {"value": 6422964, "label": "Maltazar"}, "created_at": "2019-05-07T11:24:42Z", "updated_at": "2019-05-07T11:24:42Z", "author_association": "NONE", "body": "I totally agree with carsonyl", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 309033998, "label": "Windows installation error"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/272#issuecomment-494297022", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/272", "id": 494297022, "node_id": "MDEyOklzc3VlQ29tbWVudDQ5NDI5NzAyMg==", "user": {"value": 647359, "label": "tomchristie"}, "created_at": "2019-05-21T08:39:17Z", "updated_at": "2019-05-21T08:39:17Z", "author_association": "NONE", "body": "Useful context stuff:\r\n\r\n> ASGI decodes %2F encoded slashes in URLs automatically\r\n\r\n`raw_path` for ASGI looks to be under consideration: https://github.com/django/asgiref/issues/87\r\n\r\n> uvicorn doesn't support Python 3.5\r\n\r\nThat was an issue specifically against the <=3.5.2 minor point releases of Python, now resolved: https://github.com/encode/uvicorn/issues/330 \ud83d\udc4d\r\n\r\n> Starlette for things like form parsing - but it's 3.6+ only!\r\n\r\nYeah - the bits that require 3.6 are anywhere with the \"async for\" syntax. If it wasn't for that I'd downport it, but that one's a pain. It's the one bit of syntax to watch out for if you're looking to bring any bits of implementation across to Datasette.\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 324188953, "label": "Port Datasette to ASGI"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/184#issuecomment-494459264", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/184", "id": 494459264, "node_id": "MDEyOklzc3VlQ29tbWVudDQ5NDQ1OTI2NA==", "user": {"value": 222245, "label": "carlmjohnson"}, "created_at": "2019-05-21T16:17:29Z", "updated_at": "2019-05-21T16:17:29Z", "author_association": "NONE", "body": "Reopening this because it still raises 500 for incorrect table capitalization. \r\n\r\nExample:\r\n\r\n- https://salaries.news.baltimoresun.com/salaries/2018+Maryland+state+salaries/1 200 OK\r\n- https://salaries.news.baltimoresun.com/salaries/bad-table/1 400\r\n- https://salaries.news.baltimoresun.com/salaries/2018+maryland+state+salaries/1 500 Internal Error (note lowercase 'm')\r\n\r\nI think because the table name exists but is not in its canonical form, it triggers a dict lookup error.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 292011379, "label": "500 from missing table name"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/483#issuecomment-495034774", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/483", "id": 495034774, "node_id": "MDEyOklzc3VlQ29tbWVudDQ5NTAzNDc3NA==", "user": {"value": 45919695, "label": "jcmkk3"}, "created_at": "2019-05-23T01:38:32Z", "updated_at": "2019-05-23T01:43:04Z", "author_association": "NONE", "body": "I think that location information is one of the other common pieces of hierarchical data. At least one that is general enough that extra dimensions could be auto-generated.\r\n\r\nAlso, I think this is an awesome project. Thank you for creating this.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 447408527, "label": "Option to facet by date using month or year"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/120#issuecomment-496966227", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/120", "id": 496966227, "node_id": "MDEyOklzc3VlQ29tbWVudDQ5Njk2NjIyNw==", "user": {"value": 26342344, "label": "duarteocarmo"}, "created_at": "2019-05-29T14:40:52Z", "updated_at": "2019-05-29T14:40:52Z", "author_association": "NONE", "body": "I would really like this. If you give me some pointers @simonw I'm willing to PR!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 275087397, "label": "Plugin that adds an authentication layer of some sort"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/496#issuecomment-497885590", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/496", "id": 497885590, "node_id": "MDEyOklzc3VlQ29tbWVudDQ5Nzg4NTU5MA==", "user": {"value": 1740337, "label": "costrouc"}, "created_at": "2019-05-31T23:05:05Z", "updated_at": "2019-05-31T23:05:05Z", "author_association": "NONE", "body": "Upon doing a \"fix\" which allowed a longer build timeout the cloudrun container was too slow when it actually ran. So I would say if your sqlite database is over 1 GB heroku and cloudrun are not good options.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 450862577, "label": "Additional options to gcloud build command in cloudrun - timeout"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/499#issuecomment-499260727", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/499", "id": 499260727, "node_id": "MDEyOklzc3VlQ29tbWVudDQ5OTI2MDcyNw==", "user": {"value": 7936571, "label": "chrismp"}, "created_at": "2019-06-05T21:22:55Z", "updated_at": "2019-06-05T21:22:55Z", "author_association": "NONE", "body": "I was thinking of having some kind of GUI in which regular reporters can upload a CSV and choose how to name the tables, columns and whatnot. Maybe it's possible to make such a GUI using Jinja template language? I ask because I'm unsure how to pursue this but I'd like to try. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 451585764, "label": "Accessibility for non-techie newsies? "}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/498#issuecomment-499262397", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/498", "id": 499262397, "node_id": "MDEyOklzc3VlQ29tbWVudDQ5OTI2MjM5Nw==", "user": {"value": 7936571, "label": "chrismp"}, "created_at": "2019-06-05T21:28:32Z", "updated_at": "2019-06-05T21:28:32Z", "author_association": "NONE", "body": "Thinking about this more, I'd probably have to make a template page to go along with this, right? I'm guessing there's no way to add an all-databases-all-tables search to datasette's \"home page\" except by copying the \"home page\" template and editing it?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 451513541, "label": "Full text search of all tables at once?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/506#issuecomment-500238035", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/506", "id": 500238035, "node_id": "MDEyOklzc3VlQ29tbWVudDUwMDIzODAzNQ==", "user": {"value": 1059677, "label": "Gagravarr"}, "created_at": "2019-06-09T19:21:18Z", "updated_at": "2019-06-09T19:21:18Z", "author_association": "NONE", "body": "If you don't mind calling out to Java, then Apache Tika is able to tell you what a load of \"binary stuff\" is, plus render it to XHTML where possible.\r\n\r\nThere's a python wrapper around the Apache Tika server, but for a more typical datasette usecase you'd probably just want to grab the Tika CLI jar, and call it with `--detect` and/or `--xhtml` to process the unknown binary blob", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 453846217, "label": "Option to display binary data"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/498#issuecomment-501903071", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/498", "id": 501903071, "node_id": "MDEyOklzc3VlQ29tbWVudDUwMTkwMzA3MQ==", "user": {"value": 7936571, "label": "chrismp"}, "created_at": "2019-06-13T22:35:06Z", "updated_at": "2019-06-13T22:35:06Z", "author_association": "NONE", "body": "I'd like to start working on this. I've made a custom template for `index.html` that contains a `form` that contains a search `input`. But I'm not sure where to go from here. When user enters a search term, I'd like for that term to go into a function I'll make that will search all tables with full text search enabled. \r\n\r\nCan I make additional custom Python scripts for this or must I edit datasette's files directly?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 451513541, "label": "Full text search of all tables at once?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/512#issuecomment-503236800", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/512", "id": 503236800, "node_id": "MDEyOklzc3VlQ29tbWVudDUwMzIzNjgwMA==", "user": {"value": 7936571, "label": "chrismp"}, "created_at": "2019-06-18T17:36:37Z", "updated_at": "2019-06-18T17:36:37Z", "author_association": "NONE", "body": "Oh I didn't know the `description` field could be used for a database's metadata. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 457147936, "label": "\"about\" parameter in metadata does not appear when alone"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/502#issuecomment-503237884", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/502", "id": 503237884, "node_id": "MDEyOklzc3VlQ29tbWVudDUwMzIzNzg4NA==", "user": {"value": 7936571, "label": "chrismp"}, "created_at": "2019-06-18T17:39:18Z", "updated_at": "2019-06-18T17:46:08Z", "author_association": "NONE", "body": "It appears that I cannot reopen this issue but the proposed solution did not solve it. The link is not there. I have full text search enabled for a bunch of tables in my database and even clicking the link to reveal hidden tables did not show the download DB link.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 453131917, "label": "Exporting sqlite database(s)?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/513#issuecomment-503249999", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/513", "id": 503249999, "node_id": "MDEyOklzc3VlQ29tbWVudDUwMzI0OTk5OQ==", "user": {"value": 7936571, "label": "chrismp"}, "created_at": "2019-06-18T18:11:36Z", "updated_at": "2019-06-18T18:11:36Z", "author_association": "NONE", "body": "Ah, so basically put the SQLite databases on Linode, for example, and run `datasette serve` on there? I'm comfortable with that. ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 457201907, "label": "Is it possible to publish to Heroku despite slug size being too large?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/514#issuecomment-504684709", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/514", "id": 504684709, "node_id": "MDEyOklzc3VlQ29tbWVudDUwNDY4NDcwOQ==", "user": {"value": 7936571, "label": "chrismp"}, "created_at": "2019-06-22T17:36:25Z", "updated_at": "2019-06-22T17:36:25Z", "author_association": "NONE", "body": "> WorkingDirectory=/path/to/data\r\n\r\n@russss, Which directory does this represent?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 459397625, "label": "Documentation with recommendations on running Datasette in production without using Docker"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/514#issuecomment-504685187", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/514", "id": 504685187, "node_id": "MDEyOklzc3VlQ29tbWVudDUwNDY4NTE4Nw==", "user": {"value": 7936571, "label": "chrismp"}, "created_at": "2019-06-22T17:43:24Z", "updated_at": "2019-06-22T17:43:24Z", "author_association": "NONE", "body": "> > > WorkingDirectory=/path/to/data\r\n> > \r\n> > \r\n> > @russss, Which directory does this represent?\r\n> \r\n> It's the working directory (cwd) of the spawned process. In this case if you set it to the directory your data is in, you can use relative paths to the db (and metadata/templates/etc) in the `ExecStart` command.\r\n\r\nIn my case, on a remote server, I set up a virtual environment in `/home/chris/Env/datasette`, and when I activated that environment I ran `pip install datasette`. \r\n\r\nMy datasette project is in `/home/chris/datatsette-project`, so I guess I'd use that directory in the `WorkingDirectory` parameter?\r\n\r\nAnd the `ExecStart` parameter would be `/home/chris/Env/datasette/lib/python3.7/site-packages/datasette serve -h 0.0.0.0 my.db` I'm guessing?\r\n ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 459397625, "label": "Documentation with recommendations on running Datasette in production without using Docker"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/514#issuecomment-504686266", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/514", "id": 504686266, "node_id": "MDEyOklzc3VlQ29tbWVudDUwNDY4NjI2Ng==", "user": {"value": 7936571, "label": "chrismp"}, "created_at": "2019-06-22T17:58:50Z", "updated_at": "2019-06-23T21:21:57Z", "author_association": "NONE", "body": "@russss \r\n\r\nActually, here's what I've got in `/etc/systemd/system/datasette.service`\r\n\r\n```\r\n[Unit]\r\nDescription=Datasette\r\nAfter=network.target\r\n\r\n[Service]\r\nType=simple\r\nUser=chris\r\nWorkingDirectory=/home/chris/digital-library\r\nExecStart=/home/chris/Env/datasette/lib/python3.7/site-packages/datasette serve -h 0.0.0.0 databases/*.db --cors --metadata metadata.json\r\nRestart=on-failure\r\n\r\n[Install]\r\nWantedBy=multi-user.target\r\n```\r\n\r\nI ran: \r\n```\r\n$ sudo systemctl daemon-reload\r\n$ sudo systemctl enable datasette\r\n$ sudo systemctl start datasette\r\n```\r\nThen I ran:\r\n`$ journalctl -u datasette -f`\r\n\r\nGot this message.\r\n\r\n```\r\nHint: You are currently not seeing messages from other users and the system.\r\n Users in groups 'adm', 'systemd-journal', 'wheel' can see all messages.\r\n Pass -q to turn off this notice.\r\n-- Logs begin at Thu 2019-06-20 00:05:23 CEST. --\r\nJun 22 19:55:57 ns331247 systemd[16176]: datasette.service: Failed to execute command: Permission denied\r\nJun 22 19:55:57 ns331247 systemd[16176]: datasette.service: Failed at step EXEC spawning /home/chris/Env/datasette/lib/python3.7/site-packages/datasette: Permission denied\r\nJun 22 19:55:57 ns331247 systemd[16184]: datasette.service: Failed to execute command: Permission denied\r\nJun 22 19:55:57 ns331247 systemd[16184]: datasette.service: Failed at step EXEC spawning /home/chris/Env/datasette/lib/python3.7/site-packages/datasette: Permission denied\r\nJun 22 19:55:58 ns331247 systemd[16186]: datasette.service: Failed to execute command: Permission denied\r\nJun 22 19:55:58 ns331247 systemd[16186]: datasette.service: Failed at step EXEC spawning /home/chris/Env/datasette/lib/python3.7/site-packages/datasette: Permission denied\r\nJun 22 19:55:58 ns331247 systemd[16190]: datasette.service: Failed to execute command: Permission denied\r\nJun 22 19:55:58 ns331247 systemd[16190]: datasette.service: Failed at step EXEC spawning /home/chris/Env/datasette/lib/python3.7/site-packages/datasette: Permission denied\r\nJun 22 19:55:58 ns331247 systemd[16191]: datasette.service: Failed to execute command: Permission denied\r\nJun 22 19:55:58 ns331247 systemd[16191]: datasette.service: Failed at step EXEC spawning /home/chris/Env/datasette/lib/python3.7/site-packages/datasette: Permission denied\r\n```\r\nWhen I go to the address for my server, I am met with the standard \"Welcome to nginx\" message:\r\n\r\n```\r\nWelcome to nginx!\r\nIf you see this page, the nginx web server is successfully installed and working. Further configuration is required.\r\n\r\nFor online documentation and support please refer to nginx.org.\r\nCommercial support is available at nginx.com.\r\n\r\nThank you for using nginx.\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 459397625, "label": "Documentation with recommendations on running Datasette in production without using Docker"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/498#issuecomment-504785662", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/498", "id": 504785662, "node_id": "MDEyOklzc3VlQ29tbWVudDUwNDc4NTY2Mg==", "user": {"value": 7936571, "label": "chrismp"}, "created_at": "2019-06-23T20:47:37Z", "updated_at": "2019-06-23T20:47:37Z", "author_association": "NONE", "body": "Very cool, thank you.\r\n\r\nUsing http://search-24ways.herokuapp.com as an example, let's say I want to search all FTS columns in all tables in all databases for the word \"web.\" \r\n\r\n[Here's a link](http://search-24ways.herokuapp.com/24ways-f8f455f?sql=select+count%28*%29from+articles+where+rowid+in+%28select+rowid+from+articles_fts+where+articles_fts+match+%3Asearch%29&search=web) to the query I'd need to run to search \"web\" on FTS columns in `articles` table of the `24ways` database. \r\n\r\nAnd [here's a link](http://search-24ways.herokuapp.com/24ways-f8f455f.json?sql=select+count%28*%29from+articles+where+rowid+in+%28select+rowid+from+articles_fts+where+articles_fts+match+%3Asearch%29&search=web) to the JSON version of the above result. I'd like to get the JSON result of that query for each FTS table of each database in my datasette project. \r\n\r\nIs it possible in Javascript to automate the construction of query URLs like the one I linked, but for every FTS table in my datasette project?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 451513541, "label": "Full text search of all tables at once?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/514#issuecomment-504789231", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/514", "id": 504789231, "node_id": "MDEyOklzc3VlQ29tbWVudDUwNDc4OTIzMQ==", "user": {"value": 7936571, "label": "chrismp"}, "created_at": "2019-06-23T21:35:33Z", "updated_at": "2019-06-23T21:35:33Z", "author_association": "NONE", "body": "@russss \r\n\r\nThanks, just one more thing.\r\n\r\nI edited `datasette.service`:\r\n\r\n```\r\n[Unit]\r\nDescription=Datasette\r\nAfter=network.target\r\n\r\n[Service]\r\nType=simple\r\nUser=chris\r\nWorkingDirectory=/home/chris/digital-library\r\nExecStart=/home/chris/Env/datasette/bin/datasette serve -h 0.0.0.0 databases/*.db --cors --metadata metadata.json\r\nRestart=on-failure\r\n\r\n[Install]\r\nWantedBy=multi-user.target\r\n```\r\n\r\nThen ran:\r\n\r\n```\r\n$ sudo systemctl daemon-reload\r\n$ sudo systemctl enable datasette\r\n$ sudo systemctl start datasette\r\n```\r\n\r\nBut the logs from `journalctl` show this datasette error:\r\n\r\n```\r\nJun 23 23:31:41 ns331247 datasette[1771]: Error: Invalid value for \"[FILES]...\": Path \"databases/*.db\" does not exist.\r\nJun 23 23:31:44 ns331247 datasette[1778]: Usage: datasette serve [OPTIONS] [FILES]...\r\nJun 23 23:31:44 ns331247 datasette[1778]: Try \"datasette serve --help\" for help.\r\n```\r\n\r\nBut the `databases` directory does exist in the directory specified by `WorkingDirectory`. Is this a datasette problem or did I write something incorrectly in the `.service` file?\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 459397625, "label": "Documentation with recommendations on running Datasette in production without using Docker"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/514#issuecomment-504998302", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/514", "id": 504998302, "node_id": "MDEyOklzc3VlQ29tbWVudDUwNDk5ODMwMg==", "user": {"value": 7936571, "label": "chrismp"}, "created_at": "2019-06-24T12:57:19Z", "updated_at": "2019-06-24T12:57:19Z", "author_association": "NONE", "body": "Same error when I used the full path.\n\nOn Sun, Jun 23, 2019 at 18:31 Simon Willison \nwrote:\n\n> I suggest trying a full path in ExecStart like this:\n>\n> ExecStart=/home/chris/Env/datasette/bin/datasette serve -h 0.0.0.0\n> /home/chris/digital-library/databases/*.db --cors --metadata metadata.json\n>\n> That should eliminate the chance of some kind of path confusion.\n>\n> \u2014\n> You are receiving this because you authored the thread.\n> Reply to this email directly, view it on GitHub\n> ,\n> or mute the thread\n> \n> .\n>\n-- \n*Chris Persaud*\nChrisPersaud.com\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 459397625, "label": "Documentation with recommendations on running Datasette in production without using Docker"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/498#issuecomment-505228873", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/498", "id": 505228873, "node_id": "MDEyOklzc3VlQ29tbWVudDUwNTIyODg3Mw==", "user": {"value": 7936571, "label": "chrismp"}, "created_at": "2019-06-25T00:21:17Z", "updated_at": "2019-06-25T00:21:17Z", "author_association": "NONE", "body": "Eh, I'm not concerned with a relevance score right now. I think I'd be fine with a search whose results show links to data tables with at least one result.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 451513541, "label": "Full text search of all tables at once?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/514#issuecomment-505232675", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/514", "id": 505232675, "node_id": "MDEyOklzc3VlQ29tbWVudDUwNTIzMjY3NQ==", "user": {"value": 7936571, "label": "chrismp"}, "created_at": "2019-06-25T00:43:12Z", "updated_at": "2019-06-25T00:43:12Z", "author_association": "NONE", "body": "Yep, that worked to get the site up and running at `my-server.com:8000` but when I edited `run-datasette.sh` to contain this...\r\n\r\n```\r\n#!/bin/bash\r\n/home/chris/Env/datasette/bin/datasette serve -h 0.0.0.0 -p 80 /home/chris/digital-library/databases/*.db --cors --metadata /home/chris/digital-library/metadata.json\r\n```\r\n\r\nI got this error.\r\n\r\n```\r\nJun 25 02:42:41 ns331247 run-datasette.sh[747]: [2019-06-25 02:42:41 +0200] [752] [INFO] Goin' Fast @ http://0.0.0.0:80\r\nJun 25 02:42:41 ns331247 run-datasette.sh[747]: [2019-06-25 02:42:41 +0200] [752] [ERROR] Unable to start server\r\nJun 25 02:42:41 ns331247 run-datasette.sh[747]: Traceback (most recent call last):\r\nJun 25 02:42:41 ns331247 run-datasette.sh[747]: File \"uvloop/loop.pyx\", line 1111, in uvloop.loop.Loop._create_server\r\nJun 25 02:42:41 ns331247 run-datasette.sh[747]: File \"uvloop/handles/tcp.pyx\", line 89, in uvloop.loop.TCPServer.bind\r\nJun 25 02:42:41 ns331247 run-datasette.sh[747]: File \"uvloop/handles/streamserver.pyx\", line 95, in uvloop.loop.UVStreamServer._fatal_error\r\nJun 25 02:42:41 ns331247 run-datasette.sh[747]: File \"uvloop/handles/tcp.pyx\", line 87, in uvloop.loop.TCPServer.bind\r\nJun 25 02:42:41 ns331247 run-datasette.sh[747]: File \"uvloop/handles/tcp.pyx\", line 26, in uvloop.loop.__tcp_bind\r\nJun 25 02:42:41 ns331247 run-datasette.sh[747]: PermissionError: [Errno 13] Permission denied\r\nJun 25 02:42:41 ns331247 run-datasette.sh[747]: During handling of the above exception, another exception occurred:\r\nJun 25 02:42:41 ns331247 run-datasette.sh[747]: Traceback (most recent call last):\r\nJun 25 02:42:41 ns331247 run-datasette.sh[747]: File \"/home/chris/Env/datasette/lib/python3.7/site-packages/sanic/server.py\", line 591, in serve\r\nJun 25 02:42:41 ns331247 run-datasette.sh[747]: http_server = loop.run_until_complete(server_coroutine)\r\nJun 25 02:42:41 ns331247 run-datasette.sh[747]: File \"uvloop/loop.pyx\", line 1451, in uvloop.loop.Loop.run_until_complete\r\nJun 25 02:42:41 ns331247 run-datasette.sh[747]: File \"uvloop/loop.pyx\", line 1684, in create_server\r\nJun 25 02:42:41 ns331247 run-datasette.sh[747]: File \"uvloop/loop.pyx\", line 1116, in uvloop.loop.Loop._create_server\r\nJun 25 02:42:41 ns331247 run-datasette.sh[747]: PermissionError: [Errno 13] error while attempting to bind on address ('0.0.0.0', 80): permission denied\r\nJun 25 02:42:41 ns331247 run-datasette.sh[747]: [2019-06-25 02:42:41 +0200] [752] [INFO] Server Stopped\r\n```\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 459397625, "label": "Documentation with recommendations on running Datasette in production without using Docker"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/529#issuecomment-505424665", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/529", "id": 505424665, "node_id": "MDEyOklzc3VlQ29tbWVudDUwNTQyNDY2NQ==", "user": {"value": 1383872, "label": "nathancahill"}, "created_at": "2019-06-25T12:35:07Z", "updated_at": "2019-06-25T12:35:07Z", "author_association": "NONE", "body": "Opps, wrote this late last night, didn't see you'd already worked on the issue.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 460396952, "label": "Use keyed rows - fixes #521"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/522#issuecomment-506000023", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/522", "id": 506000023, "node_id": "MDEyOklzc3VlQ29tbWVudDUwNjAwMDAyMw==", "user": {"value": 1383872, "label": "nathancahill"}, "created_at": "2019-06-26T18:48:53Z", "updated_at": "2019-06-26T18:48:53Z", "author_association": "NONE", "body": "Reference implementation from Requests: https://github.com/kennethreitz/requests/blob/3.0/requests/structures.py#L14", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 459622390, "label": "Handle case-insensitive headers in a nicer way"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/498#issuecomment-506985050", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/498", "id": 506985050, "node_id": "MDEyOklzc3VlQ29tbWVudDUwNjk4NTA1MA==", "user": {"value": 7936571, "label": "chrismp"}, "created_at": "2019-06-29T20:28:21Z", "updated_at": "2019-06-29T20:28:21Z", "author_association": "NONE", "body": "In my case, I have an ever-growing number of databases and tables within them. Most tables have FTS enabled. I cannot predict the names of future tables and databases, nor can I predict the names of the columns for which I wish to enable FTS.\r\n\r\nFor my purposes, I was thinking of writing up something that sends these two GET requests to each of my databases' tables.\r\n\r\n```\r\nhttp://my-server.com/database-name/table-name.json?_search=mySearchString\r\nhttp://my-server.com/database-name/table-name.json\r\n```\r\n\r\nIn the resulting JSON strings, I'd check the value of the key `filtered_table_rows_count`. If the value is `0` in the first URL's result, or if values from both requests are the same, that means FTS is either disabled for the table or it has no rows matching the search query.\r\n\r\nIs this feasible within the datasette library, or would it require some type of plugin? Or maybe you know of a better way of accomplishing this goal. Maybe I overlooked something.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 451513541, "label": "Full text search of all tables at once?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/498#issuecomment-508590397", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/498", "id": 508590397, "node_id": "MDEyOklzc3VlQ29tbWVudDUwODU5MDM5Nw==", "user": {"value": 7936571, "label": "chrismp"}, "created_at": "2019-07-04T23:34:41Z", "updated_at": "2019-07-04T23:34:41Z", "author_association": "NONE", "body": "I'll take your suggestion and do this all in Javascript. Would I need to make a `static/` folder in my datasette project's root directory and make a custom `index.html` template that pulls from `static/js/search-all-fts.js`? Or would you suggest another way?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 451513541, "label": "Full text search of all tables at once?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/498#issuecomment-509042334", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/498", "id": 509042334, "node_id": "MDEyOklzc3VlQ29tbWVudDUwOTA0MjMzNA==", "user": {"value": 7936571, "label": "chrismp"}, "created_at": "2019-07-08T00:18:29Z", "updated_at": "2019-07-08T00:18:29Z", "author_association": "NONE", "body": "@simonw I made this primitive search that I've put in my Datasette project's custom templates directory: https://gist.github.com/chrismp/e064b41f08208a6f9a93150a23cf7e03", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 451513541, "label": "Full text search of all tables at once?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/514#issuecomment-509154312", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/514", "id": 509154312, "node_id": "MDEyOklzc3VlQ29tbWVudDUwOTE1NDMxMg==", "user": {"value": 4363711, "label": "JesperTreetop"}, "created_at": "2019-07-08T09:36:25Z", "updated_at": "2019-07-08T09:40:33Z", "author_association": "NONE", "body": "@chrismp: Ports 1024 and under are privileged and can usually only be bound by a root or supervisor user, so it makes sense if you're running as the user `chris` that port 8000 works but 80 doesn't.\r\n\r\nSee [this generic question-and-answer](https://superuser.com/questions/710253/allow-non-root-process-to-bind-to-port-80-and-443) and [this systemd question-and-answer](https://stackoverflow.com/questions/40865775/linux-systemd-service-on-port-80) for more information about ways to skin this cat. Without knowing your specific circumstances, either extending those privileges to that service/executable/user, proxying them through something like nginx or indeed looking at what the nginx systemd job has to do to listen at port 80 all sound like good ways to start.\r\n\r\nAt this point, this is more generic systemd/Linux support than a Datasette issue, which is why a complete rando like me is able to contribute anything. ", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 459397625, "label": "Documentation with recommendations on running Datasette in production without using Docker"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/514#issuecomment-509431603", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/514", "id": 509431603, "node_id": "MDEyOklzc3VlQ29tbWVudDUwOTQzMTYwMw==", "user": {"value": 7936571, "label": "chrismp"}, "created_at": "2019-07-08T23:39:52Z", "updated_at": "2019-07-08T23:39:52Z", "author_association": "NONE", "body": "In `datasette.service`, I edited\r\n\r\n```\r\nUser=chris\r\n```\r\n\r\nTo...\r\n\r\n```\r\nUser=root\r\n```\r\n\r\nIt worked. I can access `http://my-server.com`. I hope this is safe. Thanks for all the help, everyone.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 459397625, "label": "Documentation with recommendations on running Datasette in production without using Docker"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/558#issuecomment-511252718", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/558", "id": 511252718, "node_id": "MDEyOklzc3VlQ29tbWVudDUxMTI1MjcxOA==", "user": {"value": 380586, "label": "0x1997"}, "created_at": "2019-07-15T01:29:29Z", "updated_at": "2019-07-15T01:29:29Z", "author_association": "NONE", "body": "Thanks, the latest version works.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 467218270, "label": "Support unicode in url"}, "performed_via_github_app": null}