{"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/33#issuecomment-622279374", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/33", "id": 622279374, "node_id": "MDEyOklzc3VlQ29tbWVudDYyMjI3OTM3NA==", "user": {"value": 2029, "label": "garethr"}, "created_at": "2020-05-01T07:12:47Z", "updated_at": "2020-05-01T07:12:47Z", "author_association": "NONE", "body": "I also go it working with:\r\n\r\n```yaml\r\nrun: echo ${{ secrets.github_token }} | github-to-sqlite auth\r\n```", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 609950090, "label": "Fall back to authentication via ENV"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1375#issuecomment-860548546", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1375", "id": 860548546, "node_id": "MDEyOklzc3VlQ29tbWVudDg2MDU0ODU0Ng==", "user": {"value": 4068, "label": "frafra"}, "created_at": "2021-06-14T09:41:59Z", "updated_at": "2021-06-14T09:41:59Z", "author_association": "NONE", "body": "> There is a feature for this at the moment, but it's a little bit hidden: you can use `?_json=col` to tell\r\n> Datasette that you would like a specific column to be exported as nested JSON: https://docs.datasette.io/en/stable/json_api.html#special-json-arguments\r\n\r\nThanks :)\r\n \r\n> I considered trying to make this automatic - so it detects columns that appear to contain valid JSON and outputs them as nested objects - but the problem with that is that it can lead to inconsistent results - you might hit the API and find that not every column contains valid JSON (compared to the previous day) resulting in the API retuning string instead of the expected dictionary and breaking your code.\r\n\r\nIf a developer is not sure if the JSON fields are valid, but then retrieves and parse them, it should handle errors too. Handling inconsistent data is necessary due to the nature of SQLite. A global or dataset option to render the data as they have been defined (JSON, boolean, etc.) when requesting JSON could allow the user to download a regular JSON from the browser without having to rely on APIs. I would guess someone could just make a custom template with an extra JSON-parsed download button otherwise :)", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 919508498, "label": "JSON export dumps JSON fields as TEXT"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/46#issuecomment-344161226", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/46", "id": 344161226, "node_id": "MDEyOklzc3VlQ29tbWVudDM0NDE2MTIyNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-11-14T06:41:21Z", "updated_at": "2017-11-14T06:41:21Z", "author_association": "OWNER", "body": "Spatial extensions would be really useful too. https://www.gaia-gis.it/spatialite-2.1/SpatiaLite-manual.html", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 271301468, "label": "Dockerfile should build more recent SQLite with FTS5 and spatialite support"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/175#issuecomment-353424169", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/175", "id": 353424169, "node_id": "MDEyOklzc3VlQ29tbWVudDM1MzQyNDE2OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2017-12-21T18:33:55Z", "updated_at": "2017-12-21T18:33:55Z", "author_association": "OWNER", "body": "Done - thanks for curating these: https://github.com/topics/automatic-api", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 282971961, "label": "Add project topic \"automatic-api\""}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/97#issuecomment-392602334", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/97", "id": 392602334, "node_id": "MDEyOklzc3VlQ29tbWVudDM5MjYwMjMzNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-05-28T20:57:21Z", "updated_at": "2018-05-28T20:57:21Z", "author_association": "OWNER", "body": "The `/.json` endpoint is more of an implementation detail of the homepage at this point. A better, documented ( http://datasette.readthedocs.io/en/stable/introspection.html#inspect ) endpoint for finding all of the databases and tables is https://parlgov.datasettes.com/-/inspect.json", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 274022950, "label": "Link to JSON for the list of tables "}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/339#issuecomment-404565566", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/339", "id": 404565566, "node_id": "MDEyOklzc3VlQ29tbWVudDQwNDU2NTU2Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-07-12T16:08:42Z", "updated_at": "2018-07-12T16:08:42Z", "author_association": "OWNER", "body": "I'm going to turn this into an issue about better supporting the above option.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 340396247, "label": "Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/308#issuecomment-405971920", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/308", "id": 405971920, "node_id": "MDEyOklzc3VlQ29tbWVudDQwNTk3MTkyMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-07-18T15:27:12Z", "updated_at": "2018-07-18T15:27:12Z", "author_association": "OWNER", "body": "It looks like there are a few extra options we should support:\r\n\r\nhttps://devcenter.heroku.com/articles/heroku-cli-commands\r\n\r\n```\r\n -t, --team=team team to use\r\n --region=region specify region for the app to run in\r\n --space=space the private space to create the app in\r\n```\r\n\r\nSince these differ from the options for Zeit Now I think this means splitting up `datasette publish now` and `datasette publish Heroku` into separate subcommands.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 330826972, "label": "Support extra Heroku apps:create options - region, space, team"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/370#issuecomment-435974786", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/370", "id": 435974786, "node_id": "MDEyOklzc3VlQ29tbWVudDQzNTk3NDc4Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2018-11-05T18:06:56Z", "updated_at": "2018-11-05T18:06:56Z", "author_association": "OWNER", "body": "I've been thinking a bit about ways of using Jupyter Notebook more effectively with Datasette (thinks like a `publish_dataframes(df1, df2, df3)` function which publishes some Pandas dataframes and returns you a URL to a new hosted Datasette instance) but you're right, Jupyter Lab is potentially a much more interesting fit.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 377155320, "label": "Integration with JupyterLab"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/391#issuecomment-450964512", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/391", "id": 450964512, "node_id": "MDEyOklzc3VlQ29tbWVudDQ1MDk2NDUxMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-01-02T19:45:12Z", "updated_at": "2019-01-02T19:45:12Z", "author_association": "OWNER", "body": "Thanks, I've fixed this. I had to re-alias it against now:\r\n```\r\n~ $ now alias google-trends-pnwhfwvgqf.now.sh https://google-trends.datasettes.com/\r\n> Assigning alias google-trends.datasettes.com to deployment google-trends-pnwhfwvgqf.now.sh\r\n> Certificate for google-trends.datasettes.com (cert_uXaADIuNooHS3tZ) created [18s]\r\n> Success! google-trends.datasettes.com now points to google-trends-pnwhfwvgqf.now.sh [20s]\r\n```", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 392610803, "label": "Google Trends example doesn\u2019t work"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/397#issuecomment-453330680", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/397", "id": 453330680, "node_id": "MDEyOklzc3VlQ29tbWVudDQ1MzMzMDY4MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-01-11T01:17:11Z", "updated_at": "2019-01-11T01:25:33Z", "author_association": "OWNER", "body": "If you pull [the latest image](https://hub.docker.com/r/datasetteproject/datasette) you should get the right SQLite version now:\r\n\r\n docker pull datasetteproject/datasette\r\n docker run -p 8001:8001 \\\r\n datasetteproject/datasette \\\r\n datasette -p 8001 -h 0.0.0.0\r\n\r\nhttp://0.0.0.0:8001/-/versions now gives me:\r\n\r\n```\r\n \"version\": \"3.26.0\"\r\n```", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 397129564, "label": "Update official datasetteproject/datasette Docker container to SQLite 3.26.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/187#issuecomment-467264937", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/187", "id": 467264937, "node_id": "MDEyOklzc3VlQ29tbWVudDQ2NzI2NDkzNw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-02-26T02:14:28Z", "updated_at": "2019-02-26T02:14:28Z", "author_association": "OWNER", "body": "I'm working on a port of Datasette to Starlette which I think would fix this issue: https://github.com/encode/starlette", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 309033998, "label": "Windows installation error"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/419#issuecomment-473708941", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/419", "id": 473708941, "node_id": "MDEyOklzc3VlQ29tbWVudDQ3MzcwODk0MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-03-17T19:58:11Z", "updated_at": "2019-03-17T19:58:11Z", "author_association": "OWNER", "body": "Some problems to solve:\r\n\r\n* Right now Datasette assumes it can always show the count of rows in a table, because this has been pre-calculated. If a database is mutable the pre-calculation trick no longer works, and for giant tables a `select count(*) from X` query can be expensive to run. Maybe we set a time limit on these? If time limit expires show \"many rows\"?\r\n* Maintaining a content hash of the table no longer makes sense if it is changing (though interestingly there's a `.sha3sum` built-in SQLite CLI command which takes a hash of the content and stays the same even through vacuum runs). Without that we need a different mechanism for calculating table colours. It also means that we can't do the special dbname-hash URL trick (see #418) at all if the database is opened as mutable.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 421551434, "label": "Default to opening files in mutable mode, special option for immutable files"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/431#issuecomment-488555399", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/431", "id": 488555399, "node_id": "MDEyOklzc3VlQ29tbWVudDQ4ODU1NTM5OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-05-02T05:13:54Z", "updated_at": "2019-05-02T05:13:54Z", "author_association": "OWNER", "body": "Datasette master now treats databases as readonly but NOT immutable. This means you can make changes to those databases from another process and those changes will be instantly reflected in the Datasette interface.\r\n\r\nAs such, reloading on database change is no longer necessary. Closing this ticket.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 432870248, "label": "Datasette doesn't reload when database file changes"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/486#issuecomment-495659567", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/486", "id": 495659567, "node_id": "MDEyOklzc3VlQ29tbWVudDQ5NTY1OTU2Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-05-24T14:41:45Z", "updated_at": "2019-05-24T14:41:45Z", "author_association": "OWNER", "body": "I'm really keen to offer this as a plugin hook once I have Datasette working on ASGI - #272 \r\n\r\nI'll hopefully have that working in the next few weeks, but in the meantime there are a couple of tricks you can use:\r\n\r\n- you can add static HTML files (no templates though) using the static route configuration options\r\n- you can link to external hosted pages using the `about_url` metadata option\r\n- you can add information to an existing page with a custom template. I do that here for example: https://russian-ira-facebook-ads.datasettes.com/", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 448189298, "label": "Ability to add extra routes and related templates"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/21#issuecomment-496786354", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/21", "id": 496786354, "node_id": "MDEyOklzc3VlQ29tbWVudDQ5Njc4NjM1NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-05-29T05:09:01Z", "updated_at": "2019-05-29T05:09:01Z", "author_association": "OWNER", "body": "Shipped this feature in sqlite-utils 1.1: https://sqlite-utils.readthedocs.io/en/latest/changelog.html#v1-1", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 448391492, "label": "Option to ignore inserts if primary key exists already"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/498#issuecomment-498839428", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/498", "id": 498839428, "node_id": "MDEyOklzc3VlQ29tbWVudDQ5ODgzOTQyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-06-04T20:53:21Z", "updated_at": "2019-06-04T20:53:21Z", "author_association": "OWNER", "body": "It does not, but that's a really great idea for a feature.\r\n\r\nOne challenge here is that FTS ranking calculations take overall table statistics into account, which means it's usually not possible to combine rankings from different tables in a sensible way. But that doesn't mean it's not possible to return grouped results.\r\n\r\nI think this makes a lot of sense as a plugin.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 451513541, "label": "Full text search of all tables at once?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/499#issuecomment-498840129", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/499", "id": 498840129, "node_id": "MDEyOklzc3VlQ29tbWVudDQ5ODg0MDEyOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-06-04T20:55:30Z", "updated_at": "2019-06-04T21:01:22Z", "author_association": "OWNER", "body": "I really want this too!\r\n\r\nIt's one of the goals of the Datasette Library #417 concept, which I'm hoping to turn into an actual feature in the coming months.\r\n\r\nIt's also going to be a major focus of my ten month JSK fellowship at Stanford, which starts in September. https://twitter.com/simonw/status/1123624552867565569", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 451585764, "label": "Accessibility for non-techie newsies? "}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/527#issuecomment-505057520", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/527", "id": 505057520, "node_id": "MDEyOklzc3VlQ29tbWVudDUwNTA1NzUyMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-06-24T15:21:18Z", "updated_at": "2019-06-24T15:21:18Z", "author_association": "OWNER", "body": "I just released csvs-to-sqlite 0.9.1 with this bug fix: https://github.com/simonw/csvs-to-sqlite/releases/tag/0.9.1", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 459936585, "label": "Unable to use rank when fts-table generated with csvs-to-sqlite"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/437#issuecomment-505087020", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/437", "id": 505087020, "node_id": "MDEyOklzc3VlQ29tbWVudDUwNTA4NzAyMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-06-24T16:38:56Z", "updated_at": "2019-06-24T16:38:56Z", "author_association": "OWNER", "body": "Closing this because it doesn't really fit the new model of inspect (though we should discuss in #465 how to further evolve this feature) and because as-of #272 we no longer use Sanic - though #520 will implement the equivalent of `prepare_sanic` against ASGI.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 438048318, "label": "Add inspect and prepare_sanic hooks"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/556#issuecomment-510550279", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/556", "id": 510550279, "node_id": "MDEyOklzc3VlQ29tbWVudDUxMDU1MDI3OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-07-11T16:07:27Z", "updated_at": "2019-07-11T16:07:27Z", "author_association": "OWNER", "body": "This is a really neat trick, thanks!", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 465773546, "label": "Add support for running datasette as a module"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/557#issuecomment-511625212", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/557", "id": 511625212, "node_id": "MDEyOklzc3VlQ29tbWVudDUxMTYyNTIxMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-07-16T01:12:14Z", "updated_at": "2019-07-16T01:12:14Z", "author_association": "OWNER", "body": "This looks useful for dealing with the `The process cannot access the file because it is being used by another process` error: https://stackoverflow.com/a/28032829", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 466996584, "label": "Get tests running on Windows using Travis CI"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/595#issuecomment-541931047", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/595", "id": 541931047, "node_id": "MDEyOklzc3VlQ29tbWVudDU0MTkzMTA0Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-14T21:25:38Z", "updated_at": "2019-10-14T21:25:38Z", "author_association": "OWNER", "body": "I like the conditional dependency for the moment - maybe until 3.5 becomes officially unsupported.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 506300941, "label": "bump uvicorn to 0.9.0 to be Python-3.8 friendly"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/twitter-to-sqlite/issues/20#issuecomment-544335363", "issue_url": "https://api.github.com/repos/dogsheep/twitter-to-sqlite/issues/20", "id": 544335363, "node_id": "MDEyOklzc3VlQ29tbWVudDU0NDMzNTM2Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-10-21T03:32:04Z", "updated_at": "2019-10-21T03:32:04Z", "author_association": "MEMBER", "body": "In case anyone is interested, here's an extract from the crontab I'm running these under at the moment:\r\n```\r\n1,11,21,31,41,51 * * * * /home/ubuntu/datasette-venv/bin/twitter-to-sqlite user-timeline /home/ubuntu/twitter.db -a /home/ubuntu/auth.json --since\r\n2,7,12,17,22,27,32,37,42,47,52,57 * * * * /home/ubuntu/datasette-venv/bin/twitter-to-sqlite home-timeline /home/ubuntu/timeline.db -a /home/ubuntu/auth.json --since\r\n6,16,26,36,46,56 * * * * /home/ubuntu/datasette-venv/bin/twitter-to-sqlite favorites /home/ubuntu/twitter.db -a /home/ubuntu/auth.json --stop_after=50\r\n```", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 506268945, "label": "--since support for various commands for refresh-by-cron"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/62#issuecomment-549435364", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/62", "id": 549435364, "node_id": "MDEyOklzc3VlQ29tbWVudDU0OTQzNTM2NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-04T16:30:34Z", "updated_at": "2019-11-04T16:30:34Z", "author_association": "OWNER", "body": "Released as 1.12.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 500783373, "label": "[enhancement] Method to delete a row in python"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/567#issuecomment-549665423", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/567", "id": 549665423, "node_id": "MDEyOklzc3VlQ29tbWVudDU0OTY2NTQyMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-05T05:11:14Z", "updated_at": "2019-11-05T05:11:14Z", "author_association": "OWNER", "body": "@clausjuhl I wrote a bit about that here: https://simonwillison.net/2019/May/19/datasette-0-28/\r\n\r\nShort version: just point Datasette at a SQLite file and update it from another process - it should work fine! I do it all the time now - I'll have a script running that writes to a database and I'll use Datasette to monitor progress. ", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 476573875, "label": "Datasette Edit"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/595#issuecomment-552275668", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/595", "id": 552275668, "node_id": "MDEyOklzc3VlQ29tbWVudDU1MjI3NTY2OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-11T03:09:43Z", "updated_at": "2019-11-11T03:09:43Z", "author_association": "OWNER", "body": "Glitch has been upgraded to Python 3.7. I think I'm happy to drop 3.5 support now - users who want Python 3.5 can get it by installing `datasette==0.30.2`", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 506300941, "label": "bump uvicorn to 0.9.0 to be Python-3.8 friendly"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/646#issuecomment-561022224", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/646", "id": 561022224, "node_id": "MDEyOklzc3VlQ29tbWVudDU2MTAyMjIyNA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-12-03T06:30:42Z", "updated_at": "2019-12-03T06:30:42Z", "author_association": "OWNER", "body": "I don't think this is possible at the moment but you're right, it totally should be.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 531502365, "label": "Make database level information from metadata.json available in the index.html template"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/73#issuecomment-570930239", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/73", "id": 570930239, "node_id": "MDEyOklzc3VlQ29tbWVudDU3MDkzMDIzOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-01-05T17:15:18Z", "updated_at": "2020-01-05T17:15:18Z", "author_association": "OWNER", "body": "I think this is because you forgot to include a `pk=` argument. I'll change the code to throw a more useful error in this case.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 545407916, "label": "upsert_all() throws issue when upserting to empty table"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/662#issuecomment-579787057", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/662", "id": 579787057, "node_id": "MDEyOklzc3VlQ29tbWVudDU3OTc4NzA1Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-01-29T14:43:46Z", "updated_at": "2020-01-29T14:43:46Z", "author_association": "OWNER", "body": "Can you share the exact queries you're having trouble with? The SQL itself or even just the full URL to the page (it doesn't matter if it's to a Datasette instance that isn't available online - I just need to see the URL parameters).", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 556814876, "label": "Escape_fts5_query-hookimplementation does not work with queries to standard tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/662#issuecomment-579832857", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/662", "id": 579832857, "node_id": "MDEyOklzc3VlQ29tbWVudDU3OTgzMjg1Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-01-29T16:12:08Z", "updated_at": "2020-01-29T16:12:08Z", "author_association": "OWNER", "body": "I think I see what's happening here.\r\n\r\nAdding the new plugin isn't quite enough: the change I made to master also alters the table view code to call the new function:\r\n\r\nhttps://github.com/simonw/datasette/commit/3c861f363df02a59a67c59036278338e4760d2ed#diff-5e0ffd62fced7d46339b9b2cd167c2f9\r\n\r\nIf you add the escape function as a plugin in Datasette 0.33 you will have to use a custom SQL query to run it, like this:\r\n\r\nhttps://latest.datasette.io/fixtures?sql=select+pk%2C+text1%2C+text2%2C+%5Bname+with+.+and+spaces%5D+from+searchable+where+rowid+in+%28select+rowid+from+searchable_fts+where+searchable_fts+match+escape_fts%28%3Asearch%29%29+order+by+pk+limit+101&search=Dog\r\n\r\nOr you can hold out for Datasette 0.34 which will have this fix and will hopefully ship within the next 24 hours.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 556814876, "label": "Escape_fts5_query-hookimplementation does not work with queries to standard tables"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/661#issuecomment-580028593", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/661", "id": 580028593, "node_id": "MDEyOklzc3VlQ29tbWVudDU4MDAyODU5Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-01-30T00:30:04Z", "updated_at": "2020-01-30T00:30:04Z", "author_association": "OWNER", "body": "This has now shipped as part of Datasette 0.34: https://datasette.readthedocs.io/en/stable/changelog.html#v0-34", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 555832585, "label": "--port option to expose a port other than 8001 in \"datasette package\""}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/658#issuecomment-580029288", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/658", "id": 580029288, "node_id": "MDEyOklzc3VlQ29tbWVudDU4MDAyOTI4OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-01-30T00:32:43Z", "updated_at": "2020-01-30T00:32:43Z", "author_association": "OWNER", "body": "Can you share how your file layout is working?\r\n\r\nYou should have something like this:\r\n\r\n`static/app.css` - a CSS file\r\n\r\nThen run Datasette like this:\r\n\r\n`datasette my.db --static-dir=static:static/`\r\n\r\nThen `http://127.0.0.1:8001/static/app.css` should serve your CSS.\r\n\r\nCould you share the command you're using to deploy to Heroku?", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 550293770, "label": "How do I use the app.css as style sheet?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/577#issuecomment-581758728", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/577", "id": 581758728, "node_id": "MDEyOklzc3VlQ29tbWVudDU4MTc1ODcyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-02-04T06:11:53Z", "updated_at": "2020-02-04T06:11:53Z", "author_association": "OWNER", "body": "For the moment I'm going to move it to `async def render_template()` on `datasette` but otherwise keep the implementation the same.\r\n\r\nThe new signature will be:\r\n\r\n async def render_template(self, template, context=None, request=None, view_name=None):\r\n\r\n`template` can be a list of strings or a single string. If a list of strings a template will be selected from them.\r\n\r\nI'll reconsider the large list of default context variables later on in a separate ticket.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 497171390, "label": "Utility mechanism for plugins to render templates"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/675#issuecomment-589908912", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/675", "id": 589908912, "node_id": "MDEyOklzc3VlQ29tbWVudDU4OTkwODkxMg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-02-22T02:38:21Z", "updated_at": "2020-02-22T02:38:21Z", "author_association": "OWNER", "body": "Interesting feature suggestion.\r\n\r\nMy initial instinct was that this would be better handled using the layered nature of Docker - so build a Docker image with `datasette package` and then have a separate custom script which takes that image, copies in the extra data and outputs a new image.\r\n\r\nBut... `datasette package` is already meant to be more convenient than messing around with Docker by hand like this - so actually having a `--copy` option like you describe here feels like it's within scope of what `datasette package` is meant to do.\r\n\r\nSo yeah - if you're happy to design this I think it would be worth us adding.\r\n\r\nSmall design suggestion: allow `--copy` to be applied multiple times, so you can do something like this:\r\n\r\n datasette package \\\r\n --copy ~/project/templates /templates \\\r\n --copy ~/project/README.md /README.md \\\r\n data.db\r\n\r\nAlso since Click arguments can take multiple options I don't think you need to have the `:` in there - although if it better matches Docker's own UI it might be more consistent to have it.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 567902704, "label": "--cp option for datasette publish and datasette package for shipping additional files and directories"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/682#issuecomment-590517338", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/682", "id": 590517338, "node_id": "MDEyOklzc3VlQ29tbWVudDU5MDUxNzMzOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-02-24T19:51:21Z", "updated_at": "2020-02-24T19:51:21Z", "author_association": "OWNER", "body": "I filed a question / feature request with Janus about supporting timeouts for `.get()` against async queues here: https://github.com/aio-libs/janus/issues/240\r\n\r\nI'm going to move ahead without needing that ability though. I figure SQLite writes are _fast_, and plugins can be trusted to implement just fast writes. So I'm going to support either fire-and-forget writes (they get added to the queue and a task ID is returned) or have the option to block awaiting the completion of the write (using Janus) but let callers decide which version they want. I may add optional timeouts some time in the future.\r\n\r\nI am going to make both `execute_write()` and `execute_write_fn()` awaitable functions though, for consistency with `.execute()` and to give me flexibility to change how they work in the future.\r\n\r\nI'll also add a `block=True` option to both of them which causes the function to wait for the write to be successfully executed - defaults to `False` (fire-and-forget mode).\r\n", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 569613563, "label": "Mechanism for writing to database via a queue"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/683#issuecomment-590679273", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/683", "id": 590679273, "node_id": "MDEyOklzc3VlQ29tbWVudDU5MDY3OTI3Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-02-25T04:37:21Z", "updated_at": "2020-02-25T04:37:21Z", "author_association": "OWNER", "body": "I'm happy with this now. I'm going to merge to master.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 570101428, "label": ".execute_write() and .execute_write_fn() methods on Database"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/675#issuecomment-592399256", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/675", "id": 592399256, "node_id": "MDEyOklzc3VlQ29tbWVudDU5MjM5OTI1Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-02-28T08:09:12Z", "updated_at": "2020-02-28T08:09:12Z", "author_association": "OWNER", "body": "Sure, `--cp` looks good to me.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 567902704, "label": "--cp option for datasette publish and datasette package for shipping additional files and directories"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/394#issuecomment-603631640", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/394", "id": 603631640, "node_id": "MDEyOklzc3VlQ29tbWVudDYwMzYzMTY0MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-03-25T04:19:08Z", "updated_at": "2020-03-25T04:19:08Z", "author_association": "OWNER", "body": "Shipped in 0.39: https://datasette.readthedocs.io/en/latest/changelog.html#v0-39", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 396212021, "label": "base_url configuration setting"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/717#issuecomment-610076073", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/717", "id": 610076073, "node_id": "MDEyOklzc3VlQ29tbWVudDYxMDA3NjA3Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-04-06T22:47:21Z", "updated_at": "2020-04-06T22:47:21Z", "author_association": "OWNER", "body": "I'm confident it's possible to create a plugin that deploys to Now v2 now. I'll do the rest of the work in a separate repo: https://github.com/simonw/datasette-publish-now", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 594189527, "label": "See if I can get Datasette working on Zeit Now v2"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/731#issuecomment-618155472", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/731", "id": 618155472, "node_id": "MDEyOklzc3VlQ29tbWVudDYxODE1NTQ3Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-04-23T03:28:42Z", "updated_at": "2020-04-23T03:28:56Z", "author_association": "OWNER", "body": "As an alternative to `--static` this could work by letting you create the following:\r\n\r\n- `static/css/`\r\n- `static/js/`\r\n\r\nWhich would be automatically mounted at `/js/...` and `/css/...`\r\n\r\nOr maybe just mount `static/` at `/static/` instead? ", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 605110015, "label": "Option to automatically configure based on directory layout"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/16#issuecomment-623807568", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/16", "id": 623807568, "node_id": "MDEyOklzc3VlQ29tbWVudDYyMzgwNzU2OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-05-05T02:56:06Z", "updated_at": "2020-05-05T02:56:06Z", "author_association": "MEMBER", "body": "I'm pretty sure this is what I'm after. The `groups` table has what looks like identified labels in the rows with category = 2025:\r\n\r\n\"words__groups__2_528_rows_where_where_category___2025\"\r\n\r\nThen there's a `ga` table that maps groups to assets:\r\n\r\n\"words__ga__633_653_rows\"\r\n\r\nAnd an `assets` table which looks like it has one row for every one of my photos:\r\n\r\n\"words__assets__40_419_rows\"\r\n\r\nOne major challenge: these UUIDs are split into two integer numbers, `uuid_0` and `uuid_1` - but the main photos database uses regular UUIDs like this:\r\n\r\n![image](https://user-images.githubusercontent.com/9599/81031481-39164280-8e41-11ea-983b-005ced641a18.png)\r\n\r\nI need to figure out how to match up these two different UUID representations. I asked on Twitter if anyone has any ideas: https://twitter.com/simonw/status/1257500689019703296", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 612287234, "label": "Import machine-learning detected labels (dog, llama etc) from Apple Photos"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/758#issuecomment-624797119", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/758", "id": 624797119, "node_id": "MDEyOklzc3VlQ29tbWVudDYyNDc5NzExOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-05-06T17:53:46Z", "updated_at": "2020-05-06T17:53:46Z", "author_association": "OWNER", "body": "It's interesting to hear from someone who's using this feature - I'm considering moving it out into a plugin #647.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 612382643, "label": "Question: Access to immutable database-path"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/757#issuecomment-624821090", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/757", "id": 624821090, "node_id": "MDEyOklzc3VlQ29tbWVudDYyNDgyMTA5MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-05-06T18:41:29Z", "updated_at": "2020-05-06T18:41:29Z", "author_association": "OWNER", "body": "OK, I just released 0.41 with that and a bunch of other stuff: https://datasette.readthedocs.io/en/latest/changelog.html#v0-41", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 612378203, "label": "Question: Any fixed date for the release with the uft8-encoding fix?"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/21#issuecomment-626395209", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21", "id": 626395209, "node_id": "MDEyOklzc3VlQ29tbWVudDYyNjM5NTIwOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-05-10T21:52:42Z", "updated_at": "2020-05-10T21:52:42Z", "author_association": "MEMBER", "body": "Aha! It looks like I accidentally installed the old bplist into the same environment:\r\n```\r\n$ pip freeze | grep bpylist\r\nbpylist==0.1.4\r\nbpylist2==3.0.0\r\n```", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 615474990, "label": "bpylist.archiver.CircularReference: archive has a cycle with uid(13)"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/dogsheep-photos/issues/21#issuecomment-626395781", "issue_url": "https://api.github.com/repos/dogsheep/dogsheep-photos/issues/21", "id": 626395781, "node_id": "MDEyOklzc3VlQ29tbWVudDYyNjM5NTc4MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-05-10T21:57:09Z", "updated_at": "2020-05-10T21:57:09Z", "author_association": "MEMBER", "body": "Yes, I just recreated my virtual environment from scratch and the error went away.\r\n\r\nThe problem occurred when I ran `pip install datasette-bplist` in the same virtual environment - https://github.com/simonw/datasette-bplist/blob/master/setup.py depends on `bpylist` which is incompatible with `bpylist2`.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 615474990, "label": "bpylist.archiver.CircularReference: archive has a cycle with uid(13)"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/868#issuecomment-650600606", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/868", "id": 650600606, "node_id": "MDEyOklzc3VlQ29tbWVudDY1MDYwMDYwNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-06-27T18:44:28Z", "updated_at": "2020-06-27T18:44:28Z", "author_association": "OWNER", "body": "This is really exciting! Thanks so much for looking into this.\r\n\r\nI'm interested in moving CI for this repo over to GitHub Actions, so I'd be fine with you getting this to work as an Action rather than through Travis. If you can get it working in Travis though I'll happily land that and figure out how to convert that to GitHub Actions later on.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 646448486, "label": "initial windows ci setup"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/877#issuecomment-652520496", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/877", "id": 652520496, "node_id": "MDEyOklzc3VlQ29tbWVudDY1MjUyMDQ5Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-07-01T16:26:52Z", "updated_at": "2020-07-01T16:26:52Z", "author_association": "OWNER", "body": "Tokens get verified by plugins. So far there's only one: https://github.com/simonw/datasette-auth-tokens - which has you hard-coding plugins in a configuration file. I have a issue there to add support for database-backed tokens too: https://github.com/simonw/datasette-auth-tokens/issues/1", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 648421105, "label": "Consider dropping explicit CSRF protection entirely?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/121#issuecomment-655673896", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/121", "id": 655673896, "node_id": "MDEyOklzc3VlQ29tbWVudDY1NTY3Mzg5Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-07-08T18:08:11Z", "updated_at": "2020-07-08T18:08:11Z", "author_association": "OWNER", "body": "I'm with you on most of this. Completely agreed that the CLI should do everything in a transaction.\r\n\r\nThe one thing I'm not keen on is forcing calling code to explicitly start a transaction, for a couple of reasons:\r\n\r\n1. It will break all of the existing code out there\r\n2. It doesn't match to how I most commonly use this library - as an interactive tool in a Jupyter notebook, where I'm generally working against a brand new scratch database and any errors don't actually matter\r\n\r\nSo... how about this: IF you wrap your code in a `with db:` block then the `.insert()` and suchlike methods expect you to manage transactions yourself. But if you don't use the context manager they behave like they do at the moment (or maybe a bit more sensibly).\r\n\r\nThat way existing code works as it does today, lazy people like me can call `.insert()` without thinking about transactions, but people writing actual production code (as opposed to Jupyter hacks) have a sensible way to take control of the transactions themselves.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 652961907, "label": "Improved (and better documented) support for transactions"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/942#issuecomment-675718593", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/942", "id": 675718593, "node_id": "MDEyOklzc3VlQ29tbWVudDY3NTcxODU5Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-08-18T21:02:11Z", "updated_at": "2020-08-18T21:02:24Z", "author_association": "OWNER", "body": "Easiest solution: if you provide column metadata it gets displayed above the table, something like on https://fivethirtyeight.datasettes.com/fivethirtyeight/antiquities-act%2Factions_under_antiquities_act\r\n\r\n\"fivethirtyeight__antiquities-act_actions_under_antiquities_act__344_rows\"\r\n\r\nHTML `title=` tooltips are also added to the table headers, which won't be visible on touch devices but that's OK because the information is visible on the page already.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 681334912, "label": "Support column descriptions in metadata.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/142#issuecomment-683173375", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/142", "id": 683173375, "node_id": "MDEyOklzc3VlQ29tbWVudDY4MzE3MzM3NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-08-28T22:29:02Z", "updated_at": "2020-08-28T22:29:02Z", "author_association": "OWNER", "body": "Yeah I think that failure is actually because there's a brand new release of Black out and it subtly changes some of the formatting rules. I'll merge this and then run Black against the entire codebase.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 688386219, "label": "insert_all(..., alter=True) should work for new columns introduced after the first 100 records"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/159#issuecomment-693199049", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/159", "id": 693199049, "node_id": "MDEyOklzc3VlQ29tbWVudDY5MzE5OTA0OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-09-16T06:20:26Z", "updated_at": "2020-09-16T06:20:26Z", "author_association": "OWNER", "body": "See #121 - I need to think harder about how this all interacts with transactions.\r\n\r\nYou can do this:\r\n\r\n```python\r\nwith db.conn:\r\n db[\"mytable\"].delete_where()\r\n```\r\nBut that should be documented and maybe rethought.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 702386948, "label": ".delete_where() does not auto-commit (unlike .insert() or .upsert())"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/970#issuecomment-695896557", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/970", "id": 695896557, "node_id": "MDEyOklzc3VlQ29tbWVudDY5NTg5NjU1Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-09-21T04:40:12Z", "updated_at": "2020-09-21T04:40:12Z", "author_association": "OWNER", "body": "The Python standard library has a module for this: https://docs.python.org/3/library/webbrowser.html", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 705108492, "label": "request an \"-o\" option on \"datasette server\" to open the default browser at the running url"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/178#issuecomment-701627158", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/178", "id": 701627158, "node_id": "MDEyOklzc3VlQ29tbWVudDcwMTYyNzE1OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-09-30T20:29:11Z", "updated_at": "2020-09-30T20:29:11Z", "author_association": "OWNER", "body": "Thanks for the fix!", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 709043182, "label": "Update README.md"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/986#issuecomment-702265255", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/986", "id": 702265255, "node_id": "MDEyOklzc3VlQ29tbWVudDcwMjI2NTI1NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-10-01T16:51:45Z", "updated_at": "2020-10-01T16:51:45Z", "author_association": "OWNER", "body": "Thanks for taking a look! The fix ended up being a little different from this because I still want to disable faceting on regular single primary keys (since faceting by those won't ever produce interesting results) - here's what I used: https://github.com/simonw/datasette/commit/5d6bc4c268f9f155e59561671f8617addd3e91bc", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 712889459, "label": "Allow facet by primary keys, fixes #985"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/778#issuecomment-702493047", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/778", "id": 702493047, "node_id": "MDEyOklzc3VlQ29tbWVudDcwMjQ5MzA0Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-10-02T02:26:25Z", "updated_at": "2020-10-02T02:26:25Z", "author_association": "OWNER", "body": "I think this could work for arbitrary SQL queries too. Those would need querystring configuration that specifies which sorted column(s) should be used for the \"next\" cursor.\r\n\r\nOne example: I'd like to be able to offer a paginated list of counts of values in a table - e.g. this query:\r\n\r\nhttps://fivethirtyeight.datasettes.com/fivethirtyeight?sql=select+replies%2C+count%28*%29+from+%5Btwitter-ratio%2Fsenators%5D+group+by+replies+order+by+count%28*%29+desc%3B\r\n\r\nThat could even become a query that gets linked to from the column actions menu.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 626211658, "label": "Ability to configure keyset pagination for views and queries"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/991#issuecomment-712317638", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/991", "id": 712317638, "node_id": "MDEyOklzc3VlQ29tbWVudDcxMjMxNzYzOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-10-19T17:30:56Z", "updated_at": "2020-10-19T17:30:56Z", "author_association": "OWNER", "body": "https://biglocal.datasettes.com/ is one of my larger Datasettes in terms of number of databases.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 714377268, "label": "Redesign application homepage"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1044#issuecomment-715584579", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1044", "id": 715584579, "node_id": "MDEyOklzc3VlQ29tbWVudDcxNTU4NDU3OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-10-23T20:53:01Z", "updated_at": "2020-10-23T20:53:01Z", "author_association": "OWNER", "body": "Thanks for this!", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 727916744, "label": "Add minimum supported python"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1043#issuecomment-715585140", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1043", "id": 715585140, "node_id": "MDEyOklzc3VlQ29tbWVudDcxNTU4NTE0MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-10-23T20:54:29Z", "updated_at": "2020-10-23T20:54:29Z", "author_association": "OWNER", "body": "Thanks. I'll push a source release of `asgi-csrf`.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 727915394, "label": "Include LICENSE in sdist"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1033#issuecomment-716048564", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1033", "id": 716048564, "node_id": "MDEyOklzc3VlQ29tbWVudDcxNjA0ODU2NA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-10-24T20:08:31Z", "updated_at": "2020-10-24T20:08:31Z", "author_association": "OWNER", "body": "Documentation here: https://docs.datasette.io/en/latest/internals.html#datasette-urls", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 725099777, "label": "datasette.urls.static_plugins(...) method"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1050#issuecomment-718342036", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1050", "id": 718342036, "node_id": "MDEyOklzc3VlQ29tbWVudDcxODM0MjAzNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-10-29T03:49:57Z", "updated_at": "2020-10-29T03:49:57Z", "author_association": "OWNER", "body": "@thadk from that error it looks like the problem may have been that you had a BLOB column containing a `null` value? If so that's definitely a bug, I'll fix that.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 729057388, "label": "Switch to .blob render extension for BLOB downloads"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/865#issuecomment-726412057", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/865", "id": 726412057, "node_id": "MDEyOklzc3VlQ29tbWVudDcyNjQxMjA1Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-11-12T23:49:23Z", "updated_at": "2020-11-12T23:49:23Z", "author_association": "OWNER", "body": "@tballison thanks, I've split that out into a new issue #1091", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 644582921, "label": "base_url doesn't seem to work when adding criteria and clicking \"apply\""}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1114#issuecomment-735443626", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1114", "id": 735443626, "node_id": "MDEyOklzc3VlQ29tbWVudDczNTQ0MzYyNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-11-29T19:40:49Z", "updated_at": "2020-11-29T19:40:49Z", "author_association": "OWNER", "body": "Fix is out in 0.52.1: https://docs.datasette.io/en/latest/changelog.html#v0-52-1", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 752966476, "label": "--load-extension=spatialite not working with datasetteproject/datasette docker image"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/942#issuecomment-737463116", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/942", "id": 737463116, "node_id": "MDEyOklzc3VlQ29tbWVudDczNzQ2MzExNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-12-02T20:02:10Z", "updated_at": "2020-12-02T20:03:01Z", "author_association": "OWNER", "body": "My idea is that if you installed my proposed plugin you wouldn't need `metadata.json` at all - your metadata would instead live in a table in the connected SQLite database files - either one table per database (so the metadata can live in the same place as the data) or maybe also in a dedicated separate database file, for if you want to add metadata to an otherwise read-only database.\r\n\r\nThe plugin would then provide a UI for editing that metadata - maybe by configuring some writable canned queries or maybe something more custom than that. Or you could edit the metadata by manually editing the SQLite database file (or loading data into it using a tool like [yaml-to-sqlite](https://github.com/simonw/yaml-to-sqlite)).", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 681334912, "label": "Support column descriptions in metadata.json"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/111#issuecomment-738904347", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/111", "id": 738904347, "node_id": "MDEyOklzc3VlQ29tbWVudDczODkwNDM0Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-12-04T17:16:56Z", "updated_at": "2020-12-04T17:16:56Z", "author_association": "OWNER", "body": "This is STILL a good idea.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 274615452, "label": "Add \u201cupdated\u201d to metadata"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1128#issuecomment-739355855", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1128", "id": 739355855, "node_id": "MDEyOklzc3VlQ29tbWVudDczOTM1NTg1NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-12-05T19:34:57Z", "updated_at": "2020-12-05T19:34:57Z", "author_association": "OWNER", "body": "Thanks for this!", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 756867924, "label": "Fix startup error on windows"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1142#issuecomment-744563209", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1142", "id": 744563209, "node_id": "MDEyOklzc3VlQ29tbWVudDc0NDU2MzIwOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-12-14T16:41:11Z", "updated_at": "2020-12-14T16:41:11Z", "author_association": "OWNER", "body": "To check out and start the server:\r\n\r\n /tmp % git clone git@github.com:nitinpaul/datasette\r\n Cloning into 'datasette'...\r\n remote: Enumerating objects: 124, done.\r\n # ...\r\n datasette % python3 -m venv venv\r\n datasette % source venv/bin/activate\r\n (venv) datasette % pip install -e '.[test]'\r\n Obtaining file:///private/tmp/datasette\r\n Collecting asgiref<3.4.0,>=3.2.10\r\n Using cached asgiref-3.3.1-py3-none-any.whl (19 kB)\r\n # ...\r\n (venv) datasette % datasette\r\n INFO: Started server process [24002]\r\n INFO: Waiting for application startup.\r\n INFO: Application startup complete.\r\n INFO: Uvicorn running on http://127.0.0.1:8001 (Press CTRL+C to quit)\r\n\r\nAnd to run the tests:\r\n\r\n (venv) datasette % pytest\r\n ======================================================================== test session starts ========================================================================\r\n platform darwin -- Python 3.9.1, pytest-6.1.2, py-1.10.0, pluggy-0.13.1\r\n SQLite: 3.34.0\r\n rootdir: /private/tmp/datasette, configfile: pytest.ini\r\n plugins: asyncio-0.14.0, timeout-1.4.2\r\n collected 841 items \r\n\r\n tests/test_package.py .. [ 0%]\r\n", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 763361458, "label": "\"Stream all rows\" is not at all obvious"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1148#issuecomment-747062909", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1148", "id": 747062909, "node_id": "MDEyOklzc3VlQ29tbWVudDc0NzA2MjkwOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-12-16T21:51:54Z", "updated_at": "2020-12-16T21:51:54Z", "author_association": "OWNER", "body": "This is a really frustrating bug with Vercel: https://github.com/simonw/datasette-publish-vercel/issues/28\r\n\r\n`+` characters in URLs get translated into spaces before they get to Datasette. They know about the bug and said they were working on a fix a few months ago, but looks like it's still a problem.\r\n\r\nA workaround is to avoid `+` and use `-` instead - I think this SQL query does the same thing as yours:\r\n\r\nhttps://aws-partners-singapore.vercel.app/partners?sql=select%0D%0A++A.launch_rank%2C%0D%0A++A.partner_info%0D%0Afrom%0D%0A++summary+A%0D%0A++INNER+JOIN+summary+B+ON+A.launch_rank+%3E%3D+B.launch_rank+-+3%0D%0A++AND+A.launch_rank+-4+%3C%3D+B.launch_rank%0D%0AWHERE%0D%0A++B.%22partner_info%22+LIKE+%27%25Palo+Alto%25%27\r\n\r\n```sql\r\nselect\r\n A.launch_rank,\r\n A.partner_info\r\nfrom\r\n summary A\r\n INNER JOIN summary B ON A.launch_rank >= B.launch_rank - 3\r\n AND A.launch_rank -4 <= B.launch_rank\r\nWHERE\r\n B.\"partner_info\" LIKE '%Palo Alto%'\r\n```\r\nI've been moving projects from Vercel to Cloud Run when they run into this, but that's not a great situation to be in.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 767561886, "label": "Syntax error with + symbol when deployed to Vercel"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1149#issuecomment-747207787", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1149", "id": 747207787, "node_id": "MDEyOklzc3VlQ29tbWVudDc0NzIwNzc4Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-12-17T05:06:16Z", "updated_at": "2020-12-17T05:06:16Z", "author_association": "OWNER", "body": "So, an idea: what if Datasette's default CSS applied only to elements with classes - or maybe to childen of a `body class=\"datasette\"` element? In such a way that you could write your own custom HTML that reused elements of Datasette's CSS - the cog menu styling for example - but only on an opt-in basis?", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 769520939, "label": "Make it easier to theme Datasette with CSS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1158#issuecomment-750390741", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1158", "id": 750390741, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MDM5MDc0MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-12-23T17:05:32Z", "updated_at": "2020-12-23T17:05:32Z", "author_association": "OWNER", "body": "Thanks for this!\r\n\r\nI'm fine keeping the `os.path` stuff as is.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 773913793, "label": "Modernize code to Python 3.6+"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/987#issuecomment-752714747", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/987", "id": 752714747, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MjcxNDc0Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-12-30T18:23:08Z", "updated_at": "2020-12-30T18:23:20Z", "author_association": "OWNER", "body": "In terms of \"places to put your plugin content\", the simplest solution I can think of is something like this:\r\n```html\r\n
\r\n```\r\nAlternative designs:\r\n\r\n- A documented JavaScript function that returns the CSS selector where plugins should put their content\r\n- A documented JavaScript function that returns a DOM node where plugins should put their content. This would allow the JavaScript to create the element if it does not already exist (though it wouldn't be obvious WHERE that element should be created)\r\n- Documented JavaScript functions for things like \"append this node/HTML to the place-where-plugins-go\"\r\n\r\nI think the original option - an empty `
` with a known `id` attribute - is the right one to go with here. It's the simplest, it's very easy for custom template authors to understand and it acknowledges that plugins may have all kinds of extra crazy stuff they want to do - like checking in that div to see if another plugin has written to it already, for example.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 712984738, "label": "Documented HTML hooks for JavaScript plugin authors"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1169#issuecomment-753653260", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1169", "id": 753653260, "node_id": "MDEyOklzc3VlQ29tbWVudDc1MzY1MzI2MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-03T17:54:40Z", "updated_at": "2021-01-03T17:54:40Z", "author_association": "OWNER", "body": "And @benpickles yes I would land that pull request straight away as-is. Thanks!", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 777677671, "label": "Prettier package not actually being cached"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/913#issuecomment-754187326", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/913", "id": 754187326, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDE4NzMyNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T20:03:50Z", "updated_at": "2021-01-04T20:03:50Z", "author_association": "OWNER", "body": "I renamed `--config` to `--setting` and changed it to work like this:\r\n\r\n datasette --setting sql_time_limit_ms 1000\r\n\r\nNote the lack of colons.\r\n\r\nThis actually makes colons cleaner to use for plugins - I could support this:\r\n\r\n datasette --setting datasette-insert:unsafe 1", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 670209331, "label": "Mechanism for passing additional options to `datasette my.db` that affect plugins"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/93#issuecomment-754215392", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/93", "id": 754215392, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDIxNTM5Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-04T20:59:20Z", "updated_at": "2021-01-04T21:03:14Z", "author_association": "OWNER", "body": "Updated `pyinstaller` recipe - lots of hidden imports needed now:\r\n```\r\npip install wheel\r\npip install datasette pyinstaller\r\n\r\nBASE=$(python -c 'import os; print(os.path.dirname(__import__(\"datasette\").__file__))') \\\r\n pyinstaller -F \\\r\n --add-data \"$BASE/templates:datasette/templates\" \\\r\n --add-data \"$BASE/static:datasette/static\" \\\r\n --hidden-import datasette.publish \\\r\n --hidden-import datasette.publish.heroku \\\r\n --hidden-import datasette.publish.cloudrun \\\r\n --hidden-import datasette.facets \\\r\n --hidden-import datasette.sql_functions \\\r\n --hidden-import datasette.actor_auth_cookie \\\r\n --hidden-import datasette.default_permissions \\\r\n --hidden-import datasette.default_magic_parameters \\\r\n --hidden-import datasette.blob_renderer \\\r\n --hidden-import datasette.default_menu_links \\\r\n --hidden-import uvicorn \\\r\n --hidden-import uvicorn.logging \\\r\n --hidden-import uvicorn.loops \\\r\n --hidden-import uvicorn.loops.auto \\\r\n --hidden-import uvicorn.protocols \\\r\n --hidden-import uvicorn.protocols.http \\\r\n --hidden-import uvicorn.protocols.http.auto \\\r\n --hidden-import uvicorn.protocols.websockets \\\r\n --hidden-import uvicorn.protocols.websockets.auto \\\r\n --hidden-import uvicorn.lifespan \\\r\n --hidden-import uvicorn.lifespan.on \\\r\n $(which datasette)\r\n```", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 273944952, "label": "Package as standalone binary"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/657#issuecomment-761179229", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/657", "id": 761179229, "node_id": "MDEyOklzc3VlQ29tbWVudDc2MTE3OTIyOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-15T20:24:35Z", "updated_at": "2021-01-15T20:24:35Z", "author_association": "OWNER", "body": "I'm not sure how I missed this issue but it's almost a year later and I'm finally taking a look at your Parquet work.\r\n\r\nThis is yet more evidence that allowing plugins to provide their own custom `Database` objects would be a good idea.\r\n\r\nI started exploring what Datasette would like on PostgreSQL in #670 - my concern was that I would need to add a large amount of database abstraction code which would dramatically increase the complexity of the core project, but my thinking now is that it might be tractable - Datasette doesn't actually construct SQL in complex ways anywhere outside of the `TableView` class so abstracting away just that bit should be feasible.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 548591089, "label": "Allow creation of virtual tables at startup"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1209#issuecomment-769455370", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1209", "id": 769455370, "node_id": "MDEyOklzc3VlQ29tbWVudDc2OTQ1NTM3MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-28T23:00:21Z", "updated_at": "2021-01-28T23:00:21Z", "author_association": "OWNER", "body": "Good catch on the workaround here. The root problem is that `datasette-template-sql` looks for the first available databsae if you don't provide it with a `database=` argument, and in Datasette 0.54 the first available database changed to being the new `_internal` database.\r\n\r\nIs this a bug? I think it is - because the documented behaviour on https://docs.datasette.io/en/stable/internals.html#get-database-name is this:\r\n\r\n> `name` - string, optional\r\n>\r\n> The name to be used for this database - this will be used in the URL path, e.g. `/dbname`. If not specified Datasette will pick one based on the filename or memory name.\r\n\r\nSince the new behaviour differs from what was in the documentation I'm going to treat this as a bug and fix it.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 795367402, "label": "v0.54 500 error from sql query in custom template; code worked in v0.53; found a workaround"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/60#issuecomment-770071568", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/60", "id": 770071568, "node_id": "MDEyOklzc3VlQ29tbWVudDc3MDA3MTU2OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-01-29T21:56:15Z", "updated_at": "2021-01-29T21:56:15Z", "author_association": "MEMBER", "body": "I really like the way you're using pipes here - really smart. It's similar to how I build the demo database in this GitHub Actions workflow:\r\n\r\nhttps://github.com/dogsheep/github-to-sqlite/blob/62dfd3bc4014b108200001ef4bc746feb6f33b45/.github/workflows/deploy-demo.yml#L52-L82\r\n\r\n`twitter-to-sqlite` actually has a mechanism for doing this kind of thing, documented at https://github.com/dogsheep/twitter-to-sqlite#providing-input-from-a-sql-query-with---sql-and---attach\r\n\r\nIt lets you do things like:\r\n\r\n```\r\n$ twitter-to-sqlite users-lookup my.db --sql=\"select follower_id from following\" --ids\r\n```\r\nMaybe I should add something similar to `github-to-sqlite`? Feels like it could be really useful.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 797097140, "label": "Use Data from SQLite in other commands"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1200#issuecomment-777178728", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1200", "id": 777178728, "node_id": "MDEyOklzc3VlQ29tbWVudDc3NzE3ODcyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-11T03:13:59Z", "updated_at": "2021-02-11T03:13:59Z", "author_association": "OWNER", "body": "I came up with the need for this while playing with this tool: https://calands.datasettes.com/calands?sql=select%0D%0A++AsGeoJSON(geometry)%2C+*%0D%0Afrom%0D%0A++CPAD_2020a_SuperUnits%0D%0Awhere%0D%0A++PARK_NAME+like+'%25mini%25'+and%0D%0A++Intersects(GeomFromGeoJSON(%3Afreedraw)%2C+geometry)+%3D+1%0D%0A++and+CPAD_2020a_SuperUnits.rowid+in+(%0D%0A++++select%0D%0A++++++rowid%0D%0A++++from%0D%0A++++++SpatialIndex%0D%0A++++where%0D%0A++++++f_table_name+%3D+'CPAD_2020a_SuperUnits'%0D%0A++++++and+search_frame+%3D+GeomFromGeoJSON(%3Afreedraw)%0D%0A++)&freedraw={\"type\"%3A\"MultiPolygon\"%2C\"coordinates\"%3A[[[[-122.42202758789064%2C37.82280243352759]%2C[-122.39868164062501%2C37.823887203271454]%2C[-122.38220214843751%2C37.81846319511331]%2C[-122.35061645507814%2C37.77071473849611]%2C[-122.34924316406251%2C37.74465712069939]%2C[-122.37258911132814%2C37.703380457832374]%2C[-122.39044189453125%2C37.690340943717715]%2C[-122.41241455078126%2C37.680559803205135]%2C[-122.44262695312501%2C37.67295135774715]%2C[-122.47283935546876%2C37.67295135774715]%2C[-122.52502441406251%2C37.68382032669382]%2C[-122.53463745117189%2C37.6892542140253]%2C[-122.54699707031251%2C37.690340943717715]%2C[-122.55798339843751%2C37.72945260537781]%2C[-122.54287719726564%2C37.77831314799672]%2C[-122.49893188476564%2C37.81303878836991]%2C[-122.46185302734376%2C37.82822612280363]%2C[-122.42889404296876%2C37.82822612280363]%2C[-122.42202758789064%2C37.82280243352759]]]]} - before I fixed https://github.com/simonw/datasette-leaflet-geojson/issues/16 it was loading a LOT of maps, which felt bad. I wanted to be able to link people to that page with a hard limit on the number of rows displayed on that page.\r\n\r\nIt's mainly to guard against unexpected behaviour from limit-less queries though. It's not a very high priority feature!", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 792890765, "label": "?_size=10 option for the arbitrary query page would be useful"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/evernote-to-sqlite/issues/11#issuecomment-777798330", "issue_url": "https://api.github.com/repos/dogsheep/evernote-to-sqlite/issues/11", "id": 777798330, "node_id": "MDEyOklzc3VlQ29tbWVudDc3Nzc5ODMzMA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-11T21:18:58Z", "updated_at": "2021-02-11T21:18:58Z", "author_association": "MEMBER", "body": "Thanks for the fix!", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 792851444, "label": "XML parse error"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/131#issuecomment-778510528", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/131", "id": 778510528, "node_id": "MDEyOklzc3VlQ29tbWVudDc3ODUxMDUyOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-12T23:25:06Z", "updated_at": "2021-02-12T23:25:06Z", "author_association": "OWNER", "body": "If `-c` isn't available, maybe `-t` or `--type` would work for specifying column types:\r\n```\r\nsqlite-utils insert db.db images images.tsv \\\r\n --tsv \\\r\n --type id int \\\r\n --type score float\r\n```\r\nor\r\n```\r\nsqlite-utils insert db.db images images.tsv \\\r\n --tsv \\\r\n -t id int \\\r\n -t score float\r\n```", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 675753042, "label": "sqlite-utils insert: options for column types"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/782#issuecomment-782789598", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/782", "id": 782789598, "node_id": "MDEyOklzc3VlQ29tbWVudDc4Mjc4OTU5OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-21T03:30:02Z", "updated_at": "2021-02-21T03:30:02Z", "author_association": "OWNER", "body": "Another benefit to default:object - I could include a key that shows a list of available extras. I could then use that to power an interactive API explorer.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 627794879, "label": "Redesign default .json format"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1241#issuecomment-784567547", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1241", "id": 784567547, "node_id": "MDEyOklzc3VlQ29tbWVudDc4NDU2NzU0Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-23T22:45:56Z", "updated_at": "2021-02-23T22:46:12Z", "author_association": "OWNER", "body": "I really like the way the Share feature on Stack Overflow works: https://stackoverflow.com/questions/18934149/how-can-i-use-postgresqls-text-column-type-in-django\r\n", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 814595021, "label": "Share button for copying current URL"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/google-takeout-to-sqlite/pull/5#issuecomment-786925280", "issue_url": "https://api.github.com/repos/dogsheep/google-takeout-to-sqlite/issues/5", "id": 786925280, "node_id": "MDEyOklzc3VlQ29tbWVudDc4NjkyNTI4MA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-26T22:23:10Z", "updated_at": "2021-02-26T22:23:10Z", "author_association": "MEMBER", "body": "Thanks!\r\n\r\nI requested my Gmail export from takeout - once that arrives I'll test it against this and then merge the PR.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 813880401, "label": "WIP: Add Gmail takeout mbox import"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/838#issuecomment-795895436", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/838", "id": 795895436, "node_id": "MDEyOklzc3VlQ29tbWVudDc5NTg5NTQzNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-03-10T18:44:46Z", "updated_at": "2021-03-10T18:44:57Z", "author_association": "OWNER", "body": "Let's reopen this.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 637395097, "label": "Incorrect URLs when served behind a proxy with base_url set"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/249#issuecomment-803501756", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/249", "id": 803501756, "node_id": "MDEyOklzc3VlQ29tbWVudDgwMzUwMTc1Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-03-21T02:33:45Z", "updated_at": "2021-03-21T02:33:45Z", "author_association": "OWNER", "body": "Did you run `enable-fts` before you inserted the data?\r\n\r\nIf so you'll need to run `populate-fts` after the insert to populate the FTS index.\r\n\r\nA better solution may be to add `--create-triggers` to the `enable-fts` command to add triggers that will automatically keep the index updated as you insert new records.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 836963850, "label": "Full text search possibly broken?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1153#issuecomment-805109341", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1153", "id": 805109341, "node_id": "MDEyOklzc3VlQ29tbWVudDgwNTEwOTM0MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-03-23T17:55:48Z", "updated_at": "2021-03-23T18:41:57Z", "author_association": "OWNER", "body": "Beginnings of a UI element for switching between them:\r\n```html\r\n
\r\nJSON\r\nYAML\r\n
\r\n```\r\n\"Metadata_\u2014_Datasette_documentation\"\r\n\r\nThat `
` has a padding of 12px, so using 12px padding on the tab links should get them to line up better.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 771202454, "label": "Use YAML examples in documentation by default, not JSON"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/pull/1260#issuecomment-808988697", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1260", "id": 808988697, "node_id": "MDEyOklzc3VlQ29tbWVudDgwODk4ODY5Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-03-29T00:22:21Z", "updated_at": "2021-03-29T00:22:21Z", "author_association": "OWNER", "body": "This is interesting!\r\n\r\nI've decided to apply a subset of these - the `if` and `elif` blocks are a deliberate style choice from me, because I find code clearer when it has if/else as opposed to relying on early termination. Likewise the iteration against `.keys()` on dictionaries.\r\n\r\nI like the other fixes though, I'm about to land them in a separate commit that credits you.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 831163537, "label": "Fix: code quality issues"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/pull/1031#issuecomment-809010713", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1031", "id": 809010713, "node_id": "MDEyOklzc3VlQ29tbWVudDgwOTAxMDcxMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-03-29T01:46:45Z", "updated_at": "2021-03-29T01:46:45Z", "author_association": "OWNER", "body": "Sorry I didn't get to this PR sooner. I've joint-credited you in the release notes for this fix: https://docs.datasette.io/en/stable/changelog.html#v0-56", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 724369025, "label": "Fallback to databases in inspect-data.json when no -i options are passed"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/issues/696#issuecomment-809548363", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/696", "id": 809548363, "node_id": "MDEyOklzc3VlQ29tbWVudDgwOTU0ODM2Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-03-29T17:04:19Z", "updated_at": "2021-03-29T17:04:19Z", "author_association": "OWNER", "body": "I tried this just now against Datasette 0.56 with the new Dockerfile from #1249 (that uses SQLite and SpatiaLite installed with `apt-get install`) and the tests all passed.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 576722115, "label": "Single failing unit test when run inside the Docker image"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/issues/1284#issuecomment-810740486", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1284", "id": 810740486, "node_id": "MDEyOklzc3VlQ29tbWVudDgxMDc0MDQ4Ng==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-03-31T03:57:55Z", "updated_at": "2021-03-31T03:57:55Z", "author_association": "OWNER", "body": "You're right, doing this is really hard at the moment - I'm not sure I know how I would tackle this either, and it's something I've wanted in the past!\r\n\r\nI'll have a think about this one.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 845794436, "label": "Feature or Documentation Request: Individual table as home page template"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/issues/1286#issuecomment-812664443", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1286", "id": 812664443, "node_id": "MDEyOklzc3VlQ29tbWVudDgxMjY2NDQ0Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-04-02T18:52:45Z", "updated_at": "2021-04-02T18:52:51Z", "author_association": "OWNER", "body": "Idea: default to displaying single-dimension JSON arrays of strings as a comma-separated list but show the comma in a different colour - something like this:\r\n\r\n\"fixtures__facetable__15_rows\"\r\n\r\nI used this HTML for the prototype (re-using `.type-int` just to get the colour):\r\n```html\r\ntag1, tag2\r\n```", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 849220154, "label": "Better default display of arrays of items"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/sqlite-utils/pull/258#issuecomment-843702392", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/258", "id": 843702392, "node_id": "MDEyOklzc3VlQ29tbWVudDg0MzcwMjM5Mg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-05-19T02:47:37Z", "updated_at": "2021-05-19T02:47:37Z", "author_association": "OWNER", "body": "I'm going to merge this and add a test - thanks!", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 868191959, "label": "Fixing insert from JSON containing strings with non-ascii characters \u2026"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/sqlite-utils/issues/253#issuecomment-843718859", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/253", "id": 843718859, "node_id": "MDEyOklzc3VlQ29tbWVudDg0MzcxODg1OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-05-19T03:31:47Z", "updated_at": "2021-05-19T03:31:47Z", "author_association": "OWNER", "body": "Fixed: https://simonwillison.net/2020/Sep/23/sqlite-advanced-alter-table/", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 847423559, "label": "fixtures.db example error in sql-utils blog post"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/pull/1352#issuecomment-852673695", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1352", "id": 852673695, "node_id": "MDEyOklzc3VlQ29tbWVudDg1MjY3MzY5NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-06-02T02:52:26Z", "updated_at": "2021-06-02T02:52:26Z", "author_association": "OWNER", "body": "@dependabot recreate", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 908276134, "label": "Bump black from 21.5b1 to 21.5b2"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/issues/526#issuecomment-853567413", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/526", "id": 853567413, "node_id": "MDEyOklzc3VlQ29tbWVudDg1MzU2NzQxMw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-06-03T05:11:27Z", "updated_at": "2021-06-03T05:11:27Z", "author_association": "OWNER", "body": "Another potential way to implement this would be to hold the SQLite connection open and execute the full query there.\r\n\r\nI've avoided this in the past due to concerns of resource exhaustion - if multiple requests attempt this at the same time all of the connections in the pool will become tied up and the site will be unable to respond to further requests.\r\n\r\nBut... now that Datasette has authentication there's the possibility of making this feature only available to specific authenticated users - the `--root` user for example. Which avoids the danger while unlocking a super-useful feature.\r\n\r\nNot to mention people who are running Datasette privately on their own laptop, or the proposed `--query` CLI feature in #1356.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 459882902, "label": "Stream all results for arbitrary SQL and canned queries"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/sqlite-utils/issues/264#issuecomment-853567861", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/264", "id": 853567861, "node_id": "MDEyOklzc3VlQ29tbWVudDg1MzU2Nzg2MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-06-03T05:12:21Z", "updated_at": "2021-06-03T05:12:21Z", "author_association": "OWNER", "body": "I think this is more likely to happen in Datasette than in sqlite-utils - see https://github.com/simonw/datasette/issues/1356 for thoughts on this.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 907642546, "label": "Supporting additional output formats, like GeoJSON"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/issues/1375#issuecomment-860230385", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1375", "id": 860230385, "node_id": "MDEyOklzc3VlQ29tbWVudDg2MDIzMDM4NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-06-13T15:37:49Z", "updated_at": "2021-06-13T15:37:49Z", "author_association": "OWNER", "body": "There is a feature for this at the moment, but it's a little bit hidden: you can use `?_json=col` to tell\r\nDatasette that you would like a specific column to be exported as nested JSON: https://docs.datasette.io/en/stable/json_api.html#special-json-arguments\r\n\r\nI considered trying to make this automatic - so it detects columns that appear to contain valid JSON and outputs them as nested objects - but the problem with that is that it can lead to inconsistent results - you might hit the API and find that not every column contains valid JSON (compared to the previous day) resulting in the API retuning  string instead of the expected dictionary and breaking your code.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 919508498, "label": "JSON export dumps JSON fields as TEXT"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/sqlite-utils/issues/272#issuecomment-861987651", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/272", "id": 861987651, "node_id": "MDEyOklzc3VlQ29tbWVudDg2MTk4NzY1MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-06-16T02:27:20Z", "updated_at": "2021-06-16T02:27:20Z", "author_association": "OWNER", "body": "Solution: `sqlite-utils memory -` attempts to detect the input based on if it starts with a `{` or `[` (likely JSON) or if it doesn't use the `csv.Sniffer()` mechanism. Or you can use `sqlite-utils memory -:csv` to specifically indicate the type of input.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 921878733, "label": "Idea: import CSV to memory, run SQL, export in a single command"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/sqlite-utils/issues/278#issuecomment-864128489", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/278", "id": 864128489, "node_id": "MDEyOklzc3VlQ29tbWVudDg2NDEyODQ4OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-06-18T15:46:24Z", "updated_at": "2021-06-18T15:46:24Z", "author_association": "OWNER", "body": "A workaround could be to define a bash or zsh alias of some sort.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 923697888, "label": "Support db as first parameter before subcommand, or as environment variable"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/issues/1396#issuecomment-880326049", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1396", "id": 880326049, "node_id": "MDEyOklzc3VlQ29tbWVudDg4MDMyNjA0OQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-07-15T01:50:05Z", "updated_at": "2021-07-15T01:50:05Z", "author_association": "OWNER", "body": "I think I made a mistake in this commit: https://github.com/simonw/datasette/commit/0486303b60ce2784fd2e2ecdbecf304b7d6e6659\r\n\r\n\"Explicitly_push_version_tag__refs__1281_\u00b7_simonw_datasette_0486303\"\r\n\r\nIt looks like I copied `$VERSION_TAG` from here - but it's not available in the `publish.yml` flow: https://github.com/simonw/datasette/blob/0486303b60ce2784fd2e2ecdbecf304b7d6e6659/.github/workflows/push_docker_tag.yml#L18-L25", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 944903881, "label": "\"invalid reference format\" publishing Docker image"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/sqlite-utils/issues/298#issuecomment-891359751", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/298", "id": 891359751, "node_id": "IC_kwDOCGYnMM41IRIH", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-08-02T21:55:16Z", "updated_at": "2021-08-02T21:55:16Z", "author_association": "OWNER", "body": "This is a feature already! You can do this:\r\n\r\n    sqlite-utils insert nl-demo.db mytable data.ndjson --nl\r\n\r\nSee https://sqlite-utils.datasette.io/en/stable/cli.html#inserting-newline-delimited-json\r\n", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 951581763, "label": "Read lines with JSON object"}, "performed_via_github_app": null}