{"html_url": "https://github.com/simonw/datasette/issues/1715#issuecomment-1106945876", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1715", "id": 1106945876, "node_id": "IC_kwDOBm6k_c5B-qdU", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-22T22:24:29Z", "updated_at": "2022-04-22T22:24:29Z", "author_association": "OWNER", "body": "Looking at the start of `TableView.data()`:\r\n\r\nhttps://github.com/simonw/datasette/blob/d57c347f35bcd8cff15f913da851b4b8eb030867/datasette/views/table.py#L333-L346\r\n\r\nI'm going to resolve `table_name` and `database` from the URL - `table_name` will be a string, `database` will be the DB object returned by `datasette.get_database()`. Then those can be passed in separately too.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1212823665, "label": "Refactor TableView to use asyncinject"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1716#issuecomment-1106923258", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1716", "id": 1106923258, "node_id": "IC_kwDOBm6k_c5B-k76", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-22T22:02:07Z", "updated_at": "2022-04-22T22:02:07Z", "author_association": "OWNER", "body": "https://github.com/simonw/datasette/blame/main/datasette/views/base.py\r\n\r\n\"image\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1212838949, "label": "Configure git blame to ignore Black commit"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1715#issuecomment-1106908642", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1715", "id": 1106908642, "node_id": "IC_kwDOBm6k_c5B-hXi", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-22T21:47:55Z", "updated_at": "2022-04-22T21:47:55Z", "author_association": "OWNER", "body": "I need a `asyncio.Registry` with functions registered to perform the role of the table view.\r\n\r\nSomething like this perhaps:\r\n```python\r\ndef table_html_context(facet_results, query, datasette, rows):\r\n return {...}\r\n```\r\nThat then gets called like this:\r\n```python\r\nasync def view(request):\r\n registry = Registry(facet_results, query, datasette, rows)\r\n context = await registry.resolve(table_html, request=request, datasette=datasette)\r\n return Reponse.html(await datasette.render(\"table.html\", context)\r\n```\r\nIt's also interesting to start thinking about this from a Python client library point of view. If I'm writing code outside of the HTTP request cycle, what would it look like?\r\n\r\nOne thing I could do: break out is the code that turns a request into a list of pairs extracted from the request - this code here: https://github.com/simonw/datasette/blob/8338c66a57502ef27c3d7afb2527fbc0663b2570/datasette/views/table.py#L442-L449\r\n\r\nI could turn that into a typed dependency injection function like this:\r\n\r\n```python\r\ndef filter_args(request: Request) -> List[Tuple[str, str]]:\r\n # Arguments that start with _ and don't contain a __ are\r\n # special - things like ?_search= - and should not be\r\n # treated as filters.\r\n filter_args = []\r\n for key in request.args:\r\n if not (key.startswith(\"_\") and \"__\" not in key):\r\n for v in request.args.getlist(key):\r\n filter_args.append((key, v))\r\n return filter_args\r\n```\r\nThen I can either pass a `request` into a `.resolve()` call, or I can instead skip that function by passing:\r\n\r\n```python\r\noutput = registry.resolve(table_context, filter_args=[(\"foo\", \"bar\")])\r\n```\r\nI do need to think about where plugins get executed in all of this.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1212823665, "label": "Refactor TableView to use asyncinject"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1101#issuecomment-1105642187", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1101", "id": 1105642187, "node_id": "IC_kwDOBm6k_c5B5sLL", "user": {"value": 25778, "label": "eyeseast"}, "created_at": "2022-04-21T18:59:08Z", "updated_at": "2022-04-21T18:59:08Z", "author_association": "CONTRIBUTOR", "body": "Ha! That was your idea (and a good one).\r\n\r\nBut it's probably worth measuring to see what overhead it adds. It did require both passing in the database and making the whole thing `async`. \r\n\r\nJust timing the queries themselves:\r\n\r\n1. [Using `AsGeoJSON(geometry) as geometry`](https://alltheplaces-datasette.fly.dev/alltheplaces?sql=select%0D%0A++id%2C%0D%0A++properties%2C%0D%0A++AsGeoJSON%28geometry%29+as+geometry%2C%0D%0A++spider%0D%0Afrom%0D%0A++places%0D%0Aorder+by%0D%0A++id%0D%0Alimit%0D%0A++1000) takes 10.235 ms\r\n2. [Leaving as binary](https://alltheplaces-datasette.fly.dev/alltheplaces?sql=select%0D%0A++id%2C%0D%0A++properties%2C%0D%0A++geometry%2C%0D%0A++spider%0D%0Afrom%0D%0A++places%0D%0Aorder+by%0D%0A++id%0D%0Alimit%0D%0A++1000) takes 8.63 ms\r\n\r\nLooking at the network panel:\r\n\r\n1. Takes about 200 ms for the `fetch` request\r\n2. Takes about 300 ms\r\n\r\nI'm not sure how best to time the GeoJSON generation, but it would be interesting to check. Maybe I'll write a plugin to add query times to response headers.\r\n\r\nThe other thing to consider with async streaming is that it might be well-suited for a slower response. When I have to get the whole result and send a response in a fixed amount of time, I need the most efficient query possible. If I can hang onto a connection and get things one chunk at a time, maybe it's ok if there's some overhead.\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 749283032, "label": "register_output_renderer() should support streaming data"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1101#issuecomment-1105615625", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1101", "id": 1105615625, "node_id": "IC_kwDOBm6k_c5B5lsJ", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-21T18:31:41Z", "updated_at": "2022-04-21T18:32:22Z", "author_association": "OWNER", "body": "The `datasette-geojson` plugin is actually an interesting case here, because of the way it converts SpatiaLite geometries into GeoJSON: https://github.com/eyeseast/datasette-geojson/blob/602c4477dc7ddadb1c0a156cbcd2ef6688a5921d/datasette_geojson/__init__.py#L61-L66\r\n\r\n```python\r\n\r\n if isinstance(geometry, bytes):\r\n results = await db.execute(\r\n \"SELECT AsGeoJSON(:geometry)\", {\"geometry\": geometry}\r\n )\r\n return geojson.loads(results.single_value())\r\n```\r\nThat actually seems to work really well as-is, but it does worry me a bit that it ends up having to execute an extra `SELECT` query for every single returned row - especially in streaming mode where it might be asked to return 1m rows at once.\r\n\r\nMy PostgreSQL/MySQL engineering brain says that this would be better handled by doing a chunk of these (maybe 100) at once, to avoid the per-query-overhead - but with SQLite that might not be necessary.\r\n\r\nAt any rate, this is one of the reasons I'm interested in \"iterate over this sequence of chunks of 100 rows at a time\" as a potential option here.\r\n\r\nOf course, a better solution would be for `datasette-geojson` to have a way to influence the SQL query before it is executed, adding a `AsGeoJSON(geometry)` clause to it - so that's something I'm open to as well.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 749283032, "label": "register_output_renderer() should support streaming data"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1101#issuecomment-1105608964", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1101", "id": 1105608964, "node_id": "IC_kwDOBm6k_c5B5kEE", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-21T18:26:29Z", "updated_at": "2022-04-21T18:26:29Z", "author_association": "OWNER", "body": "I'm questioning if the mechanisms should be separate at all now - a single response rendering is really just a case of a streaming response that only pulls the first N records from the iterator.\r\n\r\nIt probably needs to be an `async for` iterator, which I've not worked with much before. Good opportunity to learn.\r\n\r\nThis actually gets a fair bit more complicated due to the work I'm doing right now to improve the default JSON API:\r\n\r\n- #1709\r\n\r\nI want to do things like make faceting results optionally available to custom renderers - which is a separate concern from streaming rows.\r\n\r\nI'm going to poke around with a bunch of prototypes and see what sticks.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 749283032, "label": "register_output_renderer() should support streaming data"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1101#issuecomment-1105588651", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1101", "id": 1105588651, "node_id": "IC_kwDOBm6k_c5B5fGr", "user": {"value": 25778, "label": "eyeseast"}, "created_at": "2022-04-21T18:15:39Z", "updated_at": "2022-04-21T18:15:39Z", "author_association": "CONTRIBUTOR", "body": "What if you split rendering and streaming into two things:\r\n\r\n- `render` is a function that returns a response\r\n- `stream` is a function that sends chunks, or yields chunks passed to an ASGI `send` callback\r\n\r\nThat way current plugins still work, and streaming is purely additive. A `stream` function could get a cursor or iterator of rows, instead of a list, so it could more efficiently handle large queries.\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 749283032, "label": "register_output_renderer() should support streaming data"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1101#issuecomment-1105571003", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1101", "id": 1105571003, "node_id": "IC_kwDOBm6k_c5B5ay7", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-21T18:10:38Z", "updated_at": "2022-04-21T18:10:46Z", "author_association": "OWNER", "body": "Maybe the simplest design for this is to add an optional `can_stream` to the contract:\r\n\r\n```python\r\n @hookimpl\r\n def register_output_renderer(datasette):\r\n return {\r\n \"extension\": \"tsv\",\r\n \"render\": render_tsv,\r\n \"can_render\": lambda: True,\r\n \"can_stream\": lambda: True\r\n }\r\n```\r\nWhen streaming, a new parameter could be passed to the render function - maybe `chunks` - which is an iterator/generator over a sequence of chunks of rows.\r\n\r\nOr it could use the existing `rows` parameter but treat that as an iterator?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 749283032, "label": "register_output_renderer() should support streaming data"}, "performed_via_github_app": null} {"html_url": "https://github.com/dogsheep/github-to-sqlite/issues/72#issuecomment-1105474232", "issue_url": "https://api.github.com/repos/dogsheep/github-to-sqlite/issues/72", "id": 1105474232, "node_id": "IC_kwDODFdgUs5B5DK4", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-21T17:02:15Z", "updated_at": "2022-04-21T17:02:15Z", "author_association": "MEMBER", "body": "That's interesting - yeah it looks like the number of pages can be derived from the `Link` header, which is enough information to show a progress bar, probably using Click just to avoid adding another dependency.\r\n\r\nhttps://docs.github.com/en/rest/guides/traversing-with-pagination", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1211283427, "label": "feature: display progress bar when downloading multi-page responses"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1574#issuecomment-1105464661", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1574", "id": 1105464661, "node_id": "IC_kwDOBm6k_c5B5A1V", "user": {"value": 208018, "label": "dholth"}, "created_at": "2022-04-21T16:51:24Z", "updated_at": "2022-04-21T16:51:24Z", "author_association": "NONE", "body": "tfw you have more ephemeral storage than upstream bandwidth\r\n\r\n```\r\nFROM python:3.10-slim AS base\r\n\r\nRUN apt update && apt -y install zstd\r\n\r\nENV DATASETTE_SECRET 'sosecret'\r\nRUN --mount=type=cache,target=/root/.cache/pip\r\n pip install -U datasette datasette-pretty-json datasette-graphql\r\n\r\nENV PORT 8080\r\nEXPOSE 8080\r\n\r\nFROM base AS pack\r\n\r\nCOPY . /app\r\nWORKDIR /app\r\n\r\nRUN datasette inspect --inspect-file inspect-data.json\r\nRUN zstd --rm *.db\r\n\r\nFROM base AS unpack\r\n\r\nCOPY --from=pack /app /app\r\nWORKDIR /app\r\n\r\nCMD [\"/bin/bash\", \"-c\", \"shopt -s nullglob && zstd --rm -d *.db.zst && datasette serve --host 0.0.0.0 --cors --inspect-file inspect-data.json --metadata metadata.json --create --port $PORT *.db\"]\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1084193403, "label": "introduce new option for datasette package to use a slim base image"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1713#issuecomment-1103312860", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1713", "id": 1103312860, "node_id": "IC_kwDOBm6k_c5Bwzfc", "user": {"value": 536941, "label": "fgregg"}, "created_at": "2022-04-20T00:52:19Z", "updated_at": "2022-04-20T00:52:19Z", "author_association": "CONTRIBUTOR", "body": "feels related to #1402 ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1203943272, "label": "Datasette feature for publishing snapshots of query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/425#issuecomment-1101594549", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/425", "id": 1101594549, "node_id": "IC_kwDOCGYnMM5BqP-1", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-18T17:36:14Z", "updated_at": "2022-04-18T17:36:14Z", "author_association": "OWNER", "body": "Releated:\r\n- #408", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1203842656, "label": "`sqlite3.NotSupportedError`: deterministic=True requires SQLite 3.8.3 or higher"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1159#issuecomment-1100243987", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1159", "id": 1100243987, "node_id": "IC_kwDOBm6k_c5BlGQT", "user": {"value": 552629, "label": "lovasoa"}, "created_at": "2022-04-15T17:24:43Z", "updated_at": "2022-04-15T17:24:43Z", "author_association": "NONE", "body": "@simonw : do you think this could be merged ?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 774332247, "label": "Improve the display of facets information"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1713#issuecomment-1099540225", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1713", "id": 1099540225, "node_id": "IC_kwDOBm6k_c5BiacB", "user": {"value": 25778, "label": "eyeseast"}, "created_at": "2022-04-14T19:09:57Z", "updated_at": "2022-04-14T19:09:57Z", "author_association": "CONTRIBUTOR", "body": "I wonder if this overlaps with what I outlined in #1605. You could run something like this:\r\n\r\n```sh\r\ndatasette freeze -d exports/\r\naws s3 cp exports/ s3://my-export-bucket/$(date)\r\n```\r\n\r\nAnd maybe that does what you need. Of course, that plugin isn't built yet. But that's the idea.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1203943272, "label": "Datasette feature for publishing snapshots of query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1713#issuecomment-1099443468", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1713", "id": 1099443468, "node_id": "IC_kwDOBm6k_c5BiC0M", "user": {"value": 9308268, "label": "rayvoelker"}, "created_at": "2022-04-14T17:26:27Z", "updated_at": "2022-04-14T17:26:27Z", "author_association": "NONE", "body": "What would be an awesome feature as a plugin would be to be able to save a query (and possibly even results) to a github gist. Being able to share results that way would be super fantastic. Possibly even in Jupyter Notebook format (since github and github gists nicely render those)! \r\n\r\nI know there's the handy datasette-saved-queries plugin, but a button that could export stuff out and then even possibly import stuff back in (I'm sort of thinking the way that Google Colab allows you to save to github, and then pull the notebook back in is a really great workflow \r\n![image](https://user-images.githubusercontent.com/9308268/163441612-9ad2649f-c73e-4557-aaf2-e3d0fdc48fbf.png)\r\nhttps://github.com/cincinnatilibrary/collection-analysis/blob/master/reports/colab_datasette_example.ipynb )", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1203943272, "label": "Datasette feature for publishing snapshots of query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1713#issuecomment-1098628334", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1713", "id": 1098628334, "node_id": "IC_kwDOBm6k_c5Be7zu", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-14T01:43:00Z", "updated_at": "2022-04-14T01:43:13Z", "author_association": "OWNER", "body": "Current workaround for fast publishing to S3:\r\n\r\n datasette fixtures.db --get /fixtures/facetable.json | \\\r\n s3-credentials put-object my-bucket facetable.json -", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1203943272, "label": "Datasette feature for publishing snapshots of query results"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/421#issuecomment-1098548931", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/421", "id": 1098548931, "node_id": "IC_kwDOCGYnMM5BeobD", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-13T22:41:59Z", "updated_at": "2022-04-13T22:41:59Z", "author_association": "OWNER", "body": "I'm going to close this ticket since it looks like this is a bug in the way the Dockerfile builds Python, but I'm going to ship a fix for that issue I found so the `LD_PRELOAD` workaround above should work OK with the next release of `sqlite-utils`. Thanks for the detailed bug report!", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1180427792, "label": "\"Error: near \"(\": syntax error\" when using sqlite-utils indexes CLI"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/424#issuecomment-1098548090", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/424", "id": 1098548090, "node_id": "IC_kwDOCGYnMM5BeoN6", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-13T22:40:15Z", "updated_at": "2022-04-13T22:40:15Z", "author_association": "OWNER", "body": "New error:\r\n```pycon\r\n>>> from sqlite_utils import Database\r\n>>> db = Database(memory=True)\r\n>>> db[\"foo\"].create({})\r\nTraceback (most recent call last):\r\n File \"\", line 1, in \r\n File \"/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/db.py\", line 1465, in create\r\n self.db.create_table(\r\n File \"/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/db.py\", line 885, in create_table\r\n sql = self.create_table_sql(\r\n File \"/Users/simon/Dropbox/Development/sqlite-utils/sqlite_utils/db.py\", line 771, in create_table_sql\r\n assert columns, \"Tables must have at least one column\"\r\nAssertionError: Tables must have at least one column\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1200866134, "label": "Better error message if you try to create a table with no columns"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/425#issuecomment-1098545390", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/425", "id": 1098545390, "node_id": "IC_kwDOCGYnMM5Benju", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-13T22:34:52Z", "updated_at": "2022-04-13T22:34:52Z", "author_association": "OWNER", "body": "That broke Python 3.7 because it doesn't support `deterministic=True` even being passed:\r\n\r\n> function takes at most 3 arguments (4 given)", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1203842656, "label": "`sqlite3.NotSupportedError`: deterministic=True requires SQLite 3.8.3 or higher"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/425#issuecomment-1098537000", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/425", "id": 1098537000, "node_id": "IC_kwDOCGYnMM5Belgo", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-13T22:18:22Z", "updated_at": "2022-04-13T22:18:22Z", "author_association": "OWNER", "body": "I figured out a workaround in https://github.com/simonw/sqlite-utils/issues/421#issuecomment-1098535531\r\n\r\nThe current `register(fn)` method looks like this: https://github.com/simonw/sqlite-utils/blob/95522ad919f96eb6cc8cd3cd30389b534680c717/sqlite_utils/db.py#L389-L403\r\n\r\nThis alternative implementation worked in the environment where that failed:\r\n\r\n```python\r\n def register(fn):\r\n name = fn.__name__\r\n arity = len(inspect.signature(fn).parameters)\r\n if not replace and (name, arity) in self._registered_functions:\r\n return fn\r\n kwargs = {}\r\n done = False\r\n if deterministic:\r\n # Try this, but fall back if sqlite3.NotSupportedError\r\n try:\r\n self.conn.create_function(name, arity, fn, **dict(kwargs, deterministic=True))\r\n done = True\r\n except sqlite3.NotSupportedError:\r\n pass\r\n if not done:\r\n self.conn.create_function(name, arity, fn, **kwargs)\r\n self._registered_functions.add((name, arity))\r\n return fn\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1203842656, "label": "`sqlite3.NotSupportedError`: deterministic=True requires SQLite 3.8.3 or higher"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/421#issuecomment-1098535531", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/421", "id": 1098535531, "node_id": "IC_kwDOCGYnMM5BelJr", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-13T22:15:48Z", "updated_at": "2022-04-13T22:15:48Z", "author_association": "OWNER", "body": "Trying this alternative implementation of the `register()` method:\r\n\r\n```python\r\n def register(fn):\r\n name = fn.__name__\r\n arity = len(inspect.signature(fn).parameters)\r\n if not replace and (name, arity) in self._registered_functions:\r\n return fn\r\n kwargs = {}\r\n done = False\r\n if deterministic:\r\n # Try this, but fall back if sqlite3.NotSupportedError\r\n try:\r\n self.conn.create_function(name, arity, fn, **dict(kwargs, deterministic=True))\r\n done = True\r\n except sqlite3.NotSupportedError:\r\n pass\r\n if not done:\r\n self.conn.create_function(name, arity, fn, **kwargs)\r\n self._registered_functions.add((name, arity))\r\n return fn\r\n```\r\nWith that fix, the following worked!\r\n```\r\nLD_PRELOAD=./build/sqlite-autoconf-3360000/.libs/libsqlite3.so sqlite-utils indexes /tmp/global.db --table\r\ntable index_name seqno cid name desc coll key\r\n--------- -------------------------- ------- ----- ------- ------ ------ -----\r\ncountries idx_countries_country_name 0 1 country 0 BINARY 1\r\ncountries idx_countries_country_name 1 2 name 0 BINARY 1\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1180427792, "label": "\"Error: near \"(\": syntax error\" when using sqlite-utils indexes CLI"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/421#issuecomment-1098532220", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/421", "id": 1098532220, "node_id": "IC_kwDOCGYnMM5BekV8", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-13T22:09:52Z", "updated_at": "2022-04-13T22:09:52Z", "author_association": "OWNER", "body": "That error is weird - it's not supposed to happen according to this code here: https://github.com/simonw/sqlite-utils/blob/95522ad919f96eb6cc8cd3cd30389b534680c717/sqlite_utils/db.py#L389-L400", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1180427792, "label": "\"Error: near \"(\": syntax error\" when using sqlite-utils indexes CLI"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/421#issuecomment-1098531354", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/421", "id": 1098531354, "node_id": "IC_kwDOCGYnMM5BekIa", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-13T22:08:20Z", "updated_at": "2022-04-13T22:08:20Z", "author_association": "OWNER", "body": "OK I figured out what's going on here. First I added an extra `print(sql)` statement to the `indexes` command to see what SQL it was running:\r\n```\r\n(app-root) sqlite-utils indexes global.db --table\r\n\r\n select\r\n sqlite_master.name as \"table\",\r\n indexes.name as index_name,\r\n xinfo.*\r\n from sqlite_master\r\n join pragma_index_list(sqlite_master.name) indexes\r\n join pragma_index_xinfo(index_name) xinfo\r\n where\r\n sqlite_master.type = 'table'\r\n and xinfo.key = 1\r\nError: near \"(\": syntax error\r\n```\r\nThis made me suspicious that the SQLite version being used here didn't support joining against the `pragma_index_list(...)` table-valued functions in that way. So I checked the version:\r\n```\r\n(app-root) sqlite3\r\nSQLite version 3.36.0 2021-06-18 18:36:39\r\n```\r\nThat version should be fine - it's the one you compiled in the Dockerfile.\r\n\r\nThen I checked the version that `sqlite-utils` itself was using:\r\n```\r\n(app-root) sqlite-utils memory 'select sqlite_version()'\r\n[{\"sqlite_version()\": \"3.7.17\"}]\r\n```\r\nIt's running SQLite 3.7.17!\r\n\r\nSo the problem here is that the Python in that Docker image is running a very old version of SQLite.\r\n\r\nI tried using the trick in https://til.simonwillison.net/sqlite/ld-preload as a workaround, and it almost worked:\r\n\r\n```\r\n(app-root) python3 -c 'import sqlite3; print(sqlite3.connect(\":memory\").execute(\"select sqlite_version()\").fetchone())'\r\n('3.7.17',)\r\n(app-root) LD_PRELOAD=./build/sqlite-autoconf-3360000/.libs/libsqlite3.so python3 -c 'import sqlite3; print(sqlite3.connect(\":memory\").execute(\"select sqlite_version()\").fetchone())'\r\n('3.36.0',)\r\n```\r\nBut when I try to run `sqlite-utils` like that I get an error:\r\n\r\n```\r\n(app-root) LD_PRELOAD=./build/sqlite-autoconf-3360000/.libs/libsqlite3.so sqlite-utils indexes /tmp/global.db \r\n...\r\n File \"/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/cli.py\", line 1624, in query\r\n db.register_fts4_bm25()\r\n File \"/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/db.py\", line 412, in register_fts4_bm25\r\n self.register_function(rank_bm25, deterministic=True)\r\n File \"/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/db.py\", line 408, in register_function\r\n register(fn)\r\n File \"/opt/app-root/lib64/python3.8/site-packages/sqlite_utils/db.py\", line 401, in register\r\n self.conn.create_function(name, arity, fn, **kwargs)\r\nsqlite3.NotSupportedError: deterministic=True requires SQLite 3.8.3 or higher\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1180427792, "label": "\"Error: near \"(\": syntax error\" when using sqlite-utils indexes CLI"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/421#issuecomment-1098295517", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/421", "id": 1098295517, "node_id": "IC_kwDOCGYnMM5Bdqjd", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-13T17:16:20Z", "updated_at": "2022-04-13T17:16:20Z", "author_association": "OWNER", "body": "Aha! I was able to replicate the bug using your `Dockerfile` - thanks very much for providing that.\r\n```\r\n(app-root) sqlite-utils indexes global.db --table\r\nError: near \"(\": syntax error\r\n```\r\n(That wa sbefore I even ran the `extract` command.)\r\n\r\nTo build your `Dockerfile` I copied it into an empty folder and ran the following:\r\n```\r\nwget https://www.sqlite.org/2021/sqlite-autoconf-3360000.tar.gz\r\ndocker build . -t centos-sqlite-utils\r\ndocker run -it centos-sqlite-utils /bin/bash\r\n```\r\nThis gave me a shell in which I could replicate the bug.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1180427792, "label": "\"Error: near \"(\": syntax error\" when using sqlite-utils indexes CLI"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/421#issuecomment-1098288158", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/421", "id": 1098288158, "node_id": "IC_kwDOCGYnMM5Bdowe", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-13T17:07:53Z", "updated_at": "2022-04-13T17:07:53Z", "author_association": "OWNER", "body": "I can't replicate the bug I'm afraid:\r\n```\r\n% wget \"https://github.com/wri/global-power-plant-database/blob/232a6666/output_database/global_power_plant_database.csv?raw=true\" \r\n...\r\n2022-04-13 10:06:29 (8.97 MB/s) - \u2018global_power_plant_database.csv?raw=true\u2019 saved [8856038/8856038]\r\n% sqlite-utils insert global.db power_plants \\ \r\n 'global_power_plant_database.csv?raw=true' --csv\r\n [------------------------------------] 0%\r\n [###################################-] 99% 00:00:00%\r\n% sqlite-utils indexes global.db --table \r\ntable index_name seqno cid name desc coll key\r\n------- ------------ ------- ----- ------ ------ ------ -----\r\n% sqlite-utils extract global.db power_plants country country_long \\\r\n --table countries \\\r\n --fk-column country_id \\\r\n --rename country_long name\r\n% sqlite-utils indexes global.db --table \r\ntable index_name seqno cid name desc coll key\r\n--------- -------------------------- ------- ----- ------- ------ ------ -----\r\ncountries idx_countries_country_name 0 1 country 0 BINARY 1\r\ncountries idx_countries_country_name 1 2 name 0 BINARY 1\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1180427792, "label": "\"Error: near \"(\": syntax error\" when using sqlite-utils indexes CLI"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1712#issuecomment-1097115034", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1712", "id": 1097115034, "node_id": "IC_kwDOBm6k_c5BZKWa", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-12T19:12:21Z", "updated_at": "2022-04-12T19:12:21Z", "author_association": "OWNER", "body": "Got a TIL out of this too: https://til.simonwillison.net/spatialite/gunion-to-combine-geometries", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1202227104, "label": "Make \"\" easier to read"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1712#issuecomment-1097076622", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1712", "id": 1097076622, "node_id": "IC_kwDOBm6k_c5BZA-O", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-12T18:42:04Z", "updated_at": "2022-04-12T18:42:04Z", "author_association": "OWNER", "body": "I'm not going to show the tooltip if the formatted number is in bytes.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1202227104, "label": "Make \"\" easier to read"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1712#issuecomment-1097068474", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1712", "id": 1097068474, "node_id": "IC_kwDOBm6k_c5BY--6", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-12T18:38:18Z", "updated_at": "2022-04-12T18:38:18Z", "author_association": "OWNER", "body": "\"image\"\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1202227104, "label": "Make \"\" easier to read"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1708#issuecomment-1095687566", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1708", "id": 1095687566, "node_id": "IC_kwDOBm6k_c5BTt2O", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-11T23:24:30Z", "updated_at": "2022-04-11T23:24:30Z", "author_association": "OWNER", "body": "## Redesigned template context\r\n\r\n**Warning:** if you use any custom templates with your Datasette instance they are likely to break when you upgrade to 1.0.\r\n\r\nThe template context has been redesigned to be based on the documented JSON API. This means that the template context can be considered stable going forward, so any custom templates you implement should continue to work when you upgrade Datasette in the future.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1200649124, "label": "Datasette 1.0 alpha upcoming release notes"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1708#issuecomment-1095675839", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1708", "id": 1095675839, "node_id": "IC_kwDOBm6k_c5BTq-_", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-11T23:06:30Z", "updated_at": "2022-11-15T19:57:53Z", "author_association": "OWNER", "body": "# Datasette 1.0 alpha 1\r\n\r\nThis alpha release is a preview of Datasette 1.0.\r\n\r\nDatasette 1.0 marks a significant milestone in the project: it is the point from which various aspects of Datasette can be considered \"stable\", in that code developed against them should expect not to be broken by future releases in the 1.x series.\r\n\r\nThis will hold true until the next major version release, Datasette 2.0 - which we hope to hold off releasing for as long as possible.\r\n\r\nThe following Datasette components should be be considered stable after 1.0:\r\n\r\n- The plugin API. Plugins developed against 1.0 should continue to work unmodified throughout the 1.x series.\r\n- The JSON API. Code written that interacts with Datasette's default JSON web API should continue to work.\r\n- The template context. If you build custom templates against Datasette your custom pages should continue to work.\r\n\r\nNote that none of these components will cease to introduce new features. New plugin hooks, new JSON APIs and new template context variables can be introduced without breaking existing code.\r\n\r\nSince this alpha release previews features that will be frozen for 1.0, please test this thoroughly against your existing Datasette projects.\r\n\r\nYou can install the alpha using:\r\n\r\n pip install datasette==1.0a0\r\n\r\n## JSON API changes\r\n\r\nThe most significant changes introduced in this new alpha concern Datasette's JSON API.\r\n\r\nThe default JSON returned by the `/database/table.json` endpoint has changed. It now returns an object with two keys: `rows` - which contains a list of objects representing the rows in the table or query, and `more` containing a `boolean` that shows if there are more rows or if this object contains them all.\r\n\r\n```json\r\n{\r\n \"rows\": [{\r\n \"id\": 1,\r\n \"name\": \"Name 1\"\r\n }, {\r\n \"id\": 2,\r\n \"name\": \"Name 2\"\r\n }],\r\n \"more\": false\r\n}\r\n```\r\n[ Initially I thought about going with `next_url`, which would be `null` if you have reached the last page of records. Maybe that would be better? But since `next_url` cannot be provided on query pages, should this be part of the default format at all? ]\r\n\r\n## Use ?_extra= to retrieve extra fields\r\n\r\nThe default format can be expanded using one or more `?_extra=` parameters. This takes names of extra keys you would like to include. These can be comma-separated or `?_extra=` can be applied multiple times.\r\n\r\nFor example:\r\n\r\n /database/table.json?_extra=total\r\n\r\nThis adds a `\"total\": 124` field to the returned JSON.\r\n\r\n[ Question: if you do `?_facet=foo` then do you still need to do `?_extra=facets` - I think not? ]", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1200649124, "label": "Datasette 1.0 alpha upcoming release notes"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1705#issuecomment-1095673947", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1705", "id": 1095673947, "node_id": "IC_kwDOBm6k_c5BTqhb", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-11T23:03:49Z", "updated_at": "2022-04-11T23:03:49Z", "author_association": "OWNER", "body": "I'll also encourage testing against both Datasette 0.x and Datasette 1.0 using a GitHub Actions matrix.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1197926598, "label": "How to upgrade your plugin for 1.0 documentation"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1710#issuecomment-1095673670", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1710", "id": 1095673670, "node_id": "IC_kwDOBm6k_c5BTqdG", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-11T23:03:25Z", "updated_at": "2022-04-11T23:03:25Z", "author_association": "OWNER", "body": "Dupe of:\r\n- #1705", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1200649889, "label": "Guide for plugin authors to upgrade their plugins for 1.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1711#issuecomment-1095672127", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1711", "id": 1095672127, "node_id": "IC_kwDOBm6k_c5BTqE_", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-11T23:00:58Z", "updated_at": "2022-04-11T23:00:58Z", "author_association": "OWNER", "body": "- #1510", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1200650491, "label": "Template context powered entirely by the JSON API format"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1709#issuecomment-1095671940", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1709", "id": 1095671940, "node_id": "IC_kwDOBm6k_c5BTqCE", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-11T23:00:39Z", "updated_at": "2022-04-11T23:01:41Z", "author_association": "OWNER", "body": "- #262\r\n- #782 \r\n- #1509", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1200649502, "label": "Redesigned JSON API with ?_extra= parameters"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1707#issuecomment-1095277937", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1707", "id": 1095277937, "node_id": "IC_kwDOBm6k_c5BSJ1x", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-11T16:32:31Z", "updated_at": "2022-04-11T16:33:00Z", "author_association": "OWNER", "body": "That's a really interesting idea!\r\n\r\nThat page is one of the least developed at the moment. There's plenty of room for it to grow new useful features.\r\n\r\nI like this suggestion because it feels like a good opportunity to introduce some unobtrusive JavaScript. Could use a details/summary element that uses `fetch()` to load in the extra data for example.\r\n\r\nCould even do something with the `` Web Component here... https://github.com/simonw/datasette-table", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1200224939, "label": "[feature] expanded detail page"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1699#issuecomment-1094453751", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1699", "id": 1094453751, "node_id": "IC_kwDOBm6k_c5BPAn3", "user": {"value": 25778, "label": "eyeseast"}, "created_at": "2022-04-11T01:32:12Z", "updated_at": "2022-04-11T01:32:12Z", "author_association": "CONTRIBUTOR", "body": "Was looking through old issues and realized a bunch of this got discussed in #1101 (including by me!), so sorry to rehash all this. Happy to help with whatever piece of it I can. Would be very excited to be able to use format plugins with exports.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1193090967, "label": "Proposal: datasette query"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1706#issuecomment-1094152642", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1706", "id": 1094152642, "node_id": "IC_kwDOBm6k_c5BN3HC", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-10T01:11:54Z", "updated_at": "2022-04-10T01:11:54Z", "author_association": "OWNER", "body": "This relates to this much larger vision:\r\n- #417 ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1198822563, "label": "[feature] immutable mode for a directory, not just individual sqlite file"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1706#issuecomment-1094152173", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1706", "id": 1094152173, "node_id": "IC_kwDOBm6k_c5BN2_t", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-10T01:08:50Z", "updated_at": "2022-04-10T01:08:50Z", "author_association": "OWNER", "body": "This is a good idea - it matches the way `datasette .` works for mutable database files already.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1198822563, "label": "[feature] immutable mode for a directory, not just individual sqlite file"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1693#issuecomment-1093454899", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1693", "id": 1093454899, "node_id": "IC_kwDOBm6k_c5BLMwz", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-08T23:07:04Z", "updated_at": "2022-04-08T23:07:04Z", "author_association": "OWNER", "body": "Tests failed here due to this issue:\r\n- https://github.com/psf/black/pull/2987\r\n\r\nA future Black release should fix that.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1184850337, "label": "Bump black from 22.1.0 to 22.3.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1703#issuecomment-1092850719", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1703", "id": 1092850719, "node_id": "IC_kwDOBm6k_c5BI5Qf", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2022-04-08T13:18:04Z", "updated_at": "2022-04-08T13:18:04Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1703?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) Report\n> Merging [#1703](https://codecov.io/gh/simonw/datasette/pull/1703?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) (73aabe6) into [main](https://codecov.io/gh/simonw/datasette/commit/90d1be9952db9aaddc21a536e4d00a8de44765d7?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) (90d1be9) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n```diff\n@@ Coverage Diff @@\n## main #1703 +/- ##\n=======================================\n Coverage 91.75% 91.75% \n=======================================\n Files 34 34 \n Lines 4573 4573 \n=======================================\n Hits 4196 4196 \n Misses 377 377 \n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1703?src=pr&el=continue&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1703?src=pr&el=footer&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Last update [90d1be9...73aabe6](https://codecov.io/gh/simonw/datasette/pull/1703?src=pr&el=lastupdated&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1197298420, "label": "Update beautifulsoup4 requirement from <4.11.0,>=4.8.1 to >=4.8.1,<4.12.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1699#issuecomment-1092386254", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1699", "id": 1092386254, "node_id": "IC_kwDOBm6k_c5BHH3O", "user": {"value": 25778, "label": "eyeseast"}, "created_at": "2022-04-08T02:39:25Z", "updated_at": "2022-04-08T02:39:25Z", "author_association": "CONTRIBUTOR", "body": "And just to think this through a little more, here's what `stream_geojson` might look like:\r\n\r\n```python\r\nasync def stream_geojson(datasette, columns, rows, database, stream):\r\n db = datasette.get_database(database)\r\n for row in rows:\r\n feature = await row_to_geojson(row, db)\r\n stream.write(feature + \"\\n\") # just assuming newline mode for now\r\n```\r\n\r\nAlternately, that could be an async generator, like this:\r\n\r\n```python\r\nasync def stream_geojson(datasette, columns, rows, database):\r\n db = datasette.get_database(database)\r\n for row in rows:\r\n feature = await row_to_geojson(row, db)\r\n yield feature\r\n```\r\n\r\nNot sure which makes more sense, but I think this pattern would open up a lot of possibility. If you had your [stream_indented_json](https://til.simonwillison.net/python/output-json-array-streaming) function, you could do `yield from stream_indented_json(rows, 2)` and be one your way.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1193090967, "label": "Proposal: datasette query"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1699#issuecomment-1092370880", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1699", "id": 1092370880, "node_id": "IC_kwDOBm6k_c5BHEHA", "user": {"value": 25778, "label": "eyeseast"}, "created_at": "2022-04-08T02:07:40Z", "updated_at": "2022-04-08T02:07:40Z", "author_association": "CONTRIBUTOR", "body": "So maybe `render_output_render` returns something like this:\r\n\r\n```python\r\n@hookimpl\r\ndef register_output_renderer(datasette):\r\n return {\r\n \"extension\": \"geojson\",\r\n \"render\": render_geojson,\r\n \"stream\": stream_geojson,\r\n \"can_render\": can_render_geojson,\r\n }\r\n```\r\n\r\nAnd stream gets an iterator, instead of a list of rows, so it can efficiently handle large queries. Maybe it also gets passed a destination stream, or it returns an iterator. I'm not sure what makes more sense. Either way, that might cover both CLI exports and streaming responses.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1193090967, "label": "Proposal: datasette query"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1699#issuecomment-1092361727", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1699", "id": 1092361727, "node_id": "IC_kwDOBm6k_c5BHB3_", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-08T01:47:43Z", "updated_at": "2022-04-08T01:47:43Z", "author_association": "OWNER", "body": "A render mode for that plugin hook that writes to a stream is exactly what I have in mind:\r\n- #1062 ", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1193090967, "label": "Proposal: datasette query"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1699#issuecomment-1092357672", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1699", "id": 1092357672, "node_id": "IC_kwDOBm6k_c5BHA4o", "user": {"value": 25778, "label": "eyeseast"}, "created_at": "2022-04-08T01:39:40Z", "updated_at": "2022-04-08T01:39:40Z", "author_association": "CONTRIBUTOR", "body": "> My best thought on how to differentiate them so far is plugins: if Datasette plugins that provide alternative outputs - like .geojson and .yml and suchlike - also work for the datasette query command that would make a lot of sense to me.\r\n\r\nThat's my thinking, too. It's really the thing I've been wanting since writing `datasette-geojson`, since I'm always exporting with `datasette --get`. The workflow I'm always looking for is something like this:\r\n\r\n```sh\r\ncd alltheplaces-datasette\r\ndatasette query dunkin_in_suffolk -f geojson -o dunkin_in_suffolk.geojson\r\n```\r\n\r\nI think this probably needs either a new plugin hook separate from `register_output_renderer` or a way to use that without going through the HTTP stack. Or maybe a render mode that writes to a stream instead of a response. Maybe there's a new key in the dictionary that `register_output_renderer` returns that handles CLI exports.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1193090967, "label": "Proposal: datasette query"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1699#issuecomment-1092321966", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1699", "id": 1092321966, "node_id": "IC_kwDOBm6k_c5BG4Ku", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-08T00:20:32Z", "updated_at": "2022-04-08T00:20:56Z", "author_association": "OWNER", "body": "If we do this I'm keen to have it be more than just an alternative to the existing `sqlite-utils` command - especially since if I add `sqlite-utils` as a dependency of Datasette in the future that command will be installed as part of `pip install datasette` anyway.\r\n\r\nMy best thought on how to differentiate them so far is plugins: if Datasette plugins that provide alternative outputs - like `.geojson` and `.yml` and suchlike - also work for the `datasette query` command that would make a lot of sense to me.\r\n\r\nOne way that could work: a `--fmt geojson` option to this command which uses the plugin that was registered for the specified extension.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1193090967, "label": "Proposal: datasette query"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1549#issuecomment-1087428593", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1549", "id": 1087428593, "node_id": "IC_kwDOBm6k_c5A0Nfx", "user": {"value": 536941, "label": "fgregg"}, "created_at": "2022-04-04T11:17:13Z", "updated_at": "2022-04-04T11:17:13Z", "author_association": "CONTRIBUTOR", "body": "another way to get the behavior of downloading the file is to use the download attribute of the anchor tag\r\n\r\nhttps://developer.mozilla.org/en-US/docs/Web/HTML/Element/a#attr-download", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1077620955, "label": "Redesign CSV export to improve usability"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1698#issuecomment-1086784547", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1698", "id": 1086784547, "node_id": "IC_kwDOBm6k_c5AxwQj", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-03T06:10:24Z", "updated_at": "2022-04-03T06:10:24Z", "author_association": "OWNER", "body": "Warning added here: https://docs.datasette.io/en/latest/publish.html#publishing-to-google-cloud-run", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1190828163, "label": "Add a warning about bots and Cloud Run"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1697#issuecomment-1085323192", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1697", "id": 1085323192, "node_id": "IC_kwDOBm6k_c5AsLe4", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-04-01T02:01:51Z", "updated_at": "2022-04-01T02:01:51Z", "author_association": "OWNER", "body": "Huh, turns out `Request.fake()` wasn't yet documented.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1189113609, "label": "`Request.fake(..., url_vars={})`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1574#issuecomment-1084216224", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1574", "id": 1084216224, "node_id": "IC_kwDOBm6k_c5An9Og", "user": {"value": 33631, "label": "fs111"}, "created_at": "2022-03-31T07:45:25Z", "updated_at": "2022-03-31T07:45:25Z", "author_association": "NONE", "body": "@simonw I like that you want to go \"slim by default\". Do you want another PR for that or should I just wait?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1084193403, "label": "introduce new option for datasette package to use a slim base image"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1696#issuecomment-1083351437", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1696", "id": 1083351437, "node_id": "IC_kwDOBm6k_c5AkqGN", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-30T16:20:49Z", "updated_at": "2022-03-30T16:21:02Z", "author_association": "OWNER", "body": "Maybe like this:\r\n\r\n\"image\"\r\n\r\n```html\r\n

283 rows\r\n where dcode = 3 (Human Related: Other)\r\n

\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1186696202, "label": "Show foreign key label when filtering"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1692#issuecomment-1082663746", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1692", "id": 1082663746, "node_id": "IC_kwDOBm6k_c5AiCNC", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-30T06:14:39Z", "updated_at": "2022-03-30T06:14:51Z", "author_association": "OWNER", "body": "I like your design, though I think it should be `\"nomodule\": True` for consistency with the other options.\r\n\r\nI think `\"async\": True` is worth supporting too.", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1182227211, "label": "[plugins][feature request]: Support additional script tag attributes when loading custom JS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1692#issuecomment-1082661795", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1692", "id": 1082661795, "node_id": "IC_kwDOBm6k_c5AiBuj", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-30T06:11:41Z", "updated_at": "2022-03-30T06:11:41Z", "author_association": "OWNER", "body": "This is a good idea.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1182227211, "label": "[plugins][feature request]: Support additional script tag attributes when loading custom JS"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1695#issuecomment-1082617386", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1695", "id": 1082617386, "node_id": "IC_kwDOBm6k_c5Ah24q", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-30T04:46:18Z", "updated_at": "2022-03-30T04:46:18Z", "author_association": "OWNER", "body": "` selected = (column_qs, str(row[\"value\"])) in qs_pairs` is wrong.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1185868354, "label": "Option to un-filter facet not shown for `?col__exact=value`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1695#issuecomment-1082617241", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1695", "id": 1082617241, "node_id": "IC_kwDOBm6k_c5Ah22Z", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-30T04:45:55Z", "updated_at": "2022-03-30T04:45:55Z", "author_association": "OWNER", "body": "Relevant template: https://github.com/simonw/datasette/blob/e73fa72917ca28c152208d62d07a490c81cadf52/datasette/templates/table.html#L168-L172\r\n\r\nPopulated from here: https://github.com/simonw/datasette/blob/c496f2b663ff0cef908ffaaa68b8cb63111fb5f2/datasette/facets.py#L246-L253", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1185868354, "label": "Option to un-filter facet not shown for `?col__exact=value`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/420#issuecomment-1082476727", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/420", "id": 1082476727, "node_id": "IC_kwDOCGYnMM5AhUi3", "user": {"value": 770231, "label": "strada"}, "created_at": "2022-03-29T23:52:38Z", "updated_at": "2022-03-29T23:52:38Z", "author_association": "NONE", "body": "@simonw Thanks for looking into it and documenting the solution!\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1178546862, "label": "Document how to use a `--convert` function that runs initialization code first"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1693#issuecomment-1081861670", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1693", "id": 1081861670, "node_id": "IC_kwDOBm6k_c5Ae-Ym", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2022-03-29T13:18:47Z", "updated_at": "2022-05-20T20:36:30Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1693?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) Report\n> Merging [#1693](https://codecov.io/gh/simonw/datasette/pull/1693?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) (65a5d5e) into [main](https://codecov.io/gh/simonw/datasette/commit/1465fea4798599eccfe7e8f012bd8d9adfac3039?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) (1465fea) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n> :exclamation: Current head 65a5d5e differs from pull request most recent head ec2d1e4. Consider uploading reports for the commit ec2d1e4 to get more accurate results\n\n```diff\n@@ Coverage Diff @@\n## main #1693 +/- ##\n=======================================\n Coverage 91.67% 91.67% \n=======================================\n Files 36 36 \n Lines 4658 4658 \n=======================================\n Hits 4270 4270 \n Misses 388 388 \n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1693?src=pr&el=continue&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1693?src=pr&el=footer&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Last update [1d33fd0...ec2d1e4](https://codecov.io/gh/simonw/datasette/pull/1693?src=pr&el=lastupdated&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1184850337, "label": "Bump black from 22.1.0 to 22.3.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1694#issuecomment-1081860312", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1694", "id": 1081860312, "node_id": "IC_kwDOBm6k_c5Ae-DY", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2022-03-29T13:17:30Z", "updated_at": "2022-03-29T13:17:30Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1694?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) Report\n> Merging [#1694](https://codecov.io/gh/simonw/datasette/pull/1694?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) (83ff967) into [main](https://codecov.io/gh/simonw/datasette/commit/e73fa72917ca28c152208d62d07a490c81cadf52?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) (e73fa72) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n```diff\n@@ Coverage Diff @@\n## main #1694 +/- ##\n=======================================\n Coverage 91.74% 91.74% \n=======================================\n Files 34 34 \n Lines 4565 4565 \n=======================================\n Hits 4188 4188 \n Misses 377 377 \n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1694?src=pr&el=continue&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1694?src=pr&el=footer&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Last update [e73fa72...83ff967](https://codecov.io/gh/simonw/datasette/pull/1694?src=pr&el=lastupdated&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1184850675, "label": "Update click requirement from <8.1.0,>=7.1.1 to >=7.1.1,<8.2.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/421#issuecomment-1081079506", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/421", "id": 1081079506, "node_id": "IC_kwDOCGYnMM5Ab_bS", "user": {"value": 24938923, "label": "learning4life"}, "created_at": "2022-03-28T19:58:55Z", "updated_at": "2022-03-28T20:05:57Z", "author_association": "NONE", "body": "Sure, it is from the documentation example:\r\n[Extracting columns into a separate table](https://sqlite-utils.datasette.io/en/stable/cli.html#extracting-columns-into-a-separate-table)\r\n\r\n\r\n```\r\nwget \"https://github.com/wri/global-power-plant-database/blob/232a6666/output_database/global_power_plant_database.csv?raw=true\"\r\n\r\nsqlite-utils insert global.db power_plants \\\r\n 'global_power_plant_database.csv?raw=true' --csv\r\n# Extract those columns:\r\nsqlite-utils extract global.db power_plants country country_long \\\r\n --table countries \\\r\n --fk-column country_id \\\r\n --rename country_long name\r\n```\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1180427792, "label": "\"Error: near \"(\": syntax error\" when using sqlite-utils indexes CLI"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/420#issuecomment-1081047053", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/420", "id": 1081047053, "node_id": "IC_kwDOCGYnMM5Ab3gN", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-28T19:22:37Z", "updated_at": "2022-03-28T19:22:37Z", "author_association": "OWNER", "body": "Wrote about this in my weeknotes: https://simonwillison.net/2022/Mar/28/datasette-auth0/#new-features-as-documentation", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1178546862, "label": "Document how to use a `--convert` function that runs initialization code first"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/420#issuecomment-1080141111", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/420", "id": 1080141111, "node_id": "IC_kwDOCGYnMM5AYaU3", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-28T03:25:57Z", "updated_at": "2022-03-28T03:54:37Z", "author_association": "OWNER", "body": "So now this should solve your problem:\r\n```\r\necho '[{\"name\": \"notaword\"}, {\"name\": \"word\"}]\r\n' | python3 -m sqlite_utils insert listings.db listings - --convert '\r\nimport enchant\r\nd = enchant.Dict(\"en_US\")\r\n\r\ndef convert(row):\r\n global d\r\n row[\"is_dictionary_word\"] = d.check(row[\"name\"])\r\n'\r\n```", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 1, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1178546862, "label": "Document how to use a `--convert` function that runs initialization code first"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1688#issuecomment-1079806857", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1688", "id": 1079806857, "node_id": "IC_kwDOBm6k_c5AXIuJ", "user": {"value": 9020979, "label": "hydrosquall"}, "created_at": "2022-03-27T01:01:14Z", "updated_at": "2022-03-27T01:01:14Z", "author_association": "NONE", "body": "Thank you! I went through the cookiecutter template, and published my first package here: https://github.com/hydrosquall/datasette-nteract-data-explorer", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1181432624, "label": "[plugins][documentation] Is it possible to serve per-plugin static folders when writing one-off (single file) plugins?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1690#issuecomment-1079788375", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1690", "id": 1079788375, "node_id": "IC_kwDOBm6k_c5AXENX", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-26T22:43:00Z", "updated_at": "2022-03-26T22:43:00Z", "author_association": "OWNER", "body": "Then I can update this section of the documentation which currently recommends the above pattern: https://docs.datasette.io/en/stable/authentication.html#the-ds-actor-cookie", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1182141761, "label": "Idea: `datasette.set_actor_cookie(response, actor)`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1690#issuecomment-1079788346", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1690", "id": 1079788346, "node_id": "IC_kwDOBm6k_c5AXEM6", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-26T22:42:40Z", "updated_at": "2022-03-26T22:42:40Z", "author_association": "OWNER", "body": "I don't want to do a `response.set_actor_cookie()` method because I like `Response` not to carry too many Datasette-specific features.\r\n\r\nSo `datasette.set_actor_cookie(response, actor, expire_after=None)` would be a better place for this I think.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1182141761, "label": "Idea: `datasette.set_actor_cookie(response, actor)`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1689#issuecomment-1079779040", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1689", "id": 1079779040, "node_id": "IC_kwDOBm6k_c5AXB7g", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-26T21:35:57Z", "updated_at": "2022-03-26T21:35:57Z", "author_association": "OWNER", "body": "Fixed: https://docs.datasette.io/en/latest/internals.html#add-message-request-message-type-datasette-info", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1182065616, "label": "datasette.add_message() documentation is incorrect"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1688#issuecomment-1079582485", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1688", "id": 1079582485, "node_id": "IC_kwDOBm6k_c5AWR8V", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-26T03:15:34Z", "updated_at": "2022-03-26T03:15:34Z", "author_association": "OWNER", "body": "Yup, you're right in what you figured out here: stand-alone plugins can't currently package static assets other then using the static folder.\r\n\r\nThe `datasette-plugin` cookiecutter template should make creating a Python package pretty easy though: https://github.com/simonw/datasette-plugin\r\n\r\nYou can run that yourself, or you can run it using this GitHub template repository: https://github.com/simonw/datasette-plugin-template-repository \r\n\r\n", "reactions": "{\"total_count\": 1, \"+1\": 1, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1181432624, "label": "[plugins][documentation] Is it possible to serve per-plugin static folders when writing one-off (single file) plugins?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1688#issuecomment-1079550754", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1688", "id": 1079550754, "node_id": "IC_kwDOBm6k_c5AWKMi", "user": {"value": 9020979, "label": "hydrosquall"}, "created_at": "2022-03-26T01:27:27Z", "updated_at": "2022-03-26T03:16:29Z", "author_association": "NONE", "body": "> Is there a way to serve a static assets when using the plugins/ directory method instead of installing plugins as a new python package?\r\n\r\nAs a workaround, I found I can serve my statics from a non-plugin specific folder using the [--static](https://docs.datasette.io/en/stable/custom_templates.html#serving-static-files) CLI flag.\r\n\r\n```bash\r\ndatasette ~/Library/Safari/History.db \\\r\n --plugins-dir=plugins/ \\\r\n --static assets:dist/\r\n```\r\n\r\nIt's not ideal because it means I'll change the cache pattern path depending on how the plugin is running (via pip install or as a one off script), but it's usable as a workaround.\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1181432624, "label": "[plugins][documentation] Is it possible to serve per-plugin static folders when writing one-off (single file) plugins?"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/417#issuecomment-1079441621", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/417", "id": 1079441621, "node_id": "IC_kwDOCGYnMM5AVvjV", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-25T21:18:37Z", "updated_at": "2022-03-25T21:18:37Z", "author_association": "OWNER", "body": "Updated documentation: https://sqlite-utils.datasette.io/en/latest/cli.html#inserting-newline-delimited-json", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1175744654, "label": "insert fails on JSONL with whitespace"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/421#issuecomment-1079407962", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/421", "id": 1079407962, "node_id": "IC_kwDOCGYnMM5AVnVa", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-25T20:25:10Z", "updated_at": "2022-03-25T20:25:18Z", "author_association": "OWNER", "body": "Can you share either your whole `global.db` table or a shrunk down example that illustrates the bug?\r\n\r\nMy hunch is that you may have a table or column with a name that triggers the error.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1180427792, "label": "\"Error: near \"(\": syntax error\" when using sqlite-utils indexes CLI"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/422#issuecomment-1079406708", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/422", "id": 1079406708, "node_id": "IC_kwDOCGYnMM5AVnB0", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-25T20:23:21Z", "updated_at": "2022-03-25T20:23:21Z", "author_association": "OWNER", "body": "Fixing this would require a bump to 4.0 because it would break existing code.\r\n\r\nThe alternative would be to introduce a new `ignore_nulls=True` parameter which users can change to `ignore_nulls=False`. Or come up with better wording for that.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1181236173, "label": "Reconsider not running convert functions against null values"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/420#issuecomment-1079404281", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/420", "id": 1079404281, "node_id": "IC_kwDOCGYnMM5AVmb5", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-25T20:19:50Z", "updated_at": "2022-03-25T20:19:50Z", "author_association": "OWNER", "body": "Now documented here: https://sqlite-utils.datasette.io/en/latest/cli.html#using-a-convert-function-to-execute-initialization", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1178546862, "label": "Document how to use a `--convert` function that runs initialization code first"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/420#issuecomment-1079384771", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/420", "id": 1079384771, "node_id": "IC_kwDOCGYnMM5AVhrD", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-25T19:51:34Z", "updated_at": "2022-03-25T19:53:01Z", "author_association": "OWNER", "body": "This works:\r\n```\r\n% sqlite-utils insert dogs.db dogs dogs.json --convert '\r\nimport random\r\nprint(\"seeding\")\r\nrandom.seed(10)\r\nprint(random.random())\r\n\r\ndef convert(row):\r\n global random\r\n print(row)\r\n row[\"random_score\"] = random.random()\r\n'\r\nseeding\r\n0.5714025946899135\r\n{'id': 1, 'name': 'Cleo'}\r\n{'id': 2, 'name': 'Pancakes'}\r\n{'id': 3, 'name': 'New dog'}\r\n(sqlite-utils) sqlite-utils % sqlite-utils rows dogs.db dogs\r\n[{\"id\": 1, \"name\": \"Cleo\", \"random_score\": 0.4288890546751146},\r\n {\"id\": 2, \"name\": \"Pancakes\", \"random_score\": 0.5780913011344704},\r\n {\"id\": 3, \"name\": \"New dog\", \"random_score\": 0.20609823213950174}]\r\n```\r\nHaving to use `global random` inside the function is frustrating but apparently necessary. https://stackoverflow.com/a/56552138/6083", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1178546862, "label": "Document how to use a `--convert` function that runs initialization code first"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/420#issuecomment-1079376283", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/420", "id": 1079376283, "node_id": "IC_kwDOCGYnMM5AVfmb", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-25T19:39:30Z", "updated_at": "2022-03-25T19:43:35Z", "author_association": "OWNER", "body": "Actually this doesn't work as I thought. This demo shows that the initialization code is run once per item, not a single time at the start of the run:\r\n```\r\n% sqlite-utils insert dogs.db dogs dogs.json --convert '\r\nimport random\r\nprint(\"seeding\")\r\nrandom.seed(10)\r\nprint(random.random())\r\n\r\ndef convert(row):\r\n print(row)\r\n row[\"random_score\"] = random.random()\r\n'\r\nseeding\r\n0.5714025946899135\r\nseeding\r\n0.5714025946899135\r\nseeding\r\n0.5714025946899135\r\nseeding\r\n0.5714025946899135\r\n```\r\nAlso that `print(row)` line is not being printed anywhere that gets to the console for some reason.\r\n\r\n... my mistake, that happened because I changed this line in order to try to get local imports to work:\r\n```python\r\n try:\r\n exec(code, globals, locals)\r\n return globals[\"convert\"]\r\n except (AttributeError, SyntaxError, NameError, KeyError, TypeError):\r\n```\r\nIt should be `locals[\"convert\"]`", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1178546862, "label": "Document how to use a `--convert` function that runs initialization code first"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/420#issuecomment-1079243535", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/420", "id": 1079243535, "node_id": "IC_kwDOCGYnMM5AU_MP", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-25T17:25:12Z", "updated_at": "2022-03-25T17:25:12Z", "author_association": "OWNER", "body": "That documentation is split across a few places. This is the only bit that talks about `def convert()` pattern right now:\r\n\r\n- https://sqlite-utils.datasette.io/en/stable/cli.html#converting-data-in-columns\r\n\r\nBut that's for `sqlite-utils convert` - the documentation for `sqlite-utils insert --convert` at https://sqlite-utils.datasette.io/en/stable/cli.html#applying-conversions-while-inserting-data doesn't mention it.\r\n\r\nSince both `sqlite-utils convert` and `sqlite-utils insert --convert` apply the same rules to the code, they should link to a shared explanation in the documentation.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1178546862, "label": "Document how to use a `--convert` function that runs initialization code first"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1685#issuecomment-1079018557", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1685", "id": 1079018557, "node_id": "IC_kwDOBm6k_c5AUIQ9", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2022-03-25T13:16:48Z", "updated_at": "2022-03-25T13:16:48Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/datasette/pull/1685?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) Report\n> Merging [#1685](https://codecov.io/gh/simonw/datasette/pull/1685?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) (933ce47) into [main](https://codecov.io/gh/simonw/datasette/commit/c496f2b663ff0cef908ffaaa68b8cb63111fb5f2?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) (c496f2b) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n```diff\n@@ Coverage Diff @@\n## main #1685 +/- ##\n=======================================\n Coverage 91.74% 91.74% \n=======================================\n Files 34 34 \n Lines 4565 4565 \n=======================================\n Hits 4188 4188 \n Misses 377 377 \n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/datasette/pull/1685?src=pr&el=continue&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/datasette/pull/1685?src=pr&el=footer&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Last update [c496f2b...933ce47](https://codecov.io/gh/simonw/datasette/pull/1685?src=pr&el=lastupdated&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1180778860, "label": "Update jinja2 requirement from <3.1.0,>=2.10.3 to >=2.10.3,<3.2.0"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/420#issuecomment-1078343231", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/420", "id": 1078343231, "node_id": "IC_kwDOCGYnMM5ARjY_", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-24T21:16:10Z", "updated_at": "2022-03-24T21:17:20Z", "author_association": "OWNER", "body": "Aha! This may be possible already: https://github.com/simonw/sqlite-utils/blob/396f80fcc60da8dd844577114f7920830a2e5403/sqlite_utils/utils.py#L311-L316\r\n\r\nAnd yes, this does indeed work - you can do something like this:\r\n\r\n```\r\necho '{\"name\": \"harry\"}' | sqlite-utils insert db.db people - --convert '\r\nimport time\r\n# Simulate something expensive\r\ntime.sleep(1)\r\n\r\ndef convert(row):\r\n row[\"upper\"] = row[\"name\"].upper()\r\n'\r\n```\r\nAnd after running that:\r\n```\r\nsqlite-utils dump db.db \r\nBEGIN TRANSACTION;\r\nCREATE TABLE [people] (\r\n [name] TEXT,\r\n [upper] TEXT\r\n);\r\nINSERT INTO \"people\" VALUES('harry','HARRY');\r\nCOMMIT;\r\n```\r\nSo this is a documentation issue - there's a trick for it but I didn't know what the trick was!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1178546862, "label": "Document how to use a `--convert` function that runs initialization code first"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/420#issuecomment-1078328774", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/420", "id": 1078328774, "node_id": "IC_kwDOCGYnMM5ARf3G", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-24T21:12:33Z", "updated_at": "2022-03-24T21:12:33Z", "author_association": "OWNER", "body": "Here's how the `_compile_code()` mechanism works at the moment: https://github.com/simonw/sqlite-utils/blob/396f80fcc60da8dd844577114f7920830a2e5403/sqlite_utils/utils.py#L308-L342\r\n\r\nAt the end it does this:\r\n```python\r\n return locals[\"fn\"]\r\n```\r\nSo it's already building and then returning a function.\r\n\r\nThe question is if there's a sensible way to allow people to further customize that function by executing some code first, in a way that's easy to explain.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1178546862, "label": "Document how to use a `--convert` function that runs initialization code first"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/420#issuecomment-1078322301", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/420", "id": 1078322301, "node_id": "IC_kwDOCGYnMM5AReR9", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-24T21:10:52Z", "updated_at": "2022-03-24T21:10:52Z", "author_association": "OWNER", "body": "I can think of three ways forward:\r\n\r\n- Figure out a pattern that gets that local file import workaround to work\r\n- Add another option such as `--convert-init` that lets you pass code that will be executed once at the start\r\n- Come up with a pattern where the `--convert` code can run some initialization code and then return a function which will be called against each value\r\n\r\nI quite like the idea of that third option - I'm going to prototype it and see if I can work something out.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1178546862, "label": "Document how to use a `--convert` function that runs initialization code first"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/420#issuecomment-1078315922", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/420", "id": 1078315922, "node_id": "IC_kwDOCGYnMM5ARcuS", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-24T21:09:27Z", "updated_at": "2022-03-24T21:09:27Z", "author_association": "OWNER", "body": "Yeah, this is WAY harder than it should be.\r\n\r\nThere's a clumsy workaround you could use which looks something like this: create a file `my_enchant.py` containing:\r\n\r\n```python\r\nimport enchant\r\nd = enchant.Dict(\"en_US\")\r\n\r\ndef check(word):\r\n return d.check(word)\r\n```\r\nThen run `sqlite-utils` like this:\r\n\r\n```\r\nPYTHONPATH=. cat items.json | jq '.data' | sqlite-utils insert listings.db listings - --convert 'my_enchant.check(value)' --import my_enchant\r\n```\r\nExcept I tried that and it doesn't work! I don't know the right pattern for getting `--import` to work with modules in the same directory.\r\n\r\nSo yeah, this is definitely a big feature gap.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1178546862, "label": "Document how to use a `--convert` function that runs initialization code first"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1684#issuecomment-1078126065", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1684", "id": 1078126065, "node_id": "IC_kwDOBm6k_c5AQuXx", "user": {"value": 536941, "label": "fgregg"}, "created_at": "2022-03-24T20:08:56Z", "updated_at": "2022-03-24T20:13:19Z", "author_association": "CONTRIBUTOR", "body": "would be nice if the behavior was\r\n\r\n1. try to facet all the columns\r\n2. for bigger tables try to facet the indexed columns\r\n3. for the biggest tables, turn off autofacetting completely\r\n\r\nThis is based on my assumption that what determines autofaceting is the rarity of unique values. Which may not be true!", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1179998071, "label": "Mechanism for disabling faceting on large tables only"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/399#issuecomment-1077671779", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/399", "id": 1077671779, "node_id": "IC_kwDOCGYnMM5AO_dj", "user": {"value": 25778, "label": "eyeseast"}, "created_at": "2022-03-24T14:11:33Z", "updated_at": "2022-03-24T14:11:43Z", "author_association": "CONTRIBUTOR", "body": "Coming back to this. I was about to add a utility function to [datasette-geojson]() to convert lat/lng columns to geometries. Thankfully I googled first. There's a SpatiaLite function for this: [MakePoint](https://www.gaia-gis.it/gaia-sins/spatialite-sql-latest.html#p0).\r\n\r\n```sql\r\nselect MakePoint(longitude, latitude) as geometry from places;\r\n```\r\n\r\nI'm not sure if that would work with `conversions`, since it needs two columns, but it's an option for tables that already have latitude, longitude columns.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1124731464, "label": "Make it easier to insert geometries, with documentation and maybe code"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1581#issuecomment-1077047295", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1581", "id": 1077047295, "node_id": "IC_kwDOBm6k_c5AMm__", "user": {"value": 536941, "label": "fgregg"}, "created_at": "2022-03-24T04:08:18Z", "updated_at": "2022-03-24T04:08:18Z", "author_association": "CONTRIBUTOR", "body": "this has been addressed by the datasette-hashed-urls plugin", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1089529555, "label": "when hashed urls are turned on, the _memory db has improperly long-lived cache expiry"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1582#issuecomment-1077047152", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1582", "id": 1077047152, "node_id": "IC_kwDOBm6k_c5AMm9w", "user": {"value": 536941, "label": "fgregg"}, "created_at": "2022-03-24T04:07:58Z", "updated_at": "2022-03-24T04:07:58Z", "author_association": "CONTRIBUTOR", "body": "this has been obviated by the datasette-hashed-urls plugin", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1090055810, "label": "don't set far expiry if hash is '000'"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1682#issuecomment-1076696791", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1682", "id": 1076696791, "node_id": "IC_kwDOBm6k_c5ALRbX", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-23T18:45:49Z", "updated_at": "2022-03-23T18:45:49Z", "author_association": "OWNER", "body": "The problem is here in `QueryView`: https://github.com/simonw/datasette/blob/d7c793d7998388d915f8d270079c68a77a785051/datasette/views/database.py#L206-L238\r\n\r\nIt should be resolving `database` based on the route path, as seen in other methods like this one: https://github.com/simonw/datasette/blob/d7c793d7998388d915f8d270079c68a77a785051/datasette/views/table.py#L270-L279\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1178521513, "label": "SQL queries against databases with different routes are broken"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1670#issuecomment-1076683297", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1670", "id": 1076683297, "node_id": "IC_kwDOBm6k_c5ALOIh", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-23T18:32:32Z", "updated_at": "2022-03-23T18:32:32Z", "author_association": "OWNER", "body": "Added this to news on https://datasette.io/ https://github.com/simonw/datasette.io/commit/fd3ec57cdd5b935f75cbf52a86b3aabf2c97d217", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1174423568, "label": "Ship Datasette 0.61"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1670#issuecomment-1076666293", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1670", "id": 1076666293, "node_id": "IC_kwDOBm6k_c5ALJ-1", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-23T18:16:29Z", "updated_at": "2022-03-23T18:16:29Z", "author_association": "OWNER", "body": "https://docs.datasette.io/en/stable/changelog.html#v0-61", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1174423568, "label": "Ship Datasette 0.61"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1670#issuecomment-1076665837", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1670", "id": 1076665837, "node_id": "IC_kwDOBm6k_c5ALJ3t", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-23T18:16:01Z", "updated_at": "2022-03-23T18:16:01Z", "author_association": "OWNER", "body": "https://github.com/simonw/datasette/releases/tag/0.61\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1174423568, "label": "Ship Datasette 0.61"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/pull/419#issuecomment-1076662556", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/419", "id": 1076662556, "node_id": "IC_kwDOCGYnMM5ALJEc", "user": {"value": 22429695, "label": "codecov[bot]"}, "created_at": "2022-03-23T18:12:47Z", "updated_at": "2022-03-23T18:12:47Z", "author_association": "NONE", "body": "# [Codecov](https://codecov.io/gh/simonw/sqlite-utils/pull/419?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) Report\n> Merging [#419](https://codecov.io/gh/simonw/sqlite-utils/pull/419?src=pr&el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) (228f736) into [main](https://codecov.io/gh/simonw/sqlite-utils/commit/93fa79d30b1531bea281d0eb6b925c4e61bc1aa6?el=desc&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison) (93fa79d) will **not change** coverage.\n> The diff coverage is `n/a`.\n\n```diff\n@@ Coverage Diff @@\n## main #419 +/- ##\n=======================================\n Coverage 96.55% 96.55% \n=======================================\n Files 6 6 \n Lines 2498 2498 \n=======================================\n Hits 2412 2412 \n Misses 86 86 \n```\n\n\n\n------\n\n[Continue to review full report at Codecov](https://codecov.io/gh/simonw/sqlite-utils/pull/419?src=pr&el=continue&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison).\n> **Legend** - [Click here to learn more](https://docs.codecov.io/docs/codecov-delta?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison)\n> `\u0394 = absolute (impact)`, `\u00f8 = not affected`, `? = missing data`\n> Powered by [Codecov](https://codecov.io/gh/simonw/sqlite-utils/pull/419?src=pr&el=footer&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Last update [93fa79d...228f736](https://codecov.io/gh/simonw/sqlite-utils/pull/419?src=pr&el=lastupdated&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison). Read the [comment docs](https://docs.codecov.io/docs/pull-request-comments?utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=Simon+Willison).\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1178484369, "label": "Ignore common generated files"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1670#issuecomment-1076652046", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1670", "id": 1076652046, "node_id": "IC_kwDOBm6k_c5ALGgO", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-23T18:02:30Z", "updated_at": "2022-03-23T18:02:30Z", "author_association": "OWNER", "body": "Two new things to add to the release notes from https://github.com/simonw/datasette/compare/0.61a0...main\r\n- https://github.com/simonw/datasette/issues/1678\r\n- https://github.com/simonw/datasette/issues/1675 (now also a documented API)", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1174423568, "label": "Ship Datasette 0.61"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1670#issuecomment-1076647495", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1670", "id": 1076647495, "node_id": "IC_kwDOBm6k_c5ALFZH", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-23T17:58:16Z", "updated_at": "2022-03-23T17:58:16Z", "author_association": "OWNER", "body": "I think the release notes are fine, but they need an opening paragraph highlighting the changes that are most likely to break backwards compatibility.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1174423568, "label": "Ship Datasette 0.61"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1574#issuecomment-1076645636", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1574", "id": 1076645636, "node_id": "IC_kwDOBm6k_c5ALE8E", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-23T17:56:35Z", "updated_at": "2022-03-23T17:56:35Z", "author_association": "OWNER", "body": "I'd actually like to switch to slim as the default - I think Datasette should ship the smallest possible container that can still support extra packages being installed using `apt-get install`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1084193403, "label": "introduce new option for datasette package to use a slim base image"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1665#issuecomment-1076644362", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1665", "id": 1076644362, "node_id": "IC_kwDOBm6k_c5ALEoK", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-23T17:55:39Z", "updated_at": "2022-03-23T17:55:39Z", "author_association": "OWNER", "body": "Thanks for the PR - I spotted an error about this and went through and fixed this in all of my repos the other day: https://github.com/search?o=desc&q=user%3Asimonw+google-github-actions%2Fsetup-gcloud%40v0&s=committer-date&type=Commits", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1173828092, "label": "Pin setup-gcloud to v0 instead of master"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1670#issuecomment-1076638278", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1670", "id": 1076638278, "node_id": "IC_kwDOBm6k_c5ALDJG", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-23T17:50:55Z", "updated_at": "2022-03-23T17:50:55Z", "author_association": "OWNER", "body": "Release notes are mostly written for the alpha, just need to clean them up a bit https://github.com/simonw/datasette/blob/c4c9dbd0386e46d2bf199f0ed34e4895c98cb78c/docs/changelog.rst#061a0-2022-03-19", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1174423568, "label": "Ship Datasette 0.61"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1681#issuecomment-1075438684", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1681", "id": 1075438684, "node_id": "IC_kwDOBm6k_c5AGeRc", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-22T17:45:50Z", "updated_at": "2022-03-22T17:49:09Z", "author_association": "OWNER", "body": "I would expect this to break against SQL views that include calculated columns though - something like this:\r\n\r\n```sql\r\ncreate view this_will_break as select pk + 1 as pk_plus_one, 0.5 as score from searchable;\r\n```\r\nConfirmed: the filter interface for that view plain doesn't work for any comparison against that table - except for `score > 0` since `0` is converted to an integer. `0.1` breaks though because it doesn't get converted as it doesn't match `.isdigit()`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1177101697, "label": "Potential bug in numeric handling where_clause for filters"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1681#issuecomment-1075437598", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1681", "id": 1075437598, "node_id": "IC_kwDOBm6k_c5AGeAe", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-22T17:44:42Z", "updated_at": "2022-03-22T17:45:04Z", "author_association": "OWNER", "body": "My hunch is that this mechanism doesn't actually do anything useful at all, because of the type conversion that automatically happens for data from tables based on the column type affinities, see:\r\n- #1671\r\n\r\nSo either remove the `self.numeric` type conversion bit entirely, or prove that it is necessary and upgrade it to be able to handle floating point values too.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1177101697, "label": "Potential bug in numeric handling where_clause for filters"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1671#issuecomment-1075435185", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1671", "id": 1075435185, "node_id": "IC_kwDOBm6k_c5AGdax", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-22T17:42:09Z", "updated_at": "2022-03-22T17:42:09Z", "author_association": "OWNER", "body": "Also made me realize that this query:\r\n```sql\r\nselect * from sortable where sortable > :p0\r\n```\r\nOnly works here thanks to the column affinity thing kicking in too: https://latest.datasette.io/fixtures?sql=select+*+from+sortable+where+sortable+%3E+%3Ap0&p0=70", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1174655187, "label": "Filters fail to work correctly against calculated numeric columns returned by SQL views because type affinity rules do not apply"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1671#issuecomment-1075432283", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1671", "id": 1075432283, "node_id": "IC_kwDOBm6k_c5AGctb", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-22T17:39:04Z", "updated_at": "2022-03-22T17:43:12Z", "author_association": "OWNER", "body": "Note that Datasette does already have special logic to convert parameters to integers for numeric comparisons like `>`:\r\n\r\nhttps://github.com/simonw/datasette/blob/c4c9dbd0386e46d2bf199f0ed34e4895c98cb78c/datasette/filters.py#L203-L212\r\n\r\nThough... it looks like there's a bug in that? It doesn't account for `float` values - `\"3.5\".isdigit()` return `False` - probably for the best, because `int(3.5)` would break that value anyway.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1174655187, "label": "Filters fail to work correctly against calculated numeric columns returned by SQL views because type affinity rules do not apply"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1671#issuecomment-1075428030", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1671", "id": 1075428030, "node_id": "IC_kwDOBm6k_c5AGbq-", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-22T17:34:30Z", "updated_at": "2022-03-22T17:34:30Z", "author_association": "OWNER", "body": "No, I think I need to use `cast` - I can't think of any way to ask SQLite \"for this query, what types are the columns that will come back from it?\"\r\n\r\nEven the details from the `explain` trick explored in #1293 don't seem to come back with column type information: https://latest.datasette.io/fixtures?sql=explain+select+pk%2C+text1%2C+text2%2C+[name+with+.+and+spaces]+from+searchable_view+where+%22pk%22+%3D+%3Ap0&p0=1", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1174655187, "label": "Filters fail to work correctly against calculated numeric columns returned by SQL views because type affinity rules do not apply"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1671#issuecomment-1075425513", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1671", "id": 1075425513, "node_id": "IC_kwDOBm6k_c5AGbDp", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-22T17:31:53Z", "updated_at": "2022-03-22T17:31:53Z", "author_association": "OWNER", "body": "The alternative to using `cast` here would be for Datasette to convert the `\"1\"` to a `1` in Python code before passing it as a param.\r\n\r\nThis feels a bit neater to me, but I still then need to solve the problem of how to identify the \"type\" of a column that I want to use in a query.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1174655187, "label": "Filters fail to work correctly against calculated numeric columns returned by SQL views because type affinity rules do not apply"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/339#issuecomment-1074479932", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/339", "id": 1074479932, "node_id": "IC_kwDOBm6k_c5AC0M8", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-21T22:22:34Z", "updated_at": "2022-03-21T22:22:34Z", "author_association": "OWNER", "body": "Closing this as obsolete since Datasette no longer uses Sanic.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 340396247, "label": "Expose SANIC_RESPONSE_TIMEOUT config option in a sensible way"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/276#issuecomment-1074479768", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/276", "id": 1074479768, "node_id": "IC_kwDOBm6k_c5AC0KY", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-03-21T22:22:20Z", "updated_at": "2022-03-21T22:22:20Z", "author_association": "OWNER", "body": "I'm closing this issue because this is now solved by a number of neat plugins:\r\n\r\n- https://datasette.io/plugins/datasette-geojson-map shows the geometry from SpatiaLite columns on a map\r\n- https://datasette.io/plugins/datasette-leaflet-geojson can be used to display inline maps next to each column", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 324835838, "label": "Handle spatialite geometry columns better"}, "performed_via_github_app": null}