[^/]+?$)\", table),\r\n ]\r\n```\r\nI'll use a `/t/` prefix for the moment, but this is probably something I'll fix in Datasette itself later.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 648435885, "label": "New pattern for views that return either JSON or HTML, available for plugins"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/issues/1519#issuecomment-974420619", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1519", "id": 974420619, "node_id": "IC_kwDOBm6k_c46FHqL", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-11-19T20:25:19Z", "updated_at": "2021-11-19T20:25:19Z", "author_association": "OWNER", "body": "The implementations of `path_with_removed_args` and `path_with_format`:\r\n\r\nhttps://github.com/simonw/datasette/blob/85849935292e500ab7a99f8fe0f9546e903baad3/datasette/utils/__init__.py#L228-L254\r\n\r\nhttps://github.com/simonw/datasette/blob/85849935292e500ab7a99f8fe0f9546e903baad3/datasette/utils/__init__.py#L710-L729", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058790545, "label": "base_url is omitted in JSON and CSV views"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/issues/1519#issuecomment-974398399", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1519", "id": 974398399, "node_id": "IC_kwDOBm6k_c46FCO_", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-11-19T20:08:20Z", "updated_at": "2021-11-19T20:22:02Z", "author_association": "OWNER", "body": "The relevant test is this one: https://github.com/simonw/datasette/blob/30255055150d7bc0affc8156adc18295495020ff/tests/test_html.py#L1608-L1649\r\n\r\nI modified that test to add `\"/fixtures/facetable?sql=select+1\"` as one of the tested paths, and dropped in an `assert False` to pause it in the debugger:\r\n```\r\n @pytest.mark.parametrize(\r\n \"path\",\r\n [\r\n \"/\",\r\n \"/fixtures\",\r\n \"/fixtures/compound_three_primary_keys\",\r\n \"/fixtures/compound_three_primary_keys/a,a,a\",\r\n \"/fixtures/paginated_view\",\r\n \"/fixtures/facetable\",\r\n \"/fixtures?sql=select+1\",\r\n ],\r\n )\r\n def test_base_url_config(app_client_base_url_prefix, path):\r\n client = app_client_base_url_prefix\r\n response = client.get(\"/prefix/\" + path.lstrip(\"/\"))\r\n soup = Soup(response.body, \"html.parser\")\r\n if path == \"/fixtures?sql=select+1\":\r\n> assert False\r\nE assert False\r\n```\r\nBUT... in the debugger:\r\n```\r\n(Pdb) print(soup)\r\n...\r\nThis data as\r\n json,\r\n testall,\r\n testnone,\r\n testresponse,\r\n CSV
\r\n```\r\nThose all have the correct prefix! But that's not what I'm seeing in my `Dockerfile` reproduction of the issue.\r\n\r\nSomething very weird is going on here.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058790545, "label": "base_url is omitted in JSON and CSV views"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/issues/1520#issuecomment-974308215", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1520", "id": 974308215, "node_id": "IC_kwDOBm6k_c46EsN3", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-11-19T18:29:26Z", "updated_at": "2021-11-19T18:29:26Z", "author_association": "OWNER", "body": "The solution that jumps to mind first is that it would be neat if routes could return something that meant \"actually my bad, I can't handle this after all - move to the next one in the list\".\r\n\r\nA related idea: it might be useful for custom views like my one here to say \"no actually call the default view for this, but give me back the response so I can modify it in some way\". Kind of like Django or ASGI middleware.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058803238, "label": "Pattern for avoiding accidental URL over-rides"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/issues/1521#issuecomment-974336020", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1521", "id": 974336020, "node_id": "IC_kwDOBm6k_c46EzAU", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-11-19T19:10:48Z", "updated_at": "2021-11-19T19:10:48Z", "author_association": "OWNER", "body": "There's a promising looking minimal Apache 2 proxy config here: https://stackoverflow.com/questions/26474476/minimal-configuration-for-apache-reverse-proxy-in-docker-container\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058815557, "label": "Docker configuration for exercising Datasette behind Apache mod_proxy"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/issues/1521#issuecomment-974433520", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1521", "id": 974433520, "node_id": "IC_kwDOBm6k_c46FKzw", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-11-19T20:32:29Z", "updated_at": "2021-11-19T20:32:29Z", "author_association": "OWNER", "body": "This configuration works great.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058815557, "label": "Docker configuration for exercising Datasette behind Apache mod_proxy"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/issues/878#issuecomment-973635157", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/878", "id": 973635157, "node_id": "IC_kwDOBm6k_c46CH5V", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-11-19T01:07:08Z", "updated_at": "2021-11-19T01:07:08Z", "author_association": "OWNER", "body": "This exercise is proving so useful in getting my head around how the enormous and complex `TableView` class works again.\r\n\r\nHere's where I've got to now - I'm systematically working through the variables that are returned for HTML and for JSON copying across code to get it to work:\r\n\r\n```python\r\nfrom datasette.database import QueryInterrupted\r\nfrom datasette.utils import escape_sqlite\r\nfrom datasette.utils.asgi import Response, NotFound, Forbidden\r\nfrom datasette.views.base import DatasetteError\r\nfrom datasette import hookimpl\r\nfrom asyncinject import AsyncInject, inject\r\nfrom pprint import pformat\r\n\r\n\r\nclass Table(AsyncInject):\r\n @inject\r\n async def database(self, request, datasette):\r\n # TODO: all that nasty hash resolving stuff can go here\r\n db_name = request.url_vars[\"db_name\"]\r\n try:\r\n db = datasette.databases[db_name]\r\n except KeyError:\r\n raise NotFound(f\"Database '{db_name}' does not exist\")\r\n return db\r\n\r\n @inject\r\n async def table_and_format(self, request, database, datasette):\r\n table_and_format = request.url_vars[\"table_and_format\"]\r\n # TODO: be a lot smarter here\r\n if \".\" in table_and_format:\r\n return table_and_format.split(\".\", 2)\r\n else:\r\n return table_and_format, \"html\"\r\n\r\n @inject\r\n async def main(self, request, database, table_and_format, datasette):\r\n # TODO: if this is actually a canned query, dispatch to it\r\n\r\n table, format = table_and_format\r\n\r\n is_view = bool(await database.get_view_definition(table))\r\n table_exists = bool(await database.table_exists(table))\r\n if not is_view and not table_exists:\r\n raise NotFound(f\"Table not found: {table}\")\r\n\r\n await check_permissions(\r\n datasette,\r\n request,\r\n [\r\n (\"view-table\", (database.name, table)),\r\n (\"view-database\", database.name),\r\n \"view-instance\",\r\n ],\r\n )\r\n\r\n private = not await datasette.permission_allowed(\r\n None, \"view-table\", (database.name, table), default=True\r\n )\r\n\r\n pks = await database.primary_keys(table)\r\n table_columns = await database.table_columns(table)\r\n\r\n specified_columns = await columns_to_select(datasette, database, table, request)\r\n select_specified_columns = \", \".join(\r\n escape_sqlite(t) for t in specified_columns\r\n )\r\n select_all_columns = \", \".join(escape_sqlite(t) for t in table_columns)\r\n\r\n use_rowid = not pks and not is_view\r\n if use_rowid:\r\n select_specified_columns = f\"rowid, {select_specified_columns}\"\r\n select_all_columns = f\"rowid, {select_all_columns}\"\r\n order_by = \"rowid\"\r\n order_by_pks = \"rowid\"\r\n else:\r\n order_by_pks = \", \".join([escape_sqlite(pk) for pk in pks])\r\n order_by = order_by_pks\r\n\r\n if is_view:\r\n order_by = \"\"\r\n\r\n nocount = request.args.get(\"_nocount\")\r\n nofacet = request.args.get(\"_nofacet\")\r\n\r\n if request.args.get(\"_shape\") in (\"array\", \"object\"):\r\n nocount = True\r\n nofacet = True\r\n\r\n # Next, a TON of SQL to build where_params and filters and suchlike\r\n # skipping that and jumping straight to...\r\n where_clauses = []\r\n where_clause = \"\"\r\n if where_clauses:\r\n where_clause = f\"where {' and '.join(where_clauses)} \"\r\n\r\n from_sql = \"from {table_name} {where}\".format(\r\n table_name=escape_sqlite(table),\r\n where=(\"where {} \".format(\" and \".join(where_clauses)))\r\n if where_clauses\r\n else \"\",\r\n )\r\n from_sql_params ={}\r\n params = {}\r\n count_sql = f\"select count(*) {from_sql}\"\r\n sql_no_order_no_limit = (\r\n \"select {select_all_columns} from {table_name} {where}\".format(\r\n select_all_columns=select_all_columns,\r\n table_name=escape_sqlite(table),\r\n where=where_clause,\r\n )\r\n )\r\n\r\n page_size = 100\r\n offset = \" offset 0\"\r\n\r\n sql = \"select {select_specified_columns} from {table_name} {where}{order_by} limit {page_size}{offset}\".format(\r\n select_specified_columns=select_specified_columns,\r\n table_name=escape_sqlite(table),\r\n where=where_clause,\r\n order_by=order_by,\r\n page_size=page_size + 1,\r\n offset=offset,\r\n )\r\n\r\n # Fetch rows\r\n results = await database.execute(sql, params, truncate=True)\r\n columns = [r[0] for r in results.description]\r\n rows = list(results.rows)\r\n\r\n # Fetch count\r\n filtered_table_rows_count = None\r\n if count_sql:\r\n try:\r\n count_rows = list(await database.execute(count_sql, from_sql_params))\r\n filtered_table_rows_count = count_rows[0][0]\r\n except QueryInterrupted:\r\n pass\r\n\r\n\r\n vars = {\r\n \"json\": {\r\n # THIS STUFF is from the regular JSON\r\n \"database\": database.name,\r\n \"table\": table,\r\n \"is_view\": is_view,\r\n # \"human_description_en\": human_description_en,\r\n \"rows\": rows[:page_size],\r\n \"truncated\": results.truncated,\r\n \"filtered_table_rows_count\": filtered_table_rows_count,\r\n # \"expanded_columns\": expanded_columns,\r\n # \"expandable_columns\": expandable_columns,\r\n \"columns\": columns,\r\n \"primary_keys\": pks,\r\n # \"units\": units,\r\n \"query\": {\"sql\": sql, \"params\": params},\r\n # \"facet_results\": facet_results,\r\n # \"suggested_facets\": suggested_facets,\r\n # \"next\": next_value and str(next_value) or None,\r\n # \"next_url\": next_url,\r\n \"private\": private,\r\n \"allow_execute_sql\": await datasette.permission_allowed(\r\n request.actor, \"execute-sql\", database, default=True\r\n ),\r\n },\r\n \"html\": {\r\n # ... this is the HTML special stuff\r\n # \"table_actions\": table_actions,\r\n # \"supports_search\": bool(fts_table),\r\n # \"search\": search or \"\",\r\n \"use_rowid\": use_rowid,\r\n # \"filters\": filters,\r\n # \"display_columns\": display_columns,\r\n # \"filter_columns\": filter_columns,\r\n # \"display_rows\": display_rows,\r\n # \"facets_timed_out\": facets_timed_out,\r\n # \"sorted_facet_results\": sorted(\r\n # facet_results.values(),\r\n # key=lambda f: (len(f[\"results\"]), f[\"name\"]),\r\n # reverse=True,\r\n # ),\r\n # \"show_facet_counts\": special_args.get(\"_facet_size\") == \"max\",\r\n # \"extra_wheres_for_ui\": extra_wheres_for_ui,\r\n # \"form_hidden_args\": form_hidden_args,\r\n # \"is_sortable\": any(c[\"sortable\"] for c in display_columns),\r\n # \"path_with_replaced_args\": path_with_replaced_args,\r\n # \"path_with_removed_args\": path_with_removed_args,\r\n # \"append_querystring\": append_querystring,\r\n \"request\": request,\r\n # \"sort\": sort,\r\n # \"sort_desc\": sort_desc,\r\n \"disable_sort\": is_view,\r\n # \"custom_table_templates\": [\r\n # f\"_table-{to_css_class(database)}-{to_css_class(table)}.html\",\r\n # f\"_table-table-{to_css_class(database)}-{to_css_class(table)}.html\",\r\n # \"_table.html\",\r\n # ],\r\n # \"metadata\": metadata,\r\n # \"view_definition\": await db.get_view_definition(table),\r\n # \"table_definition\": await db.get_table_definition(table),\r\n },\r\n }\r\n\r\n # I'm just trying to get HTML to work for the moment\r\n if format == \"json\":\r\n return Response.json(dict(vars, locals=locals()), default=repr)\r\n else:\r\n return Response.html(repr(vars[\"html\"]))\r\n\r\n async def view(self, request, datasette):\r\n return await self.main(request=request, datasette=datasette)\r\n\r\n\r\n@hookimpl\r\ndef register_routes():\r\n return [\r\n (r\"/t/(?P[^/]+)/(?P[^/]+?$)\", Table().view),\r\n ]\r\n\r\n\r\nasync def check_permissions(datasette, request, permissions):\r\n \"\"\"permissions is a list of (action, resource) tuples or 'action' strings\"\"\"\r\n for permission in permissions:\r\n if isinstance(permission, str):\r\n action = permission\r\n resource = None\r\n elif isinstance(permission, (tuple, list)) and len(permission) == 2:\r\n action, resource = permission\r\n else:\r\n assert (\r\n False\r\n ), \"permission should be string or tuple of two items: {}\".format(\r\n repr(permission)\r\n )\r\n ok = await datasette.permission_allowed(\r\n request.actor,\r\n action,\r\n resource=resource,\r\n default=None,\r\n )\r\n if ok is not None:\r\n if ok:\r\n return\r\n else:\r\n raise Forbidden(action)\r\n\r\n\r\nasync def columns_to_select(datasette, database, table, request):\r\n table_columns = await database.table_columns(table)\r\n pks = await database.primary_keys(table)\r\n columns = list(table_columns)\r\n if \"_col\" in request.args:\r\n columns = list(pks)\r\n _cols = request.args.getlist(\"_col\")\r\n bad_columns = [column for column in _cols if column not in table_columns]\r\n if bad_columns:\r\n raise DatasetteError(\r\n \"_col={} - invalid columns\".format(\", \".join(bad_columns)),\r\n status=400,\r\n )\r\n # De-duplicate maintaining order:\r\n columns.extend(dict.fromkeys(_cols))\r\n if \"_nocol\" in request.args:\r\n # Return all columns EXCEPT these\r\n bad_columns = [\r\n column\r\n for column in request.args.getlist(\"_nocol\")\r\n if (column not in table_columns) or (column in pks)\r\n ]\r\n if bad_columns:\r\n raise DatasetteError(\r\n \"_nocol={} - invalid columns\".format(\", \".join(bad_columns)),\r\n status=400,\r\n )\r\n tmp_columns = [\r\n column for column in columns if column not in request.args.getlist(\"_nocol\")\r\n ]\r\n columns = tmp_columns\r\n return columns\r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 648435885, "label": "New pattern for views that return either JSON or HTML, available for plugins"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/issues/1522#issuecomment-974506401", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1522", "id": 974506401, "node_id": "IC_kwDOBm6k_c46Fcmh", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-11-19T22:11:51Z", "updated_at": "2021-11-19T22:11:51Z", "author_association": "OWNER", "body": "This is frustrating: I have the following Dockerfile:\r\n```dockerfile\r\nFROM python:3-alpine\r\n\r\nRUN apk add --no-cache \\\r\n\tapache2 \\\r\n\tapache2-proxy \\\r\n\tbash\r\n\r\nRUN pip install datasette\r\n\r\nENV TINI_VERSION v0.18.0\r\nADD https://github.com/krallin/tini/releases/download/${TINI_VERSION}/tini-static /tini\r\nRUN chmod +x /tini\r\n\r\n# Append this to the end of the default httpd.conf file\r\nRUN echo $'ServerName localhost\\n\\\r\n\\n\\\r\n\\n\\\r\n Order deny,allow\\n\\\r\n Allow from all\\n\\\r\n\\n\\\r\n\\n\\\r\nProxyPass /prefix/ http://localhost:8001/\\n\\\r\nHeader add X-Proxied-By \"Apache2\"' >> /etc/apache2/httpd.conf\r\n\r\nRUN echo $'Datasette' > /var/www/localhost/htdocs/index.html\r\n\r\nWORKDIR /app\r\n\r\nADD https://latest.datasette.io/fixtures.db /app/fixtures.db\r\n\r\nRUN echo $'#!/usr/bin/env bash\\n\\\r\nset -e\\n\\\r\n\\n\\\r\nhttpd -D FOREGROUND &\\n\\\r\ndatasette fixtures.db --setting base_url \"/prefix/\" -h 0.0.0.0 -p 8001 &\\n\\\r\n\\n\\\r\nwait -n' > /app/start.sh\r\n\r\nRUN chmod +x /app/start.sh\r\n\r\nEXPOSE 80\r\nENTRYPOINT [\"/tini\", \"--\", \"/app/start.sh\"]\r\n```\r\nIt works fine when I run it locally:\r\n```\r\ndocker build -t datasette-apache-proxy-demo .\r\ndocker run -p 5000:80 datasette-apache-proxy-demo\r\n```\r\nBut when I deploy it to Cloud Run with the following script:\r\n```bash\r\n#!/bin/bash\r\n# https://til.simonwillison.net/cloudrun/ship-dockerfile-to-cloud-run\r\n\r\nNAME=\"datasette-apache-proxy-demo\"\r\nPROJECT=$(gcloud config get-value project)\r\nIMAGE=\"gcr.io/$PROJECT/$NAME\"\r\n\r\ngcloud builds submit --tag $IMAGE\r\ngcloud run deploy \\\r\n --allow-unauthenticated \\\r\n --platform=managed \\\r\n --image $IMAGE $NAME \\\r\n --port 80\r\n```\r\nIt serves the `/` page successfully, but hits to `/prefix/` return the following 503 error:\r\n\r\n> Service Unavailable\r\n>\r\n> The server is temporarily unable to service your request due to maintenance downtime or capacity problems. Please try again later.\r\n>\r\n> Apache/2.4.51 (Unix) Server at datasette-apache-proxy-demo-j7hipcg4aq-uc.a.run.app Port 80\r\n\r\nCloud Run logs:\r\n\r\n\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058896236, "label": "Deploy a live instance of demos/apache-proxy"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/issues/878#issuecomment-973568285", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/878", "id": 973568285, "node_id": "IC_kwDOBm6k_c46B3kd", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-11-19T00:29:20Z", "updated_at": "2021-11-19T00:29:20Z", "author_association": "OWNER", "body": "This is working!\r\n```python\r\nfrom datasette.utils.asgi import Response\r\nfrom datasette import hookimpl\r\nimport html\r\nfrom asyncinject import AsyncInject, inject\r\n\r\n\r\nclass Table(AsyncInject):\r\n @inject\r\n async def database(self, request):\r\n return request.url_vars[\"db_name\"]\r\n\r\n @inject\r\n async def main(self, request, database):\r\n return Response.html(\"Database: {}\".format(\r\n html.escape(database)\r\n ))\r\n\r\n async def view(self, request):\r\n return await self.main(request=request)\r\n\r\n\r\n@hookimpl\r\ndef register_routes():\r\n return [\r\n (r\"/t/(?P[^/]+)/(?P[^/]+?$)\", Table().view),\r\n ]\r\n```\r\nThis project will definitely show me if I actually like the `asyncinject` patterns or not.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 648435885, "label": "New pattern for views that return either JSON or HTML, available for plugins"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/issues/1521#issuecomment-974321391", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1521", "id": 974321391, "node_id": "IC_kwDOBm6k_c46Evbv", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-11-19T18:49:15Z", "updated_at": "2021-11-19T18:57:18Z", "author_association": "OWNER", "body": "This pattern looks like it can help: https://ahmet.im/blog/cloud-run-multiple-processes-easy-way/ - see example in https://github.com/ahmetb/multi-process-container-lazy-solution\r\n\r\nI got that demo working locally like this:\r\n\r\n```bash\r\ncd /tmp\r\ngit clone https://github.com/ahmetb/multi-process-container-lazy-solution\r\ncd multi-process-container-lazy-solution\r\ndocker build -t multi-process-container-lazy-solution .\r\ndocker run -p 5000:8080 --rm multi-process-container-lazy-solution\r\n```\r\n\r\nI want to use `apache2` rather than `nginx` though. I found a few relevant examples of Apache in Alpine:\r\n\r\n- https://github.com/Hacking-Lab/alpine-apache2-reverse-proxy/blob/master/Dockerfile\r\n- https://www.sentiatechblog.com/running-apache-in-a-docker-container\r\n- https://github.com/search?l=Dockerfile&q=alpine+apache2&type=code\r\n\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058815557, "label": "Docker configuration for exercising Datasette behind Apache mod_proxy"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/issues/1518#issuecomment-974285803", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1518", "id": 974285803, "node_id": "IC_kwDOBm6k_c46Emvr", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-11-19T17:56:48Z", "updated_at": "2021-11-19T18:14:30Z", "author_association": "OWNER", "body": "Very confused by this piece of code here: https://github.com/simonw/datasette/blob/1c13e1af0664a4dfb1e69714c56523279cae09e4/datasette/views/table.py#L37-L63\r\n\r\nI added it in https://github.com/simonw/datasette/commit/754836eef043676e84626c4fd3cb993eed0d2976 - in the new world that should probably be replaced by pure JSON.\r\n\r\nAha - this comment explains it: https://github.com/simonw/datasette/issues/521#issuecomment-505279560\r\n\r\n> I think the trick is to redefine what a \"cell_row\" is. Each row is currently a list of cells:\r\n> \r\n> https://github.com/simonw/datasette/blob/6341f8cbc7833022012804dea120b838ec1f6558/datasette/views/table.py#L159-L163\r\n> \r\n> I can redefine the row (the `cells` variable in the above example) as a thing-that-iterates-cells (hence behaving like a list) but that also supports `__getitem__` access for looking up cell values if you know the name of the column.\r\n\r\nThe goal was to support neater custom templates like this:\r\n```html+jinja\r\n{% for row in display_rows %}\r\n {{ row[\"First_Name\"] }} {{ row[\"Last_Name\"] }}
\r\n ...\r\n```\r\nThis may be an argument for continuing to allow non-JSON-objects through to the HTML templates. Need to think about that a bit more.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058072543, "label": "Complete refactor of TableView and table.html template"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/issues/878#issuecomment-973542284", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/878", "id": 973542284, "node_id": "IC_kwDOBm6k_c46BxOM", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-11-19T00:16:44Z", "updated_at": "2021-11-19T00:16:44Z", "author_association": "OWNER", "body": "```\r\nDevelopment % cookiecutter gh:simonw/datasette-plugin\r\nYou've downloaded /Users/simon/.cookiecutters/datasette-plugin before. Is it okay to delete and re-download it? [yes]: yes\r\nplugin_name []: table-new\r\ndescription []: New implementation of TableView, see https://github.com/simonw/datasette/issues/878\r\nhyphenated [table-new]: \r\nunderscored [table_new]: \r\ngithub_username []: simonw\r\nauthor_name []: Simon Willison\r\ninclude_static_directory []: \r\ninclude_templates_directory []: \r\n```", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 648435885, "label": "New pattern for views that return either JSON or HTML, available for plugins"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/sqlite-utils/issues/342#issuecomment-973820125", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/342", "id": 973820125, "node_id": "IC_kwDOCGYnMM46C1Dd", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-11-19T07:25:55Z", "updated_at": "2021-11-19T07:25:55Z", "author_association": "OWNER", "body": "`alter=True` doesn't make sense to support here either, because `.lookup()` already adds missing columns: https://github.com/simonw/sqlite-utils/blob/3b8abe608796e99e4ffc5f3f4597a85e605c0e9b/sqlite_utils/db.py#L2743-L2746", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058196641, "label": "Extra options to `lookup()` which get passed to `insert()`"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/issues/1519#issuecomment-974309591", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1519", "id": 974309591, "node_id": "IC_kwDOBm6k_c46EsjX", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-11-19T18:31:32Z", "updated_at": "2021-11-19T18:31:32Z", "author_association": "OWNER", "body": "`base_url` has been a source of so many bugs like this! I often find them quite hard to replicate, likely because I haven't made myself a good Apache `mod_proxy` testing environment yet.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058790545, "label": "base_url is omitted in JSON and CSV views"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/sqlite-utils/issues/342#issuecomment-973801650", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/342", "id": 973801650, "node_id": "IC_kwDOCGYnMM46Cwiy", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-11-19T06:55:56Z", "updated_at": "2021-11-19T06:55:56Z", "author_association": "OWNER", "body": "`pk` needs to be an explicit argument to `.lookup()`. The rest could be `**kwargs` passed through to `.insert()`, like this hacked together version (docstring removed for brevity):\r\n\r\n```python\r\n def lookup(\r\n self,\r\n lookup_values: Dict[str, Any],\r\n extra_values: Optional[Dict[str, Any]] = None,\r\n pk=\"id\",\r\n **insert_kwargs,\r\n ):\r\n \"\"\"\r\n assert isinstance(lookup_values, dict)\r\n if extra_values is not None:\r\n assert isinstance(extra_values, dict)\r\n combined_values = dict(lookup_values)\r\n if extra_values is not None:\r\n combined_values.update(extra_values)\r\n if self.exists():\r\n self.add_missing_columns([combined_values])\r\n unique_column_sets = [set(i.columns) for i in self.indexes]\r\n if set(lookup_values.keys()) not in unique_column_sets:\r\n self.create_index(lookup_values.keys(), unique=True)\r\n wheres = [\"[{}] = ?\".format(column) for column in lookup_values]\r\n rows = list(\r\n self.rows_where(\r\n \" and \".join(wheres), [value for _, value in lookup_values.items()]\r\n )\r\n )\r\n try:\r\n return rows[0][pk]\r\n except IndexError:\r\n return self.insert(combined_values, pk=pk, **insert_kwargs).last_pk\r\n else:\r\n pk = self.insert(combined_values, pk=pk, **insert_kwargs).last_pk\r\n self.create_index(lookup_values.keys(), unique=True)\r\n return pk\r\n```\r\nI think I'll explicitly list the parameters, mainly so they can be typed and covered by automatic documentation.\r\n\r\nI do worry that I'll add more keyword arguments to `.insert()` in the future and forget to mirror them to `.lookup()` though.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058196641, "label": "Extra options to `lookup()` which get passed to `insert()`"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/issues/1519#issuecomment-974450232", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1519", "id": 974450232, "node_id": "IC_kwDOBm6k_c46FO44", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-11-19T20:41:53Z", "updated_at": "2021-11-19T20:42:19Z", "author_association": "OWNER", "body": "https://docs.datasette.io/en/stable/deploying.html#apache-proxy-configuration says I should use `ProxyPreserveHost on`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058790545, "label": "base_url is omitted in JSON and CSV views"}, "performed_via_github_app": null}
{"html_url": "https://github.com/simonw/datasette/issues/1521#issuecomment-974371116", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1521", "id": 974371116, "node_id": "IC_kwDOBm6k_c46E7ks", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-11-19T19:45:47Z", "updated_at": "2021-11-19T19:45:47Z", "author_association": "OWNER", "body": "https://github.com/krallin/tini says:\r\n\r\n> *NOTE: If you are using Docker 1.13 or greater, Tini is included in Docker itself. This includes all versions of Docker CE. To enable Tini, just [pass the `--init` flag to `docker run`](https://docs.docker.com/engine/reference/commandline/run/).*", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1058815557, "label": "Docker configuration for exercising Datasette behind Apache mod_proxy"}, "performed_via_github_app": null}