{"html_url": "https://github.com/simonw/datasette/issues/1470#issuecomment-938124652", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1470", "id": 938124652, "node_id": "IC_kwDOBm6k_c436qVs", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-10-07T20:17:53Z", "updated_at": "2021-10-07T20:18:55Z", "author_association": "OWNER", "body": "Here's the exception:\r\n```\r\n-> params[f\"p{len(params)}\"] = components[0]\r\n(Pdb) list\r\n603 \t\r\n604 \t # Figure out the SQL for next-based-on-primary-key first\r\n605 \t next_by_pk_clauses = []\r\n606 \t if use_rowid:\r\n607 \t next_by_pk_clauses.append(f\"rowid > :p{len(params)}\")\r\n608 ->\t params[f\"p{len(params)}\"] = components[0]\r\n609 \t else:\r\n610 \t # Apply the tie-breaker based on primary keys\r\n611 \t if len(components) == len(pks):\r\n612 \t param_len = len(params)\r\n613 \t next_by_pk_clauses.append(\r\n```\r\nDebugger shows that `components` is an empty array, so `components[0]` cannot be resolved:\r\n\r\n```\r\n-> params[f\"p{len(params)}\"] = components[0]\r\n(Pdb) params\r\n{'search': 'hello'}\r\n(Pdb) components\r\n[]\r\n```\r\n\r\nSo the bug is in this code: https://github.com/simonw/datasette/blob/adb5b70de5cec3c3dd37184defe606a082c232cf/datasette/views/table.py#L604-L617\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 995098231, "label": "?_sort=rowid with _next= returns error"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1470#issuecomment-938131806", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1470", "id": 938131806, "node_id": "IC_kwDOBm6k_c436sFe", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-10-07T20:28:30Z", "updated_at": "2021-10-07T20:28:30Z", "author_association": "OWNER", "body": "On further investigation this isn't related to `_search` at all - it happens when you explicitly sort by `_sort=rowid` and apply a `_next`\r\n\r\n- https://global-power-plants.datasettes.com/global-power-plants/global-power-plants?_next=200 works without an error (currently)\r\n- https://global-power-plants.datasettes.com/global-power-plants/global-power-plants?_next=200&_sort=rowid shows that error", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 995098231, "label": "?_sort=rowid with _next= returns error"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1480#issuecomment-938134038", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1480", "id": 938134038, "node_id": "IC_kwDOBm6k_c436soW", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-10-07T20:31:46Z", "updated_at": "2021-10-07T20:31:46Z", "author_association": "OWNER", "body": "I've had this problem too - my solution was to not use Cloud Run for databases larger than about 2GB, but the way you describe it here makes me think that maybe there is a workaround here which could get it to work.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1015646369, "label": "Exceeding Cloud Run memory limits when deploying a 4.8G database"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/1481#issuecomment-938142436", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1481", "id": 938142436, "node_id": "IC_kwDOBm6k_c436urk", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-10-07T20:44:43Z", "updated_at": "2021-10-07T20:44:43Z", "author_association": "OWNER", "body": "The 3.10 tests failed a lot. Trying to run this locally:\r\n\r\n```\r\n/tmp % pyenv install 3.10\r\npython-build: definition not found: 3.10\r\n\r\nThe following versions contain `3.10' in the name:\r\n 3.10.0a6\r\n 3.10-dev\r\n miniconda-3.10.1\r\n miniconda3-3.10.1\r\n\r\nSee all available versions with `pyenv install --list'.\r\n\r\nIf the version you need is missing, try upgrading pyenv:\r\n\r\n brew update && brew upgrade pyenv\r\n```\r\nSo trying:\r\n\r\n brew update && brew upgrade pyenv\r\n\r\nThen did this:\r\n\r\n```\r\n/tmp % brew upgrade pyenv \r\n==> Upgrading 1 outdated package:\r\npyenv 1.2.24.1 -> 2.1.0\r\n```\r\nThis decided to upgrade everything by downloaded everything on the internet. Aah, Homebrew.\r\n\r\nBut it looks like I have `3.10.0` available to `pyenv` now.\r\n\r\n```\r\n/tmp % pyenv install 3.10.0\r\npython-build: use openssl@1.1 from homebrew\r\npython-build: use readline from homebrew\r\nDownloading Python-3.10.0.tar.xz...\r\n-> https://www.python.org/ftp/python/3.10.0/Python-3.10.0.tar.xz\r\nInstalling Python-3.10.0...\r\n...\r\n```\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1020436713, "label": "Fix compatibility with Python 3.10"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/1480#issuecomment-938171377", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1480", "id": 938171377, "node_id": "IC_kwDOBm6k_c4361vx", "user": {"value": 110420, "label": "ghing"}, "created_at": "2021-10-07T21:33:12Z", "updated_at": "2021-10-07T21:33:12Z", "author_association": "CONTRIBUTOR", "body": "Thanks for the reply @simonw. What services have you had better success with than Cloud Run for larger database?\r\n\r\nAlso, what about my issue description makes you think there may be a workaround?\r\n\r\nIs there any instrumentation I could add to see at which point in the deploy the memory usage spikes? Should I be able to see this whether it's running under Docker locally, or do you suspect this is Cloud Run-specific?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1015646369, "label": "Exceeding Cloud Run memory limits when deploying a 4.8G database"}, "performed_via_github_app": null}