{"html_url": "https://github.com/simonw/datasette/issues/2019#issuecomment-1421600789", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2019", "id": 1421600789, "node_id": "IC_kwDOBm6k_c5Uu-gV", "user": {"value": 9599, "label": "simonw"}, "created_at": "2023-02-07T23:12:40Z", "updated_at": "2023-02-07T23:16:20Z", "author_association": "OWNER", "body": "Most complicated example of a paginated query: https://latest.datasette.io/fixtures?sql=select%0D%0A++pk1%2C%0D%0A++pk2%2C%0D%0A++content%2C%0D%0A++sortable%2C%0D%0A++sortable_with_nulls%2C%0D%0A++sortable_with_nulls_2%2C%0D%0A++text%0D%0Afrom%0D%0A++sortable%0D%0Awhere%0D%0A++(%0D%0A++++sortable_with_nulls+is+null%0D%0A++++and+(%0D%0A++++++(pk1+%3E+%3Ap0)%0D%0A++++++or+(%0D%0A++++++++pk1+%3D+%3Ap0%0D%0A++++++++and+pk2+%3E+%3Ap1%0D%0A++++++)%0D%0A++++)%0D%0A++)%0D%0Aorder+by%0D%0A++sortable_with_nulls+desc%2C%0D%0A++pk1%2C%0D%0A++pk2%0D%0Alimit%0D%0A++101&p0=h&p1=r\r\n\r\n```sql\r\nselect\r\n pk1,\r\n pk2,\r\n content,\r\n sortable,\r\n sortable_with_nulls,\r\n sortable_with_nulls_2,\r\n text\r\nfrom\r\n sortable\r\nwhere\r\n (\r\n sortable_with_nulls is null\r\n and (\r\n (pk1 > :p0)\r\n or (\r\n pk1 = :p0\r\n and pk2 > :p1\r\n )\r\n )\r\n )\r\norder by\r\n sortable_with_nulls desc,\r\n pk1,\r\n pk2\r\nlimit\r\n 101\r\n```\r\nGenerated by this page: https://latest.datasette.io/fixtures/sortable?_next=%24null%2Ch%2Cr&_sort_desc=sortable_with_nulls\r\n\r\nThe `_next=` parameter there decodes as `$null,h,r` - and those components are tilde-encoded, so this can be distinguished from an actual `$null` value which would be represented as `~24null`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1573424830, "label": "Refactor out the keyset pagination code"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/520#issuecomment-1421571810", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/520", "id": 1421571810, "node_id": "IC_kwDOCGYnMM5Uu3bi", "user": {"value": 167893, "label": "mcarpenter"}, "created_at": "2023-02-07T22:43:09Z", "updated_at": "2023-02-07T22:43:09Z", "author_association": "CONTRIBUTOR", "body": "Hey, isn't this essentially the same issue as #448 ?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1516644980, "label": "rows_from_file() raises confusing error if file-like object is not in binary mode"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/2019#issuecomment-1421274434", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2019", "id": 1421274434, "node_id": "IC_kwDOBm6k_c5Utu1C", "user": {"value": 9599, "label": "simonw"}, "created_at": "2023-02-07T18:42:42Z", "updated_at": "2023-02-07T18:42:42Z", "author_association": "OWNER", "body": "I'm going to build completely separate tests for this in `test_pagination.py`.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1573424830, "label": "Refactor out the keyset pagination code"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1421177666", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/524", "id": 1421177666, "node_id": "IC_kwDOCGYnMM5UtXNC", "user": {"value": 21095447, "label": "4l1fe"}, "created_at": "2023-02-07T17:39:00Z", "updated_at": "2023-02-07T17:39:00Z", "author_association": "NONE", "body": "> lets users make schema changes, so it's important to me that the tool work in a non-surprising way -- if you ask for a column of type X, you should get type X. If the column or table previously had CHECK constraints, they shouldn't be silently removed\r\n\r\nI've got your concern. Let's see if we will be replied on it and i'll close the issue some later.\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1572766460, "label": "Transformation type `--type DATETIME`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1421081939", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/524", "id": 1421081939, "node_id": "IC_kwDOCGYnMM5Us_1T", "user": {"value": 193185, "label": "cldellow"}, "created_at": "2023-02-07T16:42:25Z", "updated_at": "2023-02-07T16:43:42Z", "author_association": "NONE", "body": "Ha, yes, I might end up making something very niche. That's OK.\r\n\r\nI'm building a UI for [Datasette](https://datasette.io/) that lets users make schema changes, so it's important to me that the tool work in a non-surprising way -- if you ask for a column of type X, you should get type X. If the column or table previously had CHECK constraints, they shouldn't be silently removed. And so on. I had hoped that I could just lean on sqlite-utils, but I think it's a little too surprising.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1572766460, "label": "Transformation type `--type DATETIME`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1421055590", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/524", "id": 1421055590, "node_id": "IC_kwDOCGYnMM5Us5Zm", "user": {"value": 21095447, "label": "4l1fe"}, "created_at": "2023-02-07T16:25:31Z", "updated_at": "2023-02-07T16:25:31Z", "author_association": "NONE", "body": "> Ah, it looks like that is controlled by this dict: https://github.com/simonw/sqlite-utils/blob/main/sqlite_utils/db.py#L178\r\n> \r\n> I suspect you could overwrite the datetime entry to achieve what you want\r\n\r\nAnd thank you for pointing me to it. At least, i can make a monkey patch for my need...", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1572766460, "label": "Transformation type `--type DATETIME`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1421052195", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/524", "id": 1421052195, "node_id": "IC_kwDOCGYnMM5Us4kj", "user": {"value": 21095447, "label": "4l1fe"}, "created_at": "2023-02-07T16:23:17Z", "updated_at": "2023-02-07T16:23:57Z", "author_association": "NONE", "body": "Isn't your suggestion too fundamental for the utility?\r\n\r\nThe bigger flexibility, the bigger complexity. Your idea make sense defenitely, but how often do you make schema changes? And how many people could benefit from it, what do you think?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1572766460, "label": "Transformation type `--type DATETIME`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1421033725", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/524", "id": 1421033725, "node_id": "IC_kwDOCGYnMM5Us0D9", "user": {"value": 193185, "label": "cldellow"}, "created_at": "2023-02-07T16:12:13Z", "updated_at": "2023-02-07T16:12:13Z", "author_association": "NONE", "body": "I think the bigger issue is that `sqlite-utils` mixes mechanism (it implements the [12-step way to alter SQLite tables](https://www.sqlite.org/lang_altertable.html#otheralter)) and policy (it has an opinionated stance on what column types should be used).\r\n\r\nThat might be a design choice to make it accessible to users by providing a reasonable set of defaults, but it doesn't quite fit my use case.\r\n\r\nIt might make sense to extract a separate library that provides just the mechanisms, and then `sqlite-utils` would sit on top of that library with its opinionated set of policies.\r\n\r\nThat would be a very big change, though.\r\n\r\nI might take a stab at extracting the library, but just for the table schema migration piece, not all the other features that `sqlite-utils` supports. I wouldn't expect `sqlite-utils` to depend on it.\r\n\r\nPart of my motivation is that I want to provide some other abilities, too, like support for CHECK constraints. I see that the issue in this repo (https://github.com/simonw/sqlite-utils/issues/358) proposes a bunch of short-hand constraints, which I wouldn't want to accidentally expose to people -- I want a layer that is a 1:1 mapping to SQLite.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1572766460, "label": "Transformation type `--type DATETIME`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1421022917", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/524", "id": 1421022917, "node_id": "IC_kwDOCGYnMM5UsxbF", "user": {"value": 21095447, "label": "4l1fe"}, "created_at": "2023-02-07T16:06:03Z", "updated_at": "2023-02-07T16:08:58Z", "author_association": "NONE", "body": "> Do you see a way to enable it without affecting existing users or bumping the major version number?\r\n\r\nI don't see a clean solution, only extending code with a side variable that tells us we want to apply advanced types instead of basic.\r\n\r\nit could be a similiar command like `tranform-v2 --type column DATETIME` or a cli option `transform --adv-type column DATETIME` along with a dict that contains the advanced types. Then with knowledge that we run an advanced command we take that dictionary somehow, we can wrap the current and new dictionaries by a superdict and work with it everywhere according to the knowledge. This way shouldn't affect users who are using the previous lib versions and it have to be merged in the next major one.\r\n\r\nBut this way looks a bad design, too messy.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1572766460, "label": "Transformation type `--type DATETIME`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1420992261", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/524", "id": 1420992261, "node_id": "IC_kwDOCGYnMM5Usp8F", "user": {"value": 193185, "label": "cldellow"}, "created_at": "2023-02-07T15:45:58Z", "updated_at": "2023-02-07T15:45:58Z", "author_association": "NONE", "body": "I'd support that, but I'm not the author of this library.\r\n\r\nOne challenge is that would be a breaking change. Do you see a way to enable it without affecting existing users or bumping the major version number?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1572766460, "label": "Transformation type `--type DATETIME`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1420966995", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/524", "id": 1420966995, "node_id": "IC_kwDOCGYnMM5UsjxT", "user": {"value": 21095447, "label": "4l1fe"}, "created_at": "2023-02-07T15:29:28Z", "updated_at": "2023-02-07T15:29:28Z", "author_association": "NONE", "body": "I could, of course.\r\n\r\nDoest it worth bringing such the improvement to the library?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1572766460, "label": "Transformation type `--type DATETIME`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/pull/564#issuecomment-1420941334", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/564", "id": 1420941334, "node_id": "IC_kwDOBm6k_c5UsdgW", "user": {"value": 82988, "label": "psychemedia"}, "created_at": "2023-02-07T15:14:10Z", "updated_at": "2023-02-07T15:14:10Z", "author_association": "CONTRIBUTOR", "body": "Is this feature covered by any more recent updates to `datasette`, or via any plugins that you're aware of?", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 473288428, "label": "First proof-of-concept of Datasette Library"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1420809773", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/524", "id": 1420809773, "node_id": "IC_kwDOCGYnMM5Ur9Yt", "user": {"value": 193185, "label": "cldellow"}, "created_at": "2023-02-07T13:53:01Z", "updated_at": "2023-02-07T13:53:01Z", "author_association": "NONE", "body": "Ah, it looks like that is controlled by this dict: https://github.com/simonw/sqlite-utils/blob/main/sqlite_utils/db.py#L178\r\n\r\nI suspect you could overwrite the datetime entry to achieve what you want", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1572766460, "label": "Transformation type `--type DATETIME`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/sqlite-utils/issues/524#issuecomment-1420496447", "issue_url": "https://api.github.com/repos/simonw/sqlite-utils/issues/524", "id": 1420496447, "node_id": "IC_kwDOCGYnMM5Uqw4_", "user": {"value": 21095447, "label": "4l1fe"}, "created_at": "2023-02-07T09:57:38Z", "updated_at": "2023-02-07T09:57:38Z", "author_association": "NONE", "body": "> That said, it looks like the check is only enforced at the CLI level. If you use the API directly, I think it'll work.\r\n\r\nIt works, but a column becomes `TEXT`\r\n\r\n```python\r\nIn [1]: import sqlite_utils\r\nIn [2]: db = sqlite_utils.Database('events.sqlite')\r\nIn [3]: table = db['cards.chunk.get']\r\nIn [4]: table.columns_dict\r\nOut[4]:\r\n{'id': int,\r\n 'timestamp': float,\r\n 'data_chunk_number': int,\r\n 'user_id': str,\r\n 'meta_duplication_source_id': int,\r\n 'context_sort_attribute': str,\r\n 'context_sort_order': str}\r\n\r\nIn [5]: from datetime import datetime\r\nIn [7]: table.transform(types={'timestamp': datetime})\r\nIn [8]: table.columns_dict\r\nOut[8]:\r\n{'id': int,\r\n 'timestamp': str,\r\n 'data_chunk_number': int,\r\n 'user_id': str,\r\n 'meta_duplication_source_id': int,\r\n 'context_sort_attribute': str,\r\n 'context_sort_order': str}\r\n```\r\n\r\n```bash\r\n\u276f sqlite-utils schema events.sqlite cards.chunk.get\r\nCREATE TABLE \"cards.chunk.get\" (\r\n [id] INTEGER PRIMARY KEY NOT NULL,\r\n [timestamp] TEXT,\r\n ...\r\n```\r\n", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1572766460, "label": "Transformation type `--type DATETIME`"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/2019#issuecomment-1420109153", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2019", "id": 1420109153, "node_id": "IC_kwDOBm6k_c5UpSVh", "user": {"value": 9599, "label": "simonw"}, "created_at": "2023-02-07T02:32:36Z", "updated_at": "2023-02-07T02:32:36Z", "author_association": "OWNER", "body": "Doing this as a class makes sense to me. There are a few steps:\r\n\r\n- Instantiate the class with the information it needs, which includes sort order, page size, tiebreaker columns and SQL query and parameters\r\n- Generate the new SQL query that will actually be executed - maybe this takes the optional `_next` parameter? This returns the SQL and params that should be executed, where the SQL now includes pagination logic plus order by and limit\r\n- The calling code then gets to execute the SQL query to fetch the rows\r\n- Last step: those rows are passed to a paginator method which returns `(rows, next)` - where `rows` is the rows truncated to the correct length (really just with the last one cut off if it's too long for the length) and `next` is either `None` or a token, depending on if there should be a next page.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1573424830, "label": "Refactor out the keyset pagination code"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/2019#issuecomment-1420106315", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2019", "id": 1420106315, "node_id": "IC_kwDOBm6k_c5UpRpL", "user": {"value": 9599, "label": "simonw"}, "created_at": "2023-02-07T02:28:03Z", "updated_at": "2023-02-07T02:28:36Z", "author_association": "OWNER", "body": "So I think I can write an abstraction that applies keyset pagination to ANY arbitrary SQL query provided it is given the query, the existing params (so it can pick names for the new params that won't overlap with them), the desired sort order, any existing `_next` token AND the columns that should be used to tie-break any duplicates.\r\n\r\nThose tie breakers will be either the primary key(s) or `rowid` if none are provided.\r\n\r\nWhat about the case of SQL views, where offset/limit should be used instead? I'm inclined to have that as a separate pagination abstraction entirely, with the calling code deciding which pagination helper to use based on if keyset pagination makes sense or not.\r\n\r\nMight be easier to design a class structure for this starting with `OffsetPaginator`, then using that to inform the design of `KeysetPaginator`.\r\n\r\nMight put these in `datasette.utils.pagination` to start off with, then maybe extract them out to `sqlite-utils` later once they've proven themselves.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1573424830, "label": "Refactor out the keyset pagination code"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/2019#issuecomment-1420104254", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2019", "id": 1420104254, "node_id": "IC_kwDOBm6k_c5UpRI-", "user": {"value": 9599, "label": "simonw"}, "created_at": "2023-02-07T02:24:46Z", "updated_at": "2023-02-07T02:24:46Z", "author_association": "OWNER", "body": "Even more complicated: https://latest.datasette.io/fixtures/sortable?sortable_with_nulls__notnull=1&_next=0~2E692704598586882%2Ce%2Cr&_sort=sortable_with_nulls_2\r\n\r\nThe rewritten SQL for that is:\r\n\r\n```sql\r\nselect * from (select pk1, pk2, content, sortable, sortable_with_nulls, sortable_with_nulls_2, text from sortable where \"sortable_with_nulls\" is not null)\r\n where (sortable_with_nulls_2 > :p2 or (sortable_with_nulls_2 = :p2 and ((pk1 > :p0)\r\n or\r\n(pk1 = :p0 and pk2 > :p1)))) order by sortable_with_nulls_2, pk1, pk2 limit 101\r\n```\r\nAnd it still has the same number of explain steps as the current SQL witohut the subselect.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1573424830, "label": "Refactor out the keyset pagination code"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/2019#issuecomment-1420101175", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2019", "id": 1420101175, "node_id": "IC_kwDOBm6k_c5UpQY3", "user": {"value": 9599, "label": "simonw"}, "created_at": "2023-02-07T02:22:11Z", "updated_at": "2023-02-07T02:22:11Z", "author_association": "OWNER", "body": "A more complex example: https://latest.datasette.io/fixtures/sortable?_next=0~2E2650566289400591%2Ca%2Cu&_sort=sortable_with_nulls_2\r\n\r\nSQL:\r\n\r\n```sql\r\nselect pk1, pk2, content, sortable, sortable_with_nulls, sortable_with_nulls_2, text from sortable where (sortable_with_nulls_2 > :p2 or (sortable_with_nulls_2 = :p2 and ((pk1 > :p0)\r\n or\r\n(pk1 = :p0 and pk2 > :p1)))) order by sortable_with_nulls_2, pk1, pk2 limit 101\r\n```\r\n\r\nhttps://latest.datasette.io/fixtures?sql=select+pk1%2C+pk2%2C+content%2C+sortable%2C+sortable_with_nulls%2C+sortable_with_nulls_2%2C+text+from+sortable+where+%28sortable_with_nulls_2+%3E+%3Ap2+or+%28sortable_with_nulls_2+%3D+%3Ap2+and+%28%28pk1+%3E+%3Ap0%29%0A++or%0A%28pk1+%3D+%3Ap0+and+pk2+%3E+%3Ap1%29%29%29%29+order+by+sortable_with_nulls_2%2C+pk1%2C+pk2+limit+101&p0=a&p1=u&p2=0.2650566289400591\r\n\r\nHere's the explain: 49 steps long https://latest.datasette.io/fixtures?sql=explain+select+pk1%2C+pk2%2C+content%2C+sortable%2C+sortable_with_nulls%2C+sortable_with_nulls_2%2C+text+from+sortable+where+%28sortable_with_nulls_2+%3E+%3Ap2+or+%28sortable_with_nulls_2+%3D+%3Ap2+and+%28%28pk1+%3E+%3Ap0%29%0D%0A++or%0D%0A%28pk1+%3D+%3Ap0+and+pk2+%3E+%3Ap1%29%29%29%29+order+by+sortable_with_nulls_2%2C+pk1%2C+pk2+limit+101&p2=0.2650566289400591&p0=a&p1=u\r\n\r\nRewritten with a subselect:\r\n\r\n```sql\r\nselect * from (\r\n select pk1, pk2, content, sortable, sortable_with_nulls, sortable_with_nulls_2, text from sortable\r\n)\r\nwhere (sortable_with_nulls_2 > :p2 or (sortable_with_nulls_2 = :p2 and ((pk1 > :p0)\r\n or\r\n(pk1 = :p0 and pk2 > :p1)))) order by sortable_with_nulls_2, pk1, pk2 limit 101\r\n```\r\nhttps://latest.datasette.io/fixtures?sql=select+*+from+(%0D%0A++select+pk1%2C+pk2%2C+content%2C+sortable%2C+sortable_with_nulls%2C+sortable_with_nulls_2%2C+text+from+sortable%0D%0A)%0D%0Awhere+(sortable_with_nulls_2+%3E+%3Ap2+or+(sortable_with_nulls_2+%3D+%3Ap2+and+((pk1+%3E+%3Ap0)%0D%0A++or%0D%0A(pk1+%3D+%3Ap0+and+pk2+%3E+%3Ap1))))+order+by+sortable_with_nulls_2%2C+pk1%2C+pk2+limit+101&p2=0.2650566289400591&p0=a&p1=u\r\n\r\nAnd here's the explain for that - also 49 steps: https://latest.datasette.io/fixtures?sql=explain+select+*+from+%28%0D%0A++select+pk1%2C+pk2%2C+content%2C+sortable%2C+sortable_with_nulls%2C+sortable_with_nulls_2%2C+text+from+sortable%0D%0A%29%0D%0Awhere+%28sortable_with_nulls_2+%3E+%3Ap2+or+%28sortable_with_nulls_2+%3D+%3Ap2+and+%28%28pk1+%3E+%3Ap0%29%0D%0A++or%0D%0A%28pk1+%3D+%3Ap0+and+pk2+%3E+%3Ap1%29%29%29%29+order+by+sortable_with_nulls_2%2C+pk1%2C+pk2+limit+101&p2=0.2650566289400591&p0=a&p1=u", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1573424830, "label": "Refactor out the keyset pagination code"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/2019#issuecomment-1420094396", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2019", "id": 1420094396, "node_id": "IC_kwDOBm6k_c5UpOu8", "user": {"value": 9599, "label": "simonw"}, "created_at": "2023-02-07T02:18:11Z", "updated_at": "2023-02-07T02:19:16Z", "author_association": "OWNER", "body": "For the SQL underlying this page (the second page in that compound primary key paginated sequence): https://latest.datasette.io/fixtures/compound_three_primary_keys?_next=a%2Cd%2Cv\r\n\r\nThe explain for the default query: https://latest.datasette.io/fixtures?sql=explain+select%0D%0A++pk1%2C%0D%0A++pk2%2C%0D%0A++pk3%2C%0D%0A++content%0D%0Afrom%0D%0A++compound_three_primary_keys%0D%0Awhere%0D%0A++%28%0D%0A++++%28pk1+%3E+%3Ap0%29%0D%0A++++or+%28%0D%0A++++++pk1+%3D+%3Ap0%0D%0A++++++and+pk2+%3E+%3Ap1%0D%0A++++%29%0D%0A++++or+%28%0D%0A++++++pk1+%3D+%3Ap0%0D%0A++++++and+pk2+%3D+%3Ap1%0D%0A++++++and+pk3+%3E+%3Ap2%0D%0A++++%29%0D%0A++%29%0D%0Aorder+by%0D%0A++pk1%2C%0D%0A++pk2%2C%0D%0A++pk3%0D%0Alimit%0D%0A++101&p0=a&p1=d&p2=v\r\n\r\nThe explain for that query rewritten as this:\r\n\r\n```sql\r\nexplain\r\nselect\r\n *\r\nfrom\r\n (\r\n select\r\n pk1,\r\n pk2,\r\n pk3,\r\n content\r\n from\r\n compound_three_primary_keys\r\n )\r\nwhere\r\n (\r\n (pk1 > :p0)\r\n or (\r\n pk1 = :p0\r\n and pk2 > :p1\r\n )\r\n or (\r\n pk1 = :p0\r\n and pk2 = :p1\r\n and pk3 > :p2\r\n )\r\n )\r\norder by\r\n pk1,\r\n pk2,\r\n pk3\r\nlimit\r\n 101\r\n```\r\nhttps://latest.datasette.io/fixtures?sql=explain+select+*+from+%28select+%0D%0A++pk1%2C%0D%0A++pk2%2C%0D%0A++pk3%2C%0D%0A++content%0D%0Afrom%0D%0A++compound_three_primary_keys%0D%0A%29%0D%0A++where%0D%0A++%28%0D%0A++++%28pk1+%3E+%3Ap0%29%0D%0A++++or+%28%0D%0A++++++pk1+%3D+%3Ap0%0D%0A++++++and+pk2+%3E+%3Ap1%0D%0A++++%29%0D%0A++++or+%28%0D%0A++++++pk1+%3D+%3Ap0%0D%0A++++++and+pk2+%3D+%3Ap1%0D%0A++++++and+pk3+%3E+%3Ap2%0D%0A++++%29%0D%0A++%29%0D%0Aorder+by%0D%0A++pk1%2C%0D%0A++pk2%2C%0D%0A++pk3%0D%0Alimit%0D%0A++101&p0=a&p1=d&p2=v\r\n\r\nBoth explains have 31 steps and look pretty much identical.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1573424830, "label": "Refactor out the keyset pagination code"}, "performed_via_github_app": null} {"html_url": "https://github.com/simonw/datasette/issues/2019#issuecomment-1420088670", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/2019", "id": 1420088670, "node_id": "IC_kwDOBm6k_c5UpNVe", "user": {"value": 9599, "label": "simonw"}, "created_at": "2023-02-07T02:14:35Z", "updated_at": "2023-02-07T02:14:35Z", "author_association": "OWNER", "body": "Maybe the correct level of abstraction here is that pagination is something that happens to a SQL query that is defined as SQL and params, without an order by or limit. That's then wrapped in a sub-select and those things are added to it, plus the necessary `where` clauses depending on the page.\r\n\r\nNeed to check that the query plan for pagination of a subquery isn't slower than the plan for pagination as it works today.", "reactions": "{\"total_count\": 0, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 0, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1573424830, "label": "Refactor out the keyset pagination code"}, "performed_via_github_app": null}