pull_requests
6 rows where draft = 1 and state = "open"
This data as json, CSV (advanced)
Suggested facets: user, author_association, repo, created_at (date), updated_at (date)
id ▼ | node_id | number | state | locked | title | user | body | created_at | updated_at | closed_at | merged_at | merge_commit_sha | assignee | milestone | draft | head | base | author_association | repo | url | merged_by | auto_merge |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
301483613 | MDExOlB1bGxSZXF1ZXN0MzAxNDgzNjEz | 564 | open | 0 | First proof-of-concept of Datasette Library | simonw 9599 | Refs #417. Run it like this: datasette -d ~/Library Uses a new plugin hook - available_databases() | 2019-07-26T10:22:26Z | 2023-02-07T15:14:11Z | 4f425d2b39d1be10d7ef5c146480a3eb494d5086 | 1 | 947645d84710677ea50762016081a9fbc6b014a8 | a9453c4dda70bbf5122835e68f63db6ecbe1a6fc | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/564 | ||||||
560760145 | MDExOlB1bGxSZXF1ZXN0NTYwNzYwMTQ1 | 1204 | open | 0 | WIP: Plugin includes | simonw 9599 | Refs #1191 Next steps: - [ ] Get comfortable that this pattern is the right way to go - [ ] Implement it for all of the other pages, not just the table page - [ ] Add a new set of plugin tests that exercise ALL of these new hook locations - [ ] Document, then ship | 2021-01-25T03:59:06Z | 2021-12-17T07:10:49Z | 98f06a766317a40035962416cf3211d7a374866a | 1 | 05258469ae39bcaad17beb57c5b7eeab0d58a589 | 07e163561592c743e4117f72102fcd350a600909 | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/1204 | ||||||
580235427 | MDExOlB1bGxSZXF1ZXN0NTgwMjM1NDI3 | 241 | open | 0 | Extract expand - work in progress | simonw 9599 | Refs #239. Still needs documentation and CLI implementation. | 2021-02-25T16:36:38Z | 2021-02-25T16:36:38Z | 0bb6c7a38994627a64e7b3375931528e96b8c222 | 1 | 8d641ab08ac449081e96f3e25bd6c0226870948a | 38e688fb8bcb58ae888b676fe3f7dd0529b4eecc | OWNER | sqlite-utils 140912432 | https://github.com/simonw/sqlite-utils/pull/241 | ||||||
598213565 | MDExOlB1bGxSZXF1ZXN0NTk4MjEzNTY1 | 1271 | open | 0 | Use SQLite conn.interrupt() instead of sqlite_timelimit() | simonw 9599 | Refs #1270, #1268, #1249 Before merging this I need to do some more testing (to make sure that expensive queries really are properly cancelled). I also need to delete a bunch of code relating to the old mechanism of cancelling queries. [See comment below: this doesn't actually cancel the query due to a thread-local confusion] | 2021-03-22T17:34:20Z | 2021-03-22T21:49:27Z | a4fd7e5a761523881c031b4fee266a366e1c97bd | 1 | fb2ad7ada0b86a7fe4a576fe23236757c41eb05e | c4f1ec7f33fd7d5b93f0f895dafb5351cc3bfc5b | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/1271 | ||||||
729704537 | MDExOlB1bGxSZXF1ZXN0NzI5NzA0NTM3 | 1465 | open | 0 | add support for -o --get /path | ctb 51016 | Fixes https://github.com/simonw/datasette/issues/1459 Adds support for `--open --get /path` to be used in combination. If `--open` is provided alone, datasette will open a web page to a default URL. If `--get <url>` is provided alone, datasette will output the result of doing a GET to that URL and then exit. If `--open --get <url>` are provided together, datasette will open a web page to that URL. TODO items: - [ ] update documentation - [ ] print out error message when `--root --open --get <url>` is used - [ ] adjust code to require that `<url>` start with a `/` when `-o --get <url>` is used - [ ] add test(s) note, '@CTB' is used in this PR to flag code that needs revisiting. | 2021-09-08T14:30:42Z | 2021-09-08T14:31:45Z | 064e9511923fc4e50566bf9430b4a5b26f169357 | 1 | 9b66a7d9ba55bad8a3b409ede8855f4b4fff1f88 | d57ab156b35ec642549fb69d08279850065027d2 | CONTRIBUTOR | datasette 107914493 | https://github.com/simonw/datasette/pull/1465 | ||||||
1303909190 | PR_kwDOBm6k_c5NuBNG | 2053 | open | 0 | WIP new JSON for queries | simonw 9599 | Refs: - #2049 TODO: - [x] Read queries JSON - [ ] Implement error display with `"ok": false` and an errors key - [ ] Read queries HTML - [ ] Read queries other formats (plugins) - [ ] Canned read queries (dispatched to from table) - [ ] Write queries (a canned query thing) - [ ] Implement different shapes, refactoring to share code with table - [ ] Implement a sensible subset of extras, also refactoring to share code with table - [ ] Get all tests passing <!-- readthedocs-preview datasette start --> ---- :books: Documentation preview :books:: https://datasette--2053.org.readthedocs.build/en/2053/ <!-- readthedocs-preview datasette end --> | 2023-04-05T23:26:15Z | 2023-05-26T23:13:03Z | 3f9f8455e9a7c3fd65d034f5432a31a548c613a7 | 1 | 007294008d925b7e5529e6d14add002b6b56ddb5 | dda99fc09fb0b5523948f6d481c6c051c1c7b5de | OWNER | datasette 107914493 | https://github.com/simonw/datasette/pull/2053 |
Advanced export
JSON shape: default, array, newline-delimited, object
CREATE TABLE [pull_requests] ( [id] INTEGER PRIMARY KEY, [node_id] TEXT, [number] INTEGER, [state] TEXT, [locked] INTEGER, [title] TEXT, [user] INTEGER REFERENCES [users]([id]), [body] TEXT, [created_at] TEXT, [updated_at] TEXT, [closed_at] TEXT, [merged_at] TEXT, [merge_commit_sha] TEXT, [assignee] INTEGER REFERENCES [users]([id]), [milestone] INTEGER REFERENCES [milestones]([id]), [draft] INTEGER, [head] TEXT, [base] TEXT, [author_association] TEXT, [repo] INTEGER REFERENCES [repos]([id]), [url] TEXT, [merged_by] INTEGER REFERENCES [users]([id]) , [auto_merge] TEXT); CREATE INDEX [idx_pull_requests_merged_by] ON [pull_requests] ([merged_by]); CREATE INDEX [idx_pull_requests_repo] ON [pull_requests] ([repo]); CREATE INDEX [idx_pull_requests_milestone] ON [pull_requests] ([milestone]); CREATE INDEX [idx_pull_requests_assignee] ON [pull_requests] ([assignee]); CREATE INDEX [idx_pull_requests_user] ON [pull_requests] ([user]);