home / github

Menu
  • GraphQL API

issue_comments

Table actions
  • GraphQL API for issue_comments

41 rows where reactions = "{"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0}" sorted by node_id

✎ View and edit SQL

This data as json, CSV (advanced)

Suggested facets: user, author_association, created_at (date), updated_at (date)

reactions 1 ✖

  • {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} · 41 ✖
id html_url issue_url node_id ▼ user created_at updated_at author_association body reactions issue performed_via_github_app
886969541 https://github.com/simonw/datasette/issues/1402#issuecomment-886969541 https://api.github.com/repos/simonw/datasette/issues/1402 IC_kwDOBm6k_c403hTF simonw 9599 2021-07-26T19:31:40Z 2021-07-26T19:31:40Z OWNER Datasette could do a pretty good job of this by default, using `twitter:card` and `og:url` tags - like on https://til.simonwillison.net/jq/extracting-objects-recursively I could also provide a mechanism to customize these - in particular to add images of some sort. It feels like something that should tie in to the metadata mechanism. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} feature request: social meta tags 951185411  
943632697 https://github.com/simonw/datasette/pull/1467#issuecomment-943632697 https://api.github.com/repos/simonw/datasette/issues/1467 IC_kwDOBm6k_c44PrE5 simonw 9599 2021-10-14T18:54:18Z 2021-10-14T18:54:18Z OWNER The test there failed because it turns out there's a whole bunch of places that set the `Access-Control-Allow-Origin` header. I'm going to close this PR and ship a fix that refactors those places to use the same code. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Add Authorization header when CORS flag is set 991575770  
988468238 https://github.com/simonw/datasette/issues/1528#issuecomment-988468238 https://api.github.com/repos/simonw/datasette/issues/1528 IC_kwDOBm6k_c466tQO 20after4 30934 2021-12-08T03:35:45Z 2021-12-08T03:35:45Z NONE FWIW I implemented something similar with a bit of plugin code: ```python @hookimpl def canned_queries(datasette: Datasette, database: str) -> Mapping[str, str]: # load "canned queries" from the filesystem under # www/sql/db/query_name.sql queries = {} sqldir = Path(__file__).parent.parent / "sql" if database: sqldir = sqldir / database if not sqldir.is_dir(): return queries for f in sqldir.glob('*.sql'): try: sql = f.read_text('utf8').strip() if not len(sql): log(f"Skipping empty canned query file: {f}") continue queries[f.stem] = { "sql": sql } except OSError as err: log(err) return queries ``` {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Add new `"sql_file"` key to Canned Queries in metadata? 1060631257  
996103956 https://github.com/simonw/datasette/issues/1553#issuecomment-996103956 https://api.github.com/repos/simonw/datasette/issues/1553 IC_kwDOBm6k_c47X1cU simonw 9599 2021-12-16T19:14:38Z 2021-12-16T19:14:38Z OWNER This is a really interesting idea - kind of similar to how many APIs include custom HTTP headers informing of rate-limits. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} if csv export is truncated in non streaming mode set informative response header 1079111498  
997124280 https://github.com/simonw/datasette/issues/1546#issuecomment-997124280 https://api.github.com/repos/simonw/datasette/issues/1546 IC_kwDOBm6k_c47bui4 simonw 9599 2021-12-18T02:05:16Z 2021-12-18T02:05:16Z OWNER Sure - there are actually several levels to this. The code that creates connections to the database is this: https://github.com/simonw/datasette/blob/83bacfa9452babe7bd66e3579e23af988d00f6ac/datasette/database.py#L72-L95 For files on disk, it does this: ```python # For read-only connections conn = sqlite3.connect( "file:my.db?mode=ro", uri=True, check_same_thread=False) # For connections that should be treated as immutable: conn = sqlite3.connect( "file:my.db?immutable=1", uri=True, check_same_thread=False) ``` For in-memory databases it runs this after the connection has been created: ```python conn.execute("PRAGMA query_only=1") ``` SQLite `PRAGMA` queries are treated as dangerous: someone could run `PRAGMA query_only=0` to turn that previous option off for example. So this function runs against any incoming SQL to verify that it looks like a `SELECT ...` and doesn't have anything like that in it. https://github.com/simonw/datasette/blob/83bacfa9452babe7bd66e3579e23af988d00f6ac/datasette/utils/__init__.py#L195-L204 You can see the tests for that here: https://github.com/simonw/datasette/blob/b1fed48a95516ae84c0f020582303ab50ab817e2/tests/test_utils.py#L136-L170 {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} validating the sql 1076057610  
1160712911 https://github.com/simonw/datasette/pull/1759#issuecomment-1160712911 https://api.github.com/repos/simonw/datasette/issues/1759 IC_kwDOBm6k_c5FLxLP simonw 9599 2022-06-20T17:58:37Z 2022-06-20T17:58:37Z OWNER This is a great idea. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Extract facet portions of table.html out into included templates 1275523220  
1259693536 https://github.com/simonw/datasette/issues/526#issuecomment-1259693536 https://api.github.com/repos/simonw/datasette/issues/526 IC_kwDOBm6k_c5LFWXg simonw 9599 2022-09-27T15:42:55Z 2022-09-27T15:42:55Z OWNER It's interesting to note WHY the time limit works against this so well. The time limit as-implemented looks like this: https://github.com/simonw/datasette/blob/5f9f567acbc58c9fcd88af440e68034510fb5d2b/datasette/utils/__init__.py#L181-L201 The key here is `conn.set_progress_handler(handler, n)` - which specifies that the handler function should be called every `n` SQLite operations. The handler function then checks to see if too much time has transpired and conditionally cancels the query. This also doubles up as a "maximum number of operations" guard, which is what's happening when you attempt to fetch an infinite number of rows from an infinite table. That limit code could even be extended to say "exit the query after either 5s or 50,000,000 operations". I don't think that's necessary though. To be honest I'm having trouble with the idea of dropping `max_returned_rows` mainly because what Datasette does (allow arbitrary untrusted SQL queries) is dangerous, so I've designed in multiple redundant defence-in-depth mechanisms right from the start. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Stream all results for arbitrary SQL and canned queries 459882902  
1313052863 https://github.com/simonw/datasette/issues/1886#issuecomment-1313052863 https://api.github.com/repos/simonw/datasette/issues/1886 IC_kwDOBm6k_c5OQ5i_ simonw 9599 2022-11-14T03:40:50Z 2022-11-14T03:40:50Z OWNER Tim Sherratt on Twitter: https://twitter.com/wragge/status/1591930345469153282 > Where do I start? The [#GLAMWorkbench](https://twitter.com/hashtag/GLAMWorkbench?src=hashtag_click) now includes a number of examples where GLAM data is harvested, processed, and then made available for exploration via Datasette. > > https://glam-workbench.net/ > > For example the GLAM Name Index Search brings together 10+ million entries from 240 indexes and provides an aggregated search using the Datasette search-all plugin: > > https://glam-workbench.net/name-search/ > > Most recently I converted PDFs of the Tasmanian Postal Directories to a big Datasette instance: https://updates.timsherratt.org/2022/09/15/from-pdfs-to.html the process is documented and reusable. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Call for birthday presents: if you're using Datasette, let us know how you're using it here 1447050738  
1368267484 https://github.com/simonw/datasette/pull/1967#issuecomment-1368267484 https://api.github.com/repos/simonw/datasette/issues/1967 IC_kwDOBm6k_c5Rjhrc simonw 9599 2022-12-31T19:15:50Z 2022-12-31T19:15:50Z OWNER My Firefox tab before: <img width="169" alt="image" src="https://user-images.githubusercontent.com/9599/210153361-42241d3b-7681-42ed-a491-c0d513ef41fe.png"> And after: <img width="169" alt="image" src="https://user-images.githubusercontent.com/9599/210153655-f74ef94f-7223-468b-9ca0-cf944393d993.png"> {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Add favicon to documentation 1503010009  
1672224611 https://github.com/simonw/datasette/issues/2133#issuecomment-1672224611 https://api.github.com/repos/simonw/datasette/issues/2133 IC_kwDOBm6k_c5jrB9j simonw 9599 2023-08-09T22:07:43Z 2023-08-09T22:07:43Z OWNER Documentation: https://docs.datasette.io/en/latest/plugins.html#seeing-what-plugins-are-installed {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} [feature request]`datasette install plugins.json` options 1841501975  
1722943484 https://github.com/simonw/datasette/pull/2052#issuecomment-1722943484 https://api.github.com/repos/simonw/datasette/issues/2052 IC_kwDOBm6k_c5msgf8 20after4 30934 2023-09-18T08:14:47Z 2023-09-18T08:14:47Z NONE This is such a well thought out contribution. I don't think I've seen such a thoroughly considered PR on any project in recent memory. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} feat: Javascript Plugin API (Custom panels, column menu items with JS actions) 1651082214  
1760552652 https://github.com/simonw/datasette/pull/2052#issuecomment-1760552652 https://api.github.com/repos/simonw/datasette/issues/2052 IC_kwDOBm6k_c5o7-bM simonw 9599 2023-10-12T23:59:21Z 2023-10-12T23:59:21Z OWNER I'm landing this despite the cog failures. I'll fix them on main if I have to. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} feat: Javascript Plugin API (Custom panels, column menu items with JS actions) 1651082214  
1030730108 https://github.com/simonw/sqlite-utils/issues/397#issuecomment-1030730108 https://api.github.com/repos/simonw/sqlite-utils/issues/397 IC_kwDOCGYnMM49b7F8 simonw 9599 2022-02-06T01:30:46Z 2022-02-06T01:30:46Z OWNER Updated documentation is here: https://sqlite-utils.datasette.io/en/latest/python-api.html#explicitly-creating-a-table {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Support IF NOT EXISTS for table creation 1123903919  
1065440445 https://github.com/simonw/sqlite-utils/issues/411#issuecomment-1065440445 https://api.github.com/repos/simonw/sqlite-utils/issues/411 IC_kwDOCGYnMM4_gVS9 simonw 9599 2022-03-11T19:52:15Z 2022-03-11T19:52:15Z OWNER Two new parameters to `.create_table()` and friends: - `generated={...}` - generated column definitions - `generated_stored={...}` generated stored column definitions These columns will be added at the end of the table, but you can use the `column_order=` parameter to apply a different order. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Support for generated columns 1160034488  
1232419522 https://github.com/simonw/sqlite-utils/pull/480#issuecomment-1232419522 https://api.github.com/repos/simonw/sqlite-utils/issues/480 IC_kwDOCGYnMM5JdTrC simonw 9599 2022-08-31T03:33:27Z 2022-08-31T03:33:27Z OWNER Tests look great, thank you! {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} search_sql add include_rank option 1355433619  
1258437060 https://github.com/simonw/sqlite-utils/issues/490#issuecomment-1258437060 https://api.github.com/repos/simonw/sqlite-utils/issues/490 IC_kwDOCGYnMM5LAjnE simonw 9599 2022-09-26T18:24:44Z 2022-09-26T18:24:44Z OWNER Just saw your great write-up on this: https://jeqo.github.io/notes/2022-09-24-ingest-logs-sqlite/ {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Ability to insert multi-line files 1382457780  
1539058795 https://github.com/simonw/sqlite-utils/pull/519#issuecomment-1539058795 https://api.github.com/repos/simonw/sqlite-utils/issues/519 IC_kwDOCGYnMM5bvCxr simonw 9599 2023-05-08T21:12:52Z 2023-05-08T21:12:52Z OWNER This is a really neat fix, thank you. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Fixes breaking DEFAULT values 1505568103  
1556210844 https://github.com/simonw/sqlite-utils/issues/545#issuecomment-1556210844 https://api.github.com/repos/simonw/sqlite-utils/issues/545 IC_kwDOCGYnMM5cweSc simonw 9599 2023-05-21T15:44:10Z 2023-05-21T15:44:10Z OWNER It looks like `nargs=-1` on a positional argument isn't yet supported - opened an issue here: - https://github.com/Textualize/trogon/issues/4 {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Try out Trogon for a tui interface 1718517882  
1606415188 https://github.com/simonw/sqlite-utils/issues/235#issuecomment-1606415188 https://api.github.com/repos/simonw/sqlite-utils/issues/235 IC_kwDOCGYnMM5fv_NU simonw 9599 2023-06-26T01:46:47Z 2023-06-26T01:47:01Z OWNER I just tested this in a brand new virtual environment using the macOS Python 3: ```bash pipenv shell --python /Applications/Xcode.app/Contents/Developer/usr/bin/python3 ``` Then in that virtual environment I ran: ```bash pip install sqlite-utils # Confirm the right one is on the path: which sqlite-utils curl "https://data.nasa.gov/resource/y77d-th95.json" | \ sqlite-utils insert meteorites.db meteorites - --pk=id sqlite-utils extract meteorites.db meteorites recclass ``` This threw the same error reported above. Then I did this: ```bash rm meteorites.db pip install sqlean.py curl "https://data.nasa.gov/resource/y77d-th95.json" | \ sqlite-utils insert meteorites.db meteorites - --pk=id sqlite-utils extract meteorites.db meteorites recclass ``` And that second time it worked correctly. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Extract columns cannot create foreign key relation: sqlite3.OperationalError: table sqlite_master may not be modified 810618495  
1646686424 https://github.com/simonw/sqlite-utils/issues/567#issuecomment-1646686424 https://api.github.com/repos/simonw/sqlite-utils/issues/567 IC_kwDOCGYnMM5iJnDY simonw 9599 2023-07-22T22:52:34Z 2023-07-22T22:52:34Z OWNER Splitting off an issue for `prepare_connection()` since Alex got the PR in seconds before I shipped 3.34! {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Plugin system 1801394744  
1683076325 https://github.com/simonw/sqlite-utils/issues/577#issuecomment-1683076325 https://api.github.com/repos/simonw/sqlite-utils/issues/577 IC_kwDOCGYnMM5kUbTl simonw 9599 2023-08-17T22:48:36Z 2023-08-17T22:48:36Z OWNER I'm inclined to just go with the `.transform()` method and not attempt to keep around the method that involves updating `sqlite_master` and then add code to detect if that's possible (or catch if it fails) and fall back on the other mechanism. It would be nice to drop some code complexity, plus I don't yet have a way of running automated tests against Python + SQLite versions that exhibit the problem. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Get `add_foreign_keys()` to work without modifying `sqlite_master` 1817289521  
547373739 https://github.com/simonw/datasette/issues/594#issuecomment-547373739 https://api.github.com/repos/simonw/datasette/issues/594 MDEyOklzc3VlQ29tbWVudDU0NzM3MzczOQ== willingc 2680980 2019-10-29T11:21:52Z 2019-10-29T11:21:52Z NONE Just an FYI for folks wishing to run datasette with Python 3.8, I was able to successfully use datasette with the following in a virtual environment: ``` pip install uvloop==0.14.0rc1 pip install uvicorn==0.9.1 ``` {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} upgrade to uvicorn-0.9 to be Python-3.8 friendly 506297048  
552276247 https://github.com/simonw/datasette/issues/594#issuecomment-552276247 https://api.github.com/repos/simonw/datasette/issues/594 MDEyOklzc3VlQ29tbWVudDU1MjI3NjI0Nw== simonw 9599 2019-11-11T03:13:00Z 2019-11-11T03:13:00Z OWNER #622 {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} upgrade to uvicorn-0.9 to be Python-3.8 friendly 506297048  
501541902 https://github.com/simonw/sqlite-utils/issues/26#issuecomment-501541902 https://api.github.com/repos/simonw/sqlite-utils/issues/26 MDEyOklzc3VlQ29tbWVudDUwMTU0MTkwMg== simonw 9599 2019-06-13T04:15:22Z 2019-06-13T16:55:42Z OWNER So maybe something like this: ``` curl https://api.github.com/repos/simonw/datasette/pulls?state=all | \ sqlite-utils insert git.db pulls - \ --flatten=base \ --flatten=head \ --extract=user:users:id \ --extract=head_repo.license:licenses:key \ --extract=head_repo.owner:users \ --extract=head_repo --extract=base_repo.license:licenses:key \ --extract=base_repo.owner:users \ --extract=base_repo ``` Is the order of those nested `--extract` lines significant I wonder? It would be nice if the order didn't matter and the code figured out the right execution plan on its own. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Mechanism for turning nested JSON into foreign keys / many-to-many 455486286  
640116494 https://github.com/simonw/datasette/issues/791#issuecomment-640116494 https://api.github.com/repos/simonw/datasette/issues/791 MDEyOklzc3VlQ29tbWVudDY0MDExNjQ5NA== simonw 9599 2020-06-06T20:50:41Z 2020-06-06T20:50:41Z OWNER I have a better idea: a feed reader! You can insert URLs to feeds, then have a command which fetches the latest entries from them into a separate table. Then implement favorites as a canned query, let you search your favorites, etc. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Tutorial: building a something-interesting with writable canned queries 628572716  
642522285 https://github.com/simonw/datasette/issues/394#issuecomment-642522285 https://api.github.com/repos/simonw/datasette/issues/394 MDEyOklzc3VlQ29tbWVudDY0MjUyMjI4NQ== LVerneyPEReN 58298410 2020-06-11T09:15:19Z 2020-06-11T09:15:19Z NONE Hi @wragge, This looks great, thanks for the share! I refactored it into a self-contained function, binding on a random available TCP port (multi-user context). I am using subprocess API directly since the `%run` magic was leaving defunct process behind :/ ![image](https://user-images.githubusercontent.com/58298410/84367566-b5d0d500-abd4-11ea-96e2-f5c05a28e506.png) ```python import socket from signal import SIGINT from subprocess import Popen, PIPE from IPython.display import display, HTML from notebook.notebookapp import list_running_servers def get_free_tcp_port(): """ Get a free TCP port. """ tcp = socket.socket(socket.AF_INET, socket.SOCK_STREAM) tcp.bind(('', 0)) _, port = tcp.getsockname() tcp.close() return port def datasette(database): """ Run datasette on an SQLite database. """ # Get current running servers servers = list_running_servers() # Get the current base url base_url = next(servers)['base_url'] # Get a free port port = get_free_tcp_port() # Create a base url for Datasette suing the proxy path proxy_url = f'{base_url}proxy/absolute/{port}/' # Display a link to Datasette display(HTML(f'<p><a href="{proxy_url}">View Datasette</a> (Click on the stop button to close the Datasette server)</p>')) # Launch Datasette with Popen( [ 'python', '-m', 'datasette', '--', database, '--port', str(port), '--config', f'base_url:{proxy_url}' ], stdout=PIPE, stderr=PIPE, bufsize=1, universal_newlines=True ) as p: print(p.stdout.readline(), end='') while True: try: line = p.stderr.readline() if not line: break print(line, end='') exit_code = p.poll() except KeyboardInterrupt: p.send_signal(SIGINT) ``` Ideal… {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} base_url configuration setting 396212021  
645030262 https://github.com/simonw/datasette/issues/850#issuecomment-645030262 https://api.github.com/repos/simonw/datasette/issues/850 MDEyOklzc3VlQ29tbWVudDY0NTAzMDI2Mg== simonw 9599 2020-06-16T21:51:01Z 2020-06-16T21:51:39Z OWNER File locking is interesting here. https://docs.aws.amazon.com/lambda/latest/dg/services-efs.html > Amazon EFS supports [file locking](https://docs.aws.amazon.com/efs/latest/ug/how-it-works.html#consistency) to prevent corruption if multiple functions try to write to the same file system at the same time. Locking in Amazon EFS follows the NFS v4.1 protocol for advisory locking, and enables your applications to use both whole file and byte range locks. SQLite can apparently work on NFS v4.1. I think I'd rather set things up so there's only ever one writer - so a Datasette instance could scale reads by running lots more lambda functions but only one function ever writes to a file at a time. Not sure if that's feasible with Lambda though - maybe by adding some additional shared state mechanism like Redis? {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Proof of concept for Datasette on AWS Lambda with EFS 639993467  
671072223 https://github.com/simonw/datasette/issues/919#issuecomment-671072223 https://api.github.com/repos/simonw/datasette/issues/919 MDEyOklzc3VlQ29tbWVudDY3MTA3MjIyMw== simonw 9599 2020-08-09T16:26:17Z 2020-08-09T16:26:17Z OWNER Should be released in a couple of minutes: https://travis-ci.org/github/simonw/datasette/builds/716328883 {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Travis should not build the master branch, only the main branch 675727366  
609455243 https://github.com/simonw/datasette/issues/716#issuecomment-609455243 https://api.github.com/repos/simonw/datasette/issues/716 MDEyOklzc3VlQ29tbWVudDYwOTQ1NTI0Mw== simonw 9599 2020-04-05T17:47:33Z 2020-04-05T17:47:33Z OWNER You start `git bisect` by giving it a known bad commit and a known good one: ``` git bisect start master 286ed28 ``` Then you tell it to start running your script: ``` git bisect run python ../datasette-issue-716/check_view_name.py ``` Here's what I got: ``` (datasette) ~/Dropbox/Development/datasette $ git bisect start master 286ed28 Bisecting: 30 revisions left to test after this (roughly 5 steps) [dc80e779a2e708b2685fc641df99e6aae9ad6f97] Handle scope path if it is a string (datasette) ~/Dropbox/Development/datasette $ git bisect run python ../datasette-issue-716/check_view_name.py running python ../datasette-issue-716/check_view_name.py Traceback (most recent call last): ... Bisecting: 15 revisions left to test after this (roughly 4 steps) [7c6a9c35299f251f9abfb03fd8e85143e4361709] Better tests for prepare_connection() plugin hook, refs #678 running python ../datasette-issue-716/check_view_name.py Traceback (most recent call last): ... Bisecting: 7 revisions left to test after this (roughly 3 steps) [0091dfe3e5a3db94af8881038d3f1b8312bb857d] More reliable tie-break ordering for facet results running python ../datasette-issue-716/check_view_name.py Traceback (most recent call last): ... Bisecting: 3 revisions left to test after this (roughly 2 steps) [ce12244037b60ba0202c814871218c1dab38d729] Release notes for 0.35 running python ../datasette-issue-716/check_view_name.py Traceback (most recent call last): ... Bisecting: 0 revisions left to test after this (roughly 1 step) [4d7dae9eb75e5430c3ee3c369bb5cd9ba0a148bc] Added a bunch more plugins to the Ecosystem page running python ../datasette-issue-716/check_view_name.py Traceback (most recent call last): ... 70b915fb4bc214f9d064179f87671f8a378aa127 is the first bad commit commit 70b915fb4bc214f9d064179f87671f8a378aa127 Author: Simon Willison <swillison@gmail.com> Date: Tue Feb 4 12:26:17 2020 -0800 Datasette.render_template() method, closes #577 Pull request #664. :040000 040000 def9e31252e056845609… {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} extra_template_vars() sending wrong view_name for index 594168758  
634254943 https://github.com/simonw/datasette/issues/744#issuecomment-634254943 https://api.github.com/repos/simonw/datasette/issues/744 MDEyOklzc3VlQ29tbWVudDYzNDI1NDk0Mw== simonw 9599 2020-05-26T20:15:11Z 2020-05-26T20:15:11Z OWNER Oh no! It looks like `dirs_exist_ok` is Python 3.8 only. This is a bad fix, it needs to work on older Python's too. Re-opening. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} link_or_copy_directory() error - Invalid cross-device link 608058890  
744757558 https://github.com/simonw/datasette/issues/1143#issuecomment-744757558 https://api.github.com/repos/simonw/datasette/issues/1143 MDEyOklzc3VlQ29tbWVudDc0NDc1NzU1OA== simonw 9599 2020-12-14T22:42:10Z 2020-12-14T22:42:10Z OWNER This may involve a breaking change to the CLI settings interface, so I'm adding this to the 1.0 milestone. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} More flexible CORS support in core, to encourage good security practices 764059235  
749749948 https://github.com/simonw/datasette/issues/509#issuecomment-749749948 https://api.github.com/repos/simonw/datasette/issues/509 MDEyOklzc3VlQ29tbWVudDc0OTc0OTk0OA== simonw 9599 2020-12-22T20:03:10Z 2020-12-22T20:03:10Z OWNER If you open multiple files with the same filename, e.g. like this: datasette fixtures.db templates/fixtures.db plugins/fixtures.db You'll now get this: <img width="702" alt="Datasette__fixtures__fixtures_2__fixtures_3" src="https://user-images.githubusercontent.com/9599/102928494-a4069f00-444d-11eb-9c1e-382a4e27bb51.png"> {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Support opening multiple databases with the same stem 456568880  
754911290 https://github.com/simonw/datasette/issues/1171#issuecomment-754911290 https://api.github.com/repos/simonw/datasette/issues/1171 MDEyOklzc3VlQ29tbWVudDc1NDkxMTI5MA== rcoup 59874 2021-01-05T21:31:15Z 2021-01-05T21:31:15Z NONE We did this for [Sno](https://sno.earth) under macOS — it's a PyInstaller binary/setup which uses [Packages](http://s.sudre.free.fr/Software/Packages/about.html) for packaging. * [Building & Signing](https://github.com/koordinates/sno/blob/master/platforms/Makefile#L67-L95) * [Packaging & Notarizing](https://github.com/koordinates/sno/blob/master/platforms/Makefile#L121-L215) * [Github Workflow](https://github.com/koordinates/sno/blob/master/.github/workflows/build.yml#L228-L269) has the CI side of it FYI (if you ever get to it) for Windows you need to get a code signing certificate. And if you want automated CI, you'll want to get an "EV CodeSigning for HSM" certificate from GlobalSign, which then lets you put the certificate into Azure Key Vault. Which you can use with [azuresigntool](https://github.com/vcsjones/AzureSignTool) to sign your code & installer. (Non-EV certificates are a waste of time, the user still gets big warnings at install time). {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} GitHub Actions workflow to build and sign macOS binary executables 778450486  
786812716 https://github.com/simonw/datasette/issues/1240#issuecomment-786812716 https://api.github.com/repos/simonw/datasette/issues/1240 MDEyOklzc3VlQ29tbWVudDc4NjgxMjcxNg== simonw 9599 2021-02-26T18:18:18Z 2021-02-26T18:18:18Z OWNER Agreed, this would be extremely useful. I'd love to be able to facet against custom queries. It's a fair bit of work to implement but it's not impossible. Closing this as a duplicate of #972. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Allow facetting on custom queries 814591962  
792233255 https://github.com/simonw/datasette/pull/1223#issuecomment-792233255 https://api.github.com/repos/simonw/datasette/issues/1223 MDEyOklzc3VlQ29tbWVudDc5MjIzMzI1NQ== simonw 9599 2021-03-07T07:41:01Z 2021-03-07T07:41:01Z OWNER This is fantastic, thanks so much for tracking this down. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Add compile option to Dockerfile to fix failing test (fixes #696) 806918878  
797158641 https://github.com/simonw/datasette/issues/670#issuecomment-797158641 https://api.github.com/repos/simonw/datasette/issues/670 MDEyOklzc3VlQ29tbWVudDc5NzE1ODY0MQ== simonw 9599 2021-03-12T00:59:49Z 2021-03-12T00:59:49Z OWNER > Challenge: what's the equivalent for PostgreSQL of opening a database in read only mode? Will I have to talk users through creating read only credentials? It looks like the answer to this is yes - I'll need users to setup read-only credentials. Here's a TIL about that: https://til.simonwillison.net/postgresql/read-only-postgresql-user {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Prototoype for Datasette on PostgreSQL 564833696  
709633080 https://github.com/simonw/datasette/issues/865#issuecomment-709633080 https://api.github.com/repos/simonw/datasette/issues/865 MDEyOklzc3VlQ29tbWVudDcwOTYzMzA4MA== simonw 9599 2020-10-15T22:58:51Z 2020-10-15T22:58:51Z OWNER It looks like there are places where Datasette might return a redirect that doesn't take `base_url` into account - I'm planning on fixing those here, after which I think `ProxyPassReverse` should no longer be necessary. https://github.com/simonw/datasette/issues/1025#issuecomment-709632136 {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} base_url doesn't seem to work when adding criteria and clicking "apply" 644582921  
843291675 https://github.com/simonw/datasette/issues/856#issuecomment-843291675 https://api.github.com/repos/simonw/datasette/issues/856 MDEyOklzc3VlQ29tbWVudDg0MzI5MTY3NQ== simonw 9599 2021-05-18T15:56:45Z 2021-05-18T15:56:45Z OWNER Tables and views get "stream all rows" at the moment, so one workaround is to define a SQL view for your query - this only works for queries that don't take any parameters though (although you may be able to define a view and then pass it extra fields using the Datasette table interface, like on https://latest.datasette.io/fixtures/paginated_view?content_extra__contains=9) I've explored this problem in a bit more detail in https://githu.com/simonw/django-sql-dashboard and I think I have a pattern that could work. For your canned query, you could implement the pattern yourself by setting up two canned queries that look something like this: https://github-to-sqlite.dogsheep.net/github?sql=select+rowid%2C+sha%2C+author_date+from+commits+order+by+rowid+limit+1000 ```sql select rowid, sha, author_date from commits order by rowid limit 1000 ``` That gets you the first set of 1,000 results. The important thing here is to order by a unique column, in this case `rowid` - because then subsequent pages can be loaded by a separate canned query that looks like this: ```sql select rowid, sha, author_date from commits where rowid > :after order by rowid limit 1000 ``` https://github-to-sqlite.dogsheep.net/github?sql=select+rowid%2C+sha%2C+author_date+from+commits+where+rowid+%3E+%3Aafter+order+by+rowid+limit+1000&after=1000 You then need to write code which knows how to generate these queries - start with the first query with no `where` clause (or if you are using `rowid` you can just use the second query and pass it `?after=0` for the first call) - then keep calling the query passing in the last rowid you recieved as the `after` parameter. Basically this is an implementation of keyset pagination with a smart client. When Datasette grows the ability to do this itself it will work by executing this mechanism inside the Python code, which is how the "stream all rows" option for tables works at the moment. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Consider pagination of canned queries 642296989  
855369819 https://github.com/simonw/datasette/issues/283#issuecomment-855369819 https://api.github.com/repos/simonw/datasette/issues/283 MDEyOklzc3VlQ29tbWVudDg1NTM2OTgxOQ== simonw 9599 2021-06-06T09:40:18Z 2021-06-06T09:40:18Z OWNER > One note on using this pragma I got an error on starting datasette `no such table: pragma_database_list`. > > I diagnosed this to an older version of sqlite3 (3.14.2) and upgrading to a newer version (3.34.2) fixed the issue. That issue is fixed in #1276. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} Support cross-database joins 325958506  
864092515 https://github.com/simonw/sqlite-utils/pull/277#issuecomment-864092515 https://api.github.com/repos/simonw/sqlite-utils/issues/277 MDEyOklzc3VlQ29tbWVudDg2NDA5MjUxNQ== simonw 9599 2021-06-18T14:47:57Z 2021-06-18T14:47:57Z OWNER This is a neat improvement. {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} add -h support closes #276 923612361  
873156408 https://github.com/simonw/datasette/issues/1387#issuecomment-873156408 https://api.github.com/repos/simonw/datasette/issues/1387 MDEyOklzc3VlQ29tbWVudDg3MzE1NjQwOA== simonw 9599 2021-07-02T17:37:30Z 2021-07-02T17:37:30Z OWNER Updated documentation is here: https://docs.datasette.io/en/latest/deploying.html#apache-proxy-configuration {"total_count": 1, "+1": 0, "-1": 0, "laugh": 0, "hooray": 0, "confused": 0, "heart": 1, "rocket": 0, "eyes": 0} absolute_url() behind a proxy assembles incorrect http://127.0.0.1:8001/ URLs 935930820  

Advanced export

JSON shape: default, array, newline-delimited, object

CSV options:

CREATE TABLE [issue_comments] (
   [html_url] TEXT,
   [issue_url] TEXT,
   [id] INTEGER PRIMARY KEY,
   [node_id] TEXT,
   [user] INTEGER REFERENCES [users]([id]),
   [created_at] TEXT,
   [updated_at] TEXT,
   [author_association] TEXT,
   [body] TEXT,
   [reactions] TEXT,
   [issue] INTEGER REFERENCES [issues]([id])
, [performed_via_github_app] TEXT);
CREATE INDEX [idx_issue_comments_issue]
                ON [issue_comments] ([issue]);
CREATE INDEX [idx_issue_comments_user]
                ON [issue_comments] ([user]);
Powered by Datasette · Queries took 770.719ms · About: simonw/datasette-graphql