[{"html_url": "https://github.com/simonw/datasette/issues/1143#issuecomment-744757558", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1143", "id": 744757558, "node_id": "MDEyOklzc3VlQ29tbWVudDc0NDc1NzU1OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-12-14T22:42:10Z", "updated_at": "2020-12-14T22:42:10Z", "author_association": "OWNER", "body": "This may involve a breaking change to the CLI settings interface, so I'm adding this to the 1.0 milestone.", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 764059235, "label": "More flexible CORS support in core, to encourage good security practices"}, "performed_via_github_app": null}, {"html_url": "https://github.com/simonw/datasette/issues/1171#issuecomment-754911290", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1171", "id": 754911290, "node_id": "MDEyOklzc3VlQ29tbWVudDc1NDkxMTI5MA==", "user": {"value": 59874, "label": "rcoup"}, "created_at": "2021-01-05T21:31:15Z", "updated_at": "2021-01-05T21:31:15Z", "author_association": "NONE", "body": "We did this for [Sno](https://sno.earth) under macOS \u2014 it's a PyInstaller binary/setup which uses [Packages](http://s.sudre.free.fr/Software/Packages/about.html) for packaging.\r\n\r\n* [Building & Signing](https://github.com/koordinates/sno/blob/master/platforms/Makefile#L67-L95)\r\n* [Packaging & Notarizing](https://github.com/koordinates/sno/blob/master/platforms/Makefile#L121-L215)\r\n* [Github Workflow](https://github.com/koordinates/sno/blob/master/.github/workflows/build.yml#L228-L269) has the CI side of it\r\n\r\nFYI (if you ever get to it) for Windows you need to get a code signing certificate. And if you want automated CI, you'll want to get an \"EV CodeSigning for HSM\" certificate from GlobalSign, which then lets you put the certificate into Azure Key Vault. Which you can use with [azuresigntool](https://github.com/vcsjones/AzureSignTool) to sign your code & installer. (Non-EV certificates are a waste of time, the user still gets big warnings at install time).\r\n", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 778450486, "label": "GitHub Actions workflow to build and sign macOS binary executables"}, "performed_via_github_app": null}, {"html_url": "https://github.com/simonw/datasette/pull/1223#issuecomment-792233255", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1223", "id": 792233255, "node_id": "MDEyOklzc3VlQ29tbWVudDc5MjIzMzI1NQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-03-07T07:41:01Z", "updated_at": "2021-03-07T07:41:01Z", "author_association": "OWNER", "body": "This is fantastic, thanks so much for tracking this down.", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 806918878, "label": "Add compile option to Dockerfile to fix failing test (fixes #696)"}, "performed_via_github_app": null}, {"html_url": "https://github.com/simonw/datasette/issues/1240#issuecomment-786812716", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1240", "id": 786812716, "node_id": "MDEyOklzc3VlQ29tbWVudDc4NjgxMjcxNg==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-02-26T18:18:18Z", "updated_at": "2021-02-26T18:18:18Z", "author_association": "OWNER", "body": "Agreed, this would be extremely useful. I'd love to be able to facet against custom queries. It's a fair bit of work to implement but it's not impossible. Closing this as a duplicate of #972.", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 814591962, "label": "Allow facetting on custom queries"}, "performed_via_github_app": null}, {"html_url": "https://github.com/simonw/datasette/issues/1387#issuecomment-873156408", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1387", "id": 873156408, "node_id": "MDEyOklzc3VlQ29tbWVudDg3MzE1NjQwOA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-07-02T17:37:30Z", "updated_at": "2021-07-02T17:37:30Z", "author_association": "OWNER", "body": "Updated documentation is here: https://docs.datasette.io/en/latest/deploying.html#apache-proxy-configuration", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 935930820, "label": "absolute_url() behind a proxy assembles incorrect http://127.0.0.1:8001/ URLs"}, "performed_via_github_app": null}, {"html_url": "https://github.com/simonw/datasette/issues/1402#issuecomment-886969541", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1402", "id": 886969541, "node_id": "IC_kwDOBm6k_c403hTF", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-07-26T19:31:40Z", "updated_at": "2021-07-26T19:31:40Z", "author_association": "OWNER", "body": "Datasette could do a pretty good job of this by default, using `twitter:card` and `og:url` tags - like on https://til.simonwillison.net/jq/extracting-objects-recursively\r\n\r\nI could also provide a mechanism to customize these - in particular to add images of some sort.\r\n\r\nIt feels like something that should tie in to the metadata mechanism.", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 951185411, "label": "feature request: social meta tags"}, "performed_via_github_app": null}, {"html_url": "https://github.com/simonw/datasette/pull/1467#issuecomment-943632697", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1467", "id": 943632697, "node_id": "IC_kwDOBm6k_c44PrE5", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-10-14T18:54:18Z", "updated_at": "2021-10-14T18:54:18Z", "author_association": "OWNER", "body": "The test there failed because it turns out there's a whole bunch of places that set the `Access-Control-Allow-Origin` header. I'm going to close this PR and ship a fix that refactors those places to use the same code.", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 991575770, "label": "Add Authorization header when CORS flag is set"}, "performed_via_github_app": null}, {"html_url": "https://github.com/simonw/datasette/issues/1528#issuecomment-988468238", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1528", "id": 988468238, "node_id": "IC_kwDOBm6k_c466tQO", "user": {"value": 30934, "label": "20after4"}, "created_at": "2021-12-08T03:35:45Z", "updated_at": "2021-12-08T03:35:45Z", "author_association": "NONE", "body": "FWIW I implemented something similar with a bit of plugin code:\r\n\r\n```python\r\n@hookimpl\r\ndef canned_queries(datasette: Datasette, database: str) -> Mapping[str, str]:\r\n # load \"canned queries\" from the filesystem under\r\n # www/sql/db/query_name.sql\r\n queries = {}\r\n\r\n sqldir = Path(__file__).parent.parent / \"sql\"\r\n if database:\r\n sqldir = sqldir / database\r\n\r\n if not sqldir.is_dir():\r\n return queries\r\n\r\n for f in sqldir.glob('*.sql'):\r\n try:\r\n sql = f.read_text('utf8').strip()\r\n if not len(sql):\r\n log(f\"Skipping empty canned query file: {f}\")\r\n continue\r\n queries[f.stem] = { \"sql\": sql }\r\n except OSError as err:\r\n log(err)\r\n\r\n return queries\r\n\r\n\r\n\r\n```", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1060631257, "label": "Add new `\"sql_file\"` key to Canned Queries in metadata?"}, "performed_via_github_app": null}, {"html_url": "https://github.com/simonw/datasette/issues/1546#issuecomment-997124280", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1546", "id": 997124280, "node_id": "IC_kwDOBm6k_c47bui4", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-18T02:05:16Z", "updated_at": "2021-12-18T02:05:16Z", "author_association": "OWNER", "body": "Sure - there are actually several levels to this.\r\n\r\nThe code that creates connections to the database is this: https://github.com/simonw/datasette/blob/83bacfa9452babe7bd66e3579e23af988d00f6ac/datasette/database.py#L72-L95\r\n\r\nFor files on disk, it does this:\r\n```python\r\n# For read-only connections\r\nconn = sqlite3.connect( \"file:my.db?mode=ro\", uri=True, check_same_thread=False)\r\n# For connections that should be treated as immutable:\r\nconn = sqlite3.connect( \"file:my.db?immutable=1\", uri=True, check_same_thread=False)\r\n```\r\nFor in-memory databases it runs this after the connection has been created:\r\n```python\r\nconn.execute(\"PRAGMA query_only=1\")\r\n```\r\nSQLite `PRAGMA` queries are treated as dangerous: someone could run `PRAGMA query_only=0` to turn that previous option off for example.\r\n\r\nSo this function runs against any incoming SQL to verify that it looks like a `SELECT ...` and doesn't have anything like that in it.\r\n\r\nhttps://github.com/simonw/datasette/blob/83bacfa9452babe7bd66e3579e23af988d00f6ac/datasette/utils/__init__.py#L195-L204\r\n\r\nYou can see the tests for that here: https://github.com/simonw/datasette/blob/b1fed48a95516ae84c0f020582303ab50ab817e2/tests/test_utils.py#L136-L170", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1076057610, "label": "validating the sql"}, "performed_via_github_app": null}, {"html_url": "https://github.com/simonw/datasette/issues/1553#issuecomment-996103956", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1553", "id": 996103956, "node_id": "IC_kwDOBm6k_c47X1cU", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-12-16T19:14:38Z", "updated_at": "2021-12-16T19:14:38Z", "author_association": "OWNER", "body": "This is a really interesting idea - kind of similar to how many APIs include custom HTTP headers informing of rate-limits.", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1079111498, "label": "if csv export is truncated in non streaming mode set informative response header"}, "performed_via_github_app": null}, {"html_url": "https://github.com/simonw/datasette/pull/1759#issuecomment-1160712911", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1759", "id": 1160712911, "node_id": "IC_kwDOBm6k_c5FLxLP", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-06-20T17:58:37Z", "updated_at": "2022-06-20T17:58:37Z", "author_association": "OWNER", "body": "This is a great idea.", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1275523220, "label": "Extract facet portions of table.html out into included templates"}, "performed_via_github_app": null}, {"html_url": "https://github.com/simonw/datasette/issues/1886#issuecomment-1313052863", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1886", "id": 1313052863, "node_id": "IC_kwDOBm6k_c5OQ5i_", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-11-14T03:40:50Z", "updated_at": "2022-11-14T03:40:50Z", "author_association": "OWNER", "body": "Tim Sherratt on Twitter: https://twitter.com/wragge/status/1591930345469153282\r\n\r\n> Where do I start? The [#GLAMWorkbench](https://twitter.com/hashtag/GLAMWorkbench?src=hashtag_click) now includes a number of examples where GLAM data is harvested, processed, and then made available for exploration via Datasette.\r\n>\r\n> https://glam-workbench.net/\r\n>\r\n> For example the GLAM Name Index Search brings together 10+ million entries from 240 indexes and provides an aggregated search using the Datasette search-all plugin:\r\n>\r\n> https://glam-workbench.net/name-search/\r\n>\r\n> Most recently I converted PDFs of the Tasmanian Postal Directories to a big Datasette instance: https://updates.timsherratt.org/2022/09/15/from-pdfs-to.html the process is documented and reusable.", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1447050738, "label": "Call for birthday presents: if you're using Datasette, let us know how you're using it here"}, "performed_via_github_app": null}, {"html_url": "https://github.com/simonw/datasette/pull/1967#issuecomment-1368267484", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/1967", "id": 1368267484, "node_id": "IC_kwDOBm6k_c5Rjhrc", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-12-31T19:15:50Z", "updated_at": "2022-12-31T19:15:50Z", "author_association": "OWNER", "body": "My Firefox tab before:\r\n\r\n\r\n\r\nAnd after:\r\n\r\n\r\n", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 1503010009, "label": "Add favicon to documentation"}, "performed_via_github_app": null}, {"html_url": "https://github.com/simonw/datasette/issues/283#issuecomment-855369819", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/283", "id": 855369819, "node_id": "MDEyOklzc3VlQ29tbWVudDg1NTM2OTgxOQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-06-06T09:40:18Z", "updated_at": "2021-06-06T09:40:18Z", "author_association": "OWNER", "body": "> One note on using this pragma I got an error on starting datasette `no such table: pragma_database_list`.\r\n> \r\n> I diagnosed this to an older version of sqlite3 (3.14.2) and upgrading to a newer version (3.34.2) fixed the issue.\r\n\r\nThat issue is fixed in #1276.", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 325958506, "label": "Support cross-database joins"}, "performed_via_github_app": null}, {"html_url": "https://github.com/simonw/datasette/issues/394#issuecomment-642522285", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/394", "id": 642522285, "node_id": "MDEyOklzc3VlQ29tbWVudDY0MjUyMjI4NQ==", "user": {"value": 58298410, "label": "LVerneyPEReN"}, "created_at": "2020-06-11T09:15:19Z", "updated_at": "2020-06-11T09:15:19Z", "author_association": "NONE", "body": "Hi @wragge,\r\n\r\nThis looks great, thanks for the share! I refactored it into a self-contained function, binding on a random available TCP port (multi-user context). I am using subprocess API directly since the `%run` magic was leaving defunct process behind :/\r\n\r\n![image](https://user-images.githubusercontent.com/58298410/84367566-b5d0d500-abd4-11ea-96e2-f5c05a28e506.png)\r\n\r\n```python\r\nimport socket\r\n\r\nfrom signal import SIGINT\r\nfrom subprocess import Popen, PIPE\r\n\r\nfrom IPython.display import display, HTML\r\nfrom notebook.notebookapp import list_running_servers\r\n\r\n\r\ndef get_free_tcp_port():\r\n \"\"\"\r\n Get a free TCP port.\r\n \"\"\"\r\n tcp = socket.socket(socket.AF_INET, socket.SOCK_STREAM)\r\n tcp.bind(('', 0))\r\n _, port = tcp.getsockname()\r\n tcp.close()\r\n return port\r\n\r\n\r\ndef datasette(database):\r\n \"\"\"\r\n Run datasette on an SQLite database.\r\n \"\"\"\r\n # Get current running servers\r\n servers = list_running_servers()\r\n\r\n # Get the current base url\r\n base_url = next(servers)['base_url']\r\n\r\n # Get a free port\r\n port = get_free_tcp_port()\r\n\r\n # Create a base url for Datasette suing the proxy path\r\n proxy_url = f'{base_url}proxy/absolute/{port}/'\r\n\r\n # Display a link to Datasette\r\n display(HTML(f'
View Datasette (Click on the stop button to close the Datasette server)
'))\r\n\r\n # Launch Datasette\r\n with Popen(\r\n [\r\n 'python', '-m', 'datasette', '--',\r\n database,\r\n '--port', str(port),\r\n '--config', f'base_url:{proxy_url}'\r\n ],\r\n stdout=PIPE,\r\n stderr=PIPE,\r\n bufsize=1,\r\n universal_newlines=True\r\n ) as p:\r\n print(p.stdout.readline(), end='')\r\n while True:\r\n try:\r\n line = p.stderr.readline()\r\n if not line:\r\n break\r\n print(line, end='')\r\n exit_code = p.poll()\r\n except KeyboardInterrupt:\r\n p.send_signal(SIGINT)\r\n```\r\n\r\nIdeally, I'd like some extra magic to notify users when they are leaving the closing the notebook tab and make them terminate the running datasette processes. I'll be looking for it.", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 396212021, "label": "base_url configuration setting"}, "performed_via_github_app": null}, {"html_url": "https://github.com/simonw/datasette/issues/509#issuecomment-749749948", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/509", "id": 749749948, "node_id": "MDEyOklzc3VlQ29tbWVudDc0OTc0OTk0OA==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-12-22T20:03:10Z", "updated_at": "2020-12-22T20:03:10Z", "author_association": "OWNER", "body": "If you open multiple files with the same filename, e.g. like this:\r\n\r\n datasette fixtures.db templates/fixtures.db plugins/fixtures.db\r\n\r\nYou'll now get this:\r\n\r\n\r\n", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 456568880, "label": "Support opening multiple databases with the same stem"}, "performed_via_github_app": null}, {"html_url": "https://github.com/simonw/datasette/issues/526#issuecomment-1259693536", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/526", "id": 1259693536, "node_id": "IC_kwDOBm6k_c5LFWXg", "user": {"value": 9599, "label": "simonw"}, "created_at": "2022-09-27T15:42:55Z", "updated_at": "2022-09-27T15:42:55Z", "author_association": "OWNER", "body": "It's interesting to note WHY the time limit works against this so well.\r\n\r\nThe time limit as-implemented looks like this:\r\n\r\nhttps://github.com/simonw/datasette/blob/5f9f567acbc58c9fcd88af440e68034510fb5d2b/datasette/utils/__init__.py#L181-L201\r\n\r\nThe key here is `conn.set_progress_handler(handler, n)` - which specifies that the handler function should be called every `n` SQLite operations.\r\n\r\nThe handler function then checks to see if too much time has transpired and conditionally cancels the query.\r\n\r\nThis also doubles up as a \"maximum number of operations\" guard, which is what's happening when you attempt to fetch an infinite number of rows from an infinite table.\r\n\r\nThat limit code could even be extended to say \"exit the query after either 5s or 50,000,000 operations\".\r\n\r\nI don't think that's necessary though.\r\n\r\nTo be honest I'm having trouble with the idea of dropping `max_returned_rows` mainly because what Datasette does (allow arbitrary untrusted SQL queries) is dangerous, so I've designed in multiple redundant defence-in-depth mechanisms right from the start.", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 459882902, "label": "Stream all results for arbitrary SQL and canned queries"}, "performed_via_github_app": null}, {"html_url": "https://github.com/simonw/datasette/issues/594#issuecomment-547373739", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/594", "id": 547373739, "node_id": "MDEyOklzc3VlQ29tbWVudDU0NzM3MzczOQ==", "user": {"value": 2680980, "label": "willingc"}, "created_at": "2019-10-29T11:21:52Z", "updated_at": "2019-10-29T11:21:52Z", "author_association": "NONE", "body": "Just an FYI for folks wishing to run datasette with Python 3.8, I was able to successfully use datasette with the following in a virtual environment:\r\n\r\n```\r\npip install uvloop==0.14.0rc1\r\npip install uvicorn==0.9.1\r\n```\r\n", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 506297048, "label": "upgrade to uvicorn-0.9 to be Python-3.8 friendly"}, "performed_via_github_app": null}, {"html_url": "https://github.com/simonw/datasette/issues/594#issuecomment-552276247", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/594", "id": 552276247, "node_id": "MDEyOklzc3VlQ29tbWVudDU1MjI3NjI0Nw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2019-11-11T03:13:00Z", "updated_at": "2019-11-11T03:13:00Z", "author_association": "OWNER", "body": "#622", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 506297048, "label": "upgrade to uvicorn-0.9 to be Python-3.8 friendly"}, "performed_via_github_app": null}, {"html_url": "https://github.com/simonw/datasette/issues/670#issuecomment-797158641", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/670", "id": 797158641, "node_id": "MDEyOklzc3VlQ29tbWVudDc5NzE1ODY0MQ==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2021-03-12T00:59:49Z", "updated_at": "2021-03-12T00:59:49Z", "author_association": "OWNER", "body": "> Challenge: what's the equivalent for PostgreSQL of opening a database in read only mode? Will I have to talk users through creating read only credentials?\r\n\r\nIt looks like the answer to this is yes - I'll need users to setup read-only credentials. Here's a TIL about that: https://til.simonwillison.net/postgresql/read-only-postgresql-user", "reactions": "{\"total_count\": 1, \"+1\": 0, \"-1\": 0, \"laugh\": 0, \"hooray\": 0, \"confused\": 0, \"heart\": 1, \"rocket\": 0, \"eyes\": 0}", "issue": {"value": 564833696, "label": "Prototoype for Datasette on PostgreSQL"}, "performed_via_github_app": null}, {"html_url": "https://github.com/simonw/datasette/issues/716#issuecomment-609455243", "issue_url": "https://api.github.com/repos/simonw/datasette/issues/716", "id": 609455243, "node_id": "MDEyOklzc3VlQ29tbWVudDYwOTQ1NTI0Mw==", "user": {"value": 9599, "label": "simonw"}, "created_at": "2020-04-05T17:47:33Z", "updated_at": "2020-04-05T17:47:33Z", "author_association": "OWNER", "body": "You start `git bisect` by giving it a known bad commit and a known good one:\r\n```\r\ngit bisect start master 286ed28 \r\n```\r\nThen you tell it to start running your script:\r\n```\r\ngit bisect run python ../datasette-issue-716/check_view_name.py\r\n```\r\nHere's what I got:\r\n```\r\n(datasette) ~/Dropbox/Development/datasette $ git bisect start master 286ed28\r\nBisecting: 30 revisions left to test after this (roughly 5 steps)\r\n[dc80e779a2e708b2685fc641df99e6aae9ad6f97] Handle scope path if it is a string\r\n(datasette) ~/Dropbox/Development/datasette $ git bisect run python ../datasette-issue-716/check_view_name.py\r\nrunning python ../datasette-issue-716/check_view_name.py\r\nTraceback (most recent call last):\r\n...\r\nBisecting: 15 revisions left to test after this (roughly 4 steps)\r\n[7c6a9c35299f251f9abfb03fd8e85143e4361709] Better tests for prepare_connection() plugin hook, refs #678\r\nrunning python ../datasette-issue-716/check_view_name.py\r\nTraceback (most recent call last):\r\n...\r\nBisecting: 7 revisions left to test after this (roughly 3 steps)\r\n[0091dfe3e5a3db94af8881038d3f1b8312bb857d] More reliable tie-break ordering for facet results\r\nrunning python ../datasette-issue-716/check_view_name.py\r\nTraceback (most recent call last):\r\n...\r\nBisecting: 3 revisions left to test after this (roughly 2 steps)\r\n[ce12244037b60ba0202c814871218c1dab38d729] Release notes for 0.35\r\nrunning python ../datasette-issue-716/check_view_name.py\r\nTraceback (most recent call last):\r\n...\r\nBisecting: 0 revisions left to test after this (roughly 1 step)\r\n[4d7dae9eb75e5430c3ee3c369bb5cd9ba0a148bc] Added a bunch more plugins to the Ecosystem page\r\nrunning python ../datasette-issue-716/check_view_name.py\r\nTraceback (most recent call last):\r\n...\r\n70b915fb4bc214f9d064179f87671f8a378aa127 is the first bad commit\r\ncommit 70b915fb4bc214f9d064179f87671f8a378aa127\r\nAuthor: Simon Willison